WorldWideScience

Sample records for active length verification

  1. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  2. Myofilament length dependent activation

    Energy Technology Data Exchange (ETDEWEB)

    de Tombe, Pieter P.; Mateja, Ryan D.; Tachampa, Kittipong; Mou, Younss Ait; Farman, Gerrie P.; Irving, Thomas C. (IIT); (Loyola)

    2010-05-25

    The Frank-Starling law of the heart describes the interrelationship between end-diastolic volume and cardiac ejection volume, a regulatory system that operates on a beat-to-beat basis. The main cellular mechanism that underlies this phenomenon is an increase in the responsiveness of cardiac myofilaments to activating Ca{sup 2+} ions at a longer sarcomere length, commonly referred to as myofilament length-dependent activation. This review focuses on what molecular mechanisms may underlie myofilament length dependency. Specifically, the roles of inter-filament spacing, thick and thin filament based regulation, as well as sarcomeric regulatory proteins are discussed. Although the 'Frank-Starling law of the heart' constitutes a fundamental cardiac property that has been appreciated for well over a century, it is still not known in muscle how the contractile apparatus transduces the information concerning sarcomere length to modulate ventricular pressure development.

  3. Role of independent inspections in verification activities

    International Nuclear Information System (INIS)

    Verification activities have an important place in the activities associated with the implementation of a quality assurance programme and concern the compliance of all work with clearly defined requirements. In the first part of this paper the author reviews these requirements, classifying them into four groups: specific requirements of the product in question, particular requirements of the organization responsible for manufacturing the product, requirements of successive customers, and regulatory requirements. The second part of the paper examines the two approaches which can be adopted to establish verification systems outside the organizational structure. The first approach is to pool or organize the monitoring resources of different bodies with common requirements (electricity producer and its principal contractors); the second consists in using an external monitoring body. The third part of the paper describes the system used in France, which is the first of the methods described above. It requires constant co-operation between the different parties involved, and these have established two associations for the purpose of applying the system - AFCEN (nuclear) and AFCEC (conventional). The advantages and disadvantages of the two possible approaches to verification of activities must be assessed within their industrial and commercial regulatory context. In France the best method has proved to be the pooling of resources. This has led to a direct and fruitful dialogue between customers and suppliers aimed at defining common requirements (Design and Construction Regulations (RCC)) and monitoring their application. (author)

  4. Telomerase activity and telomere length in Daphnia.

    Science.gov (United States)

    Schumpert, Charles; Nelson, Jacob; Kim, Eunsuk; Dudycha, Jeffry L; Patel, Rekha C

    2015-01-01

    Telomeres, comprised of short repetitive sequences, are essential for genome stability and have been studied in relation to cellular senescence and aging. Telomerase, the enzyme that adds telomeric repeats to chromosome ends, is essential for maintaining the overall telomere length. A lack of telomerase activity in mammalian somatic cells results in progressive shortening of telomeres with each cellular replication event. Mammals exhibit high rates of cell proliferation during embryonic and juvenile stages but very little somatic cell proliferation occurs during adult and senescent stages. The telomere hypothesis of cellular aging states that telomeres serve as an internal mitotic clock and telomere length erosion leads to cellular senescence and eventual cell death. In this report, we have examined telomerase activity, processivity, and telomere length in Daphnia, an organism that grows continuously throughout its life. Similar to insects, Daphnia telomeric repeat sequence was determined to be TTAGG and telomerase products with five-nucleotide periodicity were generated in the telomerase activity assay. We investigated telomerase function and telomere lengths in two closely related ecotypes of Daphnia with divergent lifespans, short-lived D. pulex and long-lived D. pulicaria. Our results indicate that there is no age-dependent decline in telomere length, telomerase activity, or processivity in short-lived D. pulex. On the contrary, a significant age dependent decline in telomere length, telomerase activity and processivity is observed during life span in long-lived D. pulicaria. While providing the first report on characterization of Daphnia telomeres and telomerase activity, our results also indicate that mechanisms other than telomere shortening may be responsible for the strikingly short life span of D. pulex.

  5. 78 FR 6852 - Agency Information Collection (Student Verification of Enrollment) Activity Under OMB Review

    Science.gov (United States)

    2013-01-31

    ... AFFAIRS Agency Information Collection (Student Verification of Enrollment) Activity Under OMB Review....'' SUPPLEMENTARY INFORMATION: Title: Student Verification of Enrollment, VA Form 22-8979. OMB Control Number: 2900... a student's certification of actual attendance and verification of the student's...

  6. Remedial activities effectiveness verification in tailing areas

    International Nuclear Information System (INIS)

    The complex radiological study of the basin of sludge from the uranium ore mining and preprocessing was done. Air kerma rates (including its spectral analysis) at the reference height of 1 m above ground over the whole area were measured and radiation fields mapped during two measuring campaigns (years 2009 and 2014). K, U and Th concentrations in sludge and concentrations in depth profiles (including radon concentration and radon exhalation rates) in selected points were determined using gamma spectrometry for in situ as well as laboratory samples measurement. Results were used for the analysis, design evaluation and verification of the efficiency of the remediation measures. Efficiency of the sludge basin covering by the inert material was modelled using MicroShield code. (authors)

  7. Formal Verification of Effectiveness of Control Activities in Business Processes

    Science.gov (United States)

    Arimoto, Yasuhito; Iida, Shusaku; Futatsugi, Kokichi

    It has been an important issue to deal with risks in business processes for achieving companies' goals. This paper introduces a method for applying a formal method to analysis of risks and control activities in business processes in order to evaluate control activities consistently, exhaustively, and to give us potential to have scientific discussion on the result of the evaluation. We focus on document flows in business activities and control activities and risks related to documents because documents play important roles in business. In our method, document flows including control activities are modeled and it is verified by OTS/CafeOBJ Method that risks about falsification of documents are avoided by control activities in the model. The verification is done by interaction between humans and CafeOBJ system with theorem proving, and it raises potential to discuss the result scientifically because the interaction gives us rigorous reasons why the result is derived from the verification.

  8. 77 FR 67737 - Proposed Information Collection (Student Verification of Enrollment) Activity: Comment Request

    Science.gov (United States)

    2012-11-13

    ... AFFAIRS Proposed Information Collection (Student Verification of Enrollment) Activity: Comment Request...: Student Verification of Enrollment, VA Form 22-8979. OMB Control Number: 2900-0465. Type of Review... of actual attendance and verification of the student's continued enrollment in courses leading to...

  9. Investigation of an implantable dosimeter for single-point water equivalent path length verification in proton therapy

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Hsiao-Ming; Mann, Greg; Cascio, Ethan [Francis H. Burr Proton Therapy Center, Massachusetts General Hospital, Boston, Massachusetts 02114 (United States); Sicel Technologies, Inc., Morrisville, North Carolina 27560 (United States); Francis H. Burr Proton Therapy Center, Massachusetts General Hospital, Boston, Massachusetts 02114 (United States)

    2010-11-15

    Purpose: In vivo range verification in proton therapy is highly desirable. A recent study suggested that it was feasible to use point dose measurement for in vivo beam range verification in proton therapy, provided that the spread-out Bragg peak dose distribution is delivered in a different and rather unconventional manner. In this work, the authors investigate the possibility of using a commercial implantable dosimeter with wireless reading for this particular application. Methods: The traditional proton treatment technique delivers all the Bragg peaks required for a SOBP field in a single sequence, producing a constant dose plateau across the target volume. As a result, a point dose measurement anywhere in the target volume will produce the same value, thus providing no information regarding the water equivalent path length to the point of measurement. However, the same constant dose distribution can be achieved by splitting the field into a complementary pair of subfields, producing two oppositely ''sloped'' depth-dose distributions, respectively. The ratio between the two distributions can be a sensitive function of depth and measuring this ratio at a point inside the target volume can provide the water equivalent path length to the dosimeter location. Two types of field splits were used in the experiment, one achieved by the technique of beam current modulation and the other by manipulating the location and width of the beam pulse relative to the range modulator track. Eight MOSFET-based implantable dosimeters at four different depths in a water tank were used to measure the dose ratios for these field pairs. A method was developed to correct the effect of the well-known LET dependence of the MOSFET detectors on the depth-dose distributions using the columnar recombination model. The LET-corrected dose ratios were used to derive the water equivalent path lengths to the dosimeter locations to be compared to physical measurements. Results

  10. Telomerase activity and telomere length in human hepatocellular carcinoma.

    Science.gov (United States)

    Huang, G T; Lee, H S; Chen, C H; Chiou, L L; Lin, Y W; Lee, C Z; Chen, D S; Sheu, J C

    1998-11-01

    Telomerase activity is activated and telomere length altered in various types of cancers, including hepatocellular carcinoma (HCC). A total of 39 HCC tissues and the corresponding non-tumour livers were analysed and correlated with clinical parameters. Telomere length was determined by terminal restriction fragment assay, and telomerase activity was assayed by telomeric repeat amplification protocol. Telomerase activity was positive in 24 of the 39 tumour tissues (1.15-285.13 total product generated (TPG) units) and in six of the 39 non-tumour liver tissues (1.05-1.73 TPG units). In the 28 cases analysed for telomere length, telomere length was shortened in 11 cases, lengthened in six cases, and unaltered in 11 cases compared with non-tumour tissues. Neither telomere length nor telomerase activity was correlated to any clinical parameters. PMID:10023320

  11. [Telomere length and telomerase activity in hepatocellular carcinoma].

    Science.gov (United States)

    Nakashio, R; Kitamoto, M; Nakanishi, T; Takaishi, H; Takahashi, S; Kajiyama, G

    1998-05-01

    Telomerase activity and terminal restriction fragment (TRF) length were examined in hepatocellular carcinoma (HCC). Telomerase activity was assayed by telomeric repeat amplification protocol (TRAP) connected with an internal telomerase assay standard (ITAS). The incidence of strong telomerase activity (highly variable level compared with the activity of non-cancerous liver tissue) was 79% in well, 84% in moderately, and 100% in poorly differentiated HCC, while 0% in non-cancerous liver tissues. The incidence of TRF length alteration (reduction or elongation) was 53% in HCC. The incidence of TRF alteration was significantly higher in HCC exceeding 3 cm in diameter, moderately or poorly differentiated in histology. Telomerase activity was not associated with TRF length alteration in HCC. In conclusion, strong telomerase activity and TRF length alteration increased with HCC tumor progressions. PMID:9613130

  12. Investigation of an implantable dosimeter for single-point water equivalent path length verification in proton therapy

    OpenAIRE

    Lu, Hsiao-Ming; Mann, Greg; Cascio, Ethan

    2010-01-01

    Purpose:In vivo range verification in proton therapy is highly desirable. A recent study suggested that it was feasible to use point dose measurement for in vivo beam range verification in proton therapy, provided that the spread-out Bragg peak dose distribution is delivered in a different and rather unconventional manner. In this work, the authors investigate the possibility of using a commercial implantable dosimeter with wireless reading for this particular application.

  13. Verification of Minimum Detectable Activity for Radiological Threat Source Search

    Science.gov (United States)

    Gardiner, Hannah; Myjak, Mitchell; Baciak, James; Detwiler, Rebecca; Seifert, Carolyn

    2015-10-01

    The Department of Homeland Security's Domestic Nuclear Detection Office is working to develop advanced technologies that will improve the ability to detect, localize, and identify radiological and nuclear sources from airborne platforms. The Airborne Radiological Enhanced-sensor System (ARES) program is developing advanced data fusion algorithms for analyzing data from a helicopter-mounted radiation detector. This detector platform provides a rapid, wide-area assessment of radiological conditions at ground level. The NSCRAD (Nuisance-rejection Spectral Comparison Ratios for Anomaly Detection) algorithm was developed to distinguish low-count sources of interest from benign naturally occurring radiation and irrelevant nuisance sources. It uses a number of broad, overlapping regions of interest to statistically compare each newly measured spectrum with the current estimate for the background to identify anomalies. We recently developed a method to estimate the minimum detectable activity (MDA) of NSCRAD in real time. We present this method here and report on the MDA verification using both laboratory measurements and simulated injects on measured backgrounds at or near the detection limits. This work is supported by the US Department of Homeland Security, Domestic Nuclear Detection Office, under competitively awarded contract/IAA HSHQDC-12-X-00376. This support does not constitute an express or implied endorsement on the part of the Gov't.

  14. Activity of telomerase and telomeric length in Apis mellifera.

    Science.gov (United States)

    Korandová, Michala; Frydrychová, Radmila Čapková

    2016-06-01

    Telomerase is an enzyme that adds repeats of DNA sequences to the ends of chromosomes, thereby preventing their shortening. Telomerase activity is associated with proliferative status of cells, organismal development, and aging. We report an analysis of telomerase activity and telomere length in the honeybee, Apis mellifera. Telomerase activity was found to be regulated in a development and caste-specific manner. During the development of somatic tissues of larval drones and workers, telomerase activity declined to 10 % of its level in embryos and remained low during pupal and adult stages but was upregulated in testes of late pupae, where it reached 70 % of the embryo level. Upregulation of telomerase activity was observed in the ovaries of late pupal queens, reaching 160 % of the level in embryos. Compared to workers and drones, queens displayed higher levels of telomerase activity. In the third larval instar of queens, telomerase activity reached the embryo level, and an enormous increase was observed in adult brains of queens, showing a 70-fold increase compared to a brain of an adult worker. Southern hybridization of terminal TTAGG fragments revealed a high variability of telomeric length between different individuals, although the same pattern of hybridization signals was observed in different tissues of each individual. PMID:26490169

  15. Length-scale dependent mechanical properties of Al-Cu eutectic alloy: Molecular dynamics based model and its experimental verification

    Energy Technology Data Exchange (ETDEWEB)

    Tiwary, C. S., E-mail: cst.iisc@gmail.com; Chattopadhyay, K. [Department of Materials Engineering, Indian Institute of Science, Bangalore 560012 (India); Chakraborty, S.; Mahapatra, D. R. [Department of Aerospace Engineering, Indian Institute of Science, Bangalore 560012 (India)

    2014-05-28

    This paper attempts to gain an understanding of the effect of lamellar length scale on the mechanical properties of two-phase metal-intermetallic eutectic structure. We first develop a molecular dynamics model for the in-situ grown eutectic interface followed by a model of deformation of Al-Al{sub 2}Cu lamellar eutectic. Leveraging the insights obtained from the simulation on the behaviour of dislocations at different length scales of the eutectic, we present and explain the experimental results on Al-Al{sub 2}Cu eutectic with various different lamellar spacing. The physics behind the mechanism is further quantified with help of atomic level energy model for different length scale as well as different strain. An atomic level energy partitioning of the lamellae and the interface regions reveals that the energy of the lamellae core are accumulated more due to dislocations irrespective of the length-scale. Whereas the energy of the interface is accumulated more due to dislocations when the length-scale is smaller, but the trend is reversed when the length-scale is large beyond a critical size of about 80 nm.

  16. Length-scale dependent mechanical properties of Al-Cu eutectic alloy: Molecular dynamics based model and its experimental verification

    International Nuclear Information System (INIS)

    This paper attempts to gain an understanding of the effect of lamellar length scale on the mechanical properties of two-phase metal-intermetallic eutectic structure. We first develop a molecular dynamics model for the in-situ grown eutectic interface followed by a model of deformation of Al-Al2Cu lamellar eutectic. Leveraging the insights obtained from the simulation on the behaviour of dislocations at different length scales of the eutectic, we present and explain the experimental results on Al-Al2Cu eutectic with various different lamellar spacing. The physics behind the mechanism is further quantified with help of atomic level energy model for different length scale as well as different strain. An atomic level energy partitioning of the lamellae and the interface regions reveals that the energy of the lamellae core are accumulated more due to dislocations irrespective of the length-scale. Whereas the energy of the interface is accumulated more due to dislocations when the length-scale is smaller, but the trend is reversed when the length-scale is large beyond a critical size of about 80 nm.

  17. Summary of LHC MD 398: Verification of the dependence of the BCTF measurements on beam position and bunch length

    CERN Document Server

    Krupa, Michal; Gasior, Marek; Lefevre, Thibaut; Soby, Lars; CERN. Geneva. ATS Department

    2015-01-01

    The main aim of the MD was to study the dependency of bunch-by-bunch intensity measurements to beam position and bunch length variations. Large beam position offsets in IR4 and varying bunch length were introduced to compare the performance of the presently installed Fast Beam Current Transformers with the new Integrating Current Transformer and the new Wall Current Transformer. This note explains all the procedures of the LHC MD 398, which took place on 20/07/2015, and presents the obtained results.

  18. First Exon Length Controls Active Chromatin Signatures and Transcription

    Directory of Open Access Journals (Sweden)

    Nicole I. Bieberstein

    2012-07-01

    Full Text Available Here, we explore the role of splicing in transcription, employing both genome-wide analysis of human ChIP-seq data and experimental manipulation of exon-intron organization in transgenic cell lines. We show that the activating histone modifications H3K4me3 and H3K9ac map specifically to first exon-intron boundaries. This is surprising, because these marks help recruit general transcription factors (GTFs to promoters. In genes with long first exons, promoter-proximal levels of H3K4me3 and H3K9ac are greatly reduced; consequently, GTFs and RNA polymerase II are low at transcription start sites (TSSs and exhibit a second, promoter-distal peak from which transcription also initiates. In contrast, short first exons lead to increased H3K4me3 and H3K9ac at promoters, higher expression levels, accuracy in TSS usage, and a lower frequency of antisense transcription. Therefore, first exon length is predictive for gene activity. Finally, splicing inhibition and intron deletion reduce H3K4me3 levels and transcriptional output. Thus, gene architecture and splicing determines transcription quantity and quality as well as chromatin signatures.

  19. Training to Support Standardization and Improvement of Safety I and C Related Verification and Validation Activities

    Energy Technology Data Exchange (ETDEWEB)

    Ammon, G.; Schoenfelder, C.

    2014-07-01

    In recent years AREVA has conducted several measures to enhance the effectiveness of safety I and C related verification and validation activities within nuclear power plant (NPP) new build as well as modernization projects, thereby further strengthening its commitment to achieving the highest level of safety in nuclear facilities. (Author)

  20. 78 FR 6849 - Agency Information Collection (Verification of VA Benefits) Activity Under OMB Review

    Science.gov (United States)

    2013-01-31

    ... AFFAIRS Agency Information Collection (Verification of VA Benefits) Activity Under OMB Review AGENCY... abstracted below to the Office of Management and Budget (OMB) for review and comment. The PRA submission... VA Benefits, VA Form 26-8937. OMB Control Number: 2900-0406. ] Type of Review: Extension of...

  1. 77 FR 20889 - Proposed Information Collection (Request One-VA Identification Verification Card) Activity...

    Science.gov (United States)

    2012-04-06

    ... of Veterans Affairs, 810 Vermont Avenue NW., Washington, DC 20420 or email: john.hancock@va.gov... AFFAIRS Proposed Information Collection (Request One-VA Identification Verification Card) Activity... Veterans Affairs (VA), is announcing an opportunity for public comment on the proposed collection...

  2. 77 FR 38396 - Agency Information Collection (One-VA Identification Verification Card) Activities Under OMB Review

    Science.gov (United States)

    2012-06-27

    ... information through www.Regulations.gov or to VA's OMB Desk Officer, Office of Information and Regulatory... 20420, (202) 632-7479, Fax (202) 632-7583 or email denise.mclamb@va.gov . Please refer to ``OMB Control... AFFAIRS Agency Information Collection (One-VA Identification Verification Card) Activities Under...

  3. The Influence of Epoch Length on Physical Activity Patterns Varies by Child's Activity Level

    Science.gov (United States)

    Nettlefold, Lindsay; Naylor, P. J.; Warburton, Darren E. R.; Bredin, Shannon S. D.; Race, Douglas; McKay, Heather A.

    2016-01-01

    Purpose: Patterns of physical activity (PA) and sedentary time, including volume of bouted activity, are important health indicators. However, the effect of accelerometer epoch length on measurement of these patterns and associations with health outcomes in children remain unknown. Method: We measured activity patterns in 308 children (52% girls,…

  4. Single-Cell Telomere-Length Quantification Couples Telomere Length to Meristem Activity and Stem Cell Development in Arabidopsis

    Directory of Open Access Journals (Sweden)

    Mary-Paz González-García

    2015-05-01

    Full Text Available Telomeres are specialized nucleoprotein caps that protect chromosome ends assuring cell division. Single-cell telomere quantification in animals established a critical role for telomerase in stem cells, yet, in plants, telomere-length quantification has been reported only at the organ level. Here, a quantitative analysis of telomere length of single cells in Arabidopsis root apex uncovered a heterogeneous telomere-length distribution of different cell lineages showing the longest telomeres at the stem cells. The defects in meristem and stem cell renewal observed in tert mutants demonstrate that telomere lengthening by TERT sets a replicative limit in the root meristem. Conversely, the long telomeres of the columella cells and the premature stem cell differentiation plt1,2 mutants suggest that differentiation can prevent telomere erosion. Overall, our results indicate that telomere dynamics are coupled to meristem activity and continuous growth, disclosing a critical association between telomere length, stem cell function, and the extended lifespan of plants.

  5. Single-cell telomere-length quantification couples telomere length to meristem activity and stem cell development in Arabidopsis.

    Science.gov (United States)

    González-García, Mary-Paz; Pavelescu, Irina; Canela, Andrés; Sevillano, Xavier; Leehy, Katherine A; Nelson, Andrew D L; Ibañes, Marta; Shippen, Dorothy E; Blasco, Maria A; Caño-Delgado, Ana I

    2015-05-12

    Telomeres are specialized nucleoprotein caps that protect chromosome ends assuring cell division. Single-cell telomere quantification in animals established a critical role for telomerase in stem cells, yet, in plants, telomere-length quantification has been reported only at the organ level. Here, a quantitative analysis of telomere length of single cells in Arabidopsis root apex uncovered a heterogeneous telomere-length distribution of different cell lineages showing the longest telomeres at the stem cells. The defects in meristem and stem cell renewal observed in tert mutants demonstrate that telomere lengthening by TERT sets a replicative limit in the root meristem. Conversely, the long telomeres of the columella cells and the premature stem cell differentiation plt1,2 mutants suggest that differentiation can prevent telomere erosion. Overall, our results indicate that telomere dynamics are coupled to meristem activity and continuous growth, disclosing a critical association between telomere length, stem cell function, and the extended lifespan of plants. PMID:25937286

  6. Single-cell telomere-length quantification couples telomere length to meristem activity and stem cell development in Arabidopsis.

    Science.gov (United States)

    González-García, Mary-Paz; Pavelescu, Irina; Canela, Andrés; Sevillano, Xavier; Leehy, Katherine A; Nelson, Andrew D L; Ibañes, Marta; Shippen, Dorothy E; Blasco, Maria A; Caño-Delgado, Ana I

    2015-05-12

    Telomeres are specialized nucleoprotein caps that protect chromosome ends assuring cell division. Single-cell telomere quantification in animals established a critical role for telomerase in stem cells, yet, in plants, telomere-length quantification has been reported only at the organ level. Here, a quantitative analysis of telomere length of single cells in Arabidopsis root apex uncovered a heterogeneous telomere-length distribution of different cell lineages showing the longest telomeres at the stem cells. The defects in meristem and stem cell renewal observed in tert mutants demonstrate that telomere lengthening by TERT sets a replicative limit in the root meristem. Conversely, the long telomeres of the columella cells and the premature stem cell differentiation plt1,2 mutants suggest that differentiation can prevent telomere erosion. Overall, our results indicate that telomere dynamics are coupled to meristem activity and continuous growth, disclosing a critical association between telomere length, stem cell function, and the extended lifespan of plants.

  7. Modelling and Verification of Web Services Business Activity Protocol

    DEFF Research Database (Denmark)

    Ravn, Anders Peter; Srba, Jiri; Vighio, Saleem

    2011-01-01

    WS-Business Activity specification defines two coordination protocols in order to ensure a consistent agreement on the outcome of long-running distributed applications. We use the model checker Uppaal to analyse the Business Agreement with Coordination Completion protocol type. Our analyses show...

  8. Evaluation of micro-parallel liquid chromatography as a method for HTS-coupled actives verification.

    Science.gov (United States)

    Simeonov, Anton; Yasgar, Adam; Klumpp, Carleen; Zheng, Wei; Shafqat, Naeem; Oppermann, Udo; Austin, Christopher P; Inglese, James

    2007-12-01

    The identification of biologically active compounds from high-throughput screening (HTS) can involve considerable postscreening analysis to verify the nature of the sample activity. In this study we evaluated the performance of micro-parallel liquid chromatography (microPLC) as a separation-based enzyme assay platform for follow-up of compound activities found in quantitative HTS of two different targets, a hydrolase and an oxidoreductase. In an effort to couple secondary analysis to primary screening we explored the application of microPLC immediately after a primary screen. In microPLC, up to 24 samples can be loaded and analyzed simultaneously via high-performance liquid chromatography within a specially designed cartridge. In a proof-of-concept experiment for screen-coupled actives verification, we identified, selected, and consolidated the contents of "active" wells from a 1,536-well format HTS experiment into a 384-well plate and subsequently analyzed these samples by a 24-channel microPLC system. The method utilized 0.6% of the original 6-microl 1,536-well assay for the analysis. The analysis revealed several non-biological-based "positive" samples. The main examples included "false" enzyme activators resulting from an increase in well fluorescence due to fluorescent compound or impurity. The microPLC analysis also provided a verification of the activity of two activators of glucocerebrosidase. We discuss the benefits of microPLC and its limitations from the standpoint of ease of use and integration into a seamless postscreen workflow.

  9. Length adaptation of smooth muscle contractile filaments in response to sustained activation.

    Science.gov (United States)

    Stålhand, Jonas; Holzapfel, Gerhard A

    2016-05-21

    Airway and bladder smooth muscles are known to undergo length adaptation under sustained contraction. This adaptation process entails a remodelling of the intracellular actin and myosin filaments which shifts the peak of the active force-length curve towards the current length. Smooth muscles are therefore able to generate the maximum force over a wide range of lengths. In contrast, length adaptation of vascular smooth muscle has attracted very little attention and only a handful of studies have been reported. Although their results are conflicting on the existence of a length adaptation process in vascular smooth muscle, it seems that, at least, peripheral arteries and arterioles undergo such adaptation. This is of interest since peripheral vessels are responsible for pressure regulation, and a length adaptation will affect the function of the cardiovascular system. It has, e.g., been suggested that the inward remodelling of resistance vessels associated with hypertension disorders may be related to smooth muscle adaptation. In this study we develop a continuum mechanical model for vascular smooth muscle length adaptation by assuming that the muscle cells remodel the actomyosin network such that the peak of the active stress-stretch curve is shifted towards the operating point. The model is specialised to hamster cheek pouch arterioles and the simulated response to stepwise length changes under contraction. The results show that the model is able to recover the salient features of length adaptation reported in the literature.

  10. Software Verification and Validation Plan Activities, 2011, Project Number: N6423, SAPHIRE Version 8

    Energy Technology Data Exchange (ETDEWEB)

    Kurt G. Vedros; Curtis L. Smith

    2011-11-01

    The SV&V Plan experienced changes over the past year to bring it into the operational software life cycle of SAPHIRE 8 and to maintain its sections on design features. Peer review of the SVVP with the former IV&V members identified the need for the operational use of metrics as a tool for quality maintenance and improvement. New tests were added to the SVVP to verify the operation of the new design features incorporated into SAPHIRE 8. Other additions to the SVVP were the addition of software metrics and the PDR and CDR processes. Audit support was provided for the NRC Technical Manager and Project Manager for the NRC OIG Audit performed throughout 2011. The SVVP is considered to be an up to date reference and useful roadmap of verification and validation activities going forward.

  11. Analysis of the age of Panax ginseng based on telomere length and telomerase activity.

    Science.gov (United States)

    Liang, Jiabei; Jiang, Chao; Peng, Huasheng; Shi, Qinghua; Guo, Xiang; Yuan, Yuan; Huang, Luqi

    2015-01-23

    Ginseng, which is the root of Panax ginseng (Araliaceae), has been used in Oriental medicine as a stimulant and dietary supplement for more than 7,000 years. Older ginseng plants are substantially more medically potent, but ginseng age can be simulated using unscrupulous cultivation practices. Telomeres progressively shorten with each cell division until they reach a critical length, at which point cells enter replicative senescence. However, in some cells, telomerase maintains telomere length. In this study, to determine whether telomere length reflects ginseng age and which tissue is best for such an analysis, we examined telomerase activity in the main roots, leaves, stems, secondary roots and seeds of ginseng plants of known age. Telomere length in the main root (approximately 1 cm below the rhizome) was found to be the best indicator of age. Telomeric terminal restriction fragment (TRF) lengths, which are indicators of telomere length, were determined for the main roots of plants of different ages through Southern hybridization analysis. Telomere length was shown to be positively correlated with plant age, and a simple mathematical model was formulated to describe the relationship between telomere length and age for P. ginseng.

  12. Statistical analysis and verification of 3-hourly geomagnetic activity probability predictions

    Science.gov (United States)

    Wang, Jingjing; Zhong, Qiuzhen; Liu, Siqing; Miao, Juan; Liu, Fanghua; Li, Zhitao; Tang, Weiwei

    2015-12-01

    The Space Environment Prediction Center (SEPC) has classified geomagnetic activity into four levels: quiet to unsettled (Kp 6). The 3-hourly Kp index prediction product provided by the SEPC is updated half hourly. In this study, the statistical conditional forecast models for the 3-hourly geomagnetic activity level were developed based on 10 years of data and applied to more than 3 years of data, using the previous Kp index, interplanetary magnetic field, and solar wind parameters measured by the Advanced Composition Explorer as conditional parameters. The quality of the forecast models was measured and compared against verifications of accuracy, reliability, discrimination capability, and skill of predicting all geomagnetic activity levels, especially the probability of reaching the storm level given a previous "calm" (nonstorm level) or "storm" (storm level) condition. It was found that the conditional models that used the previous Kp index, the peak value of BtV (the product of the total interplanetary magnetic field and speed), the average value of Bz (the southerly component of the interplanetary magnetic field), and BzV (the product of the southerly component of the interplanetary magnetic field and speed) over the last 6 h as conditional parameters provide a relative operating characteristic area of 0.64 and can be an appropriate predictor for the probability forecast of geomagnetic activity level.

  13. Characterizing proton-activated materials to develop PET-mediated proton range verification markers

    Science.gov (United States)

    Cho, Jongmin; Ibbott, Geoffrey S.; Kerr, Matthew D.; Amos, Richard A.; Stingo, Francesco C.; Marom, Edith M.; Truong, Mylene T.; Palacio, Diana M.; Betancourt, Sonia L.; Erasmus, Jeremy J.; DeGroot, Patricia M.; Carter, Brett W.; Gladish, Gregory W.; Sabloff, Bradley S.; Benveniste, Marcelo F.; Godoy, Myrna C.; Patil, Shekhar; Sorensen, James; Mawlawi, Osama R.

    2016-06-01

    Conventional proton beam range verification using positron emission tomography (PET) relies on tissue activation alone and therefore requires particle therapy PET whose installation can represent a large financial burden for many centers. Previously, we showed the feasibility of developing patient implantable markers using high proton cross-section materials (18O, Cu, and 68Zn) for in vivo proton range verification using conventional PET scanners. In this technical note, we characterize those materials to test their usability in more clinically relevant conditions. Two phantoms made of low-density balsa wood (~0.1 g cm‑3) and beef (~1.0 g cm‑3) were embedded with Cu or 68Zn foils of several volumes (10–50 mm3). The metal foils were positioned at several depths in the dose fall-off region, which had been determined from our previous study. The phantoms were then irradiated with different proton doses (1–5 Gy). After irradiation, the phantoms with the embedded foils were moved to a diagnostic PET scanner and imaged. The acquired data were reconstructed with 20–40 min of scan time using various delay times (30–150 min) to determine the maximum contrast-to-noise ratio. The resultant PET/computed tomography (CT) fusion images of the activated foils were then examined and the foils’ PET signal strength/visibility was scored on a 5 point scale by 13 radiologists experienced in nuclear medicine. For both phantoms, the visibility of activated foils increased in proportion to the foil volume, dose, and PET scan time. A linear model was constructed with visibility scores as the response variable and all other factors (marker material, phantom material, dose, and PET scan time) as covariates. Using the linear model, volumes of foils that provided adequate visibility (score 3) were determined for each dose and PET scan time. The foil volumes that were determined will be used as a guideline in developing practical implantable markers.

  14. Amplification of Frequency-Modulated Similariton Pulses in Length-Inhomogeneous Active Fibers

    Directory of Open Access Journals (Sweden)

    I. O. Zolotovskii

    2012-01-01

    Full Text Available The possibility of an effective gain of the self-similar frequency-modulated (FM wave packets is studied in the length-inhomogeneous active fibers. The dynamics of parabolic pulses with the constant chirp has been considered. The optimal profile for the change of the group-velocity dispersion corresponding to the optimal similariton pulse amplification has been obtained. It is shown that the use of FM pulses in the active (gain and length-inhomogeneous optical fibers with the normal group-velocity dispersion can provide subpicosecond optical pulse amplification up to the energies higher than 1 nJ.

  15. Verification of relationships between anthropometric variables among ureteral stents recipients and ureteric lengths: a challenge for Vitruvian-da Vinci theory

    Directory of Open Access Journals (Sweden)

    Acelam PA

    2015-08-01

    Full Text Available Philip A Acelam Walden University, College of Health Sciences, Minneapolis, MN, USA Objective: To determine and verify how anthropometric variables correlate to ureteric lengths and how well statistical models approximate the actual ureteric lengths. Materials and methods: In this work, 129 charts of endourological patients (71 females and 58 males were studied retrospectively. Data were gathered from various research centers from North and South America. Continuous data were studied using descriptive statistics. Anthropometric variables (age, body surface area, body weight, obesity, and stature were utilized as predictors of ureteric lengths. Linear regressions and correlations were used for studying relationships between the predictors and the outcome variables (ureteric lengths; P-value was set at 0.05. To assess how well statistical models were capable of predicting the actual ureteric lengths, percentages (or ratios of matched to mismatched results were employed. Results: The results of the study show that anthropometric variables do not correlate well to ureteric lengths. Statistical models can partially estimate ureteric lengths. Out of the five anthropometric variables studied, three of them: body frame, stature, and weight, each with a P<0.0001, were significant. Two of the variables: age (R2=0.01; P=0.20 and obesity (R2=0.03; P=0.06, were found to be poor estimators of ureteric lengths. None of the predictors reached the expected (match:above:below ratio of 1:0:0 to qualify as reliable predictors of ureteric lengths. Conclusion: There is not sufficient evidence to conclude that anthropometric variables can reliably predict ureteric lengths. These variables appear to lack adequate specificity as they failed to reach the expected (match:above:below ratio of 1:0:0. Consequently, selections of ureteral stents continue to remain a challenge. However, height (R2=0.68 with the (match:above:below ratio of 3:3:4 appears suited for use as

  16. Validation and Verification of Operational Land Analysis Activities at the Air Force Weather Agency

    Science.gov (United States)

    Shaw, Michael; Kumar, Sujay V.; Peters-Lidard, Christa D.; Cetola, Jeffrey

    2012-01-01

    The NASA developed Land Information System (LIS) is the Air Force Weather Agency's (AFWA) operational Land Data Assimilation System (LDAS) combining real time precipitation observations and analyses, global forecast model data, vegetation, terrain, and soil parameters with the community Noah land surface model, along with other hydrology module options, to generate profile analyses of global soil moisture, soil temperature, and other important land surface characteristics. (1) A range of satellite data products and surface observations used to generate the land analysis products (2) Global, 1/4 deg spatial resolution (3) Model analysis generated at 3 hours. AFWA recognizes the importance of operational benchmarking and uncertainty characterization for land surface modeling and is developing standard methods, software, and metrics to verify and/or validate LIS output products. To facilitate this and other needs for land analysis activities at AFWA, the Model Evaluation Toolkit (MET) -- a joint product of the National Center for Atmospheric Research Developmental Testbed Center (NCAR DTC), AFWA, and the user community -- and the Land surface Verification Toolkit (LVT), developed at the Goddard Space Flight Center (GSFC), have been adapted to operational benchmarking needs of AFWA's land characterization activities.

  17. Experimental verification of the effect of cable length on voltage distribution in stator winding of an induction motor under surge condition

    Energy Technology Data Exchange (ETDEWEB)

    Oyegoke, B.S. [Helsinki Univ. of Technology, Otaniemi (Finland). Lab. of Electromechanics

    1997-12-31

    This paper presents the results of surge distribution tests performed on a stator of a 6 kV induction motor. The primary aim of these tests was to determine the wave propagation properties of the machine winding fed via cables of different lengths. Considering the measured resorts, conclusions are derived regarding the effect of cable length on the surge distribution within the stator winding of an ac motor. (orig.) 15 refs.

  18. Association of day length and weather conditions with physical activity levels in older community dwelling people.

    Directory of Open Access Journals (Sweden)

    Miles D Witham

    Full Text Available BACKGROUND: Weather is a potentially important determinant of physical activity. Little work has been done examining the relationship between weather and physical activity, and potential modifiers of any relationship in older people. We therefore examined the relationship between weather and physical activity in a cohort of older community-dwelling people. METHODS: We analysed prospectively collected cross-sectional activity data from community-dwelling people aged 65 and over in the Physical Activity Cohort Scotland. We correlated seven day triaxial accelerometry data with daily weather data (temperature, day length, sunshine, snow, rain, and a series of potential effect modifiers were tested in mixed models: environmental variables (urban vs rural dwelling, percentage of green space, psychological variables (anxiety, depression, perceived behavioural control, social variables (number of close contacts and health status measured using the SF-36 questionnaire. RESULTS: 547 participants, mean age 78.5 years, were included in this analysis. Higher minimum daily temperature and longer day length were associated with higher activity levels; these associations remained robust to adjustment for other significant associates of activity: age, perceived behavioural control, number of social contacts and physical function. Of the potential effect modifier variables, only urban vs rural dwelling and the SF-36 measure of social functioning enhanced the association between day length and activity; no variable modified the association between minimum temperature and activity. CONCLUSIONS: In older community dwelling people, minimum temperature and day length were associated with objectively measured activity. There was little evidence for moderation of these associations through potentially modifiable health, environmental, social or psychological variables.

  19. AN APPROACH FOR ACTIVE SEGMENTATION OF UNCONSTRAINED HANDWRITTEN KOREAN STRINGS USING RUN-LENGTH CODE

    NARCIS (Netherlands)

    JeongSuk, J.; Kim, G.

    2004-01-01

    We propose an active handwritten Hangul segmentation method. A manageable structure based on Run-length code is defined in order to apply to preprocessing and segmentation. Also three fundamental candidate estimation functions are in- troduced to detect the clues on touching points, and the classifi

  20. Day length and weather effects on children's physical activity and participation in play, sports, and active travel

    OpenAIRE

    Goodman, A.; Paskins, J.; MacKett, R

    2012-01-01

    BACKGROUND: Children in primary school are more physically active in the spring/summer. Little is known about the relative contributions of day length and weather, however, or about the underlying behavioral mediators. METHODS: 325 British children aged 8 to 11 wore accelerometers as an objective measure of physical activity, measured in terms of mean activity counts. Children simultaneously completed diaries in which we identified episodes of out-of-home play, structured sports, and active t...

  1. Effects of physical activity in telomere length: Systematic review and meta-analysis.

    Science.gov (United States)

    Mundstock, Eduardo; Zatti, Helen; Louzada, Fernanda Mattos; Oliveira, Suelen Goecks; Guma, Fátima T C R; Paris, Mariana Migliorini; Rueda, Angélica Barba; Machado, Denise Greff; Stein, Renato T; Jones, Marcus Herbert; Sarria, Edgar E; Barbé-Tuana, Florencia M; Mattiello, Rita

    2015-07-01

    The aim of this systematic review is to assess the effects of exercise on telomeres length. We searched the following databases: MEDLINE, EMBASE, Cochrane Central Register of Controlled Trials (CENTRAL, The Cochrane Library), Scopus, LILACS, SPORTDiscus and Web of Science from inception to August 2014. All articles that assessed the effects of exercise in telomere length were included in this review. The search strategy used the following combinations of terms: telomere AND "motor activity" OR exercise OR "physical activity". Two reviewers, working independently, screened all titles and abstracts to identify studies that could meet inclusion criteria. Whenever possible, and if appropriate, we performed a random-effect meta-analysis of study outcomes. Thirty-seven original studies were included in this systematic review, including 41,230 participants. Twenty articles did not find statistically significant association, whereas 15 described a positive association. Two papers found an inverted "U" correlation. There is a tendency toward demonstrating an effect of exercise on telomere length. Few prospective studies were found, many studies did not reach statistical significance and there was an important methodological diversity. For this reason, a possible significant association between physical activity and telomere length remains an open question.

  2. ANATOMY OF SOLAR CYCLE LENGTH AND SUNSPOT NUMBER: DEPENDENCE OF AVERAGE GLOBAL TEMPERATURE ON SOLAR ACTIVITY

    Directory of Open Access Journals (Sweden)

    A. B. BHATTACHARYA

    2011-11-01

    Full Text Available The paper examines thoroughly all the past 23 sunspot cycles and the associated 11 hale cycles. It is noticed that solar cycle 23 had a deep minimum with longest decline phase. When solar cycles 20 to 23 are compared with solar cycles 1 to 4, the forthcoming Dalton minimum can be expected. The predicted variation of sunspot number for the present solar cycle 24 is examined at length and it appears that the peak monthly sunspot number of the solar cycle 24 will be around 80. We have correlated the solar cycle length and peak sunspot number witha priority to the solar cycle 24. From an elaborate analysis it appears that the most common cycle length is around 10.5 years, with few cycles in the range 11.5 to 12.5 years. Global temperature depends upon the total solar irradiance which in turn depends on duration of solar cycle. Also cloud cover directly depends on the solar irradiance. Our analysis supports that the global temperature is governed by the length of the predicted cycle.From the increased length of solar cycle 23, we have estimated the temperature variation of cycle 24. The predicted result reassures that average global temperature will be decreased for next few solar cycles due totypical solar activity. The results have been interpreted emphasizing the formation of type III solar radio bursts caused by plasma excitation.

  3. Relationship between metabolism and ovarian activity in dairy cows with different dry period lengths.

    Science.gov (United States)

    Chen, J; Soede, N M; van Dorland, H A; Remmelink, G J; Bruckmaier, R M; Kemp, B; van Knegsel, A T M

    2015-11-01

    The objectives of the present study were to evaluate the effects of dry period length on ovarian activity in cows fed a lipogenic or a glucogenic diet within 100 days in milk (DIM) and to determine relationships between ovarian activity and energy balance and metabolic status in early lactation. Holstein-Friesian dairy cows (n = 167) were randomly assigned to one of three dry period lengths (0, 30, or 60 days) and one of two diets in early lactation (glucogenic or lipogenic diet) resulting in a 3 × 2 factorial design. Cows were monitored for body condition score, milk yield, dry matter intake, and energy balance from calving to week 8 postpartum, and blood was sampled weekly from 95 cows from calving to week 8 postpartum. Milk samples were collected three times a week until 100 DIM postpartum for determination of progesterone concentration. At least two succeeding milk samples with progesterone concentration of 2 ng/mL or greater were used to indicate the occurrence of luteal activity. Normal resumption of ovarian cyclicity was defined as the onset of luteal activity (OLA) occurring at 45 DIM or less, followed by regular ovarian cycles of 18 to 24 days in length. Within 100 DIM postpartum, cows with a 0-day dry period had greater incidence of normal resumption of ovarian cyclicity (53.2%; 25 out of 47 cows) compared with cows with a 60-day dry period (26.0%; 13 out of 50 cows, P = 0.02). Independent of dry period length or diet, cows with OLA at less than 21 DIM had a greater body condition score during weeks 1 and 2 (P = 0.01) and weeks 1 through 8 (P = 0.01) postpartum compared with cows with OLA at greater than 30 DIM. Cows with the first ovarian cycle of medium length (18-24 days) had greater energy balance (P = 0.03), plasma concentrations of insulin (P = 0.03), glucose (P = 0.04), and insulin-like growth factor I (P = 0.04) than cows with long ovarian cycle lengths (>24 days) but had lower plasma β-hydroxybutyrate (P cows with

  4. Active aging as a way of keeping diseases at arm’s length

    DEFF Research Database (Denmark)

    Lassen, Aske Juul

    good for their quality of life, health, functionality and the economy (Sundhedsstyrelsen 2008, EC 2006, WHO 2002). At the same time active aging is inscribed into a general health care focus, which individualizes the responsibility for health and disease. This requires subjects ready to self-care, by...... paying attention to the signals of the body and leading healthy lives (Rose 2001). However, active aging seems to contain an ambiguity in this aspect, as the practice of active aging is often a way for elderly to keep diseases at arm’s length, and not a way to sense the possible abnormalities in the body...

  5. Implementation of the Additional Protocol: Verification activities at uranium mines and mills

    International Nuclear Information System (INIS)

    Full text: The mining and milling of uranium is the first in a long chain of processes required to produce nuclear materials in a form suitable for use in nuclear weapons. Misuse of a declared uranium mining/milling facility, in the form of understatement of production, would be hard to detect with the same high level of confidence as afforded by classical safeguards on other parts of the nuclear fuel cycle. For these reasons, it would not be cost-effective to apply verification techniques based on classical safeguards concepts to a mining/milling facility in order to derive assurance of the absence of misuse. Indeed, these observations have been recognised in the Model Protocol (INFCIRC/540): 'the Agency shall not mechanistically or systematically seek to verify' information provided to it by States (Article 4.a.). Nevertheless, complementary access to uranium mining/milling sites 'on a selective basis in order to assure the absence of undeclared nuclear material and activities' (Article 4.a.(i)) is provided for. On this basis, therefore, this paper will focus predominantly on options other than site access, which are available to the Agency for deriving assurance that declared mining/milling operations are not misused. Such options entail the interpretation and analysis of information provided to the Agency including, for example, from declarations, monitoring import/export data, open source reports, commercial satellite imagery, aerial photographs, and information provided by Member States. Uranium mining techniques are diverse, and the inventories, flows and uranium assays which arise at various points in the process will vary considerably between mines, and over the operating cycle of an individual mine. Thus it is essentially impossible to infer any information, which can be used precisely to confirm, or otherwise, declared production by measuring or estimating any of those parameters at points within the mining/milling process. The task of attempting to

  6. Effect of grass silage chop length on chewing activity and digestibility

    DEFF Research Database (Denmark)

    Garmo, T.H.; Randby, Å.T.; Eknæs, M.;

    2008-01-01

    Round bale grass silage harvested early (D-value 757 g kg-1 DM) or at a normal (D-value 696 g kg-1 DM) time was used to study the effect of harvesting time, chop length and their interaction on chewing activity and digestibility by dairy cows. Six early lactating Norwegian Red cows were used in a 6...... x 6 Latin square with 3-week periods. Chewing activity was measured using IGER Behaviour recorders, and digestibility was measured by total collection of faeces. The two silages were fed long (170 mm), coarsely chopped (55 mm), or finely chopped (24 mm median particle length). Cows were fed silage...... ad libitum and supplemented with 6 kg concentrate. Early harvested silage significantly decreased total ration eating (ET), rumination (RT) and chewing time (CT) per kg silage DM compared with normal harvested silage (CT = 38 vs. 46 min kg-1 DM). Chopping of silage reduced CT significantly, mainly...

  7. A Method Based on Active Appearance Model and Gradient Orientation Pyramid of Face Verification as People Age

    Directory of Open Access Journals (Sweden)

    Ji-Xiang Du

    2014-01-01

    Full Text Available Face verification in the presence of age progression is an important problem that has not been widely addressed. In this paper, we propose to use the active appearance model (AAM and gradient orientation pyramid (GOP feature representation for this problem. First, we use the AAM on the dataset and generate the AAM images; we then get the representation of gradient orientation on a hierarchical model, which is the appearance of GOP. When combined with a support vector machine (SVM, experimental results show that our approach has excellent performance on two public domain face aging datasets: FGNET and MORPH. Second, we compare the performance of the proposed methods with a number of related face verification methods; the results show that the new approach is more robust and performs better.

  8. Active Stream Length Dynamics in Headwater Catchments Spanning Physiographic Provinces in the Appalachian Highlands

    Science.gov (United States)

    Jensen, C.; McGuire, K. J.

    2015-12-01

    One of the most basic descriptions of streams is the presence of channelized flow. However, this seemingly simple query goes unanswered for the majority of headwater networks, as stream length expands and contracts with the wetness of catchments seasonally, interannually, and in response to storm events. Although streams are known to grow and shrink, a lack of information on longitudinal dynamics across different geographic regions precludes effective management. Understanding the temporal variation in temporary network length over a broad range of settings is critical for policy decisions that impact aquatic ecosystem health. This project characterizes changes in active stream length for forested headwater catchments spanning four physiographic provinces of the Appalachian Highlands: the New England at Hubbard Brook Experimental Forest, New Hampshire; Valley and Ridge at Poverty Creek and the North Fork of Big Stony Creek in Jefferson National Forest, Virginia; Blue Ridge at Coweeta Hydrologic Laboratory, North Carolina; and Appalachian Plateau at Fernow Experimental Forest, West Virginia. Multivariate statistical analysis confirms these provinces exhibit characteristic topographies reflecting differences in climate, geology, and environmental history and, thus, merit separate consideration. The active streams of three watersheds (extremes of discharge. This work demonstrates that streams can remain active in the form of isolated, disconnected sections along even the most upstream reaches during low flows. This finding suggests that we must consider the maximum stream extent for conservation and management strategies much more frequently than for just periods of high stream flow.

  9. Telomerase activity and telomere length in human tumor cells with acquired resistance to anticancer agents.

    Science.gov (United States)

    Smith, V; Dai, F; Spitz, M; Peters, G J; Fiebig, H H; Hussain, A; Burger, A M

    2009-11-01

    Telomeres and telomerase are targets for anticancer drug development and specific inhibitors are currently under clinical investigation. However, it has been reported that standard cytotoxic agents can affect telomere length and telomerase activity suggesting that they also have of a role in drug resistance. in this study, telomere lengths and telomerase activity as well as drug efflux pump expression, glutathione (GSH) levels and polyadenosine-ribose polymerase (PARP) cleavage were assessed in a panel of human tumor cell lines made resistant to vindesine, gemcitabine and cisplatin. these included two lung cancer cell lines resistant to vindesine (LXFL 529L/Vind, LXFA 526L/Vind), a renal cancer cell line (RXF944L/Gem) and an ovarian cancer cell line (AG6000) resistant to gemcitabine, and one resistant to cisplatin (ADDP). The resistant clones were compared to their parental lines and evaluated for cross resistance to other cytotoxic agents. Several drug specific resistance patterns were found, and various complex patterns of cross resistance emerged from some cell lines, but these mechanisms of resistance could not be related to drug efflux pump expression, GSH levels or pARp cleavage. However, all displayed changes in telomerase activity and/or telomere length. Our studies present evidence that telomere maintenance should be taken into consideration in efforts not only to overcome drug resistance, but also to optimize the use of telomere-based therapeutics.

  10. Influence of linker length and composition on enzymatic activity and ribosomal binding of neomycin dimers.

    Science.gov (United States)

    Watkins, Derrick; Kumar, Sunil; Green, Keith D; Arya, Dev P; Garneau-Tsodikova, Sylvie

    2015-07-01

    The human and bacterial A site rRNA binding as well as the aminoglycoside-modifying enzyme (AME) activity against a series of neomycin B (NEO) dimers is presented. The data indicate that by simple modifications of linker length and composition, substantial differences in rRNA selectivity and AME activity can be obtained. We tested five different AMEs with dimeric NEO dimers that were tethered via triazole, urea, and thiourea linkages. We show that triazole-linked dimers were the worst substrates for most AMEs, with those containing the longer linkers showing the largest decrease in activity. Thiourea-linked dimers that showed a decrease in activity by AMEs also showed increased bacterial A site binding, with one compound (compound 14) even showing substantially reduced human A site binding. The urea-linked dimers showed a substantial decrease in activity by AMEs when a conformationally restrictive phenyl linker was introduced. The information learned herein advances our understanding of the importance of the linker length and composition for the generation of dimeric aminoglycoside antibiotics capable of avoiding the action of AMEs and selective binding to the bacterial rRNA over binding to the human rRNA.

  11. Effect of rain boot shaft length on lower extremity muscle activity during treadmill walking

    Science.gov (United States)

    Kim, Young-Hwan; Yoo, Kyung-Tae

    2016-01-01

    [Purpose] This study aimed to determine the extent of lower extremity muscle activity before and after walking based on rain boot shaft length. [Subjects and Methods] The subjects, 12 young and healthy females, were divided into three groups based on rain boot shaft length (long, middle, and short). They walked on a treadmill for 30 minutes. Activity of the rectus femoris, vastus lateralis, semitendinosus, tibialis anterior, peroneus longus, and gastrocnemius was measured using electromyography before and after walking. Two-way repeated measures analysis of variance was performed to compare the muscle activities of each group. [Results] There were no significant differences in terms of the interactive effects between group and time for all muscles, the main effects of group, or the main effects of time. [Conclusion] The results of this study may indicate that movement of the lower extremities was not significantly limited by friction force based on the characteristics of the boot material or the circumference of the boot shaft. Thus, it may be helpful instead to consider the material of the sole or the weight of the boots when choosing which rain boots to wear. PMID:27799685

  12. Chain length dependence of non-surface activity and micellization behavior of cationic amphiphilic diblock copolymers.

    Science.gov (United States)

    Ghosh, Arjun; Yusa, Shin-ichi; Matsuoka, Hideki; Saruwatari, Yoshiyuki

    2014-04-01

    The cationic and anionic amphiphilic diblock copolymers with a critical chain length and block ratio do not adsorb at the air/water interface but form micelles in solution, which is a phenomenon called "non-surface activity". This is primarily due to the high charge density of the block copolymer, which creates a strong image charge effect at the air/water interface preventing adsorption. Very stable micelle formation in bulk solution could also play an important role in the non-surface activity. To further confirm these unique properties, we studied the adsorption and micellization behavior of cationic amphiphilic diblock copolymers of poly(n-butyl acrylate)-b-poly(3-(methacryloyloxy)ethyl)trimethylammonium chloride) (PBA-b-PDMC) with different molecular weights of hydrophobic blocks but with the same ionic block length. These block copolymers were successfully prepared via consecutive reversible addition-fragmentation chain transfer (RAFT) polymerization. The block copolymer with the shortest hydrophobic block length was surface-active; the solution showed surface tension reduction and foam formation. However, above the critical block ratio, the surface tension of the solution did not decrease with increasing polymer concentration, and there was no foam formation, indicating lack of surface activity. After addition of 0.1 M NaCl, stable foam formation and slight reduction of surface tension were observed, which is reminiscent of the electrostatic nature of the non-surface activity. Fluorescence and dynamic and static light scattering measurements showed that the copolymer with the shortest hydrophobic block did not form micelles, while the block copolymers formed spherical micelles having radii of 25-30 nm. These observations indicate that micelle formation is also important for non-surface activity. Upon addition of NaCl, cmc did not decrease but rather increased as observed for non-surface-active block copolymers previously studied. The micelles formed were

  13. Groundwater flow code verification ``benchmarking`` activity (COVE-2A): Analysis of participants` work

    Energy Technology Data Exchange (ETDEWEB)

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project.

  14. Leukocyte Telomere Length in Healthy Caucasian and African-American Adolescents : Relationships with Race, Sex, Adiposity, Adipokines, and Physical Activity

    NARCIS (Netherlands)

    Zhu, Haidong; Wang, Xiaoling; Gutin, Bernard; Davis, Catherine L.; Keeton, Daniel; Thomas, Jeffrey; Stallmann-Jorgensen, Inger; Mooken, Grace; Bundy, Vanessa; Snieder, Harold; van der Harst, Pim; Dong, Yanbin

    2011-01-01

    Objective To examine the relationships of race, sex, adiposity, adipokines, and physical activity to telomere length in adolescents. Study design Leukocyte telomere length (T/S ratio) was assessed cross-sectionally in 667 adolescents (aged 14-18 years; 48% African-Americans; 51% girls) using a quant

  15. An Origin AS Verification Mechanism Based on the Length of Prefix Assignment Path for Securing BG P%基于前缀分配路径长度的BGP源自治系统验证机制

    Institute of Scientific and Technical Information of China (English)

    王娜; 张建辉; 马海龙; 汪斌强

    2009-01-01

    The paper found that current origin Autonomous System (AS) verification mechanisms to secure BGP which security property have been widely recognized, such as S-BGP, have the vulnerability that they are based on the assignment path of a prefix,only guarantee that a prefix is originated by the AS which is authorized by an Internet Service Provider (ISP) at the assignment path of the prefix,not guarantee that it is originated by the AS authorized by the last ISP, which owns the prefix. Only the AS authorized by the ISP owns a prefix is the prefix's legitimate origin AS. As a result, these mechanisms suffer from a ' the upper ISP' prefix hijacking.The paper proposes a novel origin AS verification mechanism based on the length of a prefix assignment path for securing BGP, called LAP (the Length of Assignment Path). The basic idea is that all Ases must provide the assignment path and attestations of their originated prefixes, and for a prefix, the AS provides the longest valid assignment path is its legitimate origin AS.LAP protect inter-domain routing system against valid prefix hijacking, sub-prefix hijacking and unused prefix hijacking, especially ' the upper ISP' prefix hijacking, and it can be seamlessly applied in current BGP secure solutions and some next generation inter-domain routing protocols.%发现目前安全性得到广泛认可的BGP源自治系统验证机制(如S-BGP)会受到一种上层ISP(Internet Service Provider,Internet服务提供商)前缀劫持攻击.这些机制基于前缀的分配路径,仅能保证前缀被分配路径上的ISP授权自治系统发起,不能保证被分配路径上最后一个ISP(即前缀的拥有ISP)授权自治系统发起.只有获得前缀拥有ISP授权的自治系统才是该前缀的合法源自治系统.本文提出了一种基于前缀分配路径长度的源自治系统验证机制--LAP (the Length of Assignment Path,分配路径长度).基本思想是任一发出前缀可达路由通告的自治系统都必须

  16. Rational Design for Rotaxane Synthesis through Intramolecular Slippage: Control of Activation Energy by Rigid Axle Length.

    Science.gov (United States)

    Masai, Hiroshi; Terao, Jun; Fujihara, Tetsuaki; Tsuji, Yasushi

    2016-05-01

    We describe a new concept for rotaxane synthesis through intramolecular slippage using π-conjugated molecules as rigid axles linked with organic soluble and flexible permethylated α-cyclodextrins (PM α-CDs) as macrocycles. Through hydrophilic-hydrophobic interactions and flipping of PM α-CDs, successful quantitative conversion into rotaxanes was achieved without covalent bond formation. The rotaxanes had high activation barrier for their de-threading, so that they were kinetically isolated and derivatized even under conditions unfavorable for maintaining the rotaxane structures. (1) H NMR spectroscopy experiments clearly revealed that the restricted motion of the linked macrocycle with the rigid axle made it possible to control the kinetic stability by adjusting the length of the rigid axle in the precursor structure rather than the steric bulkiness of the stopper unit.

  17. Effects of cisplatin on telomerase activity and telomere length in BEL-7404 human hepatoma cells

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Telomerase activity was inhibited in a dose and time-dependent manner with the treatment of cisplatin for 24, 48, or 72 h in a concentration ranged from 0.8 to 50 μM in BEL-7404 human hepatoma cells. There were no changes in expression pattern of three telomerase subunits, its catalytic reverse transcriptase subunit (hTERT), its RNA component (hTR) or the associated protein subunit (TP1), after cisplatin treated for 72 h with indicated concentrations. Mean telomere lengths were decreased by the cisplatin treatment. Cell growth inhibition and cell cycle accumulation in G2/M phase were found to be correlated with telomerase inhibition in the present study, but percentages of cell apoptosis did not change markedly during the process.

  18. Status of the Agency's verification activities in Iraq as of 8 January 2003. Statement by the Director General. New York, 09 January 2003

    International Nuclear Information System (INIS)

    The following information is provided to update the Council on the activities of the IAEA pursuant to Security Council resolution 1441 (2002) and other relevant resolutions. It describes the verification activities performed thus far, next steps, and where we are at this stage

  19. Purification and activity testing of the full-length YycFGHI proteins of Staphylococcus aureus.

    Directory of Open Access Journals (Sweden)

    Michael Türck

    Full Text Available BACKGROUND: The YycFG two-component regulatory system (TCS of Staphylococcus aureus represents the only essential TCS that is almost ubiquitously distributed in gram-positive bacteria with a low G+C-content. YycG (WalK/VicK is a sensor histidine-kinase and YycF (WalR/VicR is the cognate response regulator. Both proteins play an important role in the biosynthesis of the cell envelope and mutations in these proteins have been involved in development of vancomycin and daptomycin resistance. METHODOLOGY/PRINCIPAL FINDINGS: Here we present high yield expression and purification of the full-length YycG and YycF proteins as well as of the auxiliary proteins YycH and YycI of Staphylococcus aureus. Activity tests of the YycG kinase and a mutated version, that harbours an Y306N exchange in its cytoplasmic PAS domain, in a detergent-micelle-model and a phosholipid-liposome-model showed kinase activity (autophosphorylation and phosphoryl group transfer to YycF only in the presence of elevated concentrations of alkali salts. A direct comparison of the activity of the kinases in the liposome-model indicated a higher activity of the mutated YycG kinase. Further experiments indicated that YycG responds to fluidity changes in its microenvironment. CONCLUSIONS/SIGNIFICANCE: The combination of high yield expression, purification and activity testing of membrane and membrane-associated proteins provides an excellent experimental basis for further protein-protein interaction studies and for identification of all signals received by the YycFGHI system.

  20. Ground-based verification and data processing of Yutu rover Active Particle-induced X-ray Spectrometer

    CERN Document Server

    Guo, Dongya; Peng, Wenxi; Cui, Xingzhu; Zhang, Chengmo; Liu, Yaqing; Liang, Xiaohua; Dong, Yifan; Wang, Jinzhou; Gao, Min; Yang, Jiawei; Zhang, Jiayu; Li, Chunlai; Zou, Yongliao; Zhang, Guangliang; Zhang, Liyan; Fu, Xiaohui

    2015-01-01

    The Active Particle-induced X-ray Spectrometer (APXS) is one of the payloads on board the Yutu rover of Chang'E-3 mission. In order to assess the instrumental performance of APXS, a ground verification test was done for two unknown samples (basaltic rock, mixed powder sample). In this paper, the details of the experiment configurations and data analysis method are presented. The results show that the elemental abundance of major elements can be well determined by the APXS with relative deviations < 15 wt. % (detection distance = 30 mm, acquisition time = 30 min). The derived detection limit of each major element is inversely proportional to acquisition time and directly proportional to detection distance, suggesting that the appropriate distance should be < 50mm.

  1. Translating activity diagram from duration calculus for modeling of real-time systems and its formal verification using UPPAAL and DiVinE

    International Nuclear Information System (INIS)

    The RTS (Real-Time Systems) are widely used in industry, home appliances, life saving systems, aircrafts, and automatic weapons. These systems need more accuracy, safety, and reliability. An accurate graphical modeling and verification of such systems is really challenging. The formal methods made it possible to model such systems with more accuracy. In this paper, we envision a strategy to overcome the inadequacy of SysML (System Modeling Language) for modeling and verification of RTS, and illustrate the framework by applying it on a case study of fuel filling machine. We have defined DC (Duration Calculus) implementation based formal semantics to specify the functionality of RTS. The activity diagram in then generated from these semantics. Finally, the graphical model is verified using UPPAAL and DiVinE model checkers for validation of timed and untimed properties with accelerated verification speed. Our results suggest the use of methodology for modeling and verification of large scale real-time systems with reduced verification cost. (author)

  2. Telomerase activity is increased and telomere length shortened in T cells from blood of patients with atopic dermatitis and psoriasis

    DEFF Research Database (Denmark)

    Wu, Kehuai; Higashi, N; Hansen, E R;

    2000-01-01

    We studied telomerase activity and telomere length in PBMC and purified CD4(+) and CD8(+) T cells from blood obtained from a total of 32 patients with atopic dermatitis, 16 patients with psoriasis, and 30 normal controls. The telomerase activity was significantly increased in PBMC from the patients......(+) T cell subsets from normal donors. In conclusion, the increased telomerase activity and shortened telomere length indicates that T lymphocytes in atopic dermatitis and psoriasis are chronically stimulated and have an increased cellular turnover in vivo....

  3. Estimation of active force-length characteristics of human vastus lateralis muscle.

    Science.gov (United States)

    Ichinose, Y; Kawakami, Y; Ito, M; Fukunaga, T

    1997-01-01

    The length and angles of fascicles were determined for the vastus lateralis muscle (VL) using ultrasonography in 6 subjects performing ramp isometric knee extension. The subject increased torque from zero (relax) to maximum (MVC) with the knee positioned every 15 degrees, from 10 degrees to 100 degrees flexion (0 degrees = full extension). As the knee was positioned closer to extension, fascicle length was shorter [116 +/- 4.7 (mean +/- SEM) mm at 100 degrees vs. 88 +/- 4.1 mm at 10 degrees (relax)]. The fascicle length of the VL decreased with increasing torque at each knee position [116 +/- 4.7 (relax) to 92 +/- 4.3 mm (MVC) at 100 degrees]. On the other hand, fascicle angles increased with an increase in torque. These changes reflected the compliance of the muscle-tendon complex which increased as the knee reached a straight position. The estimated muscle force of the VL was maximal (2,052 +/- 125 N) for a fascicle length of 78 +/- 2.7 mm (i.e. optimum length) with the knee positioned at 70 degrees of flexion. The relationship between muscle force and fascicle length indicated that the VL uses the ascending (knee 70 degrees) of the force-length curve.

  4. Telomere Length in Peripheral Blood Mononuclear Cells of Patients on Chronic Hemodialysis Is Related With Telomerase Activity and Treatment Duration.

    Science.gov (United States)

    Stefanidis, Ioannis; Voliotis, Georgios; Papanikolaou, Vassilios; Chronopoulou, Ioanna; Eleftheriadis, Theodoros; Kowald, Axel; Zintzaras, Elias; Tsezou, Aspasia

    2015-09-01

    Telomere shortening to a critical limit is associated with replicative senescence. This process is prevented by the enzyme telomerase. Oxidative stress and chronic inflammation are factors accelerating telomere loss. Chronic hemodialysis, typically accompanied by oxidative stress and inflammation, may be also associated with replicative senescence. To test this hypothesis, we determined telomere length and telomerase activity in peripheral blood mononuclear cells (PBMCs) in a cross-sectional study. Hemodialysis patients at the University Hospital Larissa and healthy controls were studied. Telomere length was determined by the TeloTAGGG Telomere Length Assay and telomerase activity by Telomerase PCR-ELISA (Roche Diagnostics GmbH, Mannheim, Germany). We enrolled 43 hemodialysis patients (17 females; age 65.0 ± 12.7 years) and 23 controls (six females; age 62.1 ± 15.7 years). Between the two groups, there was no difference in telomere length (6.95 ± 3.25 vs. 7.31 ± 1.96 kb; P = 0.244) or in telomerase activity (1.82 ± 2.91 vs. 2.71 ± 3.0; P = 0.085). Telomere length correlated inversely with vintage of hemodialysis (r = -0.332, P = 0.030). In hemodialysis patients, positive telomerase activity correlated with telomere length (r = 0.443, P = 0.030). Only age, and neither telomere length nor telomerase activity, was an independent survival predictor (hazard ratio 1.116, 95% confidence interval 1.009-1.234, P = 0.033). In this study, telomere length and telomerase activity in PBMCs are not altered in hemodialysis patients compared with healthy controls. Long duration of hemodialysis treatment is associated with telomere shortening and positive telomerase activity with an increased telomere length in PBMCs of hemodialysis patients. The underlying mechanism and clinical implications of our findings require further investigation.

  5. Telomerase activity is increased and telomere length shortened in T cells from blood of patients with atopic dermatitis and psoriasis

    DEFF Research Database (Denmark)

    Wu, Kehuai; Higashi, H; Hansen, E R;

    2000-01-01

    We studied telomerase activity and telomere length in PBMC and purified CD4(+) and CD8(+) T cells from blood obtained from a total of 32 patients with atopic dermatitis, 16 patients with psoriasis, and 30 normal controls. The telomerase activity was significantly increased in PBMC from the patients...... compared with PBMC from normal donors. This increase was most pronounced in the subpopulation of CD4(+) T cells, which were significantly above the activity of the CD8(+) T cells in atopic dermatitis, psoriasis patients, and control persons. The telomere length was significantly reduced in all T cell...... subsets from both atopic dermatitis and psoriasis patients compared with normal individuals. Furthermore, the telomere length was found to be significantly shorter in CD4(+) memory T cells compared with the CD4(+) naive T cells, and both of the cell subsets from diseases were shown to be of significantly...

  6. Nuclear disarmament verification

    Energy Technology Data Exchange (ETDEWEB)

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  7. Experimental verification of the control mechanism of marketing activities of small consulting organization

    OpenAIRE

    Kirushkin, A. A.

    2013-01-01

    This article contains results of testing the new mechanism of managing marketing activity of a small consulting organization, proposed in the previous author’s article (see Journal Marketing MBA, 2013, issue 1). The testing was done with computer simulating the organization’s activity with several initial data. The simulations have given recommendations on managing marketing activity of small consulting organizations.

  8. A "Kane's Dynamics" Model for the Active Rack Isolation System Part Two: Nonlinear Model Development, Verification, and Simplification

    Science.gov (United States)

    Beech, G. S.; Hampton, R. D.; Rupert, J. K.

    2004-01-01

    Many microgravity space-science experiments require vibratory acceleration levels that are unachievable without active isolation. The Boeing Corporation's active rack isolation system (ARIS) employs a novel combination of magnetic actuation and mechanical linkages to address these isolation requirements on the International Space Station. Effective model-based vibration isolation requires: (1) An isolation device, (2) an adequate dynamic; i.e., mathematical, model of that isolator, and (3) a suitable, corresponding controller. This Technical Memorandum documents the validation of that high-fidelity dynamic model of ARIS. The verification of this dynamics model was achieved by utilizing two commercial off-the-shelf (COTS) software tools: Deneb's ENVISION(registered trademark), and Online Dynamics Autolev(trademark). ENVISION is a robotics software package developed for the automotive industry that employs three-dimensional computer-aided design models to facilitate both forward and inverse kinematics analyses. Autolev is a DOS-based interpreter designed, in general, to solve vector-based mathematical problems and specifically to solve dynamics problems using Kane's method. The simplification of this model was achieved using the small-angle theorem for the joint angle of the ARIS actuators. This simplification has a profound effect on the overall complexity of the closed-form solution while yielding a closed-form solution easily employed using COTS control hardware.

  9. Experimental verification of the flow characteristics of an active controlled microfluidic valve with annular boundary

    Science.gov (United States)

    Pan, Chun-Peng; Wang, Dai-Hua

    2014-03-01

    The principle and structural configuration of an active controlled microfluidic valve with annular boundary is presented in this paper. The active controlled flowrate model of the active controlled microfluidic valve with annular boundary is established. The prototypes of the active controlled microfluidic valves with annular boundaries with three different combinations of the inner and outer radii are fabricated and tested on the established experimental setup. The experimental results show that: (1) The active controlled microfluidic valve with annular boundary possesses the on/off switching and the continuous control capability of the fluid with simple structure and easy fabrication processing; (2) When the inner and outer diameters of the annular boundary are 1.5 mm and 3.5 mm, respectively, the maximum flowrate of the valve is 0.14 ml/s when the differential pressure of the inlet and outlet of the valve is 1000 Pa and the voltage applied to circular piezoelectric unimorph actuator is 100 V; (3) The established active controlled flowrate model can accurately predict the controlled flowrate of the active controlled microfluidic valves with the maximum relative error of 6.7%. The results presented in this paper lay the foundation for designing and developing the active controlled microfluidic valves with annular boundary driven by circular piezoelectric unimorph actuators.

  10. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  11. Finite element analysis of vibration-driven electro-active paper energy harvester with experimental verification

    Directory of Open Access Journals (Sweden)

    Zafar Abas

    2015-02-01

    Full Text Available In this research work, a coupled-field finite element model of electro-active paper energy harvester is presented, and the results are verified experimentally. Electro-active paper is a smart form of cellulose coated with electrodes on both sides. A finite element model was developed, and harmonic and transient analyses were performed using a commercial finite element analysis package. Two 80 mm × 50 mm and 100 mm × 50 mm aluminum cantilever benders bonded with electro-active paper were tested to validate the finite element model results. Displacement and voltage generated by the energy harvester at the electrode surfaces were measured. The electro-active paper energy harvesters were excited at their fundamental resonance frequencies by a sinusoidal force located 18 mm from the free end. The voltage obtained from the 80 mm × 50 mm and 100 mm × 50 mm electro-active paper energy harvester finite element model was 3.7 and 7 mV, respectively. Experimental results have shown good agreement with the finite element model. The direct piezoelectric effect of electro-active paper shows potential for a cellulose-based eco-friendly energy harvester.

  12. Simulated sudden increase in geomagnetic activity and its effect on heart rate variability: Experimental verification of correlation studies

    Science.gov (United States)

    Caswell, Joseph M.; Singh, Manraj; Persinger, Michael A.

    2016-08-01

    Previous research investigating the potential influence of geomagnetic factors on human cardiovascular state has tended to converge upon similar inferences although the results remain relatively controversial. Furthermore, previous findings have remained essentially correlational without accompanying experimental verification. An exception to this was noted for human brain activity in a previous study employing experimental simulation of sudden geomagnetic impulses in order to assess correlational results that had demonstrated a relationship between geomagnetic perturbations and neuroelectrical parameters. The present study employed the same equipment in a similar procedure in order to validate previous findings of a geomagnetic-cardiovascular dynamic with electrocardiography and heart rate variability measures. Results indicated that potential magnetic field effects on frequency components of heart rate variability tended to overlap with previous correlational studies where low frequency power and the ratio between low and high frequency components of heart rate variability appeared affected. In the present study, a significant increase in these particular parameters was noted during geomagnetic simulation compared to baseline recordings.

  13. Experimental verification and molecular basis of active immunization against fungal pathogens in termites.

    Science.gov (United States)

    Liu, Long; Li, Ganghua; Sun, Pengdong; Lei, Chaoliang; Huang, Qiuying

    2015-10-13

    Termites are constantly exposed to many pathogens when they nest and forage in the field, so they employ various immune strategies to defend against pathogenic infections. Here, we demonstrate that the subterranean termite Reticulitermes chinensis employs active immunization to defend against the entomopathogen Metarhizium anisopliae. Our results showed that allogrooming frequency increased significantly between fungus-treated termites and their nestmates. Through active social contact, previously healthy nestmates only received small numbers of conidia from fungus-treated individuals. These nestmates experienced low-level fungal infections, resulting in low mortality and apparently improved antifungal defences. Moreover, infected nestmates promoted the activity of two antioxidant enzymes (SOD and CAT) and upregulated the expression of three immune genes (phenoloxidase, transferrin, and termicin). We found 20 differentially expressed proteins associated with active immunization in R. chinensis through iTRAQ proteomics, including 12 stress response proteins, six immune signalling proteins, and two immune effector molecules. Subsequently, two significantly upregulated (60S ribosomal protein L23 and isocitrate dehydrogenase) and three significantly downregulated (glutathione S-transferase D1, cuticle protein 19, and ubiquitin conjugating enzyme) candidate immune proteins were validated by MRM assays. These findings suggest that active immunization in termites may be regulated by different immune proteins.

  14. The changes in telomerase activity and telomere length in HeLa cells undergoing apop- tosis induced by sodium butyrate

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The changes in telomerase activity and telomere length during apoptosis in HeLa cells as induced by sodium butyrate (SB) have been studied. After a 48 h SB treatment, HeLa cells demonstrated characteristic apoptotic hallmarks including chromatin condensation, formation of apoptotic bodies and DNA Laddering which were caused by the cleavage and degradation of DNA between nucleosomes. There were no significant changes in telomerase activity of apoptotic cells, while the telomere length shortened markedly. In the meanwhile, cells became more susceptible to apoptotic stimuli and telomere became more vulnerable to degradation after telomerase activity was inhibited. All the results suggest that the apoptosis induced by SB is closely related to telomere shortening, while telomerase enhances resistance of HeLa cells to apoptotic stimuli by protecting telomere.

  15. Verification of a characterization method of the laser-induced selective activation based on industrial lasers

    DEFF Research Database (Denmark)

    Zhang, Yang; Hansen, Hans Nørgaard; Tang, Peter T.;

    2013-01-01

    In this article, laser-induced selective activation (LISA) for subsequent autocatalytic copper plating is performed by several types of industrial scale lasers, including a Nd:YAG laser, a UV laser, a fiber laser, a green laser, and a short pulsed laser. Based on analysis of all the laser-machine...

  16. An Activation-Verification Model for Letter and Word Recognition: The Word-Superiority Effect.

    Science.gov (United States)

    Paap, Kenneth R.; And Others

    1982-01-01

    An encoding algorithm uses empirically determined confusion matrices to activate units in an alphabetum and a lexicon to predict performance of word, orthographically regular nonword, or irregular nonword recognition. Performance is enhanced when decisions are based on lexical information which constrains test letter identity. Word prediction…

  17. Implementation and Verification of an Application with Active Path Selection in IPv6 Environment

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The paper proposes a function of active path selection to the destination by users and implements the function into the FTP application by using the routing header defined in IPv6 specifications. It also develops the "support system for quality selection", which supports the selection by providing information on each path, experimentally. The details of the function and an implementation of the application also are described.

  18. Biochemical Activities of the Wiskott-Aldrich Syndrome Homology Region 2 Domains of Sarcomere Length Short (SALS) Protein.

    Science.gov (United States)

    Tóth, Mónika Ágnes; Majoros, Andrea Kinga; Vig, Andrea Teréz; Migh, Ede; Nyitrai, Miklós; Mihály, József; Bugyi, Beáta

    2016-01-01

    Drosophila melanogaster sarcomere length short (SALS) is a recently identified Wiskott-Aldrich syndrome protein homology 2 (WH2) domain protein involved in skeletal muscle thin filament regulation. SALS was shown to be important for the establishment of the proper length and organization of sarcomeric actin filaments. Here, we present the first detailed characterization of the biochemical activities of the tandem WH2 domains of SALS (SALS-WH2). Our results revealed that SALS-WH2 binds both monomeric and filamentous actin and shifts the monomer-filament equilibrium toward the monomeric actin. In addition, SALS-WH2 can bind to but fails to depolymerize phalloidin- or jasplakinolide-bound actin filaments. These interactions endow SALS-WH2 with the following two major activities in the regulation of actin dynamics: SALS-WH2 sequesters actin monomers into non-polymerizable complexes and enhances actin filament disassembly by severing, which is modulated by tropomyosin. We also show that profilin does not influence the activities of the WH2 domains of SALS in actin dynamics. In conclusion, the tandem WH2 domains of SALS are multifunctional regulators of actin dynamics. Our findings suggest that the activities of the WH2 domains do not reconstitute the presumed biological function of the full-length protein. Consequently, the interactions of the WH2 domains of SALS with actin must be tuned in the cellular context by other modules of the protein and/or sarcomeric components for its proper functioning.

  19. ANATOMY OF SOLAR CYCLE LENGTH AND SUNSPOT NUMBER: DEPENDENCE OF AVERAGE GLOBAL TEMPERATURE ON SOLAR ACTIVITY

    OpenAIRE

    Bhattacharya, A. B.; B. RAHA; Das, T.; M. Debnath; D. HALDER

    2011-01-01

    The paper examines thoroughly all the past 23 sunspot cycles and the associated 11 hale cycles. It is noticed that solar cycle 23 had a deep minimum with longest decline phase. When solar cycles 20 to 23 are compared with solar cycles 1 to 4, the forthcoming Dalton minimum can be expected. The predicted variation of sunspot number for the present solar cycle 24 is examined at length and it appears that the peak monthly sunspot number of the solar cycle 24 will be around 80. We have correlated...

  20. North Atlantic Basin Tropical Cyclone Activity in Relation to Temperature and Decadal- Length Oscillation Patterns

    Science.gov (United States)

    Wilson, Robert M.

    2009-01-01

    Yearly frequencies of North Atlantic basin tropical cyclones, their locations of origin, peak wind speeds, average peak wind speeds, lowest pressures, and average lowest pressures for the interval 1950-2008 are examined. The effects of El Nino and La Nina on the tropical cyclone parametric values are investigated. Yearly and 10-year moving average (10-yma) values of tropical cyclone parameters are compared against those of temperature and decadal-length oscillation, employing both linear and bi-variate analysis, and first differences in the 10-yma are determined. Discussion of the 2009 North Atlantic basin hurricane season, updating earlier results, is given.

  1. Thermodynamic compatibility of actives encapsulated into PEG-PLA nanoparticles: In Silico predictions and experimental verification.

    Science.gov (United States)

    Erlebach, Andreas; Ott, Timm; Otzen, Christoph; Schubert, Stephanie; Czaplewska, Justyna; Schubert, Ulrich S; Sierka, Marek

    2016-09-15

    Achieving optimal solubility of active substances in polymeric carriers is of fundamental importance for a number of industrial applications, including targeted drug delivery within the growing field of nanomedicine. However, its experimental optimization using a trial-and-error approach is cumbersome and time-consuming. Here, an approach based on molecular dynamics (MD) simulations and the Flory-Huggins theory is proposed for rapid prediction of thermodynamic compatibility between active species and copolymers comprising hydrophilic and hydrophobic segments. In contrast to similar methods, our approach offers high computational efficiency by employing MD simulations that avoid explicit consideration of the actual copolymer chains. The accuracy of the method is demonstrated for compatibility predictions between pyrene and nile red as model dyes as well as indomethacin as model drug and copolymers containing blocks of poly(ethylene glycol) and poly(lactic acid) in different ratios. The results of the simulations are directly verified by comparison with the observed encapsulation efficiency of nanoparticles prepared by nanoprecipitation. © 2016 Wiley Periodicals, Inc. PMID:27425625

  2. Association of day length and weather conditions with physical activity levels in older community dwelling people

    OpenAIRE

    Witham, Miles D; Donnan, Peter T.; Thenmalar Vadiveloo; Sniehotta, Falko F.; Crombie, Iain K; Zhiqiang Feng; McMurdo, Marion E. T.

    2014-01-01

    BACKGROUND: Weather is a potentially important determinant of physical activity. Little work has been done examining the relationship between weather and physical activity, and potential modifiers of any relationship in older people. We therefore examined the relationship between weather and physical activity in a cohort of older community-dwelling people. METHODS: We analysed prospectively collected cross-sectional activity data from community-dwelling people aged 65 and over in the Physical...

  3. Availability verification of information for human system interface in automatic SG level control using activity diagram

    Energy Technology Data Exchange (ETDEWEB)

    Nuraslinda, Anuar; Kim, Dong Young; Kim, Jong Hyun [KEPCO International Nuclear Graduate School, Uljugun (Korea, Republic of)

    2012-10-15

    Steam Generator (SG) level control system in OPR 1000 is one of representative automatic systems that falls under the Supervisory Control level in Endsley's taxonomy. Supervisory control of automated systems is classified as a form of out of the loop (OOTL) performance due to passive involvement in the systems operation, which could lead to loss of situation awareness (SA). There was a reported event, which was caused by inadequate human automation communication that contributed to an unexpected reactor trip in July 2005. A high SG level trip occurred in Yeonggwang (YGN) Unit 6 Nuclear Power Plant (NPP) due to human operator failure to recognize the need to change the control mode of the economizer valve controller (EVC) to manual mode during swap over (the transition from low power mode to high power mode) after the loss of offsite power (LOOP) event was recovered. This paper models the human system interaction in NPP SG level control system using Unified Modeling Language (UML) Activity Diagram. Then, it identifies the missing information for operators in the OPR1000 Main Control Room (MCR) and suggests some means of improving the human system interaction.

  4. Validation and verification of a deterministic model for corrosion and activity incorporation using operational data from Kozloduy NPP

    Energy Technology Data Exchange (ETDEWEB)

    Betova, I. [Technical Univ. of Sofia, Sofia (Bulgaria); Bojinov, M. [Univ. of Chemical Technology and Metallurgy, Sofia (Bulgaria); Minkova, K. [Kozloduy Nuclear Power Plant, Kozloduy (Bulgaria)

    2010-07-01

    An integrated deterministic model of corrosion and activity incorporation in the construction materials of the primary circuit of light water reactors based on fundamental physico-chemical processes has been recently proposed. The calculational procedure of the model enables to obtain reliable estimates of the kinetic and transport parameters of growth and restructuring of inner and outer oxide layers on austenitic steels (AISI 304, 0X18H10T, AISI 316, A800) and nickel alloys (A600 ∦ A690) via a quantitative comparison of the model equations with electrochemical data on the conduction mechanism and ex-situ analytical information on the thickness and in-depth chemical composition of the oxides on such materials stemming from both laboratory and in-pile BWR, PWR and WWER reactor experience. As a result, a large database of kinetic and transport parameters makes it possible to predict the kinetics of growth and restructuring of the oxides, as well as corrosion release from the construction materials in the primary circuit, by using data on the water chemistry and radioactive corrosion products in the coolant and the piping surfaces. Predictions on the incorporation of radioactive corrosion products during oxide growth and restructuring are also made. In the present communication, the validation and verification of the model using data for the primary circuit water chemistry and radioactive corrosion products from both reactors of the Kozloduy NPP are discussed. The calculations are in good agreement with the available experimental data and allow for reliable long-term predictions of the corrosion processes and radioactivity accumulation in the primary circuit of both reactors of Kozloduy NPP to be obtained. (author)

  5. The length of a lantibiotic hinge region has profound influence on antimicrobial activity and host specificity

    Directory of Open Access Journals (Sweden)

    Liang eZhou

    2015-01-01

    Full Text Available Lantibiotics are ribosomally synthesized (methyllanthionine containing peptides which can efficiently inhibit the growth of Gram-positive bacteria. As lantibiotics kill bacteria efficiently and resistance to them is difficult to be obtained, they have the potential to be used in many applications, e.g. in pharmaceutical industry or food industry. Nisin can inhibit the growth of Gram-positive bacteria by binding to lipid II and by making pores in their membrane. The C-terminal part of nisin is known to play an important role during translocation over the membrane and forming pore complexes. However, as the thickness of bacterial membranes varies between different species and environmental conditions, this property could have an influence on the pore forming activity of nisin. To investigate this, the so-called hinge region of nisin (residues NMK was engineered to vary from one to six amino acid residues and specific activity against different indicators was compared. Antimicrobial activity in liquid culture assays showed that wild type nisin is most active, while truncation of the hinge region dramatically reduced the activity of the peptide. However, one or two amino acids extensions showed only slightly reduced activity against most indicator strains. Notably, some variants (+2, +1, -1, -2 exhibited higher antimicrobial activity than nisin in agar well diffusion assays against Lactococcus lactis MG1363, Listeria monocytogenes, Enterococcus faecalis VE14089, Bacillus sporothermodurans IC4 and Bacillus cereus 4153 at certain temperatures.

  6. The length of a lantibiotic hinge region has profound influence on antimicrobial activity and host specificity.

    Science.gov (United States)

    Zhou, Liang; van Heel, Auke J; Kuipers, Oscar P

    2015-01-01

    Lantibiotics are ribosomally synthesized (methyl)lanthionine containing peptides which can efficiently inhibit the growth of Gram-positive bacteria. As lantibiotics kill bacteria efficiently and resistance to them is difficult to be obtained, they have the potential to be used in many applications, e.g., in pharmaceutical industry or food industry. Nisin can inhibit the growth of Gram-positive bacteria by binding to lipid II and by making pores in their membrane. The C-terminal part of nisin is known to play an important role during translocation over the membrane and forming pore complexes. However, as the thickness of bacterial membranes varies between different species and environmental conditions, this property could have an influence on the pore forming activity of nisin. To investigate this, the so-called "hinge region" of nisin (residues NMK) was engineered to vary from one to six amino acid residues and specific activity against different indicators was compared. Antimicrobial activity in liquid culture assays showed that wild type nisin is most active, while truncation of the hinge region dramatically reduced the activity of the peptide. However, one or two amino acids extensions showed only slightly reduced activity against most indicator strains. Notably, some variants (+2, +1, -1, -2) exhibited higher antimicrobial activity than nisin in agar well diffusion assays against Lactococcus lactis MG1363, Listeria monocytogenes, Enterococcus faecalis VE14089, Bacillus sporothermodurans IC4 and Bacillus cereus 4153 at certain temperatures.

  7. The chain length of biologically produced (R)-3-hydroxyalkanoic acid affects biological activity and structure of anti-cancer peptides.

    Science.gov (United States)

    Szwej, Emilia; Devocelle, Marc; Kenny, Shane; Guzik, Maciej; O'Connor, Stephen; Nikodinovic-Runic, Jasmina; Radivojevic, Jelena; Maslak, Veselin; Byrne, Annete T; Gallagher, William M; Zulian, Qun Ren; Zinn, Manfred; O'Connor, Kevin E

    2015-06-20

    Conjugation of DP18L peptide with (R)-3-hydroxydecanoic acid, derived from the biopolymer polyhydroxyalkanoate, enhances its anti-cancer activity (O'Connor et al., 2013. Biomaterials 34, 2710-2718). However, it is unknown if other (R)-3-hydroxyalkanoic acids (R3HAs) can enhance peptide activity, if chain length affects enhancement, and what effect R3HAs have on peptide structure. Here we show that the degree of enhancement of peptide (DP18L) anti-cancer activity by R3HAs is carbon chain length dependent. In all but one example the R3HA conjugated peptides were more active against cancer cells than the unconjugated peptides. However, R3HAs with 9 and 10 carbons were most effective at improving DP18L activity. DP18L peptide variant DP17L, missing a hydrophobic amino acid (leucine residue 4) exhibited lower efficacy against MiaPaCa cells. Circular dichroism analysis showed DP17L had a lower alpha helix content and the conjugation of any R3HA ((R)-3-hydroxyhexanoic acid to (R)-3-hydroxydodecanoic acid) to DP17L returned the helix content back to levels of DP18L. However (R)-3-hydroxyhexanoic did not enhance the anti-cancer activity of DP17L and at least 7 carbons were needed in the R3HA to enhance activity of D17L. DP17L needs a longer chain R3HA to achieve the same activity as DP18L conjugated to an R3HA. As a first step to assess the synthetic potential of polyhydroxyalkanoate derived R3HAs, (R)-3-hydroxydecanoic acid was synthetically converted to (±)3-chlorodecanoic acid, which when conjugated to DP18L improved its antiproliferative activity against MiaPaCa cells. PMID:25820126

  8. Field Test and Performance Verification: Integrated Active Desiccant Rooftop Hybrid System Installed in a School - Final Report: Phase 4A

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, J

    2005-12-21

    This report summarizes the results of a field verification pilot site investigation that involved the installation of a hybrid integrated active desiccant/vapor-compression rooftop heating, ventilation, and air-conditioning (HVAC) unit at an elementary school in the Atlanta Georgia area. For years, the school had experienced serious humidity and indoor air quality (IAQ) problems that had resulted in occupant complaints and microbial (mold) remediation. The outdoor air louvers of the original HVAC units had been closed in an attempt to improve humidity control within the space. The existing vapor compression variable air volume system was replaced by the integrated active desiccant rooftop (IADR) system that was described in detail in an Oak Ridge National Laboratory (ORNL) report published in 2004 (Fischer and Sand 2004). The IADR system and all space conditions have been monitored remotely for more than a year. The hybrid system was able to maintain both the space temperature and humidity as desired while delivering the outdoor air ventilation rate required by American Society of Heating, Refrigerating and Air-Conditioning Engineers Standard 62. The performance level of the IADR unit and the overall system energy efficiency was measured and found to be very high. A comprehensive IAQ investigation was completed by the Georgia Tech Research Institute before and after the system retrofit. Before-and-after data resulting from this investigation confirmed a significant improvement in IAQ, humidity control, and occupant comfort. These observations were reported by building occupants and are echoed in a letter to ORNL from the school district energy manager. The IADR system was easily retrofitted in place of the original rooftop system using a custom curb adapter. All work was completed in-house by the school's maintenance staff over one weekend. A subsequent cost analysis completed for the school district by the design engineer of record concluded that the IADR

  9. Modulating anti-MicroRNA-21 activity and specificity using oligonucleotide derivatives and length optimization

    DEFF Research Database (Denmark)

    Munoz-Alarcon, Andres; Guterstam, Peter; Romero, Cristian;

    2012-01-01

    MicroRNAs are short, endogenous RNAs that direct posttranscriptional regulation of gene expression vital for many developmental and cellular functions. Implicated in the pathogenesis of several human diseases, this group of RNAs provides interesting targets for therapeutic intervention. Anti......-microRNA oligonucleotides constitute a class of synthetic antisense oligonucleotides used to interfere with microRNAs. In this study, we investigate the effects of chemical modifications and truncations on activity and specificity of anti-microRNA oligonucleotides targeting microRNA-21. We observed an increased activity...

  10. Honey, I Shrunk the DNA : DNA Length as a Probe for Nucleic-Acid Enzyme Activity

    NARCIS (Netherlands)

    Oijen, Antoine M. van

    2007-01-01

    The replication, recombination, and repair of DNA are processes essential for the maintenance of genomic information and require the activity of numerous enzymes that catalyze the polymerization or digestion of DNA. This review will discuss how differences in elastic properties between single- and d

  11. Activated carbon load equalization of gas-phase toluene: effect of cycle length and fraction of time in loading

    Energy Technology Data Exchange (ETDEWEB)

    William M. Moe; Kodi L. Collins; John D. Rhodes [Louisiana State University, Baton Rouge, LA (United States). Department of Civil and Environmental Engineering

    2007-08-01

    Fluctuating pollutant concentrations pose challenges in the design and operation of air pollution control devices such as biofilters. Effective load equalization could decrease or eliminate many of these difficulties. In research described here, experiments were conducted to evaluate effects of cycle length and fraction of time contaminants are supplied on the degree of load equalization achieved by passively operated granular activated carbon (GAC) beds. Columns packed with bituminous coal based Calgon BPL 4 x 6 mesh GAC were subjected to a variety of cyclic loading conditions in which toluene was supplied at concentrations of 1000 or 250 ppmv during loading intervals, and uncontaminated air flowed through the columns during no-loading intervals. The fraction of time when toluene was supplied ranged from 1/2 to 1/6, and cycle lengths ranged from 6 to 48 h. Results demonstrate that passively operated GAC columns can temporarily accumulate contaminants during intervals of high influent concentration and desorb contaminants during intervals of no loading, resulting in appreciable load equalization without need for external regeneration by heating or other means. Greater load equalization was achieved as the fraction of time toluene was loaded decreased and as the cycle length decreased. A pore and surface diffusion model, able to predict the level of contaminant concentration attenuation in GAC columns with reasonable accuracy, was used to further explore the range of load equalization performance expected from columns of various packed bed depths. 19 refs., 6 figs., 1 tab.

  12. NMR characterization of full-length farnesylated and non-farnesylated H-Ras and its implications for Raf activation.

    Science.gov (United States)

    Thapar, Roopa; Williams, Jason G; Campbell, Sharon L

    2004-11-01

    The C terminus, also known as the hypervariable region (residues 166-189), of H-, N-, and K-Ras proteins has sequence determinants necessary for full activation of downstream effectors such as Raf kinase and PI-3 kinase as well as for the correct targeting of Ras proteins to lipid rafts and non-raft membranes. There is considerable interest in understanding how residues in the extreme C terminus of the different Ras proteins and farnesylation of the CaaX box cysteine affect Ras membrane localization and allosteric activation of Raf kinase. To provide insights into the structural and dynamic changes that occur in Ras upon farnesylation, we have used NMR spectroscopy to compare the properties of truncated H-Ras (1-166), to non-processed full-length H-Ras (residues 1-185) and full-length (1-189) farnesylated H-Ras. We report that the C-terminal helix alpha-5 extends to residue N172, and the remaining 17 amino acid residues in the C terminus are conformationally averaged in solution. Removal of either 23 or 18 amino acid residues from the C terminus of full length H-Ras generates truncated H-Ras (1-166) and H-Ras (1-171) proteins, respectively, that have been structurally characterized and are biochemical active. Here we report that C-terminal truncation of H-Ras results in minor structural and dynamic perturbations that are propagated throughout the H-Ras protein including increased flexibility of the central beta-sheet and the C-terminal helix alpha-5. Ordering of residues in loop-2, which is involved in Raf CRD binding is also observed. Farnesylation of full-length H-Ras at C186 does not result in detectable conformational changes in H-Ras. Chemical shift mapping studies of farnesylated and non-farnesylated forms of H-Ras with the Raf-CRD show that the farnesyl moiety, the extreme H-Ras C terminus and residues 23-30, contribute to H-Ras:Raf-CRD interactions, thereby increasing the affinity of H-Ras for the Raf-CRD.

  13. Photocatalytic Activity and Photocurrent Properties of TiO2 Nanotube Arrays Influenced by Calcination Temperature and Tube Length

    Science.gov (United States)

    Hou, Jian; Zhang, Min; Yan, Guotian; Yang, Jianjun

    2012-06-01

    In this article, titanium oxide nanotube arrays (TiO2-NTAs) were fabricated by anodic oxidation in an ethylene glycol (EG) electrolyte solution containing 0.25 wt.% NH4F. By varying anodized time and annealed temperature, the obtained nanotube arrays behaved different photocatalytic (PC) activities and photocurrent properties. These samples were characterized by scanning electronic microscope (SEM), X-ray powder diffraction (XRD). It was indicated in SEM images that TiO2 nanotube manifests highly ordered structure which, however, has been completely destroyed when the temperature comes to 800°C. XRD manifested that TiO2 nanotubes with various kinds of length all possessed anatase crystallite when annealed at 500°C; meanwhile, with certain length, TiO2-NTAs annealed at series calcination temperature range of 300-600°C also presented anatase crystallite, which is gradually enhanced with the increment of temperature. At 700°C, mixed structure was observed which was made up of proportions of overwhelming anatase and toothful rutile. Methyl blue (MB) degradation and photocurrent measurement testified that TiO2-NTAs under 4 h oxidation and 3 h of 600°C calcination manifested the highest activity and photocurrent density.

  14. Hydrophobic Side-Chain Length Determines Activity and Conformational Heterogeneity of a Vancomycin Derivative Bound to the Cell Wall of Staphylococcus aureus§

    OpenAIRE

    Kim, Sung Joon; Schaefer, Jacob

    2008-01-01

    Disaccharide modified glycopeptides with hydrophobic sidechains are active against vancomycin-resistant enterococci and vancomycin-resistant S. aureus. The activity depends on the length of the sidechain. The benzyl sidechain of N-(4-fluorobenzyl)vancomycin (FBV) has the minimal length sufficient for enhancement in activity against vancomycin-resistant pathogens. The conformation of FBV bound to the peptidoglycan in whole cells of S. aureus has been determined using rotational-echo double res...

  15. Bacterial membrane activity of a-peptide/b-peptoid chimeras: Influence of amino acid composition and chain length on the activity against different bacterial strains

    DEFF Research Database (Denmark)

    Hein-Kristensen, Line; Knapp, Kolja M; Franzyk, Henrik;

    2011-01-01

    acid only had a minor effect on MIC values, whereas chain length had a profound influence on activity. All chimeras were less active against Serratia marcescens (MICs above 46 μM). The chimeras were bactericidal and induced leakage of ATP from Staphylococcus aureus and S. marcescens with similar time...... of onset and reduction in the number of viable cells. EDTA pre-treatment of S. marcescens and E. coli followed by treatment with chimeras resulted in pronounced killing indicating that disintegration of the Gram-negative outer membrane eliminated innate differences in susceptibility. Chimera chain length...... of the bacterial cell envelope, and the outer membrane may act as a barrier in Gram-negative bacteria. The tolerance of S. marcescens to chimeras may be due to differences in the composition of the lipopolysaccharide layer also responsible for its resistance to polymyxin B....

  16. Linker length and flexibility induces new cellobiohydrolase activity of PoCel6A from Penicillium oxalicum.

    Science.gov (United States)

    Gao, Le; Wang, Lushan; Jiang, Xukai; Qu, Yinbo

    2015-06-01

    In a previous study, a novel cellobiohydrolase, PoCel6A, with new enzymatic activity against p-nitrophenyl-β-D-cellobioside (pNPC), was purified from Penicillium oxalicum. The cellulose-binding module and catalytic domain of PoCel6A showed a high degree of sequence similarity with other fungal Cel6As. However, PoCel6A had 11 more amino acids in the linker region than other Cel6As. To evaluate the relationship between the longer linker of PoCel6A and its enzymatic activity, 11 amino acids were deleted from the linker region of PoCel6A. The shortened PoCel6A linker nullified the enzymatic activity against pNPC but dramatically increased the enzyme's capacity for crystalline cellulose degradation. The shortened linker segment appeared to have no effect on the secondary structural conformation of PoCel6A. Another variant (PoCel6A-6pro) with six consecutive proline residues in the interdomain linker had a higher rigid linker, and no enzymatic activity was observed against soluble and insoluble substrate. The flexibility of the linker had an important function in the formation of active cellulase. The length and flexibility of the linker is clearly able to modify the function of PoCel6A and induce new characteristics of Cel6A.

  17. Conjugation of fatty acids with different lengths modulates the antibacterial and antifungal activity of a cationic biologically inactive peptide.

    Science.gov (United States)

    Malina, Amir; Shai, Yechiel

    2005-09-15

    Many studies have shown that an amphipathic structure and a threshold of hydrophobicity of the peptidic chain are crucial for the biological function of AMPs (antimicrobial peptides). However, the factors that dictate their cell selectivity are not yet clear. In the present study, we show that the attachment of aliphatic acids with different lengths (10, 12, 14 or 16 carbon atoms) to the N-terminus of a biologically inactive cationic peptide is sufficient to endow the resulting lipopeptides with lytic activity against different cells. Mode-of-action studies were performed with model phospholipid membranes mimicking those of bacterial, mammalian and fungal cells. These include determination of the structure in solution and membranes by using CD and ATR-FTIR (attenuated total reflectance Fourier-transform infrared) spectroscopy, membrane leakage experiments and by visualizing bacterial and fungal damage via transmission electron microscopy. The results obtained reveal that: (i) the short lipopeptides (10 and 12 carbons atoms) are non-haemolytic, active towards both bacteria and fungi and monomeric in solution. (ii) The long lipopeptides (14 and 16 carbons atoms) are highly antifungal, haemolytic only at concentrations above their MIC (minimal inhibitory concentration) values and aggregate in solution. (iii) All the lipopeptides adopt a partial alpha-helical structure in 1% lysophosphatidylcholine and bacterial and mammalian model membranes. However, the two short lipopeptides contain a significant fraction of random coil in fungal membranes, in agreement with their reduced antifungal activity. (iv) All the lipopeptides have a membranolytic effect on all types of cells assayed. Overall, the results reveal that the length of the aliphatic chain is sufficient to control the pathogen specificity of the lipopeptides, most probably by controlling both the overall hydrophobicity and the oligomeric state of the lipopeptides in solution. Besides providing us with basic

  18. The Ca2+-activated Cl- channel Ano1 controls microvilli length and membrane surface area in the oocyte.

    Science.gov (United States)

    Courjaret, Raphael; Hodeify, Rawad; Hubrack, Satanay; Ibrahim, Awab; Dib, Maya; Daas, Sahar; Machaca, Khaled

    2016-07-01

    Ca(2+)-activated Cl(-) channels (CaCCs) play important physiological functions in epithelia and other tissues. In frog oocytes the CaCC Ano1 regulates resting membrane potential and the block to polyspermy. Here, we show that Ano1 expression increases the oocyte surface, revealing a novel function for Ano1 in regulating cell morphology. Confocal imaging shows that Ano1 increases microvilli length, which requires ERM-protein-dependent linkage to the cytoskeleton. A dominant-negative form of the ERM protein moesin precludes the Ano1-dependent increase in membrane area. Furthermore, both full-length and the truncated dominant-negative forms of moesin co-localize with Ano1 to the microvilli, and the two proteins co-immunoprecipitate. The Ano1-moesin interaction limits Ano1 lateral membrane mobility and contributes to microvilli scaffolding, therefore stabilizing larger membrane structures. Collectively, these results reveal a newly identified role for Ano1 in shaping the plasma membrane during oogenesis, with broad implications for the regulation of microvilli in epithelia. PMID:27173493

  19. Mesenchymal stem cells with high telomerase expression do not actively restore their chromosome arm specific telomere length pattern after exposure to ionizing radiation

    DEFF Research Database (Denmark)

    Graakjaer, Jesper; Christensen, Rikke; Kolvraa, Steen;

    2007-01-01

    BACKGROUND: Previous studies have demonstrated that telomeres in somatic cells are not randomly distributed at the end of the chromosomes. We hypothesize that these chromosome arm specific differences in telomere length (the telomere length pattern) may be actively maintained. In this study we...... investigate the existence and maintenance of the telomere length pattern in stem cells. For this aim we studied telomere length in primary human mesenchymal stem cells (hMSC) and their telomerase-immortalised counterpart (hMSC-telo1) during extended proliferation as well as after irradiation. Telomere lengths...... were measured using Fluorescence In Situ Hybridization (Q-FISH). RESULTS: A telomere length pattern was found to exist in primary hMSC's as well as in hMSC-telo1. This pattern is similar to what was previously found in lymphocytes and fibroblasts. The cells were then exposed to a high dose of ionizing...

  20. Flame Length

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — Flame length was modeled using FlamMap, an interagency fire behavior mapping and analysis program that computes potential fire behavior characteristics. The tool...

  1. [Synergetic Inhibitory Effect of Free Ammonia and Aeration Phase Length Control on the Activity of Nitrifying Bacteria].

    Science.gov (United States)

    Sun, Hong-wei; Lü, Xin-tao; Wei, Xue-fen; Zhao, Hua-nan; Ma, Juan; Fang, Xiao-hang

    2016-03-15

    Three sequencing batch reactors (SBRs) labeled with R(Ahead), R(Exact) and R(Exceed) were employed to investigate the synergetic inhibition effect of free ammonia (FA) and length of aeration phase on the activity of ammonia-oxidizing bacteria ( AOB) and nitrite- oxidizing bacteria (NOB) after shortcut nitritation was achieved in the systems. The experiments were conducted under the conditions of three FA concentrations (0.5, 5. 1, 10.1 mg · L⁻¹) combined with three kinds of aeration time (t(Exact): the time when ammonia oxidation was completed; t(Ahead): 30 min ahead of the time when ammonia oxidation was completed; t(Exceed): 30 min exceeded when the time ammonia oxidation was completed). It was found that short-cut nitrification could be successfully established in three reactors with a FA level of 10.1 mg · L⁻¹. Meanwhile, the speed of achieving nitritation was in the sequence of R(Ahead) > R(Exact) > R(Exceed) with operational cycles of 56, 62 and 72, respectively. Compared to AOB, NOB in the three reactors was observed to be more sensitive to FA, resulting in AOB activity higher than NOB activity throughout the whole experimental period. Moreover, there was great difference in the activity coefficient ( η) between AOB and NOB. The activity coefficients of AOB were in the order of η(RExact) > η(RExceed) > η(RAhead) with the values of 104.4%, 100% and 85.8%, respectively. Nevertheless, the activity coefficients of NOB were in the order of η(RExceed) > η(RExact) > η(RAhead) with the values of 71.2%, 64.9% and 50.2%, respectively.

  2. [Synergetic Inhibitory Effect of Free Ammonia and Aeration Phase Length Control on the Activity of Nitrifying Bacteria].

    Science.gov (United States)

    Sun, Hong-wei; Lü, Xin-tao; Wei, Xue-fen; Zhao, Hua-nan; Ma, Juan; Fang, Xiao-hang

    2016-03-15

    Three sequencing batch reactors (SBRs) labeled with R(Ahead), R(Exact) and R(Exceed) were employed to investigate the synergetic inhibition effect of free ammonia (FA) and length of aeration phase on the activity of ammonia-oxidizing bacteria ( AOB) and nitrite- oxidizing bacteria (NOB) after shortcut nitritation was achieved in the systems. The experiments were conducted under the conditions of three FA concentrations (0.5, 5. 1, 10.1 mg · L⁻¹) combined with three kinds of aeration time (t(Exact): the time when ammonia oxidation was completed; t(Ahead): 30 min ahead of the time when ammonia oxidation was completed; t(Exceed): 30 min exceeded when the time ammonia oxidation was completed). It was found that short-cut nitrification could be successfully established in three reactors with a FA level of 10.1 mg · L⁻¹. Meanwhile, the speed of achieving nitritation was in the sequence of R(Ahead) > R(Exact) > R(Exceed) with operational cycles of 56, 62 and 72, respectively. Compared to AOB, NOB in the three reactors was observed to be more sensitive to FA, resulting in AOB activity higher than NOB activity throughout the whole experimental period. Moreover, there was great difference in the activity coefficient ( η) between AOB and NOB. The activity coefficients of AOB were in the order of η(RExact) > η(RExceed) > η(RAhead) with the values of 104.4%, 100% and 85.8%, respectively. Nevertheless, the activity coefficients of NOB were in the order of η(RExceed) > η(RExact) > η(RAhead) with the values of 71.2%, 64.9% and 50.2%, respectively. PMID:27337903

  3. Potency of Full- Length MGF to Induce Maximal Activation of the IGF-I R Is Similar to Recombinant Human IGF-I at High Equimolar Concentrations

    OpenAIRE

    Janssen, Joseph A. M. J. L.; Hofland, Leo J.; Strasburger, Christian J.; Elisabeth S R van den Dungen; Mario Thevis

    2016-01-01

    textabstractAims To compare full-length mechano growth factor (full-length MGF) with human recombinant insulin-like growth factor-I (IGF-I) and human recombinant insulin (HI) in their ability to activate the human IGF-I receptor (IGF-IR), the human insulin receptor (IR-A) and the human insulin receptor-B (IR-B), respectively. In addition, we tested the stimulatory activity of human MGF and its stabilized analog Goldspink-MGF on the IGF-IR. Methods The effects of full-length MGF, IGF-I, human ...

  4. Role of active contraction and tropomodulins in regulating actin filament length and sarcomere structure in developing zebrafish skeletal muscle

    Directory of Open Access Journals (Sweden)

    Lise eMazelet

    2016-03-01

    Full Text Available Whilst it is recognised that contraction plays an important part in maintaining the structure and function of mature skeletal muscle, its role during development remains undefined. In this study the role of movement in skeletal muscle maturation was investigated in intact zebrafish embryos using a combination of genetic and pharmacological approaches. An immotile mutant line (cacnb1ts25 which lacks functional voltage-gated calcium channels (dihydropyridine receptors in the muscle and pharmacological immobilisation of embryos with a reversible anaesthetic (Tricaine, allowed the study of paralysis (in mutants and anaesthetised fish and recovery of movement (reversal of anaesthetic treatment. The effect of paralysis in early embryos (aged between 17-24 hours post fertilisation, hpf on skeletal muscle structure at both myofibrillar and myofilament level was determined using both immunostaining with confocal microscopy and small angle X-ray diffraction. The consequences of paralysis and subsequent recovery on the localisation of the actin capping proteins Tropomodulin 1 &4 (Tmod in fish aged from 17hpf until 42hpf was also assessed. The functional consequences of early paralysis were investigated by examining the mechanical properties of the larval muscle. The length-force relationship, active and passive tension, was measured in immotile, recovered and control skeletal muscle at 5 and 7 day post fertilisation (dpf. Recovery of muscle function was also assessed by examining swimming patterns in recovered and control fish. Inhibition of the initial embryonic movements (up to 24 hpf resulted in an increase in myofibril length and a decrease in width followed by almost complete recovery in both moving and paralysed fish by 42hpf. In conclusion, myofibril organisation is regulated by a dual mechanism involving movement-dependent and movement-independent processes. The initial contractile event itself drives the localisation of Tmod1 to its sarcomeric

  5. On the role of code comparisons in verification and validation.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2003-08-01

    This report presents a perspective on the role of code comparison activities in verification and validation. We formally define the act of code comparison as the Code Comparison Principle (CCP) and investigate its application in both verification and validation. One of our primary conclusions is that the use of code comparisons for validation is improper and dangerous. We also conclude that while code comparisons may be argued to provide a beneficial component in code verification activities, there are higher quality code verification tasks that should take precedence. Finally, we provide a process for application of the CCP that we believe is minimal for achieving benefit in verification processes.

  6. Potency of Full- Length MGF to Induce Maximal Activation of the IGF-I R Is Similar to Recombinant Human IGF-I at High Equimolar Concentrations.

    Directory of Open Access Journals (Sweden)

    Joseph A M J L Janssen

    Full Text Available To compare full-length mechano growth factor (full-length MGF with human recombinant insulin-like growth factor-I (IGF-I and human recombinant insulin (HI in their ability to activate the human IGF-I receptor (IGF-IR, the human insulin receptor (IR-A and the human insulin receptor-B (IR-B, respectively. In addition, we tested the stimulatory activity of human MGF and its stabilized analog Goldspink-MGF on the IGF-IR.The effects of full-length MGF, IGF-I, human mechano growth factor (MGF, Goldspink-MGF and HI were compared using kinase specific receptor activation (KIRA bioassays specific for IGF-I, IR-A or IR-B, respectively. These assays quantify activity by measuring auto-phosphorylation of the receptor upon ligand binding.IGF-IR: At high equimolar concentrations maximal IGF-IR stimulating effects generated by full-length MGF were similar to that of IGF-I (89-fold vs. 77-fold, respectively. However, EC50 values of IGF-I and full-length MGF for the IGF-I receptor were 0.86 nmol/L (95% CI 0.69-1.07 and 7.83 nmol/L (95% CI: 4.87-12.58, respectively. No IGF-IR activation was observed by human MGF and Goldspink-MGF, respectively. IR-A/IR-B: At high equimolar concentrations similar maximal IR-A stimulating effects were observed for full -length MGF and HI, but maximal IR-B stimulation achieved by full -length MGF was stronger than that by HI (292-fold vs. 98-fold. EC50 values of HI and full-length MGF for the IR-A were 1.13 nmol/L (95% CI 0.69-1.84 and 73.11 nmol/L (42.87-124.69, respectively; for IR-B these values were 1.28 nmol/L (95% CI 0.64-2.57 and 35.10 nmol/L (95% 17.52-70.33, respectively.Full-length MGF directly stimulates the IGF-IR. Despite a higher EC50 concentration, at high equimolar concentrations full-length MGF showed a similar maximal potency to activate the IGF-IR as compared to IGF-I. Further research is needed to understand the actions of full-length MGF in vivo and to define the physiological relevance of our in vitro findings.

  7. Potency of Full- Length MGF to Induce Maximal Activation of the IGF-I R Is Similar to Recombinant Human IGF-I at High Equimolar Concentrations

    Science.gov (United States)

    Janssen, Joseph A. M. J. L.; Hofland, Leo J.; Strasburger, Christian J.; van den Dungen, Elisabeth S. R.; Thevis, Mario

    2016-01-01

    Aims To compare full-length mechano growth factor (full-length MGF) with human recombinant insulin-like growth factor-I (IGF-I) and human recombinant insulin (HI) in their ability to activate the human IGF-I receptor (IGF-IR), the human insulin receptor (IR-A) and the human insulin receptor-B (IR-B), respectively. In addition, we tested the stimulatory activity of human MGF and its stabilized analog Goldspink-MGF on the IGF-IR. Methods The effects of full-length MGF, IGF-I, human mechano growth factor (MGF), Goldspink-MGF and HI were compared using kinase specific receptor activation (KIRA) bioassays specific for IGF-I, IR-A or IR-B, respectively. These assays quantify activity by measuring auto-phosphorylation of the receptor upon ligand binding. Results IGF-IR: At high equimolar concentrations maximal IGF-IR stimulating effects generated by full-length MGF were similar to that of IGF-I (89-fold vs. 77-fold, respectively). However, EC50 values of IGF-I and full-length MGF for the IGF-I receptor were 0.86 nmol/L (95% CI 0.69–1.07) and 7.83 nmol/L (95% CI: 4.87–12.58), respectively. No IGF-IR activation was observed by human MGF and Goldspink-MGF, respectively. IR-A/IR-B: At high equimolar concentrations similar maximal IR-A stimulating effects were observed for full -length MGF and HI, but maximal IR-B stimulation achieved by full -length MGF was stronger than that by HI (292-fold vs. 98-fold). EC50 values of HI and full-length MGF for the IR-A were 1.13 nmol/L (95% CI 0.69–1.84) and 73.11 nmol/L (42.87–124.69), respectively; for IR-B these values were 1.28 nmol/L (95% CI 0.64–2.57) and 35.10 nmol/L (95% 17.52–70.33), respectively. Conclusions Full-length MGF directly stimulates the IGF-IR. Despite a higher EC50 concentration, at high equimolar concentrations full-length MGF showed a similar maximal potency to activate the IGF-IR as compared to IGF-I. Further research is needed to understand the actions of full-length MGF in vivo and to define the

  8. Rer1p maintains ciliary length and signaling by regulating γ-secretase activity and Foxj1a levels.

    Science.gov (United States)

    Jurisch-Yaksi, Nathalie; Rose, Applonia J; Lu, Huiqi; Raemaekers, Tim; Munck, Sebastian; Baatsen, Pieter; Baert, Veerle; Vermeire, Wendy; Scales, Suzie J; Verleyen, Daphne; Vandepoel, Roel; Tylzanowski, Przemko; Yaksi, Emre; de Ravel, Thomy; Yost, H Joseph; Froyen, Guy; Arrington, Cammon B; Annaert, Wim

    2013-03-18

    Cilia project from the surface of most vertebrate cells and are important for several physiological and developmental processes. Ciliary defects are linked to a variety of human diseases, named ciliopathies, underscoring the importance of understanding signaling pathways involved in cilia formation and maintenance. In this paper, we identified Rer1p as the first endoplasmic reticulum/cis-Golgi-localized membrane protein involved in ciliogenesis. Rer1p, a protein quality control receptor, was highly expressed in zebrafish ciliated organs and regulated ciliary structure and function. Both in zebrafish and mammalian cells, loss of Rer1p resulted in the shortening of cilium and impairment of its motile or sensory function, which was reflected by hearing, vision, and left-right asymmetry defects as well as decreased Hedgehog signaling. We further demonstrate that Rer1p depletion reduced ciliary length and function by increasing γ-secretase complex assembly and activity and, consequently, enhancing Notch signaling as well as reducing Foxj1a expression.

  9. Global-scale pattern of peatland Sphagnum growth driven by photosynthetically active radiation and growing season length

    Directory of Open Access Journals (Sweden)

    Z. Yu

    2012-02-01

    Full Text Available High-latitude peatlands contain about one third of the world's soil organic carbon, most of which is derived from partly decomposed Sphagnum (peat moss plants. We conducted a meta-analysis based on a global dataset of Sphagnum growth measurements collected from published literature to investigate the effects of bioclimatic variables on Sphagnum growth. Analysis of variance and general linear models were used to relate Sphagnum magellanicum and S. fuscum growth rates to photosynthetically active radiation integrated over the growing season (PAR0 and a moisture index. We found that PAR0 was the main predictor of Sphagnum growth for the global dataset, and effective moisture was only correlated with moss growth at continental sites. The strong correlation between Sphagnum growth and PAR0 suggests the existence of a global pattern of growth, with slow rates under cool climate and short growing seasons, highlighting the important role of temperature and growing season length in explaining peatland biomass production. Large-scale patterns of cloudiness during the growing season might also limit moss growth. Although considerable uncertainty remains over the carbon balance of peatlands under a changing climate, our results suggest that increasing PAR0 as a result of global warming and lengthening growing seasons could promote Sphagnum growth. Assuming that production and decomposition have the same sensitivity to temperature, this enhanced growth could lead to greater peat-carbon sequestration, inducing a negative feedback to climate change.

  10. Global-scale pattern of peatland Sphagnum growth driven by photosynthetically active radiation and growing season length

    Directory of Open Access Journals (Sweden)

    Z. Yu

    2012-07-01

    Full Text Available High-latitude peatlands contain about one third of the world's soil organic carbon, most of which is derived from partly decomposed Sphagnum (peat moss plants. We conducted a meta-analysis based on a global data set of Sphagnum growth measurements collected from published literature to investigate the effects of bioclimatic variables on Sphagnum growth. Analysis of variance and general linear models were used to relate Sphagnum magellanicum and S. fuscum growth rates to photosynthetically active radiation integrated over the growing season (PAR0 and a moisture index. We found that PAR0 was the main predictor of Sphagnum growth for the global data set, and effective moisture was only correlated with moss growth at continental sites. The strong correlation between Sphagnum growth and PAR0 suggests the existence of a global pattern of growth, with slow rates under cool climate and short growing seasons, highlighting the important role of growing season length in explaining peatland biomass production. Large-scale patterns of cloudiness during the growing season might also limit moss growth. Although considerable uncertainty remains over the carbon balance of peatlands under a changing climate, our results suggest that increasing PAR0 as a result of global warming and lengthening growing seasons, without major change in cloudiness, could promote Sphagnum growth. Assuming that production and decomposition have the same sensitivity to temperature, this enhanced growth could lead to greater peat-carbon sequestration, inducing a negative feedback to climate change.

  11. Verification and Validation of Flight Critical Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Verification and Validation is a multi-disciplinary activity that encompasses elements of systems engineering, safety, software engineering and test. The elements...

  12. The Study on the Variation of the Cavity Length's Influence on the Output Pulse Train of the Actively Mode-Locked Fiber Laser

    Institute of Scientific and Technical Information of China (English)

    LUO Hong-e; TIAN Xiao-jian; GAO Bo

    2005-01-01

    The influence of actively mode-locked Erbium-Doped Fiber Laser(EDFL) cavity length variation on the noises of an optical pulse train is investigated, in theory and in MATLAB simulation. Using a simple model, the noise characteristics of the output pulse train are studied. The results show that the noises of the output pulse train increase with the increasing of the variation of the cavity length. The theory analysis and the simulation results agree well. This result is very significant for us to improve the reliability and the stability of the actively mode-locked fiber laser.

  13. Machine learning techniques for the verification of refueling activities in CANDU-type nuclear power plants (NPPs) with direct applications in nuclear safeguards

    International Nuclear Information System (INIS)

    This dissertation deals with the problem of automated classification of the signals obtained from certain radiation monitoring systems, specifically from the Core Discharge Monitor (CDM) systems, that are successfully operated by the International Atomic Energy Agency (IAEA) at various CANDU-type nuclear power plants around the world. In order to significantly reduce the costly and error-prone manual evaluation of the large amounts of the collected CDM signals, a reliable and efficient algorithm for the automated data evaluation is necessary, which might ensure real-time performance with maximum of 0.01 % misclassification ratio. This thesis describes the research behind finding a successful prototype implementation of such automated analysis software. The finally adopted methodology assumes a nonstationary data-generating process that has a finite number of states or basic fueling activities, each of which can emit observable data patterns having particular stationary characteristics. To find out the underlying state sequences, a unified probabilistic approach known as the hidden Markov model (HMM) is used. Each possible fueling sequence is modeled by a distinct HMM having a left-right profile topology with explicit insert and delete states. Given an unknown fueling sequence, a dynamic programming algorithm akin to the Viterbi search is used to find the maximum likelihood state path through each model and eventually the overall best-scoring path is picked up as the recognition hypothesis. Machine learning techniques are applied to estimate the observation densities of the states, because the densities are not simply parameterizable. Unlike most present applications of continuous monitoring systems that rely on heuristic approaches to the recognition of possibly risky events, this research focuses on finding techniques that make optimal use of prior knowledge and computer simulation in the recognition task. Thus, a suitably modified, approximate n-best variant of

  14. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    ; qualitative and quantitative measurements of nuclear material; familiarity and access to sensitive technologies related to detection, unattended verification systems, containment/surveillance and sensors; examination and verification of design information of large and complex facilities; theoretical and practical aspects of technologies relevant to verification objectives; analysis of inspection findings and evaluation of their mutual consistency; negotiations on technical issues with facility operators and State authorities. This experience is reflected in the IAEA Safeguards Manual which sets out the policies and procedures to be followed in the inspection process as well as in the Safeguards Criteria which provide guidance for verification, evaluation and analysis of the inspection findings. The IAEA infrastructure and its experience with verification permitted in 1991 the organization to respond immediately and successfully to the tasks required by the Security Council Resolution 687(1991) for Iraq as well as to the tasks related to the verification of completeness and correctness of the initial declarations in the cases of the DPRK. and of S. Africa. In the case of Iraq the discovery of its undeclared programs was made possible through the existing verification system enhanced by additional access rights, information and application of modern detection technology. Such discoveries made it evident that there was a need for an intensive development effort to strengthen the safeguards system to develop a capability to detect undeclared activities. For this purpose it was recognized that there was need for additional and extended a) access to information, b) access to locations. It was also obvious that access to the Security Council, to bring the IAEA closer to the body responsible for maintenance of international peace and security, would be a requirement for reporting periodically on non-proliferation and the results of the IAEA's verification activities. While the case

  15. Increased in vitro glial fibrillary acidic protein expression, telomerase activity, and telomere length after productive human immunodeficiency virus-1 infection in murine astrocytes.

    Science.gov (United States)

    Ojeda, Diego; López-Costa, Juan José; Sede, Mariano; López, Ester María; Berria, María Isabel; Quarleri, Jorge

    2014-02-01

    Although HIV-associated neurocognitive disorders (HAND) result from injury and loss of neurons, productive infection routinely takes place in cells of macrophage lineage. In such a complex context, astrocytosis induced by local chemokines/cytokines is one of the hallmarks of HIV neuropathology. Whether this sustained astrocyte activation is able to alter telomere-aging process is unknown. We hypothesized that interaction of HIV with astrocytes may impact astrocyte telomerase activity (TA) and telomere length in a scenario of astrocytic activation measured by expression of glial fibrillary acidic protein (GFAP). To test this hypothesis, cultured murine astrocytes were challenged with pseudotyped HIV/vesicular stomatitis virus (HIV/VSV) to circumvent the absence of viral receptors; and GFAP, telomerase activity, and telomere length were quantified. As an early and transient event after HIV infection, both TA activity and telomere length were significantly augmented (P telomere length, that may attenuate cell proliferation and enhance the astrocyte dysregulation, contributing to HIV neuropathogenesis. Understanding the mechanisms involved in HIV-mediated persistence by altering the telomere-related aging processes could aid in the development of therapeutic modalities for neurological complications of HIV infection.

  16. PCB153 reduces telomerase activity and telomere length in immortalized human skin keratinocytes (HaCaT) but not in human foreskin keratinocytes (NFK)

    Energy Technology Data Exchange (ETDEWEB)

    Senthilkumar, P.K. [Interdisciplinary Graduate Program in Human Toxicology, The University of Iowa, Iowa City, IA (United States); Robertson, L.W. [Interdisciplinary Graduate Program in Human Toxicology, The University of Iowa, Iowa City, IA (United States); Department of Occupational and Environmental Health, The University of Iowa, Iowa City, IA (United States); Ludewig, G., E-mail: Gabriele-ludewig@uiowa.edu [Interdisciplinary Graduate Program in Human Toxicology, The University of Iowa, Iowa City, IA (United States); Department of Occupational and Environmental Health, The University of Iowa, Iowa City, IA (United States)

    2012-02-15

    Polychlorinated biphenyls (PCBs), ubiquitous environmental pollutants, are characterized by long term-persistence in the environment, bioaccumulation, and biomagnification in the food chain. Exposure to PCBs may cause various diseases, affecting many cellular processes. Deregulation of the telomerase and the telomere complex leads to several biological disorders. We investigated the hypothesis that PCB153 modulates telomerase activity, telomeres and reactive oxygen species resulting in the deregulation of cell growth. Exponentially growing immortal human skin keratinocytes (HaCaT) and normal human foreskin keratinocytes (NFK) were incubated with PCB153 for 48 and 24 days, respectively, and telomerase activity, telomere length, superoxide level, cell growth, and cell cycle distribution were determined. In HaCaT cells exposure to PCB153 significantly reduced telomerase activity, telomere length, cell growth and increased intracellular superoxide levels from day 6 to day 48, suggesting that superoxide may be one of the factors regulating telomerase activity, telomere length and cell growth compared to untreated control cells. Results with NFK cells showed no shortening of telomere length but reduced cell growth and increased superoxide levels in PCB153-treated cells compared to untreated controls. As expected, basal levels of telomerase activity were almost undetectable, which made a quantitative comparison of treated and control groups impossible. The significant down regulation of telomerase activity and reduction of telomere length by PCB153 in HaCaT cells suggest that any cell type with significant telomerase activity, like stem cells, may be at risk of premature telomere shortening with potential adverse health effects for the affected organism. -- Highlights: ► Human immortal (HaCaT) and primary (NFK) keratinocytes were exposed to PCB153. ► PCB153 significantly reduced telomerase activity and telomere length in HaCaT. ► No effect on telomere length and

  17. Proton Therapy Verification with PET Imaging

    OpenAIRE

    Zhu, Xuping; Fakhri, Georges El

    2013-01-01

    Proton therapy is very sensitive to uncertainties introduced during treatment planning and dose delivery. PET imaging of proton induced positron emitter distributions is the only practical approach for in vivo, in situ verification of proton therapy. This article reviews the current status of proton therapy verification with PET imaging. The different data detecting systems (in-beam, in-room and off-line PET), calculation methods for the prediction of proton induced PET activity distributions...

  18. Verification of ceramic structures

    NARCIS (Netherlands)

    Behar-Lafenetre, S.; Cornillon, L.; Rancurel, M.; Graaf, D. de; Hartmann, P.; Coe, G.; Laine, B.

    2012-01-01

    In the framework of the "Mechanical Design and Verification Methodologies for Ceramic Structures" contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instr

  19. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  20. Survey on Existing Techniques for Writer Verification

    Directory of Open Access Journals (Sweden)

    Rashmi Welekar

    2015-11-01

    Full Text Available This paper presents a survey of the literature on handwriting analysis and writer verification schemes and techniques up till date. The paper outlines an overview of the writer identification schemes mainly in English, Arabic, Bangla, Malayalam and Gujrati languages. Taxonomy of different features adopted for online and offline writer identification schemes is also drawn at. The feature extraction methods adopted for the schemes are discussed in length outlining the merits and demerits of the same. In automated writer verification, text independent and text dependent methods are available which is also discussed in this paper. An evaluation of writer verification schemes under multiple languages is also analyzed by comparing the recognition rate. New method proposed for identifying writer using slant, orientation, eccentricity enabling to identify writer‟s mental state by features associated.

  1. Automatic adjustment of cycle length and aeration time for improved nitrogen removal in an alternating activated sludge process

    DEFF Research Database (Denmark)

    Isaacs, Steven Howard

    1997-01-01

    The paper examines the nitrogen dynamics in the alternating BIODENITRO and BIODENIPHO processes with a focus on two control handles influencing now scheduling and aeration: the cycle length and the ammonia concentration at which a nitrifying period is terminated. A steady state analysis examining...

  2. trans activation by the full-length E2 proteins of human papillomavirus type 16 and bovine papillomavirus type 1 in vitro and in vivo: cooperation with activation domains of cellular transcription factors.

    OpenAIRE

    Ushikai, M; Lace, M J; Yamakawa, Y.; Kono, M; Anson, J; Ishiji, T; Parkkinen, S; Wicker, N.; Valentine, M E; Davidson, I

    1994-01-01

    Papillomaviral E2 genes encode proteins that regulate viral transcription. While the full-length bovine papillomavirus type 1 (BPV-1) E2 peptide is a strong trans activator, the homologous full-length E2 product of human papillomavirus type 16 (HPV-16) appeared to vary in function in previous studies. Here we show that when expressed from comparable constructs, the full-length E2 products of HPV-16 and BPV-1 trans activate a simple E2- and Sp1-dependent promoter up to approximately 100-fold i...

  3. Augmented telomerase activity, reduced telomere length and the presence of alternative lengthening of telomere in renal cell carcinoma: plausible predictive and diagnostic markers.

    Science.gov (United States)

    Pal, Deeksha; Sharma, Ujjawal; Khajuria, Ragini; Singh, Shrawan Kumar; Kakkar, Nandita; Prasad, Rajendra

    2015-05-15

    In this study, we analyzed 100 cases of renal cell carcinoma (RCC) for telomerase activity, telomere length and alternative lengthening of telomeres (ALT) using the TRAP assay, TeloTTAGGG assay kit and immunohistochemical analysis of ALT associated promyelocytic leukemia (PML) bodies respectively. A significantly higher (P=0.000) telomerase activity was observed in 81 cases of RCC which was correlated with clinicopathological features of tumor for instance, stage (P=0.008) and grades (P=0.000) but not with the subtypes of RCC (P = 0.355). Notwithstanding, no correlation was found between telomerase activity and subtypes of RCC. Strikingly, the telomere length was found to be significantly shorter in RCC (P=0.000) to that of corresponding normal renal tissues and it is well correlated with grades (P=0.016) but not with stages (P=0.202) and subtypes (P=0.669) of RCC. In this study, telomere length was also negatively correlated with the age of patients (r(2)=0.528; P=0.000) which supports the notion that it could be used as a marker for biological aging. ALT associated PML bodies containing PML protein was found in telomerase negative cases of RCC. It suggests the presence of an ALT pathway mechanism to maintain the telomere length in telomerase negative RCC tissues which was associated with high stages of RCC, suggesting a prevalent mechanism for telomere maintenance in high stages. In conclusion, the telomerase activity and telomere length can be used as a diagnostic as well as a predictive marker in RCC. The prevalence of ALT mechanism in high stages of RCC is warranted for the development of anti-ALT inhibitors along with telomerase inhibitor against RCC as a therapeutic approach. PMID:25769384

  4. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  5. Development and verification of signal processing system of avalanche photo diode for the active shields onboard ASTRO-H

    Science.gov (United States)

    Ohno, M.; Kawano, T.; Edahiro, I.; Shirakawa, H.; Ohashi, N.; Okada, C.; Habata, S.; Katsuta, J.; Tanaka, Y.; Takahashi, H.; Mizuno, T.; Fukazawa, Y.; Murakami, H.; Kobayashi, S.; Miyake, K.; Ono, K.; Kato, Y.; Furuta, Y.; Murota, Y.; Okuda, K.; Wada, Y.; Nakazawa, K.; Mimura, T.; Kataoka, J.; Ichinohe, Y.; Uchida, Y.; Katsuragawa, M.; Yoneda, H.; Sato, G.; Sato, R.; Kawaharada, M.; Harayama, A.; Odaka, H.; Hayashi, K.; Ohta, M.; Watanabe, S.; Kokubun, M.; Takahashi, T.; Takeda, S.; Kinoshita, M.; Yamaoka, K.; Tajima, H.; Yatsu, Y.; Uchiyama, H.; Saito, S.; Yuasa, T.; Makishima, K.

    2016-09-01

    The hard X-ray Imager and Soft Gamma-ray Detector onboard ASTRO-H demonstrate high sensitivity to hard X-ray (5-80 keV) and soft gamma-rays (60-600 keV), respectively. To reduce the background, both instruments are actively shielded by large, thick Bismuth Germanate scintillators. We have developed the signal processing system of the avalanche photodiode in the BGO active shields and have demonstrated its effectiveness after assembly in the flight model of the HXI/SGD sensor and after integration into the satellite. The energy threshold achieved is about 150 keV and anti-coincidence efficiency for cosmic-ray events is almost 100%. Installed in the BGO active shield, the developed signal processing system successfully reduces the room background level of the main detector.

  6. Liposome encapsulation of lipophilic N-alkyl-propanediamine platinum complexes: impact on their cytotoxic activity and influence of the carbon chain length

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Heveline; Fontes, Ana Paula S. [Universidade Federal de Juiz de Fora (UFJF), MG (Brazil). Dept. de Quimica; Lopes, Miriam Teresa P. [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Dept. de Farmacologia; Frezard, Frederic, E-mail: frezard@icb.ufmg.b [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Dept. de Fisiologia e Biofisica

    2010-07-01

    Antitumor platinum(II) complexes derived from N-alkyl-propanediamine differing in the length of their carbon chain (C8, C10, C12 and C14) were incorporated in liposomes and the cytotoxic activity of these formulations was evaluated against tumor (A{sub 549}, MDA-MB-231, B16-F1 and B16-F10) and non-tumor (BHK-21 and CHO) cell lines. Stable and monodisperse liposome suspensions incorporating the platinum complexes were obtained from the lipid composition consisting of distearoyl-sn-glycero-3-phosphocholine, cholesterol and 1,2-distearoyl-sn-glycero- 3-phosphoethanolamine-N-(methoxy(polyethylene glycol)-2000) at 5:3:0.3 molar ratio. The entrapment efficiency (EE%) of the platinum complexes in liposomes increased with the carbon chain length. EE% was higher than 80% in C12- and C14-derivatives. The effect of liposome encapsulation on the cytotoxic activity of the complexes was found to depend on the carbon chain length. These data indicate that the highest drug bioavailability from liposome formulations was achieved with the complex showing intermediate carbon chain length and partition between the liposome membrane and aqueous phase. (author)

  7. Investigating the role of chain and linker length on the catalytic activity of an H 2 production catalyst containing a β-hairpin peptide

    Energy Technology Data Exchange (ETDEWEB)

    Reback, Matthew L.; Ginovska, Bojana; Buchko, Garry W.; Dutta, Arnab; Priyadarshani, Nilusha; Kier, Brandon L.; Helm, Monte L.; Raugei, Simone; Shaw, Wendy J.

    2016-06-02

    Building on our recent report of an active H2 production catalyst [Ni(PPh2NProp-peptide)2]2+ (Prop=para-phenylpropionic acid, peptide (R10)=WIpPRWTGPR-NH2, p=D-proline, and P2N=1-aza-3,6-diphosphacycloheptane) that contains structured -hairpin peptides, here we investigate how H2 production is effected by: (1) the length of the hairpin (eight or ten residues) and (2) limiting the flexibility between the peptide and the core complex by altering the length of the linker: para-phenylpropionic acid (three carbons) or para-benzoic acid (one carbon). Reduction of the peptide chain length from ten to eight residues increases or maintains the catalytic current for H2 production for all complexes, suggesting a non-productive steric interaction at longer peptide lengths. While the structure of the hairpin appears largely intact for the complexes, NMR data are consistent with differences in dynamic behavior which may contribute to the observed differences in catalytic activity. Molecular dynamics simulations demonstrate that complexes with a one-carbon linker have the desired effect of restricting the motion of the hairpin relative to the complex; however, the catalytic currents are significantly reduced compared to complexes containing a three-carbon linker as a result of the electron withdrawing nature of the -COOH group. These results demonstrate the complexity and interrelated nature of the outer coordination sphere on catalysis.

  8. Spatial correlation of proton irradiation-induced activity and dose in polymer gel phantoms for PET/CT delivery verification studies

    International Nuclear Information System (INIS)

    activity and lays the groundwork for further investigations using BANG3-Pro2 as a dosimetric phantom in PET/CT delivery verification studies.

  9. Communication dated 18 December 2013 received from the Delegation of the European Union to the International Organisations in Vienna on the European Union's Support for the IAEA Activities in the Areas of Nuclear Security and Verification

    International Nuclear Information System (INIS)

    The Secretariat has received a note verbale dated 18 December 2013 from the Delegation of the European Union to the International Organisations in Vienna with Council Decision 2013/517/CFSP of 21 October 2013, in support of the IAEA activities in the areas of nuclear security and verification and in the framework of the implementation of the EU Strategy against Proliferation of Weapons of Mass Destruction. As requested in that communication, the note verbale and the enclosure are circulated herewith for information

  10. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  11. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  12. Ten1p promotes the telomeric DNA-binding activity of Cdc13p: implication for its function in telomere length regulation

    Institute of Scientific and Technical Information of China (English)

    Wei Qian; Jianyong Wang; Na-Na Jin; Xiao-Hong Fu; Yi-Chien Lin; Jing-Jer Lin; Jin-Qiu Zhou

    2009-01-01

    In Saccharomyces cerevisiae, the essential gene CDC13 encodes a telomeric single-stranded DNA-binding protein that interacts with Stnlp and Tenlp genetically and physically, and is required for telomere end protection and te-Iomere length control. The molecular mechanism by which Ten1 participates in telomere length regulation and chro-mosome end protection remains elusive. In this work, we observed a weak interaction of Cdc13p and Tenlp in a gel-filtration analysis using purified recombinant Cdc13p and Ten lp. Ten 1p itself exhibits a weak DNA-binding activity, but enhances the telomeric TG1-3 DNA-binding ability of Cdc13p. Cdc13p is co-immunoprecipitated with Ten1p. In the mutant ten1-55 or ten1-66 cells, the impaired interaction between Ten1p and Cdc13p results in much longer telomeres, as well as a decreased association of Cdc13p with telomeric DNA. Consistently, the Ten1-55 and Ten1-66 mutant proteins fail to stimulate the telomeric DNA-binding activity of Cdc13p in vitro. These results suggest that Ten1p enhances the telomeric DNA-binding activity of Cdc13p to negatively regulate telomere length.

  13. Development and experimental verification of a robust active noise control system for a diesel engine in submarines

    Science.gov (United States)

    Sachau, D.; Jukkert, S.; Hövelmann, N.

    2016-08-01

    This paper presents the development and experimental validation of an ANC (active noise control)-system designed for a particular application in the exhaust line of a submarine. Thereby, tonal components of the exhaust noise in the frequency band from 75 Hz to 120 Hz are reduced by more than 30 dB. The ANC-system is based on the feedforward leaky FxLMS-algorithm. The observability of the sound pressure in standing wave field is ensured by using two error microphones. The noninvasive online plant identification method is used to increase the robustness of the controller. Online plant identification is extended by a time-varying convergence gain to improve the performance in the presence of slight error in the frequency of the reference signal.

  14. Verification of threshold activation detection (TAD) technique in prompt fission neutron detection using scintillators containing 19F

    International Nuclear Information System (INIS)

    In the present study ⌀ 5''× 3'' and ⌀ 2''× 2'' EJ-313 liquid fluorocarbon as well as ⌀ 2'' × 3'' BaF2 scintillators were exposed to neutrons from a 252Cf neutron source and a Sodern Genie 16GT deuterium-tritium (D+T) neutron generator. The scintillators responses to β− particles with maximum endpoint energy of 10.4 MeV from the n+19F reactions were studied. Response of a ⌀ 5'' × 3'' BC-408 plastic scintillator was also studied as a reference. The β− particles are the products of interaction of fast neutrons with 19F which is a component of the EJ-313 and BaF2 scintillators. The method of fast neutron detection via fluorine activation is already known as Threshold Activation Detection (TAD) and was proposed for photofission prompt neutron detection from fissionable and Special Nuclear Materials (SNM) in the field of Homeland Security and Border Monitoring. Measurements of the number of counts between 6.0 and 10.5 MeV with a 252Cf source showed that the relative neutron detection efficiency ratio, defined as εBaF2 / εEJ−313−5'', is 32.0% ± 2.3% and 44.6% ± 3.4% for front-on and side-on orientation of the BaF2, respectively. Moreover, the ⌀ 5'' EJ-313 and side-on oriented BaF2 were also exposed to neutrons from the D+T neutron generator, and the relative efficiency εBaF2 / εEJ−313−5'' was estimated to be 39.3%. Measurements of prompt photofission neutrons with the BaF2 detector by means of data acquisition after irradiation (out-of-beam) of nuclear material and between the beam pulses (beam-off) techniques were also conducted on the 9 MeV LINAC of the SAPHIR facility

  15. The role of instrumental activation analysis in the verification and completion of analytical data of rock reference materials

    International Nuclear Information System (INIS)

    Ten selected rock reference materials (USGS diabase W-l, basalt BCR-1, andesite AGV-1, granite G-2, granodiorite GSP-1, and CRPG basalt BE-N, granite GS-N, trachyte ISH-G, serpentine UB-N, glass standard VS-N) were analyzed by instrumental neutron and photon activation analyses. The results were evaluated on average for the entire set of samples to detect possible systematic deviations of the determined values from the reference values. Out of 47 elements determined, 43 elements were determined with reasonable agreement (deviation <10% on average) with the reference values. Au could not be determined because of a high blank from packaging polyethylene foil. Systematically higher Dy and lower Ho and Tm (by about 20 % on average) in our results require further investigation. In several cases, reasons for greater differences between the determined and recommended values could not be traced in our procedures. The most suspect is the recommended value for W in the CRPG BE-N basalt, which is twenty-five times higher than the value determined in the present work, probably due to inconsistent contamination from a W carbide mill used in production of this reference material. (author)

  16. Verification of biological activity of irradiated Sopoongsan, an oriental medicinal prescription, for industrial application of functional cosmetic material

    Science.gov (United States)

    Lee, Jin-Young; Park, Tae-Soon; Ho Son, Jun; Jo, Cheorun; Woo Byun, Myung; Jeun An, Bong

    2007-11-01

    Sopoongsan is an oriental medicinal prescription including 12 medicinal herbs. Sopoongsan is known to have anti-inflammatory, anti-microbial, anti-allergic, and anti-cancer effects on human skin. To use Sopoongsan extract for functional cosmetic composition, its dark color should be brighter for seeking consumer demand, clear products, without any adverse change in its function. Irradiation with doses 0, 5, 10, and 20 kGy was applied to improve color of ethanol- or water-extracted Sopoongsan and also superoxide dismutase (SOD), xanthine oxidase (XO), melanoma cell growth inhibition, and anti-microbial activity was investigated. Generally, ethanol extract was better than water extract in function and irradiation up to 20 kGy did not change any functional effect. Especially, the inhibition of melanin deposition on skin measured by inhibition of B16F10 (melanoma) cell growth was as high as arbutin, commercially available product, when the ethanol-extracted Sopoongsan was irradiated for 20 kGy. Results showed that when irradiation technology is used, the limitation of addition amount of natural materials for food or cosmetic composition caused by color problem can be decreased significantly with time saving and cost benefit compared to conventional color removal process. Therefore, irradiation would be one of the good methods to pose an additional value for related industry.

  17. Verification of biological activity of irradiated Sopoongsan, an oriental medicinal prescription, for industrial application of functional cosmetic material

    International Nuclear Information System (INIS)

    Sopoongsan is an oriental medicinal prescription including 12 medicinal herbs. Sopoongsan is known to have anti-inflammatory, anti-microbial, anti-allergic, and anti-cancer effects on human skin. To use Sopoongsan extract for functional cosmetic composition, its dark color should be brighter for seeking consumer demand, clear products, without any adverse change in its function. Irradiation with doses 0, 5, 10, and 20 kGy was applied to improve color of ethanol- or water-extracted Sopoongsan and also superoxide dismutase (SOD), xanthine oxidase (XO), melanoma cell growth inhibition, and anti-microbial activity was investigated. Generally, ethanol extract was better than water extract in function and irradiation up to 20 kGy did not change any functional effect. Especially, the inhibition of melanin deposition on skin measured by inhibition of B16F10 (melanoma) cell growth was as high as arbutin, commercially available product, when the ethanol-extracted Sopoongsan was irradiated for 20 kGy. Results showed that when irradiation technology is used, the limitation of addition amount of natural materials for food or cosmetic composition caused by color problem can be decreased significantly with time saving and cost benefit compared to conventional color removal process. Therefore, irradiation would be one of the good methods to pose an additional value for related industry

  18. Verification of biological activity of irradiated Sopoongsan, an oriental medicinal prescription, for industrial application of functional cosmetic material

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jin-Young; Park, Tae-Soon; Ho Son, Jun [Department of Cosmeceutical Science, Daegu Haany University, Kyungsan 712-715 (Korea, Republic of); Jo, Cheorun [Department of Animal Science and Biotechnology, Chungnam National University, Daejeon 305-764 (Korea, Republic of); Woo Byun, Myung [Radiation Food Science and Biotechnology Team, Korea Atomic Energy Research Institute, Jeongeup 580-185 (Korea, Republic of); Jeun An, Bong [Department of Cosmeceutical Science, Daegu Haany University, Kyungsan 712-715 (Korea, Republic of)], E-mail: anbj@dhu.ac.kr

    2007-11-15

    Sopoongsan is an oriental medicinal prescription including 12 medicinal herbs. Sopoongsan is known to have anti-inflammatory, anti-microbial, anti-allergic, and anti-cancer effects on human skin. To use Sopoongsan extract for functional cosmetic composition, its dark color should be brighter for seeking consumer demand, clear products, without any adverse change in its function. Irradiation with doses 0, 5, 10, and 20 kGy was applied to improve color of ethanol- or water-extracted Sopoongsan and also superoxide dismutase (SOD), xanthine oxidase (XO), melanoma cell growth inhibition, and anti-microbial activity was investigated. Generally, ethanol extract was better than water extract in function and irradiation up to 20 kGy did not change any functional effect. Especially, the inhibition of melanin deposition on skin measured by inhibition of B16F10 (melanoma) cell growth was as high as arbutin, commercially available product, when the ethanol-extracted Sopoongsan was irradiated for 20 kGy. Results showed that when irradiation technology is used, the limitation of addition amount of natural materials for food or cosmetic composition caused by color problem can be decreased significantly with time saving and cost benefit compared to conventional color removal process. Therefore, irradiation would be one of the good methods to pose an additional value for related industry.

  19. Revisiting the prediction of solar activity based on the relationship between the solar maximum amplitude and max-max cycle length

    CERN Document Server

    Carrasco, V M S; Gallego, M C

    2016-01-01

    It is very important to forecast the future solar activity due to its effect on our planet and near space. Here, we employ the new version of the sunspot number index (version 2) to analyse the relationship between the solar maximum amplitude and max-max cycle length proposed by Du (2006). We show that the correlation between the parameters used by Du (2006) for the prediction of the sunspot number (amplitude of the cycle, Rm, and max-max cycle length for two solar cycles before, Pmax-2) disappears when we use solar cycles prior to solar cycle 9. We conclude that the correlation between these parameters depends on the time interval selected. Thus, the proposal of Du (2006) should definitively not be considered for prediction purposes.

  20. Wind gust warning verification

    Science.gov (United States)

    Primo, Cristina

    2016-07-01

    Operational meteorological centres around the world increasingly include warnings as one of their regular forecast products. Warnings are issued to warn the public about extreme weather situations that might occur leading to damages and losses. In forecasting these extreme events, meteorological centres help their potential users in preventing the damage or losses they might suffer. However, verifying these warnings requires specific methods. This is due not only to the fact that they happen rarely, but also because a new temporal dimension is added when defining a warning, namely the time window of the forecasted event. This paper analyses the issues that might appear when dealing with warning verification. It also proposes some new verification approaches that can be applied to wind warnings. These new techniques are later applied to a real life example, the verification of wind gust warnings at the German Meteorological Centre ("Deutscher Wetterdienst"). Finally, the results obtained from the latter are discussed.

  1. Influence of Linker Length Variations on the Biomass-Degrading Performance of Heat-Active Enzyme Chimeras.

    Science.gov (United States)

    Rizk, Mazen; Antranikian, Garabed; Elleuche, Skander

    2016-04-01

    Plant cell walls are composed of complex polysaccharides such as cellulose and hemicellulose. In order to efficiently hydrolyze cellulose, the synergistic action of several cellulases is required. Some anaerobic cellulolytic bacteria form multienzyme complexes, namely cellulosomes, while other microorganisms produce a portfolio of diverse enzymes that work in synergistic fashion. Molecular biological methods can mimic such effects through the generation of artificial bi- or multifunctional fusion enzymes. Endoglucanase and β-glucosidase from extremely thermophilic anaerobic bacteria Fervidobacterium gondwanense and Fervidobacterium islandicum, respectively, were fused end-to-end in an approach to optimize polysaccharide degradation. Both enzymes are optimally active at 90 °C and pH 6.0-7.0 representing excellent candidates for fusion experiments. The direct linkage of both enzymes led to an increased activity toward the substrate specific for β-glucosidase, but to a decreased activity of endoglucanase. However, these enzyme chimeras were superior over 1:1 mixtures of individual enzymes, because combined activities resulted in a higher final product yield. Therefore, such fusion enzymes exhibit promising features for application in industrial bioethanol production processes. PMID:26921187

  2. Open verification methodology cookbook

    CERN Document Server

    Glasser, Mark

    2009-01-01

    Functional verification is an art as much as a science. It requires not only creativity and cunning, but also a clear methodology to approach the problem. The Open Verification Methodology (OVM) is a leading-edge methodology for verifying designs at multiple levels of abstraction. It brings together ideas from electrical, systems, and software engineering to provide a complete methodology for verifying large scale System-on-Chip (SoC) designs. OVM defines an approach for developing testbench architectures so they are modular, configurable, and reusable. This book is designed to help both novic

  3. Requirement Assurance: A Verification Process

    Science.gov (United States)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  4. Structural properties of the active layer of discotic hexabenzocoronene/perylene diimide bulk hetero junction photovoltaic devices: The role of alkyl side chain length

    Energy Technology Data Exchange (ETDEWEB)

    Al-Hussein, M., E-mail: m.alhussein@ju.edu.jo [Department of Physics, University of Jordan, Amman 11942 (Jordan); Hesse, H.C.; Weickert, J. [Ludwig-Maximilians-University Munich, Department of Physics and Center for NanoScience(CeNS), Amalienstr.54, 80799 Munich (Germany); Doessel, L.; Feng, X.; Muellen, K. [Max Planck Institute for Polymer Research, Ackermannweg 10, 55128 Mainz (Germany); Schmidt-Mende, L. [Ludwig-Maximilians-University Munich, Department of Physics and Center for NanoScience(CeNS), Amalienstr.54, 80799 Munich (Germany)

    2011-10-31

    We investigate thin blend films of phenyl-substituted hexa-peri-hexabenzocoronenes (HBC) with various alkyl side chain lengths ((CH{sub 2})n, n = 6, 8, 12 and 16)/perylenediimide (PDI). These blends constitute the active layers in bulk-hetero junction organic solar cells we studied recently [1]. Their structural properties are studied by both scanning electron microscopy and X-ray diffraction measurements. The results support the evidence for the formation of HBC donor-PDI acceptor complexes in all blends regardless of the side chain length of the HBC molecule. These complexes are packed into a layered structure parallel to the substrate for short side chain HBC molecules (n = 6 and 8). The layered structure is disrupted by increasing the side chain length of the HBC molecule and eventually a disordered structure is formed for long side chains (n > 12). We attribute this behavior to the size difference between the aromatic parts of the HBC and PDI molecules. For short side chains, the size difference results in a room for the side chains of the two molecules to fill in the space around the aromatic cores. For long side chains (n > 12), the empty space will not be enough to accommodate this increase, leading to the disruption of the layered structure and a rather disordered structure is formed. Our results highlight the importance of the donor-acceptor interaction in a bulk heterojunction active layer as well as the geometry of the two molecules and their role in determining the structure of the active layer and thus their photovoltaic performance.

  5. 45 CFR 261.63 - When is a State's Work Verification Plan due?

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 2 2010-10-01 2010-10-01 false When is a State's Work Verification Plan due? 261... When is a State's Work Verification Plan due? (a) Each State must submit its interim Work Verification Plan for validating work activities reported in the TANF Data Report and, if applicable, the...

  6. Telomerase-Associated Protein TEP1 Is Not Essential for Telomerase Activity or Telomere Length Maintenance In Vivo

    OpenAIRE

    Liu, Yie; Snow, Bryan E.; Hande, M. Prakash; Baerlocher, Gabriela; Kickhoefer, Valerie A.; Yeung, David; Wakeham, Andrew; Itie, Annick; Siderovski, David P.; Lansdorp, Peter M.; Robinson, Murray O; Harrington, Lea

    2000-01-01

    TEP1 is a mammalian telomerase-associated protein with similarity to the Tetrahymena telomerase protein p80. Like p80, TEP1 is associated with telomerase activity and the telomerase reverse transcriptase, and it specifically interacts with the telomerase RNA. To determine the role of mTep1 in telomerase function in vivo, we generated mouse embryonic stem (ES) cells and mice lacking mTep1. The mTep1-deficient (mTep1−/−) mice were viable and were bred for seven successive generations with no ob...

  7. Hybrid Deep Learning for Face Verification.

    Science.gov (United States)

    Sun, Yi; Wang, Xiaogang; Tang, Xiaoou

    2016-10-01

    This paper proposes a hybrid convolutional network (ConvNet)-Restricted Boltzmann Machine (RBM) model for face verification. A key contribution of this work is to learn high-level relational visual features with rich identity similarity information. The deep ConvNets in our model start by extracting local relational visual features from two face images in comparison, which are further processed through multiple layers to extract high-level and global relational features. To keep enough discriminative information, we use the last hidden layer neuron activations of the ConvNet as features for face verification instead of those of the output layer. To characterize face similarities from different aspects, we concatenate the features extracted from different face region pairs by different deep ConvNets. The resulting high-dimensional relational features are classified by an RBM for face verification. After pre-training each ConvNet and the RBM separately, the entire hybrid network is jointly optimized to further improve the accuracy. Various aspects of the ConvNet structures, relational features, and face verification classifiers are investigated. Our model achieves the state-of-the-art face verification performance on the challenging LFW dataset under both the unrestricted protocol and the setting when outside data is allowed to be used for training. PMID:26660699

  8. Proton Therapy Verification with PET Imaging

    Science.gov (United States)

    Zhu, Xuping; Fakhri, Georges El

    2013-01-01

    Proton therapy is very sensitive to uncertainties introduced during treatment planning and dose delivery. PET imaging of proton induced positron emitter distributions is the only practical approach for in vivo, in situ verification of proton therapy. This article reviews the current status of proton therapy verification with PET imaging. The different data detecting systems (in-beam, in-room and off-line PET), calculation methods for the prediction of proton induced PET activity distributions, and approaches for data evaluation are discussed. PMID:24312147

  9. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  10. Influenza activity in Europe during eight seasons (1999–2007: an evaluation of the indicators used to measure activity and an assessment of the timing, length and course of peak activity (spread across Europe

    Directory of Open Access Journals (Sweden)

    Meijer Adam

    2007-11-01

    Full Text Available Abstract Background The European Influenza Surveillance Scheme (EISS has collected clinical and virological data on influenza since 1996 in an increasing number of countries. The EISS dataset was used to characterise important epidemiological features of influenza activity in Europe during eight winters (1999–2007. The following questions were addressed: 1 are the sentinel clinical reports a good measure of influenza activity? 2 how long is a typical influenza season in Europe? 3 is there a west-east and/or south-north course of peak activity ('spread' of influenza in Europe? Methods Influenza activity was measured by collecting data from sentinel general practitioners (GPs and reports by national reference laboratories. The sentinel reports were first evaluated by comparing them to the laboratory reports and were then used to assess the timing and spread of influenza activity across Europe during eight seasons. Results We found a good match between the clinical sentinel data and laboratory reports of influenza collected by sentinel physicians (overall match of 72% for +/- 1 week difference. We also found a moderate to good match between the clinical sentinel data and laboratory reports of influenza from non-sentinel sources (overall match of 60% for +/- 1 week. There were no statistically significant differences between countries using ILI (influenza-like illness or ARI (acute respiratory disease as case definition. When looking at the peak-weeks of clinical activity, the average length of an influenza season in Europe was 15.6 weeks (median 15 weeks; range 12–19 weeks. Plotting the peak weeks of clinical influenza activity reported by sentinel GPs against the longitude or latitude of each country indicated that there was a west-east spread of peak activity (spread of influenza across Europe in four winters (2001–2002, 2002–2003, 2003–2004 and 2004–2005 and a south-north spread in three winters (2001–2002, 2004–2005 and 2006

  11. Estimation of genome length

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The genome length is a fundamental feature of a species. This note outlined the general concept and estimation method of the physical and genetic length. Some formulae for estimating the genetic length were derived in detail. As examples, the genome genetic length of Pinus pinaster Ait. and the genetic length of chromosome Ⅵ of Oryza sativa L. were estimated from partial linkage data.

  12. Improved computational model (AQUIFAS) for activated sludge, integrated fixed-film activated sludge, and moving-bed biofilm reactor systems, part III: analysis and verification.

    Science.gov (United States)

    Sen, Dipankar; Randall, Clifford W

    2008-07-01

    Research was undertaken to analyze and verify a model that can be applied to activated sludge, integrated fixed-film activated sludge (IFAS), and moving-bed biofilm reactor (MBBR) systems. The model embeds a biofilm model into a multicell activated sludge model. The advantage of such a model is that it eliminates the need to run separate computations for a plant being retrofitted from activated sludge to IFAS or MBBR. The biofilm flux rates for organics, nutrients, and biomass can be computed by two methods-a semi-empirical model of the biofilm that is relatively simpler, or a diffusional model of the biofilm that is computationally intensive. Biofilm support media can be incorporated to the anoxic and aerobic cells, but not the anaerobic cells. The model can be run for steady-state and dynamic simulations. The model was able to predict the changes in nitrification and denitrification at both pilot- and full-scale facilities. The semi-empirical and diffusional models of the biofilm were both used to evaluate the biofilm flux rates for media at different locations. The biofilm diffusional model was used to compute the biofilm thickness and growth, substrate concentrations, volatile suspended solids (VSS) concentration, and fraction of nitrifiers in each layer inside the biofilm. Following calibration, both models provided similar effluent results for reactor mixed liquor VSS and mixed liquor suspended solids and for the effluent organics, nitrogen forms, and phosphorus concentrations. While the semi-empirical model was quicker to run, the diffusional model provided additional information on biofilm thickness, quantity of growth in the biofilm, and substrate profiles inside the biofilm. PMID:18710147

  13. INF and IAEA: A comparative analysis of verification strategy

    International Nuclear Information System (INIS)

    This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities

  14. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  15. Validation and verification

    Institute of Scientific and Technical Information of China (English)

    J. Manickam

    2007-01-01

    @@ The challenge of developing a reliable predictive code with complicated scientific models, includes a process of verification and validation. The process of simulating a physics phenomenon generally includes:1. identifying the key features of the underlying physics;2. determining a suitable model set of equations;3. writing a computer code, which can solve the set of equations, and4. running the code and comparing the results with experimental data.

  16. Distorted Fingerprint Verification System

    OpenAIRE

    Divya KARTHIKAESHWARAN; Jeyalatha SIVARAMAKRISHNAN

    2011-01-01

    Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the...

  17. Polyketide chain length control by chain length factor.

    Science.gov (United States)

    Tang, Yi; Tsai, Shiou-Chuan; Khosla, Chaitan

    2003-10-22

    Bacterial aromatic polyketides are pharmacologically important natural products. A critical parameter that dictates product structure is the carbon chain length of the polyketide backbone. Systematic manipulation of polyketide chain length represents a major unmet challenge in natural product biosynthesis. Polyketide chain elongation is catalyzed by a heterodimeric ketosynthase. In contrast to homodimeric ketosynthases found in fatty acid synthases, the active site cysteine is absent from the one subunit of this heterodimer. The precise role of this catalytically silent subunit has been debated over the past decade. We demonstrate here that this subunit is the primary determinant of polyketide chain length, thereby validating its designation as chain length factor. Using structure-based mutagenesis, we identified key residues in the chain length factor that could be manipulated to convert an octaketide synthase into a decaketide synthase and vice versa. These results should lead to novel strategies for the engineered biosynthesis of hitherto unidentified polyketide scaffolds.

  18. TFE verification program

    Science.gov (United States)

    1990-03-01

    The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a Thermionic Fuel Element (TFE) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program; (3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-88; and (5) Thermionic Program in 1986 and 1987.

  19. Polyester hydrolytic and synthetic activity catalyzed by the medium-chain-length poly(3-hydroxyalkanoate) depolymerase from Streptomyces venezuelae SO1.

    Science.gov (United States)

    Santos, Marta; Gangoiti, Joana; Keul, Helmut; Möller, Martin; Serra, Juan L; Llama, María J

    2013-01-01

    The extracellular medium-chain-length polyhydroxyalkanote (MCL-PHA) depolymerase from an isolate identified as Streptomyces venezuelae SO1 was purified to electrophoretic homogeneity and characterized. The molecular mass and pI of the purified enzyme were approximately 27 kDa and 5.9, respectively. The depolymerase showed its maximum activity in the alkaline pH range and 50 °C and retained more than 70 % of its initial activity after 8 h at 40 °C. The MCL-PHA depolymerase hydrolyzes various p-nitrophenyl-alkanoates and polycaprolactone but not polylactide, poly-3-hydroxybutyrate, and polyethylene succinate. The enzymatic activity was markedly enhanced by the presence of low concentrations of detergents and organic solvents, being inhibited by dithiothreitol and EDTA. The potential of using the enzyme to produce (R)-3-hydroxyoctanoate in aqueous media or to catalyze ester-forming reactions in anhydrous media was investigated. In this sense, the MCL-PHA depolymerase catalyzes the hydrolysis of poly-3-hydroxyoctanoate to monomeric units and the ring-opening polymerization of β-butyrolactone and lactides, while ε-caprolactone and pentadecalactone were hardly polymerized. PMID:22695803

  20. Formal Verification of UML Profil

    DEFF Research Database (Denmark)

    Bhutto, Arifa; Hussain, Dil Muhammad Akbar

    2011-01-01

    The Unified Modeling Language (UML) is based on the Model Driven Development (MDD) approach which capturing the system functionality using the platform-independent model (PMI) and appropriate domain-specific languages. In UML base system notations, structural view is model by the class, components...... and object diagrams and behavioral view model by the activity, use case, state, and sequence diagram. However, UML does not provide the formal syntax, therefore its semantics is not formally definable, so for assure of correctness, we need to incorporate semantic reasoning through verification, specification......, refinement, and incorporate into the development process. Our motivation of research is to make an easy structural view and suggest formal technique/ method which can be best applied or used for the UML based development system. We investigate the tools and methods, which broadly used for the formal...

  1. Verification of wet blasting decontamination technology

    International Nuclear Information System (INIS)

    Macoho Co., Ltd. participated in the projects of 'Decontamination Verification Test FY 2011 by the Ministry of the Environment' and 'Decontamination Verification Test FY 2011 by the Cabinet Office.' And we tested verification to use a wet blasting technology for decontamination of rubble and roads contaminated by the accident of Fukushima Daiichi Nuclear Power Plant of the Tokyo Electric Power Company. As a results of the verification test, the wet blasting decontamination technology showed that a decontamination rate became 60-80% for concrete paving, interlocking, dense-grated asphalt pavement when applied to the decontamination of the road. When it was applied to rubble decontamination, a decontamination rate was 50-60% for gravel and approximately 90% for concrete and wood. It was thought that Cs-134 and Cs-137 attached to the fine sludge scraped off from a decontamination object and the sludge was found to be separated from abrasives by wet cyclene classification: the activity concentration of the abrasives is 1/30 or less than the sludge. The result shows that the abrasives can be reused without problems when the wet blasting decontamination technology is used. (author)

  2. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  3. Criteria for structural verification of fast reactor core elements

    International Nuclear Information System (INIS)

    Structural and functional criteria and relative verifications of PEC reactor fuel element are presented and discussed. Particular attention has been given to differentiate the structural verifications of low neutronic damage zones from those high neutronic damage ones. The structural verification criteria, which had already been presented at the 8th SMIRT Seminar Conference in Paris, have had some modifications during the Safety Report preparation. Finally some necessary activities are indicated for structural criteria validation, in particular for irradiated components, and for converging towards a European fast reactor code. (author). 3 refs, 6 tabs

  4. Advanced formal verification

    CERN Document Server

    Drechsler, Rolf

    2007-01-01

    Preface. Contributing Authors. Introduction; R. Drechsler. 1. Formal Verification. 2. Challenges. 3. Contributions to this Book. 1: What SAT-Solvers Can and Cannot Do; E. Goldberg. 1. Introduction. 2. Hard Equivalence Checking CNF Formulas. 3. Stable Sets of Points. 2: Advancements in Mixed BDD and SAT Techniques; G. Cabodi, S. Quer. 1. Introduction. 2. Background. 3. Comparing SAT and BDD Approaches: Are they Different? 4. Decision Diagrams as a Slave Engine in General SAT: Clause Compression by Means of ZBDDs. 5. Decision Diagram Preprocessing and Circuit-Based SAT. 6. Using SAT in Symbolic

  5. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  6. EU Environmental Technology Verification pilot programme - Guidance documents: Guidelines for the workflow of documents and information between Verification Bodies, Technical Working Groups and Commission Services

    OpenAIRE

    BARBOSA LANHAM ANA; PIERS DE RAVESCHOOT RONALD; SCHOSGER Jean-Pierre; Henry, Pierre

    2014-01-01

    Environmental Technology Verification (ETV) is a new tool to enable the verification of the claims provided by environmental technologies. The Programme is set up foreseeing the existence of Technical Working Groups (TWGs), one for each technology area active under the Pilot programme. These are chaired by the JRC and composed by Commission Invited Experts and by Experts representing the Verification Bodies with the overall aim to harmonise and exchange good practices among member states. ...

  7. On the Relationship Between the Length of Season and Tropical Cyclone Activity in the North Atlantic Basin During the Weather Satellite Era, 1960-2013

    Science.gov (United States)

    Wilson, Robert M.

    2014-01-01

    Officially, the North Atlantic basin tropical cyclone season runs from June 1 through November 30 of each year. During this 183-day interval, the vast majority of tropical cyclone onsets are found to occur. For example, in a study of the 715 tropical cyclones that occurred in the North Atlantic basin during the interval 1945-2010, it was found that about 97 percent of them had their onsets during the conventional hurricane season, with the bulk (78 percent) having had onset during the late summer-early fall months of August, September, and October and with none having had onset in the month of March. For the 2014 hurricane season, it already has had the onset of its first named storm on July 1 (day of year (DOY) 182), Arthur, which formed off the east coast of Florida, rapidly growing into a category-2 hurricane with peak 1-minute sustained wind speed of about 90 kt and striking the coast of North Carolina as a category-2 hurricane on July 3. Arthur is the first hurricane larger than category-1 to strike the United States (U.S.) since the year 2008 when Ike struck Texas as a category-2 hurricane and there has not been a major hurricane (category-3 or larger) to strike the U.S. since Wilma struck Florida as a category-3 hurricane in 2005. Only two category-1 hurricanes struck the U.S. in the year 2012 (Isaac and Sandy, striking Louisiana and New York, respectively) and there were no U.S. land-falling hurricanes in 2013 (also true for the years 1962, 1973, 1978, 1981, 1982, 1990, 1994, 2000, 2001, 2006, 2009, and 2010). In recent years it has been argued that the length of season (LOS), determined as the inclusive elapsed time between the first storm day (FSD) and the last storm day (LSD) of the yearly hurricane season (i.e., when peak 1-minute sustained wind speed of at least 34 kt occurred and the tropical cyclone was not classified as 'extratropical'), has increased in length with the lengthening believed to be due to the FSD occurring sooner and the LSD occurring

  8. Crystal Structure of Full-length Mycobacterium tuberculosis H37Rv Glycogen Branching Enzyme; Insights of N-Terminal [beta]-Sandwich in Sustrate Specifity and Enzymatic Activity

    Energy Technology Data Exchange (ETDEWEB)

    Pal, Kuntal; Kumar, Shiva; Sharma, Shikha; Garg, Saurabh Kumar; Alam, Mohammad Suhail; Xu, H. Eric; Agrawal, Pushpa; Swaminathan, Kunchithapadam (NU Sinapore); (Van Andel); (IMT-India)

    2010-07-13

    The open reading frame Rv1326c of Mycobacterium tuberculosis (Mtb) H37Rv encodes for an {alpha}-1,4-glucan branching enzyme (MtbGlgB, EC 2.4.1.18, Uniprot entry Q10625). This enzyme belongs to glycoside hydrolase (GH) family 13 and catalyzes the branching of a linear glucose chain during glycogenesis by cleaving a 1 {yields} 4 bond and making a new 1 {yields} 6 bond. Here, we show the crystal structure of full-length MtbGlgB (MtbGlgBWT) at 2.33-{angstrom} resolution. MtbGlgBWT contains four domains: N1 {beta}-sandwich, N2 {beta}-sandwich, a central ({beta}/{alpha}){sub 8} domain that houses the catalytic site, and a C-terminal {beta}-sandwich. We have assayed the amylase activity with amylose and starch as substrates and the glycogen branching activity using amylose as a substrate for MtbGlgBWT and the N1 domain-deleted (the first 108 residues deleted) Mtb{Delta}108GlgB protein. The N1 {beta}-sandwich, which is formed by the first 105 amino acids and superimposes well with the N2 {beta}-sandwich, is shown to have an influence in substrate binding in the amylase assay. Also, we have checked and shown that several GH13 family inhibitors are ineffective against MtbGlgBWT and Mtb{Delta}108GlgB. We propose a two-step reaction mechanism, for the amylase activity (1 {yields} 4 bond breakage) and isomerization (1 {yields} 6 bond formation), which occurs in the same catalytic pocket. The structural and functional properties of MtbGlgB and Mtb{Delta}108GlgB are compared with those of the N-terminal 112-amino acid-deleted Escherichia coli GlgB (EC{Delta}112GlgB).

  9. Solar cycle length hypothesis appears to support the IPCC on global warming

    DEFF Research Database (Denmark)

    Laut, Peter; Gundermann, Jesper

    1999-01-01

    warming from the enhanced concentrations of greenhouse gases. The "solar hypothesis" claims that solar activity causes a significant component of the global mean temperature to vary in phase opposite to the filtered solar cycle lengths. In an earlier paper we have demonstrated that for data covering...... as the contributions from man-made greenhouse gases and sulphate aerosols by using an upwelling diffusion-energy balance model similar to the model of Wigley and Raper employed in the Second Assessment Report of The Intergovernmental Panel on Climate Change. It turns out that the agreement of the filtered solar cycle...... lengths with the "corrected" temperature anomalies is substantially better than with the historical anomalies. Therefore our findings support a total reversal of the common assumption that a verification of the solar hypothesis would challenge the IPCC assessment of man-made global warming....

  10. Clinical Verification of Homeopathy

    Directory of Open Access Journals (Sweden)

    Michel Van Wassenhoven

    2011-07-01

    Full Text Available The world is changing! This is certainly true regarding the homeopathic practice and access to homeopathic medicine. Therefore our first priority at the ECH-LMHI [1] has been to produce a yearly report on the scientific framework of homeopathy. In the 2010 version a new chapter about epidemic diseases has been added including the Leptospirosis survey on the Cuban population. A second priority has been to review the definition of the homeopathic medicines respecting the new framework generated by the official registration procedure and the WHO report. We are working now on a documented (Materia Medica and provings list of homeopathic remedies to facilitate the registration of our remedies. The new challenges are: first of all more good research proposals and as such more funding (possible through ISCHI + Blackie Foundation as examples [2]; international acceptance of new guidelines for proving and clinical verification of homeopathic symptoms (Proposals are ready for discussion; total reconsideration of the homeopathic repertories including results of the clinical verification of the symptoms. The world is changing, we are part of the world and changes are needed also for homeopathy!

  11. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  12. Generic System Verilog Universal Verification Methodology Based Reusable Verification Environment for Efficient Verification of Image Signal Processing IPS/SOCS

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2012-12-01

    Full Text Available In this paper, we present Generic System Verilog Universal Verification Methodology based Reusable Verification Environment for efficient verification of Image Signal Processing IP’s/SoC’s. With the tight schedules on all projects it is important to have a strong verification methodology which contributes to First Silicon Success. Deploy methodologies which enforce full functional coverage and verification of corner cases through pseudo random test scenarios is required. Also, standardization of verification flow is needed. Previously, inside imaging group of ST, Specman (e/Verilog based Verification Environment forIP/Subsystem level verification and C/C++/Verilog based Directed Verification Environment for SoC Level Verification was used for Functional Verification. Different Verification Environments were used at IP level and SoC level. Different Verification/Validation Methodologies were used for SoC Verification across multiple sites. Verification teams were also looking for the ways how to catch bugs early in the design cycle? Thus, Generic System Verilog Universal Verification Methodology (UVM based Reusable Verification Environment is required to avoid the problem of having so many methodologies and provides a standard unified solution which compiles on all tools.

  13. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    Science.gov (United States)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  14. Characterization of the cloned full-length and a truncated human target of rapamycin: Activity, specificity, and enzyme inhibition as studied by a high capacity assay

    International Nuclear Information System (INIS)

    The mammalian target of rapamycin (mTOR/TOR) is implicated in cancer and other human disorders and thus an important target for therapeutic intervention. To study human TOR in vitro, we have produced in large scale both the full-length TOR (289 kDa) and a truncated TOR (132 kDa) from HEK293 cells. Both enzymes demonstrated a robust and specific catalytic activity towards the physiological substrate proteins, p70 S6 ribosomal protein kinase 1 (p70S6K1) and eIF4E binding protein 1 (4EBP1), as measured by phosphor-specific antibodies in Western blotting. We developed a high capacity dissociation-enhanced lanthanide fluorescence immunoassay (DELFIA) for analysis of kinetic parameters. The Michaelis constant (K m) values of TOR for ATP and the His6-S6K substrate were shown to be 50 and 0.8 μM, respectively. Dose-response and inhibition mechanisms of several known inhibitors, the rapamycin-FKBP12 complex, wortmannin and LY294002, were also studied in DELFIA. Our data indicate that TOR exhibits kinetic features of those shared by traditional serine/threonine kinases and demonstrate the feasibility for TOR enzyme screen in searching for new inhibitors

  15. Crystallization and preliminary X-ray crystallographic analysis of a full-length active form of the Cry4Ba toxin from Bacillus thuringiensis

    International Nuclear Information System (INIS)

    The crystallization of the Cry4Ba toxin from B. thuringiensis is described. To obtain a complete structure of the Bacillus thuringiensis Cry4Ba mosquito-larvicidal protein, a 65 kDa functional form of the Cry4Ba-R203Q mutant toxin was generated for crystallization by eliminating the tryptic cleavage site at Arg203. The 65 kDa trypsin-resistant fragment was purified and crystallized using the sitting-drop vapour-diffusion method. The crystals belonged to the rhombohedral space group R32, with unit-cell parameters a = b = 184.62, c = 187.36 Å. Diffraction data were collected to at least 2.07 Å resolution using synchrotron radiation and gave a data set with an overall Rmerge of 9.1% and a completeness of 99.9%. Preliminary analysis indicated that the asymmetric unit contained one molecule of the active full-length mutant, with a VM coefficient and solvent content of 4.33 Å3 Da−1 and 71%, respectively

  16. Diacyltransferase Activity and Chain Length Specificity of Mycobacterium tuberculosis PapA5 in the Synthesis of Alkyl β-Diol Lipids

    Energy Technology Data Exchange (ETDEWEB)

    Touchette, Megan H.; Bommineni, Gopal R.; Delle Bovi, Richard J.; Gadbery, John; Nicora, Carrie D.; Shukla, Anil K.; Kyle, Jennifer E.; Metz, Thomas O.; Martin, Dwight W.; Sampson, Nicole S.; Miller, W. T.; Tonge, Peter J.; Seeliger, Jessica C.

    2015-09-08

    Although classified as Gram-positive bacteria, Corynebacterineae possess an asymmetric outer membrane that imparts structural and thereby physiological similarity to more distantly related Gram-negative bacteria. Like lipopolysaccharide in Gram-negative bacteria, lipids in the outer membrane of Corynebacterineae have been associated with the virulence of pathogenic species such as Mycobacterium tuberculosis (Mtb). For example, Mtb strains that lack long, branched-chain alkyl esters known as dimycocerosates (DIMs) are significantly attenuated in model infections. The resultant interest in the biosynthetic pathway of these unusual virulence factors has led to the elucidation of many of the steps leading to the final esterification of the alkyl beta-diol, phthiocerol, with branched-chain fatty acids know as mycocerosates. PapA5 is an acyltransferase implicated in these final reactions. We here show that PapA5 is indeed the terminal enzyme in DIM biosynthesis by demonstrating its dual esterification activity and chain-length preference using synthetic alkyl beta-diol substrate analogues. Applying these analogues to a series of PapA5 mutants, we also revise a model for the substrate binding within PapA5. Finally, we demonstrate that the Mtb Ser/Thr kinase PknB modifies PapA5 on three Thr residues, including two (T196, T198) located on an unresolved loop. These results clarify the DIM biosynthetic pathway and suggest possible mechanisms by which DIM biosynthesis may be regulated by the post-translational modification of PapA5.

  17. Corepressor effect on androgen receptor activity varies with the length of the CAG encoded polyglutamine repeat and is dependent on receptor/corepressor ratio in prostate cancer cells.

    Science.gov (United States)

    Buchanan, Grant; Need, Eleanor F; Barrett, Jeffrey M; Bianco-Miotto, Tina; Thompson, Vanessa C; Butler, Lisa M; Marshall, Villis R; Tilley, Wayne D; Coetzee, Gerhard A

    2011-08-01

    The response of prostate cells to androgens reflects a combination of androgen receptor (AR) transactivation and transrepression, but how these two processes differ mechanistically and influence prostate cancer risk and disease outcome remain elusive. Given recent interest in targeting AR transrepressive processes, a better understanding of AR/corepressor interaction and responses is warranted. Here, we used transactivation and interaction assays with wild-type and mutant ARs, and deletion AR fragments, to dissect the relationship between AR and the corepressor, silencing mediator for retinoic acid and thyroid hormone receptors (SMRT). We additionally tested how these processes are influenced by AR agonist and antagonist ligands, as well as by variation in the polyglutamine tract in the AR amino terminal domain (NTD), which is encoded by a polymorphic CAG repeat in the gene. SMRT was recruited to the AR ligand binding domain by agonist ligand, and as determined by the effect of strategic mutations in activation function 2 (AF-2), requires a precise conformation of that domain. A distinct region of SMRT also mediated interaction with the AR-NTD via the transactivation unit 5 (TAU5; residues 315-538) region. The degree to which SMRT was able to repress AR increased from 17% to 56% as the AR polyglutamine repeat length was increased from 9 to 42 residues, but critically this effect could be abolished by increasing the SMRT:AR molar ratio. These data suggest that the extent to which the CAG encoded polyglutamine repeat influences AR activity represents a balance between corepressor and coactivator occupancy of the same ligand-dependent and independent AR interaction surfaces. Changes in the homeostatic relationship of AR to these molecules, including SMRT, may explain the variable penetrance of the CAG repeat and the loss of AR signaling flexibility in prostate cancer progression.

  18. Measuring Thermodynamic Length

    Energy Technology Data Exchange (ETDEWEB)

    Crooks, Gavin E

    2007-09-07

    Thermodynamic length is a metric distance between equilibrium thermodynamic states. Among other interesting properties, this metric asymptotically bounds the dissipation induced by a finite time transformation of a thermodynamic system. It is also connected to the Jensen-Shannon divergence, Fisher information, and Rao's entropy differential metric. Therefore, thermodynamic length is of central interestin understanding matter out of equilibrium. In this Letter, we will consider how to denethermodynamic length for a small system described by equilibrium statistical mechanics and how to measure thermodynamic length within a computer simulation. Surprisingly, Bennett's classic acceptance ratio method for measuring free energy differences also measures thermodynamic length.

  19. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of LightWater Reactors (CASL). Fivemain types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  20. Verification and Validation in Computational Fluid Dynamics; TOPICAL

    International Nuclear Information System (INIS)

    Verification and validation (V and V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V and V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V and V, and develops a number of extensions to existing ideas. The review of the development of V and V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V and V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized

  1. Spent Nuclear Fuel (SNF) Project Design Verification and Validation Process

    International Nuclear Information System (INIS)

    This document provides a description of design verification and validation activities implemented by the Spent Nuclear Fuel (SNF) Project. During the execution of early design verification, a management assessment (Bergman, 1999) and external assessments on configuration management (Augustenburg, 1999) and testing (Loscoe, 2000) were conducted and identified potential uncertainties in the verification process. This led the SNF Chief Engineer to implement corrective actions to improve process and design products. This included Design Verification Reports (DVRs) for each subproject, validation assessments for testing, and verification of the safety function of systems and components identified in the Safety Equipment List to ensure that the design outputs were compliant with the SNF Technical Requirements. Although some activities are still in progress, the results of the DVR and associated validation assessments indicate that Project requirements for design verification are being effectively implemented. These results have been documented in subproject-specific technical documents (Table 2). Identified punch-list items are being dispositioned by the Project. As these remaining items are closed, the technical reports (Table 2) will be revised and reissued to document the results of this work

  2. Spent Nuclear Fuel (SNF) Project Design Verification and Validation Process

    Energy Technology Data Exchange (ETDEWEB)

    OLGUIN, L.J.

    2000-09-25

    This document provides a description of design verification and validation activities implemented by the Spent Nuclear Fuel (SNF) Project. During the execution of early design verification, a management assessment (Bergman, 1999) and external assessments on configuration management (Augustenburg, 1999) and testing (Loscoe, 2000) were conducted and identified potential uncertainties in the verification process. This led the SNF Chief Engineer to implement corrective actions to improve process and design products. This included Design Verification Reports (DVRs) for each subproject, validation assessments for testing, and verification of the safety function of systems and components identified in the Safety Equipment List to ensure that the design outputs were compliant with the SNF Technical Requirements. Although some activities are still in progress, the results of the DVR and associated validation assessments indicate that Project requirements for design verification are being effectively implemented. These results have been documented in subproject-specific technical documents (Table 2). Identified punch-list items are being dispositioned by the Project. As these remaining items are closed, the technical reports (Table 2) will be revised and reissued to document the results of this work.

  3. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  4. Generic interpreters and microprocessor verification

    Science.gov (United States)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  5. Numident Online Verification Utility (NOVU)

    Data.gov (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  6. TPS verification with UUT simulation

    Science.gov (United States)

    Wang, Guohua; Meng, Xiaofeng; Zhao, Ruixian

    2006-11-01

    TPS's (Test Program Set) verification or first article acceptance test commonly depends on fault insertion experiment on UUT (Unit Under Test). However the failure modes injected on UUT is limited and it is almost infeasible when the UUT is in development or in a distributed state. To resolve this problem, a TPS verification method based on UUT interface signal simulation is putting forward. The interoperability between ATS (automatic test system) and UUT simulation platform is very important to realize automatic TPS verification. After analyzing the ATS software architecture, the approach to realize interpretability between ATS software and UUT simulation platform is proposed. And then the UUT simulation platform software architecture is proposed based on the ATS software architecture. The hardware composition and software architecture of the UUT simulation is described in details. The UUT simulation platform has been implemented in avionics equipment TPS development, debug and verification.

  7. Video-Based Fingerprint Verification

    OpenAIRE

    Lili Liu; Yilong Yin; Wei Qin

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and...

  8. Trajectory Based Behavior Analysis for User Verification

    Science.gov (United States)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  9. FINAL REPORT –INDEPENDENT VERIFICATION SURVEY SUMMARY AND RESULTS FOR THE ARGONNE NATIONAL LABORATORY BUILDING 330 PROJECT FOOTPRINT, ARGONNE, ILLINOIS

    Energy Technology Data Exchange (ETDEWEB)

    ERIKA N. BAILEY

    2012-02-29

    ORISE conducted onsite verification activities of the Building 330 project footprint during the period of June 6 through June 7, 2011. The verification activities included technical reviews of project documents, visual inspections, radiation surface scans, and sampling and analysis. The draft verification report was issued in July 2011 with findings and recommendations. The contractor performed additional evaluations and remediation.

  10. NDA techniques for spent fuel verification and radiation monitoring. Report on activities 6a and 6b of Task JNT C799 (SAGOR). Finnish support programme to the IAEA safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Tarvainen, M. [Finnish Centre for Radiation and Nuclear Safety, Helsinki (Finland); Levai, F. [Technical Univ., Budabest (Hungary); Valentine, T.E. [Oak Ridge National Lab., TN (United States); Abhold, M. [Los Alamos National Lab., NM (United States); Moran, B. [USNRC, Washington, DC (United States)

    1997-08-01

    A variety of NDA methods exist for measurement of spent fuel at various stages of the disposition process. Each of the methods has weaknesses and strengths that make them applicable to one or more stages in disposition. Both passive and active methods are, under favorable conditions, capable of providing either a mapping of an assembly to identify missing fuel pins or a measurement of the fissile content and some are capable of providing a mapping of a canister to identify missing assemblies or a measurement of the fissile content. However, a spent fuel measurement system capable of making routine partial defect tests of spent fuel assemblies is missing. The active NDA methods, in particular, the active neutron methods, hold the most promise for providing quantitative measurements on fuel assemblies and canisters. Application of NDA methods to shielded casks may not be practical or even possible due to the extent of radiation attenuation by the shielding materials, and none of these methods are considered to have potential for quantitative measurements once the spent fuel cask has been placed in a repository. The most practical approach to spent fuel verification is to confirm the characteristics of the spent fuel prior to loading in a canister or cask at the conditioning facility. Fissile material tracking systems in addition to containment and surveillance methods have the capability to assure continuity of the verified knowledge of the sample from loading of the canisters to final disposal and closing of the repository. (orig.). 49 refs.

  11. Verification of Monte Carlo transport codes FLUKA, Mars and Shield

    International Nuclear Information System (INIS)

    The present study is a continuation of the project 'Verification of Monte Carlo Transport Codes' which is running at GSI as a part of activation studies of FAIR relevant materials. It includes two parts: verification of stopping modules of FLUKA, MARS and SHIELD-A (with ATIMA stopping module) and verification of their isotope production modules. The first part is based on the measurements of energy deposition function of uranium ions in copper and stainless steel. The irradiation was done at 500 MeV/u and 950 MeV/u, the experiment was held at GSI from September 2004 until May 2005. The second part is based on gamma-activation studies of an aluminium target irradiated with an argon beam of 500 MeV/u in August 2009. Experimental depth profiling of the residual activity of the target is compared with the simulations. (authors)

  12. Addressing verification challenges [International safeguards symposium on addressing verification challenges

    International Nuclear Information System (INIS)

    now provide information relevant to physical protection as well. The IAEA does not receive all information they would need, for example systematic information from the Nuclear Suppliers Group on exports and imports. Other challenges are financial resources (IAEA's budget: $ 130 million) and the IAEA laboratories in Vienna which are not equipped for state- of-the-art analysis of environmental samples. There is also need for transparency measures in certain situations - for example, interviewing people and having access to documents. Another challenge is how to deal with countries having already begun weaponization activities, how to verify that weapons have been dismantled, weaponization structures have been destroyed and custody has been taken of weapon design information. The IAEA recently moved from a system based on facility verification to a State level safeguards approach. The IAEA has also introduced an integrated safeguards approach, which is more cost effective and enables the IAEA to provide better assurances. Environmental sampling and satellite monitoring are new tools that the IAEA is now using almost routinely. Moreover, the IAEA is continuing to work with the Member States to develop new verification tools. Each of the issues discussed presents its own challenge and there is hope for input and new ideas provided by the participants. The real purpose of the symposium is to determine how the IAEA can continue to be effective and relevant, and a valuable instrument to help the international community deal with nuclear weapons proliferation

  13. Multi-project verification of Swedish AIJ projects. Verification results and documentation

    Energy Technology Data Exchange (ETDEWEB)

    Uzzell, J.; Lehmann, M.; Nestaas, I.; Telnes, E.; Lund, T.; Vaage, G. [Det Norske Veritas AS, Hoevik (Norway)

    2000-03-01

    In 2000 DNV was engaged by the Swedish National Energy Administration to carry out a pilot multi-project verification of Swedish AIJ (Activities Implemented Jointly) projects located in the Baltic countries of Estonia, Latvia, and Lithuania. The CO{sub 2} emissions reductions from 27 fuel switch projects were verified as a case study for the multi-project verification methodology developed by DNV. These AIJ projects replaced fossil fuel boilers with advanced biofuel boiler technology. These biofuel boilers use primarily wood waste and the air emissions are assumed to be CO2 neutral in accordance with IPCC guidelines. The aim of the multi-project methodology is to reduce verification transaction costs by selecting only a sample from the projects for on-site auditing. In order to maintain a high level of confidence in the verified emission reductions the multi-project verification methodology conservatively estimates the verified emission reductions: by discounting the ERUs due to uncertainty in monitored data and uncertainty in baseline parameters; and by extrapolating project reporting error to the projects which were not audited on-site. A logical and transparent site selection process was used for selecting the projects to be audited on-site; and DNV audited on-site 61% of the verified emissions reductions while visiting only 11 of the 27 projects. The 27 AIJ projects were assessed with AIJ and JI criteria existing at this time and were found to be in agreement with these criteria. The total amount of emission reductions which could be conservatively verified by DNV during the period of 1993-1999 for these 27 projects was 498,710 tonnes of CO{sub 2}. The lessons learned from this pilot multi-project verification are documented in a companion report.

  14. SEMI-AUTOMATIC SPEAKER VERIFICATION SYSTEM

    Directory of Open Access Journals (Sweden)

    E. V. Bulgakova

    2016-03-01

    Full Text Available Subject of Research. The paper presents a semi-automatic speaker verification system based on comparing of formant values, statistics of phone lengths and melodic characteristics as well. Due to the development of speech technology, there is an increased interest now in searching for expert speaker verification systems, which have high reliability and low labour intensiveness because of the automation of data processing for the expert analysis. System Description. We present a description of a novel system analyzing similarity or distinction of speaker voices based on comparing statistics of phone lengths, formant features and melodic characteristics. The characteristic feature of the proposed system based on fusion of methods is a weak correlation between the analyzed features that leads to a decrease in the error rate of speaker recognition. The system advantage is the possibility to carry out rapid analysis of recordings since the processes of data preprocessing and making decision are automated. We describe the functioning methods as well as fusion of methods to combine their decisions. Main Results. We have tested the system on the speech database of 1190 target trials and 10450 non-target trials, including the Russian speech of the male and female speakers. The recognition accuracy of the system is 98.59% on the database containing records of the male speech, and 96.17% on the database containing records of the female speech. It was also experimentally established that the formant method is the most reliable of all used methods. Practical Significance. Experimental results have shown that proposed system is applicable for the speaker recognition task in the course of phonoscopic examination.

  15. Telomerase inhibitor imetelstat has preclinical activity across the spectrum of non-small cell lung cancer oncogenotypes in a telomere length dependent manner.

    Science.gov (United States)

    Frink, Robin E; Peyton, Michael; Schiller, Joan H; Gazdar, Adi F; Shay, Jerry W; Minna, John D

    2016-05-31

    Telomerase was evaluated as a therapeutic oncotarget by studying the efficacy of the telomerase inhibitor imetelstat in non-small cell lung cancer (NSCLC) cell lines to determine the range of response phenotypes and identify potential biomarkers of response. A panel of 63 NSCLC cell lines was studied for telomere length and imetelstat efficacy in inhibiting colony formation and no correlation was found with patient characteristics, tumor histology, and oncogenotypes. While there was no overall correlation between imetelstat efficacy with initial telomere length (ranging from 1.5 to 20 kb), the quartile of NSCLC lines with the shortest telomeres was more sensitive than the quartile with the longest telomeres. Continuous long-term treatment with imetelstat resulted in sustained telomerase inhibition, progressive telomere shortening and eventual growth inhibition in a telomere-length dependent manner. Cessation of imetelstat therapy before growth inhibition was followed by telomere regrowth. Likewise, in vivo imetelstat treatment caused tumor xenograft growth inhibition in a telomere-length dependent manner. We conclude from these preclinical studies of telomerase as an oncotarget tested by imetelstat response that imetelstat has efficacy across the entire oncogenotype spectrum of NSCLC, continuous therapy is necessary to prevent telomere regrowth, and short telomeres appears to be the best treatment biomarker.

  16. Organics Verification Study for Sinclair and Dyes Inlets, Washington

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.

    2006-09-28

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the

  17. Telomere Length and Mortality

    DEFF Research Database (Denmark)

    Kimura, Masayuki; Hjelmborg, Jacob V B; Gardner, Jeffrey P;

    2008-01-01

    Leukocyte telomere length, representing the mean length of all telomeres in leukocytes, is ostensibly a bioindicator of human aging. The authors hypothesized that shorter telomeres might forecast imminent mortality in elderly people better than leukocyte telomere length. They performed mortality...... analysis in 548 same-sex Danish twins (274 pairs) aged 73-94 years, of whom 204 pairs experienced the death of one or both co-twins during 9-10 years of follow-up (1997-2007). From the terminal restriction fragment length (TRFL) distribution, the authors obtained the mean TRFL (mTRFL) and the mean values...... of the shorter 50% (mTRFL(50)) and shortest 25% (mTRFL(25)) of TRFLs in the distribution and computed the mode of TRFL (MTRFL). They analyzed the proportions of twin pairs in which the co-twin with the shorter telomeres died first. The proportions derived from the intrapair comparisons indicated that the shorter...

  18. A Characteristic Particle Length

    CERN Document Server

    Roberts, Mark D

    2015-01-01

    It is argued that there are characteristic intervals associated with any particle that can be derived without reference to the speed of light $c$. Such intervals are inferred from zeros of wavefunctions which are solutions to the Schr\\"odinger equation. The characteristic length is $\\ell=\\beta^2\\hbar^2/(8Gm^3)$, where $\\beta=3.8\\dots$; this length might lead to observational effects on objects the size of a virus.

  19. Equilibrium CO bond lengths

    Science.gov (United States)

    Demaison, Jean; Császár, Attila G.

    2012-09-01

    Based on a sample of 38 molecules, 47 accurate equilibrium CO bond lengths have been collected and analyzed. These ultimate experimental (reEX), semiexperimental (reSE), and Born-Oppenheimer (reBO) equilibrium structures are compared to reBO estimates from two lower-level techniques of electronic structure theory, MP2(FC)/cc-pVQZ and B3LYP/6-311+G(3df,2pd). A linear relationship is found between the best equilibrium bond lengths and their MP2 or B3LYP estimates. These (and similar) linear relationships permit to estimate the CO bond length with an accuracy of 0.002 Å within the full range of 1.10-1.43 Å, corresponding to single, double, and triple CO bonds, for a large number of molecules. The variation of the CO bond length is qualitatively explained using the Atoms in Molecules method. In particular, a nice correlation is found between the CO bond length and the bond critical point density and it appears that the CO bond is at the same time covalent and ionic. Conditions which permit the computation of an accurate ab initio Born-Oppenheimer equilibrium structure are discussed. In particular, the core-core and core-valence correlation is investigated and it is shown to roughly increase with the bond length.

  20. An innovative piping verification program for steam generator replacement

    International Nuclear Information System (INIS)

    The traditional programmatic approach to confirm the acceptability of piping thermal expansion has an impact on the schedule for the startup of nuclear plants. The process of obtaining, evaluating, and resolving critical measurements at pipe supports and plant structures is a critical path activity that extends the time required for the plant to obtain or resume full power operation. In order to support the schedule for and minimize the duration of the steam generator replacement (SGR) outage at North Anna Unit 1, an innovative piping verification program was developed and implemented. The approach used for the restart verification program involved a significant planning effort prior to the SGR outage and kept piping system commodity verification activities off of the critical path by performing a series of engineering evaluation tasks before and during the SGR outage. The lessons learned from the successful program is being revised and improved for implementation on the steam generator replacement project for North Anna Unit 2

  1. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  2. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  3. Experimental verification of proton beam monitoring in a human body by use of activity image of positron-emitting nuclei generated by nuclear fragmentation reaction.

    Science.gov (United States)

    Nishio, Teiji; Miyatake, Aya; Inoue, Kazumasa; Gomi-Miyagishi, Tomoko; Kohno, Ryosuke; Kameoka, Satoru; Nakagawa, Keiichi; Ogino, Takashi

    2008-01-01

    Proton therapy is a form of radiotherapy that enables concentration of dose on a tumor by use of a scanned or modulated Bragg peak. Therefore, it is very important to evaluate the proton-irradiated volume accurately. The proton-irradiated volume can be confirmed by detection of pair-annihilation gamma rays from positron-emitting nuclei generated by the nuclear fragmentation reaction of the incident protons on target nuclei using a PET apparatus. The activity of the positron-emitting nuclei generated in a patient was measured with a PET-CT apparatus after proton beam irradiation of the patient. Activity measurement was performed in patients with tumors of the brain, head and neck, liver, lungs, and sacrum. The 3-D PET image obtained on the CT image showed the visual correspondence with the irradiation area of the proton beam. Moreover, it was confirmed that there were differences in the strength of activity from the PET-CT images obtained at each irradiation site. The values of activity obtained from both measurement and calculation based on the reaction cross section were compared, and it was confirmed that the intensity and the distribution of the activity changed with the start time of the PET imaging after proton beam irradiation. The clinical use of this information about the positron-emitting nuclei will be important for promoting proton treatment with higher accuracy in the future.

  4. CHEMICAL INDUCTION MIXER VERIFICATION - ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    Science.gov (United States)

    The Wet-Weather Flow Technologies Pilot of the Environmental Technology Verification (ETV) Program, which is supported by the U.S. Environmental Protection Agency and facilitated by NSF International, has recently evaluated the performance of chemical induction mixers used for di...

  5. Verification of radiation transport codes with unstructured meshes

    International Nuclear Information System (INIS)

    Confidence in the results of a radiation transport code requires that the code be verified against problems with known solutions. Such verification problems may be generated by means of the method of manufactured solutions. Previously we reported the application of this method to the verification of radiation transport codes for structured meshes, in particular the SCEPTRE code. We extend this work to verification with unstructured meshes and again apply it to SCEPTRE. We report on additional complexities for unstructured mesh verification of transport codes. Refinement of such meshes for error convergence studies is more involved, particularly for tetrahedral meshes. Furthermore, finite element integrations arising from the presence of the streaming operator exhibit different behavior for unstructured meshes than for structured meshes. We verify SCEPTRE with a combination of 'exact' and 'inexact' problems. Errors in the results are consistent with the discretizations, either being limited to roundoff error or displaying the expected rates of convergence with mesh refinement. We also observe behaviors in the results that were difficult to analyze and predict from a strictly theoretical basis, thereby yielding benefits from verification activities beyond demonstrating code correctness. (author)

  6. Working Memory Mechanism in Proportional Quantifier Verification

    Science.gov (United States)

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  7. Guidance for the verification and validation of neural networks

    CERN Document Server

    Pullum, L; Darrah, M

    2007-01-01

    Guidance for the Verification and Validation of Neural Networks is a supplement to the IEEE Standard for Software Verification and Validation, IEEE Std 1012-1998. Born out of a need by the National Aeronautics and Space Administration's safety- and mission-critical research, this book compiles over five years of applied research and development efforts. It is intended to assist the performance of verification and validation (V&V) activities on adaptive software systems, with emphasis given to neural network systems. The book discusses some of the difficulties with trying to assure adaptive systems in general, presents techniques and advice for the V&V practitioner confronted with such a task, and based on a neural network case study, identifies specific tasking and recommendations for the V&V of neural network systems.

  8. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  9. Mappability and Read Length

    Directory of Open Access Journals (Sweden)

    Wentian eLi

    2014-11-01

    Full Text Available Power-law distributions are the main functional form forthe distribution of repeat size and repeat copy number in the human genome. When the genome is broken into fragments for sequencing, the limited size offragments and reads may prevent an unique alignment of repeatsequences to the reference sequence. Repeats in the human genome canbe as long as $10^4$ bases, or $10^5-10^6$ bases when allowing for mismatches between repeat units. Sequence reads from these regions are therefore unmappable when the read length is in the range of $10^3$ bases.With the read length of exactly 1000 bases, slightly more than 1% of theassembled genome, and slightly less than 1% of the 1kbreads, are unmappable, excluding the unassembled portion of the humangenome (8% in GRCh37. The slow decay (long tail ofthe power-law function implies a diminishing return in convertingunmappable regions/reads to become mappable with the increase of theread length, with the understanding that increasing read length willalways move towards the direction of 100% mappability.

  10. Automated Verification of Virtualized Infrastructures

    DEFF Research Database (Denmark)

    Bleikertz, Sören; Gross, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    Virtualized infrastructures and clouds present new challenges for security analysis and formal verification: they are complex environments that continuously change their shape, and that give rise to non-trivial security goals such as isolation and failure resilience requirements. We present...

  11. Eggspectation : organic egg verification tool

    NARCIS (Netherlands)

    Ruth, van S.M.; Hoogenboom, L.A.P.

    2011-01-01

    In 2009 RIKILT conducted a study on about 2,000 eggs to evaluate three different analytical verification methods: carotenoid profiling, fatty acid profiling and isotope ratio mass spectrometry. The eggs were collected from about 50 Dutch farms. The selection was based on the farms’ location and size

  12. A new model for verification

    Institute of Scientific and Technical Information of China (English)

    DU Zhen-jun; MA Guang-sheng; FENG Gang

    2007-01-01

    Formal verification is playing a significant role in IC design. However, the common models for verification either have their complexity problems or have applicable limitations. In order to overcome the deficiencies, a novel model-WGL (Weighted Generalized List) is proposed, which is based on the general-list decomposition of polynomials, with three different weights and manipulation rules introduced to effect node sharing and the canonicity. Timing parameters and operations on them are also considered. Examples show the word-level WGL is the only model to linearly represent the common word-level functions and the bit-level WGL is especially suitable for arithmetic intensive circuits. The model is proved to be a uniform and efficient model for both bitlevel and word-level functions. Then based on the WGL model, a backward-construction verification approach is proposed, which reduces time and space complexity for multipliers to polynomial complexity ( time complexity is less than O( n3.6) and space complexity is less than O( n1.5) ) without hierarchical partitioning. Both the model and the verification method show their theoretical and applicable significance in IC design.

  13. Modular Verification of Recursive Programs

    CERN Document Server

    Apt, Krzysztof R; Olderog, Ernst-Rüdiger

    2009-01-01

    We argue that verification of recursive programs by means of the assertional method of C.A.R. Hoare can be conceptually simplified using a modular reasoning. In this approach some properties of the program are established first and subsequently used to establish other program properties. We illustrate this approach by providing a modular correctness proof of the Quicksort program.

  14. Verification of safety critical software

    International Nuclear Information System (INIS)

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  15. Emergency operating procedures. Generation, verification and validation

    International Nuclear Information System (INIS)

    Systems Response, Operator Cognition and the application of the Emergency Operating Procedures (EOP) Standards for Canadian Nuclear Utilities are three of the four corner stones of the Point Lepreau EOP program, the fourth corner stone is a common sense application of the other three. The Emergency Operating Procedures for the Point Lepreau Generating Station have been subject to two major revisions over the past decade. The later revision, currently in progress, reflects a full application of the 'Emergency Operating Procedures Standards for Canadian Utilities'. The Standards require, prior to issue of an Emergency Operating Procedure, the application of a process which entails the elements of 'Generation', 'Verification' and 'Validation'. This paper describes that process with respect to the production (including Generation, Verification and Validation) of a generic EOP and those EOPs which deal with Loss of Coolant Accidents and Loss of Heat Sink accidents. The activities involved in each of the elements are discussed and illustrated with examples extracted from the EOPs. The EOPs are part of a larger framework which dictates the human response to an upset - the plant specific 'Upset Response Strategy'. That strategy is developed from a fundamental understanding of the process time constants. Likewise, the strategies internal to an EOP must recognize both process time constants and the 'human time constants'. The EOP structure, format and detailed content must recognize the Control Room Operator as an intelligent controller -objectives, inputs, decisions and actions must be expressed with the CROs' cognition foremost. Proper application of the elements of Generation, Verification and Validation ensure that the necessary technical and operational experience has been incorporated into an EOP before it is released to training and before it is issued. (author) 8 refs., 4 figs

  16. Atomic frequency-time-length standards

    International Nuclear Information System (INIS)

    The principles of operative of atomic frequency-time-length standards and their principle characteristics are described. The role of quartz crystal oscillators which are sloved to active or passive standards is presented. (authors)

  17. Verification of the viability of virions detection using neutron activation analysis; Verificacao da viabilidade de deteccao de virions atraves da analise por ativacao com neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Wacha, R.; Silva, A.X. da; Crispim, V.R [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia. Programa de Engenharia Nuclear; Couceiro, J.N.S.S. [Universidade Federal, Rio de Janeiro, RJ (Brazil). Inst. de Microbiologia Professor Paulo de Goes. Dept. de Virologia

    2002-07-01

    The use of nuclear techniques, as Neutron Activation Analysis, can be an alternative way for the microbiological diagnosis, bringing a significant profit in the analysis time, for not needing pre cultivated samples in appropriate way. In this technique, the samples are collected and submitted to a thermal neutron beam. The interaction of these neutrons with the samples generates gamma rays whose energy spectre is a characteristic of the elemental composition of these samples. Of this done one, a virus presence can be detected in the sample through the distinction of its respective elemental compositions allowing, also, carrying through the analysis in real time. In this work, computational simulations had been become fulfilled using the radiation transport code based on the Monte Carlo Method, MCNP4B, to verify the viability of the application of this system for the virus particle detection in its natural collection environment. (author)

  18. Laboratory verification of the Active Particle-induced X-ray Spectrometer (APXS) on the Chang'e-3 mission

    International Nuclear Information System (INIS)

    In the Chang'e-3 mission, the Active Particle-induced X-ray Spectrometer (APXS) on the Yutu rover is used to analyze the chemical composition of lunar soil and rock samples. APXS data are only valid are only if the sensor head gets close to the target and integration time lasts long enough. Therefore, working distance and integration time are the dominant factors that affect APXS results. This study confirms the ability of APXS to detect elements and investigates the effects of distance and time on the measurements. We make use of a backup APXS instrument to determine the chemical composition of both powder and bulk samples under the conditions of different working distances and integration times. The results indicate that APXS can detect seven major elements, including Mg, Al, Si, K, Ca, Ti and Fe under the condition that the working distance is less than 30 mm and having an integration time of 30 min. The statistical deviation is smaller than 15%. This demonstrates the instrument's ability to detect major elements in the sample. Our measurements also indicate the increase of integration time could reduce the measurement error of peak area, which is useful for detecting the elements Mg, Al and Si. However, an increase in working distance can result in larger errors in measurement, which significantly affects the detection of the element Mg. (paper)

  19. Verification of anticlockwise gyre in the semi-closed water area of Lake Nakaumi, southwest Japan, by using 224Ra/228Ra activity ratios

    International Nuclear Information System (INIS)

    The Honjyo area in Lake Nakaumi is a semi-closed brackish water area where some mixing of up-flowing marine water and down-flowing lake water take place. A large-scale gyre that caused by the residual circulation was once indicated by a temporal algal blooming that spread over the semi-closed Honjyo area in brackish Lake Nakaumi. In order to verify this type of water circulation, we examined 224Ra (t1/2=3.66 d)/228Ra (t1/2=5.75 y) activity ratios of both upper and lower waters that differentiated by a well-developed halocline. The 224Ra/228Ra ratios in the upper water were lowest in the central area, suggesting the formation of anticlockwise gyre. The ratios in the lower water were rather uniform, but a basin-wide anticlockwise flow of water is also indicated. The 224Ra/228Ra ratio is clearly effective to trace the water flow for both the deep and surface waters. (author)

  20. A Series of Diamagnetic Pyridine Monoimine Rhenium Complexes with Different Degrees of Metal-to-Ligand Charge Transfer: Correlating (13) C NMR Chemical Shifts with Bond Lengths in Redox-Active Ligands.

    Science.gov (United States)

    Sieh, Daniel; Kubiak, Clifford P

    2016-07-18

    A set of pyridine monoimine (PMI) rhenium(I) tricarbonyl chlorido complexes with substituents of different steric and electronic properties was synthesized and fully characterized. Spectroscopic (NMR and IR) and single-crystal X-ray diffraction analyses of these complexes showed that the redox-active PMI ligands are neutral and that the overall electronic structure is little affected by the choices of the substituent at the ligand backbone. One- and two-electron reduction products were prepared from selected starting compounds and could also be characterized by multiple spectroscopic methods and X-ray diffraction. The final product of a one-electron reduction in THF is a diamagnetic metal-metal-bonded dimer after loss of the chlorido ligand. Bond lengths in and NMR chemical shifts of the PMI ligand backbone indicate partial electron transfer to the ligand. Two-electron reduction in THF also leads to the loss of the chlorido ligand and a pentacoordinate complex is obtained. The comparison with reported bond lengths and (13) C NMR chemical shifts of doubly reduced free pyridine monoaldimine ligands indicates that both redox equivalents in the doubly reduced rhenium complex investigated here are located in the PMI ligand. With diamagnetic complexes varying over three formal reduction stages at the PMI ligand we were, for the first time, able to establish correlations of the (13) C NMR chemical shifts with the relevant bond lengths in redox-active ligands over a full redox series. PMID:27319753

  1. INDEPENDENT VERIFICATION OF THE BUILDING 3550 SLAB AT OAK RIDGE NATIONAL LABORATORY OAK RIDGE, TENNESSEE

    Energy Technology Data Exchange (ETDEWEB)

    Weaver, Phyllis C.

    2012-05-08

    The Oak Ridge Institute for Science and Education (ORISE) has completed the independent verification survey of the Building 3550 Slab. The results of this effort are provided. The objective of this verification survey is to provide independent review and field assessment of remediation actions conducted by Safety and Ecology Corporation (SEC) to document that the final radiological condition of the slab meets the release guidelines. Verification survey activities on the Building 3550 Slab that included scans, measurements, and the collection of smears. Scans for alpha, alpha plus beta, and gamma activity identified several areas that were investigated.

  2. Societal Verification: Intellectual Game or International Game-Changer

    International Nuclear Information System (INIS)

    Within the nuclear nonproliferation and arms control field, there is an increasing appreciation for the potential of open source information technologies to supplement existing verification and compliance regimes. While clearly not a substitute for on-site inspections or national technical means, it may be possible to better leverage information gleaned from commercial satellite imagery, international trade records and the vast amount of data being exchanged online and between publics (including social media) so as to develop a more comprehensive set of tools and practices for monitoring and verifying a state’s nuclear activities and helping judge compliance with international obligations. The next generation “toolkit” for monitoring and verifying items, facility operations and activities will likely include a more diverse set of analytical tools and technologies than are currently used internationally. To explore these and other issues, the Nuclear Threat Initiative has launched an effort that examines, in part, the role that emerging technologies and “citizen scientists” might play in future verification regimes. This paper will include an assessment of past proliferation and security “events” and whether emerging tools and technologies would have provided indicators concurrently or in advance of these actions. Such case studies will be instrumental in understanding the reliability of these technologies and practices and in thinking through the requirements of a 21st century verification regime. Keywords: Verification, social media, open-source information, arms control, disarmament.

  3. Ultrasonic Verification of Composite Structures

    OpenAIRE

    Pelt, Maurice; Boer, Robert Jan,; Schoemaker, Christiaan; Sprik, Rudolf

    2014-01-01

    International audience Ultrasonic Verification is a new method for the monitoring large surface areas of CFRP by ultrasound with few sensors. The echo response of a transmitted pulse through the structure is compared with the response of an earlier obtained reference signal to calculate a fidelity parameter. A change in fidelity over time is indicative for a new defect in the structure. This paper presents an experimental assessment of the effectiveness and reproducibility of the method.

  4. An Effective Fingerprint Verification Technique

    OpenAIRE

    Gogoi, Minakshi; Bhattacharyya, D K

    2010-01-01

    This paper presents an effective method for fingerprint verification based on a data mining technique called minutiae clustering and a graph-theoretic approach to analyze the process of fingerprint comparison to give a feature space representation of minutiae and to produce a lower bound on the number of detectably distinct fingerprints. The method also proving the invariance of each individual fingerprint by using both the topological behavior of the minutiae graph and also using a distance ...

  5. discouraged by queue length

    Directory of Open Access Journals (Sweden)

    P. R. Parthasarathy

    2001-01-01

    Full Text Available The transient solution is obtained analytically using continued fractions for a state-dependent birth-death queue in which potential customers are discouraged by the queue length. This queueing system is then compared with the well-known infinite server queueing system which has the same steady state solution as the model under consideration, whereas their transient solutions are different. A natural measure of speed of convergence of the mean number in the system to its stationarity is also computed.

  6. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  7. Regression Verification Using Impact Summaries

    Science.gov (United States)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  8. Measuring verification device error rates

    International Nuclear Information System (INIS)

    A verification device generates a Type I (II) error when it recommends to reject (accept) a valid (false) identity claim. For a given identity, the rates or probabilities of these errors quantify random variations of the device from claim to claim. These are intra-identity variations. To some degree, these rates depend on the particular identity being challenged, and there exists a distribution of error rates characterizing inter-identity variations. However, for most security system applications we only need to know averages of this distribution. These averages are called the pooled error rates. In this paper the authors present the statistical underpinnings for the measurement of pooled Type I and Type II error rates. The authors consider a conceptual experiment, ''a crate of biased coins''. This model illustrates the effects of sampling both within trials of the same individual and among trials from different individuals. Application of this simple model to verification devices yields pooled error rate estimates and confidence limits for these estimates. A sample certification procedure for verification devices is given in the appendix

  9. Kinetic analysis of anionic surfactant adsorption from aqueous solution onto activated carbon and layered double hydroxide with the zero length column method

    NARCIS (Netherlands)

    Schouten, Natasja; Ham, Louis G.J. van der; Euverink, Gert-Jan W.; Haan, André B. de

    2009-01-01

    Low cost adsorption technology offers high potential to clean-up laundry rinsing water. From an earlier selection of adsorbents, layered double hydroxide (LDH) and granular activated carbon (GAC) proved to be interesting materials for the removal of anionic surfactant, linear alkyl benzene sulfonate

  10. Gender verification in competitive sports.

    Science.gov (United States)

    Simpson, J L; Ljungqvist, A; de la Chapelle, A; Ferguson-Smith, M A; Genel, M; Carlson, A S; Ehrhardt, A A; Ferris, E

    1993-11-01

    The possibility that men might masquerade as women and be unfair competitors in women's sports is accepted as outrageous by athletes and the public alike. Since the 1930s, media reports have fuelled claims that individuals who once competed as female athletes subsequently appeared to be men. In most of these cases there was probably ambiguity of the external genitalia, possibly as a result of male pseudohermaphroditism. Nonetheless, beginning at the Rome Olympic Games in 1960, the International Amateur Athletics Federation (IAAF) began establishing rules of eligibility for women athletes. Initially, physical examination was used as a method for gender verification, but this plan was widely resented. Thus, sex chromatin testing (buccal smear) was introduced at the Mexico City Olympic Games in 1968. The principle was that genetic females (46,XX) show a single X-chromatic mass, whereas males (46,XY) do not. Unfortunately, sex chromatin analysis fell out of common diagnostic use by geneticists shortly after the International Olympic Committee (IOC) began its implementation for gender verification. The lack of laboratories routinely performing the test aggravated the problem of errors in interpretation by inexperienced workers, yielding false-positive and false-negative results. However, an even greater problem is that there exist phenotypic females with male sex chromatin patterns (e.g. androgen insensitivity, XY gonadal dysgenesis). These individuals have no athletic advantage as a result of their congenital abnormality and reasonably should not be excluded from competition. That is, only the chromosomal (genetic) sex is analysed by sex chromatin testing, not the anatomical or psychosocial status. For all the above reasons sex chromatin testing unfairly excludes many athletes. Although the IOC offered follow-up physical examinations that could have restored eligibility for those 'failing' sex chromatin tests, most affected athletes seemed to prefer to 'retire'. All

  11. Online Signature Verification Based on DCT and Sparse Representation.

    Science.gov (United States)

    Liu, Yishu; Yang, Zhihua; Yang, Lihua

    2015-11-01

    In this paper, a novel online signature verification technique based on discrete cosine transform (DCT) and sparse representation is proposed. We find a new property of DCT, which can be used to obtain a compact representation of an online signature using a fixed number of coefficients, leading to simple matching procedures and providing an effective alternative to deal with time series of different lengths. The property is also used to extract energy features. Furthermore, a new attempt to apply sparse representation to online signature verification is made, and a novel task-specific method for building overcomplete dictionaries is proposed, then sparsity features are extracted. Finally, energy features and sparsity features are concatenated to form a feature vector. Experiments are conducted on the Sabancı University's Signature Database (SUSIG)-Visual and SVC2004 databases, and the results show that our proposed method authenticates persons very reliably with a verification performance which is better than those of state-of-the-art methods on the same databases.

  12. Segment lengths influence hill walking strategies.

    Science.gov (United States)

    Sheehan, Riley C; Gottschall, Jinger S

    2014-08-22

    Segment lengths are known to influence walking kinematics and muscle activity patterns. During level walking at the same speed, taller individuals take longer, slower strides than shorter individuals. Based on this, we sought to determine if segment lengths also influenced hill walking strategies. We hypothesized that individuals with longer segments would display more joint flexion going uphill and more extension going downhill as well as greater lateral gastrocnemius and vastus lateralis activity in both directions. Twenty young adults of varying heights (below 155 cm to above 188 cm) walked at 1.25 m/s on a level treadmill as well as 6° and 12° up and downhill slopes while we collected kinematic and muscle activity data. Subsequently, we ran linear regressions for each of the variables with height, leg, thigh, and shank length. Despite our population having twice the anthropometric variability, the level and hill walking patterns matched closely with previous studies. While there were significant differences between level and hill walking, there were few hill walking variables that were correlated with segment length. In support of our hypothesis, taller individuals had greater knee and ankle flexion during uphill walking. However, the majority of the correlations were between tibialis anterior and lateral gastrocnemius activities and shank length. Contrary to our hypothesis, relative step length and muscle activity decreased with segment length, specifically shank length. In summary, it appears that individuals with shorter segments require greater propulsion and toe clearance during uphill walking as well as greater braking and stability during downhill walking. PMID:24968942

  13. Kinetic analysis of anionic surfactant adsorption from aqueous solution onto activated carbon and layered double hydroxide with the zero length column method

    OpenAIRE

    Schouten, Natasja; Ham, Louis G.J. van der; Euverink, Gert-Jan W.; Haan, André B. de

    2009-01-01

    Low cost adsorption technology offers high potential to clean-up laundry rinsing water. From an earlier selection of adsorbents, layered double hydroxide (LDH) and granular activated carbon (GAC) proved to be interesting materials for the removal of anionic surfactant, linear alkyl benzene sulfonate (LAS), which is the main contaminant in rinsing water. The main research question is to identify adsorption kinetics of LAS onto GAC-1240 and LDH. The influence of pre-treatment of the adsorbent, ...

  14. Block 2 SRM conceptual design studies. Volume 1, Book 2: Preliminary development and verification plan

    Science.gov (United States)

    1986-01-01

    Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.

  15. Towards the Verification of Human-Robot Teams

    Science.gov (United States)

    Fisher, Michael; Pearce, Edward; Wooldridge, Mike; Sierhuis, Maarten; Visser, Willem; Bordini, Rafael H.

    2005-01-01

    Human-Agent collaboration is increasingly important. Not only do high-profile activities such as NASA missions to Mars intend to employ such teams, but our everyday activities involving interaction with computational devices falls into this category. In many of these scenarios, we are expected to trust that the agents will do what we expect and that the agents and humans will work together as expected. But how can we be sure? In this paper, we bring together previous work on the verification of multi-agent systems with work on the modelling of human-agent teamwork. Specifically, we target human-robot teamwork. This paper provides an outline of the way we are using formal verification techniques in order to analyse such collaborative activities. A particular application is the analysis of human-robot teams intended for use in future space exploration.

  16. MOV reliability evaluation and periodic verification scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Bunte, B.D.

    1996-12-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.

  17. Verification-based Software-fault Detection

    OpenAIRE

    Gladisch, Christoph David

    2011-01-01

    Software is used in many safety- and security-critical systems. Software development is, however, an error-prone task. In this dissertation new techniques for the detection of software faults (or software "bugs") are described which are based on a formal deductive verification technology. The described techniques take advantage of information obtained during verification and combine verification technology with deductive fault detection and test generation in a very unified way.

  18. Design, Implementation, and Verification of the Reliable Multicast Protocol. Thesis

    Science.gov (United States)

    Montgomery, Todd L.

    1995-01-01

    This document describes the Reliable Multicast Protocol (RMP) design, first implementation, and formal verification. RMP provides a totally ordered, reliable, atomic multicast service on top of an unreliable multicast datagram service. RMP is fully and symmetrically distributed so that no site bears an undue portion of the communications load. RMP provides a wide range of guarantees, from unreliable delivery to totally ordered delivery, to K-resilient, majority resilient, and totally resilient atomic delivery. These guarantees are selectable on a per message basis. RMP provides many communication options, including virtual synchrony, a publisher/subscriber model of message delivery, a client/server model of delivery, mutually exclusive handlers for messages, and mutually exclusive locks. It has been commonly believed that total ordering of messages can only be achieved at great performance expense. RMP discounts this. The first implementation of RMP has been shown to provide high throughput performance on Local Area Networks (LAN). For two or more destinations a single LAN, RMP provides higher throughput than any other protocol that does not use multicast or broadcast technology. The design, implementation, and verification activities of RMP have occurred concurrently. This has allowed the verification to maintain a high fidelity between design model, implementation model, and the verification model. The restrictions of implementation have influenced the design earlier than in normal sequential approaches. The protocol as a whole has matured smoother by the inclusion of several different perspectives into the product development.

  19. Wind-farm power performance verification

    Energy Technology Data Exchange (ETDEWEB)

    Dutilleux, P. [DEWI German Wind Energy Institute, Wilhelmshaven (Germany)

    2005-07-01

    Details of wind farm power performance verification procedures were presented. Verifications were performed at the DEWI test site at Wilhelmhaven, Germany. Types of power performance guarantees included power performance of individual turbines with IEC verification measurement, and Nacelle anemometer verification. In addition, availability guarantees were examined, as well as issues concerning energy production guarantees of complete wind farms in relation to nearby meteorology masts. An evaluation of power curve verification measurements was presented as well as measurement procedures relating to IEC standards. Methods for Nacelle anemometry verification included calibration of the anemometer; documentation of its exact position and chain signal; and end-to-end calibration from sensor to SCADA data base. Classification of anemometers included impact of dynamical effects; and influence on annual energy production. An example of a project for performance verification of a wind farm with 9 identical wind turbines was presented. The significance of status signals was discussed, as well as alternative methods for power-curve measurements. Evaluation procedures for energy yield and power curve verifications were presented. The upcoming set of IEC standards concerning power curve measurements was discussed. Various alternative verification procedures for wind farm power performance were reviewed. refs., tabs., figs.

  20. Turbulence Modeling Verification and Validation

    Science.gov (United States)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  1. SHIELD verification and validation report

    Energy Technology Data Exchange (ETDEWEB)

    Boman, C.

    1992-02-01

    This document outlines the verification and validation effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system code. Along with its predecessors, SHIELD has been in use at the Savannah River Site (SRS) for more than ten years. During this time the code has been extensively tested and a variety of validation documents have been issued. The primary function of this report is to specify the features and capabilities for which SHIELD is to be considered validated, and to reference the documents that establish the validation.

  2. A verification environment for bigraphs

    DEFF Research Database (Denmark)

    Perrone, Gian David; Debois, Søren; Hildebrandt, Thomas

    2013-01-01

    We present the BigMC tool for bigraphical reactive systems that may be instantiated as a verification tool for any formalism or domain-specific modelling language encoded as a bigraphical reactive system. We introduce the syntax and use of BigMC, and exemplify its use with two small examples......: a textbook “philosophers” example, and an example motivated by a ubiquitous computing application. We give a tractable heuristic with which to approximate interference between reaction rules, and prove this analysis to be safe. We provide a mechanism for state reachability checking of bigraphical reactive...

  3. An Effective Fingerprint Verification Technique

    CERN Document Server

    Gogoi, Minakshi

    2010-01-01

    This paper presents an effective method for fingerprint verification based on a data mining technique called minutiae clustering and a graph-theoretic approach to analyze the process of fingerprint comparison to give a feature space representation of minutiae and to produce a lower bound on the number of detectably distinct fingerprints. The method also proving the invariance of each individual fingerprint by using both the topological behavior of the minutiae graph and also using a distance measure called Hausdorff distance.The method provides a graph based index generation mechanism of fingerprint biometric data. The self-organizing map neural network is also used for classifying the fingerprints.

  4. Numerical Verification of Industrial Numerical Codes

    Directory of Open Access Journals (Sweden)

    Montan Sethy

    2012-04-01

    Full Text Available Several approximations occur during a numerical simulation: physical effects mapy be discarded, continuous functions replaced by discretized ones and real numbers replaced by finite-precision representations. The use of the floating point arithmetic generates round-off errors at each arithmetical expression and some mathematical properties are lost. The aim of the numerical verification activity at EDF R&D is to study the effect of the round-off error propagation on the results of a numerical simulation. It is indeed crucial to perform a numerical verification of industrial codes such as developped at EDF R&D even more for code running in HPC environments. This paper presents some recent studies around the numerical verification at EDF R&D. Le résultat d’un code de simulation numérique subit plusieurs approximations effectuées lors de la modélisation mathématique du problème physique, de la discrétisation du modèle mathématique et de la résolution numérique en arithmétique flottante. L’utilisation de l’arithmétique flottante génère en effet des erreurs d’arrondi lors de chaque opération flottante et des propriétés mathématiques sont perdues. Il existe à EDF R&D une activité transverse de vérification numérique consistant à étudier l’effet de la propagation des erreurs d’arrondi sur les résultats des simulations. Il est en effet important de vérifier numériquement des codes industriels et ce d’autant plus s’ils sont éxécutés dans environnements de calcul haute performance. Ce papier présente des études récentes autour de la vérification numérique à EDF R&D.

  5. Smoking cessation early in pregnancy and birth weight, length, head circumference, and endothelial nitric oxide synthase activity in umbilical and chorionic vessels: an observational study of healthy singleton pregnancies

    DEFF Research Database (Denmark)

    Andersen, Malene R; Simonsen, Ulf; Uldbjerg, Niels;

    2009-01-01

    BACKGROUND: Reduced production of the vasodilator nitric oxide (NO) in fetal vessels in pregnant smokers may lower the blood flow to the fetus and result in lower birth weight, length, and head circumference. The present study measured endothelial NO synthase (eNOS) activity in fetal umbilical...... and chorionic vessels from nonsmokers, smokers, and ex-smokers and related the findings to the fetal outcome. METHODS AND RESULTS: Of 266 healthy, singleton pregnancies, 182 women were nonsmokers, 43 were smokers, and 41 stopped smoking early in pregnancy. eNOS activity and concentration were quantified...... in endothelial cells of the fetal vessels. Cotinine, lipid profiles, estradiol, l-arginine, and dimethylarginines that may affect NO production were determined in maternal and fetal blood. Serum cotinine verified self-reported smoking. Newborns of smokers had a lower weight (P

  6. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  7. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  8. RISKIND verification and benchmark comparisons

    International Nuclear Information System (INIS)

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models

  9. On the organisation of program verification competitions

    NARCIS (Netherlands)

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    2012-01-01

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to sho

  10. A Correlation-Based Fingerprint Verification System

    NARCIS (Netherlands)

    Bazen, Asker M.; Verwaaijen, Gerben T.B.; Gerez, Sabih H.; Veelenturf, Leo P.J.; Zwaag, van der Berend Jan

    2000-01-01

    In this paper, a correlation-based fingerprint verification system is presented. Unlike the traditional minutiae-based systems, this system directly uses the richer gray-scale information of the fingerprints. The correlation-based fingerprint verification system first selects appropriate templates i

  11. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  12. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Ed; Mader, Angelika

    2004-01-01

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the verificatio

  13. Verification Support for Object Database Design

    NARCIS (Netherlands)

    Spelt, David

    1999-01-01

    In this thesis we have developed a verification theory and a tool for the automated analysis of assertions about object-oriented database schemas. The approach is inspired by the work of [SS89] in which a theorem prover is used to automate the verification of invariants for transactions on a relatio

  14. Methods of Verification, Accountability and Control of Special Nuclear Material

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, J.E.

    1999-05-03

    This session demonstrates nondestructive assay (NDA) measurement, surveillance and analysis technology required to protect, control and account (MPC and A) for special nuclear materials (SNM) in sealed containers. These measurements, observations and analyses comprise state-of-the art, strengthened, SNM safeguards systems. Staff member specialists, actively involved in research, development, training and implementation worldwide, will present six NDA verification systems and two software tools for integration and analysis of facility MPC and A data.

  15. Concepts of Model Verification and Validation

    International Nuclear Information System (INIS)

    Model verification and validation (VandV) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model VandV procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model VandV program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model VandV is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define VandV methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for VandV applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of VandV procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model

  16. Verification and validation process for the safety software in KNICS

    International Nuclear Information System (INIS)

    This paper describes the Verification and Validation (V and V ) process for safety software of Programmable Logic Controller (PLC), Digital Reactor Protection System (DRPS), and Engineered Safety Feature-Component Control System (ESF-CCS) that are being developed in Korea Nuclear Instrumentation and Control System (KNICS) projects. Specifically, it presents DRPS V and V experience according to the software development life cycle. The main activities of DRPS V and V process are preparation of software planning documentation, verification of Software Requirement Specification (SRS), Software Design Specification (SDS) and codes, and testing of the integrated software and the integrated system. In addition, they include software safety analysis and software configuration management. SRS V and V of DRPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated system test plan, software safety analysis, and software configuration management. Also, SDS V and V of RPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated software test plan, software safety analysis, and software configuration management. The code V and V of DRPS are traceability analysis, source code inspection, test case and test procedure generation, software safety analysis, and software configuration management. Testing is the major V and V activity of software integration and system integration phase. Software safety analysis at SRS phase uses Hazard Operability (HAZOP) method, at SDS phase it uses HAZOP and Fault Tree Analysis (FTA), and at implementation phase it uses FTA. Finally, software configuration management is performed using Nu-SCM (Nuclear Software Configuration Management) tool developed by KNICS project. Through these activities, we believe we can achieve the functionality, performance, reliability and safety that are V

  17. SU-E-J-138: On the Ion Beam Range and Dose Verification in Hadron Therapy Using Sound Waves

    International Nuclear Information System (INIS)

    Purpose: Accurate range verification is of great importance to fully exploit the potential benefits of ion beam therapies. Current research efforts on this topic include the use of PET imaging of induced activity, detection of emerging prompt gamma rays or secondary particles. It has also been suggested recently to detect the ultrasound waves emitted through the ion energy absorption process. The energy absorbed in a medium is dissipated as heat, followed by thermal expansion that leads to generation of acoustic waves. By using an array of ultrasound transducers the precise spatial location of the Bragg peak can be obtained. The shape and intensity of the emitted ultrasound pulse depend on several variables including the absorbed energy and the pulse length. The main objective of this work is to understand how the ultrasound wave amplitude and shape depend on the initial ion energy and intensity. This would help guide future experiments in ionoacoustic imaging. Methods: The absorbed energy density for protons and carbon ions of different energy and field sizes were obtained using Fluka Monte Carlo code. Subsequently, the system of coupled equations for temperature and pressure is solved for different ion pulse intensities and lengths to obtain the pressure wave shape, amplitude and spectral distribution. Results: The proposed calculations show that the excited pressure wave amplitude is proportional to the absorbed energy density and for longer ion pulses inversely proportional to the ion pulse duration. It is also shown that the resulting ionoacoustic pressure distribution depends on both ion pulse duration and time between the pulses. Conclusion: The Bragg peak localization using ionoacoustic signal may eventually lead to the development of an alternative imaging method with sub-millimeter resolution. It may also open a way for in-vivo dose verification from the measured acoustic signal

  18. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    The purpose of this thesis is to develop a method for verifying timed temporal properties of continuous dynamical systems, and to develop a method for verifying the safety of an interconnection of continuous systems. The methods must be scalable in the number of continuous variables......, an abstraction approach is taken to the verification problem. A method is developed for abstracting continuous dynamical systems by timed automata. This method is based on subdividing the state space into cells by use of subdivisioning functions that are decreasing along the vector field. To allow...... is developed based on sum of squares programming. In addition, a necessary and sufficient condition is provided for identifying the subdivisioning functions that allow the generation of complete abstractions. A complete abstraction makes it possible to verify and falsify timed temporal properties of continuous...

  19. MFTF sensor verification computer program

    Energy Technology Data Exchange (ETDEWEB)

    Chow, H.K.

    1984-11-09

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system.

  20. Age and disease at an arms length

    DEFF Research Database (Denmark)

    Lassen, Aske Juul

    How are the boundaries of disease and health, age, life and death negotiated through technology and active aging? The paper focuses on how disease and age are dealt with by active elderly at activity centres in the Copenhagen area. New health technologies lead to new expectations to the longevity...... a chronic (previously fatal) disease. The active elderly often stick to their image of themselves as active, youthful and energetic in spite of a chronic disease. Old age and disease is not what they identify with and seems to be conceived at an arms length. In the paper the author explores how health...

  1. Verification for PWSCC mitigation technologies

    International Nuclear Information System (INIS)

    In order to prevent damages or leakage due to PWSCC (Primary Water Stress Corrosion Cracking) and to improve the reliability of power plants, various technologies, inspection, mitigation, replacement and repair have been developed and performed. Water jet peening (WJP) and ultrasonic shot peening (USP) have been developed and verified as mitigation technologies and these techniques have been applied to the operating reactor vessels (RV) and the steam generators (SG). So far, the effect of WJP/USP on materials without cracks was confirmed by a verification test. However, there are detection limits in pre-inspections for WJP/USP. Therefore, it should be confirmed that the WJP/USP can be applied in cases where cracks shallower than the detection limit are present that it will not cause any negative impact on those cracks, such as crack growth. This paper describes the verification test for the applicability of WJP and USP to materials with shallow cracks beyond pre-inspection detection. First of all, plate mockups with shallow cracks were prepared. The WJP or USP was conducted on the mockups. Then, the residual stress adjacent to the cracks was measured and the presence of any shallow crack growth was inspected. From the test results, the effect of WJP/USP application on residual stress (compressive stress) on the surfaces with shallow cracks was confirmed and the harmlessness of WJP/USP application on shallow cracks, which means no cracks growth, was also observed. This means that the application of WJP and USP is effective in the case where shallow cracks, which are beyond detection, are present. Therefore, the WJP and USP are confirmed as mitigation technologies for PWSCC even in cases where there is no indication of cracks detected during the pre-inspection. (authors)

  2. Conducting Verification and Validation of Multi- Agent Systems

    Directory of Open Access Journals (Sweden)

    Nedhal Al Saiyd

    2012-10-01

    Full Text Available Verification and Validation (V&V is a series of activities ,technical and managerial ,which performed bysystem tester not the system developer in order to improve the system quality ,system reliability andassure that product satisfies the users operational needs. Verification is the assurance that the products ofa particular development phase are consistent with the requirements of that phase and preceding phase(s,while validation is the assurance that the final product meets system requirements. an outside agency canbe used to performed V&V, which is indicate by Independent V&V, or IV&V, or by a group within theorganization but not the developer, referred to as Internal V&V. Use of V&V often accompanies testing,can improve quality assurance, and can reduce risk. This paper putting guidelines for performing V&V ofMulti-Agent Systems (MAS.

  3. Effects of Tea Saponin on Seed Germination, Root Lengths and Soil Enzyme Activities%茶皂素对种子发芽、根长及土壤酶活性的影响

    Institute of Scientific and Technical Information of China (English)

    侯俊杰; 李宁宁; 吕辉雄; 曾巧云; 吴启堂

    2015-01-01

    Tea saponin, as a biosurfactant, has great potential to be applied to the environmental remediation. However, it has certain biologi-cal toxicity to soil ecosystem. In this study, seeds of Chinese flowering cabbage(Brassica parachinensis), mung bean(Vigna radiata L.)and maize(Zea mays L.)were selected to examine the influences of tea saponin added to soil on seed germination rates, root elongations, and the activities of soil catalase and polyphenol oxidase. Results showed that the seed germination rates(excluding mung bean)and root lengths of three plant species were significantly inhibited by tea saponin at the experimental concentrations. The inhibitory effects aggravated with in-creasing concentrations of tea saponin. When tea saponin concentration was 0.2%, the root lengths significantly decreased more than 33.1%and the germination rates reduced more than 10%, compared with the control(without tea saponin). The inhibition of tea saponin on germi-nation rates and root lengths were dependent on the plant types. The germination rates decreased in order of maize﹥Chinese flowering cab-bage﹥mung bean, while root lengths decreased in order of Chinese flowering cabbage﹥maize﹥mung bean. With increasing concentrations of tea saponin, catalase activity in soil decreased significantly, while soil polyphenol oxidase activity increased significantly.%选取玉米、绿豆和菜心作为研究对象,调查茶皂素对3种作物种子发芽及根伸长的影响情况,同时研究茶皂素对土壤过氧化氢酶活性和多酚氧化酶活性的影响。结果表明:试验浓度范围内,茶皂素溶液对玉米、绿豆和菜心种子的发芽率及根伸长均具有显著的抑制作用(绿豆种子发芽率除外),0.2%处理对3种种子根伸长的抑制率达33.1%以上,其根长显著低于对照处理,对玉米和菜心种子发芽率的抑制率达10%;茶皂素对种子发芽率及根长抑制作用的强弱与种子类型有关

  4. Code Verification of the HIGRAD Computational Fluid Dynamics Solver

    Energy Technology Data Exchange (ETDEWEB)

    Van Buren, Kendra L. [Los Alamos National Laboratory; Canfield, Jesse M. [Los Alamos National Laboratory; Hemez, Francois M. [Los Alamos National Laboratory; Sauer, Jeremy A. [Los Alamos National Laboratory

    2012-05-04

    The purpose of this report is to outline code and solution verification activities applied to HIGRAD, a Computational Fluid Dynamics (CFD) solver of the compressible Navier-Stokes equations developed at the Los Alamos National Laboratory, and used to simulate various phenomena such as the propagation of wildfires and atmospheric hydrodynamics. Code verification efforts, as described in this report, are an important first step to establish the credibility of numerical simulations. They provide evidence that the mathematical formulation is properly implemented without significant mistakes that would adversely impact the application of interest. Highly accurate analytical solutions are derived for four code verification test problems that exercise different aspects of the code. These test problems are referred to as: (i) the quiet start, (ii) the passive advection, (iii) the passive diffusion, and (iv) the piston-like problem. These problems are simulated using HIGRAD with different levels of mesh discretization and the numerical solutions are compared to their analytical counterparts. In addition, the rates of convergence are estimated to verify the numerical performance of the solver. The first three test problems produce numerical approximations as expected. The fourth test problem (piston-like) indicates the extent to which the code is able to simulate a 'mild' discontinuity, which is a condition that would typically be better handled by a Lagrangian formulation. The current investigation concludes that the numerical implementation of the solver performs as expected. The quality of solutions is sufficient to provide credible simulations of fluid flows around wind turbines. The main caveat associated to these findings is the low coverage provided by these four problems, and somewhat limited verification activities. A more comprehensive evaluation of HIGRAD may be beneficial for future studies.

  5. On the verification of polynomial system solvers

    Institute of Scientific and Technical Information of China (English)

    Changbo CHEN; Marc MORENO MAZA; Wei PAN; Yuzhen XI

    2008-01-01

    We discuss the verification of mathematical software solving polynomial systems symbolically by way of triangular decomposition. Standard verification techniques are highly resource consuming and apply only to polynomial systems which are easy to solve. We exhibit a new approach which manipulates constructible sets represented by regular systems. We provide comparative benchmarks of different verification procedures applied to four solvers on a large set of well-known polynomial systems. Our experimental results illustrate the high effi-ciency of our new approach. In particular, we are able to verify triangular decompositions of polynomial systems which are not easy to solve.

  6. Curve Length Estimation using Vertix Chain Code Curve Length Estimation

    Directory of Open Access Journals (Sweden)

    Habibollah Haron

    2010-09-01

    Full Text Available Most of the applications in image analysis are based on Freeman chain code. In this paper, for the first time, vertex chain code (VCC proposed by Bribiesca is applied to improve length estimation of the 2D digitized curve. The chain code has some preferences such as stable in shifting, turning, mirroring movement of image and has normalized starting point. Due to the variety of length estimator methods, we focused on the three specific techniques. First, the way Bribiesca proposed which is based on counting links between vertices; second, based on maximum length digital straight segments (DSSs and lastly local metrics. The results of these length estimators with the real perimeter are compared. Results thus obtained exhibits thatlength estimation using VCC is nearest to the actual length.

  7. Integrated safety management system verification: Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, R.F.

    1998-08-12

    Department of Energy (DOE) Policy (P) 450.4, Safety Management System Policy, commits to institutionalizing an Integrated Safety Management System (ISMS) throughout the DOE complex. The DOE Acquisition Regulations (DEAR 48 CFR 970) requires contractors to manage and perform work in accordance with a documented Integrated Safety Management System. The Manager, Richland Operations Office (RL), initiated a combined Phase 1 and Phase 2 Integrated Safety Management Verification review to confirm that PNNL had successfully submitted a description of their ISMS and had implemented ISMS within the laboratory facilities and processes. A combined review was directed by the Manager, RL, based upon the progress PNNL had made in the implementation of ISM. This report documents the results of the review conducted to verify: (1) that the PNNL integrated safety management system description and enabling documents and processes conform to the guidance provided by the Manager, RL; (2) that corporate policy is implemented by line managers; (3) that PNNL has provided tailored direction to the facility management; and (4) the Manager, RL, has documented processes that integrate their safety activities and oversight with those of PNNL. The general conduct of the review was consistent with the direction provided by the Under Secretary`s Draft Safety Management System Review and Approval Protocol. The purpose of this review was to provide the Manager, RL, with a recommendation to the adequacy of the ISMS description of the Pacific Northwest Laboratory based upon compliance with the requirements of 49 CFR 970.5204(-2 and -78); and, to provide an evaluation of the extent and maturity of ISMS implementation within the Laboratory. Further, this review was intended to provide a model for other DOE Laboratories. In an effort to reduce the time and travel costs associated with ISM verification the team agreed to conduct preliminary training and orientation electronically and by phone. These

  8. A hybrid framework for verification of satellite precipitation products

    Science.gov (United States)

    Li, J.; Hsu, K.; AghaKouchak, A.; Sorooshian, S.

    2011-12-01

    Advances in satellite technology have led to the development of many remote-sensing algorithms to estimate precipitation at quasi-global scales. A number of satellite precipitation products are provided at high spatial and temporal resolutions that are suitable for short-term hydrologic applications. Several coordinated validation activities have been established to evaluate the accuracy of satellite precipitation. Traditional verification measures summarize pixel-to-pixel differences between observation and estimates. Object-based verification methods, however, extend pixel based validation to address errors related to spatial patterns and storm structure, such as the shape, volume, and distribution of precipitation rain-objects. In this investigation, a 2D watershed segmentation technique is used to identify rain storm objects and is further adopted in a hybrid verification framework to diagnose the storm-scale rainfall objects from both satellite-based precipitation estimates and ground observations (radar estimates). Five key scores are identified in the objective-based verification framework, including false alarm ratio, missing ratio, maximum of total interest, equal weight and weighted summation of total interest. These scores indicate the performance of satellite estimates with features extracted from the segmented storm objects. The proposed object-based verification framework was used to evaluate PERSIANN, PERSIANN-CCS, CMORPH, 3B42RT against NOAA stage IV MPE multi-sensor composite rain analysis. All estimates are evaluated at 0.25°x0.25° daily-scale in summer 2008 over the continental United States (CONUS). The five final scores for each precipitation product are compared with the median of maximum interest (MMI) of the Method for Object-Based Diagnostic Evaluation (MODE). The results show PERSIANN and CMORPH outperform 3B42RT and PERSIANN-CCS. Different satellite products presented distinct features of precipitation. For example, the sizes of

  9. Flammable Gas Refined Safety Analysis Tool Software Verification and Validation Report for Resolve Version 2.5

    Energy Technology Data Exchange (ETDEWEB)

    BRATZEL, D.R.

    2000-09-28

    The purpose of this report is to document all software verification and validation activities, results, and findings related to the development of Resolve Version 2.5 for the analysis of flammable gas accidents in Hanford Site waste tanks.

  10. Flammable Gas Refined Safety Analysis Tool Software Verification and Validation Report for Resolve Version 2.5

    International Nuclear Information System (INIS)

    The purpose of this report is to document all software verification and validation activities, results, and findings related to the development of Resolve Version 2.5 for the analysis of flammable gas accidents in Hanford Site waste tanks

  11. Minor modifications to the phosphate groups and the C3' acyl chain length of lipid A in two Bordetella pertussis strains, BP338 and 18-323, independently affect Toll-like receptor 4 protein activation.

    Science.gov (United States)

    Shah, Nita R; Albitar-Nehme, Sami; Kim, Emma; Marr, Nico; Novikov, Alexey; Caroff, Martine; Fernandez, Rachel C

    2013-04-26

    Lipopolysaccharides (LPS) of Bordetella pertussis are important modulators of the immune system. Interaction of the lipid A region of LPS with the Toll-like receptor 4 (TLR4) complex causes dimerization of TLR4 and activation of downstream nuclear factor κB (NFκB), which can lead to inflammation. We have previously shown that two strains of B. pertussis, BP338 (a Tohama I-derivative) and 18-323, display two differences in lipid A structure. 1) BP338 can modify the 1- and 4'-phosphates by the addition of glucosamine (GlcN), whereas 18-323 cannot, and 2) the C3' acyl chain in BP338 is 14 carbons long, but only 10 or 12 carbons long in 18-323. In addition, BP338 lipid A can activate TLR4 to a greater extent than 18-323 lipid A. Here we set out to determine the genetic reasons for the differences in these lipid A structures and the contribution of each structural difference to the ability of lipid A to activate TLR4. We show that three genes of the lipid A GlcN modification (Lgm) locus, lgmA, lgmB, and lgmC (previously locus tags BP0399-BP0397), are required for GlcN modification and a single amino acid difference in LpxA is responsible for the difference in C3' acyl chain length. Furthermore, by introducing lipid A-modifying genes into 18-323 to generate isogenic strains with varying penta-acyl lipid A structures, we determined that both modifications increase TLR4 activation, although the GlcN modification plays a dominant role. These results shed light on how TLR4 may interact with penta-acyl lipid A species.

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: FUEL CELLS

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  13. Tackling Verification and Validation for Prognostics

    Data.gov (United States)

    National Aeronautics and Space Administration — Verification and validation (V&V) has been identified as a critical phase in fielding systems with Integrated Systems Health Management (ISHM) solutions to...

  14. Data Exchanges and Verifications Online (DEVO)

    Data.gov (United States)

    Social Security Administration — DEVO is the back-end application for processing SSN verifications and data exchanges. DEVO uses modern technology for parameter driven processing of both batch and...

  15. Seismic design verification of LMFBR structures

    Energy Technology Data Exchange (ETDEWEB)

    1977-07-01

    The report provides an assessment of the seismic design verification procedures currently used for nuclear power plant structures, a comparison of dynamic test methods available, and conclusions and recommendations for future LMFB structures.

  16. MAMA Software Features: Quantification Verification Documentation-1

    Energy Technology Data Exchange (ETDEWEB)

    Ruggiero, Christy E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Porter, Reid B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  17. Procedure Verification and Validation Toolset Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed research is aimed at investigating a procedure verification and validation toolset, which will allow the engineers who are responsible for developing...

  18. Engineering drawing field verification program. Revision 3

    International Nuclear Information System (INIS)

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  19. Particularities of Verification Processes for Distributed Informatics Applications

    OpenAIRE

    IVAN, ION; Cristian CIUREA; Bogdan VINTILA; Gheorghe NOSCA

    2013-01-01

    This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are d...

  20. Performing Verification and Validation in Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  1. An Assembler Driven Verification Methodology (ADVM)

    OpenAIRE

    Macbeth, John S.; Heinz, Dietmar; Gray, Ken

    2005-01-01

    Submitted on behalf of EDAA (http://www.edaa.com/) International audience This paper presents an overview of an assembler driven verification methodology (ADVM) that was created and implemented for a chip card project at Infineon Technologies AG. The primary advantage of this methodology is that it enables rapid porting of directed tests to new targets and derivatives, with only a minimum amount of code refactoring. As a consequence, considerable verification development time and effort...

  2. Inventory verification measurements using neutron multiplicity counting

    Energy Technology Data Exchange (ETDEWEB)

    Ensslin, N.; Foster, L.A.; Harker, W.C.; Krick, M.S.; Langner, D.G.

    1998-12-31

    This paper describes a series of neutron multiplicity measurements of large plutonium samples at the Los Alamos Plutonium Facility. The measurements were corrected for bias caused by neutron energy spectrum shifts and nonuniform multiplication, and are compared with calorimetry/isotopics. The results show that multiplicity counting can increase measurement throughput and yield good verification results for some inventory categories. The authors provide recommendations on the future application of the technique to inventory verification.

  3. Verification of MPI-based Computations

    OpenAIRE

    Siegel, Stephen F.

    2008-01-01

    The Message Passing Interface is a widely-used parallel programming model and is the effective standard for high-performance scientific computing. It has also been used in parallel model checkers, such as DiVinE. In this talk we discuss the verification problem for MPI-based programs. The MPI is quite large and the semantics complex. Nevertheless, by restricting to a certain subset of MPI, the verification problem becomes tractable. Certain constructs outside of this subset (such as wildc...

  4. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  5. A Continuous Verification Process in Concurrent Engineering

    OpenAIRE

    Schaus, Volker; Tiede, Michael; Fischer, Philipp M.; Lüdtke, Daniel; Gerndt, Andreas

    2013-01-01

    This paper presents how a continuous mission verification process similar than in software engineering can be applied in early spacecraft design and Concurrent Engineering. Following the Model-based Systems Engineering paradigm, all engineers contribute to one single centralized data model of the system. The data model is enriched with some extra information to create an executable representation of the spacecraft and its mission. That executable scenario allows for verifications agains...

  6. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan;

    2012-01-01

    The interplay of random phenomena and continuous dynamics deserves increased attention, especially in the context of wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variants of systems with hybrid dynamics. In safety verification...... hybrid systems and develop a general abstraction technique for verifying probabilistic safety problems. This gives rise to the first mechanisable technique that can, in practice, formally verify safety properties of non-trivial continuous-time stochastic hybrid systems. Moreover, being based...

  7. Code Formal Verification of Operation System

    OpenAIRE

    Yu Zhang; Yunwei Dong; Huo Hong; Fan Zhang

    2010-01-01

    with the increasing pressure on non-function attributes (security, safety and reliability) requirements of an operation system, high–confidence operation system is becoming more important. Formal verification is the only known way to guarantee that a system is free of programming errors. We research on formal verification of operation system kernel in system code level and take theorem proving and model checking as the main technical methods to resolve the key techniques of verifying operatio...

  8. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  9. An Assembler Driven Verification Methodology (ADVM)

    CERN Document Server

    Macbeth, John S; Gray, Ken

    2011-01-01

    This paper presents an overview of an assembler driven verification methodology (ADVM) that was created and implemented for a chip card project at Infineon Technologies AG. The primary advantage of this methodology is that it enables rapid porting of directed tests to new targets and derivatives, with only a minimum amount of code refactoring. As a consequence, considerable verification development time and effort was saved.

  10. Hailstorms over Switzerland: Verification of Crowd-sourced Data

    Science.gov (United States)

    Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia

    2016-04-01

    The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.

  11. State of the Art in the Research of Formal Verification

    Directory of Open Access Journals (Sweden)

    Serna-M. Edgar

    2014-10-01

    Full Text Available In recent years research in formal verification of hardware and software has reached important progresses in the development of methodologies and tools to meet the increasing complexity of systems. The explicit role of Formal Verification is to find errors and to improve the reliability on the accuracy of system design, which implies a challenge for software engineering of this century. The purpose of this research is to perform a systematic review of the literature to establish the state of the art of research in formal verification during the last 10 years and to identify the approaches, methods, techniques and methodologies used, as well as the intensity of those research activities. During the process it was found that research in this field has doubled since 2005, and that the mean value of researches conducted year after year remains the same and that prevail the application in control and interaction systems. Additionally it was found that, the case study is the most used method and that empirical research is the most applied type.

  12. Verification and validation guidelines for high integrity systems. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Hecht, H.; Hecht, M.; Dinsmore, G.; Hecht, S.; Tang, D. [SoHaR, Inc., Beverly Hills, CA (United States)

    1995-03-01

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities.

  13. Spacelab Mission 2 Pallet-Only Mode Verification Flight

    Science.gov (United States)

    Lester, R. C.; Witt, W. R., Jr.

    1982-02-01

    The Spacelab is a flexible laboratory system, featuring an array of interchangeable components -- pressurized manned laboratories, unpressurized platforms, and related support systems -- that can be assembled in several different configurations of pallets and pressurized modules depending on the specific scientific requirements of the mission. The first two flights of Spacelab are designed to verify the flexibility and utility of all the elements of the Spacelab inventory. Spacelab Mission 2 will constitute the first flight of the pallet-only configuration of Spacelab. The major objective of Mission 2 is the verification of the performance of Space-lab systems and subsystems in this operating mode. The system performance will be verified using a complement of verification flight instrumentation and by operating a complement of scientific instrumentation to obtain scientific data. This paper describes the evolution of Spacelab Mission 2 including a discussion of the verification requirements and instrumentation, the experiments requirements and instrumentation, the major mission peculiar equipment to integrate the payload, and the general mission planning for the flight. Finally, the current status of the mission will be discussed with emphasis on hardware and software development, and on major activities yet to be completed.

  14. Verification and validation guidelines for high integrity systems. Volume 1

    International Nuclear Information System (INIS)

    High integrity systems include all protective (safety and mitigation) systems for nuclear power plants, and also systems for which comparable reliability requirements exist in other fields, such as in the process industries, in air traffic control, and in patient monitoring and other medical systems. Verification aims at determining that each stage in the software development completely and correctly implements requirements that were established in a preceding phase, while validation determines that the overall performance of a computer system completely and correctly meets system requirements. Volume I of the report reviews existing classifications for high integrity systems and for the types of errors that may be encountered, and makes recommendations for verification and validation procedures, based on assumptions about the environment in which these procedures will be conducted. The final chapter of Volume I deals with a framework for standards in this field. Volume II contains appendices dealing with specific methodologies for system classification, for dependability evaluation, and for two software tools that can automate otherwise very labor intensive verification and validation activities

  15. Standard practice for verification and classification of extensometer systems

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice covers procedures for the verification and classification of extensometer systems, but it is not intended to be a complete purchase specification. The practice is applicable only to instruments that indicate or record values that are proportional to changes in length corresponding to either tensile or compressive strain. Extensometer systems are classified on the basis of the magnitude of their errors. 1.2 Because strain is a dimensionless quantity, this document can be used for extensometers based on either SI or US customary units of displacement. Note 1—Bonded resistance strain gauges directly bonded to a specimen cannot be calibrated or verified with the apparatus described in this practice for the verification of extensometers having definite gauge points. (See procedures as described in Test Methods E251.) 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish app...

  16. Distributed Verification and Hardness of Distributed Approximation

    CERN Document Server

    Sarma, Atish Das; Kor, Liah; Korman, Amos; Nanongkai, Danupon; Pandurangan, Gopal; Peleg, David; Wattenhofer, Roger

    2010-01-01

    We study the {\\em verification} problem in distributed networks, stated as follows. Let $H$ be a subgraph of a network $G$ where each vertex of $G$ knows which edges incident on it are in $H$. We would like to verify whether $H$ has some properties, e.g., if it is a tree or if it is connected. We would like to perform this verification in a decentralized fashion via a distributed algorithm. The time complexity of verification is measured as the number of rounds of distributed communication. In this paper we initiate a systematic study of distributed verification, and give almost tight lower bounds on the running time of distributed verification algorithms for many fundamental problems such as connectivity, spanning connected subgraph, and $s-t$ cut verification. We then show applications of these results in deriving strong unconditional time lower bounds on the {\\em hardness of distributed approximation} for many classical optimization problems including minimum spanning tree, shortest paths, and minimum cut....

  17. Tactile length contraction as Bayesian inference.

    Science.gov (United States)

    Tong, Jonathan; Ngo, Vy; Goldreich, Daniel

    2016-08-01

    To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process. PMID:27121574

  18. Tactile length contraction as Bayesian inference.

    Science.gov (United States)

    Tong, Jonathan; Ngo, Vy; Goldreich, Daniel

    2016-08-01

    To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process.

  19. Context-aware approach for formal verification

    Directory of Open Access Journals (Sweden)

    Amel Benabbou

    2016-02-01

    Full Text Available The Context-aware approach has proven to be an effective technique for software model-checking verification. It focuses on the explicit modelling of environment as one or more contexts. In this area, specifying precise requirement is a challenged task for engineer since often environmental conditions lack of precision. A DSL, called CDL, has been proposed to facilitate the specification of requirement and context. However, such language is still low-level and error prone, difficult to grasp on complex models and assessment about its usability is still mitigated. In this paper, we propose a high level formalism of CDL to facilitate specifying contexts based on interaction overview diagrams that orchestrate activity diagrams automatically transformed from textual use cases. Our approach highlights the boundaries between the system and its environment. It is qualified as model checking context-aware that aims to reduce the semantic gap between informal and formal requirements, hence the objective is to assist and encourage engineers to put sufficient details to accomplish effectively the specification process.

  20. Field verification of CO sub 2 -foam

    Energy Technology Data Exchange (ETDEWEB)

    Martin, F.D.; Heller, J.P.; Weiss, W.W.

    1992-05-01

    In September 1989, the Petroleum Recovery Research Center (PRRC), a division of New Mexico Institute of Mining and Technology, received a grant from the US Department of Energy (DOE) for a project entitled Field Verification of CO{sub 2} Foam.'' The grant provided for an extension of the PRRC laboratory work to a field testing stage to be performed in collaboration with an oil producer actively conducting a CO{sub 2} flood. The objectives of this project are to: (1) conduct reservoir studies, laboratory tests, simulation runs, and field tests to evaluate the use of foam for mobility control or fluid diversion in a New Mexico CO{sub 2} flood, and (2) evaluate the concept of CO{sub 2}-foam in the field by using a reservoir where CO{sub 2} flooding is ongoing, characterizing the reservoir, modeling the process, and monitoring performance of the field test. Seven tasks were identified for the successful completion of the project: (1) evaluate and select a field site, (2) develop an initial site- specific plan, (3) conduct laboratory CO{sub 2}-foam mobility tests, (4) perform reservoir simulations, (5) design the foam slug, (6) implement a field test, and (7) evaluate results.

  1. Improving Fingerprint Verification Using Minutiae Triplets

    Directory of Open Access Journals (Sweden)

    Leopoldo Altamirano-Robles

    2012-03-01

    Full Text Available Improving fingerprint matching algorithms is an active and important research area in fingerprint recognition. Algorithms based on minutia triplets, an important matcher family, present some drawbacks that impact their accuracy, such as dependency to the order of minutiae in the feature, insensitivity to the reflection of minutiae triplets, and insensitivity to the directions of the minutiae relative to the sides of the triangle. To alleviate these drawbacks, we introduce in this paper a novel fingerprint matching algorithm, named M3gl. This algorithm contains three components: a new feature representation containing clockwise-arranged minutiae without a central minutia, a new similarity measure that shifts the triplets to find the best minutiae correspondence, and a global matching procedure that selects the alignment by maximizing the amount of global matching minutiae. To make M3gl faster, it includes some optimizations to discard non-matching minutia triplets without comparing the whole representation. In comparison with six verification algorithms, M3gl achieves the highest accuracy in the lowest matching time, using FVC2002 and FVC2004 databases.

  2. Visual inspection for CTBT verification

    Energy Technology Data Exchange (ETDEWEB)

    Hawkins, W.; Wohletz, K.

    1997-03-01

    On-site visual inspection will play an essential role in future Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection can greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can visual inspection offer ``ground truth`` in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending party may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection.

  3. Integrated Key based Strict Friendliness Verification of Neighbors in MANET

    CERN Document Server

    Vaman, Dhadesugoor R

    2012-01-01

    A novel Strict Friendliness Verification (SFV) scheme based on the integrated key consisting of symmetric node identity, geographic location and round trip response time between the sender and the receiver radio in MANET is proposed. This key is dynamically updated for encryption and decryption of each packet to resolve Wormhole attack and Sybil attack. Additionally, it meets the minimal key lengths required for symmetric ciphers to provide adequate commercial security. Furthermore, the foe or unfriendly node detection is found significantly increasing with the lower number of symmetric IDs. This paper presents the simulation demonstrating the performance of SFV in terms of dynamic range using directional antenna on radios (or nodes), and the performance in terms of aggregate throughput, average end to end delay and packet delivered ratio.

  4. National Verification System of National Meteorological Center , China

    Science.gov (United States)

    Zhang, Jinyan; Wei, Qing; Qi, Dan

    2016-04-01

    Product Quality Verification Division for official weather forecasting of China was founded in April, 2011. It is affiliated to Forecast System Laboratory (FSL), National Meteorological Center (NMC), China. There are three employees in this department. I'm one of the employees and I am in charge of Product Quality Verification Division in NMC, China. After five years of construction, an integrated realtime National Verification System of NMC, China has been established. At present, its primary roles include: 1) to verify official weather forecasting quality of NMC, China; 2) to verify the official city weather forecasting quality of Provincial Meteorological Bureau; 3) to evaluate forecasting quality for each forecasters in NMC, China. To verify official weather forecasting quality of NMC, China, we have developed : • Grid QPF Verification module ( including upascale) • Grid temperature, humidity and wind forecast verification module • Severe convective weather forecast verification module • Typhoon forecast verification module • Disaster forecast verification • Disaster warning verification module • Medium and extend period forecast verification module • Objective elements forecast verification module • Ensemble precipitation probabilistic forecast verification module To verify the official city weather forecasting quality of Provincial Meteorological Bureau, we have developed : • City elements forecast verification module • Public heavy rain forecast verification module • City air quality forecast verification module. To evaluate forecasting quality for each forecasters in NMC, China, we have developed : • Off-duty forecaster QPF practice evaluation module • QPF evaluation module for forecasters • Severe convective weather forecast evaluation module • Typhoon track forecast evaluation module for forecasters • Disaster warning evaluation module for forecasters • Medium and extend period forecast evaluation module The further

  5. Pairing versus quarteting coherence length

    CERN Document Server

    Delion, Doru S

    2015-01-01

    We systematically analyse the coherence length in even-even nuclei. The pairing coherence length in the spin-singlet channel for the effective density dependent delta (DDD) and Gaussian interaction is estimated. We consider in our calculations bound states as well as narrow resonances. It turns out that the pairing gaps given by the DDD interaction are similar to those of the Gaussian potential if one renormalizes the radial width to the nuclear radius. The correlations induced by the pairing interaction have in all considered cases a long range character inside the nucleus and decrease towards the surface. The mean coherence length is larger than the geometrical radius for light nuclei and approaches this value for heavy nuclei. The effect of the temperature and states in continuum is investigated. Strong shell effects are evidenced, especially for protons. We generalize this concept to quartets by considering similar relations, but between proton and neutron pairs. The quartet coherence length has a similar...

  6. Minimum Length from First Principles

    OpenAIRE

    Calmet, Xavier; Graesser, Michael; Hsu, Stephen D. H.

    2005-01-01

    We show that no device or gedanken experiment is capable of measuring a distance less than the Planck length. By "measuring a distance less than the Planck length" we mean, technically, resolve the eigenvalues of the position operator to within that accuracy. The only assumptions in our argument are causality, the uncertainty principle from quantum mechanics and a dynamical criteria for gravitational collapse from classical general relativity called the hoop conjecture. The inability of any g...

  7. Production of enzymatically active recombinant full-length barley high pI alpha-glucosidase of glycoside family 31 by high cell-density fermentation of Pichia pastoris and affinity purification

    DEFF Research Database (Denmark)

    Næsted, Henrik; Kramhøft, Birte; Lok, F.;

    2006-01-01

    Recombinant barley high pI alpha-glucosidase was produced by high cell-density fermentation of Pichia pastoris expressing the cloned full-length gene. The gene was amplified from a genomic clone and exons (coding regions) were assembled by overlap PCR. The resulting cDNA was expressed under control...... of the alcohol oxidase 1 promoter using methanol induction of P. pastoris fermentation in a Biostat B 5 L reactor. Forty-two milligrams a-glucosidase was purified from 3.5 L culture in four steps applying an N-terminal hexa-histidine tag. The apparent molecular mass of the recombinant alpha-glucosidase was 100 k......Da compared to 92 kDa of the native barley enzyme. The secreted recombinant enzyme was highly stabile during the 5-day fermentation and had significantly superior specific activity of the enzyme purified previously from barley malt. The kinetic parameters K-m, V-max, and k(cat) were determined to 1.7 mM, 139...

  8. Ozone Monitoring Instrument geolocation verification

    Science.gov (United States)

    Kroon, M.; Dobber, M. R.; Dirksen, R.; Veefkind, J. P.; van den Oord, G. H. J.; Levelt, P. F.

    2008-08-01

    Verification of the geolocation assigned to individual ground pixels as measured by the Ozone Monitoring Instrument (OMI) aboard the NASA EOS-Aura satellite was performed by comparing geophysical Earth surface details as observed in OMI false color images with the high-resolution continental outline vector map as provided by the Interactive Data Language (IDL) software tool from ITT Visual Information Solutions. The OMI false color images are generated from the OMI visible channel by integration over 20-nm-wide spectral bands of the Earth radiance intensity around 484 nm, 420 nm, and 360 nm wavelength per ground pixel. Proportional to the integrated intensity, we assign color values composed of CRT standard red, green, and blue to the OMI ground pixels. Earth surface details studied are mostly high-contrast coast lines where arid land or desert meets deep blue ocean. The IDL high-resolution vector map is based on the 1993 CIA World Database II Map with a 1-km accuracy. Our results indicate that the average OMI geolocation offset over the years 2005-2006 is 0.79 km in latitude and 0.29 km in longitude, with a standard deviation of 1.64 km in latitude and 2.04 km in longitude, respectively. Relative to the OMI nadir pixel size, one obtains mean displacements of ˜6.1% in latitude and ˜1.2% in longitude, with standard deviations of 12.6% and 7.9%, respectively. We conclude that the geolocation assigned to individual OMI ground pixels is sufficiently accurate to support scientific studies of atmospheric features as observed in OMI level 2 satellite data products, such as air quality issues on urban scales or volcanic eruptions and its plumes, that occur on spatial scales comparable to or smaller than OMI nadir pixels.

  9. Verification of Internal Dose Calculations.

    Science.gov (United States)

    Aissi, Abdelmadjid

    The MIRD internal dose calculations have been in use for more than 15 years, but their accuracy has always been questionable. There have been attempts to verify these calculations; however, these attempts had various shortcomings which kept the question of verification of the MIRD data still unanswered. The purpose of this research was to develop techniques and methods to verify the MIRD calculations in a more systematic and scientific manner. The research consisted of improving a volumetric dosimeter, developing molding techniques, and adapting the Monte Carlo computer code ALGAM to the experimental conditions and vice versa. The organic dosimetric system contained TLD-100 powder and could be shaped to represent human organs. The dosimeter possessed excellent characteristics for the measurement of internal absorbed doses, even in the case of the lungs. The molding techniques are inexpensive and were used in the fabrication of dosimetric and radioactive source organs. The adaptation of the computer program provided useful theoretical data with which the experimental measurements were compared. The experimental data and the theoretical calculations were compared for 6 source organ-7 target organ configurations. The results of the comparison indicated the existence of an agreement between measured and calculated absorbed doses, when taking into consideration the average uncertainty (16%) of the measurements, and the average coefficient of variation (10%) of the Monte Carlo calculations. However, analysis of the data gave also an indication that the Monte Carlo method might overestimate the internal absorbed doses. Even if the overestimate exists, at least it could be said that the use of the MIRD method in internal dosimetry was shown to lead to no unnecessary exposure to radiation that could be caused by underestimating the absorbed dose. The experimental and the theoretical data were also used to test the validity of the Reciprocity Theorem for heterogeneous

  10. Evolution of IAEA verification in relation to nuclear disarmament

    International Nuclear Information System (INIS)

    The Agency has over forty years of experience in applying safeguards in 70 States. This experience has been used to provide safeguards to the 'excess material', nuclear material irreversibly released from the nuclear weapons program in the United States. The IAEA safeguards experience has also helped to put the Trilateral Initiative on a fast forward track. The basic work on an agreement and on technical verification details is well on the way and may feed seamless into the Plutonium Management and Disposition Agreement (PMDA). Since fissile material remains the most essential part of a nuclear weapon, technology and approaches currently used for safeguards in non-nuclear weapon States may be utilized, or further developed, to assure the international community that such material remains irreversibly removed from weapons programs. The IAEA experience in understanding relevant processes from the nuclear fuel cycle permit the application of monitoring regimes in nuclear facilities and their operation to assure that these facilities cannot be misused for proscribed activities under an international treaty that would ban the production of weapons-usable material. It must be remembered that the application of safeguards pursuant to the NPT is an Agency's core activity. There is no such explicit and forceful mandate for the Agency in respect of nuclear disarmament, unless an international agreement or treaty would designate the Agency to become the verification organization for that agreement, too. The Agency Statute requests the Agency to 'Conduct its activities' in conformity with policies of the UN furthering the establishment of safeguarded worldwide disarmament'. Technical skills and experience exist. A path from the IAEA international safeguards regime of today leading to a verification arrangement under an Fissile Material Cut-off Treaty (FMCT) may be possible

  11. An integrated user-oriented laboratory for verification of digital flight control systems: Features and capabilities

    Science.gov (United States)

    Defeo, P.; Doane, D.; Saito, J.

    1982-01-01

    A Digital Flight Control Systems Verification Laboratory (DFCSVL) has been established at NASA Ames Research Center. This report describes the major elements of the laboratory, the research activities that can be supported in the area of verification and validation of digital flight control systems (DFCS), and the operating scenarios within which these activities can be carried out. The DFCSVL consists of a palletized dual-dual flight-control system linked to a dedicated PDP-11/60 processor. Major software support programs are hosted in a remotely located UNIVAC 1100 accessible from the PDP-11/60 through a modem link. Important features of the DFCSVL include extensive hardware and software fault insertion capabilities, a real-time closed loop environment to exercise the DFCS, an integrated set of software verification tools, and a user-oriented interface to all the resources and capabilities.

  12. VERIFICATION OF PARALLEL AUTOMATA-BASED PROGRAMS

    Directory of Open Access Journals (Sweden)

    M. A. Lukin

    2014-01-01

    Full Text Available The paper deals with an interactive method of automatic verification for parallel automata-based programs. The hierarchical state machines can be implemented in different threads and can interact with each other. Verification is done by means of Spin tool and includes automatic Promela model construction, conversion of LTL-formula to Spin format and counterexamples in terms of automata. Interactive verification gives the possibility to decrease verification time and increase the maximum size of verifiable programs. Considered method supports verification of the parallel system for hierarchical automata that interact with each other through messages and shared variables. The feature of automaton model is that each state machine is considered as a new data type and can have an arbitrary bounded number of instances. Each state machine in the system can run a different state machine in a new thread or have nested state machine. This method was implemented in the developed Stater tool. Stater shows correct operation for all test cases.

  13. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare these appro......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare...... these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  14. Cleanup Verification Package for the 116-K-2 Effluent Trench

    Energy Technology Data Exchange (ETDEWEB)

    J. M. Capron

    2006-04-04

    This cleanup verification package documents completion of remedial action for the 116-K-2 effluent trench, also referred to as the 116-K-2 mile-long trench and the 116-K-2 site. During its period of operation, the 116-K-2 site was used to dispose of cooling water effluent from the 105-KE and 105-KW Reactors by percolation into the soil. This site also received mixed liquid wastes from the 105-KW and 105-KE fuel storage basins, reactor floor drains, and miscellaneous decontamination activities.

  15. Concepts of Model Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    B.H.Thacker; S.W.Doebling; F.M.Hemez; M.C. Anderson; J.E. Pepin; E.A. Rodriguez

    2004-10-30

    Model verification and validation (V&V) is an enabling methodology for the development of computational models that can be used to make engineering predictions with quantified confidence. Model V&V procedures are needed by government and industry to reduce the time, cost, and risk associated with full-scale testing of products, materials, and weapon systems. Quantifying the confidence and predictive accuracy of model calculations provides the decision-maker with the information necessary for making high-consequence decisions. The development of guidelines and procedures for conducting a model V&V program are currently being defined by a broad spectrum of researchers. This report reviews the concepts involved in such a program. Model V&V is a current topic of great interest to both government and industry. In response to a ban on the production of new strategic weapons and nuclear testing, the Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship Program (SSP). An objective of the SSP is to maintain a high level of confidence in the safety, reliability, and performance of the existing nuclear weapons stockpile in the absence of nuclear testing. This objective has challenged the national laboratories to develop high-confidence tools and methods that can be used to provide credible models needed for stockpile certification via numerical simulation. There has been a significant increase in activity recently to define V&V methods and procedures. The U.S. Department of Defense (DoD) Modeling and Simulation Office (DMSO) is working to develop fundamental concepts and terminology for V&V applied to high-level systems such as ballistic missile defense and battle management simulations. The American Society of Mechanical Engineers (ASME) has recently formed a Standards Committee for the development of V&V procedures for computational solid mechanics models. The Defense Nuclear Facilities Safety Board (DNFSB) has been a proponent of model V&V for all

  16. Telomere length in human liver diseases.

    Science.gov (United States)

    Urabe, Y; Nouso, K; Higashi, T; Nakatsukasa, H; Hino, N; Ashida, K; Kinugasa, N; Yoshida, K; Uematsu, S; Tsuji, T

    1996-10-01

    To determine the role of telomere-mediated gene stability in hepatocarcinogenesis, we examined the telomere length of human liver with or without chronic liver diseases and hepatocellular carcinomas (HCC). The mean telomere restriction fragment (TRF) length of normal liver (n = 13), chronic hepatitis (n = 11), liver cirrhosis (n = 24) and HCC (n = 24) was 7.8 +/- 0.2, 7.1 +/- 0.3, 6.4 +/- 0.2 and 5.2 +/- 0.2 kb, respectively (mean +/- standard error). TRF length decreased with a progression of chronic liver diseases and that in HCC was significantly shorter than that in other chronic liver diseases (p HCC to that of corresponding surrounding liver of well differentiated (n = 7), moderately differentiated (n = 10) and poorly differentiated (n = 4) HCCs were 0.83 +/- 0.06, 0.75 +/- 0.05 and 0.98 +/- 0.09, respectively. The ratio of poorly differentiated HCC was significantly higher than that of moderately differentiated HCC (p telomere length ratio of moderately differentiated HCCs revealed a decrease of the ratio with size until it reached 50 mm in diameter. In contrast, the ratio increased as the size enlarged over 50 mm. These findings suggest that the gene stability of the liver cells mediated by the telomere is reduced as chronic liver disease progresses and that telomerase is activated in poorly differentiated HCC and moderately differentiated HCC over 50 mm in diameter. PMID:8938628

  17. Energy verification in Ion Beam Therapy

    CERN Document Server

    Moser, F; Dorda, U

    2011-01-01

    The adoption of synchrotrons for medical applications necessitates a comprehensive on-line verification of all beam parameters, autonomous of common beam monitors. In particular for energy verification, the required precision of down to 0.1MeV in absolute terms, poses a special challenge regarding the betatron-core driven 3rd order extraction mechanism which is intended to be used at MedAustron [1]. Two different energy verification options have been studied and their limiting factors were investigated: 1) A time-of-flight measurement in the synchrotron, limited by the orbit circumference information and measurement duration as well as extraction uncertainties. 2) A calorimeter-style system in the extraction line, limited by radiation hardness and statistical fluctuations. The paper discusses in detail the benefits and specific aspects of each method.

  18. Code Formal Verification of Operation System

    Directory of Open Access Journals (Sweden)

    Yu Zhang

    2010-12-01

    Full Text Available with the increasing pressure on non-function attributes (security, safety and reliability requirements of an operation system, high–confidence operation system is becoming more important. Formal verification is the only known way to guarantee that a system is free of programming errors. We research on formal verification of operation system kernel in system code level and take theorem proving and model checking as the main technical methods to resolve the key techniques of verifying operation system kernel in C code level. We present a case study to the verification of real-world C systems code derived from an implementation of μC/OS – II in the end.

  19. Probabilistic Model for Dynamic Signature Verification System

    Directory of Open Access Journals (Sweden)

    Chai Tong Yuen

    2011-11-01

    Full Text Available This study has proposed the algorithm for signature verification system using dynamic parameters of the signature: pen pressure, velocity and position. The system is proposed to read, analyze and verify the signatures from the SUSig online database. Firstly, the testing and reference samples will have to be normalized, re-sampled and smoothed through pre-processing stage. In verification stage, the difference between reference and testing signatures will be calculated based on the proposed thresholded standard deviation method. A probabilistic acceptance model has been designed to enhance the performance of the verification system. The proposed algorithm has reported False Rejection Rate (FRR of 14.8% and False Acceptance Rate (FAR of 2.64%. Meanwhile, the classification rate of the system is around 97%.

  20. Early Development of UVM based Verification Environment of Image Signal Processing Designs using TLM Reference Model of RTL

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2014-01-01

    Full Text Available With semiconductor industry trend of “smaller the better”, from an idea to a final product, more innovation on product portfolio and yet remaining competitive and profitable are few criteria which are culminating into pressure and need for more and more innovation for CAD flow, process management and project execution cycle. Project schedules are very tight and to achieve first silicon success is key for projects. This necessitates quicker verification with better coverage matrix. Quicker Verification requires early development of the verification environment with wider test vectors without waiting for RTL to be available. In this paper, we are presenting a novel approach of early development of reusable multi-language verification flow, by addressing four major activities of verification – 1. Early creation of Executable Specification 2. Early creation of Verification Environment 3. Early development of test vectors and 4. Better and increased Re-use of blocks Although this paper focuses on early development of UVM based Verification Environment of Image Signal Processing designs using TLM Reference Model of RTL, same concept can be extended for non-image signal processing designs.

  1. Continuously variable focal length lens

    Science.gov (United States)

    Adams, Bernhard W; Chollet, Matthieu C

    2013-12-17

    A material preferably in crystal form having a low atomic number such as beryllium (Z=4) provides for the focusing of x-rays in a continuously variable manner. The material is provided with plural spaced curvilinear, optically matched slots and/or recesses through which an x-ray beam is directed. The focal length of the material may be decreased or increased by increasing or decreasing, respectively, the number of slots (or recesses) through which the x-ray beam is directed, while fine tuning of the focal length is accomplished by rotation of the material so as to change the path length of the x-ray beam through the aligned cylindrical slows. X-ray analysis of a fixed point in a solid material may be performed by scanning the energy of the x-ray beam while rotating the material to maintain the beam's focal point at a fixed point in the specimen undergoing analysis.

  2. Numerical Weather Predictions Evaluation Using Spatial Verification Methods

    Science.gov (United States)

    Tegoulias, I.; Pytharoulis, I.; Kotsopoulos, S.; Kartsios, S.; Bampzelis, D.; Karacostas, T.

    2014-12-01

    During the last years high-resolution numerical weather prediction simulations have been used to examine meteorological events with increased convective activity. Traditional verification methods do not provide the desired level of information to evaluate those high-resolution simulations. To assess those limitations new spatial verification methods have been proposed. In the present study an attempt is made to estimate the ability of the WRF model (WRF -ARW ver3.5.1) to reproduce selected days with high convective activity during the year 2010 using those feature-based verification methods. Three model domains, covering Europe, the Mediterranean Sea and northern Africa (d01), the wider area of Greece (d02) and central Greece - Thessaly region (d03) are used at horizontal grid-spacings of 15km, 5km and 1km respectively. By alternating microphysics (Ferrier, WSM6, Goddard), boundary layer (YSU, MYJ) and cumulus convection (Kain-­-Fritsch, BMJ) schemes, a set of twelve model setups is obtained. The results of those simulations are evaluated against data obtained using a C-Band (5cm) radar located at the centre of the innermost domain. Spatial characteristics are well captured but with a variable time lag between simulation results and radar data. Acknowledgements: This research is co­financed by the European Union (European Regional Development Fund) and Greek national funds, through the action "COOPERATION 2011: Partnerships of Production and Research Institutions in Focused Research and Technology Sectors" (contract number 11SYN_8_1088 - DAPHNE) in the framework of the operational programme "Competitiveness and Entrepreneurship" and Regions in Transition (OPC II, NSRF 2007-­-2013).

  3. Overview of bunch length measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Lumpkin, A. H.

    1999-02-19

    An overview of particle and photon beam bunch length measurements is presented in the context of free-electron laser (FEL) challenges. Particle-beam peak current is a critical factor in obtaining adequate FEL gain for both oscillators and self-amplified spontaneous emission (SASE) devices. Since measurement of charge is a standard measurement, the bunch length becomes the key issue for ultrashort bunches. Both time-domain and frequency-domain techniques are presented in the context of using electromagnetic radiation over eight orders of magnitude in wavelength. In addition, the measurement of microbunching in a micropulse is addressed.

  4. Summary of neutron scattering lengths

    International Nuclear Information System (INIS)

    All available neutron-nuclei scattering lengths are collected together with their error bars in a uniform way. Bound scattering lengths are given for the elements, the isotopes, and the various spin-states. They are discussed in the sense of their use as basic parameters for many investigations in the field of nuclear and solid state physics. The data bank is available on magnetic tape, too. Recommended values and a map of these data serve for an uncomplicated use of these quantities. (orig.)

  5. Continuous lengths of oxide superconductors

    Science.gov (United States)

    Kroeger, Donald M.; List, III, Frederick A.

    2000-01-01

    A layered oxide superconductor prepared by depositing a superconductor precursor powder on a continuous length of a first substrate ribbon. A continuous length of a second substrate ribbon is overlaid on the first substrate ribbon. Sufficient pressure is applied to form a bound layered superconductor precursor powder between the first substrate ribbon and the second substrate ribbon. The layered superconductor precursor is then heat treated to establish the oxide superconducting phase. The layered oxide superconductor has a smooth interface between the substrate and the oxide superconductor.

  6. An analysis of clinical activity, admission rates, length of hospital stay, and economic impact after a temporary loss of 50% of the non-operative podiatrists from a tertiary specialist foot clinic in the United Kingdom

    Directory of Open Access Journals (Sweden)

    Catherine Gooday

    2013-09-01

    Full Text Available Introduction: Podiatrists form an integral part of the multidisciplinary foot team in the treatment of diabetic foot–related complications. A set of unforeseen circumstances within our specialist diabetes foot service in the United Kingdom caused a loss of 50% of our non-operative podiatry team for almost 7 months during 2010. Some of this time was filled by non-specialist community non-operative podiatrists. Methods: We assessed the economic impact of this loss by examining data for the 5 years prior to this 7-month interruption, and for the 2 years after ‘normal service’ was resumed. Results: Our data show that the loss of the non-operative podiatrists led to a significant rise in the numbers of admissions into hospital, and hospital length of stay also increased. At our institution a single bed day cost is £275. During the time that the numbers of specialist non-operative podiatry staff were depleted, and for up to 6 months after they returned to normal activities, the extra costs increased by just less than £90,000. The number of people admitted directly from specialist vascular and orthopaedic clinics is likely to have increased due to the lack of capacity to manage them in the diabetic foot clinic. Our data were unable to assess these individuals and did not look at the costs saved from avoiding surgery. Thus the actual costs incurred are likely to be higher. Conclusions: Our data suggest that specialist non-operative podiatrists involved in the treatment of the diabetic foot may prevent unwarranted hospital admission and increased hospitalisation rates by providing skilled assessment and care in the outpatient clinical settings.

  7. Properties of Single-Wall Carbon Nanotubes with Finite Lengths

    Institute of Scientific and Technical Information of China (English)

    HU Di-Li; PAN Bi-Cai

    2001-01-01

    Carbon nanotubes with finite lengths should be natural components of future "nano devices". Based on orthogonal tight-binding molecular dynamics simulations, we report on our study of formation energies, optimal geometrical structures and active sites of carbon nanotubes with finite lengths. This should be useful to understand the properties of such natural components.

  8. Time Optimal Reachability Analysis Using Swarm Verification

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    and planning problems, response time optimization etc. We propose swarm verification to accelerate time optimal reachability using the real-time model-checker Uppaal. In swarm verification, a large number of model checker instances execute in parallel on a computer cluster using different, typically randomized...... search strategies. We develop four swarm algorithms and evaluate them with four models in terms scalability, and time- and memory consumption. Three of these cooperate by exchanging costs of intermediate solutions to prune the search using a branch-and-bound approach. Our results show that swarm...

  9. Key Nuclear Verification Priorities - Safeguards and Beyond

    International Nuclear Information System (INIS)

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  10. A Novel Verification Approach of Workflow Schema

    Institute of Scientific and Technical Information of China (English)

    WANG Guangqi; WANG Juying; WANG Yan; SONG Baoyan; YU Ge

    2006-01-01

    A workflow schema is an abstract description of the business processed by workflow model, and plays a critical role in analyzing, executing and reorganizing business processes. The verification issue on the correctness of complicated workflow schemas is difficult in the field of workflow. We make an intensive study of it in this paper. We present here local errors and schema logic errors (global errors) in workflow schemas in detail, and offer some constraint rules trying to avoid schema errors during modeling. In addition, we propose a verification approach based on graph reduction and graph spread, and give the algorithm. The algorithm is implemented in a workflow prototype system e-ScopeWork.

  11. Towards Automated Verification of Web Services

    CERN Document Server

    Vaz, C

    2011-01-01

    This paper proposes the use of model-checking software technology for the verification of workflows and business processes behaviour based on web services, namely the use of the SPIN model checker. Since the specification of a business process behaviour based on web services can be decomposed into patterns, it is proposed a translation of a well known collection of workflow patterns into PROMELA, the input specification language of SPIN. The use of this translation is illustrated with one business process example, which demonstrates how its translation to a PROMELA model can be useful in the web service specification and verification.

  12. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan;

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification...... hybrid systems and develop a general abstraction technique for verifying probabilistic safety problems. This gives rise to the first mechanisable technique that can, in practice, formally verify safety properties of non-trivial continuous-time stochastic hybrid systems-without resorting to point...

  13. 340 and 310 drawing field verification

    International Nuclear Information System (INIS)

    The purpose of the drawing field verification work plan is to provide reliable drawings for the 310 Treated Effluent Disposal Facility (TEDF) and 340 Waste Handling Facility (340 Facility). The initial scope of this work plan is to provide field verified and updated versions of all the 340 Facility essential drawings. This plan can also be used for field verification of any other drawings that the facility management directs to be so updated. Any drawings revised by this work plan will be issued in an AutoCAD format

  14. 340 and 310 drawing field verification

    Energy Technology Data Exchange (ETDEWEB)

    Langdon, J.

    1996-09-27

    The purpose of the drawing field verification work plan is to provide reliable drawings for the 310 Treated Effluent Disposal Facility (TEDF) and 340 Waste Handling Facility (340 Facility). The initial scope of this work plan is to provide field verified and updated versions of all the 340 Facility essential drawings. This plan can also be used for field verification of any other drawings that the facility management directs to be so updated. Any drawings revised by this work plan will be issued in an AutoCAD format.

  15. A New Signature Scheme with Shared Verification

    Institute of Scientific and Technical Information of China (English)

    JIA Xiao-yun; LUO Shou-shan; YUAN Chao-wei

    2006-01-01

    With expanding user demands, digital signature techniques are also being expanded greatly, from single signature and single verification techniques to techniques supporting multi-users. This paper presents a new digital signature scheme vith shared verification based on the fiat-shamir signature scheme. This scheme is suitable not only for digital signatures of one public key, but also for situations where multiple public keys are required. In addition, the scheme can resist all kinds of collusion, making it more practicable and safer. Additionally it is more efficient than other schemes.

  16. Design and ground verification of proximity operations

    Science.gov (United States)

    Tobias, A.; Ankersen, F.; Fehse, W.; Pauvert, C.; Pairot, J.

    This paper describes the approach to guidance, navigation, and control (GNC) design and verification for proximity operations. The most critical part of the rendezvous mission is the proximity operations phase when the distance between chaser and target is below approximately 20 m. Safety is the overriding consideration in the design of the GNC system. Requirements on the GNC system also stem from the allocation of performance between proximity operations and the mating process, docking, or capture for berthing. Whereas the design process follows a top down approach, the verification process goes bottom up in a stepwise way according to the development stage.

  17. 47 CFR 2.952 - Limitation on verification.

    Science.gov (United States)

    2010-10-01

    ... person shall, in any advertising matter, brochure, etc., use or make reference to a verification in a deceptive or misleading manner or convey the impression that such verification reflects more than...

  18. VERIFICATION OF A TOXIC ORGANIC SUBSTANCE TRANSPORT AND BIOACCUMULATION MODEL

    Science.gov (United States)

    A field verification of the Toxic Organic Substance Transport and Bioaccumulation Model (TOXIC) was conducted using the insecticide dieldrin and the herbicides alachlor and atrazine as the test compounds. The test sites were two Iowa reservoirs. The verification procedure include...

  19. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Andersen, J.H.; Kristensen, C.H.; Skou, A.

    1996-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  20. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Andersen, J.H.; Skou, A.

    1995-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  1. OS VERIFICATION- A SURVEY AS A SOURCE OF FUTURE CHALLENGES

    Directory of Open Access Journals (Sweden)

    Kushal Anjaria

    2015-08-01

    Full Text Available Formal verification of an operating system kernel manifests absence of errors in the kernel and establishes trust in it. This paper evaluates various projects on operating system kernel verification and presents indepth survey of them. The methodologies and contributions of operating system verification projects have been discussed in the present work. At the end, few unattended and interesting future challenges in operating system verification area have been discussed and possible directions towards the challenge solution have been described in brief.

  2. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    Science.gov (United States)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  3. Cyclic Codes of Length 2

    Indian Academy of Sciences (India)

    Manju Pruthi

    2001-11-01

    In this paper explicit expressions of + 1 idempotents in the ring $R = F_q[X]/\\langle X^{2^m}-1\\rangle$ are given. Cyclic codes of length 2 over the finite field , of odd characteristic, are defined in terms of their generator polynomials. The exact minimum distance and the dimension of the codes are obtained.

  4. Persistence length of dendronized polymers

    NARCIS (Netherlands)

    Mikhailov, I.V.; Darinskii, A.A.; Zhulina, E.B.; Borisov, O.V.; Leermakers, F.A.M.

    2015-01-01

    We present numerical results for the thermodynamic rigidity and induced persistence length of dendronized polymers with systematically varied topology of their grafts obtained by the Scheutjens-Fleer self-consistent field method. The results were compared to predictions of an analytical mean-fiel

  5. Cavity length below chute aerators

    Institute of Scientific and Technical Information of China (English)

    WU JianHua; RUAN ShiPing

    2008-01-01

    It is proved that air entrainment is one of the efficient measures dealing with cavitation control for the release works of hydropower projects. There are many factors to be considered in designing a chute aerator. One of the most important factors concerns the cavity length below the aerator, which has outstanding effects on air entrainment against cavitation damage. It is crucial to determine reasonable emergence angle for the calculation of the cavity length. In the present paper the overall effects of structural and hydraulic parameters on the emergence angle of the flow from the aerator were analyzed. Four improved expressions of the emergence angle with weight coefficient were investigated through experimental data of 68 points observed from 12 aerators of 6 hydropower projects, of both model and prototype, on the basis of error theory. A method to calculate the cavity length below aerators was suggested, which considers overall effects of the above mentioned parameters. Comparison between the method in this paper and the other five methods of calculating the cavity length showed that the present method is much more reliable than the existing methods while the mean error of the method is less than others.

  6. Cavity length below chute aerators

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    It is proved that air entrainment is one of the efficient measures dealing with cavi-tation control for the release works of hydropower projects. There are many factors to be considered in designing a chute aerator. One of the most important factors concerns the cavity length below the aerator,which has outstanding effects on air entrainment against cavitation damage. It is crucial to determine reasonable emergence angle for the calculation of the cavity length. In the present paper the overall effects of structural and hydraulic parameters on the emergence angle of the flow from the aerator were analyzed. Four improved expressions of the emer-gence angle with weight coefficient were investigated through experimental data of 68 points observed from 12 aerators of 6 hydropower projects,of both model and prototype,on the basis of error theory. A method to calculate the cavity length be-low aerators was suggested,which considers overall effects of the above men-tioned parameters. Comparison between the method in this paper and the other five methods of calculating the cavity length showed that the present method is much more reliable than the existing methods while the mean error of the method is less than others.

  7. Verification of Monte Carlo transport codes: FLUKA, MARS and SHIELD-A

    Energy Technology Data Exchange (ETDEWEB)

    Chetvertkova, Vera [IAP, J. W. Goethe-University, Frankfurt am Main (Germany); GSI Helmholtzzentrum fuer Schwerionenforschung, Darmstadt (Germany); Mustafin, Edil; Strasik, Ivan [GSI Helmholtzzentrum fuer Schwerionenforschung, Darmstadt (Germany); Ratzinger, Ulrich [IAP, J. W. Goethe-University, Frankfurt am Main (Germany); Latysheva, Ludmila; Sobolevskiy, Nikolai [Institute for Nuclear Research RAS, Moscow (Russian Federation)

    2011-07-01

    Monte Carlo transport codes like FLUKA, MARS and SHIELD are widely used for the estimation of radiation hazards in accelerator facilities. Accurate simulations are especially important with increasing energies and intensities of the machines. As the physical models implied in the codes are being constantly further developed, the verification is needed to make sure that the simulations give reasonable results. We report on the verification of electronic stopping modules and the verification of nuclide production modules of the codes. The verification of electronic stopping modules is based on the results of irradiation of stainless steel, copper and aluminum by 500 MeV/u and 950 MeV/u uranium ions. The stopping ranges achieved experimentally are compared with the simulated ones. The verification of isotope production modules is done via comparing the experimental depth profiles of residual activity (aluminum targets were irradiated by 500 MeV/u and 950 MeV/u uranium ions) with the results of simulations. Correspondences and discrepancies between the experiment and the simulations are discussed.

  8. Verification of Monte Carlo transport codes: FLUKA, MARS and SHIELD-A

    International Nuclear Information System (INIS)

    Monte Carlo transport codes like FLUKA, MARS and SHIELD are widely used for the estimation of radiation hazards in accelerator facilities. Accurate simulations are especially important with increasing energies and intensities of the machines. As the physical models implied in the codes are being constantly further developed, the verification is needed to make sure that the simulations give reasonable results. We report on the verification of electronic stopping modules and the verification of nuclide production modules of the codes. The verification of electronic stopping modules is based on the results of irradiation of stainless steel, copper and aluminum by 500 MeV/u and 950 MeV/u uranium ions. The stopping ranges achieved experimentally are compared with the simulated ones. The verification of isotope production modules is done via comparing the experimental depth profiles of residual activity (aluminum targets were irradiated by 500 MeV/u and 950 MeV/u uranium ions) with the results of simulations. Correspondences and discrepancies between the experiment and the simulations are discussed.

  9. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  10. Main Timber Legality Verification Schemes in the World

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Based on introduction of the timber legality verification schemes,the article provides a detailed review of existing legality verification schemes covering aspects such as definition of legality,verification process.It aims to help Chinese companies understand the different requirements and evidence of compliance required by legislation,public and private procurement policies.

  11. The Construction of Verification Models for Embedded Systems

    NARCIS (Netherlands)

    Mader, A.H.; Wupper, H.; Boon, M.

    2007-01-01

    The usefulness of verification hinges on the quality of the verification model. Verification is useful if it increases our confidence that an artefact bahaves as expected. As modelling inherently contains non-formal elements, the qualityof models cannot be captured by purely formal means. Still, w

  12. A Domain-specific Framework for Automated Construction and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2009-01-01

    elaborate safety mechanisms in order to keep the risk at the same low level that has been established for European railways until today. The challenge is further increased by the demand for shorter time-to-market periods and higher competition among suppliers of the railway domain; both factors resulting in...... a demand for a higher degree of automation for the development verification, validation and test phases of projects, without impairing the thoroughness of safety-related quality measures and certification activities. Motivated by these considerations, this presentation describes an approach for...... automated construction and verification of railway control systems....

  13. Static and Dynamic Verification of Critical Software for Space Applications

    Science.gov (United States)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA

  14. QA's role in the independent verification of plant readiness for startup from planned outages

    International Nuclear Information System (INIS)

    Quality Assurance (QA) personnel at the N Reactor, located near Richland, Washington, USA, perform many activities to independently verify the readiness for startup of the reactor from planned outages. The verifications are performed as inspections, test witnessing, audits, surveillance, and followup on identified corrective action needs. The results of these verifications are used in a formal readiness review process. The formal reviews are administered by a Review Board of representatives from several major components of the Company and are conducted using systematically structured analytical techniques. The N Reactor QA staff includes 26 persons (excluding managers) who are involved in independent verifications of reactor-related work as part or all of their assigned functions

  15. IAEA inspectors complete verification of nuclear material in Iraq

    International Nuclear Information System (INIS)

    Full text: A team of IAEA inspectors has returned from Iraq to Vienna after completing the annual Physical Inventory Verification of declared nuclear material. The material - natural or low-enriched uranium - is consolidated at a storage facility near the Tuwaitha complex, south of Baghdad. The inspectors found no diversion of nuclear material. The two-day inspection was conducted with the logistical and security assistance of the Multinational Force, the Office of the UN Security Coordinator, and the UN Assistance Mission for Iraq. Every non-nuclear-weapon State party to the NPT that has declared holdings of nuclear material is required to undergo such inspections. The inspectors verify the correctness of the State's declaration, and that material has not been diverted to any undeclared activity. Such inspections have been performed in Iraq on a continuing basis. NPT safeguards inspections are limited in scope and coverage as compared to the verification activities carried out in 1991-98 and 2002-03 by the IAEA under Security Council resolution 687 and related resolutions. (IAEA)

  16. Cleanup Verification Package for the 118-F-5 PNL Sawdust Pit

    Energy Technology Data Exchange (ETDEWEB)

    L. D. Habel

    2008-05-20

    This cleanup verification package documents completion of remedial action, sampling activities, and compliance with cleanup criteria for the 118-F-5 Burial Ground, the PNL (Pacific Northwest Laboratory) Sawdust Pit. The 118-F-5 Burial Ground was an unlined trench that received radioactive sawdust from the floors of animal pens in the 100-F Experimental Animal Farm.

  17. Verification of Autonomous Systems for Space Applications

    Science.gov (United States)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  18. Stamp Verification for Automated Document Authentication

    DEFF Research Database (Denmark)

    Micenková, Barbora; van Beusekom, Joost; Shafait, Faisal

    Stamps, along with signatures, can be considered as the most widely used extrinsic security feature in paper documents. In contrast to signatures, however, for stamps little work has been done to automatically verify their authenticity. In this paper, an approach for verification of color stamps is...

  19. RELAP-7 SOFTWARE VERIFICATION AND VALIDATION PLAN

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L [Idaho National Laboratory; Choi, Yong-Joon [Idaho National Laboratory; Zou, Ling [Idaho National Laboratory

    2014-09-01

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  20. 10 CFR 300.11 - Independent verification.

    Science.gov (United States)

    2010-01-01

    ... methods; and (v) Risk assessment and methodologies and materiality analysis procedures outlined by other... reviewing a coal electric utility should be knowledgeable about mass balance calculations, fuel purchasing... calculations underpinning this verification. The verifying entity shall maintain such records related to...

  1. Evaluating software verification systems: benchmarks and competitions

    NARCIS (Netherlands)

    Beyer, Dirk; Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary

    2014-01-01

    This report documents the program and the outcomes of Dagstuhl Seminar 14171 “Evaluating Software Verification Systems: Benchmarks and Competitions”. The seminar brought together a large group of current and future competition organizers and participants, benchmark maintainers, as well as practition

  2. The COST IC0701 verification competition 2011

    NARCIS (Netherlands)

    Bormer, T.; Brockschmidt, M.; Distefano, D.; Ernst, G.; Filliatre, J.-C.; Grigore, R.; Huisman, M.; Klebanov, V.; Marche, C.; Monahan, R.; Mostowski, W.I.; Poiikarpova, N.; Scheben, C.; Schellhorn, G.; Tofan, B.; Tschannen, J.; Ulbrich, M.; Beckert, B.; Damiani, F.; Gurov, D.

    2012-01-01

    This paper reports on the experiences with the program verification competition held during the FoVeOOS conference in October 2011. There were 6 teams participating in this competition. We discuss the three different challenges that were posed and the solutions developed by the teams. We conclude wi

  3. 20 CFR 325.6 - Verification procedures.

    Science.gov (United States)

    2010-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  4. Verification Techniques for Graph Rewriting (Tutorial)

    NARCIS (Netherlands)

    Rensink, Arend; Abdulla, Parosh Aziz; Gadducci, Fabio; König, Barbara; Vafeiadis, Viktor

    2016-01-01

    This tutorial paints a high-level picture of the concepts involved in verification of graph transformation systems. We distinguish three fundamentally different application scenarios for graph rewriting: (1) as grammars (in which case we are interested in the language, or set, of terminal graphs for

  5. An eclectic quadrant of rule based system verification: work grounded in verification of fuzzy rule bases

    OpenAIRE

    Viaene, Stijn; Wets, G.; Vanthienen, Jan; Dedene, Guido

    1999-01-01

    In this paper, we used a research approach based on grounded theory in order to classify methods proposed in literature that try to extend the verification of classical rule bases to the case of fuzzy knowledge modeling. Within this area of verification we identify two dual lines of thought respectively leading to what is termed respectively static and dynamic anomaly detection methods. The major outcome of the confrontation of both approaches is that their results, most often stated in terms...

  6. Environmental radiation measurement in CTBT verification system

    International Nuclear Information System (INIS)

    This paper introduces the technical requirements of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) Radionuclide Stations, the CTBT-related activities carried out by the Japan Atomic Energy Research Institute (JAERI), and the ripple effects of such acquired radionuclide data on general researches. The International Monitoring System (IMS), which is one of the CTBT verification regime. Consists of 80 radionuclide air monitoring stations (of those, 40 stations monitor noble gas as well) and 16 certified laboratories that support these stations throughout the world. For radionuclide air monitoring under the CTBT, the stations collect particulates in the atmosphere on a filter and determine by gamma-ray spectrometry the presence or absence of any radionuclides (e.g. 140Ba, 131I, 99Mo, 132Te, 103Ru, 141Ce, 147Nd, 95Zr, etc.) that offer clear evidence of possible nuclear explosion. Minimum technical requirements are stringently set for the radionuclide air monitoring stations: 500 m3/h air flow rate, 24-hour acquisition time, 10 to 30 Bq/m3 of detection sensitivity for 140Ba, and less than 7 consecutive days, or total of 15 days, a year of shutdown at the stations. For noble gas monitoring, on the other hand, the stations separate Xe from gas elements in the atmosphere and, after purifying and concentrating it, measure 4 nuclides, 131mXe, 133Xe, 133mXe, and 135Xe, by gamma-ray spectrometry or beta-gamma coincidence method. Minimum technical requirements are also set for the noble gas measurement: 0.4 m3/h air flow rate, a full capacity of 10 m3, and 1 Bq/m3 of detection sensitivity for 133Xe, etc. On the request of the Ministry of Education, Culture, Sports and Technology, the JAERI is currently undertaking the establishment of the CTBT radionuclide monitoring stations at both Takasaki (both particle and noble gas) and Okinawa (particle), the certified laboratory at JAERI Tokai, and the National Data Center (NDC 2) at JAERI Tokai, which handles radionuclide data, as

  7. Formalization and Verification of Business Process Modeling Based on UML and Petri Nets

    Institute of Scientific and Technical Information of China (English)

    YAN Zhi-jun; GAN Ren-chu

    2005-01-01

    In order to provide a quantitative analysis and verification method for activity diagrams based business process modeling, a formal definition of activity diagrams is introduced. And the basic requirements for activity diagrams based business process models are proposed. Furthermore, the standardized transformation technique between business process models and basic Petri nets is presented and the analysis method for the soundness and well-structured properties of business processes is introduced.

  8. Variable focal length deformable mirror

    Science.gov (United States)

    Headley, Daniel; Ramsey, Marc; Schwarz, Jens

    2007-06-12

    A variable focal length deformable mirror has an inner ring and an outer ring that simply support and push axially on opposite sides of a mirror plate. The resulting variable clamping force deforms the mirror plate to provide a parabolic mirror shape. The rings are parallel planar sections of a single paraboloid and can provide an on-axis focus, if the rings are circular, or an off-axis focus, if the rings are elliptical. The focal length of the deformable mirror can be varied by changing the variable clamping force. The deformable mirror can generally be used in any application requiring the focusing or defocusing of light, including with both coherent and incoherent light sources.

  9. INTERPOLATION WITH RESTRICTED ARC LENGTH

    Institute of Scientific and Technical Information of China (English)

    Petar Petrov

    2003-01-01

    For given data (ti,yi), I= 0,1,…,n,0 = t0 <t1 <…<tn = 1we study constrained interpolation problem of Favard type inf{‖f"‖∞|f∈W2∞[0,1],f(ti)=yi,i=0,…,n,l(f;[0,1])≤l0}, wherel(f";[0,1])=∫1 0 / 1+f'2(x)dx is the arc length off in [0,1]. We prove the existence of a solution f* of the above problem, that is a quadratic spline with a second derivative f"* , which coincides with one of the constants - ‖f"*‖∞,0,‖f"*‖∞ between every two consecutive knots. Thus, we extend a result ofKarlin concerning Favard problem, to the case of restricted length interpolation.

  10. Minimal Length, Measurability and Gravity

    Directory of Open Access Journals (Sweden)

    Alexander Shalyt-Margolin

    2016-03-01

    Full Text Available The present work is a continuation of the previous papers written by the author on the subject. In terms of the measurability (or measurable quantities notion introduced in a minimal length theory, first the consideration is given to a quantum theory in the momentum representation. The same terms are used to consider the Markov gravity model that here illustrates the general approach to studies of gravity in terms of measurable quantities.

  11. Post-silicon and runtime verification for modern processors

    CERN Document Server

    Wagner, Ilya

    2010-01-01

    The purpose of this book is to survey the state of the art and evolving directions in post-silicon and runtime verification. The authors start by giving an overview of the state of the art in verification, particularly current post-silicon methodologies in use in the industry, both for the domain of processor pipeline design and for memory subsystems. They then dive into the presentation of several new post-silicon verification solutions aimed at boosting the verification coverage of modern processors, dedicating several chapters to this topic. The presentation of runtime verification solution

  12. SHINE Vacuum Pump Test Verification

    Energy Technology Data Exchange (ETDEWEB)

    Morgan, Gregg A; Peters, Brent

    2013-09-30

    scroll pump will be used to back the booster pump. In this case the ''booster pump'' is an Adixen Molecular Drag Pump (MDP 5011) and the backing pump is an Edwards (nXDS15iC) scroll pump. Various configurations of the two pumps and associated lengths of 3/4 inch tubing (0 feet to 300 feet) were used in combination with hydrogen and nitrogen flow rates ranging from 25-400 standard cubic centimeters per minute (sccm) to determine whether the proposed pump configuration meets the design criteria for SHINE. The results of this study indicate that even under the most severe conditions (300 feet of tubing and 400 sccm flow rate) the Adixen 5011 MDP can serve as a booster pump to transport gases from the accelerator (NDAS) to the TPS. The Target Gas Receiving System pump (Edwards nXDS15iC) located approximately 300 feet from the accelerator can effectively back the Adixen MDP. The molecular drag pump was able to maintain its full rotational speed even when the flow rate was 400 sccm hydrogen or nitrogen and 300 feet of tubing was installed between the drag pump and the Edwards scroll pump. In addition to maintaining adequate rotation, the pressure in the system was maintained below the target pressure of 30 torr for all flow rates, lengths of tubing, and process gases. This configuration is therefore adequate to meet the SHINE design requirements in terms of flow and pressure.

  13. Neutron spectrometry for UF6 enrichment verification in storage cylinders

    International Nuclear Information System (INIS)

    Verification of declared UF6 enrichment and mass in storage cylinders is of great interest in nuclear material nonproliferation. Nondestructive assay (NDA) techniques are commonly used for safeguards inspections to ensure accountancy of declared nuclear materials. Common NDA techniques used include gamma-ray spectrometry and both passive and active neutron measurements. In the present study, neutron spectrometry was investigated for verification of UF6 enrichment in 30B storage cylinders based on an unattended and passive measurement approach. MCNP5 and Geant4 simulated neutron spectra, for selected UF6 enrichments and filling profiles, were used in the investigation. The simulated neutron spectra were analyzed using principal component analysis (PCA). The PCA technique is a well-established technique and has a wide area of application including feature analysis, outlier detection, and gamma-ray spectral analysis. Results obtained demonstrate that neutron spectrometry supported by spectral feature analysis has potential for assaying UF6 enrichment in storage cylinders. The results from the present study also showed that difficulties associated with the UF6 filling profile and observed in other unattended passive neutron measurements can possibly be overcome using the approach presented

  14. Verification and Validation Issues in Systems of Systems

    Directory of Open Access Journals (Sweden)

    Eric Honour

    2013-11-01

    Full Text Available The cutting edge in systems development today is in the area of "systems of systems" (SoS large networks of inter-related systems that are developed and managed separately, but that also perform collective activities. Such large systems typically involve constituent systems operating with different life cycles, often with uncoordinated evolution. The result is an ever-changing SoS in which adaptation and evolution replace the older engineering paradigm of "development". This short paper presents key thoughts about verification and validation in this environment. Classic verification and validation methods rely on having (a a basis of proof, in requirements and in operational scenarios, and (b a known system configuration to be proven. However, with constant SoS evolution, management of both requirements and system configurations are problematic. Often, it is impossible to maintain a valid set of requirements for the SoS due to the ongoing changes in the constituent systems. Frequently, it is even difficult to maintain a vision of the SoS operational use as users find new ways to adapt the SoS. These features of the SoS result in significant challenges for system proof. In addition to discussing the issues, the paper also indicates some of the solutions that are currently used to prove the SoS.

  15. Testing of Cotton Fiber Length

    Institute of Scientific and Technical Information of China (English)

    刘若华; 李汝勤

    2001-01-01

    To understand the influences of actual sampling conditions on cotton fiber length testing, this article presents a theoretic study on the distributions and fibrogram of the sample taken out by sampler from ideal sliver at a certain angle. From the distributions expression it can be found that the size of the sampler and the sampling angle are important factors which affect sampling, but if the sampling width is narrow enough, the influence of the sampling angle on the distributions and fibrogram is small enough to be omitted. This is an important conclusion for sampling, in light of this, some suggestions for designing new type sampler are put forward.

  16. Role of TERRA in the regulation of telomere length.

    Science.gov (United States)

    Wang, Caiqin; Zhao, Li; Lu, Shiming

    2015-01-01

    Telomere dysfunction is closely associated with human diseases such as cancer and ageing. Inappropriate changes in telomere length and/or structure result in telomere dysfunction. Telomeres have been considered to be transcriptionally silent, but it was recently demonstrated that mammalian telomeres are transcribed into telomeric repeat-containing RNA (TERRA). TERRA, a long non-coding RNA, participates in the regulation of telomere length, telomerase activity and heterochromatinization. The correct regulation of telomere length may be crucial to telomeric homeostasis and functions. Here, we summarize recent advances in our understanding of the crucial role of TERRA in the maintenance of telomere length, with focus on the variety of mechanisms by which TERRA is involved in the regulation of telomere length. This review aims to enable further understanding of how TERRA-targeted drugs can target telomere-related diseases.

  17. Verificator: Educational Tool for Learning Programming

    Directory of Open Access Journals (Sweden)

    Danijel RADOSEVIC

    2009-09-01

    Full Text Available The paper introduces Verificator, our learning programming interface aimed for learningprogramming in C++ at the university beginners’ level. In teaching programming some specificproblems concerning the teaching itself as well as the organization of the teaching process need tobe considered. One of the biggest problems is that students tend to adopt certain bad programminghabits in their attempt to more easily deal with their examinations, such as trying to write programswithout any syntax and logical checking. It is very hard to help them correct those errors once theyare deeply rooted. Our students’ web questionnaire and its results show that the majority of problemsin learning programming among our students arise from the gap between the understanding ofprogramming language syntax and problem-solving algorithms. Verificator prevents students frommaking a lot of errors they are likely to make in learning programming and helps them to learnprogramming language syntax and adopt good programming habits.

  18. Packaged low-level waste verification system

    Energy Technology Data Exchange (ETDEWEB)

    Tuite, K.; Winberg, M.R.; McIsaac, C.V. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  19. Shield verification and validation action matrix summary

    Energy Technology Data Exchange (ETDEWEB)

    Boman, C.

    1992-02-01

    WSRC-RP-90-26, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification are integral part of the certification process. This document identifies the work performed and documentation generated to satisfy these action items for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system, it is not certification of the complete SHIELD system. Complete certification will follow at a later date. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but can be found in the references. The validation and verification effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system computer code is completed.

  20. Learning to Order BDD Variables in Verification

    CERN Document Server

    Grumberg, O; Markovitch, S; 10.1613/jair.1096

    2011-01-01

    The size and complexity of software and hardware systems have significantly increased in the past years. As a result, it is harder to guarantee their correct behavior. One of the most successful methods for automated verification of finite-state systems is model checking. Most of the current model-checking systems use binary decision diagrams (BDDs) for the representation of the tested model and in the verification process of its properties. Generally, BDDs allow a canonical compact representation of a boolean function (given an order of its variables). The more compact the BDD is, the better performance one gets from the verifier. However, finding an optimal order for a BDD is an NP-complete problem. Therefore, several heuristic methods based on expert knowledge have been developed for variable ordering. We propose an alternative approach in which the variable ordering algorithm gains 'ordering experience' from training models and uses the learned knowledge for finding good orders. Our methodology is based o...

  1. Sensor-fusion-based biometric identity verification

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W. [Sandia National Labs., Albuquerque, NM (United States); Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L. [New Mexico State Univ., Las Cruces, NM (United States). Electronic Vision Research Lab.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person`s identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm.

  2. Spatial Verification Using Wavelet Transforms: A Review

    CERN Document Server

    Weniger, Michael; Friederichs, Petra

    2016-01-01

    Due to the emergence of new high resolution numerical weather prediction (NWP) models and the availability of new or more reliable remote sensing data, the importance of efficient spatial verification techniques is growing. Wavelet transforms offer an effective framework to decompose spatial data into separate (and possibly orthogonal) scales and directions. Most wavelet based spatial verification techniques have been developed or refined in the last decade and concentrate on assessing forecast performance (i.e. forecast skill or forecast error) on distinct physical scales. Particularly during the last five years, a significant growth in meteorological applications could be observed. However, a comparison with other scientific fields such as feature detection, image fusion, texture analysis, or facial and biometric recognition, shows that there is still a considerable, currently unused potential to derive useful diagnostic information. In order to tab the full potential of wavelet analysis, we revise the stat...

  3. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  4. Automated Formal Verification for PLC Control Systems

    CERN Document Server

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  5. Formal verification of industrial control systems

    CERN Document Server

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  6. Sensor-fusion-based biometric identity verification

    International Nuclear Information System (INIS)

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person's identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm

  7. Verification of Tolerance Chains in Micro Manufacturing

    DEFF Research Database (Denmark)

    Gasparin, Stefania

    The aim of this work is to define methodologies for the tolerance verification of injection moulded components with downscaled dimensions. In micro and nano metrology different challenges can be found: lack of calibration artefacts and available ISO standards, problematic uncertainty budget...... light. In this thesis a replica casting technique is proposed to overcome the problem: the workpiece is replicated and the replica is characterized instead of the part. Different investigations are carried out on roughness specimens and deterministic structures (e.g. grooves) in order to define...... on a calibrated standard artefact and to measure both in order to assure an unbroken chain of comparisons. The replica technique reveals to be a fast, cost-effective and reliable method. Regarding the tolerance verification of micro parts and nano-structured surfaces, a systematic approach is discussed based...

  8. Getting ready for final disposal in Finland - Independent verification of spent fuel

    International Nuclear Information System (INIS)

    Full text: Final disposal of spent nuclear fuel has been known to be the solution for the back-end of the fuel cycle in Finland already for a long time. This has allowed the State system for accounting and control (SSAC) to prepare for the safeguards requirements in time. The Finnish SSAC includes the operator, the State authority STUK and the parties above them e.g. the Ministry for Trade and Industry. Undisputed responsibility of the safe disposal of spent fuel is on the operator. The role of the safety authority STUK. is to set up detailed requirements, to inspect the operator plans and by using different tools of a quality audit approach to verity that the requirements will be complied with in practice. Responsibility on the safeguards issues is similar with the addition of the role of the regional and the international verification organizations represented by Euratom and the IAEA, As the competent safeguards authority, STUK has decided to maintain its active role also in the future. This will be reflected in the future in the increasing cooperation between the SSAC and the IAEA in the new safeguards activities related to the Additional Protocol. The role of Euratom will remain the same concerning the implementation of conventional safeguards. Based on its SSAC role, STUK has continued carrying out safeguards inspections including independent verification measurements on spent fuel also after joining the EU and Euratom safeguards in 1995. Verification of the operator declared data is the key verification element of safeguards. This will remain to be the case also under the Integrated Safeguards (IS) in the future. It is believed that the importance of high quality measurements will rather increase than decrease when the frequency of interim inspections will decrease. Maintaining the continuity of knowledge makes sense only when the knowledge is reliable and independently verified. One of the corner stones of the high quality of the Finnish SSAC activities is

  9. Analytical benchmarks for verification of thermal-hydraulic codes based on sub-channel approach

    International Nuclear Information System (INIS)

    Over the last year (2007), preliminary tests have been performed on the Moroccan TRIGA MARK II research reactor to show that, under all operating conditions, the coolant parameters fall within the ranges allowing the safe working conditions of the reactor core. In parallel, a sub-channel thermal-hydraulic code, named SACATRI (Sub-channel Analysis Code for Application to TRIGA reactors), was developed to satisfy the needs of numerical simulation tools, able to predict the coolant flow parameters. The thermal-hydraulic model of SACATRI code is based on four partial differential equations that describe the conservation of mass, energy, axial and transversal momentum. However, to achieve the full task of any numerical code, verification is a highly recommended activity for assessing the accuracy of computational simulations. This paper presents a new procedure which can be used during code and solution verification activities of thermal-hydraulic tools based on sub-channel approach. The technique of verification proposed is based mainly on the combination of the method of manufactured solution and the order of accuracy test. The verification of SACATRI code allowed the elaboration of exact analytical benchmarks that can be used to assess the mathematical correctness of the numerical solution to the elaborated model

  10. Analytical benchmarks for verification of thermal-hydraulic codes based on sub-channel approach

    Energy Technology Data Exchange (ETDEWEB)

    Merroun, O. [LMR/ERSN, Department of Physics, Faculty of Sciences, Abdelmalek Essaadi University, B.P. 2121, Tetouan 93002 (Morocco)], E-mail: meroun.ossama@gmail.com; Almers, A. [Department of Energetics, Ecole Nationale Superieure d' Arts et Metiers, Moulay Ismail University, B.P. 4024, Meknes (Morocco); El Bardouni, T.; El Bakkari, B. [LMR/ERSN, Department of Physics, Faculty of Sciences, Abdelmalek Essaadi University, B.P. 2121, Tetouan 93002 (Morocco); Chakir, E. [LRM/EPTN, Department of Physics, Faculty of Sciences, Kenitra (Morocco)

    2009-04-15

    Over the last year (2007), preliminary tests have been performed on the Moroccan TRIGA MARK II research reactor to show that, under all operating conditions, the coolant parameters fall within the ranges allowing the safe working conditions of the reactor core. In parallel, a sub-channel thermal-hydraulic code, named SACATRI (Sub-channel Analysis Code for Application to TRIGA reactors), was developed to satisfy the needs of numerical simulation tools, able to predict the coolant flow parameters. The thermal-hydraulic model of SACATRI code is based on four partial differential equations that describe the conservation of mass, energy, axial and transversal momentum. However, to achieve the full task of any numerical code, verification is a highly recommended activity for assessing the accuracy of computational simulations. This paper presents a new procedure which can be used during code and solution verification activities of thermal-hydraulic tools based on sub-channel approach. The technique of verification proposed is based mainly on the combination of the method of manufactured solution and the order of accuracy test. The verification of SACATRI code allowed the elaboration of exact analytical benchmarks that can be used to assess the mathematical correctness of the numerical solution to the elaborated model.

  11. Face Verification using MLP and SVM

    OpenAIRE

    Cardinaux, Fabien; Marcel, Sébastien

    2002-01-01

    The performance of machine learning algorithms has steadily improved over the past few years, such as MLP or more recently SVM. In this paper, we compare two successful discriminant machine learning algorithms apply to the problem of face verification: MLP and SVM. These two algorithms are tested on a benchmark database, namely XM2VTS. Results show that a MLP is better than a SVM on this particular task.

  12. Survey of Existing Tools for Formal Verification.

    Energy Technology Data Exchange (ETDEWEB)

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.; Jackson, Mayo

    2014-12-01

    Formal methods have come into wide use because of their effectiveness in verifying %22safety and security%22 requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  13. Food QUALITY VERIFICATION: WHO DO CONSUMERS TRUST?

    OpenAIRE

    Hobbs, Jill E.; Innes, Brian G.; Uzea, Adrian D.

    2010-01-01

    Food markets are increasingly characterized by an array of quality assurances with respect to credence attributes, many of which relate to agricultural production methods. A variety of organizations are associated with these quality assurance claims, including private, third party and public sector organizations. How do quality verifications from different sources affect consumer food choices? Who do consumers trust for assurances about credence attributes? This paper draws upon two recent st...

  14. Probabilistic verification of partially observable dynamical systems

    OpenAIRE

    Gyori, Benjamin M.; Paulin, Daniel; Palaniappan, Sucheendra K.

    2014-01-01

    The construction and formal verification of dynamical models is important in engineering, biology and other disciplines. We focus on non-linear models containing a set of parameters governing their dynamics. The value of these parameters is often unknown and not directly observable through measurements, which are themselves noisy. When treating parameters as random variables, one can constrain their distribution by conditioning on observations and thereby constructing a posterior probability ...

  15. Unconstrained Face Verification using Deep CNN Features

    OpenAIRE

    Chen, Jun-Cheng; Patel, Vishal M.; Chellappa, Rama

    2015-01-01

    In this paper, we present an algorithm for unconstrained face verification based on deep convolutional features and evaluate it on the newly released IARPA Janus Benchmark A (IJB-A) dataset. The IJB-A dataset includes real-world unconstrained faces from 500 subjects with full pose and illumination variations which are much harder than the traditional Labeled Face in the Wild (LFW) and Youtube Face (YTF) datasets. The deep convolutional neural network (DCNN) is trained using the CASIA-WebFace ...

  16. Survey of Existing Tools for Formal Verification.

    Energy Technology Data Exchange (ETDEWEB)

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.; Jackson, Mayo

    2014-12-01

    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  17. Dynamic analysis for shuttle design verification

    Science.gov (United States)

    Fralich, R. W.; Green, C. E.; Rheinfurth, M. H.

    1972-01-01

    Two approaches that are used for determining the modes and frequencies of space shuttle structures are discussed. The first method, direct numerical analysis, involves finite element mathematical modeling of the space shuttle structure in order to use computer programs for dynamic structural analysis. The second method utilizes modal-coupling techniques of experimental verification made by vibrating only spacecraft components and by deducing modes and frequencies of the complete vehicle from results obtained in the component tests.

  18. Analyzing personalized policies for online biometric verification.

    Directory of Open Access Journals (Sweden)

    Apaar Sadhwani

    Full Text Available Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR subject to constraints on the false accept rate (FAR and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses 12 biometrics for each resident, which represents a five (four, respectively log reduction in FRR relative to fingerprint (iris, respectively policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR and acquires an average of 1.3 fingerprints per resident.

  19. Cleaning verification by air/water impingement

    Science.gov (United States)

    Jones, Lisa L.; Littlefield, Maria D.; Melton, Gregory S.; Caimi, Raoul E. B.; Thaxton, Eric A.

    1995-01-01

    This paper will discuss how the Kennedy Space Center intends to perform precision cleaning verification by Air/Water Impingement in lieu of chlorofluorocarbon-113 gravimetric nonvolatile residue analysis (NVR). Test results will be given that demonstrate the effectiveness of the Air/Water system. A brief discussion of the Total Carbon method via the use of a high temperature combustion analyzer will also be given. The necessary equipment for impingement will be shown along with other possible applications of this technology.

  20. Palm Vein Verification Using Gabor Filter

    OpenAIRE

    Ali Mohsin Al-Juboori

    2013-01-01

    Palm vein authentication is one of the modern biometric techniques, which employs the vein pattern in the human palm to verify the person. The merits of palm vein on classical biometric (e.g. fingerprint, iris, face) are a low risk of falsification, difficulty of duplicated and stability. In this research, a new method is proposed for personal verification based on palm vein features. In the propose method, the palm vein images are firstly enhanced and then the features are extracted by using...

  1. Finger-print based human verification system

    OpenAIRE

    Klopčič, Uroš

    2009-01-01

    The diploma thesis presents an algorithm for verification based on fingerprints. In order to achieve a simple and modular design, the algorithm is divided into number of steps. As an input, the algorithm takes greyscale fingerprint images. First, segmentation is performed where the background is separated from the area which represents the fingerprint. This is followed by the calculation of orientation field of the fingerprint using the gradient method and local frequency estimation. Both val...

  2. Fresh fuel verification feasibility study

    International Nuclear Information System (INIS)

    The study examined several methods of verifying crates of fresh CANDU fuel without opening the crates. Of these methods only a passive gamma techniques and an active neutron technique were chosen as being appropriate. Computer simulation supported by physical measurements were performed to determine the applicability of various detectors, detector configurations, detector locations, etc. The results indicate that for the two techniques chosen a crate of fuel can be verified without its being opened provided modifications are made to the crate to permit a detector or source probes to be inserted. Estimates of the cost of modifying existing crates and procuring the measurement equipment are provided

  3. Final Report Independent Verification Survey of the High Flux Beam Reactor, Building 802 Fan House Brookhaven National Laboratory Upton, New York

    Energy Technology Data Exchange (ETDEWEB)

    Harpeneau, Evan M. [Oak Ridge Institute for Science and Education, Oak Ridge, TN (United States). Independent Environmental Assessment and Verification Program

    2011-06-24

    On May 9, 2011, ORISE conducted verification survey activities including scans, sampling, and the collection of smears of the remaining soils and off-gas pipe associated with the 802 Fan House within the HFBR (High Flux Beam Reactor) Complex at BNL. ORISE is of the opinion, based on independent scan and sample results obtained during verification activities at the HFBR 802 Fan House, that the FSS (final status survey) unit meets the applicable site cleanup objectives established for as left radiological conditions.

  4. Verification in referral-based crowdsourcing.

    Directory of Open Access Journals (Sweden)

    Victor Naroditskiy

    Full Text Available Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through "referral-based crowdsourcing": the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge.

  5. Functional Verification of Enhanced RISC Processor

    Directory of Open Access Journals (Sweden)

    SHANKER NILANGI

    2013-10-01

    Full Text Available This paper presents design and verification of a 32-bit enhanced RISC processor core having floating point computations integrated within the core, has been designed to reduce the cost and complexity. The designed 3 stage pipelined 32-bit RISC processor is based on the ARM7 processor architecture with single precision floating point multiplier, floating point adder/subtractor for floating point operations and 32 x 32 booths multiplier added to the integer core of ARM7. The binary representation of the floating point numbers employed in the design eliminates the need for floating point registers and uses same set of registers thereby reducing the complexity, area and cost. Mask based data reversal barrel shifter performs parallel flag computations during shift or rotate and has least worst case delay of 0.94 ns compared to other barrel shifters. The hardware of the 32-bit RISC processor core has been modeled in Verilog HDL, simulated in VCS. Verification of a complex design such as 32-bit RISC is one of the major challenges as it consumes more time. In this work, a verification environment is being developed to verify the design RISC processor core.

  6. Formal verification of an avionics microprocessor

    Science.gov (United States)

    Srivas, Mandayam, K.; Miller, Steven P.

    1995-01-01

    Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.

  7. Tags and seals for arms control verification

    Energy Technology Data Exchange (ETDEWEB)

    DeVolpi, A.

    1990-09-18

    Tags and seals have long been recognized as important tools in arms control. The trend in control of armaments is to limit militarily significant equipment that is capable of being verified through direct and cooperative means, chiefly on-site inspection or monitoring. Although this paper will focus on the CFE treaty, the role of tags and seals for other treaties will also be addressed. Published technology and concepts will be reviewed, based on open sources. Arms control verification tags are defined as unique identifiers designed to be tamper-revealing; in that respect, seals are similar, being used as indicators of unauthorized access. Tamper-revealing tags might be considered as single-point markers, seals as two-point couplings, and nets as volume containment. The functions of an arms control tag can be considered to be two-fold: to provide field verification of the identity of a treaty-limited item (TLI), and to have a means of authentication of the tag and its tamper-revealing features. Authentication could take place in the field or be completed elsewhere. For CFE, the goal of tags and seals can be to reduce the overall cost of the entire verification system.

  8. DOE handbook: Integrated safety management systems (ISMS) verification. Team leader's handbook

    International Nuclear Information System (INIS)

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  9. DOE handbook: Integrated safety management systems (ISMS) verification team leader`s handbook

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  10. Technical safety requirements control level verification; TOPICAL

    International Nuclear Information System (INIS)

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  11. Ligand chain length conveys thermochromism.

    Science.gov (United States)

    Ganguly, Mainak; Panigrahi, Sudipa; Chandrakumar, K R S; Sasmal, Anup Kumar; Pal, Anjali; Pal, Tarasankar

    2014-08-14

    Thermochromic properties of a series of non-ionic copper compounds have been reported. Herein, we demonstrate that Cu(II) ion with straight-chain primary amine (A) and alpha-linolenic (fatty acid, AL) co-jointly exhibit thermochromic properties. In the current case, we determined that thermochromism becomes ligand chain length-dependent and at least one of the ligands (A or AL) must be long chain. Thermochromism is attributed to a balanced competition between the fatty acids and amines for the copper(II) centre. The structure-property relationship of the non-ionic copper compounds Cu(AL)2(A)2 has been substantiated by various physical measurements along with detailed theoretical studies based on time-dependent density functional theory. It is presumed from our results that the compound would be a useful material for temperature-sensor applications. PMID:24943491

  12. Geometry of Area Without Length

    CERN Document Server

    Ho, Pei-Ming

    2015-01-01

    To define a free string by the Nambu-Goto action, all we need is the notion of area, and mathematically the area can be defined directly in the absence of a metric. Motivated by the possibility that string theory admits backgrounds where the notion of length is not well defined but a definition of area is given, we study space-time geometries based on the generalization of metric to area metric. In analogy with Riemannian geometry, we define the analogues of connections, curvatures and Einstein tensor. We propose a formulation generalizing Einstein's theory that will be useful if at a certain stage or a certain scale the metric is ill-defined and the space-time is better characterized by the notion of area. Static spherical solutions are found for the generalized Einstein equation in vacuum, including the Schwarzschild solution as a special case.

  13. Geometry of area without length

    Science.gov (United States)

    Ho, Pei-Ming; Inami, Takeo

    2016-01-01

    To define a free string by the Nambu-Goto action, all we need is the notion of area, and mathematically the area can be defined directly in the absence of a metric. Motivated by the possibility that string theory admits backgrounds where the notion of length is not well defined but a definition of area is given, we study space-time geometries based on the generalization of a metric to an area metric. In analogy with Riemannian geometry, we define the analogues of connections, curvatures, and Einstein tensor. We propose a formulation generalizing Einstein's theory that will be useful if at a certain stage or a certain scale the metric is ill defined and the space-time is better characterized by the notion of area. Static spherical solutions are found for the generalized Einstein equation in vacuum, including the Schwarzschild solution as a special case.

  14. Thoughts on Verification of Nuclear Disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Dunlop, W H

    2007-09-26

    It is my pleasure to be here to day to participate in this Conference. My thanks to the organizers for preparing such an interesting agenda on a very difficult topic. My effort in preparing my presentation was performed under the auspices of the U.S. Department of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48. And as many of you know Lawrence Livermore National Laboratory is now, as of Oct 1st, under contract to the Lawrence Livermore National Security LLC. There has been a long history of how to view verification of arms control agreements. The basis for verification during the days of SALT was that verification would be based on each country's national technical means. For treaties dealing with strategic missiles this worked well as the individual items subject to verification were of such a size that they were visible by the National Technical Means available at the time. And it was felt that the counting of missiles and launchers could be verified by our National Technical Means. For nuclear testing treaties the use of seismic measurements developed into a capability that was reasonably robust for all but the smallest of nuclear tests. However, once we had the Threshold Test Ban Treaty, there was a significant problem in that the fidelity of the measurements were not sufficient to determine if a test was slightly above the 150 kt limit or slightly below the 150 kt limit. This led some in the US to believe that the Soviet Union was not living up to the TTBT agreement. An on-site verification protocol was negotiated in 1988 and 1989 that allowed the US to make hydrodynamic yield measurements on Soviet tests above 50 kt yield and regional seismic measurements on all tests above 35 kt of yield; and the Soviets to make the same type of measurements on US tests to ensure that they were not over 150 kt. These on-site measurements were considered reasonably intrusive. Again the measurement capability was

  15. Telomere Rapid Deletion Regulates Telomere Length in Arabidopsis thaliana▿

    OpenAIRE

    Watson, J. Matthew; Dorothy E Shippen

    2006-01-01

    Telomere length is maintained in species-specific equilibrium primarily through a competition between telomerase-mediated elongation and the loss of terminal DNA through the end-replication problem. Recombinational activities are also capable of both lengthening and shortening telomeres. Here we demonstrate that elongated telomeres in Arabidopsis Ku70 mutants reach a new length set point after three generations. Restoration of wild-type Ku70 in these mutants leads to discrete telomere-shorten...

  16. Abstraction and Learning for Infinite-State Compositional Verification

    Directory of Open Access Journals (Sweden)

    Dimitra Giannakopoulou

    2013-09-01

    Full Text Available Despite many advances that enable the application of model checking techniques to the verification of large systems, the state-explosion problem remains the main challenge for scalability. Compositional verification addresses this challenge by decomposing the verification of a large system into the verification of its components. Recent techniques use learning-based approaches to automate compositional verification based on the assume-guarantee style reasoning. However, these techniques are only applicable to finite-state systems. In this work, we propose a new framework that interleaves abstraction and learning to perform automated compositional verification of infinite-state systems. We also discuss the role of learning and abstraction in the related context of interface generation for infinite-state components.

  17. Verification of post permanently manned configuration Space Station elements

    Science.gov (United States)

    Scully, E. J.; Edwards, M. D.

    1986-01-01

    An account is given of the techniques and ground systems designed to fulfill post permanently manned configuration (PMC) Space Station verification tasks. Consideration is given to analysis using computer math models and computer-aided interface verification systems, testing using simulators and interface mixtures, and special inspection. It is noted that an initial Space Station design that accommodates and facilitates verification is crucial to an effective verification program as well as proper instrumentation, built-in test capability, and a precise configuration management, control and record system. It is concluded that post PMC verification should be accounted for both in the initial Space Station design and in the subsequent development of initial assembly flight verification techniques and capabilities.

  18. Seal Wire Integrity Verification Instrument: Evaluation of Laboratory Prototypes

    International Nuclear Information System (INIS)

    Tamper indicating devices (TIDs) provide evidence that sensitive items, to which they have been applied, have been tampered with or not. Passive wire-loop seals, a class of TIDs, are generally comprised of a multi-strand seal wire that is threaded through or around key features and a unique seal body that captures and restrains the seal wire. Seal integrity resides with unique identification of the seal and the integrity of the seal body and the seal wire. Upon inspection, the seal wire may be cut and the full length inspected. A new seal may be applied in the field as a replacement, if desired. Seal wire inspection typically requires visual and tactile examinations, which are both subjective. A need therefore exists to develop seal wire inspection technology that is easy to use in the field, is objective, provides an auditable data trail, and has low error rates. Expected benefits, if successfully implemented, are improved on-site inspection reliability and security. The work scope for this effort was restricted to integrity of seal wire used by the International Atomic Energy Agency (IAEA) and resulted in development of a wire integrity verification instrument (WIVI) laboratory prototype. Work included a performance evaluation of a laboratory-bench-top system, and design and delivery of two WIVI laboratory prototypes. The paper describes the basic physics of the eddy current measurement, a description of the WIVI laboratory prototype, and an initial evaluation performed by IAEA personnel.

  19. A quality assurance phantom for IMRT dose verification

    Science.gov (United States)

    Ma, C.-M.; Jiang, S. B.; Pawlicki, T.; Chen, Y.; Li, J. S.; Deng, J.; Boyer, A. L.

    2003-03-01

    This paper investigates a quality assurance (QA) phantom specially designed to verify the accuracy of dose distributions and monitor units (MU) calculated by clinical treatment planning optimization systems and by the Monte Carlo method for intensity-modulated radiotherapy (IMRT). The QA phantom is a PMMA cylinder of 30 cm diameter and 40 cm length with various bone and lung inserts. A procedure (and formalism) has been developed to measure the absolute dose to water in the PMMA phantom. Another cylindrical phantom of the same dimensions, but made of water, was used to confirm the results obtained with the PMMA phantom. The PMMA phantom was irradiated by 4, 6 and 15 MV photon beams and the dose was measured using an ionization chamber and compared to the results calculated by a commercial inverse planning system (CORVUS, NOMOS, Sewickley, PA) and by the Monte Carlo method. The results show that the dose distributions calculated by both CORVUS and Monte Carlo agreed to within 2% of dose maximum with measured results in the uniform PMMA phantom for both open and intensity-modulated fields. Similar agreement was obtained between Monte Carlo calculations and measured results with the bone and lung heterogeneity inside the PMMA phantom while the CORVUS results were 4% different. The QA phantom has been integrated as a routine QA procedure for the patient's IMRT dose verification at Stanford since 1999.

  20. Nuclear disarmament and the verification role of the IAEA

    International Nuclear Information System (INIS)

    At the height of the cold war, nuclear arsenals reached a peak of some 70000 weapons. Although these numbers have since come down significantly, some 27000 weapons remain. The fact that decades go by and nuclear disarmament is not realised contributes to a deep sense of concern and disappointment. So do other factors, such as the persistence of nuclear doctrines that admit first use; the lack of binding negative assurances; the ongoing research on nuclear explosives including subcritical tests, and the maintaining readiness to resume full-scale testing. The sense of insufficient or outright lack of progress in nuclear disarmament is even more disturbing if measured against existing legal obligations. First and foremost among those is of course Article VI of the Treaty on the Non-Proliferation of Nuclear Weapons (NPT). According to the ICJ's Advisory Opinion, the obligation contained in Article VI is an obligation to achieve results in nuclear disarmament. Bringing the Comprehensive Nuclear Test Ban Treaty (CTBT) into force is still missing, as well as negotiating a Fissile Materials Cut-Off Treaty (FMCT). Despite significant unilateral reductions in nuclear arsenals, these have not been done within an international process that includes the commitment to total elimination. The notion that it is morally reprehensible for some countries to pursue weapons of mass destruction yet morally acceptable for others to rely on them for their security is simply unworkable. For achieving nuclear disarmament verification objectives, the IAEA clearly would have a major role to play. Under Article III.A.5 of its Statute, the Agency is allowed to apply, at the request of a State, safeguards to any of that State's nuclear activities. The Agency's capabilities and experience make it the international institution best suited to eventually perform nuclear disarmament verification tasks. In order to perform nuclear disarmament verification activities, the Agency would of course need to

  1. Skilled Impostor Attacks Against Fingerprint Verification Systems And Its Remedy

    OpenAIRE

    Gottschlich, Carsten

    2015-01-01

    Fingerprint verification systems are becoming ubiquitous in everyday life. This trend is propelled especially by the proliferation of mobile devices with fingerprint sensors such as smartphones and tablet computers, and fingerprint verification is increasingly applied for authenticating financial transactions. In this study we describe a novel attack vector against fingerprint verification systems which we coin skilled impostor attack. We show that existing protocols for performance evaluatio...

  2. A Simple Complexity Measurement for Software Verification and Software Testing

    OpenAIRE

    Cheng, Zheng; Monahan, Rosemary; Power, James F.

    2012-01-01

    In this paper, we used a simple metric (i.e. Lines of Code) to measure the complexity involved in software verification and software testing. The goal is then, to argue for software verification over software testing, and motivate a discussion of how to reduce the complexity involved in software verification. We propose to reduce this complexity by translating the software to a simple intermediate representation which can be verified using an efficient verifier, such as Boog...

  3. Work Breakdown Structure: A Tool for Software Project Scope Verification

    OpenAIRE

    Robert T. Hans

    2013-01-01

    Software project scope verification is a very important process in project scope management and it needs to be performed properly and thoroughly so as to avoid project rework and scope creep. Moreover, software scope verification is crucial in the process of delivering exactly what the customer requested and minimizing project scope changes. Well defined software scope eases the process of scope verification and contributes to project success. Furthermore, a deliverable-oriented WBS provides ...

  4. Replication and Abstraction: Symmetry in Automated Formal Verification

    Directory of Open Access Journals (Sweden)

    Thomas Wahl

    2010-04-01

    Full Text Available This article surveys fundamental and applied aspects of symmetry in system models, and of symmetry reduction methods used to counter state explosion in model checking, an automated formal verification technique. While covering the research field broadly, we particularly emphasize recent progress in applying the technique to realistic systems, including tools that promise to elevate the scope of symmetry reduction to large-scale program verification. The article targets researchers and engineers interested in formal verification of concurrent systems.

  5. Diagnoses and visit length in complementary and mainstream medicine.

    NARCIS (Netherlands)

    Heiligers, P.J.M.; Groot, J. de; Koster, D.; Dulmen, S. van

    2010-01-01

    BACKGROUND: The demand for complementary medicine (CM) is growing worldwide and so is the supply. So far, there is not much insight in the activities in Dutch CM practices nor in how these activities differ from mainstream general practice. Comparisons on diagnoses and visit length can offer an impr

  6. Telomerase and telomere length in pulmonary fibrosis.

    Science.gov (United States)

    Liu, Tianju; Ullenbruch, Matthew; Young Choi, Yoon; Yu, Hongfeng; Ding, Lin; Xaubet, Antoni; Pereda, Javier; Feghali-Bostwick, Carol A; Bitterman, Peter B; Henke, Craig A; Pardo, Annie; Selman, Moises; Phan, Sem H

    2013-08-01

    In addition to its expression in stem cells and many cancers, telomerase activity is transiently induced in murine bleomycin (BLM)-induced pulmonary fibrosis with increased levels of telomerase transcriptase (TERT) expression, which is essential for fibrosis. To extend these observations to human chronic fibrotic lung disease, we investigated the expression of telomerase activity in lung fibroblasts from patients with interstitial lung diseases (ILDs), including idiopathic pulmonary fibrosis (IPF). The results showed that telomerase activity was induced in more than 66% of IPF lung fibroblast samples, in comparison with less than 29% from control samples, some of which were obtained from lung cancer resections. Less than 4% of the human IPF lung fibroblast samples exhibited shortened telomeres, whereas less than 6% of peripheral blood leukocyte samples from patients with IPF or hypersensitivity pneumonitis demonstrated shortened telomeres. Moreover, shortened telomeres in late-generation telomerase RNA component knockout mice did not exert a significant effect on BLM-induced pulmonary fibrosis. In contrast, TERT knockout mice exhibited deficient fibrosis that was independent of telomere length. Finally, TERT expression was up-regulated by a histone deacetylase inhibitor, while the induction of TERT in lung fibroblasts was associated with the binding of acetylated histone H3K9 to the TERT promoter region. These findings indicate that significant telomerase induction was evident in fibroblasts from fibrotic murine lungs and a majority of IPF lung samples, whereas telomere shortening was not a common finding in the human blood and lung fibroblast samples. Notably, the animal studies indicated that the pathogenesis of pulmonary fibrosis was independent of telomere length.

  7. Passive Tomography for Spent Fuel Verification: Analysis Framework and Instrument Design Study

    Energy Technology Data Exchange (ETDEWEB)

    White, Timothy A.; Svard, Staffan J.; Smith, Leon E.; Mozin, Vladimir V.; Jansson, Peter; Davour, Anna; Grape, Sophie; Trellue, H.; Deshmukh, Nikhil S.; Wittman, Richard S.; Honkamaa, Tapani; Vaccaro, Stefano; Ely, James

    2015-05-18

    The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly is being assessed through a collaboration of Support Programs to the International Atomic Energy Agency (IAEA). In the first phase of this study, two safeguards verification objectives have been identified. The first is the independent determination of the number of active pins that are present in the assembly, in the absence of a priori information. The second objective is to provide quantitative measures of pin-by-pin properties, e.g. activity of key isotopes or pin attributes such as cooling time and relative burnup, for the detection of anomalies and/or verification of operator-declared data. The efficacy of GET to meet these two verification objectives will be evaluated across a range of fuel types, burnups, and cooling times, and with a target interrogation time of less than 60 minutes. The evaluation of GET viability for safeguards applications is founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types are used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. Instrument response data are processed by a variety of tomographic-reconstruction and image-processing methods, and scoring metrics specific to each of the verification objectives are defined and used to evaluate the performance of the methods. This paper will provide a description of the analysis framework and evaluation metrics, example performance-prediction results, and describe the design of a “universal” GET instrument intended to support the full range of verification scenarios envisioned by the IAEA.

  8. Verification of Advective Bar Elements Implemented in the Aria Thermal Response Code.

    Energy Technology Data Exchange (ETDEWEB)

    Mills, Brantley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-01

    A verification effort was undertaken to evaluate the implementation of the new advective bar capability in the Aria thermal response code. Several approaches to the verification process were taken : a mesh refinement study to demonstrate solution convergence in the fluid and the solid, visually examining the mapping of the advective bar element nodes to the surrounding surfaces, and a comparison of solutions produced using the advective bars for simple geometries with solutions from commercial CFD software . The mesh refinement study has shown solution convergence for simple pipe flow in both temperature and velocity . Guidelines were provided to achieve appropriate meshes between the advective bar elements and the surrounding volume. Simulations of pipe flow using advective bars elements in Aria have been compared to simulations using the commercial CFD software ANSYS Fluent (r) and provided comparable solutions in temperature and velocity supporting proper implementation of the new capability. Verification of Advective Bar Elements iv Acknowledgements A special thanks goes to Dean Dobranich for his guidance and expertise through all stages of this effort . His advice and feedback was instrumental to its completion. Thanks also goes to Sam Subia and Tolu Okusanya for helping to plan many of the verification activities performed in this document. Thank you to Sam, Justin Lamb and Victor Brunini for their assistance in resolving issues encountered with running the advective bar element model. Finally, thanks goes to Dean, Sam, and Adam Hetzler for reviewing the document and providing very valuable comments.

  9. Office of River Protection Integrated Safety Management System Phase 1 Verification Corrective Action Plan

    International Nuclear Information System (INIS)

    The purpose of this Corrective Action Plan is to demonstrate the OW planned and/or completed actions to implement ISMS as well as prepare for the RPP ISMS Phase II Verification scheduled for August, 1999. This Plan collates implied or explicit ORP actions identified in several key ISMS documents and aligns those actions and responsibilities perceived necessary to appropriately disposition all ISM Phase II preparation activities specific to the ORP. The objective will be to complete or disposition the corrective actions prior to the commencement of the ISMS Phase II Verification. Improvement products/tasks not slated for completion prior to the RPP Phase II verification will be incorporated as corrective actions into the Strategic System Execution Plan (SSEP) Gap Analysis. Many of the business and management systems that were reviewed in the ISMS Phase I verification are being modified to support the ORP transition and are being assessed through the SSEP. The actions and processes identified in the SSEP will support the development of the ORP and continued ISMS implementation as committed to be complete by end of FY-2000

  10. Monitoring, reporting and verification for national REDD + programmes: two proposals

    International Nuclear Information System (INIS)

    Different options have been suggested by Parties to the UNFCCC (United Framework Convention on Climate Change) for inclusion in national approaches to REDD and REDD + (reduced deforestation, reduced degradation, enhancement of forest carbon stocks, sustainable management of forest, and conservation of forest carbon stocks). This paper proposes that from the practical and technical points of view of designing action for REDD and REDD + at local and sub-national level, as well as from the point of view of the necessary MRV (monitoring, reporting and verification), these should be grouped into three categories: conservation, which is rewarded on the basis of no changes in forest stock, reduced deforestation, in which lowered rates of forest area loss are rewarded, and positive impacts on carbon stock changes in forests remaining forest, which includes reduced degradation, sustainable management of forest of various kinds, and forest enhancement. Thus we have moved degradation, which conventionally is grouped with deforestation, into the forest management group reported as areas remaining forest land, with which it has, in reality, and particularly as regards MRV, much more in common. Secondly, in the context of the fact that REDD/REDD + is to take the form of a national or near-national approach, we argue that while systematic national monitoring is important, it may not be necessary for REDD/REDD + activities, or for national MRV, to be started at equal levels of intensity all over the country. Rather, areas where interventions seem easiest to start may be targeted, and here data measurements may be more rigorous (Tier 3), for example based on stakeholder self-monitoring with independent verification, while in other, untreated areas, a lower level of monitoring may be pursued, at least in the first instance. Treated areas may be targeted for any of the three groups of activities (conservation, reduced deforestation, and positive impact on carbon stock increases in

  11. Monitoring, reporting and verification for national REDD + programmes: two proposals

    Energy Technology Data Exchange (ETDEWEB)

    Herold, Martin [Center for Geoinformation, Department of Environmental Science, Wageningen University, Droevendaalsesteeg 3, 6708 PB Wageningen (Netherlands); Skutsch, Margaret, E-mail: martin.herold@wur.nl [Centro de Investigaciones en GeografIa Ambiental, UNAM Campus Morelia (Mexico)

    2011-01-15

    Different options have been suggested by Parties to the UNFCCC (United Framework Convention on Climate Change) for inclusion in national approaches to REDD and REDD + (reduced deforestation, reduced degradation, enhancement of forest carbon stocks, sustainable management of forest, and conservation of forest carbon stocks). This paper proposes that from the practical and technical points of view of designing action for REDD and REDD + at local and sub-national level, as well as from the point of view of the necessary MRV (monitoring, reporting and verification), these should be grouped into three categories: conservation, which is rewarded on the basis of no changes in forest stock, reduced deforestation, in which lowered rates of forest area loss are rewarded, and positive impacts on carbon stock changes in forests remaining forest, which includes reduced degradation, sustainable management of forest of various kinds, and forest enhancement. Thus we have moved degradation, which conventionally is grouped with deforestation, into the forest management group reported as areas remaining forest land, with which it has, in reality, and particularly as regards MRV, much more in common. Secondly, in the context of the fact that REDD/REDD + is to take the form of a national or near-national approach, we argue that while systematic national monitoring is important, it may not be necessary for REDD/REDD + activities, or for national MRV, to be started at equal levels of intensity all over the country. Rather, areas where interventions seem easiest to start may be targeted, and here data measurements may be more rigorous (Tier 3), for example based on stakeholder self-monitoring with independent verification, while in other, untreated areas, a lower level of monitoring may be pursued, at least in the first instance. Treated areas may be targeted for any of the three groups of activities (conservation, reduced deforestation, and positive impact on carbon stock increases in

  12. NEMVP: North American energy measurement and verification protocol

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-03-01

    This measurement and verification protocol discusses procedures that,when implemented, allow buyers, sellers, and financiers of energy projects to quantify energy conservation measure performance and savings.

  13. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  14. Tolerance Verification of Micro and Nano Structures on Polycarbonate Substrates

    DEFF Research Database (Denmark)

    Gasparin, Stefania; Tosello, Guido; Hansen, Hans Nørgaard

    2010-01-01

    Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features are defi......Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features...

  15. Verification of IMRT fields by film dosimetry

    International Nuclear Information System (INIS)

    In intensity modulated radiation therapy (IMRT) the aim of an accurate conformal dose distribution is obtained through a complex process. This ranges from the calculation of the optimal distribution of fluence by the treatment planning system (TPS), to the dose delivery through a multilamellar collimator (MLC), with several segments per beam in the step and shoot approach. The above-mentioned consideration makes mandatory an accurate dosimetric verification of the IM beams. A high resolution and integrating dosimeter, like the radiographic film, permits one to simultaneously measure the dose in a matrix of points, providing a good means of obtaining dose distributions. The intrinsic limitation of film dosimetry is the sensitivity dependence on the field size and on the measurement depth. However, the introduction of a scattered radiation filter permits the use of a single calibration curve for all field sizes and measurement depths. In this paper the quality control procedure developed for dosimetric verification of IMRT technique is reported. In particular a system of film dosimetry for the verification of a 6 MV photon beam has been implemented, with the introduction of the scattered radiation filter in the clinical practice that permits one to achieve an absolute dose determination with a global uncertainty within 3.4% (1 s.d.). The film has been calibrated to be used both in perpendicular and parallel configurations. The work also includes the characterization of the Elekta MLC. Ionimetric independent detectors have been used to check single point doses. The film dosimetry procedure has been applied to compare the measured absolute dose distributions with the ones calculated by the TPS, both for test and clinical plans. The agreement, quantified by the gamma index that seldom reaches the 1.5 value, is satisfying considering that the comparison is performed between absolute doses

  16. Spatial Evaluation and Verification of Earthquake Simulators

    Science.gov (United States)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2016-09-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  17. Tolerance at arm's length: the Dutch experience.

    Science.gov (United States)

    Schuijer, J

    1990-01-01

    With respect to pedophilia and the age of consent, the Netherlands warrants special attention. Although pedophilia is not as widely accepted in the Netherlands as sometimes is supposed, developments in the judicial practice showed a growing reservedness. These developments are a spin-off of related developments in Dutch society. The tolerance in the Dutch society has roots that go far back in history and is also a consequence of the way this society is structured. The social changes of the sixties and seventies resulted in a "tolerance at arm's length" for pedophiles, which proved to be deceptive when the Dutch government proposed to lower the age of consent in 1985. It resulted in a vehement public outcry. The prevailing sex laws have been the prime target of protagonists of pedophile emancipation. Around 1960, organized as a group, they started to undertake several activities. In the course of their existence, they came to redefine the issue of pedophilia as one of youth emancipation.

  18. State of the Art: Signature Biometrics Verification

    Directory of Open Access Journals (Sweden)

    Nourddine Guersi

    2010-04-01

    Full Text Available This paper presents a comparative analysis of the performance of three estimation algorithms: Expectation Maximization (EM, Greedy EM Algorithm (GEM and Figueiredo-Jain Algorithm (FJ - based on the Gaussian mixture models (GMMs for signature biometrics verification. The simulation results have shown significant performance achievements. The test performance of EER=5.49 % for "EM", EER=5.04 % for "GEM" and EER=5.00 % for "FJ", shows that the behavioral information scheme of signature biometrics is robust and has a discriminating power, which can be explored for identity authentication.

  19. The MODUS approach to formal verification

    DEFF Research Database (Denmark)

    Brewka, Lukasz Jerzy; Soler, José; Berger, Michael Stübert

    2014-01-01

    process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality) project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution...... Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model...

  20. Thermionic fuel element verification program—overview

    Science.gov (United States)

    Bohl, Richard J.; Dutt, Dale S.; Dahlberg, Richard C.; Wood, John T.

    1991-01-01

    TFE Verification Program is in the sixth year of a program to demonstrate the performance and lifetime of thermionic fuel elements for high power space applications. It is jointly funded by SIDO and DOE. Data from accelerated tests in FFTF and EBR-II show component lifetimes longer than 7 years. Alumina insulators have shown good performance at high fast fluence. Graphite-cesium reservoirs based on isotropic graphite also meet requirements. Three TFEs are current operating in the TRIGA reactor, the oldest having accumulated 15,000 hours of irradiation as of 1 October 1990.

  1. Unification & sharing in timed automata verification

    DEFF Research Database (Denmark)

    David, Alexandre; Behrmann, Gerd; Larsen, Kim Guldstrand;

    2003-01-01

    We present the design of the model-checking engine and internal data structures for the next generation of UPPAAL. The design is based on a pipeline architecture where each stage represents one independent operation in the verification algorithms. The architecture is based on essentially one shared...... data structure to reduce redundant computations in state exploration, which unifies the so-called passed and waiting lists of the traditional reachability algorithm. In the implementation, instead of using standard memory management functions from general-purpose operating systems, we have developed...

  2. Thermionic fuel element Verification Program - Overview

    Science.gov (United States)

    Bohl, Richard J.; Dahlberg, Richard C.; Dutt, Dale S.; Wood, John T.

    The TFE Verification Program is in the sixth year of a program to demonstrate the performance and lifetime of thermionic fuel elements for high power space applications. Data from accelerated tests in FETF and EBR-II show component lifetimes longer than 7 yr. Alumina insulators have shown good performance at high fast fluence. Graphite-cesium reservoirs based on isotropic graphite also meet requirements. Three TFEs are currently operating in the TRIGA reactor, the oldest having accumulated 15,000 hr of irradiation as of 1 October 1990.

  3. Fingerprint Verification based on Gabor Filter Enhancement

    CERN Document Server

    Lavanya, B N; Venugopal, K R

    2009-01-01

    Human fingerprints are reliable characteristics for personnel identification as it is unique and persistence. A fingerprint pattern consists of ridges, valleys and minutiae. In this paper we propose Fingerprint Verification based on Gabor Filter Enhancement (FVGFE) algorithm for minutiae feature extraction and post processing based on 9 pixel neighborhood. A global feature extraction and fingerprints enhancement are based on Hong enhancement method which is simultaneously able to extract local ridge orientation and ridge frequency. It is observed that the Sensitivity and Specificity values are better compared to the existing algorithms.

  4. Accelerating functional verification of an integrated circuit

    Energy Technology Data Exchange (ETDEWEB)

    Deindl, Michael; Ruedinger, Jeffrey Joseph; Zoellin, Christian G.

    2015-10-27

    Illustrative embodiments include a method, system, and computer program product for accelerating functional verification in simulation testing of an integrated circuit (IC). Using a processor and a memory, a serial operation is replaced with a direct register access operation, wherein the serial operation is configured to perform bit shifting operation using a register in a simulation of the IC. The serial operation is blocked from manipulating the register in the simulation of the IC. Using the register in the simulation of the IC, the direct register access operation is performed in place of the serial operation.

  5. Dynamic statechart verification; Dynamische Zustandsautomaten Verifikation

    Energy Technology Data Exchange (ETDEWEB)

    Braitschink, P. [Volkswagen AG, Wolfsburg (Germany); Reuss, H.C. [IVK, Univ. Stuttgart (Germany)

    2005-07-01

    In today's vehicles a great number of functions are realized by statecharts: from electrical window lift systems to entry systems and finally driver assistance systems. The methodology presented in this paper can be used for test verification during function tests both in subsystems and in the complete vehicle system. The principle of the methodology can be applied to different models (UML, MATLAB/ Stateflow, ASCET-SD, etc.) for the representation of statecharts. The standardized data exchange format ASAM/ODX is used to store the data. (orig.)

  6. Investigation of Navier-Stokes Code Verification and Design Optimization

    Science.gov (United States)

    Vaidyanathan, Rajkumar

    2004-01-01

    With rapid progress made in employing computational techniques for various complex Navier-Stokes fluid flow problems, design optimization problems traditionally based on empirical formulations and experiments are now being addressed with the aid of computational fluid dynamics (CFD). To be able to carry out an effective CFD-based optimization study, it is essential that the uncertainty and appropriate confidence limits of the CFD solutions be quantified over the chosen design space. The present dissertation investigates the issues related to code verification, surrogate model-based optimization and sensitivity evaluation. For Navier-Stokes (NS) CFD code verification a least square extrapolation (LSE) method is assessed. This method projects numerically computed NS solutions from multiple, coarser base grids onto a freer grid and improves solution accuracy by minimizing the residual of the discretized NS equations over the projected grid. In this dissertation, the finite volume (FV) formulation is focused on. The interplay between the xi concepts and the outcome of LSE, and the effects of solution gradients and singularities, nonlinear physics, and coupling of flow variables on the effectiveness of LSE are investigated. A CFD-based design optimization of a single element liquid rocket injector is conducted with surrogate models developed using response surface methodology (RSM) based on CFD solutions. The computational model consists of the NS equations, finite rate chemistry, and the k-6 turbulence closure. With the aid of these surrogate models, sensitivity and trade-off analyses are carried out for the injector design whose geometry (hydrogen flow angle, hydrogen and oxygen flow areas and oxygen post tip thickness) is optimized to attain desirable goals in performance (combustion length) and life/survivability (the maximum temperatures on the oxidizer post tip and injector face and a combustion chamber wall temperature). A preliminary multi-objective optimization

  7. 28 CFR 551.4 - Hair length.

    Science.gov (United States)

    2010-07-01

    ... to an inmate hair care services which comply with applicable health and sanitation requirements. ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Hair length. 551.4 Section 551.4 Judicial... Hair length. (a) The Warden may not restrict hair length if the inmate keeps it neat and clean. (b)...

  8. Independent Verification Review And Survey of the Argonne National Laboratory Building 301 Footprint Argonne, Illinois

    International Nuclear Information System (INIS)

    The objectives of the verification activities were to provide independent document and field data reviews in evaluating the adequacy and accuracy of the contractor's procedures and FSS results. The onsite verification survey was performed in order to generate independent radiological scan data and sample data (if necessary) for use by the DOE to verify that remedial actions were effective in reaching background levels. ORISE performed onsite verification survey activities of the Building 301 footprint area at ANL on September 14, 2009. The onsite activities included collecting independent gamma scan measurements. The survey results did not identify any locations of elevated radiation levels exceeding twice background within the footprint area. Following onsite activities, ORISE was provided a partial FSS laboratory data package to review. Upon identifying several technical issues with the analyses and reporting, ORISE was informed that those FSS samples were being re-analyzed; however, DOE approved the backfill of the area on September 16, 2009 prior to receiving the re-analyzed results. The subsequent laboratory report for the samples that were re-analyzed was provided to ORISE on November 2, 2009 and although the Building 301 footprint area was already backfilled, the review determined the analytical results were acceptable.

  9. Telomere Length – a New Biomarker in Medicine

    Directory of Open Access Journals (Sweden)

    Agnieszka Kozłowska

    2015-12-01

    Full Text Available A number of xenobiotics in the environment and workplace influences on our health and life. Biomarkers are tools for measuring such exposures and their effects in the organism. Nowadays, telomere length, epigenetic changes, mutations and changes in gene expression pattern have become new molecular biomarkers. Telomeres play the role of molecular clock, which influences on expectancy of cell life and thus aging, the formation of damages, development diseases and carcinogenesis. The telomere length depends on mechanisms of replication and the activity of telomerase. Telomere length is currently used as a biomarker of susceptibility and/or exposure. This paper describes the role of telomere length as a biomarker of aging cells, oxidative stress, a marker of many diseases including cancer, and as a marker of environmental and occupational exposure.

  10. Synthesis and structure elucidation of new μ-oxamido-bridged dicopper(II) complex with in vitro anticancer activity: A combined study from experiment verification and docking calculation on DNA/protein-binding property.

    Science.gov (United States)

    Zhu, Ling; Zheng, Kang; Li, Yan-Tuan; Wu, Zhi-Yong; Yan, Cui-Wei

    2016-02-01

    A new oxamido-bridged dicopper(II) complex with formula of [Cu2(deap)(pic)2], where H2deap and pic represent N,N'-bis[3-(diethylamino)propyl]oxamide and picrate, respectively, was synthesized and characterized by elemental analyses, molar conductance measurements, IR and electronic spectral study, and single-crystal X-ray diffraction. The crystal structure analyses revealed that the two copper(II) atoms in the dicopper(II) complex are bridged by the trans-deap(2-) ligand with the distances of 5.2116(17)Å, and the coordination environment around the copper(II) atoms can be described as a square-planar geometry. Hydrogen bonding and π-π stacking interactions link the dicopper(II) complex into a three-dimensional infinite network. The DNA/protein-binding properties of the complex are investigated by molecular docking and experimental assays. The results indicate that the dicopper(II) complex can interact with HS-DNA in the mode of intercalation and effectively quench the intrinsic fluorescence of protein BSA by 1:1 binding with the most possible binding site in the proximity of Trp134. The in vitro anticancer activities suggest that the complex is active against the selected tumor cell lines, and IC50 values for SMMC-7721 and HepG2 are lower than cisplatin. The effects of the electron density distribution of the terminal ligand and the chelate ring arrangement around copper(II) ions bridged by symmetric N,N'-bis(substituted)oxamides on DNA/BSA-binding ability and in vitro anticancer activity are preliminarily discussed.

  11. High-Resolution Fast-Neutron Spectrometry for Arms Control and Treaty Verification

    Energy Technology Data Exchange (ETDEWEB)

    David L. Chichester; James T. Johnson; Edward H. Seabury

    2012-07-01

    Many nondestructive nuclear analysis techniques have been developed to support the measurement needs of arms control and treaty verification, including gross photon and neutron counting, low- and high-resolution gamma spectrometry, time-correlated neutron measurements, and photon and neutron imaging. One notable measurement technique that has not been extensively studied to date for these applications is high-resolution fast-neutron spectrometry (HRFNS). Applied for arms control and treaty verification, HRFNS has the potential to serve as a complimentary measurement approach to these other techniques by providing a means to either qualitatively or quantitatively determine the composition and thickness of non-nuclear materials surrounding neutron-emitting materials. The technique uses the normally-occurring neutrons present in arms control and treaty verification objects of interest as an internal source of neutrons for performing active-interrogation transmission measurements. Most low-Z nuclei of interest for arms control and treaty verification, including 9Be, 12C, 14N, and 16O, possess fast-neutron resonance features in their absorption cross sections in the 0.5- to 5-MeV energy range. Measuring the selective removal of source neutrons over this energy range, assuming for example a fission-spectrum starting distribution, may be used to estimate the stoichiometric composition of intervening materials between the neutron source and detector. At a simpler level, determination of the emitted fast-neutron spectrum may be used for fingerprinting 'known' assemblies for later use in template-matching tests. As with photon spectrometry, automated analysis of fast-neutron spectra may be performed to support decision making and reporting systems protected behind information barriers. This paper will report recent work at Idaho National Laboratory to explore the feasibility of using HRFNS for arms control and treaty verification applications, including simulations

  12. Mapping {sup 15}O Production Rate for Proton Therapy Verification

    Energy Technology Data Exchange (ETDEWEB)

    Grogg, Kira; Alpert, Nathaniel M.; Zhu, Xuping [Center for Advanced Radiological Sciences, Nuclear Medicine and Molecular Imaging, Radiology Department, Massachusetts General Hospital, Boston, Massachusetts (United States); Min, Chul Hee [Department of Radiological Science, College of Health Science, Yonsei University, Wonju, Kangwon (Korea, Republic of); Testa, Mauro; Winey, Brian [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts (United States); Normandin, Marc D. [Center for Advanced Radiological Sciences, Nuclear Medicine and Molecular Imaging, Radiology Department, Massachusetts General Hospital, Boston, Massachusetts (United States); Shih, Helen A.; Paganetti, Harald; Bortfeld, Thomas [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts (United States); El Fakhri, Georges, E-mail: elfakhri@pet.mgh.harvard.edu [Center for Advanced Radiological Sciences, Nuclear Medicine and Molecular Imaging, Radiology Department, Massachusetts General Hospital, Boston, Massachusetts (United States)

    2015-06-01

    Purpose: This work was a proof-of-principle study for the evaluation of oxygen-15 ({sup 15}O) production as an imaging target through the use of positron emission tomography (PET), to improve verification of proton treatment plans and to study the effects of perfusion. Methods and Materials: Dynamic PET measurements of irradiation-produced isotopes were made for a phantom and rabbit thigh muscles. The rabbit muscle was irradiated and imaged under both live and dead conditions. A differential equation was fitted to phantom and in vivo data, yielding estimates of {sup 15}O production and clearance rates, which were compared to live versus dead rates for the rabbit and to Monte Carlo predictions. Results: PET clearance rates agreed with decay constants of the dominant radionuclide species in 3 different phantom materials. In 2 oxygen-rich materials, the ratio of {sup 15}O production rates agreed with the expected ratio. In the dead rabbit thighs, the dynamic PET concentration histories were accurately described using {sup 15}O decay constant, whereas the live thigh activity decayed faster. Most importantly, the {sup 15}O production rates agreed within 2% (P>.5) between conditions. Conclusions: We developed a new method for quantitative measurement of {sup 15}O production and clearance rates in the period immediately following proton therapy. Measurements in the phantom and rabbits were well described in terms of {sup 15}O production and clearance rates, plus a correction for other isotopes. These proof-of-principle results support the feasibility of detailed verification of proton therapy treatment delivery. In addition, {sup 15}O clearance rates may be useful in monitoring permeability changes due to therapy.

  13. Mapping 15O Production Rate for Proton Therapy Verification

    International Nuclear Information System (INIS)

    Purpose: This work was a proof-of-principle study for the evaluation of oxygen-15 (15O) production as an imaging target through the use of positron emission tomography (PET), to improve verification of proton treatment plans and to study the effects of perfusion. Methods and Materials: Dynamic PET measurements of irradiation-produced isotopes were made for a phantom and rabbit thigh muscles. The rabbit muscle was irradiated and imaged under both live and dead conditions. A differential equation was fitted to phantom and in vivo data, yielding estimates of 15O production and clearance rates, which were compared to live versus dead rates for the rabbit and to Monte Carlo predictions. Results: PET clearance rates agreed with decay constants of the dominant radionuclide species in 3 different phantom materials. In 2 oxygen-rich materials, the ratio of 15O production rates agreed with the expected ratio. In the dead rabbit thighs, the dynamic PET concentration histories were accurately described using 15O decay constant, whereas the live thigh activity decayed faster. Most importantly, the 15O production rates agreed within 2% (P>.5) between conditions. Conclusions: We developed a new method for quantitative measurement of 15O production and clearance rates in the period immediately following proton therapy. Measurements in the phantom and rabbits were well described in terms of 15O production and clearance rates, plus a correction for other isotopes. These proof-of-principle results support the feasibility of detailed verification of proton therapy treatment delivery. In addition, 15O clearance rates may be useful in monitoring permeability changes due to therapy

  14. The new geospatial tools: global transparency enhancing safeguards verification

    International Nuclear Information System (INIS)

    This paper focuses on the importance and potential role of the new, freely available, geospatial tools for enhancing IAEA safeguards and how, together with commercial satellite imagery, they can be used to promote 'all-source synergy'. As additional 'open sources', these new geospatial tools have heralded a new era of 'global transparency' and they can be used to substantially augment existing information-driven safeguards gathering techniques, procedures, and analyses in the remote detection of undeclared facilities, as well as support ongoing monitoring and verification of various treaty (e.g., NPT, FMCT) relevant activities and programs. As an illustration of how these new geospatial tools may be applied, an original exemplar case study provides how it is possible to derive value-added follow-up information on some recent public media reporting of a former clandestine underground plutonium production complex (now being converted to a 'Tourist Attraction' given the site's abandonment by China in the early 1980s). That open source media reporting, when combined with subsequent commentary found in various Internet-based Blogs and Wikis, led to independent verification of the reporting with additional ground truth via 'crowdsourcing' (tourist photos as found on 'social networking' venues like Google Earth's Panoramio layer and Twitter). Confirmation of the precise geospatial location of the site (along with a more complete facility characterization incorporating 3-D Modeling and visualization) was only made possible following the acquisition of higher resolution commercial satellite imagery that could be correlated with the reporting, ground photos, and an interior diagram, through original imagery analysis of the overhead imagery.

  15. Verification and validation for induction heating

    Energy Technology Data Exchange (ETDEWEB)

    Lam, Kin [Los Alamos National Laboratory; Tippetts, Trevor B [Los Alamos National Laboratory; Allen, David W [NON LANL

    2008-01-01

    Truchas is a software package being developed at LANL within the Telluride project for predicting the complex physical processes in metal alloy casting. The software was designed to be massively parallel, multi-material, multi-physics, and to run on 3D, fully unstructured meshes. This work describes a Verification and Validation assessment of Truchas for simulating the induction heating phase of a casting process. We used existing data from a simple experiment involving the induction heating of a graphite cylinder, as graphite is a common material used for mold assemblies. Because we do not have complete knowledge of all the conditions and properties in this experiment (as is the case in many other experiments), we performed a parameter sensitivity study, modeled the uncertainties of the most sensitive parameters, and quantified how these uncertainties propagate to the Truchas output response. A verification analysis produced estimates of the numerical error of the Truchas solution to our computational model. The outputs from Truchas runs with randomly sampled parameter values were used for the validation study.

  16. Formal Verification of Self-Assembling Systems

    CERN Document Server

    Sterling, Aaron

    2010-01-01

    This paper introduces the theory and practice of formal verification of self-assembling systems. We interpret a well-studied abstraction of nanomolecular self assembly, the Abstract Tile Assembly Model (aTAM), into Computation Tree Logic (CTL), a temporal logic often used in model checking. We then consider the class of "rectilinear" tile assembly systems. This class includes most aTAM systems studied in the theoretical literature, and all (algorithmic) DNA tile self-assembling systems that have been realized in laboratories to date. We present a polynomial-time algorithm that, given a tile assembly system T as input, either provides a counterexample to T's rectilinearity or verifies whether T has a unique terminal assembly. Using partial order reductions, the verification search space for this algorithm is reduced from exponential size to O(n^2), where n x n is the size of the assembly surface. That reduction is asymptotically the best possible. We report on experimental results obtained by translating tile ...

  17. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  18. Verification and validation of control system software

    International Nuclear Information System (INIS)

    The following guidelines are proposed for verification and validation (V ampersand V) of nuclear power plant control system software: (a) use risk management to decide what and how much V ampersand V is needed; (b) classify each software application using a scheme that reflects what type and how much V ampersand V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs

  19. Comparison of particulate verification techniques study

    Science.gov (United States)

    Rivera, Rachel

    2006-08-01

    The efficacy of five particulate verification techniques on four types of materials was studied. Statistical Analysis Software/JMP 6.0 was used to create a statistically valid design of experiments. In doing so, 35 witness coupons consisting of the four types of materials being studied, were intentionally contaminated with particulate fallout. Image Analysis was used to characterize the extent of particulate fallout on the coupons and was used to establish a baseline, or basis of comparison, against the five techniques that were studied. The five particulate verification techniques were the Tapelift, the Particulate Solvent Rinse, the GelPak lift, an in-line vacuum filtration probe, and the Infinity Focusing Microscope (IFM). The four types of materials consisted of magnesium flouride (MgF II) coated mirrors, composite coated silver aluminum (CCAg), Z93 and NS43G coated aluminum, and silicon (si) wafers. The vacuum probe was determined to be most effective for Z93, the tapelift or vacuum probe for MgF2, and the GelPak Lift for CCAg and si substrates. A margin of error for each technique, based on experimental data from two experiments, for si wafer substrates, yielded the following: Tapelift - 67%, Solvent Rinse - 58%, GelPak- 26%, Vacuum Probe - 93%, IFM-to be determined.

  20. SPR Hydrostatic Column Model Verification and Validation.

    Energy Technology Data Exchange (ETDEWEB)

    Bettin, Giorgia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lord, David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rudeen, David Keith [Gram, Inc. Albuquerque, NM (United States)

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extended nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.

  1. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  2. Monitoring/Verification using DMS: TATP Example

    Energy Technology Data Exchange (ETDEWEB)

    Stephan Weeks, Kevin Kyle, Manuel Manard

    2008-05-30

    Field-rugged and field-programmable differential mobility spectrometry (DMS) networks provide highly selective, universal monitoring of vapors and aerosols at detectable levels from persons or areas involved with illicit chemical/biological/explosives (CBE) production. CBE sensor motes used in conjunction with automated fast gas chromatography with DMS detection (GC/DMS) verification instrumentation integrated into situational operations-management systems can be readily deployed and optimized for changing application scenarios. The feasibility of developing selective DMS motes for a “smart dust” sampling approach with guided, highly selective, fast GC/DMS verification analysis is a compelling approach to minimize or prevent the illegal use of explosives or chemical and biological materials. DMS is currently one of the foremost emerging technologies for field separation and detection of gas-phase chemical species. This is due to trace-level detection limits, high selectivity, and small size. Fast GC is the leading field analytical method for gas phase separation of chemical species in complex mixtures. Low-thermal-mass GC columns have led to compact, low-power field systems capable of complete analyses in 15–300 seconds. A collaborative effort optimized a handheld, fast GC/DMS, equipped with a non-rad ionization source, for peroxide-based explosive measurements.

  3. Simple dose verification system for radiotherapy radiation

    International Nuclear Information System (INIS)

    The aim of this paper is to investigate an accurate and convenient quality assurance programme that should be included in the dosimetry system of the radiotherapy level radiation. We designed a mailed solid phantom and used TLD-100 chips and a Rexon UL320 reader for the purpose of dosimetry quality assurance in Taiwanese radiotherapy centers. After being assembled, the solid polystyrene phantom weighted only 375 g which was suitable for mailing. The Monte Carlo BEAMnrc code was applied in calculations of the dose conversion factor of water and polystyrene phantom: the dose conversion factor measurements were obtained by switching the TLDs at the same calibration depth of water and the solid phantom to measure the absorbed dose and verify the accuracy of the theoretical calculation results. The experimental results showed that the dose conversion factors from TLD measurements and the calculation values from the BEAMnrc were in good agreement with a difference within 0.5%. Ten radiotherapy centers were instructed to deliver to the TLDs on central beam axis absorbed dose of 2 Gy. The measured doses were compared with the planned ones. A total of 21 beams were checked. The dose verification differences under reference conditions for 60Co, high energy X-rays of 6, 10 and 15 MV were truly within 4% and that proved the feasibility of applying the method suggested in this work in radiotherapy dose verification

  4. Program verification using symbolic game semantics

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar

    2014-01-01

    We introduce a new symbolic representation of algorithmic game semantics, and show how it can be applied for efficient verification of open (incomplete) programs. The focus is on an Algol-like programming language which contains the core ingredients of imperative and functional languages, especia......We introduce a new symbolic representation of algorithmic game semantics, and show how it can be applied for efficient verification of open (incomplete) programs. The focus is on an Algol-like programming language which contains the core ingredients of imperative and functional languages......, especially on its second-order recursion-free fragment with infinite data types. We revisit the regular-language representation of game semantics of this language fragment. By using symbolic values instead of concrete ones, we generalize the standard notions of regular-language and automata representations...... of game semantics to that of corresponding symbolic representations. In this way programs with infinite data types, such as integers, can be expressed as finite-state symbolic-automata although the standard automata representation is infinite-state, i.e. the standard regular-language representation has...

  5. Modular verification of linked lists with views via separation logic

    DEFF Research Database (Denmark)

    Jensen, Jonas Braband; Birkedal, Lars; Sestoft, Peter

    2010-01-01

    We present a separation logic specification and verification of linked lists with views, a data structure from the C5 collection library for C#. A view is a generalization of the well-known concept of an iterator. Linked lists with views form an interesting case study for verification since they ...

  6. Certification and verification for calmac flat plate solar collector

    Energy Technology Data Exchange (ETDEWEB)

    1978-01-27

    This document contains information used in the certification and verification of the Calmac Flat Plate Collector. Contained are such items as test procedures and results, information on materials used, Installation, Operation, and Maintenance Manuals, and other information pertaining to the verification and certification.

  7. Neutron spectrometric methods for core inventory verification in research reactors

    CERN Document Server

    Ellinger, A; Hansen, W; Knorr, J; Schneider, R

    2002-01-01

    In consequence of the Non-Proliferation Treaty safeguards, inspections are periodically made in nuclear facilities by the IAEA and the EURATOM Safeguards Directorate. The inspection methods are permanently improved. Therefore, the Core Inventory Verification method is being developed as an indirect method for the verification of the core inventory and to check the declared operation of research reactors.

  8. 45 CFR 1626.7 - Verification of eligible alien status.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the...

  9. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, A.; Larsen, K.G.; Møller, M.H.;

    2012-01-01

    of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge...

  10. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...

  11. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...

  12. Sciduction: Combining Induction, Deduction, and Structure for Verification and Synthesis

    OpenAIRE

    Seshia, Sanjit A.

    2012-01-01

    Even with impressive advances in automated formal methods, certain problems in system verification and synthesis remain challenging. Examples include the verification of quantitative properties of software involving constraints on timing and energy consumption, and the automatic synthesis of systems from specifications. The major challenges include environment modeling, incompleteness in specifications, and the complexity of underlying decision problems. This position paper proposes sciductio...

  13. 21 CFR 21.44 - Verification of identity.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought....

  14. 20 CFR 211.15 - Verification of compensation claimed.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Verification of compensation claimed. 211.15... CREDITABLE RAILROAD COMPENSATION § 211.15 Verification of compensation claimed. Compensation claimed by an... Board before it may be credited. An employee's claim to compensation not credited shall be processed...

  15. 19 CFR 10.309 - Verification of documentation.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 1 2010-04-01 2010-04-01 false Verification of documentation. 10.309 Section 10.309 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF... Trade Agreement § 10.309 Verification of documentation. Any evidence of country of origin or of...

  16. A verification logic representation of indeterministic signal states

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1991-01-01

    The integration of modern CAD tools with formal verification environments require translation from hardware description language to verification logic. A signal representation including both unknown state and a degree of strength indeterminacy is essential for the correct modeling of many VLSI circuit designs. A higher-order logic theory of indeterministic logic signals is presented.

  17. 21 CFR 120.11 - Verification and validation.

    Science.gov (United States)

    2010-04-01

    ... § 120.11 Verification and validation. (a) Verification. Each processor shall verify that the Hazard Analysis and Critical Control Point (HACCP) system is being implemented according to design. (1... processor to determine whether such complaints relate to the performance of the HACCP plan or...

  18. Validation and verification plan for safety and PRA codes

    International Nuclear Information System (INIS)

    This report discusses a verification and validation (V ampersand V) plan for computer codes used for safety analysis and probabilistic risk assessment calculations. The present plan fulfills the commitments by Westinghouse Savannah River Company (WSRC) to the Department of Energy Savannah River Office (DOE-SRO) to bring the essential safety analysis and probabilistic risk assessment codes in compliance with verification and validation requirements

  19. Portable system for periodical verification of area monitors for neutrons

    International Nuclear Information System (INIS)

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  20. EXAMINING THE ROLE AND RESEARCH CHALLENGES OF SOCIAL MEDIA AS A TOOL FOR NONPROLIFERATION AND ARMS CONTROL TREATY VERIFICATION

    Energy Technology Data Exchange (ETDEWEB)

    Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.; Gastelum, Zoe N.; Kreyling, Sean J.; West, Curtis L.

    2014-05-13

    Traditional arms control treaty verification activities typically involve a combination of technical measurements via physical and chemical sensors, state declarations, political agreements, and on-site inspections involving international subject matter experts. However, the ubiquity of the internet, and the electronic sharing of data that it enables, has made available a wealth of open source information with the potential to benefit verification efforts. Open source information is already being used by organizations such as the International Atomic Energy Agency to support the verification of state-declared information, prepare inspectors for in-field activities, and to maintain situational awareness . The recent explosion in social media use has opened new doors to exploring the attitudes, moods, and activities around a given topic. Social media platforms, such as Twitter, Facebook, and YouTube, offer an opportunity for individuals, as well as institutions, to participate in a global conversation at minimal cost. Social media data can also provide a more data-rich environment, with text data being augmented with images, videos, and location data. The research described in this paper investigates the utility of applying social media signatures as potential arms control and nonproliferation treaty verification tools and technologies, as determined through a series of case studies. The treaty relevant events that these case studies touch upon include detection of undeclared facilities or activities, determination of unknown events recorded by the International Monitoring System (IMS), and the global media response to the occurrence of an Indian missile launch. The case studies examine how social media can be used to fill an information gap and provide additional confidence to a verification activity. The case studies represent, either directly or through a proxy, instances where social media information may be available that could potentially augment the evaluation

  1. Association of Telomere Length with Breast Cancer Prognostic Factors

    Science.gov (United States)

    Têtu, Bernard; Maunsell, Elizabeth; Poirier, Brigitte; Montoni, Alicia; Rochette, Patrick J.; Diorio, Caroline

    2016-01-01

    Introduction Telomere length, a marker of cell aging, seems to be affected by the same factors thought to be associated with breast cancer prognosis. Objective To examine associations of peripheral blood cell-measured telomere length with traditional and potential prognostic factors in breast cancer patients. Methods We conducted a cross-sectional analysis of data collected before surgery from 162 breast cancer patients recruited consecutively between 01/2011 and 05/2012, at a breast cancer reference center. Data on the main lifestyle factors (smoking, alcohol consumption, physical activity) were collected using standardized questionnaires. Anthropometric factors were measured. Tumor biological characteristics were extracted from pathology reports. Telomere length was measured using a highly reproducible quantitative PCR method in peripheral white blood cells. Spearman partial rank-order correlations and multivariate general linear models were used to evaluate relationships between telomere length and prognostic factors. Results Telomere length was positively associated with total physical activity (rs = 0.17, P = 0.033; Ptrend = 0.069), occupational physical activity (rs = 0.15, P = 0.054; Ptrend = 0.054) and transportation-related physical activity (rs = 0.19, P = 0.019; P = 0.005). Among post-menopausal women, telomere length remained positively associated with total physical activity (rs = 0.27, P = 0.016; Ptrend = 0.054) and occupational physical activity (rs = 0.26, P = 0.021; Ptrend = 0.056) and was only associated with transportation-related physical activity among pre-menopausal women (rs = 0.27, P = 0.015; P = 0.004). No association was observed between telomere length and recreational or household activities, other lifestyle factors or traditional prognostic factors. Conclusions Telomeres are longer in more active breast cancer patients. Since white blood cells are involved in anticancer immune responses, these findings suggest that even regular low

  2. Work Breakdown Structure: A Tool for Software Project Scope Verification

    Directory of Open Access Journals (Sweden)

    Robert T. Hans

    2013-07-01

    Full Text Available Software project scope verification is a very important process in project scope management and it needsto be performed properly and thoroughly so as to avoid project rework and scope creep. Moreover,software scope verification is crucial in the process of delivering exactly what the customer requested andminimizing project scope changes. Well defined software scope eases the process of scope verification andcontributes to project success. Furthermore, a deliverable-oriented WBS provides a road map to a welldefined software scope of work. It is on the basis of this that this paper extends the use of deliverableorientedWBS to that of scope verification process. This paper argues that a deliverable-oriented WBS is atool for software scope verification

  3. Approaches to verification of two-dimensional water quality models

    Energy Technology Data Exchange (ETDEWEB)

    Butkus, S.R. (Tennessee Valley Authority, Chattanooga, TN (USA). Water Quality Dept.)

    1990-11-01

    The verification of a water quality model is the one procedure most needed by decision making evaluating a model predictions, but is often not adequate or done at all. The results of a properly conducted verification provide the decision makers with an estimate of the uncertainty associated with model predictions. Several statistical tests are available for quantifying of the performance of a model. Six methods of verification were evaluated using an application of the BETTER two-dimensional water quality model for Chickamauga reservoir. Model predictions for ten state variables were compared to observed conditions from 1989. Spatial distributions of the verification measures showed the model predictions were generally adequate, except at a few specific locations in the reservoir. The most useful statistics were the mean standard error of the residuals. Quantifiable measures of model performance should be calculated during calibration and verification of future applications of the BETTER model. 25 refs., 5 figs., 7 tabs.

  4. A Formal Verification Methodology for Checking Data Integrity

    CERN Document Server

    Umezawa, Yasushi

    2011-01-01

    Formal verification techniques have been playing an important role in pre-silicon validation processes. One of the most important points considered in performing formal verification is to define good verification scopes; we should define clearly what to be verified formally upon designs under tests. We considered the following three practical requirements when we defined the scope of formal verification. They are (a) hard to verify (b) small to handle, and (c) easy to understand. Our novel approach is to break down generic properties for system into stereotype properties in block level and to define requirements for Verifiable RTL. Consequently, each designer instead of verification experts can describe properties of the design easily, and formal model checking can be applied systematically and thoroughly to all the leaf modules. During the development of a component chip for server platforms, we focused on RAS (Reliability, Availability, and Serviceability) features and described more than 2000 properties in...

  5. Analysis and verification of multi-channel FULMS algorithm for adaptive feedforward active vibration control%多通道FULMS自适应前馈振动控制算法分析与验证

    Institute of Scientific and Technical Information of China (English)

    朱晓锦; 黄全振; 高志远; 高守玮; 姜恩宇

    2011-01-01

    One of the key issues for active vibration control of flexible structures is control strategy and method, a multi-channel FULMS algorithm for adaptive feedforward control systems was proposed to solve the problem how to obtain the reference signal of FXLMS algorithm.Based on the controller architecture constructed and illustrated here, the multichannel FULMS algorithm procedure was deduced and described in general.To verify the feasibility and priority of the proposed control algorithm, performance comparisons of single-channel and multi-channel with FXLMS and FULMS were made using MATLAB.The analysis results indicated that the multi-channel control performance is better than the singlechannel one, while FULMS algorithm has better performance than FXLMS algorithm.Finally, taking a piezoelectric flexible epoxide resin plate to simulate solar panels, an active vibration suppression experimental platform was established with its relative measurement and control system.The experimental results showed that the proposed FULMS algorithm is feasible and efficient with excellent convergence and control performance.%柔性结构振动主动控制的核心问题之一是控制策略与方法,针对FXLMS自适应滤波前馈振动控制方法参考信号不易选取问题,给出一种多通道FULMS自适应滤波前馈振动控制方法;首先进行控制器结构的分析与构建,概括描述和推导了多通道FULMS控制算法过程;为验证所分析算法的可行性和优越性,基于MATLAB软件包进行仿真分析,并与FXLMS算法分别进行单通道和多通道控制效果对比,分析结果表明多通道控制优于单通道控制,FULMS算法优于FXIMS算法.在此基础上,以航天器柔性帆板结构为理想模拟对象,构建压电机敏柔性板结构和测控系统进行实际算法控制实验;实验过程与验证结果表明,采用的FULMS控制器设计方法与控制算法是有效可行的,并具有较快的收敛速度和较好的控制效果.

  6. Enhancement of the use of digital mock-ups in the verification and validation process for ITER remote handling systems

    Energy Technology Data Exchange (ETDEWEB)

    Sibois, R., E-mail: romain.sibois@vtt.fi [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Salminen, K.; Siuko, M. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland); Mattila, J. [Tampere University of Technology, Korkeakoulunkatu 6, 33720 Tampere (Finland); Määttä, T. [VTT Technical Research Centre of Finland, P.O. Box 1300, 33101 Tampere (Finland)

    2013-10-15

    Highlights: • Verification and validation process for ITER remote handling system. • Verification and validation framework for complex engineering systems. • Verification and validation roadmap for digital modelling phase. • Importance of the product life-cycle management in the verification and validation framework. -- Abstract: The paper is part of the EFDA's programme of European Goal Oriented Training programme on remote handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. This paper is written based on the results of a project “verification and validation (V and V) of ITER RH system using digital mock-ups (DMUs)”. The purpose of this project is to study efficient approach of using DMU for the V and V of the ITER RH system design utilizing a system engineering (SE) framework. This paper reviews the definitions of DMU and virtual prototype and overviews the current trends of using virtual prototyping in the industry during the early design phase. Based on the survey of best industrial practices, this paper proposes ways to improve the V and V process for ITER RH system utilizing DMUs.

  7. Enhancement of the use of digital mock-ups in the verification and validation process for ITER remote handling systems

    International Nuclear Information System (INIS)

    Highlights: • Verification and validation process for ITER remote handling system. • Verification and validation framework for complex engineering systems. • Verification and validation roadmap for digital modelling phase. • Importance of the product life-cycle management in the verification and validation framework. -- Abstract: The paper is part of the EFDA's programme of European Goal Oriented Training programme on remote handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. This paper is written based on the results of a project “verification and validation (V and V) of ITER RH system using digital mock-ups (DMUs)”. The purpose of this project is to study efficient approach of using DMU for the V and V of the ITER RH system design utilizing a system engineering (SE) framework. This paper reviews the definitions of DMU and virtual prototype and overviews the current trends of using virtual prototyping in the industry during the early design phase. Based on the survey of best industrial practices, this paper proposes ways to improve the V and V process for ITER RH system utilizing DMUs

  8. Burnout among physiotherapists and length of service

    Directory of Open Access Journals (Sweden)

    Zbigniew Śliwiński

    2014-04-01

    Full Text Available Objectives: The aim of this study was to identify factors that contribute to the development of burnout among physiotherapists with different length of service in physiotherapy. Material and Methods: The following research tools were used to study burnout: the Life Satisfaction Questionnaire (LSQ, based on FLZ (Fragebogen zur Lebenszufriedenheit by Frahrenberg, Myrtek, Schumacher, and Brähler; the Burnout Scale Inventory (BSI by Steuden and Okła; and an ad hoc questionnaire to collect socio-demographic data. The survey was anonymous and voluntary and involved a group of 200 active physiotherapists working in Poland. Results: A statistical analysis revealed significant differences in overall life satisfaction between length-of-service groups (p = 0.03. Physiotherapists with more than 15 years of service reported greater satisfaction than those with less than 5 years and between 5 and 15 years of service. The results suggest that burnout in those with 5-15 years of service is higher in physiotherapists working in health care centers and increases with age and greater financial satisfaction, while it decreases with greater satisfaction with friend and family relations and greater satisfaction with one's work and profession. In those with more than 15 years of service, burnout increases in the case of working in a setting other than a health care or educational center and decreases with greater satisfaction with one's work and profession. Conclusions: Job satisfaction and a satisfying family life prevent burnout among physiotherapists with 5-15 years of service in the profession. Financial satisfaction, age and being employed in health care may cause burnout among physiotherapists with 5-15 years of service. Physiotherapists with more than 15 years of service experience more burnout if they work in a setting other than a health care or educational center and less burnout if they are satisfied with their profession.

  9. Hydrodynamic length-scale selection in microswimmer suspensions

    Science.gov (United States)

    Heidenreich, Sebastian; Dunkel, Jörn; Klapp, Sabine H. L.; Bär, Markus

    2016-08-01

    A universal characteristic of mesoscale turbulence in active suspensions is the emergence of a typical vortex length scale, distinctly different from the scale invariance of turbulent high-Reynolds number flows. Collective length-scale selection has been observed in bacterial fluids, endothelial tissue, and active colloids, yet the physical origins of this phenomenon remain elusive. Here, we systematically derive an effective fourth-order field theory from a generic microscopic model that allows us to predict the typical vortex size in microswimmer suspensions. Building on a self-consistent closure condition, the derivation shows that the vortex length scale is determined by the competition between local alignment forces, rotational diffusion, and intermediate-range hydrodynamic interactions. Vortex structures found in simulations of the theory agree with recent measurements in Bacillus subtilis suspensions. Moreover, our approach yields an effective viscosity enhancement (reduction), as reported experimentally for puller (pusher) microorganisms.

  10. Image Hashes as Templates for Verification

    Energy Technology Data Exchange (ETDEWEB)

    Janik, Tadeusz; Jarman, Kenneth D.; Robinson, Sean M.; Seifert, Allen; McDonald, Benjamin S.; White, Timothy A.

    2012-07-17

    Imaging systems can provide measurements that confidently assess characteristics of nuclear weapons and dismantled weapon components, and such assessment will be needed in future verification for arms control. Yet imaging is often viewed as too intrusive, raising concern about the ability to protect sensitive information. In particular, the prospect of using image-based templates for verifying the presence or absence of a warhead, or of the declared configuration of fissile material in storage, may be rejected out-of-hand as being too vulnerable to violation of information barrier (IB) principles. Development of a rigorous approach for generating and comparing reduced-information templates from images, and assessing the security, sensitivity, and robustness of verification using such templates, are needed to address these concerns. We discuss our efforts to develop such a rigorous approach based on a combination of image-feature extraction and encryption-utilizing hash functions to confirm proffered declarations, providing strong classified data security while maintaining high confidence for verification. The proposed work is focused on developing secure, robust, tamper-sensitive and automatic techniques that may enable the comparison of non-sensitive hashed image data outside an IB. It is rooted in research on so-called perceptual hash functions for image comparison, at the interface of signal/image processing, pattern recognition, cryptography, and information theory. Such perceptual or robust image hashing—which, strictly speaking, is not truly cryptographic hashing—has extensive application in content authentication and information retrieval, database search, and security assurance. Applying and extending the principles of perceptual hashing to imaging for arms control, we propose techniques that are sensitive to altering, forging and tampering of the imaged object yet robust and tolerant to content-preserving image distortions and noise. Ensuring that the

  11. Multiple different defense mechanisms are activated in the young transgenic tobacco plants which express the full length genome of the Tobacco mosaic virus, and are resistant against this virus.

    Science.gov (United States)

    Jada, Balaji; Soitamo, Arto J; Siddiqui, Shahid Aslam; Murukesan, Gayatri; Aro, Eva-Mari; Salakoski, Tapio; Lehto, Kirsi

    2014-01-01

    Previously described transgenic tobacco lines express the full length infectious Tobacco mosaic virus (TMV) genome under the 35S promoter (Siddiqui et al., 2007. Mol Plant Microbe Interact, 20: 1489-1494). Through their young stages these plants exhibit strong resistance against both the endogenously expressed and exogenously inoculated TMV, but at the age of about 7-8 weeks they break into TMV infection, with typical severe virus symptoms. Infections with some other viruses (Potato viruses Y, A, and X) induce the breaking of the TMV resistance and lead to synergistic proliferation of both viruses. To deduce the gene functions related to this early resistance, we have performed microarray analysis of the transgenic plants during the early resistant stage, and after the resistance break, and also of TMV-infected wild type tobacco plants. Comparison of these transcriptomes to those of corresponding wild type healthy plants indicated that 1362, 1150 and 550 transcripts were up-regulated in the transgenic plants before and after the resistance break, and in the TMV-infected wild type tobacco plants, respectively, and 1422, 1200 and 480 transcripts were down-regulated in these plants, respectively. These transcriptome alterations were distinctly different between the three types of plants, and it appears that several different mechanisms, such as the enhanced expression of the defense, hormone signaling and protein degradation pathways contributed to the TMV-resistance in the young transgenic plants. In addition to these alterations, we also observed a distinct and unique gene expression alteration in these plants, which was the strong suppression of the translational machinery. This may also contribute to the resistance by slowing down the synthesis of viral proteins. Viral replication potential may also be suppressed, to some extent, by the reduction of the translation initiation and elongation factors eIF-3 and eEF1A and B, which are required for the TMV replication

  12. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    Energy Technology Data Exchange (ETDEWEB)

    Hautamaeki, J.; Tiitta, A. [VTT Chemical Technology, Espoo (Finland)

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  13. A GIS support system for declaration and verification

    International Nuclear Information System (INIS)

    Full text: The timely detection of a diversion of a significant amount of nuclear material from the civil cycle represents a complex activity that requires the use of powerful support systems. In this field the authors developed SIT (Safeguards Inspection Tool), an integrated platform for collecting, managing and analysing data from a variety of sources to support declarations and verification activities. Information dealt with is that requested by both INFCIRC/153 and INFCIRC/540 protocols. SIT is based on a low-cost Geographic Information System platform and extensive use is made of commercial software to reduce maintenance costs. The system has been developed using ARCVIEW GIS for Windows NT platforms. SIT is conceived as an integrator of multimedia information stored into local and remote databases; efforts have been focused on the automation of several tasks in order to produce a user-friendly system. Main characteristics of SIT are: Capability to deal with multimedia data, e.g. text, images, video, using user-selected COTS; Easy access to external databases, e.g. Oracle, Informix, Sybase, MS-Access, directly from the site map; Selected access to open source information via Internet; Capability to easily geo-reference site maps, to generate thematic layers of interest and to perform spatial analysis; Capability of performing aerial and satellite image analysis operations, e.g. rectification, change detection, feature extraction; Capability to easily add and run external models for e.g. material data accounting, completeness check, air dispersion models, material flow graph generation and to describe results in graphical form; Capability to use a Geo-positioning systems (GPS) with a portable computer, SIT is at an advanced stage of development and will be very soon interfaced with VERITY, a powerful Web search engine in order to allow open source information retrieval from geographical maps. The paper will describe the main features of SIT and the advantages of

  14. Completely anonymous multi-recipient signcryption scheme with public verification.

    Directory of Open Access Journals (Sweden)

    Liaojun Pang

    Full Text Available Most of the existing multi-recipient signcryption schemes do not take the anonymity of recipients into consideration because the list of the identities of all recipients must be included in the ciphertext as a necessary element for decryption. Although the signer's anonymity has been taken into account in several alternative schemes, these schemes often suffer from the cross-comparison attack and joint conspiracy attack. That is to say, there are few schemes that can achieve complete anonymity for both the signer and the recipient. However, in many practical applications, such as network conference, both the signer's and the recipient's anonymity should be considered carefully. Motivated by these concerns, we propose a novel multi-recipient signcryption scheme with complete anonymity. The new scheme can achieve both the signer's and the recipient's anonymity at the same time. Each recipient can easily judge whether the received ciphertext is from an authorized source, but cannot determine the real identity of the sender, and at the same time, each participant can easily check decryption permission, but cannot determine the identity of any other recipient. The scheme also provides a public verification method which enables anyone to publicly verify the validity of the ciphertext. Analyses show that the proposed scheme is more efficient in terms of computation complexity and ciphertext length and possesses more advantages than existing schemes, which makes it suitable for practical applications. The proposed scheme could be used for network conferences, paid-TV or DVD broadcasting applications to solve the secure communication problem without violating the privacy of each participant.

  15. Accuracy verification methods theory and algorithms

    CERN Document Server

    Mali, Olli; Repin, Sergey

    2014-01-01

    The importance of accuracy verification methods was understood at the very beginning of the development of numerical analysis. Recent decades have seen a rapid growth of results related to adaptive numerical methods and a posteriori estimates. However, in this important area there often exists a noticeable gap between mathematicians creating the theory and researchers developing applied algorithms that could be used in engineering and scientific computations for guaranteed and efficient error control.   The goals of the book are to (1) give a transparent explanation of the underlying mathematical theory in a style accessible not only to advanced numerical analysts but also to engineers and students; (2) present detailed step-by-step algorithms that follow from a theory; (3) discuss their advantages and drawbacks, areas of applicability, give recommendations and examples.

  16. Multipartite entanglement verification resistant against dishonest parties

    CERN Document Server

    Pappa, Anna; Wehner, Stephanie; Diamanti, Eleni; Kerenidis, Iordanis

    2011-01-01

    Future quantum information networks will likely consist of quantum and classical agents, who have the ability to communicate in a variety of ways with trusted and untrusted parties and securely delegate computational tasks to untrusted large-scale quantum computing servers. Multipartite quantum entanglement is a fundamental resource for such a network and hence it is imperative to study the possibility of verifying a multipartite entanglement source in a way that is efficient and provides strong guarantees even in the presence of multiple dishonest parties. In this work, we show how an agent of a quantum network can perform a distributed verification of a multipartite entangled source with minimal resources, which is, nevertheless, resistant against any number of dishonest parties. Moreover, we provide a tight tradeoff between the level of security and the distance between the state produced by the source and the ideal maximally entangled state. Last, by adding the resource of a trusted common random source, ...

  17. Automated Verification of Quantum Protocols using MCMAS

    Directory of Open Access Journals (Sweden)

    F. Belardinelli

    2012-07-01

    Full Text Available We present a methodology for the automated verification of quantum protocols using MCMAS, a symbolic model checker for multi-agent systems The method is based on the logical framework developed by D'Hondt and Panangaden for investigating epistemic and temporal properties, built on the model for Distributed Measurement-based Quantum Computation (DMC, an extension of the Measurement Calculus to distributed quantum systems. We describe the translation map from DMC to interpreted systems, the typical formalism for reasoning about time and knowledge in multi-agent systems. Then, we introduce dmc2ispl, a compiler into the input language of the MCMAS model checker. We demonstrate the technique by verifying the Quantum Teleportation Protocol, and discuss the performance of the tool.

  18. Security Protocols: Specification, Verification, Implementation, and Composition

    DEFF Research Database (Denmark)

    Almousa, Omar

    An important aspect of Internet security is the security of cryptographic protocols that it deploys. We need to make sure that such protocols achieve their goals, whether in isolation or in composition, i.e., security protocols must not suffer from any aw that enables hostile intruders to break...... their security. Among others, tools like OFMC [MV09b] and Proverif [Bla01] are quite efficient for the automatic formal verification of a large class of protocols. These tools use different approaches such as symbolic model checking or static analysis. Either approach has its own pros and cons, and therefore, we...... called SPS (Security Protocol Specification) language, that enables users, without requiring deep expertise in formal models from them, to specify a wide range of real-world protocols in a simple and intuitive way. Thus, SPS allows users to verify their protocols using different tools, and generate...

  19. Automated Verification of Practical Garbage Collectors

    CERN Document Server

    Hawblitzel, Chris

    2010-01-01

    Garbage collectors are notoriously hard to verify, due to their low-level interaction with the underlying system and the general difficulty in reasoning about reachability in graphs. Several papers have presented verified collectors, but either the proofs were hand-written or the collectors were too simplistic to use on practical applications. In this work, we present two mechanically verified garbage collectors, both practical enough to use for real-world C# benchmarks. The collectors and their associated allocators consist of x86 assembly language instructions and macro instructions, annotated with preconditions, postconditions, invariants, and assertions. We used the Boogie verification generator and the Z3 automated theorem prover to verify this assembly language code mechanically. We provide measurements comparing the performance of the verified collector with that of the standard Bartok collectors on off-the-shelf C# benchmarks, demonstrating their competitiveness.

  20. Biometric Subject Verification Based on Electrocardiographic Signals

    Science.gov (United States)

    Dusan, Sorin V. (Inventor); Jorgensen, Charles C. (Inventor)

    2014-01-01

    A method of authenticating or declining to authenticate an asserted identity of a candidate-person. In an enrollment phase, a reference PQRST heart action graph is provided or constructed from information obtained from a plurality of graphs that resemble each other for a known reference person, using a first graph comparison metric. In a verification phase, a candidate-person asserts his/her identity and presents a plurality of his/her heart cycle graphs. If a sufficient number of the candidate-person's measured graphs resemble each other, a representative composite graph is constructed from the candidate-person's graphs and is compared with a composite reference graph, for the person whose identity is asserted, using a second graph comparison metric. When the second metric value lies in a selected range, the candidate-person's assertion of identity is accepted.