WorldWideScience

Sample records for base verification based

  1. Video-based fingerprint verification.

    Science.gov (United States)

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-09-04

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, "inside similarity" and "outside similarity" are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low.

  2. VERIFICATION OF PARALLEL AUTOMATA-BASED PROGRAMS

    OpenAIRE

    M. A. Lukin

    2014-01-01

    The paper deals with an interactive method of automatic verification for parallel automata-based programs. The hierarchical state machines can be implemented in different threads and can interact with each other. Verification is done by means of Spin tool and includes automatic Promela model construction, conversion of LTL-formula to Spin format and counterexamples in terms of automata. Interactive verification gives the possibility to decrease verification time and increase the maxi...

  3. A correlation-based fingerprint verification system

    NARCIS (Netherlands)

    Bazen, A.M.; Gerez, Sabih H.; Veelenturf, L.P.J.; van der Zwaag, B.J.; Verwaaijen, G.T.B.

    2000-01-01

    In this paper, a correlation-based fingerprint verification system is presented. Unlike the traditional minutiae-based systems, this system directly uses the richer gray-scale information of the fingerprints. The correlation-based fingerprint verification system first selects appropriate templates

  4. A correlation-based fingerprint verification system

    NARCIS (Netherlands)

    Bazen, A.M.; Gerez, Sabih H.; Veelenturf, L.P.J.; van der Zwaag, B.J.; Verwaaijen, G.T.B.

    In this paper, a correlation-based fingerprint verification system is presented. Unlike the traditional minutiae-based systems, this system directly uses the richer gray-scale information of the fingerprints. The correlation-based fingerprint verification system first selects appropriate templates

  5. Privacy Preserving Iris Based Biometric Identity Verification

    Directory of Open Access Journals (Sweden)

    Przemyslaw Strzelczyk

    2011-08-01

    Full Text Available Iris biometrics is considered one of the most accurate and robust methods of identity verification. Individually unique iris features can be presented in a compact binary form easily compared with reference template to confirm identity. However, when templates or features are disclosed, iris biometrics is no longer suitable for verification. Therefore, there is a need to perform iris feature matching without revealing the features itself and reference template. The paper proposes an extension of the standard iris-based verification protocol that introduces features and a template locking mechanism, which guarantees that no sensitive information is exposed.Article in English

  6. Consent Based Verification System (CBSV)

    Data.gov (United States)

    Social Security Administration — CBSV is a fee-based service offered by SSA's Business Services Online (BSO). It is used by private companies to verify the SSNs of their customers and clients that...

  7. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  8. Rule-Based Runtime Verification

    Science.gov (United States)

    Barringer, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik

    2003-01-01

    We present a rule-based framework for defining and implementing finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time logics, interval logics, forms of quantified temporal logics, and so on. Our logic, EAGLE, is implemented as a Java library and involves novel techniques for rule definition, manipulation and execution. Monitoring is done on a state-by-state basis, without storing the execution trace.

  9. VERIFICATION OF PARALLEL AUTOMATA-BASED PROGRAMS

    Directory of Open Access Journals (Sweden)

    M. A. Lukin

    2014-01-01

    Full Text Available The paper deals with an interactive method of automatic verification for parallel automata-based programs. The hierarchical state machines can be implemented in different threads and can interact with each other. Verification is done by means of Spin tool and includes automatic Promela model construction, conversion of LTL-formula to Spin format and counterexamples in terms of automata. Interactive verification gives the possibility to decrease verification time and increase the maximum size of verifiable programs. Considered method supports verification of the parallel system for hierarchical automata that interact with each other through messages and shared variables. The feature of automaton model is that each state machine is considered as a new data type and can have an arbitrary bounded number of instances. Each state machine in the system can run a different state machine in a new thread or have nested state machine. This method was implemented in the developed Stater tool. Stater shows correct operation for all test cases.

  10. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  11. Time Series Based for Online Signature Verification

    Directory of Open Access Journals (Sweden)

    I Ketut Gede Darma Putra

    2013-11-01

    Full Text Available Signature verification system is to match the tested signature with a claimed signature. This paper proposes time series based for feature extraction method and dynamic time warping for match method. The system made by process of testing 900 signatures belong to 50 participants, 3 signatures for reference and 5 signatures from original user, simple imposters and trained imposters for signatures test. The final result system was tested with 50 participants with 3 references. This test obtained that system accuracy without imposters is 90,44897959% at threshold 44 with rejection errors (FNMR is 5,2% and acceptance errors (FMR is 4,35102%, when with imposters system accuracy is 80,1361% at threshold 27 with error rejection (FNMR is 15,6% and acceptance errors (average FMR is 4,263946%, with details as follows: acceptance errors is 0,391837%, acceptance errors simple imposters is 3,2% and acceptance errors trained imposters is 9,2%.

  12. Game-based verification and synthesis

    DEFF Research Database (Denmark)

    Vester, Steen

    and the environment behaves. Synthesis of strategies in games can thus be used for automatic generation of correct-by-construction programs from specifications. We consider verification and synthesis problems for several well-known game-based models. This includes both model-checking problems and satisfiability...... problems for logics capable of expressing strategic abilities of players in games with both qualitative and quantitative objectives. A number of computational complexity results for model-checking and satisfiability problems in this domain are obtained. We also show how the technique of symmetry reduction...... corresponds directly to a program for the corresponding entity of the system. A strategy for a player which ensures that the player wins no matter how the other players behave then corresponds to a program ensuring that the specification of the entity is satisfied no matter how the other entities...

  13. Neighbors Based Discriminative Feature Difference Learning for Kinship Verification

    DEFF Research Database (Denmark)

    Duan, Xiaodong; Tan, Zheng-Hua

    2015-01-01

    In this paper, we present a discriminative feature difference learning method for facial image based kinship verification. To transform feature difference of an image pair to be discriminative for kinship verification, a linear transformation matrix for feature difference between an image pair...... databases show that the proposed method combined with a SVM classification method outperforms or is comparable to state-of-the-art kinship verification methods. © Springer International Publishing AG, Part of Springer Science+Business Media...

  14. Android-Based Verification System for Banknotes

    Directory of Open Access Journals (Sweden)

    Ubaid Ur Rahman

    2017-11-01

    Full Text Available With the advancement in imaging technologies for scanning and printing, production of counterfeit banknotes has become cheaper, easier, and more common. The proliferation of counterfeit banknotes causes loss to banks, traders, and individuals involved in financial transactions. Hence, it is inevitably needed that efficient and reliable techniques for detection of counterfeit banknotes should be developed. With the availability of powerful smartphones, it has become possible to perform complex computations and image processing related tasks on these phones. In addition to this, smartphone users have increased greatly and numbers continue to increase. This is a great motivating factor for researchers and developers to propose innovative mobile-based solutions. In this study, a novel technique for verification of Pakistani banknotes is developed, targeting smartphones with android platform. The proposed technique is based on statistical features, and surface roughness of a banknote, representing different properties of the banknote, such as paper material, printing ink, paper quality, and surface roughness. The selection of these features is motivated by the X-ray Diffraction (XRD and Scanning Electron Microscopy (SEM analysis of genuine and counterfeit banknotes. In this regard, two important areas of the banknote, i.e., serial number and flag portions were considered since these portions showed the maximum difference between genuine and counterfeit banknote. The analysis confirmed that genuine and counterfeit banknotes are very different in terms of the printing process, the ingredients used in preparation of banknotes, and the quality of the paper. After extracting the discriminative set of features, support vector machine is used for classification. The experimental results confirm the high accuracy of the proposed technique.

  15. Algebraic Verification Method for SEREs Properties via Groebner Bases Approaches

    Directory of Open Access Journals (Sweden)

    Ning Zhou

    2013-01-01

    Full Text Available This work presents an efficient solution using computer algebra system to perform linear temporal properties verification for synchronous digital systems. The method is essentially based on both Groebner bases approaches and symbolic simulation. A mechanism for constructing canonical polynomial set based symbolic representations for both circuit descriptions and assertions is studied. We then present a complete checking algorithm framework based on these algebraic representations by using Groebner bases. The computational experience result in this work shows that the algebraic approach is a quite competitive checking method and will be a useful supplement to the existent verification methods based on simulation.

  16. NES++: number system for encryption based privacy preserving speaker verification

    Science.gov (United States)

    Xu, Lei; Feng, Tao; Zhao, Xi; Shi, Weidong

    2014-05-01

    As speech based operation becomes a main hand-free interaction solution between human and mobile devices (i.e., smartphones, Google Glass), privacy preserving speaker verification receives much attention nowadays. Privacy preserving speaker verification can be achieved through many different ways, such as fuzzy vault and encryption. Encryption based solutions are promising as cryptography is based on solid mathematic foundations and the security properties can be easily analyzed in a well established framework. Most current asymmetric encryption schemes work on finite algebraic structures, such as finite group and finite fields. However, the encryption scheme for privacy preserving speaker verification must handle floating point numbers. This gap must be filled to make the overall scheme practical. In this paper, we propose a number system that meets the requirements of both speaker verification and the encryption scheme used in the process. It also supports addition homomorphic property of Pailliers encryption, which is crucial for privacy preserving speaker verification. As asymmetric encryption is expensive, we propose a method of packing several numbers into one plain-text and the computation overhead is greatly reduced. To evaluate the performance of this method, we implement Pailliers encryption scheme over proposed number system and the packing technique. Our findings show that the proposed solution can fulfill the gap between speaker verification and encryption scheme very well, and the packing technique improves the overall performance. Furthermore, our solution is a building block of encryption based privacy preserving speaker verification, the privacy protection and accuracy rate are not affected.

  17. Finite Countermodel Based Verification for Program Transformation (A Case Study

    Directory of Open Access Journals (Sweden)

    Alexei P. Lisitsa

    2015-12-01

    Full Text Available Both automatic program verification and program transformation are based on program analysis. In the past decade a number of approaches using various automatic general-purpose program transformation techniques (partial deduction, specialization, supercompilation for verification of unreachability properties of computing systems were introduced and demonstrated. On the other hand, the semantics based unfold-fold program transformation methods pose themselves diverse kinds of reachability tasks and try to solve them, aiming at improving the semantics tree of the program being transformed. That means some general-purpose verification methods may be used for strengthening program transformation techniques. This paper considers the question how finite countermodels for safety verification method might be used in Turchin's supercompilation method. We extract a number of supercompilation sub-algorithms trying to solve reachability problems and demonstrate use of an external countermodel finder for solving some of the problems.

  18. Sensor-fusion-based biometric identity verification

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W. [Sandia National Labs., Albuquerque, NM (United States); Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L. [New Mexico State Univ., Las Cruces, NM (United States). Electronic Vision Research Lab.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person`s identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm.

  19. Sensor-fusion-based biometric identity verification

    International Nuclear Information System (INIS)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W.; Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person's identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm

  20. Log-Gabor filters for image-based vehicle verification.

    Science.gov (United States)

    Arróspide, Jon; Salgado, Luis

    2013-06-01

    Vehicle detection based on image analysis has attracted increasing attention in recent years due to its low cost, flexibility, and potential toward collision avoidance. In particular, vehicle verification is especially challenging on account of the heterogeneity of vehicles in color, size, pose, etc. Image-based vehicle verification is usually addressed as a supervised classification problem. Specifically, descriptors using Gabor filters have been reported to show good performance in this task. However, Gabor functions have a number of drawbacks relating to their frequency response. The main contribution of this paper is the proposal and evaluation of a new descriptor based on the alternative family of log-Gabor functions for vehicle verification, as opposed to existing Gabor filter-based descriptors. These filters are theoretically superior to Gabor filters as they can better represent the frequency properties of natural images. As a second contribution, and in contrast to existing approaches, which transfer the standard configuration of filters used for other applications to the vehicle classification task, an in-depth analysis of the required filter configuration by both Gabor and log-Gabor descriptors for this particular application is performed for fair comparison. The extensive experiments conducted in this paper confirm that the proposed log-Gabor descriptor significantly outperforms the standard Gabor filter for image-based vehicle verification.

  1. Palmprint Based Verification System Using SURF Features

    Science.gov (United States)

    Srinivas, Badrinath G.; Gupta, Phalguni

    This paper describes the design and development of a prototype of robust biometric system for verification. The system uses features extracted using Speeded Up Robust Features (SURF) operator of human hand. The hand image for features is acquired using a low cost scanner. The palmprint region extracted is robust to hand translation and rotation on the scanner. The system is tested on IITK database of 200 images and PolyU database of 7751 images. The system is found to be robust with respect to translation and rotation. It has FAR 0.02%, FRR 0.01% and accuracy of 99.98% and can be a suitable system for civilian applications and high-security environments.

  2. Wavelet-based verification of the quantitative precipitation forecast

    Science.gov (United States)

    Yano, Jun-Ichi; Jakubiak, Bogumil

    2016-06-01

    This paper explores the use of wavelets for spatial verification of quantitative precipitation forecasts (QPF), and especially the capacity of wavelets to provide both localization and scale information. Two 24-h forecast experiments using the two versions of the Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) on 22 August 2010 over Poland are used to illustrate the method. Strong spatial localizations and associated intermittency of the precipitation field make verification of QPF difficult using standard statistical methods. The wavelet becomes an attractive alternative, because it is specifically designed to extract spatially localized features. The wavelet modes are characterized by the two indices for the scale and the localization. Thus, these indices can simply be employed for characterizing the performance of QPF in scale and localization without any further elaboration or tunable parameters. Furthermore, spatially-localized features can be extracted in wavelet space in a relatively straightforward manner with only a weak dependence on a threshold. Such a feature may be considered an advantage of the wavelet-based method over more conventional "object" oriented verification methods, as the latter tend to represent strong threshold sensitivities. The present paper also points out limits of the so-called "scale separation" methods based on wavelets. Our study demonstrates how these wavelet-based QPF verifications can be performed straightforwardly. Possibilities for further developments of the wavelet-based methods, especially towards a goal of identifying a weak physical process contributing to forecast error, are also pointed out.

  3. Dynamic Frames Based Verification Method for Concurrent Java Programs

    NARCIS (Netherlands)

    Mostowski, Wojciech

    2016-01-01

    In this paper we discuss a verification method for concurrent Java programs based on the concept of dynamic frames. We build on our earlier work that proposes a new, symbolic permission system for concurrent reasoning and we provide the following new contributions. First, we describe our approach

  4. Biometric verification based on grip-pattern recognition

    NARCIS (Netherlands)

    Veldhuis, Raymond N.J.; Bazen, A.M.; Kauffman, J.A.; Hartel, Pieter H.; Delp, Edward J.; Wong, Ping W.

    This paper describes the design, implementation and evaluation of a user-verification system for a smart gun, which is based on grip-pattern recognition. An existing pressure sensor consisting of an array of 44 x 44 piezoresistive elements is used to measure the grip pattern. An interface has been

  5. Formal Requirements Modeling for Simulation-Based Verification

    OpenAIRE

    Otter, Martin; Thuy, Nguyen; Bouskela, Daniel; Buffoni, Lena; Elmqvist, Hilding; Fritzson, Peter; Garro, Alfredo; Jardin, Audrey; Olsson, Hans; Payelleville, Maxime; Schamai, Wladimir; Thomas, Eric; Tundis, Andrea

    2015-01-01

    This paper describes a proposal on how to model formal requirements in Modelica for simulation-based verification. The approach is implemented in the open source Modelica_Requirements library. It requires extensions to the Modelica language, that have been prototypically implemented in the Dymola and Open-Modelica software. The design of the library is based on the FOrmal Requirement Modeling Language (FORM-L) defined by EDF, and on industrial use cases from EDF and Dassault Aviation. It uses...

  6. Verification of A Process Based Hydrograph Separation

    Science.gov (United States)

    Fackel, P.; Naef, F.

    Depending on soil structure, geology, topography, etc. runoff during intense precipi- tation is formed either by HOF (Hortonian overland flow), SOF (saturated overland flow), SSF (subsurface flow) or DP (deep percolation). A methodology was developed to delineate the different dominant runoff processes in varying intensities (1 very fast to 3 retarded) in a catchment. In the small Rohr catchment (2.1 km2) the dominant runoff processes were mapped with the help of soil investigations, sprinkling exper- iments, etc.. Also a series of aerial infrared photographs were taken after a heavy rainfall event to observe the drying of the catchment, indicating the extent of satu- rated areas. To verify the mapping we installed a hydrometric measurement network. Discharge was measured form a tile drain system, from areas where only SOF was expected, and from the whole catchment. Shallow groundwater wells were installed in areas where SOF and SSF of different intensities were mapped. Wells on SOF 1 areas showed saturation after 30 mm of precipitation while SOF 2 and 3 soils needed a significantly higher amount of rainfall for saturation. Water levels in wells in SSF areas didnSt rise to ground level even after large amounts of rainfall. Discharge from the SOF areas corresponded well to measured soil saturation. The runoff from the drainage system responded very fast and strong to precipitation, indicating prefer- ential flow through the soil. In a parallel project, the EAWAG investigated pesticide transport in the Rohr catchment. They applied the pesticide atrazine plus a labelling substance to 12 maize fields and measured their concentrations in runoff with a high temporal resolution. Based on the runoff process map the IHW model Qarea simu- lates the contribution of each runoff process to the storm hydrograph separately, thus implicitly yields a process based hydrograph separation. Due to the high spatial reso- lution of the process map it was possible to calculate the spatial

  7. Verification Mechanism For Lightweight Componenent-Based Environment Based On Ioc Container

    Directory of Open Access Journals (Sweden)

    Rafal Leszko

    2013-01-01

    Full Text Available The paper presents a concept of component verification framework dedicated to a particular lightweight component  environment. A starting point of the paper constitutes a discussion about significance of  verification process of syntax inconsistencies in the software development. Next, the need of verification in service-oriented and component-based  systems are presented and various approaches of verification in existing component environments are explained. The  main part of the paper introduces a concept of a functional integrity of component-based systems that utilize  verification mechanisms checking components consistency. The proposed solution is built on fine-grained component environment  (close to classes similarly to the Spring Framework realized in AgE platform. Selected technical aspects of the framework design  illustrate the considerations of the paper.

  8. An ontology based trust verification of software license agreement

    Science.gov (United States)

    Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo

    2017-08-01

    When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.

  9. Simulation-based MDP verification for leading-edge masks

    Science.gov (United States)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification

  10. Image-based fingerprint verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil K. Singla

    2008-09-01

    Full Text Available Biometric-based identification/verification systems provide a solution to the security concerns in the modern world where machine is replacing human in every aspect of life. Fingerprints, because of their uniqueness, are the most widely used and highly accepted biometrics. Fingerprint biometric systems are either minutiae-based or pattern learning (image based. The minutiae-based algorithm depends upon the local discontinuities in the ridge flow pattern and are used when template size is important while image-based matching algorithm uses both the micro and macro feature of a fingerprint and is used if fast response is required. In the present paper an image-based fingerprint verification system is discussed. The proposed method uses a learning phase, which is not present in conventional image-based systems. The learning phase uses pseudo random sub-sampling, which reduces the number of comparisons needed in the matching stage. This system has been developed using LabVIEW (Laboratory Virtual Instrument Engineering Workbench toolbox version 6i. The availability of datalog files in LabVIEW makes it one of the most promising candidates for its usage as a database. Datalog files can access and manipulate data and complex data structures quickly and easily. It makes writing and reading much faster. After extensive experimentation involving a large number of samples and different learning sizes, high accuracy with learning image size of 100 100 and a threshold value of 700 (1000 being the perfect match has been achieved.

  11. Performing Verification and Validation in Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  12. MODEL-BASED VALIDATION AND VERIFICATION OF ANOMALIES IN LEGISLATION

    Directory of Open Access Journals (Sweden)

    Vjeran Strahonja

    2006-12-01

    Full Text Available An anomaly in legislation is absence of completeness, consistency and other desirable properties, caused by different semantic, syntactic or pragmatic reasons. In general, the detection of anomalies in legislation comprises validation and verification. The basic idea of research, as presented in this paper, is modelling legislation by capturing domain knowledge of legislation and specifying it in a generic way by using commonly agreed and understandable modelling concepts of the Unified Modelling Language (UML. Models of legislation enable to understand the system better, support the detection of anomalies and help to improve the quality of legislation by validation and verification. By implementing model-based approach, the object of validation and verification moves from legislation to its model. The business domain of legislation has two distinct aspects: a structural or static aspect (functionality, business data etc., and a behavioural or dynamic part (states, transitions, activities, sequences etc.. Because anomalism can occur on two different levels, on the level of a model, or on the level of legislation itself, a framework for validation and verification of legal regulation and its model is discussed. The presented framework includes some significant types of semantic and syntactic anomalies. Some ideas for assessment of pragmatic anomalies of models were found in the field of software quality metrics. Thus pragmatic features and attributes can be determined that could be relevant for evaluation purposes of models. Based on analogue standards for the evaluation of software, a qualitative and quantitative scale can be applied to determine the value of some feature for a specific model.

  13. 78 FR 56266 - Consent Based Social Security Number Verification (CBSV) Service

    Science.gov (United States)

    2013-09-12

    ... From the Federal Register Online via the Government Publishing Office SOCIAL SECURITY ADMINISTRATION Consent Based Social Security Number Verification (CBSV) Service AGENCY: Social Security... Security number (SSN) verification services to enrolled private businesses, State and local government...

  14. A Scala DSL for RETE-Based Runtime Verification

    Science.gov (United States)

    Havelund, Klaus

    2013-01-01

    Runtime verification (RV) consists in part of checking execution traces against formalized specifications. Several systems have emerged, most of which support specification notations based on state machines, regular expressions, temporal logic, or grammars. The field of Artificial Intelligence (AI) has for an even longer period of time studied rule-based production systems, which at a closer look appear to be relevant for RV, although seemingly focused on slightly different application domains, such as for example business processes and expert systems. The core algorithm in many of these systems is the Rete algorithm. We have implemented a Rete-based runtime verification system, named LogFire (originally intended for offline log analysis but also applicable to online analysis), as an internal DSL in the Scala programming language, using Scala's support for defining DSLs. This combination appears attractive from a practical point of view. Our contribution is in part conceptual in arguing that such rule-based frameworks originating from AI may be suited for RV.

  15. MESA: Message-Based System Analysis Using Runtime Verification

    Science.gov (United States)

    Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter

    2017-01-01

    In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.

  16. Protocol-Based Verification of Message-Passing Parallel Programs

    DEFF Research Database (Denmark)

    López-Acosta, Hugo-Andrés; Eduardo R. B. Marques, Eduardo R. B.; Martins, Francisco

    2015-01-01

    a protocol language based on a dependent type system for message-passing parallel programs, which includes various communication operators, such as point-to-point messages, broadcast, reduce, array scatter and gather. For the verification of a program against a given protocol, the protocol is first......, that suffer from the state-explosion problem or that otherwise depend on parameters to the program itself. We experimentally evaluated our approach against state-of-the-art tools for MPI to conclude that our approach offers a scalable solution....

  17. Internet-based dimensional verification system for reverse engineering processes

    International Nuclear Information System (INIS)

    Song, In Ho; Kim, Kyung Don; Chung, Sung Chong

    2008-01-01

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  18. Internet-based dimensional verification system for reverse engineering processes

    Energy Technology Data Exchange (ETDEWEB)

    Song, In Ho [Ajou University, Suwon (Korea, Republic of); Kim, Kyung Don [Small Business Corporation, Suwon (Korea, Republic of); Chung, Sung Chong [Hanyang University, Seoul (Korea, Republic of)

    2008-07-15

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  19. ECG based biometrics verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Singla

    2010-07-01

    Full Text Available Biometric based authentication systems provide solutions to the problems in high security which remain with conventionalsecurity systems. In a biometric verification system, human’s biological parameters (such as voice, finger print,palm print or hand geometry, face, iris etc. are used to verify the authenticity of a person. These parameters are good to beused as biometric parameters but do not provide the guarantee that the person is present and alive. As voice can be copied,finger print can be picked from glass on synthetic skin and in face recognition system due to genetic factors identical twinsor father-son may have the same facial appearance. ECG does not have these problems. It can not be recorded without theknowledge of the person and ECG of every person is unique even identical twins have different ECG. In this paper an ECGbasedbiometrics verification system which was developed using Laboratory Virtual Instruments Engineering Workbench(LabVIEW version 7.1 is discussed. Experiments were conducted on the database stored in the laboratory of 20 individualshaving 10 samples each and the results revealed a false rejection rate (FRR of 3% and false acceptance rate (FAR of 3.21%.

  20. Verification of product design using regulation knowledge base and Web services

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ik June [KAERI, Daejeon (Korea, Republic of); Lee, Jae Chul; Mun Du Hwan [Kyungpook National University, Daegu (Korea, Republic of); Kim, Byung Chul [Dong-A University, Busan (Korea, Republic of); Hwang, Jin Sang [PartDB Co., Ltd., Daejeom (Korea, Republic of); Lim, Chae Ho [Korea Institute of Industrial Technology, Incheon (Korea, Republic of)

    2015-11-15

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  1. Verification of product design using regulation knowledge base and Web services

    International Nuclear Information System (INIS)

    Kim, Ik June; Lee, Jae Chul; Mun Du Hwan; Kim, Byung Chul; Hwang, Jin Sang; Lim, Chae Ho

    2015-01-01

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  2. Development of a Torque Sensor-Based Test Bed for Attitude Control System Verification and Validation

    Science.gov (United States)

    2017-12-30

    AFRL-RV-PS- AFRL-RV-PS- TR-2018-0008 TR-2018-0008 DEVELOPMENT OF A TORQUE SENSOR - BASED TEST BED FOR ATTITUDE CONTROL SYSTEM VERIFICATION AND... Sensor -Based Test Bed for Attitude Control System Verification & Validation 5a. CONTRACT NUMBER FA9453-15-1-0315 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...covers the development of a torque sensor for verification and validation (V&V) of spacecraft attitude control actuators. The developed sensor directly

  3. Fusion of PCA-Based and LDA-Based Similarity Measures for Face Verification

    Directory of Open Access Journals (Sweden)

    Kittler Josef

    2010-01-01

    Full Text Available The problem of fusing similarity measure-based classifiers is considered in the context of face verification. The performance of face verification systems using different similarity measures in two well-known appearance-based representation spaces, namely Principle Component Analysis (PCA and Linear Discriminant Analysis (LDA is experimentally studied. The study is performed for both manually and automatically registered face images. The experimental results confirm that our optimised Gradient Direction (GD metric within the LDA feature space outperforms the other adopted metrics. Different methods of selection and fusion of the similarity measure-based classifiers are then examined. The experimental results demonstrate that the combined classifiers outperform any individual verification algorithm. In our studies, the Support Vector Machines (SVMs and Weighted Averaging of similarity measures appear to be the best fusion rules. Another interesting achievement of the work is that although features derived from the LDA approach lead to better results than those of the PCA algorithm for all the adopted scoring functions, fusing the PCA- and LDA-based scores improves the performance of the system.

  4. Efficient Data Integrity Verification Using CRC Based on HDFS in Cloud Storage

    Directory of Open Access Journals (Sweden)

    Xia Yun-Hao

    2017-01-01

    Full Text Available Data integrity verification is becoming a major challenge in cloud storage which can’t be ignored. This paper proposes an optimized variant of CRC (Checker Redundancy Cyclic verification algorithm based on HDFS to improve the efficiency of data integrity verification in cloud storage through the research of CRC checksum algorithm and data integrity verification mechanism of HDFS. A new method is formulated to establish the deformational optimization and to accelerate the algorithm by researching characteristics of generating and checking the algorithm. Moreover, this method optimizes the code to improve the computational efficiency according to data integrity verification mechanism of HDFS. A data integrity verification system based on Hadoop is designed to verify proposed method. Experimental results demonstrate that proposed HDFS based CRC algorithm was able to improve the calculation efficiency and the utilization of system resource on the whole and outperformed well compared to existing models in terms of accuracy and time.

  5. 76 FR 60112 - Consent Based Social Security Number Verification (CBSV) Service

    Science.gov (United States)

    2011-09-28

    ... Social Security number (SSN) verification service to private businesses and other requesters who obtain a... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA-2011-0073] Consent Based Social Security Number Verification (CBSV) Service AGENCY: Social Security Administration. ACTION: Notice of Revised Transaction Fee...

  6. E-Visas Verification Schemes Based on Public-Key Infrastructure and Identity Based Encryption

    OpenAIRE

    Najlaa A. Abuadhmah; Muawya Naser; Azman Samsudin

    2010-01-01

    Problem statement: Visa is a very important travelling document, which is an essential need at the point of entry of any country we are visiting. However an important document such as visa is still handled manually which affects the accuracy and efficiency of processing the visa. Work on e-visa is almost unexplored. Approach: This study provided a detailed description of a newly proposed e-visa verification system prototyped based on RFID technology. The core technology of the proposed e-visa...

  7. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    Science.gov (United States)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  8. Research on key technology of the verification system of steel rule based on vision measurement

    Science.gov (United States)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  9. Ensemble-based approximation of observation impact using an observation-based verification metric

    Directory of Open Access Journals (Sweden)

    Matthias Sommer

    2016-07-01

    Full Text Available Knowledge on the contribution of observations to forecast accuracy is crucial for the refinement of observing and data assimilation systems. Several recent publications highlighted the benefits of efficiently approximating this observation impact using adjoint methods or ensembles. This study proposes a modification of an existing method for computing observation impact in an ensemble-based data assimilation and forecasting system and applies the method to a pre-operational, convective-scale regional modelling environment. Instead of the analysis, the modified approach uses observation-based verification metrics to mitigate the effect of correlation between the forecast and its verification norm. Furthermore, a peculiar property in the distribution of individual observation impact values is used to define a reliability indicator for the accuracy of the impact approximation. Applying this method to a 3-day test period shows that a well-defined observation impact value can be approximated for most observation types and the reliability indicator successfully depicts where results are not significant.

  10. Verification of the Simultaneous Local Extraction Method of Base and Thermal Resistance of Bipolar Transistors

    OpenAIRE

    Robert Setekera; Luuk Tiemeijer; Ramses van der Toorn

    2014-01-01

    In this paper an extensive verification of the extraction method (published earlier) that consistently accounts for self-heating and Early effect to accurately extract both base and thermal resistance of bipolar junction transistors is presented. The method verification is demonstrated on advanced RF SiGe HBTs were the extracted results for the thermal resistance are compared with those from another published method that ignores the effect of Early effect on internal base...

  11. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2016-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  12. Horn clause verification with convex polyhedral abstraction and tree automata-based refinement

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2017-01-01

    In this paper we apply tree-automata techniques to refinement of abstract interpretation in Horn clause verification. We go beyond previous work on refining trace abstractions; firstly we handle tree automata rather than string automata and thereby can capture traces in any Horn clause derivations...... underlying the Horn clauses. Experiments using linear constraint problems and the abstract domain of convex polyhedra show that the refinement technique is practical and that iteration of abstract interpretation with tree automata-based refinement solves many challenging Horn clause verification problems. We...... compare the results with other state-of-the-art Horn clause verification tools....

  13. Tree automata-based refinement with application to Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2015-01-01

    In this paper we apply tree-automata techniques to refinement of abstract interpretation in Horn clause verification. We go beyond previous work on refining trace abstractions; firstly we handle tree automata rather than string automata and thereby can capture traces in any Horn clause derivations...... underlying the Horn clauses. Experiments using linear constraint problems and the abstract domain of convex polyhedra show that the refinement technique is practical and that iteration of abstract interpretation with tree automata-based refinement solves many challenging Horn clause verification problems. We...... compare the results with other state of the art Horn clause verification tools....

  14. Fuzzy-logic-based safety verification framework for nuclear power plants.

    Science.gov (United States)

    Rastogi, Achint; Gabbar, Hossam A

    2013-06-01

    This article presents a practical implementation of a safety verification framework for nuclear power plants (NPPs) based on fuzzy logic where hazard scenarios are identified in view of safety and control limits in different plant process values. Risk is estimated quantitatively and compared with safety limits in real time so that safety verification can be achieved. Fuzzy logic is used to define safety rules that map hazard condition with required safety protection in view of risk estimate. Case studies are analyzed from NPP to realize the proposed real-time safety verification framework. An automated system is developed to demonstrate the safety limit for different hazard scenarios. © 2012 Society for Risk Analysis.

  15. Verification and validation of computer based systems for PFBR

    International Nuclear Information System (INIS)

    Thirugnanamurthy, D.

    2017-01-01

    Verification and Validation (V and V) process is essential to build quality into system. Verification is the process of evaluating a system to determine whether the products of each development phase satisfies the requirements imposed by the previous phase. Validation is the process of evaluating a system at the end of the development process to ensure compliance with the functional, performance and interface requirements. This presentation elaborates the V and V process followed, documents submission requirements in each stage, V and V activities, check list used for reviews in each stage and reports

  16. Model Based Verification of Cyber Range Event Environments

    Science.gov (United States)

    2015-12-10

    built a software development environment that uses to support the development and verification of on-board software for spacecraft. Tanizaki et al...Technologies, Piscataway, 2011. [16] Hiroaki Tanizaki , Toshiaki Aoki, and Takuya Katayama, "A Variability Management Method for Software

  17. Developing Reading and Listening Comprehension Tests Based on the Sentence Verification Technique (SVT).

    Science.gov (United States)

    Royer, James M.

    2001-01-01

    Describes a team-based approach for creating Sentence Verification Technique (SVT) tests, a development procedure that allows teachers and other school personnel to develop comprehension tests from curriculum materials in use in their schools. Finds that if tests are based on materials that are appropriate for the population to be tested, the…

  18. A Feature Subtraction Method for Image Based Kinship Verification under Uncontrolled Environments

    DEFF Research Database (Denmark)

    Duan, Xiaodong; Tan, Zheng-Hua

    2015-01-01

    the feature distance between face image pairs with kinship and maximize the distance between non-kinship pairs. Based on the subtracted feature, the verification is realized through a simple Gaussian based distance comparison method. Experiments on two public databases show that the feature subtraction method...

  19. Comparison of megavoltage position verification for prostate irradiation based on bony anatomy and implanted fiducials

    NARCIS (Netherlands)

    Nederveen, Aart J.; Dehnad, Homan; van der Heide, Uulke A.; van Moorselaar, R. Jeroen A.; Hofman, Pieter; Lagendijk, Jan J. W.

    2003-01-01

    PURPOSE: The patient position during radiotherapy treatment of prostate cancer can be verified with the help of portal images acquired during treatment. In this study we quantify the clinical consequences of the use of image-based verification based on the bony anatomy and the prostate target

  20. Method and computer product to increase accuracy of time-based software verification for sensor networks

    Science.gov (United States)

    Foo Kune, Denis [Saint Paul, MN; Mahadevan, Karthikeyan [Mountain View, CA

    2011-01-25

    A recursive verification protocol to reduce the time variance due to delays in the network by putting the subject node at most one hop from the verifier node provides for an efficient manner to test wireless sensor nodes. Since the software signatures are time based, recursive testing will give a much cleaner signal for positive verification of the software running on any one node in the sensor network. In this protocol, the main verifier checks its neighbor, who in turn checks its neighbor, and continuing this process until all nodes have been verified. This ensures minimum time delays for the software verification. Should a node fail the test, the software verification downstream is halted until an alternative path (one not including the failed node) is found. Utilizing techniques well known in the art, having a node tested twice, or not at all, can be avoided.

  1. Streaming-based verification of XML signatures in SOAP messages

    DEFF Research Database (Denmark)

    Somorovsky, Juraj; Jensen, Meiko; Schwenk, Jörg

    2010-01-01

    approach for XML processing, the Web Services servers easily become a target of Denial-of-Service attacks. We present a solution for these problems: an external streaming-based WS-Security Gateway. Our implementation is capable of processing XML Signatures in SOAP messages using a streaming-based approach...

  2. Scenario-based verification of real-time systems using UPPAAL

    DEFF Research Database (Denmark)

    Li, Shuhao; Belaguer, Sandie; David, Alexandre

    2010-01-01

    , the problem of scenario-based verification reduces to a computation tree logic (CTL) real-time model checking problem. In case the real time system is modeled as a set of driving LSC charts, we translate these driving charts and the monitored chart into a behavior-equivalent network of TAs by using a “one......Abstract This paper proposes two approaches to tool-supported automatic verification of dense real-time systems against scenario-based requirements, where a system is modeled as a network of timed automata (TAs) or as a set of driving live sequence charts (LSCs), and a requirement is specified......-TA-per-instance line” approach, and then reduce the problems of scenario based verification also to CTL real-time model checking problems. We show how we exploit the expressivity of the TA formalism and the CTL query language of the realtime model checker UPPAAL to accomplish these tasks. The proposed two approaches...

  3. A study of compositional verification based IMA integration method

    Science.gov (United States)

    Huang, Hui; Zhang, Guoquan; Xu, Wanmeng

    2018-03-01

    The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.

  4. Action-based verification of RTCP-nets with CADP

    Science.gov (United States)

    Biernacki, Jerzy; Biernacka, Agnieszka; Szpyrka, Marcin

    2015-12-01

    The paper presents an RTCP-nets' (real-time coloured Petri nets) coverability graphs into Aldebaran format translation algorithm. The approach provides the possibility of automatic RTCP-nets verification using model checking techniques provided by the CADP toolbox. An actual fire alarm control panel system has been modelled and several of its crucial properties have been verified to demonstrate the usability of the approach.

  5. Functional verification of dynamically reconfigurable FPGA-based systems

    CERN Document Server

    Gong, Lingkan

    2015-01-01

    This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric.  Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...

  6. [Verification of Learning Effects by Team-based Learning].

    Science.gov (United States)

    Ono, Shin-Ichi; Ito, Yoshihisa; Ishige, Kumiko; Inokuchi, Norio; Kosuge, Yasuhiro; Asami, Satoru; Izumisawa, Megumi; Kobayashi, Hiroko; Hayashi, Hiroyuki; Suzuki, Takashi; Kishikawa, Yukinaga; Hata, Harumi; Kose, Eiji; Tabata, Kei-Ichi

    2017-11-01

     It has been recommended that active learning methods, such as team-based learning (TBL) and problem-based learning (PBL), be introduced into university classes by the Central Council for Education. As such, for the past 3 years, we have implemented TBL in a medical therapeutics course for 4-year students. Based upon our experience, TBL is characterized as follows: TBL needs fewer teachers than PBL to conduct a TBL module. TBL enables both students and teachers to recognize and confirm the learning results from preparation and reviewing. TBL grows students' responsibility for themselves and their teams, and likely facilitates learning activities through peer assessment.

  7. SAT-based verification for timed component connectors

    NARCIS (Netherlands)

    S. Kemper (Stephanie)

    2011-01-01

    textabstractComponent-based software construction relies on suitable models underlying components, and in particular the coordinators which orchestrate component behaviour. Verifying correctness and safety of such systems amounts to model checking the underlying system model. The model checking

  8. Geothermal-resource verification for Air Force bases

    Energy Technology Data Exchange (ETDEWEB)

    Grant, P.R. Jr.

    1981-06-01

    This report summarizes the various types of geothermal energy reviews some legal uncertainties of the resource and then describes a methodology to evaluate geothermal resources for applications to US Air Force bases. Estimates suggest that exploration costs will be $50,000 to $300,000, which, if favorable, would lead to drilling a $500,000 exploration well. Successful identification and development of a geothermal resource could provide all base, fixed system needs with an inexpensive, renewable energy source.

  9. Knowledge Based Systems (KBS) Verification, Validation, Evaluation, and Testing (VVE&T) Bibliography: Topical Categorization

    Science.gov (United States)

    2003-03-01

    Informatics, Universitat de Les Illes Balears , France, July 1994. [11, 14] Knowledge Base Reduction Ginsberg, A., "Knowledge-Base Reduction: A New...Bayberry Street 4000 Central Florida Blvd. Oak Park CA 91301 Orlando FL 32816 3 Technische Universitat Ilmenau 98693 Ilmenau Germany 9. SPONSORING...P., Verification of Multi-Level Rule-Based Expert Systems, Ph.D. Dissertation, Universitat Politecnica de Catalunya, Spain, April 1992. [14] IRS-CBR

  10. Exploring implementation practices in results-based financing: the case of the verification in Benin.

    Science.gov (United States)

    Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier

    2017-03-14

    Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks

  11. Runtime Verification of Component-Based Embedded Software

    NARCIS (Netherlands)

    Sözer, Hasan; Hofmann, C.; Tekinerdogan, B.; Aksit, Mehmet; Gelenbe, Erol; Lent, Ricardo; Sakellari, Georgia

    To deal with increasing size and complexity, component-based software development has been employed in embedded systems. Due to several faults, components can make wrong assumptions about the working mode of the system and the working modes of the other components. To detect mode inconsistencies at

  12. Evaluation of an electrocardiograph-based PICC tip verification system.

    Science.gov (United States)

    Oliver, Gemma; Jones, Matt

    Performing a chest x-ray after insertion of a peripherally inserted central catheter (PICC) is recognised as the gold standard for checking that the tip of the catheter is correctly positioned in the lower third of the superior vena cava at the right atrial junction; however, numerous problems are associated with this practice. A recent technological advancement has been developed that utilises changes in a patient's electrocardiograph (ECG) recorded from the tip of the PICC as a more reliable method. This evaluation discusses how a vascular access team in a large acute NHS Trust safely and successfully incorporated the use of ECG guidance technology for verification of PICC tip placement into their practice.

  13. An Improved Constraint-Based System for the Verification of Security Protocols

    NARCIS (Netherlands)

    Corin, R.J.; Etalle, Sandro

    We propose a constraint-based system for the verification of security protocols that improves upon the one developed by Millen and Shmatikov [30]. Our system features (1) a significantly more efficient implementation, (2) a monotonic behavior, which also allows to detect flaws associated to partial

  14. Speaker-dependent Dictionary-based Speech Enhancement for Text-Dependent Speaker Verification

    DEFF Research Database (Denmark)

    Thomsen, Nicolai Bæk; Thomsen, Dennis Alexander Lehmann; Tan, Zheng-Hua

    2016-01-01

    The problem of text-dependent speaker verification under noisy conditions is becoming ever more relevant, due to increased usage for authentication in real-world applications. Classical methods for noise reduction such as spectral subtraction and Wiener filtering introduce distortion and do not p...... on dictionary-based noise reduction and compare it to the baseline methods....

  15. An Improved Constraint-based system for the verification of security protocols

    NARCIS (Netherlands)

    Corin, R.J.; Etalle, Sandro; Hermenegildo, Manuel V.; Puebla, German

    We propose a constraint-based system for the verification of security protocols that improves upon the one developed by Millen and Shmatikov. Our system features (1) a significantly more efficient implementation, (2) a monotonic behavior, which also allows to detect aws associated to partial runs

  16. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices

    Directory of Open Access Journals (Sweden)

    Jingzhen Li

    2017-01-01

    Full Text Available In this paper, an approach to biometric verification based on human body communication (HBC is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer’s forearm is measured by vector network analyzer (VNA. Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR and false rejection rate (FRR based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN classification, support vector machines (SVM, and naive Bayesian method (NBM classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices.

  17. Online 3D EPID-based dose verification: Proof of concept

    International Nuclear Information System (INIS)

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; Herk, Marcel van

    2016-01-01

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame

  18. The research for the design verification of nuclear power plant based on VR dynamic plant

    International Nuclear Information System (INIS)

    Wang Yong; Yu Xiao

    2015-01-01

    This paper studies a new method of design verification through the VR plant, in order to perform verification and validation the design of plant conform to the requirements of accident emergency. The VR dynamic plant is established by 3D design model and digital maps that composed of GIS system and indoor maps, and driven by the analyze data of design analyzer. The VR plant could present the operation conditions and accident conditions of power plant. This paper simulates the execution of accident procedures, the development of accidents, the evacuation planning of people and so on, based on VR dynamic plant, and ensure that the plant design will not cause bad effect. Besides design verification, simulated result also can be used for optimization of the accident emergency plan, the training of accident plan and emergency accident treatment. (author)

  19. Integrity Verification for Multiple Data Copies in Cloud Storage Based on Spatiotemporal Chaos

    Science.gov (United States)

    Long, Min; Li, You; Peng, Fei

    Aiming to strike for a balance between the security, efficiency and availability of the data verification in cloud storage, a novel integrity verification scheme based on spatiotemporal chaos is proposed for multiple data copies. Spatiotemporal chaos is implemented for node calculation of the binary tree, and the location of the data in the cloud is verified. Meanwhile, dynamic operation can be made to the data. Furthermore, blind information is used to prevent a third-party auditor (TPA) leakage of the users’ data privacy in a public auditing process. Performance analysis and discussion indicate that it is secure and efficient, and it supports dynamic operation and the integrity verification of multiple copies of data. It has a great potential to be implemented in cloud storage services.

  20. NI PXI-Based Automated Measurement System for Digital ASICs Verification

    Directory of Open Access Journals (Sweden)

    Sorokoumov Georgiy

    2016-01-01

    Full Text Available The paper describes a structure of the automated measuring system used to control digital ASICs electrical and functional parameters. The automated measuring system is based on National Instruments PXI modules. The PXI-7954R module is the most significant module in the system. Hardware and software operations of the measuring system are discussed in the paper. The measuring system is based on test vectors for digital ASICs functional verification.

  1. SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements

    International Nuclear Information System (INIS)

    Thoelking, J; Yuvaraj, S; Jens, F; Lohr, F; Wenz, F; Wertz, H; Wertz, H

    2015-01-01

    Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference) and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan

  2. The potential of agent-based modelling for verification of people trajectories based on smartphone sensor data

    International Nuclear Information System (INIS)

    Hillen, F; Ehlers, M; Höfle, B; Reinartz, P

    2014-01-01

    In this paper the potential of smartphone sensor data for verification of people trajectories derived from airborne remote sensing data are investigated and discussed based on simulated test recordings in the city of Osnabrueck, Germany. For this purpose, the airborne imagery is simulated by images taken from a high building with a typical single lens reflex camera. The smartphone data required for the analysis of the potential is simultaneously recorded by test persons on the ground. In a second step, the quality of the smartphone sensor data is evaluated regarding the integration into simulation and modelling approaches. In this context we studied the potential of the agent-based modelling technique concerning the verification of people trajectories

  3. Evaluation of DVH-based treatment plan verification in addition to gamma passing rates for head and neck IMRT

    International Nuclear Information System (INIS)

    Visser, Ruurd; Wauben, David J.L.; Groot, Martijn de; Steenbakkers, Roel J.H.M.; Bijl, Henk P.; Godart, Jeremy; Veld, Aart A. van’t; Langendijk, Johannes A.; Korevaar, Erik W.

    2014-01-01

    Background and purpose: Treatment plan verification of intensity modulated radiotherapy (IMRT) is generally performed with the gamma index (GI) evaluation method, which is difficult to extrapolate to clinical implications. Incorporating Dose Volume Histogram (DVH) information can compensate for this. The aim of this study was to evaluate DVH-based treatment plan verification in addition to the GI evaluation method for head and neck IMRT. Materials and methods: Dose verifications of 700 subsequent head and neck cancer IMRT treatment plans were categorised according to gamma and DVH-based action levels. Fractionation dependent absolute dose limits were chosen. The results of the gamma- and DVH-based evaluations were compared to the decision of the medical physicist and/or radiation oncologist for plan acceptance. Results: Nearly all treatment plans (99.7%) were accepted for treatment according to the GI evaluation combined with DVH-based verification. Two treatment plans were re-planned according to DVH-based verification, which would have been accepted using the evaluation alone. DVH-based verification increased insight into dose delivery to patient specific structures increasing confidence that the treatment plans were clinically acceptable. Moreover, DVH-based action levels clearly distinguished the role of the medical physicist and radiation oncologist within the Quality Assurance (QA) procedure. Conclusions: DVH-based treatment plan verification complements the GI evaluation method improving head and neck IMRT-QA

  4. On-Board Imager-based MammoSite treatment verification.

    Science.gov (United States)

    Wojcicka, Jadwiga; Yankelevich, Rafael; Iorio, Stephen; Tinger, Alfred

    2007-11-01

    Contemporary radiation oncology departments are often lacking a conventional simulator due to common use of virtual simulation and recent implementation of image guided radiation therapy. A protocol based on MammoSite method was developed using CT based planning, a Source Position Simulator (SPS) with a Simulator Wire and a linear accelerator based On-Board Imager (OBI) for daily verification. After MammoSite balloon implantation, the patient undergoes a CT study. The images are evaluated for tissue conformance, balloon symmetry, and balloon surface to skin distance according to the departmental procedure. Prior to the CT study the SPS is attached to the transfer tube that in turn is attached to the balloon catheter. The length from the indexer to the first dwell position is measured using the simulator wire with X-ray markers. After the CT study is performed, the data set is sent to the Varian Eclipse treatment planning system (TPS) and to the Nucletron PLATO brachytherapy planning system. The reference digitally reconstructed radiographs (DRRs) of anterior and lateral setup fields are created using Eclipse TPS and are immediately available on the OBI console via the Varian Vision integrated system. The source dwell position coinciding with the balloon center is identified in the CT dataset, followed by the offset calculation, catheter reconstruction, dose points placement and dwell time calculation. OBI fluoroscopy images are acquired and marked as initial. Prior to each treatment fraction balloon diameter and symmetry are evaluated using OBI fluoroscopy and tools available on the OBI console. Acquired images are compared with reference DRRs and/or initial OBI images. The whole process from initial evaluation to daily verification is filmless and does not undermine the precision of the procedure. This verification time does not exceed 10 min. The balloon diameter correlates well (within 1 mm) between initial CT and OBI verification images. The balloon symmetry is

  5. SU-E-T-762: Toward Volume-Based Independent Dose Verification as Secondary Check

    International Nuclear Information System (INIS)

    Tachibana, H; Tachibana, R

    2015-01-01

    Purpose: Lung SBRT plan has been shifted to volume prescription technique. However, point dose agreement is still verified using independent dose verification at the secondary check. The volume dose verification is more affected by inhomogeneous correction rather than point dose verification currently used as the check. A feasibility study for volume dose verification was conducted in lung SBRT plan. Methods: Six SBRT plans were collected in our institute. Two dose distributions with / without inhomogeneous correction were generated using Adaptive Convolve (AC) in Pinnacle3. Simple MU Analysis (SMU, Triangle Product, Ishikawa, JP) was used as the independent dose verification software program, in which a modified Clarkson-based algorithm was implemented and radiological path length was computed using CT images independently to the treatment planning system. The agreement in point dose and mean dose between the AC with / without the correction and the SMU were assessed. Results: In the point dose evaluation for the center of the GTV, the difference shows the systematic shift (4.5% ± 1.9 %) in comparison of the AC with the inhomogeneous correction, on the other hands, there was good agreement of 0.2 ± 0.9% between the SMU and the AC without the correction. In the volume evaluation, there were significant differences in mean dose for not only PTV (14.2 ± 5.1 %) but also GTV (8.0 ± 5.1 %) compared to the AC with the correction. Without the correction, the SMU showed good agreement for GTV (1.5 ± 0.9%) as well as PTV (0.9% ± 1.0%). Conclusion: The volume evaluation for secondary check may be possible in homogenous region. However, the volume including the inhomogeneous media would make larger discrepancy. Dose calculation algorithm for independent verification needs to be modified to take into account the inhomogeneous correction

  6. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Heo, Gyunyoung; Jung, Jaecheon

    2016-01-01

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks

  7. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim; Heo, Gyunyoung [Kyunghee Univ., Yongin (Korea, Republic of); Jung, Jaecheon [KEPCO, Ulsan (Korea, Republic of)

    2016-10-15

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks.

  8. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  9. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    Science.gov (United States)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  10. Verification of a climate convention: role and limitations of a space based remote sensing system

    International Nuclear Information System (INIS)

    Lanchbery, J.F.; Fischer, W.; Katscher, R.; Stein, G.

    1992-01-01

    Techniques currently under discussion for verifying compliance with an international climate convention and its protocols include space-based remoted sensing systems. This paper indicates present and potential verification sectors for such systems together with the likely technical demands of the verification system user. Space-based remote sensors currently offer the prospect of being used to monitor changes in vegetation on the Earth's surface and could thus be used as tools to verify compliance with a forest protocol and possible land-use or agriculture agreements. After discussing the capabilities and limitations of currently available remote sensing techniques in this type of application, some of the technical requirements of remote sensors in other areas likely to need verifying under a climate convention are explored. 10 refs., 2 tabs

  11. Regional MLEM reconstruction strategy for PET-based treatment verification in ion beam radiotherapy

    International Nuclear Information System (INIS)

    Gianoli, Chiara; Riboldi, Marco; Fattori, Giovanni; Baselli, Giuseppe; Baroni, Guido; Bauer, Julia; Debus, Jürgen; Parodi, Katia; De Bernardi, Elisabetta

    2014-01-01

    In ion beam radiotherapy, PET-based treatment verification provides a consistency check of the delivered treatment with respect to a simulation based on the treatment planning. In this work the region-based MLEM reconstruction algorithm is proposed as a new evaluation strategy in PET-based treatment verification. The comparative evaluation is based on reconstructed PET images in selected regions, which are automatically identified on the expected PET images according to homogeneity in activity values. The strategy was tested on numerical and physical phantoms, simulating mismatches between the planned and measured β + activity distributions. The region-based MLEM reconstruction was demonstrated to be robust against noise and the sensitivity of the strategy results were comparable to three voxel units, corresponding to 6 mm in numerical phantoms. The robustness of the region-based MLEM evaluation outperformed the voxel-based strategies. The potential of the proposed strategy was also retrospectively assessed on patient data and further clinical validation is envisioned. (paper)

  12. Geophysical System Verification (GSV): A Physics-Based Alternative to Geophysical Prove-Outs for Munitions Response. Addendum

    Science.gov (United States)

    2015-09-24

    FINAL REPORT Geophysical System Verification (GSV): A Physics-Based Alternative to Geophysical Prove-Outs for Munitions Response July 2009...NUMBER (Include area code) 15-07-2009 Final Report July 2009 GEOPHYSICAL SYSTEM VERIFICATION (GSV): A PHYSICS-BASED ALTERNATIVE TO GEOPHYSICAL PROVE-OUTS...Arlington, Virginia 22203 SERDP/ESTCP N/A Unlimited distribution This document highlights a more rigorous physics-based alternative to geophysical

  13. Geophysical System Verification (GSV): A Physics-Based Alternative to Geophysical Prove-Outs for Munitions Response

    Science.gov (United States)

    2015-09-24

    FINAL REPORT Geophysical System Verification (GSV): A Physics-Based Alternative to Geophysical Prove-Outs for Munitions Response July 2009...NUMBER (Include area code) 15-07-2009 Final Report July 2009 GEOPHYSICAL SYSTEM VERIFICATION (GSV): A PHYSICS-BASED ALTERNATIVE TO GEOPHYSICAL PROVE-OUTS...Arlington, Virginia 22203 SERDP/ESTCP N/A Unlimited distribution This document highlights a more rigorous physics-based alternative to geophysical

  14. A verification approach for crosscutting features based on extension join points

    OpenAIRE

    Coelho, Roberta; Alves, Vander; Kulesza, Uirá; Costa Neto, Alberto; Garcia, Alessandro; Staa, Arndt von; Lucena, Carlos; Borba, Paulo

    2006-01-01

    Recently, one arguing question in the context of product line development is how to improve the modularization and composition of crosscutting features. However, little attention has been paid to the closely related issue of testing the crosscutting features. This paper proposes a verification approach for the crosscutting features of a product line based on the use of a previously proposed concept called Extension Join Points.

  15. Formal verification of software-based medical devices considering medical guidelines.

    Science.gov (United States)

    Daw, Zamira; Cleaveland, Rance; Vetter, Marcus

    2014-01-01

    Software-based devices have increasingly become an important part of several clinical scenarios. Due to their critical impact on human life, medical devices have very strict safety requirements. It is therefore necessary to apply verification methods to ensure that the safety requirements are met. Verification of software-based devices is commonly limited to the verification of their internal elements without considering the interaction that these elements have with other devices as well as the application environment in which they are used. Medical guidelines define clinical procedures, which contain the necessary information to completely verify medical devices. The objective of this work was to incorporate medical guidelines into the verification process in order to increase the reliability of the software-based medical devices. Medical devices are developed using the model-driven method deterministic models for signal processing of embedded systems (DMOSES). This method uses unified modeling language (UML) models as a basis for the development of medical devices. The UML activity diagram is used to describe medical guidelines as workflows. The functionality of the medical devices is abstracted as a set of actions that is modeled within these workflows. In this paper, the UML models are verified using the UPPAAL model-checker. For this purpose, a formalization approach for the UML models using timed automaton (TA) is presented. A set of requirements is verified by the proposed approach for the navigation-guided biopsy. This shows the capability for identifying errors or optimization points both in the workflow and in the system design of the navigation device. In addition to the above, an open source eclipse plug-in was developed for the automated transformation of UML models into TA models that are automatically verified using UPPAAL. The proposed method enables developers to model medical devices and their clinical environment using clinical workflows as one

  16. Time-Contrastive Learning Based DNN Bottleneck Features for Text-Dependent Speaker Verification

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Tan, Zheng-Hua

    2017-01-01

    In this paper, we present a time-contrastive learning (TCL) based bottleneck (BN) feature extraction method for speech signals with an application to text-dependent (TD) speaker verification (SV). It is well-known that speech signals exhibit quasi-stationary behavior in and only in a short interval......, and the TCL method aims to exploit this temporal structure. More specifically, it trains deep neural networks (DNNs) to discriminate temporal events obtained by uniformly segmenting speech signals, in contrast to existing DNN based BN feature extraction methods that train DNNs using labeled data...

  17. A radar-based verification of precipitation forecast for local convective storms

    Czech Academy of Sciences Publication Activity Database

    Řezáčová, Daniela; Sokol, Zbyněk; Pešice, Petr

    2007-01-01

    Roč. 83, 2-4 (2007), s. 211-224 ISSN 0169-8095 R&D Projects: GA ČR GA205/04/0114; GA AV ČR KJB3042404; GA MŠk OC 717.20 Institutional research plan: CEZ:AV0Z30420517 Keywords : Local convective storm * Flash flood precipitation * Radar-based precipitation estimation * Precipitation forecast * Radar-based forecast verification Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 1.786, year: 2007

  18. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-01-01

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform’s mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument’s working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform. PMID:27869722

  19. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform.

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-11-18

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  20. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    International Nuclear Information System (INIS)

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor

    2013-01-01

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film

  1. SU-F-T-267: A Clarkson-Based Independent Dose Verification for the Helical Tomotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Nagata, H [Shonan Kamakura General Hospital, Kamakura, Kanagawa, (Japan); Juntendo University, Hongo, Tokyo (Japan); Hongo, H [Shonan Kamakura General Hospital, Kamakura, Kanagawa, (Japan); Tsukuba University, Tsukuba, Ibaraki (Japan); Kawai, D [Kanagawa Cancer Center, Yokohama, Kanagawa (Japan); Takahashi, R [Cancer Institute Hospital of Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Hashimoto, H [Shonan Fujisawa Tokushukai Hospital, Fujisawa, Kanagawa (Japan); Tachibana, H [National Cancer Center, Kashiwa, Chiba (Japan)

    2016-06-15

    Purpose: There have been few reports for independent dose verification for Tomotherapy. We evaluated the accuracy and the effectiveness of an independent dose verification system for the Tomotherapy. Methods: Simple MU Analysis (SMU, Triangle Product, Ishikawa, Japan) was used as the independent verification system and the system implemented a Clarkson-based dose calculation algorithm using CT image dataset. For dose calculation in the SMU, the Tomotherapy machine-specific dosimetric parameters (TMR, Scp, OAR and MLC transmission factor) were registered as the machine beam data. Dose calculation was performed after Tomotherapy sinogram from DICOM-RT plan information was converted to the information for MU and MLC location at more segmented control points. The performance of the SMU was assessed by a point dose measurement in non-IMRT and IMRT plans (simple target and mock prostate plans). Subsequently, 30 patients’ treatment plans for prostate were compared. Results: From the comparison, dose differences between the SMU and the measurement were within 3% for all cases in non-IMRT plans. In the IMRT plan for the simple target, the differences (Average±1SD) were −0.70±1.10% (SMU vs. TPS), −0.40±0.10% (measurement vs. TPS) and −1.20±1.00% (measurement vs. SMU), respectively. For the mock prostate, the differences were −0.40±0.60% (SMU vs. TPS), −0.50±0.90% (measurement vs. TPS) and −0.90±0.60% (measurement vs. SMU), respectively. For patients’ plans, the difference was −0.50±2.10% (SMU vs. TPS). Conclusion: A Clarkson-based independent dose verification for the Tomotherapy can be clinically available as a secondary check with the similar tolerance level of AAPM Task group 114. This research is partially supported by Japan Agency for Medical Research and Development (AMED)

  2. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    Energy Technology Data Exchange (ETDEWEB)

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor [Gamma Knife Unit, Department of Neurosurgery, Neurosciences Centre, All India Institute of Medical Sciences, Ansari Nagar, New Delhi 110029 (India)

    2013-12-15

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film.

  3. Development and verification of an agent-based model of opinion leadership.

    Science.gov (United States)

    Anderson, Christine A; Titler, Marita G

    2014-09-27

    The use of opinion leaders is a strategy used to speed the process of translating research into practice. Much is still unknown about opinion leader attributes and activities and the context in which they are most effective. Agent-based modeling is a methodological tool that enables demonstration of the interactive and dynamic effects of individuals and their behaviors on other individuals in the environment. The purpose of this study was to develop and test an agent-based model of opinion leadership. The details of the design and verification of the model are presented. The agent-based model was developed by using a software development platform to translate an underlying conceptual model of opinion leadership into a computer model. Individual agent attributes (for example, motives and credibility) and behaviors (seeking or providing an opinion) were specified as variables in the model in the context of a fictitious patient care unit. The verification process was designed to test whether or not the agent-based model was capable of reproducing the conditions of the preliminary conceptual model. The verification methods included iterative programmatic testing ('debugging') and exploratory analysis of simulated data obtained from execution of the model. The simulation tests included a parameter sweep, in which the model input variables were adjusted systematically followed by an individual time series experiment. Statistical analysis of model output for the 288 possible simulation scenarios in the parameter sweep revealed that the agent-based model was performing, consistent with the posited relationships in the underlying model. Nurse opinion leaders act on the strength of their beliefs and as a result, become an opinion resource for their uncertain colleagues, depending on their perceived credibility. Over time, some nurses consistently act as this type of resource and have the potential to emerge as opinion leaders in a context where uncertainty exists. The

  4. Nonlinear dynamic analysis of multi-base seismically isolated structures with uplift potential II: verification examples

    Science.gov (United States)

    Roussis, Panayiotis C.; Tsopelas, Panos C.; Constantinou, Michael C.

    2010-03-01

    The work presented in this paper serves as numerical verification of the analytical model developed in the companion paper for nonlinear dynamic analysis of multi-base seismically isolated structures. To this end, two numerical examples have been analyzed using the computational algorithm incorporated into program 3D-BASIS-ME-MB, developed on the basis of the newly-formulated analytical model. The first example concerns a seven-story model structure that was tested on the earthquake simulator at the University at Buffalo and was also used as a verification example for program SAP2000. The second example concerns a two-tower, multi-story structure with a split-level seismic-isolation system. For purposes of verification, key results produced by 3D-BASIS-ME-MB are compared to experimental results, or results obtained from other structural/finite element programs. In both examples, the analyzed structure is excited under conditions of bearing uplift, thus yielding a case of much interest in verifying the capabilities of the developed analysis tool.

  5. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    Science.gov (United States)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  6. Safety Verification of a Fault Tolerant Reconfigurable Autonomous Goal-Based Robotic Control System

    Science.gov (United States)

    Braman, Julia M. B.; Murray, Richard M; Wagner, David A.

    2007-01-01

    Fault tolerance and safety verification of control systems are essential for the success of autonomous robotic systems. A control architecture called Mission Data System (MDS), developed at the Jet Propulsion Laboratory, takes a goal-based control approach. In this paper, a method for converting goal network control programs into linear hybrid systems is developed. The linear hybrid system can then be verified for safety in the presence of failures using existing symbolic model checkers. An example task is simulated in MDS and successfully verified using HyTech, a symbolic model checking software for linear hybrid systems.

  7. A Mechanism of Modeling and Verification for SaaS Customization Based on TLA

    Science.gov (United States)

    Luan, Shuai; Shi, Yuliang; Wang, Haiyang

    With the gradually mature of SOA and the rapid development of Internet, SaaS has become a popular software service mode. The customized action of SaaS is usually subject to internal and external dependency relationships. This paper first introduces a method for modeling customization process based on Temporal Logic of Actions, and then proposes a verification algorithm to assure that each step in customization will not cause unpredictable influence on system and follow the related rules defined by SaaS provider.

  8. A method of knowledge base verification and validation for nuclear power plants expert systems

    International Nuclear Information System (INIS)

    Kwon, Il Won

    1996-02-01

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. As a result of this popularity, a large number of expert systems are developed. The nature of expert systems, however, requires that they be verified and validated carefully and that detailed methodologies for their development be devised. Therefore, it is widely noted that assuring the reliability of expert systems is very important, especially in nuclear industry, and it is also recognized that the process of verification and validation is an essential part of reliability assurance for these systems. Research and practices have produced numerous methods for expert system verification and validation (V and V) that suggest traditional software and system approaches to V and V. However, many approaches and methods for expert system V and V are partial, unreliable, and not uniform. The purpose of this paper is to present a new approach to expert system V and V, based on Petri nets, providing a uniform model. We devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for checking incorrectness, inconsistency, and incompleteness in a knowledge base. We also suggest heuristic analysis for validation process to show that the reasoning path is correct

  9. Content-based Image Hiding Method for Secure Network Biometric Verification

    Directory of Open Access Journals (Sweden)

    Xiangjiu Che

    2011-08-01

    Full Text Available For secure biometric verification, most existing methods embed biometric information directly into the cover image, but content correlation analysis between the biometric image and the cover image is often ignored. In this paper, we propose a novel biometric image hiding approach based on the content correlation analysis to protect the network-based transmitted image. By using principal component analysis (PCA, the content correlation between the biometric image and the cover image is firstly analyzed. Then based on particle swarm optimization (PSO algorithm, some regions of the cover image are selected to represent the biometric image, in which the cover image can carry partial content of the biometric image. As a result of the correlation analysis, the unrepresented part of the biometric image is embedded into the cover image by using the discrete wavelet transform (DWT. Combined with human visual system (HVS model, this approach makes the hiding result perceptually invisible. The extensive experimental results demonstrate that the proposed hiding approach is robust against some common frequency and geometric attacks; it also provides an effective protection for the secure biometric verification.

  10. Verification of the two-dimensional hydrodynamic model based on remote sensing

    Science.gov (United States)

    Sazonov, Alexey; Mikhailukova, Polina; Krylenko, Inna; Frolova, Natalya; Kireeva, Mariya

    2016-04-01

    Mathematical modeling methods are used more and more actively to evaluate possible damage, identify potential flood zone and the influence of individual factors affecting the river during the passage of the flood. Calculations were performed by means of domestic software complex «STREAM-2D» which is based on the numerical solution of two-dimensional St. Venant equations. One of the major challenges in mathematical modeling is the verification of the model. This is usually made using data on water levels from hydrological stations: the smaller the difference of the actual level and the simulated one, the better the quality of the model used. Data from hydrological stations are not always available, so alternative sources of verification, such as remote sensing, are increasingly used. The aim of this work is to develop a method of verification of hydrodynamic model based on a comparison of actual flood zone area, which in turn is determined on the basis of the automated satellite image interpretation methods for different imaging systems and flooded area obtained in the course of the model. The study areas are Lena River, The North Dvina River, Amur River near Blagoveshchensk. We used satellite images made by optical and radar sensors: SPOT-5/HRG, Resurs-F, Radarsat-2. Flooded area were calculated using unsupervised classification (ISODATA and K-mean) for optical images and segmentation for Radarsat-2. Knowing the flow rate and the water level at a given date for the upper and lower limits of the model, respectively, it is possible to calculate flooded area by means of program STREAM-2D and GIS technology. All the existing vector layers with the boundaries of flooding are included in a GIS project for flood area calculation. This study was supported by the Russian Science Foundation, project no. 14-17-00155.

  11. Verification of gamma knife based fractionated radiosurgery with newly developed head-thorax phantom

    International Nuclear Information System (INIS)

    Bisht, Raj Kishor; Kale, Shashank Sharad; Natanasabapathi, Gopishankar; Singh, Manmohan Jit; Agarwal, Deepak; Garg, Ajay; Rath, Goura Kishore; Julka, Pramod Kumar; Kumar, Pratik; Thulkar, Sanjay; Sharma, Bhawani Shankar

    2016-01-01

    Objective: Purpose of the study is to verify the Gamma Knife Extend™ system (ES) based fractionated stereotactic radiosurgery with newly developed head-thorax phantom. Methods: Phantoms are extensively used to measure radiation dose and verify treatment plan in radiotherapy. A human upper body shaped phantom with thorax was designed to simulate fractionated stereotactic radiosurgery using Extend™ system of Gamma Knife. The central component of the phantom aids in performing radiological precision test, dosimetric evaluation and treatment verification. A hollow right circular cylindrical space of diameter 7.0 cm was created at the centre of this component to place various dosimetric devices using suitable adaptors. The phantom is made of poly methyl methacrylate (PMMA), a transparent thermoplastic material. Two sets of disk assemblies were designed to place dosimetric films in (1) horizontal (xy) and (2) vertical (xz) planes. Specific cylindrical adaptors were designed to place thimble ionization chamber inside phantom for point dose recording along xz axis. EBT3 Gafchromic films were used to analyze and map radiation field. The focal precision test was performed using 4 mm collimator shot in phantom to check radiological accuracy of treatment. The phantom head position within the Extend™ frame was estimated using encoded aperture measurement of repositioning check tool (RCT). For treatment verification, the phantom with inserts for film and ion chamber was scanned in reference treatment position using X-ray computed tomography (CT) machine and acquired stereotactic images were transferred into Leksell Gammaplan (LGP). A patient treatment plan with hypo-fractionated regimen was delivered and identical fractions were compared using EBT3 films and in-house MATLAB codes. Results: RCT measurement showed an overall positional accuracy of 0.265 mm (range 0.223 mm–0.343 mm). Gamma index analysis across fractions exhibited close agreement between LGP and film

  12. Comparison of carina-based versus bony anatomy-based registration for setup verification in esophageal cancer radiotherapy.

    Science.gov (United States)

    Machiels, Mélanie; Jin, Peng; van Gurp, Christianne H; van Hooft, Jeanin E; Alderliesten, Tanja; Hulshof, Maarten C C M

    2018-03-21

    To investigate the feasibility and geometric accuracy of carina-based registration for CBCT-guided setup verification in esophageal cancer IGRT, compared with current practice bony anatomy-based registration. Included were 24 esophageal cancer patients with 65 implanted fiducial markers, visible on planning CTs and follow-up CBCTs. All available CBCT scans (n = 236) were rigidly registered to the planning CT with respect to the bony anatomy and the carina. Target coverage was visually inspected and marker position variation was quantified relative to both registration approaches; the variation of systematic (Σ) and random errors (σ) was estimated. Automatic carina-based registration was feasible in 94.9% of the CBCT scans, with an adequate target coverage in 91.1% compared to 100% after bony anatomy-based registration. Overall, Σ (σ) in the LR/CC/AP direction was 2.9(2.4)/4.1(2.4)/2.2(1.8) mm using the bony anatomy registration compared to 3.3(3.0)/3.6(2.6)/3.9(3.1) mm for the carina. Mid-thoracic placed markers showed a non-significant but smaller Σ in CC and AP direction when using the carina-based registration. Compared with a bony anatomy-based registration, carina-based registration for esophageal cancer IGRT results in inadequate target coverage in 8.9% of cases. Furthermore, large Σ and σ, requiring larger anisotropic margins, were seen after carina-based registration. Only for tumors entirely confined to the mid-thoracic region the carina-based registration might be slightly favorable.

  13. A rule-based verification and control framework in ATLAS Trigger-DAQ

    CERN Document Server

    Kazarov, A; Lehmann-Miotto, G; Sloper, J E; Ryabov, Yu; Computing In High Energy and Nuclear Physics

    2007-01-01

    In order to meet the requirements of ATLAS data taking, the ATLAS Trigger-DAQ system is composed of O(1000) of applications running on more than 2600 computers in a network. With such system size, s/w and h/w failures are quite often. To minimize system downtime, the Trigger-DAQ control system shall include advanced verification and diagnostics facilities. The operator should use tests and expertise of the TDAQ and detectors developers in order to diagnose and recover from errors, if possible automatically. The TDAQ control system is built as a distributed tree of controllers, where behavior of each controller is defined in a rule-based language allowing easy customization. The control system also includes verification framework which allow users to develop and configure tests for any component in the system with different levels of complexity. It can be used as a stand-alone test facility for a small detector installation, as part of the general TDAQ initialization procedure, and for diagnosing the problems ...

  14. Model-Based Design and Formal Verification Processes for Automated Waterway System Operations

    Directory of Open Access Journals (Sweden)

    Leonard Petnga

    2016-06-01

    Full Text Available Waterway and canal systems are particularly cost effective in the transport of bulk and containerized goods to support global trade. Yet, despite these benefits, they are among the most under-appreciated forms of transportation engineering systems. Looking ahead, the long-term view is not rosy. Failures, delays, incidents and accidents in aging waterway systems are doing little to attract the technical and economic assistance required for modernization and sustainability. In a step toward overcoming these challenges, this paper argues that programs for waterway and canal modernization and sustainability can benefit significantly from system thinking, supported by systems engineering techniques. We propose a multi-level multi-stage methodology for the model-based design, simulation and formal verification of automated waterway system operations. At the front-end of development, semi-formal modeling techniques are employed for the representation of project goals and scenarios, requirements and high-level models of behavior and structure. To assure the accuracy of engineering predictions and the correctness of operations, formal modeling techniques are used for the performance assessment and the formal verification of the correctness of functionality. The essential features of this methodology are highlighted in a case study examination of ship and lock-system behaviors in a two-stage lock system.

  15. PLM-based Approach for Design Verification and Validation using Manufacturing Process Knowledge

    Directory of Open Access Journals (Sweden)

    Luis Toussaint

    2010-02-01

    Full Text Available Out of 100 hours of engineering work, only 20 are dedicated to real engineering and 80 are spent on what is considered as routine activities. Readjusting the ratio of innovative vs. routine work is a considerable challenge in the product lifecycle management (PLM strategy. Therefore, the main objective is to develop an approach in order to accelerate routine processes in engineering design. The proposed methodology called FabK consists of capturing manufacturing knowledge and its application towards the design verification and validation of new engineering designs. The approach is implemented into a Web-based PLM prototype and a Computer Aided Design system. A series of experiments from an industrial case study is introduced to provide significant results.

  16. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  17. Verification of photon attenuation characteristics for 3D printer based small animal lung model

    International Nuclear Information System (INIS)

    Lee, Se Ho; Lee, Seung Wook; Han, Su Chul; Park, Seung Woo

    2016-01-01

    Since it is difficult to measure absorbed dose to mice in vivo, replica mice are mostly used as alternative. In this study, realistic mouse phantom was fabricated by using 3D printer (object500 connex3, Stratasys, USA). Elemental inks as material of 3D printer were selected corresponding to mouse tissue. To represent lung, selected material was partially used with air layer. In order to verify material equivalent, super-flex bolus was simply compared to verify photon attenuation characteristics. In the case of lung, Hounsfield unit (HU) of the phantom were compared with a live mouse. In this study, we fabricated mouse phantom by using 3D printer, and practically verified photon attenuation characteristics. The fabricated phantom shows tissue equivalence as well as similar geometry with live mouse. As more and more growing of 3D printer technique, 3D printer based small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study

  18. Verification of photon attenuation characteristics for 3D printer based small animal lung model

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Se Ho; Lee, Seung Wook [Pusan National University, Busan (Korea, Republic of); Han, Su Chul; Park, Seung Woo [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of)

    2016-05-15

    Since it is difficult to measure absorbed dose to mice in vivo, replica mice are mostly used as alternative. In this study, realistic mouse phantom was fabricated by using 3D printer (object500 connex3, Stratasys, USA). Elemental inks as material of 3D printer were selected corresponding to mouse tissue. To represent lung, selected material was partially used with air layer. In order to verify material equivalent, super-flex bolus was simply compared to verify photon attenuation characteristics. In the case of lung, Hounsfield unit (HU) of the phantom were compared with a live mouse. In this study, we fabricated mouse phantom by using 3D printer, and practically verified photon attenuation characteristics. The fabricated phantom shows tissue equivalence as well as similar geometry with live mouse. As more and more growing of 3D printer technique, 3D printer based small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study.

  19. Performance evaluation of wavelet-based face verification on a PDA recorded database

    Science.gov (United States)

    Sellahewa, Harin; Jassim, Sabah A.

    2006-05-01

    The rise of international terrorism and the rapid increase in fraud and identity theft has added urgency to the task of developing biometric-based person identification as a reliable alternative to conventional authentication methods. Human Identification based on face images is a tough challenge in comparison to identification based on fingerprints or Iris recognition. Yet, due to its unobtrusive nature, face recognition is the preferred method of identification for security related applications. The success of such systems will depend on the support of massive infrastructures. Current mobile communication devices (3G smart phones) and PDA's are equipped with a camera which can capture both still and streaming video clips and a touch sensitive display panel. Beside convenience, such devices provide an adequate secure infrastructure for sensitive & financial transactions, by protecting against fraud and repudiation while ensuring accountability. Biometric authentication systems for mobile devices would have obvious advantages in conflict scenarios when communication from beyond enemy lines is essential to save soldier and civilian life. In areas of conflict or disaster the luxury of fixed infrastructure is not available or destroyed. In this paper, we present a wavelet-based face verification scheme that have been specifically designed and implemented on a currently available PDA. We shall report on its performance on the benchmark audio-visual BANCA database and on a newly developed PDA recorded audio-visual database that take include indoor and outdoor recordings.

  20. Operational verification of a blow out preventer utilizing fiber Bragg grating based strain gauges

    Science.gov (United States)

    Turner, Alan L.; Loustau, Philippe; Thibodeau, Dan

    2015-05-01

    Ultra-deep water BOP (Blowout Preventer) operation poses numerous challenges in obtaining accurate knowledge of current system integrity and component condition- a salient example is the difficulty of verifying closure of the pipe and shearing rams during and after well control events. Ascertaining the integrity of these functions is currently based on a manual volume measurement performed with a stop watch. Advances in sensor technology now permit more accurate methods of BOP condition monitoring. Fiber optic sensing technology and particularly fiber optic strain gauges have evolved to a point where we can derive a good representation of what is happening inside a BOP by installing sensors on the outside shell. Function signatures can be baselined to establish thresholds that indicate successful function activation. Based on this knowledge base, signal variation over time can then be utilized to assess degradation of these functions and subsequent failure to function. Monitoring the BOP from the outside has the advantage of gathering data through a system that can be interfaced with risk based integrity management software and/or a smart monitoring system that analyzes BOP control redundancies without the requirement of interfacing with OEM control systems. The paper will present the results of ongoing work on a fully instrumented 13-½" 10,000 psi pipe ram. Instrumentation includes commonly used pressure transducers, accelerometers, flow meters, and optical strain gauges. Correlation will be presented between flow, pressure, acceleration signatures and the fiber optic strain gauge's response as it relates to functional verification and component level degradation trending.

  1. THRIVE: threshold homomorphic encryption based secure and privacy preserving biometric verification system

    Science.gov (United States)

    Karabat, Cagatay; Kiraz, Mehmet Sabir; Erdogan, Hakan; Savas, Erkay

    2015-12-01

    In this paper, we introduce a new biometric verification and template protection system which we call THRIVE. The system includes novel enrollment and authentication protocols based on threshold homomorphic encryption where a private key is shared between a user and a verifier. In the THRIVE system, only encrypted binary biometric templates are stored in a database and verification is performed via homomorphically randomized templates, thus, original templates are never revealed during authentication. Due to the underlying threshold homomorphic encryption scheme, a malicious database owner cannot perform full decryption on encrypted templates of the users in the database. In addition, security of the THRIVE system is enhanced using a two-factor authentication scheme involving user's private key and biometric data. Using simulation-based techniques, the proposed system is proven secure in the malicious model. The proposed system is suitable for applications where the user does not want to reveal her biometrics to the verifier in plain form, but needs to prove her identity by using biometrics. The system can be used with any biometric modality where a feature extraction method yields a fixed size binary template and a query template is verified when its Hamming distance to the database template is less than a threshold. The overall connection time for the proposed THRIVE system is estimated to be 336 ms on average for 256-bit biometric templates on a desktop PC running with quad core 3.2 GHz CPUs at 10 Mbit/s up/down link connection speed. Consequently, the proposed system can be efficiently used in real-life applications.

  2. Mass Spectrometry-based Assay for High Throughput and High Sensitivity Biomarker Verification

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Xuejiang; Tang, Keqi

    2017-06-14

    Searching for disease specific biomarkers has become a major undertaking in the biomedical research field as the effective diagnosis, prognosis and treatment of many complex human diseases are largely determined by the availability and the quality of the biomarkers. A successful biomarker as an indicator to a specific biological or pathological process is usually selected from a large group of candidates by a strict verification and validation process. To be clinically useful, the validated biomarkers must be detectable and quantifiable by the selected testing techniques in their related tissues or body fluids. Due to its easy accessibility, protein biomarkers would ideally be identified in blood plasma or serum. However, most disease related protein biomarkers in blood exist at very low concentrations (<1ng/mL) and are “masked” by many none significant species at orders of magnitude higher concentrations. The extreme requirements of measurement sensitivity, dynamic range and specificity make the method development extremely challenging. The current clinical protein biomarker measurement primarily relies on antibody based immunoassays, such as ELISA. Although the technique is sensitive and highly specific, the development of high quality protein antibody is both expensive and time consuming. The limited capability of assay multiplexing also makes the measurement an extremely low throughput one rendering it impractical when hundreds to thousands potential biomarkers need to be quantitatively measured across multiple samples. Mass spectrometry (MS)-based assays have recently shown to be a viable alternative for high throughput and quantitative candidate protein biomarker verification. Among them, the triple quadrupole MS based assay is the most promising one. When it is coupled with liquid chromatography (LC) separation and electrospray ionization (ESI) source, a triple quadrupole mass spectrometer operating in a special selected reaction monitoring (SRM) mode

  3. Verification & Validation of High-Order Short-Characteristics-Based Deterministic Transport Methodology on Unstructured Grids

    International Nuclear Information System (INIS)

    Azmy, Yousry; Wang, Yaqi

    2013-01-01

    The research team has developed a practical, high-order, discrete-ordinates, short characteristics neutron transport code for three-dimensional configurations represented on unstructured tetrahedral grids that can be used for realistic reactor physics applications at both the assembly and core levels. This project will perform a comprehensive verification and validation of this new computational tool against both a continuous-energy Monte Carlo simulation (e.g. MCNP) and experimentally measured data, an essential prerequisite for its deployment in reactor core modeling. Verification is divided into three phases. The team will first conduct spatial mesh and expansion order refinement studies to monitor convergence of the numerical solution to reference solutions. This is quantified by convergence rates that are based on integral error norms computed from the cell-by-cell difference between the code's numerical solution and its reference counterpart. The latter is either analytic or very fine- mesh numerical solutions from independent computational tools. For the second phase, the team will create a suite of code-independent benchmark configurations to enable testing the theoretical order of accuracy of any particular discretization of the discrete ordinates approximation of the transport equation. For each tested case (i.e. mesh and spatial approximation order), researchers will execute the code and compare the resulting numerical solution to the exact solution on a per cell basis to determine the distribution of the numerical error. The final activity comprises a comparison to continuous-energy Monte Carlo solutions for zero-power critical configuration measurements at Idaho National Laboratory's Advanced Test Reactor (ATR). Results of this comparison will allow the investigators to distinguish between modeling errors and the above-listed discretization errors introduced by the deterministic method, and to separate the sources of uncertainty.

  4. Model-based verification method for solving the parameter uncertainty in the train control system

    International Nuclear Information System (INIS)

    Cheng, Ruijun; Zhou, Jin; Chen, Dewang; Song, Yongduan

    2016-01-01

    This paper presents a parameter analysis method to solve the parameter uncertainty problem for hybrid system and explore the correlation of key parameters for distributed control system. For improving the reusability of control model, the proposed approach provides the support for obtaining the constraint sets of all uncertain parameters in the abstract linear hybrid automata (LHA) model when satisfying the safety requirements of the train control system. Then, in order to solve the state space explosion problem, the online verification method is proposed to monitor the operating status of high-speed trains online because of the real-time property of the train control system. Furthermore, we construct the LHA formal models of train tracking model and movement authority (MA) generation process as cases to illustrate the effectiveness and efficiency of the proposed method. In the first case, we obtain the constraint sets of uncertain parameters to avoid collision between trains. In the second case, the correlation of position report cycle and MA generation cycle is analyzed under both the normal and the abnormal condition influenced by packet-loss factor. Finally, considering stochastic characterization of time distributions and real-time feature of moving block control system, the transient probabilities of wireless communication process are obtained by stochastic time petri nets. - Highlights: • We solve the parameters uncertainty problem by using model-based method. • We acquire the parameter constraint sets by verifying linear hybrid automata models. • Online verification algorithms are designed to monitor the high-speed trains. • We analyze the correlation of key parameters and uncritical parameters. • The transient probabilities are obtained by using reliability analysis.

  5. Verification & Validation of High-Order Short-Characteristics-Based Deterministic Transport Methodology on Unstructured Grids

    Energy Technology Data Exchange (ETDEWEB)

    Azmy, Yousry [North Carolina State Univ., Raleigh, NC (United States); Wang, Yaqi [North Carolina State Univ., Raleigh, NC (United States)

    2013-12-20

    The research team has developed a practical, high-order, discrete-ordinates, short characteristics neutron transport code for three-dimensional configurations represented on unstructured tetrahedral grids that can be used for realistic reactor physics applications at both the assembly and core levels. This project will perform a comprehensive verification and validation of this new computational tool against both a continuous-energy Monte Carlo simulation (e.g. MCNP) and experimentally measured data, an essential prerequisite for its deployment in reactor core modeling. Verification is divided into three phases. The team will first conduct spatial mesh and expansion order refinement studies to monitor convergence of the numerical solution to reference solutions. This is quantified by convergence rates that are based on integral error norms computed from the cell-by-cell difference between the code’s numerical solution and its reference counterpart. The latter is either analytic or very fine- mesh numerical solutions from independent computational tools. For the second phase, the team will create a suite of code-independent benchmark configurations to enable testing the theoretical order of accuracy of any particular discretization of the discrete ordinates approximation of the transport equation. For each tested case (i.e. mesh and spatial approximation order), researchers will execute the code and compare the resulting numerical solution to the exact solution on a per cell basis to determine the distribution of the numerical error. The final activity comprises a comparison to continuous-energy Monte Carlo solutions for zero-power critical configuration measurements at Idaho National Laboratory’s Advanced Test Reactor (ATR). Results of this comparison will allow the investigators to distinguish between modeling errors and the above-listed discretization errors introduced by the deterministic method, and to separate the sources of uncertainty.

  6. Video-based cargo fire verification system with fuzzy inference engine for commercial aircraft

    Science.gov (United States)

    Sadok, Mokhtar; Zakrzewski, Radek; Zeliff, Bob

    2005-02-01

    Conventional smoke detection systems currently installed onboard aircraft are often subject to high rates of false alarms. Under current procedures, whenever an alarm is issued the pilot is obliged to release fire extinguishers and to divert to the nearest airport. Aircraft diversions are costly and dangerous in some situations. A reliable detection system that minimizes false-alarm rate and allows continuous monitoring of cargo compartments is highly desirable. A video-based system has been recently developed by Goodrich Corporation to address this problem. The Cargo Fire Verification System (CFVS) is a multi camera system designed to provide live stream video to the cockpit crew and to perform hotspot, fire, and smoke detection in aircraft cargo bays. In addition to video frames, the CFVS uses other sensor readings to discriminate between genuine events such as fire or smoke and nuisance alarms such as fog or dust. A Mamdani-type fuzzy inference engine is developed to provide approximate reasoning for decision making. In one implementation, Gaussian membership functions for frame intensity-based features, relative humidity, and temperature are constructed using experimental data to form the system inference engine. The CFVS performed better than conventional aircraft smoke detectors in all standardized tests.

  7. 4D offline PET-based treatment verification in ion beam therapy. Experimental and clinical evaluation

    International Nuclear Information System (INIS)

    Kurz, Christopher

    2014-01-01

    Due to the accessible sharp dose gradients, external beam radiotherapy with protons and heavier ions enables a highly conformal adaptation of the delivered dose to arbitrarily shaped tumour volumes. However, this high conformity is accompanied by an increased sensitivity to potential uncertainties, e.g., due to changes in the patient anatomy. Additional challenges are imposed by respiratory motion which does not only lead to rapid changes of the patient anatomy, but, in the cased of actively scanned ions beams, also to the formation of dose inhomogeneities. Therefore, it is highly desirable to verify the actual application of the treatment and to detect possible deviations with respect to the planned irradiation. At present, the only clinically implemented approach for a close-in-time verification of single treatment fractions is based on detecting the distribution of β + -emitter formed in nuclear fragmentation reactions during the irradiation by means of positron emission tomography (PET). For this purpose, a commercial PET/CT (computed tomography) scanner has been installed directly next to the treatment rooms at the Heidelberg Ion-Beam Therapy Center (HIT). Up to present, the application of this treatment verification technique is, however, still limited to static target volumes. This thesis aimed at investigating the feasibility and performance of PET-based treatment verification under consideration of organ motion. In experimental irradiation studies with moving phantoms, not only the practicability of PET-based treatment monitoring for moving targets, using a commercial PET/CT device, could be shown for the first time, but also the potential of this technique to detect motion-related deviations from the planned treatment with sub-millimetre accuracy. The first application to four exemplary hepato-cellular carcinoma patient cases under substantially more challenging clinical conditions indicated potential for improvement by taking organ motion into

  8. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Science.gov (United States)

    Yin, Peili; Wang, Jianhua; Lu, Chunxia

    2017-08-01

    Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI). The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI) to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  9. Modal analysis based equivalent circuit model and its verification for a single cMUT cell

    International Nuclear Information System (INIS)

    Mao, S P; Rottenberg, X; Rochus, V; Czarnecki, P; Helin, P; Severi, S; Tilmans, H A C; Nauwelaers, B

    2017-01-01

    This paper presents the lumped equivalent circuit model and its verification of both transmission and reception properties of a single cell capacitive micromachined ultrasonic transducer (cMUT), which is operating in a non-collapse small signal region. The derivation of this equivalent circuit model is based on the modal analysis techniques, harmonic modes are included by using the mode superposition method; and thus a wide frequency range response of the cMUT cell can be simulated by our equivalent circuit model. The importance of the cross modal coupling between different eigenmodes of a cMUT cell is discussed by us for the first time. In this paper the development of this model is only illustrated by a single circular cMUT cell under a uniform excitation. Extension of this model and corresponding results under a more generalized excitation will be presented in our upcoming publication (Mao et al 2016 Proc. IEEE Int. Ultrasonics Symp .). This model is verified by both finite element method (FEM) simulation and experimental characterizations. Results predicted by our model are in a good agreement with the FEM simulation results, and this works for a single cMUT cell operated in either transmission or reception. Results obtained from the model also rather match the experimental results of the cMUT cell. This equivalent circuit model provides an easy and precise way to rapidly predict the behaviors of cMUT cells. (paper)

  10. Modal analysis based equivalent circuit model and its verification for a single cMUT cell

    Science.gov (United States)

    Mao, S. P.; Rottenberg, X.; Rochus, V.; Czarnecki, P.; Helin, P.; Severi, S.; Nauwelaers, B.; Tilmans, H. A. C.

    2017-03-01

    This paper presents the lumped equivalent circuit model and its verification of both transmission and reception properties of a single cell capacitive micromachined ultrasonic transducer (cMUT), which is operating in a non-collapse small signal region. The derivation of this equivalent circuit model is based on the modal analysis techniques, harmonic modes are included by using the mode superposition method; and thus a wide frequency range response of the cMUT cell can be simulated by our equivalent circuit model. The importance of the cross modal coupling between different eigenmodes of a cMUT cell is discussed by us for the first time. In this paper the development of this model is only illustrated by a single circular cMUT cell under a uniform excitation. Extension of this model and corresponding results under a more generalized excitation will be presented in our upcoming publication (Mao et al 2016 Proc. IEEE Int. Ultrasonics Symp.). This model is verified by both finite element method (FEM) simulation and experimental characterizations. Results predicted by our model are in a good agreement with the FEM simulation results, and this works for a single cMUT cell operated in either transmission or reception. Results obtained from the model also rather match the experimental results of the cMUT cell. This equivalent circuit model provides an easy and precise way to rapidly predict the behaviors of cMUT cells.

  11. Fusion of hand vein, iris and fingerprint for person identity verification based on Bayesian theory

    Science.gov (United States)

    Li, Xiuyan; Liu, Tiegen; Deng, Shichao; Wang, Yunxin

    2009-11-01

    Biometric identification is an important guarantee for social security. In recent years, as the development of social and economic, the more accuracy and safety of identification are required. The person identity verification systems that use a single biometric appear inherent limitations in accuracy, user acceptance, universality. Limitations of unimodal biometric systems can be overcome by using multimodal biometric systems, which combines the conclusions made by a number of unrelated biometrics indicators. Aiming at the limitations of unimodal biometric identification, a recognition algorithm for multimodal biometric fusion based on hand vein, iris and fingerprint was proposed. To verify person identity, the hand vein images, iris images and fingerprint images were preprocessed firstly. The region of interest (ROI) of hand vein image was obtained and filtered to reduce image noises. The multiresolution analysis theory was utilized to extract the texture information of hand vein. The iris image was preprocessed through iris localization, eyelid detection, image normalization and image enhancement, and then the feature code of iris was extracted from the detail images obtained using wavelet transform. The texture feature information represented fingerprint pattern was extracted after filtering and image enhancement. The Bayesian theorem was employed to realize the fusion at the matching score level and the fusion recognition result was finally obtained. The experimental results were presented, which showed that the recognition performance of the proposed fusion method was obviously higher than that of single biometric recognition algorithm. It had verified the efficiency of the proposed method for biometrics.

  12. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Directory of Open Access Journals (Sweden)

    Yin Peili

    2017-08-01

    Full Text Available Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI. The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  13. Energy Efficient Fuzzy Adaptive Verification Node Selection-Based Path Determination in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Muhammad Akram

    2017-10-01

    Full Text Available Wireless sensor networks are supplied with limited energy resources and are usually installed in unattended and unfriendly environments. These networks are also highly exposed to security attacks aimed at draining the energy of the network to render it unresponsive. Adversaries launch counterfeit report injection attacks and false vote injection attacks through compromised sensor nodes. Several filtering solutions have been suggested for detecting and filtering false reports during the multi-hop forwarding process. However, almost all such schemes presuppose a conventional underlying protocol for data routing that do not consider the attack status or energy dissipation on the route. Each design provides approximately the equivalent resilience in terms of protection against compromised node. However, the energy consumption characteristics of each design differ. We propose a fuzzy adaptive path selection to save energy and avoid the emergence of favored paths. Fresh authentication keys are generated periodically, and these are shared with the filtering nodes to restrict compromised intermediate filtering nodes from the verification process. The scheme helps delay the emergence of hotspot problems near the base station and exhibits improved energy conserving behavior in wireless sensor networks. The proposed scheme provides an extended network lifetime and better false data filtering capacity.

  14. NASA Operational Simulator for Small Satellites: Tools for Software Based Validation and Verification of Small Satellites

    Science.gov (United States)

    Grubb, Matt

    2016-01-01

    The NASA Operational Simulator for Small Satellites (NOS3) is a suite of tools to aid in areas such as software development, integration test (IT), mission operations training, verification and validation (VV), and software systems check-out. NOS3 provides a software development environment, a multi-target build system, an operator interface-ground station, dynamics and environment simulations, and software-based hardware models. NOS3 enables the development of flight software (FSW) early in the project life cycle, when access to hardware is typically not available. For small satellites there are extensive lead times on many of the commercial-off-the-shelf (COTS) components as well as limited funding for engineering test units (ETU). Considering the difficulty of providing a hardware test-bed to each developer tester, hardware models are modeled based upon characteristic data or manufacturers data sheets for each individual component. The fidelity of each hardware models is such that FSW executes unaware that physical hardware is not present. This allows binaries to be compiled for both the simulation environment, and the flight computer, without changing the FSW source code. For hardware models that provide data dependent on the environment, such as a GPS receiver or magnetometer, an open-source tool from NASA GSFC (42 Spacecraft Simulation) is used to provide the necessary data. The underlying infrastructure used to transfer messages between FSW and the hardware models can also be used to monitor, intercept, and inject messages, which has proven to be beneficial for VV of larger missions such as James Webb Space Telescope (JWST). As hardware is procured, drivers can be added to the environment to enable hardware-in-the-loop (HWIL) testing. When strict time synchronization is not vital, any number of combinations of hardware components and software-based models can be tested. The open-source operator interface used in NOS3 is COSMOS from Ball Aerospace. For

  15. Verification of the Microgravity Active Vibration Isolation System based on Parabolic Flight

    Science.gov (United States)

    Zhang, Yong-kang; Dong, Wen-bo; Liu, Wei; Li, Zong-feng; Lv, Shi-meng; Sang, Xiao-ru; Yang, Yang

    2017-12-01

    The Microgravity active vibration isolation system (MAIS) is a device to reduce on-orbit vibration and to provide a lower gravity level for certain scientific experiments. MAIS system is made up of a stator and a floater, the stator is fixed on the spacecraft, and the floater is suspended by electromagnetic force so as to reduce the vibration from the stator. The system has 3 position sensors, 3 accelerometers, 8 Lorentz actuators, signal processing circuits and a central controller embedded in the operating software and control algorithms. For the experiments on parabolic flights, a laptop is added to MAIS for monitoring and operation, and a power module is for electric power converting. The principle of MAIS is as follows: the system samples the vibration acceleration of the floater from accelerometers, measures the displacement between stator and floater from position sensitive detectors, and computes Lorentz force current for each actuator so as to eliminate the vibration of the scientific payload, and meanwhile to avoid crashing between the stator and the floater. This is a motion control technic in 6 degrees of freedom (6-DOF) and its function could only be verified in a microgravity environment. Thanks for DLR and Novespace, we get a chance to take the DLR 27th parabolic flight campaign to make experiments to verify the 6-DOF control technic. The experiment results validate that the 6-DOF motion control technique is effective, and vibration isolation performance perfectly matches what we expected based on theoretical analysis and simulation. The MAIS has been planned on Chinese manned spacecraft for many microgravity scientific experiments, and the verification on parabolic flights is very important for its following mission. Additionally, we also test some additional function by microgravity electromagnetic suspension, such as automatic catching and locking and working in fault mode. The parabolic flight produces much useful data for these experiments.

  16. Quantum supremacy in constant-time measurement-based computation: A unified architecture for sampling and verification

    Science.gov (United States)

    Miller, Jacob; Sanders, Stephen; Miyake, Akimasa

    2017-12-01

    While quantum speed-up in solving certain decision problems by a fault-tolerant universal quantum computer has been promised, a timely research interest includes how far one can reduce the resource requirement to demonstrate a provable advantage in quantum devices without demanding quantum error correction, which is crucial for prolonging the coherence time of qubits. We propose a model device made of locally interacting multiple qubits, designed such that simultaneous single-qubit measurements on it can output probability distributions whose average-case sampling is classically intractable, under similar assumptions as the sampling of noninteracting bosons and instantaneous quantum circuits. Notably, in contrast to these previous unitary-based realizations, our measurement-based implementation has two distinctive features. (i) Our implementation involves no adaptation of measurement bases, leading output probability distributions to be generated in constant time, independent of the system size. Thus, it could be implemented in principle without quantum error correction. (ii) Verifying the classical intractability of our sampling is done by changing the Pauli measurement bases only at certain output qubits. Our usage of random commuting quantum circuits in place of computationally universal circuits allows a unique unification of sampling and verification, so they require the same physical resource requirements in contrast to the more demanding verification protocols seen elsewhere in the literature.

  17. Image Processing Based Signature Verification Technique to Reduce Fraud in Financial Institutions

    Directory of Open Access Journals (Sweden)

    Hussein Walid

    2016-01-01

    Full Text Available Handwritten signature is broadly utilized as personal verification in financial institutions ensures the necessity for a robust automatic signature verification tool. This tool aims to reduce fraud in all related financial transactions’ sectors. This paper proposes an online, robust, and automatic signature verification technique using the recent advances in image processing and machine learning. Once the image of a handwritten signature for a customer is captured, several pre-processing steps are performed on it including filtration and detection of the signature edges. Afterwards, a feature extraction process is applied on the image to extract Speeded up Robust Features (SURF and Scale-Invariant Feature Transform (SIFT features. Finally, a verification process is developed and applied to compare the extracted image features with those stored in the database for the specified customer. Results indicate high accuracy, simplicity, and rapidity of the developed technique, which are the main criteria to judge a signature verification tool in banking and other financial institutions.

  18. MO-FG-202-01: A Fast Yet Sensitive EPID-Based Real-Time Treatment Verification System

    International Nuclear Information System (INIS)

    Ahmad, M; Nourzadeh, H; Neal, B; Siebers, J; Watkins, W

    2016-01-01

    Purpose: To create a real-time EPID-based treatment verification system which robustly detects treatment delivery and patient attenuation variations. Methods: Treatment plan DICOM files sent to the record-and-verify system are captured and utilized to predict EPID images for each planned control point using a modified GPU-based digitally reconstructed radiograph algorithm which accounts for the patient attenuation, source energy fluence, source size effects, and MLC attenuation. The DICOM and predicted images are utilized by our C++ treatment verification software which compares EPID acquired 1024×768 resolution frames acquired at ∼8.5hz from Varian Truebeam™ system. To maximize detection sensitivity, image comparisons determine (1) if radiation exists outside of the desired treatment field; (2) if radiation is lacking inside the treatment field; (3) if translations, rotations, and magnifications of the image are within tolerance. Acquisition was tested with known test fields and prior patient fields. Error detection was tested in real-time and utilizing images acquired during treatment with another system. Results: The computational time of the prediction algorithms, for a patient plan with 350 control points and 60×60×42cm^3 CT volume, is 2–3minutes on CPU and <27 seconds on GPU for 1024×768 images. The verification software requires a maximum of ∼9ms and ∼19ms for 512×384 and 1024×768 resolution images, respectively, to perform image analysis and dosimetric validations. Typical variations in geometric parameters between reference and the measured images are 0.32°for gantry rotation, 1.006 for scaling factor, and 0.67mm for translation. For excess out-of-field/missing in-field fluence, with masks extending 1mm (at isocenter) from the detected aperture edge, the average total in-field area missing EPID fluence was 1.5mm2 the out-of-field excess EPID fluence was 8mm^2, both below error tolerances. Conclusion: A real-time verification software, with

  19. Film based verification of calculation algorithms used for brachytherapy planning-getting ready for upcoming challenges of MBDCA

    Directory of Open Access Journals (Sweden)

    Grzegorz Zwierzchowski

    2016-08-01

    Full Text Available Purpose: Well-known defect of TG-43 based algorithms used in brachytherapy is a lack of information about interaction cross-sections, which are determined not only by electron density but also by atomic number. TG-186 recommendations with using of MBDCA (model-based dose calculation algorithm, accurate tissues segmentation, and the structure’s elemental composition continue to create difficulties in brachytherapy dosimetry. For the clinical use of new algorithms, it is necessary to introduce reliable and repeatable methods of treatment planning systems (TPS verification. The aim of this study is the verification of calculation algorithm used in TPS for shielded vaginal applicators as well as developing verification procedures for current and further use, based on the film dosimetry method. Material and methods : Calibration data was collected by separately irradiating 14 sheets of Gafchromic® EBT films with the doses from 0.25 Gy to 8.0 Gy using HDR 192Ir source. Standard vaginal cylinders of three diameters were used in the water phantom. Measurements were performed without any shields and with three shields combination. Gamma analyses were performed using the VeriSoft® package. Results : Calibration curve was determined as third-degree polynomial type. For all used diameters of unshielded cylinder and for all shields combinations, Gamma analysis were performed and showed that over 90% of analyzed points meets Gamma criteria (3%, 3 mm. Conclusions : Gamma analysis showed good agreement between dose distributions calculated using TPS and measured by Gafchromic films, thus showing the viability of using film dosimetry in brachytherapy.

  20. Requirements verification and validation of operating system software for a PLC-based plant protection system prototype

    Energy Technology Data Exchange (ETDEWEB)

    Cha, K. H.; Lee, Y. J.; Cheon, S. W.; Son, H. S.; Kim, J. Y.; Lee, J. S.; Kwon, K. C. [KAERI, Taejon (Korea, Republic of)

    2004-07-01

    This paper describes Requirements Verification and Validation(V and V) of operating system(OS) software to be developed for Programmable Logic Controller(PLC)-based digital Plant Protection System(PPS) prototype in Korea Nuclear Instrumentation and Control System (KNICS) project. The OS is being developed as newly developing software, lifecycle V and V is applied, and software V and V criteria and requirements in the Software Review Plan (SRP)/BTP-14, the IEEE Std. 7-4.3.2, the IEEE Std. 1012, and the IEEE Std. 1028 are applied systematically and strictly at each lifecycle phase. Checklist-based Fagan Inspection has mainly been applied for requirements V and V while model checking is applied for formal verification and HAZOP is applied for identification of safety requirements. Checklist-based V and V procedure was very effective for systematic requirements V and V of OS software, and the applied V and V techniques and their tools in requirements V and V can also be applied for systematic design V and V of OS software.

  1. M&V Guidelines: Measurement and Verification for Performance-Based Contracts Version 4.0

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-11-02

    Document outlines the Federal Energy Management Program's standard procedures and guidelines for measurement and verification (M&V) for federal energy managers, procurement officials, and energy service providers.

  2. Fast 3D dosimetric verifications based on an electronic portal imaging device using a GPU calculation engine.

    Science.gov (United States)

    Zhu, Jinhan; Chen, Lixin; Chen, Along; Luo, Guangwen; Deng, Xiaowu; Liu, Xiaowei

    2015-04-11

    To use a graphic processing unit (GPU) calculation engine to implement a fast 3D pre-treatment dosimetric verification procedure based on an electronic portal imaging device (EPID). The GPU algorithm includes the deconvolution and convolution method for the fluence-map calculations, the collapsed-cone convolution/superposition (CCCS) algorithm for the 3D dose calculations and the 3D gamma evaluation calculations. The results of the GPU-based CCCS algorithm were compared to those of Monte Carlo simulations. The planned and EPID-based reconstructed dose distributions in overridden-to-water phantoms and the original patients were compared for 6 MV and 10 MV photon beams in intensity-modulated radiation therapy (IMRT) treatment plans based on dose differences and gamma analysis. The total single-field dose computation time was less than 8 s, and the gamma evaluation for a 0.1-cm grid resolution was completed in approximately 1 s. The results of the GPU-based CCCS algorithm exhibited good agreement with those of the Monte Carlo simulations. The gamma analysis indicated good agreement between the planned and reconstructed dose distributions for the treatment plans. For the target volume, the differences in the mean dose were less than 1.8%, and the differences in the maximum dose were less than 2.5%. For the critical organs, minor differences were observed between the reconstructed and planned doses. The GPU calculation engine was used to boost the speed of 3D dose and gamma evaluation calculations, thus offering the possibility of true real-time 3D dosimetric verification.

  3. 3D VMAT Verification Based on Monte Carlo Log File Simulation with Experimental Feedback from Film Dosimetry.

    Science.gov (United States)

    Barbeiro, A R; Ureba, A; Baeza, J A; Linares, R; Perucha, M; Jiménez-Ortega, E; Velázquez, S; Mateos, J C; Leal, A

    2016-01-01

    A model based on a specific phantom, called QuAArC, has been designed for the evaluation of planning and verification systems of complex radiotherapy treatments, such as volumetric modulated arc therapy (VMAT). This model uses the high accuracy provided by the Monte Carlo (MC) simulation of log files and allows the experimental feedback from the high spatial resolution of films hosted in QuAArC. This cylindrical phantom was specifically designed to host films rolled at different radial distances able to take into account the entrance fluence and the 3D dose distribution. Ionization chamber measurements are also included in the feedback process for absolute dose considerations. In this way, automated MC simulation of treatment log files is implemented to calculate the actual delivery geometries, while the monitor units are experimentally adjusted to reconstruct the dose-volume histogram (DVH) on the patient CT. Prostate and head and neck clinical cases, previously planned with Monaco and Pinnacle treatment planning systems and verified with two different commercial systems (Delta4 and COMPASS), were selected in order to test operational feasibility of the proposed model. The proper operation of the feedback procedure was proved through the achieved high agreement between reconstructed dose distributions and the film measurements (global gamma passing rates > 90% for the 2%/2 mm criteria). The necessary discretization level of the log file for dose calculation and the potential mismatching between calculated control points and detection grid in the verification process were discussed. Besides the effect of dose calculation accuracy of the analytic algorithm implemented in treatment planning systems for a dynamic technique, it was discussed the importance of the detection density level and its location in VMAT specific phantom to obtain a more reliable DVH in the patient CT. The proposed model also showed enough robustness and efficiency to be considered as a pre

  4. Channel Verification Results for the SCME models in a Multi-Probe Based MIMO OTA Setup

    DEFF Research Database (Denmark)

    Fan, Wei; Carreño, Xavier; S. Ashta, Jagjit

    2013-01-01

    , where the focus is on comparing results from various proposed methods. Channel model verification is necessary to ensure that the target channel models are correctly implemented inside the test area. This paper shows that the all the key parameters of the SCME models, i.e., power delay profile, temporal...

  5. Type-Based Automated Verification of Authenticity in Asymmetric Cryptographic Protocols

    DEFF Research Database (Denmark)

    Dahl, Morten; Kobayashi, Naoki; Sun, Yunde

    2011-01-01

    Gordon and Jeffrey developed a type system for verification of asymmetric and symmetric cryptographic protocols. We propose a modified version of Gordon and Jeffrey's type system and develop a type inference algorithm for it, so that protocols can be verified automatically as they are, without an...

  6. A New "Moodle" Module Supporting Automatic Verification of VHDL-Based Assignments

    Science.gov (United States)

    Gutierrez, Eladio; Trenas, Maria A.; Ramos, Julian; Corbera, Francisco; Romero, Sergio

    2010-01-01

    This work describes a new "Moodle" module developed to give support to the practical content of a basic computer organization course. This module goes beyond the mere hosting of resources and assignments. It makes use of an automatic checking and verification engine that works on the VHDL designs submitted by the students. The module automatically…

  7. Convex polyhedral abstractions, specialisation and property-based predicate splitting in Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    We present an approach to constrained Horn clause (CHC) verification combining three techniques: abstract interpretation over a domain of convex polyhedra, specialisation of the constraints in CHCs using abstract interpretation of query-answer transformed clauses, and refinement by splitting pred...

  8. Injecting Formal Verification in FMI-based Co-Simulations of Cyber-Physical Systems

    DEFF Research Database (Denmark)

    Couto, Luis Diogo; Basagiannis, Stylianos; Ridouane, El Hassan

    2017-01-01

    This publication is part of the Horizon 2020 project: Integrated Tool chain for model-based design of CPSs (INTO-CPS), project/GA number 644047.......This publication is part of the Horizon 2020 project: Integrated Tool chain for model-based design of CPSs (INTO-CPS), project/GA number 644047....

  9. A Simple Visual Ethanol Biosensor Based on Alcohol Oxidase Immobilized onto Polyaniline Film for Halal Verification of Fermented Beverage Samples

    Science.gov (United States)

    Kuswandi, Bambang; Irmawati, Titi; Hidayat, Moch Amrun; Jayus; Ahmad, Musa

    2014-01-01

    A simple visual ethanol biosensor based on alcohol oxidase (AOX) immobilised onto polyaniline (PANI) film for halal verification of fermented beverage samples is described. This biosensor responds to ethanol via a colour change from green to blue, due to the enzymatic reaction of ethanol that produces acetaldehyde and hydrogen peroxide, when the latter oxidizes the PANI film. The procedure to obtain this biosensor consists of the immobilization of AOX onto PANI film by adsorption. For the immobilisation, an AOX solution is deposited on the PANI film and left at room temperature until dried (30 min). The biosensor was constructed as a dip stick for visual and simple use. The colour changes of the films have been scanned and analysed using image analysis software (i.e., ImageJ) to study the characteristics of the biosensor's response toward ethanol. The biosensor has a linear response in an ethanol concentration range of 0.01%–0.8%, with a correlation coefficient (r) of 0.996. The limit detection of the biosensor was 0.001%, with reproducibility (RSD) of 1.6% and a life time up to seven weeks when stored at 4 °C. The biosensor provides accurate results for ethanol determination in fermented drinks and was in good agreement with the standard method (gas chromatography) results. Thus, the biosensor could be used as a simple visual method for ethanol determination in fermented beverage samples that can be useful for Muslim community for halal verification. PMID:24473284

  10. Statistical design for biospecimen cohort size in proteomics-based biomarker discovery and verification studies.

    Science.gov (United States)

    Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S

    2013-12-06

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.

  11. Design verification enhancement of field programmable gate array-based safety-critical I&C system of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of); Jung, Jaecheon, E-mail: jcjung@kings.ac.kr [Department of Nuclear Power Plant Engineering, KEPCO International Nuclear Graduate School, 658-91 Haemaji-ro, Seosang-myeon, Ulju-gun, Ulsan 45014 (Korea, Republic of); Heo, Gyunyoung [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of)

    2017-06-15

    Highlights: • An enhanced, systematic and integrated design verification approach is proposed for V&V of FPGA-based I&C system of NPP. • RPS bistable fixed setpoint trip algorithm is designed, analyzed, verified and discussed using the proposed approaches. • The application of integrated verification approach simultaneously verified the entire design modules. • The applicability of the proposed V&V facilitated the design verification processes. - Abstract: Safety-critical instrumentation and control (I&C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. However, safety analysis for FPGA-based I&C systems, and verification and validation (V&V) assessments still remain important issues to be resolved, which are now become a global research point of interests. In this work, we proposed a systematic design and verification strategies from start to ready-to-use in form of model-based approaches for FPGA-based reactor protection system (RPS) that can lead to the enhancement of the design verification and validation processes. The proposed methodology stages are requirement analysis, enhanced functional flow block diagram (EFFBD) models, finite state machine with data path (FSMD) models, hardware description language (HDL) code development, and design verifications. The design verification stage includes unit test – Very high speed integrated circuit Hardware Description Language (VHDL) test and modified condition decision coverage (MC/DC) test, module test – MATLAB/Simulink Co-simulation test, and integration test – FPGA hardware test beds. To prove the adequacy of the proposed

  12. Design verification enhancement of field programmable gate array-based safety-critical I&C system of nuclear power plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Jung, Jaecheon; Heo, Gyunyoung

    2017-01-01

    Highlights: • An enhanced, systematic and integrated design verification approach is proposed for V&V of FPGA-based I&C system of NPP. • RPS bistable fixed setpoint trip algorithm is designed, analyzed, verified and discussed using the proposed approaches. • The application of integrated verification approach simultaneously verified the entire design modules. • The applicability of the proposed V&V facilitated the design verification processes. - Abstract: Safety-critical instrumentation and control (I&C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. However, safety analysis for FPGA-based I&C systems, and verification and validation (V&V) assessments still remain important issues to be resolved, which are now become a global research point of interests. In this work, we proposed a systematic design and verification strategies from start to ready-to-use in form of model-based approaches for FPGA-based reactor protection system (RPS) that can lead to the enhancement of the design verification and validation processes. The proposed methodology stages are requirement analysis, enhanced functional flow block diagram (EFFBD) models, finite state machine with data path (FSMD) models, hardware description language (HDL) code development, and design verifications. The design verification stage includes unit test – Very high speed integrated circuit Hardware Description Language (VHDL) test and modified condition decision coverage (MC/DC) test, module test – MATLAB/Simulink Co-simulation test, and integration test – FPGA hardware test beds. To prove the adequacy of the proposed

  13. A Method for Cyber-Physical System Behavior Modeling and Safety Verification Based on Extended Hybrid System Description Language

    Directory of Open Access Journals (Sweden)

    Tuo Ming Fu

    2016-01-01

    Full Text Available The safety of Cyber-physical system(CPS is up to its behavior, and it is a key property for CPS to be applied in critical application fields. A method for CPS behavior modeling and safety verification is put forward in this paper. The behavior model of CPS is described by extended hybrid system description language(EHYSDEL. The formal definition of hybrid program(HP is given, and the behavior model is transformed to HP based on the definition. The safety of CPS is verified by inputting the HP to KeYmarea. The advantage of the approach is that it models CPS intuitively and verify it’s safety strictly avoiding the state space explosion

  14. Experimental verification of preset time count rate meters based on adaptive digital signal processing algorithms

    Directory of Open Access Journals (Sweden)

    Žigić Aleksandar D.

    2005-01-01

    Full Text Available Experimental verifications of two optimized adaptive digital signal processing algorithms implemented in two pre set time count rate meters were per formed ac cording to appropriate standards. The random pulse generator realized using a personal computer, was used as an artificial radiation source for preliminary system tests and performance evaluations of the pro posed algorithms. Then measurement results for background radiation levels were obtained. Finally, measurements with a natural radiation source radioisotope 90Sr-90Y, were carried out. Measurement results, con ducted without and with radio isotopes for the specified errors of 10% and 5% showed to agree well with theoretical predictions.

  15. Verification of Structural Simulation Results of Metal-based Additive Manufacturing by Means of Neutron Diffraction

    Science.gov (United States)

    Krol, T. A.; Seidel, C.; Schilp, J.; Hofmann, M.; Gan, W.; Zaeh, M. F.

    Metal-based additive processes are characterized by numerous transient physical effects, which exhibit an adverse influence on the production result. Hence, various research approaches for the optimization of e. g. the structural part behavior exist for layered manufacturing. Increasingly, these approaches are based on the finite element analysis to be able to understand the complexity. Hereby it should be considered that the significance of the calculation results depends on the quality of modeling the process in the simulation environment. Based on a selected specimen, the current work demonstrates in which way the numerical accuracy of the residual stress state can be analyzed by utilizing the neutron diffraction. Thereby, different process parameter settings were examined.

  16. Base

    DEFF Research Database (Denmark)

    Hjulmand, Lise-Lotte; Johansson, Christer

    2004-01-01

    BASE - Engelsk basisgrammatik er resultatet af Lise-Lotte Hjulmands grundige bearbejdning og omfattende revidering af Christer Johanssons Engelska basgrammatik. Grammatikken adskiller sig fra det svenske forlæg på en lang række punkter. Den er bl.a. tilpasset til et dansk publikum og det danske...

  17. 76 FR 3859 - Trade Acknowledgment and Verification of Security-Based Swap Transactions

    Science.gov (United States)

    2011-01-21

    ... automation and the purported assignment of positions by transferring parties to third parties without notice... over-the- counter derivatives market and improve operational performance, by increasing automation... such customized security-based swap or does not contain the data elements necessary to calculate the...

  18. Verification of a characterization method of the laser-induced selective activation based on industrial lasers

    DEFF Research Database (Denmark)

    Zhang, Yang; Hansen, Hans Nørgaard; Tang, Peter T.

    2013-01-01

    In this article, laser-induced selective activation (LISA) for subsequent autocatalytic copper plating is performed by several types of industrial scale lasers, including a Nd:YAG laser, a UV laser, a fiber laser, a green laser, and a short pulsed laser. Based on analysis of all the laser...

  19. Design, implementation and verification of software code for radiation dose assessment based on simple generic environmental model

    International Nuclear Information System (INIS)

    I Putu Susila; Arif Yuniarto

    2017-01-01

    Radiation dose assessment to determine the potential of radiological impacts of various installations within nuclear facility complex is necessary to ensure environmental and public safety. A simple generic model-based method for calculating radiation doses caused by the release of radioactive substances into the environment has been published by the International Atomic Energy Agency (IAEA) as the Safety Report Series No. 19 (SRS-19). In order to assist the application of the assessment method and a basis for the development of more complex assessment methods, an open-source based software code has been designed and implemented. The software comes with maps and is very easy to be used because assessment scenarios can be done through diagrams. Software verification was performed by comparing its result to SRS-19 and CROM software calculation results. Dose estimated by SRS-19 are higher compared to the result of developed software. However, these are still acceptable since dose estimation in SRS-19 is based on conservative approach. On the other hand, compared to CROM software, the same results for three scenarios and a non-significant difference of 2.25 % in another scenario were obtained. These results indicate the correctness of our implementation and implies that the developed software is ready for use in real scenario. In the future, the addition of various features and development of new model need to be done to improve the capability of software that has been developed. (author)

  20. Independent calculation-based verification of IMRT plans using a 3D dose-calculation engine

    International Nuclear Information System (INIS)

    Arumugam, Sankar; Xing, Aitang; Goozee, Gary; Holloway, Lois

    2013-01-01

    Independent monitor unit verification of intensity-modulated radiation therapy (IMRT) plans requires detailed 3-dimensional (3D) dose verification. The aim of this study was to investigate using a 3D dose engine in a second commercial treatment planning system (TPS) for this task, facilitated by in-house software. Our department has XiO and Pinnacle TPSs, both with IMRT planning capability and modeled for an Elekta-Synergy 6 MV photon beam. These systems allow the transfer of computed tomography (CT) data and RT structures between them but do not allow IMRT plans to be transferred. To provide this connectivity, an in-house computer programme was developed to convert radiation therapy prescription (RTP) files as generated by many planning systems into either XiO or Pinnacle IMRT file formats. Utilization of the technique and software was assessed by transferring 14 IMRT plans from XiO and Pinnacle onto the other system and performing 3D dose verification. The accuracy of the conversion process was checked by comparing the 3D dose matrices and dose volume histograms (DVHs) of structures for the recalculated plan on the same system. The developed software successfully transferred IMRT plans generated by 1 planning system into the other. Comparison of planning target volume (TV) DVHs for the original and recalculated plans showed good agreement; a maximum difference of 2% in mean dose, − 2.5% in D95, and 2.9% in V95 was observed. Similarly, a DVH comparison of organs at risk showed a maximum difference of +7.7% between the original and recalculated plans for structures in both high- and medium-dose regions. However, for structures in low-dose regions (less than 15% of prescription dose) a difference in mean dose up to +21.1% was observed between XiO and Pinnacle calculations. A dose matrix comparison of original and recalculated plans in XiO and Pinnacle TPSs was performed using gamma analysis with 3%/3 mm criteria. The mean and standard deviation of pixels passing

  1. Independent calculation-based verification of IMRT plans using a 3D dose-calculation engine.

    Science.gov (United States)

    Arumugam, Sankar; Xing, Aitang; Goozee, Gary; Holloway, Lois

    2013-01-01

    Independent monitor unit verification of intensity-modulated radiation therapy (IMRT) plans requires detailed 3-dimensional (3D) dose verification. The aim of this study was to investigate using a 3D dose engine in a second commercial treatment planning system (TPS) for this task, facilitated by in-house software. Our department has XiO and Pinnacle TPSs, both with IMRT planning capability and modeled for an Elekta-Synergy 6MV photon beam. These systems allow the transfer of computed tomography (CT) data and RT structures between them but do not allow IMRT plans to be transferred. To provide this connectivity, an in-house computer programme was developed to convert radiation therapy prescription (RTP) files as generated by many planning systems into either XiO or Pinnacle IMRT file formats. Utilization of the technique and software was assessed by transferring 14 IMRT plans from XiO and Pinnacle onto the other system and performing 3D dose verification. The accuracy of the conversion process was checked by comparing the 3D dose matrices and dose volume histograms (DVHs) of structures for the recalculated plan on the same system. The developed software successfully transferred IMRT plans generated by 1 planning system into the other. Comparison of planning target volume (TV) DVHs for the original and recalculated plans showed good agreement; a maximum difference of 2% in mean dose, - 2.5% in D95, and 2.9% in V95 was observed. Similarly, a DVH comparison of organs at risk showed a maximum difference of +7.7% between the original and recalculated plans for structures in both high- and medium-dose regions. However, for structures in low-dose regions (less than 15% of prescription dose) a difference in mean dose up to +21.1% was observed between XiO and Pinnacle calculations. A dose matrix comparison of original and recalculated plans in XiO and Pinnacle TPSs was performed using gamma analysis with 3%/3mm criteria. The mean and standard deviation of pixels passing gamma

  2. Verification of MENDL2 and IEAF-2001 Data bases at intermediate energies

    Energy Technology Data Exchange (ETDEWEB)

    Titarenko, Y. E. (Yury E.); Batyaev, V. F. (Vyacheslav F.); Karpikhin, E. I. (Evgeny I.); Zhivun, V. M. (Valery M.); Koldobsky, A. B. (Aleksander B.); Mulambetov, R. D. (Ruslan D.); Mulambetova, S. V.; Trebukhovsky, Y. V. (Yury V.); Zaitsev, S. L.; Lipatov, K. A.; Mashnik, S. G. (Stepan G.); Prael, R. E. (Richard E.)

    2004-01-01

    The work presents results on computer simulations of two experiments whose aim was measuring the threshold activation reaction rates in {sup 12}C, {sup 19}F, {sup 27}Al, {sup 59}Co, {sup 63}Cu, {sup 65}Cu, {sup 64}Zn, {sup 93}Nb, {sup 115}In, {sup 169}Tm, {sup 181}Ta, {sup 197}Au, and {sup 209}Bi thin samples placed inside and outside a 0.8-GeV proton-irradiated 4-cm thick W target and a 92-cm thick W-Na composite target of 15-cm diameter both. In total, more than 1000 values of activation reaction rates were determined in both experiments. The measured data were compared with results by the LAHET code using several nuclear data bases for the respective excitation functions, namely, ENDF/B6 for cross section of neutrons at energies below 20 MeV and MENDL2 together with MENDL2P for cross sections of protons and neutrons of 20 to 100 MeV energies. The recently developed IEAF-2001 data base that provides neutron cross sections up to 150 MeV was used as well. Simulation-to-experiment results obtained using MENDL2 and IEAF-2001 are presented. The agreement between simulation and experiment was found satisfactory for both data bases. Nevertheless; further studies should be conducted to improve simulations of the production of secondary protons and high-energy neutrons, as well as the high-energy neutron elastic scattering. Our results allow drawing some conclusions concerning the reliability of the transport codes and data bases used to simulate Accelerator Driven Systems (ADS), particularly with Na-cooled W targets. The high-energy threshold excitation functions to be used in activation-based unfolding of neutron spectra inside the ADS can be also inferred from our results.

  3. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants

    Science.gov (United States)

    Drapikowski, Paweł; Kazimierczak-Grygiel, Ewa; Korecki, Dominik; Wiland-Szymańska, Justyna

    2016-01-01

    This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume) ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated. PMID:27355949

  4. A Tool for Verification and Validation of Neural Network Based Adaptive Controllers for High Assurance Systems

    Science.gov (United States)

    Gupta, Pramod; Schumann, Johann

    2004-01-01

    High reliability of mission- and safety-critical software systems has been identified by NASA as a high-priority technology challenge. We present an approach for the performance analysis of a neural network (NN) in an advanced adaptive control system. This problem is important in the context of safety-critical applications that require certification, such as flight software in aircraft. We have developed a tool to measure the performance of the NN during operation by calculating a confidence interval (error bar) around the NN's output. Our tool can be used during pre-deployment verification as well as monitoring the network performance during operation. The tool has been implemented in Simulink and simulation results on a F-15 aircraft are presented.

  5. Diffusion-weighted MRI for verification of electroporation-based treatments

    DEFF Research Database (Denmark)

    Mahmood, Faisal; Hansen, Rasmus H; Agerholm-Larsen, Birgit

    2011-01-01

    such a tissue reaction represents a great clinical benefit since, in case of target miss, retreatment can be performed immediately. We propose diffusion-weighted magnetic resonance imaging (DW-MRI) as a method to monitor EP tissue, using the concept of the apparent diffusion coefficient (ADC). We hypothesize...... that the plasma membrane permeabilization induced by EP changes the ADC, suggesting that DW-MRI constitutes a noninvasive and quick means of EP verification. In this study we performed in vivo EP in rat brains, followed by DW-MRI using a clinical MRI scanner. We found a pulse amplitude-dependent increase...... in the ADC following EP, indicating that (1) DW-MRI is sensitive to the EP-induced changes and (2) the observed changes in ADC are indeed due to the applied electric field....

  6. ECG-based PICC tip verification system: an evaluation 5 years on.

    Science.gov (United States)

    Oliver, Gemma; Jones, Matt

    2016-10-27

    In 2011, the vascular access team at East Kent Hospitals University NHS Foundation Trust safely and successfully incorporated the use of electrocardiogram (ECG) guidance technology for verification of peripherally inserted central catheters (PICC) tip placement into their practice. This study, 5 years on, compared the strengths and limitations of using this ECG method with the previous gold-standard of post-procedural chest X-ray. The study was undertaken using an embedded case study approach, and the cost, accuracy and efficiency of both systems were evaluated and compared. Using ECG to confirm PICC tip position was found to be cheaper, quicker and more accurate than post-procedural chest X-ray.

  7. SU-F-T-463: Light-Field Based Dynalog Verification

    International Nuclear Information System (INIS)

    Atwal, P; Ramaseshan, R

    2016-01-01

    Purpose: To independently verify leaf positions in so-called dynalog files for a Varian iX linac with a Millennium 120 MLC. This verification provides a measure of confidence that the files can be used directly as part of a more extensive intensity modulated radiation therapy / volumetric modulated arc therapy QA program. Methods: Initial testing used white paper placed at the collimator plane and a standard hand-held digital camera to image the light and shadow of a static MLC field through the paper. Known markings on the paper allow for image calibration. Noise reduction was attempted with removal of ‘inherent noise’ from an open-field light image through the paper, but the method was found to be inconsequential. This is likely because the environment could not be controlled to the precision required for the sort of reproducible characterization of the quantum noise needed in order to meaningfully characterize and account for it. A multi-scale iterative edge detection algorithm was used for localizing the leaf ends. These were compared with the planned locations from the treatment console. Results: With a very basic setup, the image of the central bank A leaves 15–45, which are arguably the most important for beam modulation, differed from the planned location by [0.38±0.28] mm. Similarly, for bank B leaves 15–45 had a difference of [0.42±0.28] mm Conclusion: It should be possible to determine leaf position accurately with not much more than a modern hand-held camera and some software. This means we can have a periodic and independent verification of the dynalog file information. This is indicated by the precision already achieved using a basic setup and analysis methodology. Currently, work is being done to reduce imaging and setup errors, which will bring the leaf position error down further, and allow meaningful analysis over the full range of leaves.

  8. SU-F-T-463: Light-Field Based Dynalog Verification

    Energy Technology Data Exchange (ETDEWEB)

    Atwal, P; Ramaseshan, R [BC Cancer Agency, Abbotsford, BC (Canada)

    2016-06-15

    Purpose: To independently verify leaf positions in so-called dynalog files for a Varian iX linac with a Millennium 120 MLC. This verification provides a measure of confidence that the files can be used directly as part of a more extensive intensity modulated radiation therapy / volumetric modulated arc therapy QA program. Methods: Initial testing used white paper placed at the collimator plane and a standard hand-held digital camera to image the light and shadow of a static MLC field through the paper. Known markings on the paper allow for image calibration. Noise reduction was attempted with removal of ‘inherent noise’ from an open-field light image through the paper, but the method was found to be inconsequential. This is likely because the environment could not be controlled to the precision required for the sort of reproducible characterization of the quantum noise needed in order to meaningfully characterize and account for it. A multi-scale iterative edge detection algorithm was used for localizing the leaf ends. These were compared with the planned locations from the treatment console. Results: With a very basic setup, the image of the central bank A leaves 15–45, which are arguably the most important for beam modulation, differed from the planned location by [0.38±0.28] mm. Similarly, for bank B leaves 15–45 had a difference of [0.42±0.28] mm Conclusion: It should be possible to determine leaf position accurately with not much more than a modern hand-held camera and some software. This means we can have a periodic and independent verification of the dynalog file information. This is indicated by the precision already achieved using a basic setup and analysis methodology. Currently, work is being done to reduce imaging and setup errors, which will bring the leaf position error down further, and allow meaningful analysis over the full range of leaves.

  9. Clinical evaluation of 3D/3D MRI-CBCT automatching on brain tumors for online patient setup verification - A step towards MRI-based treatment planning

    DEFF Research Database (Denmark)

    Buhl, S.K.; Duun-Christensen, Anne Katrine; Kristensen, B.H.

    2010-01-01

    Background. Magnetic Resonance Imaging (MRI) is often used in modern day radiotherapy (RT) due to superior soft tissue contrast. However, treatment planning based solely on MRI is restricted due to e. g. the limitations of conducting online patient setup verification using MRI as reference. In th...... the absolute difference in couch shift coordinates acquired from MRI-CBCT and CT-CBCT automatching, were...

  10. Safety based on organisational learning (SOL) - Conceptual approach and verification of a method for event analysis

    International Nuclear Information System (INIS)

    Miller, R.; Wilpert, B.; Fahlbruch, B.

    1999-01-01

    This paper discusses a method for analysing safety-relevant events in NPP which is known as 'SOL', safety based on organisational learning. After discussion of the specific organisational and psychological problems examined in the event analysis, the analytic process using the SOL approach is explained as well as the required general setting. The SOL approach has been tested both with scientific experiments and from the practical perspective, by operators of NPPs and experts from other branches of industry. (orig./CB) [de

  11. Abstract Interpretation-based verification/certification in the ciaoPP system

    OpenAIRE

    Puebla Sánchez, Alvaro Germán; Albert Albiol, Elvira; Hermenegildo, Manuel V.

    2005-01-01

    CiaoPP is the abstract interpretation-based preprocessor of the Ciao multi-paradigm (Constraint) Logic Programming system. It uses modular, incremental abstract interpretation as a fundamental tool to obtain information about programs. In CiaoPP, the semantic approximations thus produced have been applied to perform high- and low-level optimizations during program compilation, including transformations such as múltiple abstract specialization, parallelization, partial evaluation, resource...

  12. Simulink based behavioural modelling of a pulse oximeter for deployment in rapid development, prototyping and verification.

    Science.gov (United States)

    Shokouhian, M; Morling, R C S; Kale, I

    2012-01-01

    The pulse oximeter is a well-known device for measuring the level of oxygen in blood. Since their invention, pulse oximeters have been under constant development in both aspects of hardware and software; however there are still unsolved problems that limit their performance [6], [7]. Many fresh algorithms and new design techniques are being suggested every year by industry and academic researchers which claim that they can improve accuracy of measurements [8], [9]. With the lack of an accurate computer-based behavioural model for pulse oximeters, the only way for evaluation of these newly developed systems and algorithms is through hardware implementation which can be both expensive and time consuming. This paper presents an accurate Simulink based behavioural model for a pulse oximeter that can be used by industry and academia alike working in this area, as an exploration as well as productivity enhancement tool during their research and development process. The aim of this paper is to introduce a new computer-based behavioural model which provides a simulation environment from which new ideas can be rapidly evaluated long before the real implementation.

  13. Mechatronics design and experimental verification of an electric-vehicle-based hybrid thermal management system

    Directory of Open Access Journals (Sweden)

    Yi-Hsuan Hung

    2016-02-01

    Full Text Available In this study, an electric-vehicle-based thermal management system was designed for dual energy sources. An experimental platform developed in a previous study was modified. Regarding the mechanical components, a heat exchanger with a radiator, proportional valve, coolant pipes, and coolant pump was appropriately integrated. Regarding the electric components, two heaters emulating waste heat were controlled using two programmable power supply machines. A rapid-prototyping controller with two temperature inputs and three outputs was designed. Rule-based control strategies were coded to maintain optimal temperatures for the emulated proton exchange membrane fuel cells and lithium batteries. To evaluate the heat power of dual energy sources, driving cycles, energy management control, and efficiency maps of energy sources were considered for deriving time-variant values. The main results are as follows: (a an advanced mechatronics platform was constructed; (b a driving cycle simulation was successfully conducted; and (c coolant temperatures reached their optimal operating ranges when the proportional valve, radiator, and coolant pump were sequentially controlled. The benefits of this novel electric-vehicle-based thermal management system are (a high-efficiency operation of energy sources, (b low occupied volume integrated with energy sources, and (c higher electric vehicle traveling mileage. This system will be integrated with real energy sources and a real electric vehicle in the future.

  14. Verification and validation issues for digitally-based NPP safety systems

    International Nuclear Information System (INIS)

    Ets, A.R.

    1993-01-01

    The trend toward standardization, integration and reduced costs has led to increasing use of digital systems in reactor protection systems. While digital systems provide maintenance and performance advantages, their use also introduces new safety issues, in particular with regard to software. Current practice relies on verification and validation (V and V) to ensure the quality of safety software. However, effective V and V must be done in conjunction with a structured software development process and must consider the context of the safety system application. This paper present some of the issues and concerns that impact on the V and V process. These include documentation of systems requirements, common mode failures, hazards analysis and independence. These issues and concerns arose during evaluations of NPP safety systems for advanced reactor designs and digital I and C retrofits for existing nuclear plants in the United States. The pragmatic lessons from actual systems reviews can provide a basis for further refinement and development of guidelines for applying V and V to NPP safety systems. (author). 14 refs

  15. Measurement Verification of Plane Wave Synthesis Technique Based on Multi-probe MIMO-OTA Setup

    DEFF Research Database (Denmark)

    Fan, Wei; Carreño, Xavier; Nielsen, Jesper Ødum

    2012-01-01

    Standardization work for MIMO OTA testing methods is currently ongoing, where a multi-probe anechoic chamber based solution is an important candidate. In this paper, the probes located on an OTA ring are used to synthesize a plane wave field in the center of the OTA ring. This paper investigates...... the extent to which we can approach the synthesized plane wave in practical measurement systems. Both single plane wave with certain AoA and multiple plane waves with different AoAs and power weightings are synthesized and measured. Deviations of the measured plane wave and the simulated plane wave field...

  16. Refinement-Based Verification of the FreeRTOS Scheduler in VCC

    OpenAIRE

    Woodcock, James Charles Paul; Divakaran, Sumesh; D'Souza, Deepak; Kushwah, Anirudh; Sampath, Prahladavaradan; Sridhar, Nigamanth

    2015-01-01

    We describe our experience with verifying the scheduler-related functionality of FreeRTOS, a popular open-source embedded real-time operating system. We propose a methodology for carrying out refinement-based proofs of functional correctness of abstract data types in the popular code-level verifier VCC. We then apply this methodology to carry out a full machine-checked proof of the functional correctness of the FreeRTOS scheduler. We describe the bugs found during this exercise, the fixes mad...

  17. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments

  18. Development and Verification of a Pilot Code based on Two-fluid Three-field Model

    International Nuclear Information System (INIS)

    Hwang, Moon Kyu; Bae, S. W.; Lee, Y. J.; Chung, B. D.; Jeong, J. J.; Ha, K. S.; Kang, D. H.

    2006-09-01

    In this study, a semi-implicit pilot code is developed for a one-dimensional channel flow as three-fields. The three fields are comprised of a gas, continuous liquid and entrained liquid fields. All the three fields are allowed to have their own velocities. The temperatures of the continuous liquid and the entrained liquid are, however, assumed to be equilibrium. The interphase phenomena include heat and mass transfer, as well as momentum transfer. The fluid/structure interaction, generally, include both heat and momentum transfer. Assuming adiabatic system, only momentum transfer is considered in this study, leaving the wall heat transfer for the future study. Using 10 conceptual problems, the basic pilot code has been verified. The results of the verification are summarized below: It was confirmed that the basic pilot code can simulate various flow conditions (such as single-phase liquid flow, bubbly flow, slug/churn turbulent flow, annular-mist flow, and single-phase vapor flow) and transitions of the flow conditions. The pilot code was programmed so that the source terms of the governing equations and numerical solution schemes can be easily tested. The mass and energy conservation was confirmed for single-phase liquid and single-phase vapor flows. It was confirmed that the inlet pressure and velocity boundary conditions work properly. It was confirmed that, for single- and two-phase flows, the velocity and temperature of non-existing phase are calculated as intended. Complete phase depletion which might occur during a phase change was found to adversely affect the code stability. A further study would be required to enhance code capability in this regard

  19. Novel method based on Fricke gel dosimeters for dose verification in IMRT techniques

    International Nuclear Information System (INIS)

    Aon, E.; Brunetto, M.; Sansogne, R.; Castellano, G.; Valente, M.

    2008-01-01

    Modern radiotherapy is becoming increasingly complex. Conformal and intensity modulated (IMRT) techniques are nowadays available for achieving better tumour control. However, accurate methods for 3D dose verification for these modern irradiation techniques have not been adequately established yet. Fricke gel dosimeters consist, essentially, in a ferrous sulphate (Fricke) solution fixed to a gel matrix, which enables spatial resolution. A suitable radiochromic marker (xylenol orange) is added to the solution in order to produce radiochromic changes within the visible spectrum range, due to the chemical internal conversion (oxidation) of ferrous ions to ferric ions. In addition, xylenol orange has proved to slow down the internal diffusion effect of ferric ions. These dosimeters suitably shaped in form of thin layers and optically analyzed by means of visible light transmission imaging have recently been proposed as a method for 3D absorbed dose distribution determinations in radiotherapy, and tested in several IMRT applications employing a homogeneous plane (visible light) illuminator and a CCD camera with a monochromatic filter for sample analysis by means of transmittance images. In this work, the performance of an alternative read-out method is characterized, consisting on visible light images, acquired before and after irradiation by means of a commercially available flatbed-like scanner. Registered images are suitably converted to matrices and analyzed by means of dedicated 'in-house' software. The integral developed method allows performing 1D (profiles), 2D (surfaces) and 3D (volumes) dose mapping. In addition, quantitative comparisons have been performed by means of the Gamma composite criteria. Dose distribution comparisons between Fricke gel dosimeters and traditional standard dosimetric techniques for IMRT irradiations show an overall good agreement, supporting the suitability of the method. The agreement, quantified by the gamma index (that seldom

  20. Model-Based Interpretation and Experimental Verification of ECT Signals of Steam Generator Tubes

    International Nuclear Information System (INIS)

    Song, Sung Jin; Kim, Young Hwan; Kim, Eui Lae; Yim, Chang Jae; Lee, Jin Ho

    2004-01-01

    Model-based inversion tools for eddy current signals have been developed by combining neural networks and finite element modeling, for quantitative flaw characterization in steam generator tubes. In the present work, interpretation of experimental eddy current signals was carried out in order to validate the developed inversion tools. A database was constructed using the synthetic flaw signals generated by the finite element model. The hybrid neural networks composed of a PNN classifier and BPNN size estimators were trained using the synthetic signals. Experimental eddy current signals were obtained from axisymmetric artificial flaws. Interpretation of flaw signals was conducted by feeding the experimental signals into the neural networks. The interpretation was excellent, which shows that the developed inversion tools would be applicable to the Interpretation of real eddy current signals

  1. Verification of the IVA4 film boiling model with the data base of Liu and Theofanous

    Energy Technology Data Exchange (ETDEWEB)

    Kolev, N.I. [Siemens AG Unternehmensbereich KWU, Erlangen (Germany)

    1998-01-01

    Part 1 of this work presents a closed analytical solution for mixed-convection film boiling on vertical walls. Heat transfer coefficients predicted by the proposed model and experimental data obtained at the Royal Institute of Technology in Sweden by Okkonen et al are compared. All data predicted are inside the {+-}10% error band, with mean averaged error being below 4% using the slightly modified analytical solution. The solution obtained is recommended for practical applications. The method presented here is used in Part 2 as a guideline for developing model for film boiling on spheres. The new semi-empirical film boiling model for spheres used in IVA4 computer code is compared with the experimental data base obtained by Liu and Theofanous. The data are predicted within {+-}30% error band. (author)

  2. Formal verification of dynamic hybrid systems: a NuSMV-based model checking approach

    Directory of Open Access Journals (Sweden)

    Xu Zhi

    2018-01-01

    Full Text Available Software security is an important and challenging research topic in developing dynamic hybrid embedded software systems. Ensuring the correct behavior of these systems is particularly difficult due to the interactions between the continuous subsystem and the discrete subsystem. Currently available security analysis methods for system risks have been limited, as they rely on manual inspections of the individual subsystems under simplifying assumptions. To improve this situation, a new approach is proposed that is based on the symbolic model checking tool NuSMV. A dual PID system is used as an example system, for which the logical part and the computational part of the system are modeled in a unified manner. Constraints are constructed on the controlled object, and a counter-example path is ultimately generated, indicating that the hybrid system can be analyzed by the model checking tool.

  3. Development and verification of remote research environment based on 'Fusion research grid'

    International Nuclear Information System (INIS)

    Iba, Katsuyuki; Ozeki, Takahisa; Totsuka, Toshiyuki; Suzuki, Yoshio; Oshima, Takayuki; Sakata, Shinya; Sato, Minoru; Suzuki, Mitsuhiro; Hamamatsu, Kiyotaka; Kiyono, Kimihiro

    2008-01-01

    'Fusion research grid' is a concept that unites scientists and let them collaborate effectively against their difference in time zone and location in a nuclear fusion research. Fundamental technologies of 'Fusion research grid' have been developed at JAEA in the VizGrid project under the e-Japan project at the Ministry of Education, Culture, Sports, Science and Technology (MEXT). We are conscious of needs to create new systems that assist researchers with their research activities because remote collaborations have been increasing in international projects. Therefore we have developed prototype remote research environments for experiments, diagnostics, analyses and communications based on 'Fusion research grid'. All users can access these environments from anywhere because 'Fusion research grid' does not require a closed network like Super SINET to maintain security. The prototype systems were verified in experiments at JT-60U and their availability was confirmed

  4. Design and Hardware Verification of Canard Based Sounding Rocket Attitude Controller Using Adaptive Filter

    Science.gov (United States)

    Sawai, Shujiro; Matsuda, Seiji

    Canard based controller using an adaptive notch filter is proposed to control the attitude of launch vehicles including the ISAS's sounding rocket `S-520'. As the characteristics of launch vehicles are time variant in nature, conventional time invariant controller is not suitable for this purpose. Here, adaptive notch filter is proposed to treat the time variant nature. This adaptive filter acts to null out the structural bending mode, which often causes the instability of the attitude controller. The proposed adaptation law requires only limited calculation cost. It means that it is easy to install to the real flight system. The hardware module which aims to control the attitude of the sounding rocket `S-520' is designed and verified not only by the numerical simulations, but also by the hardware tests.

  5. Development and verification of ground-based tele-robotics operations concept for Dextre

    Science.gov (United States)

    Aziz, Sarmad

    2013-05-01

    The Special Purpose Dextreous Manipulator (Dextre) is the latest addition to the on-orbit segment of the Mobile Servicing System (MSS); Canada's contribution to the International Space Station (ISS). Launched in March 2008, the advanced two-armed robot is designed to perform various ISS maintenance tasks on robotically compatible elements and on-orbit replaceable units using a wide variety of tools and interfaces. The addition of Dextre has increased the capabilities of the MSS, and has introduced significant complexity to ISS robotics operations. While the initial operations concept for Dextre was based on human-in-the-loop control by the on-orbit astronauts, the complexities of robotic maintenance and the associated costs of training and maintaining the operator skills required for Dextre operations demanded a reexamination of the old concepts. A new approach to ISS robotic maintenance was developed in order to utilize the capabilities of Dextre safely and efficiently, while at the same time reducing the costs of on-orbit operations. This paper will describe the development, validation, and on-orbit demonstration of the operations concept for ground-based tele-robotics control of Dextre. It will describe the evolution of the new concepts from the experience gained from the development and implementation of the ground control capability for the Space Station Remote Manipulator System; Canadarm 2. It will discuss the various technical challenges faced during the development effort, such as requirements for high positioning accuracy, force/moment sensing and accommodation, failure tolerance, complex tool operations, and the novel operational tools and techniques developed to overcome them. The paper will also describe the work performed to validate the new concepts on orbit and will discuss the results and lessons learned from the on-orbit checkout and commissioning of Dextre using the newly developed tele-robotics techniques and capabilities.

  6. Assessment of surface solar irradiance derived from real-time modelling techniques and verification with ground-based measurements

    Science.gov (United States)

    Kosmopoulos, Panagiotis G.; Kazadzis, Stelios; Taylor, Michael; Raptis, Panagiotis I.; Keramitsoglou, Iphigenia; Kiranoudis, Chris; Bais, Alkiviadis F.

    2018-02-01

    This study focuses on the assessment of surface solar radiation (SSR) based on operational neural network (NN) and multi-regression function (MRF) modelling techniques that produce instantaneous (in less than 1 min) outputs. Using real-time cloud and aerosol optical properties inputs from the Spinning Enhanced Visible and Infrared Imager (SEVIRI) on board the Meteosat Second Generation (MSG) satellite and the Copernicus Atmosphere Monitoring Service (CAMS), respectively, these models are capable of calculating SSR in high resolution (1 nm, 0.05°, 15 min) that can be used for spectrally integrated irradiance maps, databases and various applications related to energy exploitation. The real-time models are validated against ground-based measurements of the Baseline Surface Radiation Network (BSRN) in a temporal range varying from 15 min to monthly means, while a sensitivity analysis of the cloud and aerosol effects on SSR is performed to ensure reliability under different sky and climatological conditions. The simulated outputs, compared to their common training dataset created by the radiative transfer model (RTM) libRadtran, showed median error values in the range -15 to 15 % for the NN that produces spectral irradiances (NNS), 5-6 % underestimation for the integrated NN and close to zero errors for the MRF technique. The verification against BSRN revealed that the real-time calculation uncertainty ranges from -100 to 40 and -20 to 20 W m-2, for the 15 min and monthly mean global horizontal irradiance (GHI) averages, respectively, while the accuracy of the input parameters, in terms of aerosol and cloud optical thickness (AOD and COT), and their impact on GHI, was of the order of 10 % as compared to the ground-based measurements. The proposed system aims to be utilized through studies and real-time applications which are related to solar energy production planning and use.

  7. Verification of Compressible and Incompressible Computational Fluid Dynamics Codes and Residual-based Mesh Adaptation

    Science.gov (United States)

    Choudhary, Aniruddha

    Code verifition is the process of ensuring, to the degree possible, that there are no algorithm deficiencies and coding mistakes (bugs) in a scientific computing simulation. In this work, techniques are presented for performing code verifition of boundary conditions commonly used in compressible and incompressible Computational Fluid Dynamics (CFD) codes. Using a compressible CFD code, this study assesses the subsonic in flow (isentropic and fixed-mass), subsonic out ow, supersonic out ow, no-slip wall (adiabatic and isothermal), and inviscid slip-wall. The use of simplified curved surfaces is proposed for easier generation of manufactured solutions during the verifition of certain boundary conditions involving many constraints. To perform rigorous code verifition, general grids with mixed cell types at the verified boundary are used. A novel approach is introduced to determine manufactured solutions for boundary condition verifition when the velocity-field is constrained to be divergence-free during the simulation in an incompressible CFD code. Order of accuracy testing using the Method of Manufactured Solutions (MMS) is employed here for code verifition of the major components of an open-source, multiphase ow code - MFIX. The presence of two-phase governing equations and a modified SIMPLE-based algorithm requiring divergence-free flows makes the selection of manufactured solutions more involved than for single-phase, compressible flows. Code verifition is performed here on 2D and 3D, uniform and stretched meshes for incompressible, steady and unsteady, single-phase and two-phase flows using the two-fluid model of MFIX. In a CFD simulation, truncation error (TE) is the difference between the continuous governing equation and its discrete approximation. Since TE can be shown to be the local source term for the discretization error, TE is proposed as the criterion for determining which regions of the computational mesh should be refined/coarsened. For mesh

  8. Experimental verification of internal dosimetry calculations: Construction of a heterogeneous phantom based on human organs

    International Nuclear Information System (INIS)

    Lauridsen, B.; Hedemann Jensen, P.

    1987-01-01

    The basic dosimetric quantity in ICRP-publication no. 30 is the aborbed fraction AF(T<-S). This parameter is the fraction of energy absorbed in a target organ T per emission of radiation from activity deposited in the source organ S. Based upon this fraction it is possible to calculate the Specific Effective Energy SEE(T<-S). From this, the committed effective dose equivalent from an intake of radioactive material can be found, and thus the annual limit of intake for given radionuclides can be determined. A male phantom has been constructed with the aim of measuring the Specific Effective Energy SEE(T<-S) in various target organs. Impressions-of real human organs have been used to produce vacuum forms. Tissue equivalent plastic sheets were sucked into the vacuum forms producing a shell with a shape identical to the original organ. Each organ has been made of two shells. The same procedure has been used for the body. Thin tubes through the organs make it possible to place TL dose meters in a matrix so the dose distribution can be measured. The phantom has been supplied with lungs, liver, kidneys, spleen, stomach, bladder, pancreas, and thyroid gland. To select a suitable body liquid for the phantom, laboratory experiments have been made with different liquids and different radionuclides. In these experiments the change in dose rate due to changes in density and composition of the liquid was determined. Preliminary results of the experiments are presented. (orig.)

  9. MENDL2 and IEAF-2001 nuclide production yields data bases verification at intermediate energies.

    Energy Technology Data Exchange (ETDEWEB)

    Titarenko, Y. E. (Yury E.); Batyaev, V. F. (Vyacheslav F.); Zhivun, V. M. (Valery M.); Mulambetov, R. D. (Ruslan D.); Mulambetova, S. V.; Zaitsev, S. L.; Lipatov, K. A.; Mashnik, S. G. (Stepan G.); Prael, R. E. (Richard E.)

    2004-01-01

    The work presents the results of computer simulation of two experiments which aim was measuring the threshold activation reaction rates in {sup 12}C, {sup 19}F, {sup 27}Al, {sup 59}Co, {sup 63}Cu, {sup 65}Cu, {sup 64}Zn, {sup 93}Nb, {sup 115}In, {sup 169}Tm, {sup 181}Ta, {sup 197}Au, and {sup 209}Bi thin samples placed inside and outside the 0.8-GeV proton-irradiated 4-cm thick W target and 92-cm thick W-Na composite target of 15-cm diameter both. In total, more than 1000 values of activation reaction were determined in the both experiments. The measured reaction rates were compared with the rates simulated by the LAHET code with the use of several nuclear databases for the respective excitation functions, namely, MENDL2/2P for neutron/proton cross sections up to 100 MeV, and recently developed IEAF-2001 that provides neutron cross sections up to 150 MeV. The comparison between the simulation-to-experiment agreements obtained via the MENDL2 and IEAF-2001 is presented. The agreement between simulation and experiment has been found generally satisfactory for both of the databases. The high-energy threshold excitation functions to be used in the activation-based unfolding of neutron spectra inside the Accelerator Driven Systems (ADS), particularly with Na-cooled W targets, can be inferred from the results.

  10. Model-based virtual VSB mask writer verification for efficient mask error checking and optimization prior to MDP

    Science.gov (United States)

    Pack, Robert C.; Standiford, Keith; Lukanc, Todd; Ning, Guo Xiang; Verma, Piyush; Batarseh, Fadi; Chua, Gek Soon; Fujimura, Akira; Pang, Linyong

    2014-10-01

    A methodology is described wherein a calibrated model-based `Virtual' Variable Shaped Beam (VSB) mask writer process simulator is used to accurately verify complex Optical Proximity Correction (OPC) and Inverse Lithography Technology (ILT) mask designs prior to Mask Data Preparation (MDP) and mask fabrication. This type of verification addresses physical effects which occur in mask writing that may impact lithographic printing fidelity and variability. The work described here is motivated by requirements for extreme accuracy and control of variations for today's most demanding IC products. These extreme demands necessitate careful and detailed analysis of all potential sources of uncompensated error or variation and extreme control of these at each stage of the integrated OPC/ MDP/ Mask/ silicon lithography flow. The important potential sources of variation we focus on here originate on the basis of VSB mask writer physics and other errors inherent in the mask writing process. The deposited electron beam dose distribution may be examined in a manner similar to optical lithography aerial image analysis and image edge log-slope analysis. This approach enables one to catch, grade, and mitigate problems early and thus reduce the likelihood for costly long-loop iterations between OPC, MDP, and wafer fabrication flows. It moreover describes how to detect regions of a layout or mask where hotspots may occur or where the robustness to intrinsic variations may be improved by modification to the OPC, choice of mask technology, or by judicious design of VSB shots and dose assignment.

  11. Simulation-To-Flight (STF-1): A Mission to Enable CubeSat Software-Based Validation and Verification

    Science.gov (United States)

    Morris, Justin; Zemerick, Scott; Grubb, Matt; Lucas, John; Jaridi, Majid; Gross, Jason N.; Ohi, Nicholas; Christian, John A.; Vassiliadis, Dimitris; Kadiyala, Anand; hide

    2016-01-01

    The Simulation-to-Flight 1 (STF-1) CubeSat mission aims to demonstrate how legacy simulation technologies may be adapted for flexible and effective use on missions using the CubeSat platform. These technologies, named NASA Operational Simulator (NOS), have demonstrated significant value on several missions such as James Webb Space Telescope, Global Precipitation Measurement, Juno, and Deep Space Climate Observatory in the areas of software development, mission operations/training, verification and validation (V&V), test procedure development and software systems check-out. STF-1 will demonstrate a highly portable simulation and test platform that allows seamless transition of mission development artifacts to flight products. This environment will decrease development time of future CubeSat missions by lessening the dependency on hardware resources. In addition, through a partnership between NASA GSFC, the West Virginia Space Grant Consortium and West Virginia University, the STF-1 CubeSat will hosts payloads for three secondary objectives that aim to advance engineering and physical-science research in the areas of navigation systems of small satellites, provide useful data for understanding magnetosphere-ionosphere coupling and space weather, and verify the performance and durability of III-V Nitride-based materials.

  12. Ground-based multispectral measurements for airborne data verification in non-operating open pit mine "Kremikovtsi"

    Science.gov (United States)

    Borisova, Denitsa; Nikolov, Hristo; Petkov, Doyno

    2013-10-01

    The impact of mining industry and metal production on the environment is presented all over the world. In our research we set focus on the impact of already non-operating ferrous "Kremikovtsi"open pit mine and related waste dumps and tailings which we consider to be the major factor responsible for pollution of one densely populated region in Bulgaria. The approach adopted is based on correct estimation of the distribution of the iron oxides inside open pit mines and the neighboring regions those considered in this case to be the key issue for the ecological state assessment of soils, vegetation and water. For this study the foremost source of data are those of airborne origin and those combined with ground-based in-situ and laboratory acquired data were used for verification of the environmental variables and thus in process of assessment of the present environmental status influenced by previous mining activities. The percentage of iron content was selected as main indicator for presence of metal pollution since it could be reliably identified by multispectral data used in this study and also because the iron compounds are widely spread in the most of the minerals, rocks and soils. In our research the number of samples from every source (air, field, lab) was taken in the way to be statistically sound and confident. In order to establish relationship between the degree of pollution of the soil and mulspectral data 40 soil samples were collected during a field campaign in the study area together with GPS measurements for two types of laboratory measurements: the first one, chemical and mineralogical analysis and the second one, non-destructive spectroscopy. In this work for environmental variables verification over large areas mulspectral satellite data from Landsat instruments TM/ETM+ and from ALI/OLI (Operational Land Imager) were used. Ground-based (laboratory and in-situ) spectrometric measurements were performed using the designed and constructed in Remote

  13. Physiologically Based Pharmacokinetic (PBPK) Modeling and Simulation Approaches: A Systematic Review of Published Models, Applications, and Model Verification

    Science.gov (United States)

    Sager, Jennifer E.; Yu, Jingjing; Ragueneau-Majlessi, Isabelle

    2015-01-01

    Modeling and simulation of drug disposition has emerged as an important tool in drug development, clinical study design and regulatory review, and the number of physiologically based pharmacokinetic (PBPK) modeling related publications and regulatory submissions have risen dramatically in recent years. However, the extent of use of PBPK modeling by researchers, and the public availability of models has not been systematically evaluated. This review evaluates PBPK-related publications to 1) identify the common applications of PBPK modeling; 2) determine ways in which models are developed; 3) establish how model quality is assessed; and 4) provide a list of publically available PBPK models for sensitive P450 and transporter substrates as well as selective inhibitors and inducers. PubMed searches were conducted using the terms “PBPK” and “physiologically based pharmacokinetic model” to collect published models. Only papers on PBPK modeling of pharmaceutical agents in humans published in English between 2008 and May 2015 were reviewed. A total of 366 PBPK-related articles met the search criteria, with the number of articles published per year rising steadily. Published models were most commonly used for drug-drug interaction predictions (28%), followed by interindividual variability and general clinical pharmacokinetic predictions (23%), formulation or absorption modeling (12%), and predicting age-related changes in pharmacokinetics and disposition (10%). In total, 106 models of sensitive substrates, inhibitors, and inducers were identified. An in-depth analysis of the model development and verification revealed a lack of consistency in model development and quality assessment practices, demonstrating a need for development of best-practice guidelines. PMID:26296709

  14. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    International Nuclear Information System (INIS)

    Qiu, J; Li, H. Harlod; Zhang, T; Yang, D; Ma, F

    2015-01-01

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The most important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools

  15. Marker-based quantification of interfractional tumor position variation and the use of markers for setup verification in radiation therapy for esophageal cancer.

    Science.gov (United States)

    Jin, Peng; van der Horst, Astrid; de Jong, Rianne; van Hooft, Jeanin E; Kamphuis, Martijn; van Wieringen, Niek; Machiels, Melanie; Bel, Arjan; Hulshof, Maarten C C M; Alderliesten, Tanja

    2015-12-01

    The aim of this study was to quantify interfractional esophageal tumor position variation using markers and investigate the use of markers for setup verification. Sixty-five markers placed in the tumor volumes of 24 esophageal cancer patients were identified in computed tomography (CT) and follow-up cone-beam CT. For each patient we calculated pairwise distances between markers over time to evaluate geometric tumor volume variation. We then quantified marker displacements relative to bony anatomy and estimated the variation of systematic (Σ) and random errors (σ). During bony anatomy-based setup verification, we visually inspected whether the markers were inside the planning target volume (PTV) and attempted marker-based registration. Minor time trends with substantial fluctuations in pairwise distances implied tissue deformation. Overall, Σ(σ) in the left-right/cranial-caudal/anterior-posterior direction was 2.9(2.4)/4.1(2.4)/2.2(1.8) mm; for the proximal stomach, it was 5.4(4.3)/4.9(3.2)/1.9(2.4) mm. After bony anatomy-based setup correction, all markers were inside the PTV. However, due to large tissue deformation, marker-based registration was not feasible. Generally, the interfractional position variation of esophageal tumors is more pronounced in the cranial-caudal direction and in the proximal stomach. Currently, marker-based setup verification is not feasible for clinical routine use, but markers can facilitate the setup verification by inspecting whether the PTV covers the tumor volume adequately. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. Comparison of DVH-based plan verification methods for VMAT: ArcCHECK-3DVH system and dynalog-based dose reconstruction.

    Science.gov (United States)

    Saito, Masahide; Kadoya, Noriyuki; Sato, Kiyokazu; Ito, Kengo; Dobashi, Suguru; Takeda, Ken; Onishi, Hiroshi; Jingu, Keiichi

    2017-07-01

    The purpose of this study was comparing dose-volume histogram (DVH)-based plan verification methods for volumetric modulated arc therapy (VMAT) pretreatment QA. We evaluated two 3D dose reconstruction systems: ArcCHECK-3DVH system (Sun Nuclear corp.) and Varian dynalog-based dose reconstruction (DBDR) system, developed in-house. Fifteen prostate cancer patients (67.6 Gy/26 Fr), four head and neck cancer patient (66 Gy/33 Fr), and four esophagus cancer patients (60 Gy/30 Fr) treated with VMAT were studied. First, ArcCHECK measurement was performed on all plans; simultaneously, the Varian dynalog data sets that contained the actual delivered parameters (leaf positions, gantry angles, and cumulative MUs) were acquired from the Linac control system. Thereafter, the delivered 3D patient dose was reconstructed by 3DVH software (two different calculating modes were used: High Sensitivity (3DVH-HS) and Normal Sensitivity (3DVH-NS)) and in-house DBDR system. We evaluated the differences between the TPS-calculated dose and the reconstructed dose using 3D gamma passing rates and DVH dose index analysis. The average 3D gamma passing rates (3%/3 mm) between the TPS-calculated dose and the reconstructed dose were 99.1 ± 0.6%, 99.7 ± 0.3%, and 100.0 ± 0.1% for 3DVH-HS, 3DVH-NS, and DBDR, respectively. For the prostate cases, the average differences between the TPS-calculated dose and reconstructed dose in the PTV mean dose were 1.52 ± 0.50%, -0.14 ± 0.55%, and -0.03 ± 0.07% for 3DVH-HS, 3DVH-NS, and DBDR, respectively. For the head and neck and esophagus cases, the dose difference to the TPS-calculated dose caused by an effect of heterogeneity was more apparent under the 3DVH dose reconstruction than the DBDR. Although with some residual dose reconstruction errors, these dose reconstruction methods can be clinically used as effective tools for DVH-based QA for VMAT delivery. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley

  17. Grip-Pattern Verification for Smart Gun Based on Maximum-Pairwise Comparison and Mean-Template Comparison

    NARCIS (Netherlands)

    Shang, X.; Veldhuis, Raymond N.J.

    2008-01-01

    In our biometric verification system of a smart gun, the rightful user of a gun is authenticated by grip-pattern recognition. In this work verification will be done using two types of comparison methods, respectively. One is mean-template comparison, where the matching score between a test image and

  18. High-dose intensity-modulated radiotherapy for prostate cancer using daily fiducial marker-based position verification: acute and late toxicity in 331 patients

    International Nuclear Information System (INIS)

    Lips, Irene M; Dehnad, Homan; Gils, Carla H van; Boeken Kruger, Arto E; Heide, Uulke A van der; Vulpen, Marco van

    2008-01-01

    We evaluated the acute and late toxicity after high-dose intensity-modulated radiotherapy (IMRT) with fiducial marker-based position verification for prostate cancer. Between 2001 and 2004, 331 patients with prostate cancer received 76 Gy in 35 fractions using IMRT combined with fiducial marker-based position verification. The symptoms before treatment (pre-treatment) and weekly during treatment (acute toxicity) were scored using the Common Toxicity Criteria (CTC). The goal was to score late toxicity according to the Radiation Therapy Oncology Group/European Organization for Research and Treatment of Cancer (RTOG/EORTC) scale with a follow-up time of at least three years. Twenty-two percent of the patients experienced pre-treatment grade ≥ 2 genitourinary (GU) complaints and 2% experienced grade 2 gastrointestinal (GI) complaints. Acute grade 2 GU and GI toxicity occurred in 47% and 30%, respectively. Only 3% of the patients developed acute grade 3 GU and no grade ≥ 3 GI toxicity occurred. After a mean follow-up time of 47 months with a minimum of 31 months for all patients, the incidence of late grade 2 GU and GI toxicity was 21% and 9%, respectively. Grade ≥ 3 GU and GI toxicity rates were 4% and 1%, respectively, including one patient with a rectal fistula and one patient with a severe hemorrhagic cystitis (both grade 4). In conclusion, high-dose intensity-modulated radiotherapy with fiducial marker-based position verification is well tolerated. The low grade ≥ 3 toxicity allows further dose escalation if the same dose constraints for the organs at risk will be used

  19. Characterization of a dose verification system dedicated to radiotherapy treatments based on a silicon detector multi-strips

    International Nuclear Information System (INIS)

    Bocca, A.; Cortes Giraldo, M. A.; Gallardo, M. I.; Espino, J. M.; Aranas, R.; Abou Haidar, Z.; Alvarez, M. A. G.; Quesada, J. M.; Vega-Leal, A. P.; Perez Neto, F. J.

    2011-01-01

    In this paper, we present the characterization of a silicon detector multi-strips (SSSSD: Single Sided Silicon Strip Detector), developed by the company Micron Semiconductors Ltd. for use as a verification system for radiotherapy treatments.

  20. Main control system verification and validation of NPP digital I and C system based on engineering simulator

    International Nuclear Information System (INIS)

    Lin Meng; Hou Dong; Liu Pengfei; Yang Zongwei; Yang Yanhua

    2010-01-01

    Full-scope digital instrumentation and controls system (I and C) technique is being introduced in Chinese new constructed Nuclear Power Plant (NPP), which mainly includes three parts: control system, reactor protection system and engineered safety feature actuation system. For example, SIEMENS TELEPERM XP and XS distributed control system (DCS) have been used in Ling Ao Phase II NPP, which is located in Guangdong province, China. This is the first NPP project in China that Chinese engineers are fully responsible for all the configuration of actual analog and logic diagram, although experience in NPP full-scope digital I and C is very limited. For the safety, it has to be made sure that configuration is right and control functions can be accomplished before the phase of real plant testing on reactor. Therefore, primary verification and validation (V and V) of I and C needs to be carried out. Except the common and basic way, i.e. checking the diagram configuration one by one according to original design, NPP engineering simulator is applied as another effective approach of V and V. For this purpose, a virtual NPP thermal-hydraulic model is established as a basis according to Ling Ao Phase II NPP design, and the NPP simulation tools can provide plant operation parameters to DCS, accept control signal from I and C and give response. During the test, one set of data acquisition equipments are used to build a connection between the engineering simulator (software) and SIEMENS DCS I/O cabinet (hardware). In this emulation, original diagram configuration in DCS and field hardware structures are kept unchanged. In this way, firstly judging whether there are some problems by observing the input and output of DCS without knowing the internal configuration. Then secondly, problems can be found and corrected by understanding and checking the exact and complex configuration in detail. At last, the correctness and functionality of the control system are verified. This method is

  1. Description of a Computer Program Written for Approach and Landing Test Post Flight Data Extraction of Proximity Separation Aerodynamic Coefficients and Aerodynamic Data Base Verification

    Science.gov (United States)

    Homan, D. J.

    1977-01-01

    A computer program written to calculate the proximity aerodynamic force and moment coefficients of the Orbiter/Shuttle Carrier Aircraft (SCA) vehicles based on flight instrumentation is described. The ground reduced aerodynamic coefficients and instrumentation errors (GRACIE) program was developed as a tool to aid in flight test verification of the Orbiter/SCA separation aerodynamic data base. The program calculates the force and moment coefficients of each vehicle in proximity to the other, using the load measurement system data, flight instrumentation data and the vehicle mass properties. The uncertainty in each coefficient is determined, based on the quoted instrumentation accuracies. A subroutine manipulates the Orbiter/747 Carrier Separation Aerodynamic Data Book to calculate a comparable set of predicted coefficients for comparison to the calculated flight test data.

  2. WE-DE-BRA-01: SCIENCE COUNCIL JUNIOR INVESTIGATOR COMPETITION WINNER: Acceleration of a Limited-Angle Intrafraction Verification (LIVE) System Using Adaptive Prior Knowledge Based Image Estimation

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Y; Yin, F; Ren, L [Duke University Medical Center, Durham, NC (United States); Zhang, Y [UT Southwestern Medical Ctr at Dallas, Dallas, TX (United States)

    2016-06-15

    Purpose: To develop an adaptive prior knowledge based image estimation method to reduce the scan angle needed in the LIVE system to reconstruct 4D-CBCT for intrafraction verification. Methods: The LIVE system has been previously proposed to reconstructs 4D volumetric images on-the-fly during arc treatment for intrafraction target verification and dose calculation. This system uses limited-angle beam’s eye view (BEV) MV cine images acquired from the treatment beam together with the orthogonally acquired limited-angle kV projections to reconstruct 4D-CBCT images for target verification during treatment. In this study, we developed an adaptive constrained free-form deformation reconstruction technique in LIVE to further reduce the scanning angle needed to reconstruct the CBCT images. This technique uses free form deformation with energy minimization to deform prior images to estimate 4D-CBCT based on projections acquired in limited angle (orthogonal 6°) during the treatment. Note that the prior images are adaptively updated using the latest CBCT images reconstructed by LIVE during treatment to utilize the continuity of patient motion.The 4D digital extended-cardiac-torso (XCAT) phantom was used to evaluate the efficacy of this technique with LIVE system. A lung patient was simulated with different scenario, including baseline drifts, amplitude change and phase shift. Limited-angle orthogonal kV and beam’s eye view (BEV) MV projections were generated for each scenario. The CBCT reconstructed by these projections were compared with the ground-truth generated in XCAT.Volume-percentage-difference (VPD) and center-of-mass-shift (COMS) were calculated between the reconstructed and the ground-truth tumors to evaluate the reconstruction accuracy. Results: Using orthogonal-view of 6° kV and BEV- MV projections, the VPD/COMS values were 12.7±4.0%/0.7±0.5 mm, 13.0±5.1%/0.8±0.5 mm, and 11.4±5.4%/0.5±0.3 mm for the three scenarios, respectively. Conclusion: The

  3. WE-DE-BRA-01: SCIENCE COUNCIL JUNIOR INVESTIGATOR COMPETITION WINNER: Acceleration of a Limited-Angle Intrafraction Verification (LIVE) System Using Adaptive Prior Knowledge Based Image Estimation

    International Nuclear Information System (INIS)

    Zhang, Y; Yin, F; Ren, L; Zhang, Y

    2016-01-01

    Purpose: To develop an adaptive prior knowledge based image estimation method to reduce the scan angle needed in the LIVE system to reconstruct 4D-CBCT for intrafraction verification. Methods: The LIVE system has been previously proposed to reconstructs 4D volumetric images on-the-fly during arc treatment for intrafraction target verification and dose calculation. This system uses limited-angle beam’s eye view (BEV) MV cine images acquired from the treatment beam together with the orthogonally acquired limited-angle kV projections to reconstruct 4D-CBCT images for target verification during treatment. In this study, we developed an adaptive constrained free-form deformation reconstruction technique in LIVE to further reduce the scanning angle needed to reconstruct the CBCT images. This technique uses free form deformation with energy minimization to deform prior images to estimate 4D-CBCT based on projections acquired in limited angle (orthogonal 6°) during the treatment. Note that the prior images are adaptively updated using the latest CBCT images reconstructed by LIVE during treatment to utilize the continuity of patient motion.The 4D digital extended-cardiac-torso (XCAT) phantom was used to evaluate the efficacy of this technique with LIVE system. A lung patient was simulated with different scenario, including baseline drifts, amplitude change and phase shift. Limited-angle orthogonal kV and beam’s eye view (BEV) MV projections were generated for each scenario. The CBCT reconstructed by these projections were compared with the ground-truth generated in XCAT.Volume-percentage-difference (VPD) and center-of-mass-shift (COMS) were calculated between the reconstructed and the ground-truth tumors to evaluate the reconstruction accuracy. Results: Using orthogonal-view of 6° kV and BEV- MV projections, the VPD/COMS values were 12.7±4.0%/0.7±0.5 mm, 13.0±5.1%/0.8±0.5 mm, and 11.4±5.4%/0.5±0.3 mm for the three scenarios, respectively. Conclusion: The

  4. Design and verification of computer-based reactor control system modification at Bruce-A candu nuclear generating station

    International Nuclear Information System (INIS)

    Basu, S.; Webb, N.

    1995-01-01

    The Reactor Control System at Bruce-A Nuclear Generating Station is going through some design modifications, which involve a rigorous design process including independent verification and validation. The design modification includes changes to the control logic, alarms and annunciation, hardware and software. The design (and verification) process includes design plan, design requirements, hardware and software specifications, hardware and software design, testing, technical review, safety evaluation, reliability analysis, failure mode and effect analysis, environmental qualification, seismic qualification, software quality assurance, system validation, documentation update, configuration management, and final acceptance. (7 figs.)

  5. Optimized periodic verification testing blended risk and performance-based MOV inservice test program an application of ASME code case OMN-1

    Energy Technology Data Exchange (ETDEWEB)

    Sellers, C.; Fleming, K.; Bidwell, D.; Forbes, P. [and others

    1996-12-01

    This paper presents an application of ASME Code Case OMN-1 to the GL 89-10 Program at the South Texas Project Electric Generating Station (STPEGS). Code Case OMN-1 provides guidance for a performance-based MOV inservice test program that can be used for periodic verification testing and allows consideration of risk insights. Blended probabilistic and deterministic evaluation techniques were used to establish inservice test strategies including both test methods and test frequency. Described in the paper are the methods and criteria for establishing MOV safety significance based on the STPEGS probabilistic safety assessment, deterministic considerations of MOV performance characteristics and performance margins, the expert panel evaluation process, and the development of inservice test strategies. Test strategies include a mix of dynamic and static testing as well as MOV exercising.

  6. Optimized periodic verification testing blended risk and performance-based MOV inservice test program an application of ASME code case OMN-1

    International Nuclear Information System (INIS)

    Sellers, C.; Fleming, K.; Bidwell, D.; Forbes, P.

    1996-01-01

    This paper presents an application of ASME Code Case OMN-1 to the GL 89-10 Program at the South Texas Project Electric Generating Station (STPEGS). Code Case OMN-1 provides guidance for a performance-based MOV inservice test program that can be used for periodic verification testing and allows consideration of risk insights. Blended probabilistic and deterministic evaluation techniques were used to establish inservice test strategies including both test methods and test frequency. Described in the paper are the methods and criteria for establishing MOV safety significance based on the STPEGS probabilistic safety assessment, deterministic considerations of MOV performance characteristics and performance margins, the expert panel evaluation process, and the development of inservice test strategies. Test strategies include a mix of dynamic and static testing as well as MOV exercising

  7. The effect of radar-based QPE on the Fractions Skill Score used at the QPF verification

    Czech Academy of Sciences Publication Activity Database

    Zacharov, Petr, jr.; Řezáčová, Daniela

    2010-01-01

    Roč. 25, - (2010), s. 91-95 ISSN 1680-7340 R&D Projects: GA MŠk OC 112; GA AV ČR(CZ) IAA300420804; GA ČR GA205/07/0905 Institutional research plan: CEZ:AV0Z30420517 Keywords : quantitative precipitation forecast * quantitative precipitation estimate * radar * verification * convective weather Subject RIV: DG - Athmosphere Sciences, Meteorology www.adv-geosci.net/25/91/2010/

  8. A comparison of two prompt gamma imaging techniques with collimator-based cameras for range verification in proton therapy

    Science.gov (United States)

    Lin, Hsin-Hon; Chang, Hao-Ting; Chao, Tsi-Chian; Chuang, Keh-Shih

    2017-08-01

    In vivo range verification plays an important role in proton therapy to fully utilize the benefits of the Bragg peak (BP) for delivering high radiation dose to tumor, while sparing the normal tissue. For accurately locating the position of BP, camera equipped with collimators (multi-slit and knife-edge collimator) to image prompt gamma (PG) emitted along the proton tracks in the patient have been proposed for range verification. The aim of the work is to compare the performance of multi-slit collimator and knife-edge collimator for non-invasive proton beam range verification. PG imaging was simulated by a validated GATE/GEANT4 Monte Carlo code to model the spot-scanning proton therapy and cylindrical PMMA phantom in detail. For each spot, 108 protons were simulated. To investigate the correlation between the acquired PG profile and the proton range, the falloff regions of PG profiles were fitted with a 3-line-segment curve function as the range estimate. Factors including the energy window setting, proton energy, phantom size, and phantom shift that may influence the accuracy of detecting range were studied. Results indicated that both collimator systems achieve reasonable accuracy and good response to the phantom shift. The accuracy of range predicted by multi-slit collimator system is less affected by the proton energy, while knife-edge collimator system can achieve higher detection efficiency that lead to a smaller deviation in predicting range. We conclude that both collimator systems have potentials for accurately range monitoring in proton therapy. It is noted that neutron contamination has a marked impact on range prediction of the two systems, especially in multi-slit system. Therefore, a neutron reduction technique for improving the accuracy of range verification of proton therapy is needed.

  9. Mass spectrometry based biomarker discovery, verification, and validation--quality assurance and control of protein biomarker assays.

    Science.gov (United States)

    Parker, Carol E; Borchers, Christoph H

    2014-06-01

    In its early years, mass spectrometry (MS)-based proteomics focused on the cataloging of proteins found in different species or different tissues. By 2005, proteomics was being used for protein quantitation, typically based on "proteotypic" peptides which act as surrogates for the parent proteins. Biomarker discovery is usually done by non-targeted "shotgun" proteomics, using relative quantitation methods to determine protein expression changes that correlate with disease (output given as "up-or-down regulation" or "fold-increases"). MS-based techniques can also perform "absolute" quantitation which is required for clinical applications (output given as protein concentrations). Here we describe the differences between these methods, factors that affect the precision and accuracy of the results, and some examples of recent studies using MS-based proteomics to verify cancer-related biomarkers. Copyright © 2014 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  10. Development of digital device based work verification system for cooperation between main control room operators and field workers in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Min, E-mail: jewellee@kaeri.re.kr [Korea Atomic Energy Research Institute, 305-353, 989-111 Daedeok-daero, Yuseong-gu, Daejeon (Korea, Republic of); Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Lee, Hyun Chul, E-mail: leehc@kaeri.re.kr [Korea Atomic Energy Research Institute, 305-353, 989-111 Daedeok-daero, Yuseong-gu, Daejeon (Korea, Republic of); Ha, Jun Su, E-mail: junsu.ha@kustar.ac.ae [Department of Nuclear Engineering, Khalifa University of Science Technology and Research, Abu Dhabi P.O. Box 127788 (United Arab Emirates); Seong, Poong Hyun, E-mail: phseong@kaist.ac.kr [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of)

    2016-10-15

    Highlights: • A digital device-based work verification and cooperation support system was developed. • Requirements were derived by interviewing field operators having experiences with mobile-based work support systems. • The usability of the proposed system was validated by conducting questionnaire surveys. • The proposed system will be useful if the manual or the set of guidelines is well constructed. - Abstract: Digital technologies have been applied in the nuclear field to check task results, monitor events and accidents, and transmit/receive data. The results of using digital devices have proven that these devices can provide high accuracy and convenience for workers, allowing them to obtain obvious positive effects by reducing their workloads. In this study, as one step forward, a digital device-based cooperation support system, the nuclear cooperation support and mobile documentation system (Nu-COSMOS), is proposed to support communication between main control room (MCR) operators and field workers by verifying field workers’ work results in nuclear power plants (NPPs). The proposed system consists of a mobile based information storage system to support field workers by providing various functions to make workers more trusted by MCR operators; also to improve the efficiency of meeting, and a large screen based information sharing system supports meetings by allowing both sides to share one medium. The usability of this system was estimated by interviewing field operators working in nuclear power plants and experts who have experience working as operators. A survey to estimate the usability of the suggested system and the suitability of the functions of the system for field working was conducted for 35 subjects who have experience in field works or with support system development-related research. The usability test was conducted using the system usability scale (SUS), which is widely used in industrial usability evaluation. Using questionnaires

  11. Design and experimental verification of near-field Ka-band probe based on wideband OMJ with minimum higher-order spherical mode content

    DEFF Research Database (Denmark)

    Foged, L. J.; Giacomini, A.; Morbidini, R

    2012-01-01

    technology capable of maintaining the same high performance standards of traditional probes However, in typical Spherical Near Field (SNF) measurement scenarios, the applicable frequency range of the single probe can also be limited by the content of μ≠1 spherical modes in the probe pattern [6...... probe/AUT distance this assumption may lead to unacceptable errors in special cases. This paper describes the design and experimental verification of a Ka-band probe based on the inverted ridge technology. The probe is intended for high precision SNF measurements in special conditions that require less...... than -45dB higher order spherical mode content. This performance level has been accomplished through careful design of the probe and meticulous selection of the components used in the external balanced feeding scheme. The paper reports on the electrical and mechanical design considerations...

  12. Development of synchronized control method for shaking table with booster device. Verification of the capabilities based on both real facility and numerical simulator

    International Nuclear Information System (INIS)

    Kajii, Shin-ichirou; Yasuda, Chiaki; Yamashita, Toshio; Abe, Hiroshi; Kanki, Hiroshi

    2004-01-01

    In the seismic design of nuclear power plant, it is recently considered to use probability method in a addition to certainty method. The former method is called Seismic Probability Safety Assessment (Seismic PSA). In case of seismic PSA for some components of a nuclear power plant using a shaking table, it is necessary for some limited conditions with high level of accelerations such as actual conditions. However, it might be difficult to achieve the test conditions that a current shaking table based on hydraulic power system is intended for the test facility. Therefore, we have been planning out a test method in which both a current and another shaking table called a booster device are applied. This paper describes the verification test of a synchronized control between a current shaking table and a booster device. (author)

  13. Guidelines for the verification and validation of expert system software and conventional software: Evaluation of knowledge base certification methods. Volume 4

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-03-01

    This report presents the results of the Knowledge Base Certification activity of the expert systems verification and validation (V ampersand V) guideline development project which is jointly funded by the US Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V ampersand V of expert systems for use in nuclear power applications. This activity is concerned with the development and testing of various methods for assuring the quality of knowledge bases. The testing procedure used was that of behavioral experiment, the first known such evaluation of any type of V ampersand V activity. The value of such experimentation is its capability to provide empirical evidence for -- or against -- the effectiveness of plausible methods in helping people find problems in knowledge bases. The three-day experiment included 20 participants from three nuclear utilities, the Nuclear Regulatory Commission's Technical training Center, the University of Maryland, EG ampersand G Idaho, and SAIC. The study used two real nuclear expert systems: a boiling water reactor emergency operating procedures tracking system and a pressurized water reactor safety assessment systems. Ten participants were assigned to each of the expert systems. All participants were trained in and then used a sequence of four different V ampersand V methods selected as being the best and most appropriate for study on the basis of prior evaluation activities. These methods either involved the analysis and tracing of requirements to elements in the knowledge base (requirements grouping and requirements tracing) or else involved direct inspection of the knowledge base for various kinds of errors. Half of the subjects within each system group used the best manual variant of the V ampersand V methods (the control group), while the other half were supported by the results of applying real or simulated automated tools to the knowledge bases

  14. Towards agent-based modelling and verification of collaborative business processes : An approach centred on interactions and behaviours

    NARCIS (Netherlands)

    Stuit, M.; Szirbik, N.

    2009-01-01

    This paper presents the process-oriented aspects of a formal and visual agent-based business process modeling language. The language is of use for (networks of) organizations that elect or envisage multi-agent systems for the support of collaborative business processes. The paper argues that the

  15. Design and verification of focal plane assembly thermal control system of one space-based astronomy telescope

    Science.gov (United States)

    Yang, Wen-gang; Fan, Xue-wu; Wang, Chen-jie; Wang, Ying-hao; Feng, Liang-jie; Du, Yun-fei; Ren, Guo-rui; Wang, Wei; Li, Chuang; Gao, Wei

    2015-10-01

    One space-based astronomy telescope will observe astronomy objects whose brightness should be lower than 23th magnitude. To ensure the telescope performance, very low system noise requirements need extreme low CCD operating temperature (lower than -65°C). Because the satellite will be launched in a low earth orbit, inevitable space external heat fluxes will result in a high radiator sink temperature (higher than -65°C). Only passive measures can't meet the focal plane cooling specification and active cooling technologies must be utilized. Based on detailed analysis on thermal environment of the telescope and thermal characteristics of focal plane assembly (FPA), active cooling system which is based on thermo-electric cooler (TEC) and heat rejection system (HRS) which is based on flexible heat pipe and radiator have been designed. Power consumption of TECs is dependent on the heat pumped requirements and its hot side temperature. Heat rejection capability of HRS is mainly dependent on the radiator size and temperature. To compromise TEC power consumption and the radiator size requirement, thermal design of FPA must be optimized. Parasitic heat loads on the detector is minimized to reduce the heat pumped demands of TECs and its power consumption. Thermal resistance of heat rejection system is minimized to reject the heat dissipation of TECs from the hot side to the radiator efficiently. The size and surface coating of radiator are optimized to compromise heat reject ion requirements and system constraints. Based on above work, transient thermal analysis of FPA is performed. FPA prototype model has been developed and thermal vacuum/balance test has been accomplished. From the test, temperature of key parts and working parameters of TECs in extreme cases have been acquired. Test results show that CCD can be controlled below -65°C and all parts worked well during the test. All of these verified the thermal design of FPA and some lessons will be presented in this

  16. Clinical application of in vivo treatment delivery verification based on PET/CT imaging of positron activity induced at high energy photon therapy

    Science.gov (United States)

    Janek Strååt, Sara; Andreassen, Björn; Jonsson, Cathrine; Noz, Marilyn E.; Maguire, Gerald Q., Jr.; Näfstadius, Peder; Näslund, Ingemar; Schoenahl, Frederic; Brahme, Anders

    2013-08-01

    The purpose of this study was to investigate in vivo verification of radiation treatment with high energy photon beams using PET/CT to image the induced positron activity. The measurements of the positron activation induced in a preoperative rectal cancer patient and a prostate cancer patient following 50 MV photon treatments are presented. A total dose of 5 and 8 Gy, respectively, were delivered to the tumors. Imaging was performed with a 64-slice PET/CT scanner for 30 min, starting 7 min after the end of the treatment. The CT volume from the PET/CT and the treatment planning CT were coregistered by matching anatomical reference points in the patient. The treatment delivery was imaged in vivo based on the distribution of the induced positron emitters produced by photonuclear reactions in tissue mapped on to the associated dose distribution of the treatment plan. The results showed that spatial distribution of induced activity in both patients agreed well with the delivered beam portals of the treatment plans in the entrance subcutaneous fat regions but less so in blood and oxygen rich soft tissues. For the preoperative rectal cancer patient however, a 2 ± (0.5) cm misalignment was observed in the cranial-caudal direction of the patient between the induced activity distribution and treatment plan, indicating a beam patient setup error. No misalignment of this kind was seen in the prostate cancer patient. However, due to a fast patient setup error in the PET/CT scanner a slight mis-position of the patient in the PET/CT was observed in all three planes, resulting in a deformed activity distribution compared to the treatment plan. The present study indicates that the induced positron emitters by high energy photon beams can be measured quite accurately using PET imaging of subcutaneous fat to allow portal verification of the delivered treatment beams. Measurement of the induced activity in the patient 7 min after receiving 5 Gy involved count rates which were about

  17. Transport Mechanisms and Quality Changes During Frying of Chicken Nuggets--Hybrid Mixture Theory Based Modeling and Experimental Verification.

    Science.gov (United States)

    Bansal, Harkirat S; Takhar, Pawan S; Alvarado, Christine Z; Thompson, Leslie D

    2015-12-01

    Hybrid mixture theory (HMT) based 2-scale fluid transport relations of Takhar coupled with a multiphase heat transfer equation were solved to model water, oil and gas movement during frying of chicken nuggets. A chicken nugget was treated as a heterogeneous material consisting of meat core with wheat-based coating. The coupled heat and fluid transfer equations were solved using the finite element method. Numerical simulations resulted in data on spatial and temporal profiles for moisture, rate of evaporation, temperature, oil, pore pressure, pressure in various phases, and coefficient of elasticity. Results showed that most of the oil stayed in the outer 1.5 mm of the coating region. Temperature values greater than 100 °C were observed in the coating after 30 s of frying. Negative gage-pore pressure (p(w) p(g)) in the hydrophilic matrix causes p(w) frying time. © 2015 Institute of Food Technologists®

  18. Thermal-hydraulics verification of a coarse-mesh OpenFOAM-based solver for a Sodium Fast Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Bonet López, M.

    2015-07-01

    Recently, in the Institute Swiss Paul Scherrer Institut, is has developed a platform Multiphysics, based in OpenFOAM, that is capable of performing an analysis multidimensional of a reactor nuclear. One of the main objectives of this project is to verify the part of the code responsible for the Thermo-hydraulic analysis of the reactor. To carry out simulations this part of the code uses the approximation of thick mesh based on the equations of a porous medium. Therefore, the other objective is demonstrate that this method is applicable to the analysis of a reactor nuclear fast of sodium, focusing is in his capacity of predict the transfer of heat between a subset and the space vacuum between subsets of the core of the reactor. (Author)

  19. Statistical methods for improving verification of claims of origin for Italian wines based on stable isotope ratios

    International Nuclear Information System (INIS)

    Dordevic, N.; Wehrens, R.; Postma, G.J.; Buydens, L.M.C.; Camin, F.

    2012-01-01

    Highlights: ► The assessment of claims of origin is of enormous economic importance for DOC and DOCG wines. ► The official method is based on univariate statistical tests of H, C and O isotopic ratios. ► We consider 5220 Italian wine samples collected in the period 2000–2010. ► Multivariate statistical analysis leads to much better specificity and easier detection of false claims of origin. ► In the case of multi-modal data, mixture modelling provides additional improvements. - Abstract: Wine derives its economic value to a large extent from geographical origin, which has a significant impact on the quality of the wine. According to the food legislation, wines can be without geographical origin (table wine) and wines with origin. Wines with origin must have characteristics which are essential due to its region of production and must be produced, processed and prepared, exclusively within that region. The development of fast and reliable analytical methods for the assessment of claims of origin is very important. The current official method is based on the measurement of stable isotope ratios of water and alcohol in wine, which are influenced by climatic factors. The results in this paper are based on 5220 Italian wine samples collected in the period 2000–2010. We evaluate the univariate approach underlying the official method to assess claims of origin and propose several new methods to get better geographical discrimination between samples. It is shown that multivariate methods are superior to univariate approaches in that they show increased sensitivity and specificity. In cases where data are non-normally distributed, an approach based on mixture modelling provides additional improvements.

  20. Mechanistic Physiologically Based Pharmacokinetic (PBPK) Model of the Heart Accounting for Inter-Individual Variability: Development and Performance Verification.

    Science.gov (United States)

    Tylutki, Zofia; Mendyk, Aleksander; Polak, Sebastian

    2018-04-01

    Modern model-based approaches to cardiac safety and efficacy assessment require accurate drug concentration-effect relationship establishment. Thus, knowledge of the active concentration of drugs in heart tissue is desirable along with inter-subject variability influence estimation. To that end, we developed a mechanistic physiologically based pharmacokinetic model of the heart. The models were described with literature-derived parameters and written in R, v.3.4.0. Five parameters were estimated. The model was fitted to amitriptyline and nortriptyline concentrations after an intravenous infusion of amitriptyline. The cardiac model consisted of 5 compartments representing the pericardial fluid, heart extracellular water, and epicardial intracellular, midmyocardial intracellular, and endocardial intracellular fluids. Drug cardiac metabolism, passive diffusion, active efflux, and uptake were included in the model as mechanisms involved in the drug disposition within the heart. The model accounted for inter-individual variability. The estimates of optimized parameters were within physiological ranges. The model performance was verified by simulating 5 clinical studies of amitriptyline intravenous infusion, and the simulated pharmacokinetic profiles agreed with clinical data. The results support the model feasibility. The proposed structure can be tested with the goal of improving the patient-specific model-based cardiac safety assessment and offers a framework for predicting cardiac concentrations of various xenobiotics. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  1. CLSI-based transference and verification of CALIPER pediatric reference intervals for 29 Ortho VITROS 5600 chemistry assays.

    Science.gov (United States)

    Higgins, Victoria; Truong, Dorothy; Woroch, Amy; Chan, Man Khun; Tahmasebi, Houman; Adeli, Khosrow

    2018-03-01

    Evidence-based reference intervals (RIs) are essential to accurately interpret pediatric laboratory test results. To fill gaps in pediatric RIs, the Canadian Laboratory Initiative on Pediatric Reference Intervals (CALIPER) project developed an age- and sex-specific pediatric RI database based on healthy pediatric subjects. Originally established for Abbott ARCHITECT assays, CALIPER RIs were transferred to assays on Beckman, Roche, Siemens, and Ortho analytical platforms. This study provides transferred reference intervals for 29 biochemical assays for the Ortho VITROS 5600 Chemistry System (Ortho). Based on Clinical Laboratory Standards Institute (CLSI) guidelines, a method comparison analysis was performed by measuring approximately 200 patient serum samples using Abbott and Ortho assays. The equation of the line of best fit was calculated and the appropriateness of the linear model was assessed. This equation was used to transfer RIs from Abbott to Ortho assays. Transferred RIs were verified using 84 healthy pediatric serum samples from the CALIPER cohort. RIs for most chemistry analytes successfully transferred from Abbott to Ortho assays. Calcium and CO 2 did not meet statistical criteria for transference (r 2 CALIPER pediatric RI database to laboratories using Ortho VITROS 5600 biochemical assays. Clinical laboratories should verify CALIPER reference intervals for their specific analytical platform and local population as recommended by CLSI. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  2. Development and verification of a 281-group WIMS-D library based on ENDF/B-VII.1

    International Nuclear Information System (INIS)

    Dong, Zhengyun; Wu, Jun; Ma, Xubo; Yu, Hui; Chen, Yixue

    2016-01-01

    Highlights: • A new WIMS-D library based on SHEM 281 energy structures is developed. • The method for calculating the lambda factor is illustrated and parameters are discussed. • The results show the improvements of this library compared with other libraries. - Abstract: The WIMS-D library based on WIMS 69 or XMAS 172 energy group structures is widely used in thermal reactor research. Otherwise, the resonance overlap effect is not taken into account in the two energy group structure, which limits the accuracy of resonance treatment. The SHEM 281 group structure is designed by the French to avoid the resonance overlap effect. In this study, a new WIMS-D library with SHEM 281 mesh is developed by using the NJOY nuclear data processing system based on the latest Evaluated Nuclear Data Library ENDF/B-VII.1. The parameters such as the thermal cut-off energy and lambda factor that depend on group structure are discussed. The lambda factor is calculated by Neutron Resonance Spectrum Calculation System and the effect of this factor is analyzed. The new library is verified through the analysis of various criticality benchmarks by using DRAGON code. The values of multiplication factor are consistent with the experiment data and the results also are improved in comparison with other WIMS libraries.

  3. Differential evolution algorithm based photonic structure design: numerical and experimental verification of subwavelength λ/5 focusing of light

    Science.gov (United States)

    Bor, E.; Turduev, M.; Kurt, H.

    2016-01-01

    Photonic structure designs based on optimization algorithms provide superior properties compared to those using intuition-based approaches. In the present study, we numerically and experimentally demonstrate subwavelength focusing of light using wavelength scale absorption-free dielectric scattering objects embedded in an air background. An optimization algorithm based on differential evolution integrated into the finite-difference time-domain method was applied to determine the locations of each circular dielectric object with a constant radius and refractive index. The multiobjective cost function defined inside the algorithm ensures strong focusing of light with low intensity side lobes. The temporal and spectral responses of the designed compact photonic structure provided a beam spot size in air with a full width at half maximum value of 0.19λ, where λ is the wavelength of light. The experiments were carried out in the microwave region to verify numerical findings, and very good agreement between the two approaches was found. The subwavelength light focusing is associated with a strong interference effect due to nonuniformly arranged scatterers and an irregular index gradient. Improving the focusing capability of optical elements by surpassing the diffraction limit of light is of paramount importance in optical imaging, lithography, data storage, and strong light-matter interaction. PMID:27477060

  4. Detection de faute automatique dans les systemes solaires thermiques basee sur la verification de regles et la simulation

    Science.gov (United States)

    Maltais Larouche, Simon

    Solar hot water systems are often considered to lower the energy costs and greenhouse gas emissions related to the production of domestic hot water. Although the capital costs associated with solar domestic hot water systems are decreasing each year, they are still significantly higher than conventional solutions, and these extra costs are compensated by reduced energy bills. In order to be economically viable, these systems must then deliver a satisfactory performance over their useful lifetime. Unfortunately, it is not uncommon for solar hot water systems to encounter issues which result in a reduction of energy savings and/or their useful life span. These issues often result from poor design, careless installation, and a lack of maintenance. Furthermore, it is frequent that the system's owners stay unaware of a failure for an extended period of time, since these systems are generally equipped with auxiliary heating designed to meet the entire heat load. Thus, a system could be underperforming or out of service for months or even years without any noticeable symptoms from the hot water consumer point of view. In this respect, it is important to find solutions to automatically warn a system's owner or manager in case of a system failure. This thesis present an original automatic fault detection method based on two levels and developed specifically for solar hot water systems. The first level monitors a system's operating conditions (e.g. temperatures, flowrates, pressures, etc.) through a rule-base algorithm. In the second level, the solar circuit and auxiliary heater daily performances are evaluated using TRNSYS simulations and compared with the measured performance in order to determine if there is a significant discrepancy. The method was assessed using three years of operation data from a solar hot water system composed of 11 evacuated tubes of a total area of 35.5 m 2 located at l'Accueil Bonneau in Montreal, Canada. The validation was also used to determine

  5. Compton DIV: Using a Compton-Based Gamma-Ray Imager for Design Information Verification of Uranium Enrichment Plants

    International Nuclear Information System (INIS)

    Burks, M.; Verbeke, J.; Dougan, A.; Wang, T.; Decman, D.

    2009-01-01

    A feasibility study has been performed to determine the potential usefulness of Compton imaging as a tool for design information verification (DIV) of uranium enrichment plants. Compton imaging is a method of gamma-ray imaging capable of imaging with a 360-degree field of view over a broad range of energies. These systems can image a room (with a time span on the order of one hour) and return a picture of the distribution and composition of radioactive material in that room. The effectiveness of Compton imaging depends on the sensitivity and resolution of the instrument as well the strength and energy of the radioactive material to be imaged. This study combined measurements and simulations to examine the specific issue of UF 6 gas flow in pipes, at various enrichment levels, as well as hold-up resulting from the accumulation of enriched material in those pipes. It was found that current generation imagers could image pipes carrying UF 6 in less than one hour at moderate to high enrichment. Pipes with low enriched gas would require more time. It was also found that hold-up was more amenable to this technique and could be imaged in gram quantities in a fraction of an hour. another questions arises regarding the ability to separately image two pipes spaced closely together. This depends on the capabilities of the instrument in question. These results are described in detail. In addition, suggestions are given as to how to develop Compton imaging as a tool for DIV

  6. HUMTRN: documentation and verification for an ICRP-based age- and sex-specific human simulation model for radionuclide dose assessment

    International Nuclear Information System (INIS)

    Gallegos, A.F.; Wenzel, W.J.

    1984-06-01

    The dynamic human simulation model HUMTRN is designed specifically as a major module of BIOTRAN to integrate climatic, hydrologic, atmospheric, food crop, and herbivore simulation with human dietary and physiological characteristics, and metabolism and radionuclides to predict radiation doses to selected organs of both sexes in different age groups. The model is based on age- and weight-specific equations developed for predicting human radionuclide transport from metabolic and physical characteristics. These characteristics are modeled from studies documented by the International Commission on Radiological Protection (ICRP 23). HUMTRN allows cumulative doses from uranium or plutonium radionuclides to be predicted by modeling age-specific anatomical, physiological, and metabolic properties of individuals between 1 and 70 years of age and can track radiation exposure and radionuclide metabolism for any age group for specified daily or yearly time periods. The simulated daily dose integration of eight or more simultaneous air, water, and food intakes gives a new, comprehensive, dynamic picture of radionuclide intake, uptake, and hazard analysis for complex scenarios. A detailed example using site-specific data based on the Pantex studies is included for verification. 14 references, 24 figures, 10 tables

  7. Model-based design and experimental verification of a monitoring concept for an active-active electromechanical aileron actuation system

    Science.gov (United States)

    Arriola, David; Thielecke, Frank

    2017-09-01

    Electromechanical actuators have become a key technology for the onset of power-by-wire flight control systems in the next generation of commercial aircraft. The design of robust control and monitoring functions for these devices capable to mitigate the effects of safety-critical faults is essential in order to achieve the required level of fault tolerance. A primary flight control system comprising two electromechanical actuators nominally operating in active-active mode is considered. A set of five signal-based monitoring functions are designed using a detailed model of the system under consideration which includes non-linear parasitic effects, measurement and data acquisition effects, and actuator faults. Robust detection thresholds are determined based on the analysis of parametric and input uncertainties. The designed monitoring functions are verified experimentally and by simulation through the injection of faults in the validated model and in a test-rig suited to the actuation system under consideration, respectively. They guarantee a robust and efficient fault detection and isolation with a low risk of false alarms, additionally enabling the correct reconfiguration of the system for an enhanced operational availability. In 98% of the performed experiments and simulations, the correct faults were detected and confirmed within the time objectives set.

  8. Use of an Existing Airborne Radon Data Base in the Verification of the NASA/AEAP Core Model

    Science.gov (United States)

    Kritz, Mark A.

    1998-01-01

    The primary objective of this project was to apply the tropospheric atmospheric radon (Rn222) measurements to the development and verification of the global 3-D atmospheric chemical transport model under development by NASA's Atmospheric Effects of Aviation Project (AEAP). The AEAP project had two principal components: (1) a modeling effort, whose goal was to create, test and apply an elaborate three-dimensional atmospheric chemical transport model (the NASA/AEAP Core model to an evaluation of the possible short and long-term effects of aircraft emissions on atmospheric chemistry and climate--and (2) a measurement effort, whose goal was to obtain a focused set of atmospheric measurements that would provide some of the observational data used in the modeling effort. My activity in this project was confined to the first of these components. Both atmospheric transport and atmospheric chemical reactions (as well the input and removal of chemical species) are accounted for in the NASA/AEAP Core model. Thus, for example, in assessing the effect of aircraft effluents on the chemistry of a given region of the upper troposphere, the model must keep track not only of the chemical reactions of the effluent species emitted by aircraft flying in this region, but also of the transport into the region of these (and other) species from other, remote sources--for example, via the vertical convection of boundary layer air to the upper troposphere. Radon, because of its known surface source and known radioactive half-life, and freedom from chemical production or loss, and from removal from the atmosphere by physical scavenging, is a recognized and valuable tool for testing the transport components of global transport and circulation models.

  9. Model-based ECT signal interpretation and experimental verification for the quantitative flaw characterization in steam generator tubes

    International Nuclear Information System (INIS)

    Song, Sung Jin; Kim, Young Hwan; Kim, Eui Lae; Chung, Tae Eon; Yim, Chang Jae

    2002-01-01

    The model-based inversion tools for eddy current signals have been developed by the novel combination of neural networks and finite element modeling for quantitative flaw characterization in steam generator tubes. In the present work, interpretation of experimental eddy current signals was carried out in order to validate the developed inversion tools. A database was constructed using the synthetic flaw signals generated by the finite element modeling. The hybrid neural networks of a PNN classifier and BPNN size estimators were trained using the synthetic signals. Experimental eddy current signals were obtained from axisymmetric artificial flaws. Interpretations of flaws were carried out by feeding experimental signals into the neural networks. The results of interpretations were excellent, so that the developed inversion tools would be applicable to the interpretation of experimental eddy current signals.

  10. PCR-based verification of positive rapid diagnostic tests for intestinal protozoa infections with variable test band intensity.

    Science.gov (United States)

    Becker, Sören L; Müller, Ivan; Mertens, Pascal; Herrmann, Mathias; Zondie, Leyli; Beyleveld, Lindsey; Gerber, Markus; du Randt, Rosa; Pühse, Uwe; Walter, Cheryl; Utzinger, Jürg

    2017-10-01

    Stool-based rapid diagnostic tests (RDTs) for pathogenic intestinal protozoa (e.g. Cryptosporidium spp. and Giardia intestinalis) allow for prompt diagnosis and treatment in resource-constrained settings. Such RDTs can improve individual patient management and facilitate population-based screening programmes in areas without microbiological laboratories for confirmatory testing. However, RDTs are difficult to interpret in case of 'trace' results with faint test band intensities and little is known about whether such ambiguous results might indicate 'true' infections. In a longitudinal study conducted in poor neighbourhoods of Port Elizabeth, South Africa, a total of 1428 stool samples from two cohorts of schoolchildren were examined on the spot for Cryptosporidium spp. and G. intestinalis using an RDT (Crypto/Giardia DuoStrip; Coris BioConcept). Overall, 121 samples were positive for G. intestinalis and the RDT suggested presence of cryptosporidiosis in 22 samples. After a storage period of 9-10 months in cohort 1 and 2-3 months in cohort 2, samples were subjected to multiplex PCR (BD Max™ Enteric Parasite Panel, Becton Dickinson). Ninety-three percent (112/121) of RDT-positive samples for G. intestinalis were confirmed by PCR, with a correlation between RDT test band intensity and quantitative pathogen load present in the sample. For Cryptosporidium spp., all positive RDTs had faintly visible lines and these were negative on PCR. The performance of the BD Max™ PCR was nearly identical in both cohorts, despite the prolonged storage at disrupted cold chain conditions in cohort 1. The Crypto/Giardia DuoStrip warrants further validation in communities with a high incidence of diarrhoea. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Pedestrian Detection Based on Adaptive Selection of Visible Light or Far-Infrared Light Camera Image by Fuzzy Inference System and Convolutional Neural Network-Based Verification

    Science.gov (United States)

    Kang, Jin Kyu; Hong, Hyung Gil; Park, Kang Ryoung

    2017-01-01

    A number of studies have been conducted to enhance the pedestrian detection accuracy of intelligent surveillance systems. However, detecting pedestrians under outdoor conditions is a challenging problem due to the varying lighting, shadows, and occlusions. In recent times, a growing number of studies have been performed on visible light camera-based pedestrian detection systems using a convolutional neural network (CNN) in order to make the pedestrian detection process more resilient to such conditions. However, visible light cameras still cannot detect pedestrians during nighttime, and are easily affected by shadows and lighting. There are many studies on CNN-based pedestrian detection through the use of far-infrared (FIR) light cameras (i.e., thermal cameras) to address such difficulties. However, when the solar radiation increases and the background temperature reaches the same level as the body temperature, it remains difficult for the FIR light camera to detect pedestrians due to the insignificant difference between the pedestrian and non-pedestrian features within the images. Researchers have been trying to solve this issue by inputting both the visible light and the FIR camera images into the CNN as the input. This, however, takes a longer time to process, and makes the system structure more complex as the CNN needs to process both camera images. This research adaptively selects a more appropriate candidate between two pedestrian images from visible light and FIR cameras based on a fuzzy inference system (FIS), and the selected candidate is verified with a CNN. Three types of databases were tested, taking into account various environmental factors using visible light and FIR cameras. The results showed that the proposed method performs better than the previously reported methods. PMID:28698475

  12. AFLATOXIN B1 IN CORN: DIRECT VERIFICATION OF CONTAMINATION THROUGH AN AUTOMATIC COMPUTERIZED SYSTEM BASED ON THE FLUORESCENCE

    Directory of Open Access Journals (Sweden)

    L. Vallone

    2009-09-01

    Full Text Available “Aflaflesh” is a computer based instrument, designed combining a visual data acquisition system with a sophisticated software of acquisition and analysis of images. This system allows you to check on a representative sample (5/10 kg contamination of corn by AFB1, using fluorescence under UV light when the grain is contaminated. To optimize the use of this control equipment were analyzed in two phases, a total of 80 samples comparing the results obtained by chemical analysis (Hplc to those obtained using “Aflaflesh”. Initially the study was set to correlate the number of contaminated grains to the ppb read by the official method, Hplc; the second step was to correlate ppb values to the number of pixel of contaminated surface of the grains read by the “Aflaflesh” instrument. The apparatus was then calibrated through a statistical analysis of the results obtained, to allow a direct reading of the AFB1 concentrations in a short period of time (15 min without the assistance of specialized personnel.

  13. Verification of a pencil beam based treatment planning system: output factors for open photon beams shaped with MLC or blocks

    International Nuclear Information System (INIS)

    Hansson, H.

    1999-01-01

    The accuracy of monitor unit calculations from a pencil beam based, three-dimensional treatment planning system (3D TPS) has been evaluated for open irregularly shaped photon fields. The dose per monitor unit was measured in water and in air for x-ray beam qualities from 6 to 15 MV. The fields were shaped either with a multileaf collimator (MLC) or with customized alloy blocks. Calculations from the 3D TPS were compared with measurements. The agreement between calculated and measured dose per monitor unit depended on field size and the amount of blocking and was within 3% for the MLC-shaped fields. The deviation could be traced to limitations in head scatter modelling for the MLC. For fields shaped with alloy blocks, the dose per monitor unit was calculated to be within 1.6% of measured values for all fields studied. The measured and calculated relative phantom scatter for fields with the same equivalent field size were identical for MLC and alloy shaped fields. These results indicate that the accuracy in the TPS calculations for open irregular fields, shaped with MLC or blocks, is satisfactory for clinical situations. (author)

  14. Verification of Radicals Formation in Ethanol-Water Mixture Based Solution Plasma and Their Relation to the Rate of Reaction.

    Science.gov (United States)

    Sudare, Tomohito; Ueno, Tomonaga; Watthanaphanit, Anyarat; Saito, Nagahiro

    2015-12-03

    Our previous research demonstrated that using ethanol-water mixture as a liquid medium for the synthesis of gold nanoparticles by the solution plasma process (SPP) could lead to an increment of the reaction rate of ∼35.2 times faster than that in pure water. This drastic change was observed when a small amount of ethanol, that is, at an ethanol mole fraction (χethanol) of 0.089, was added in the system. After this composition, the reaction rate decreased continuously. To better understand what happens in the ethanol-water mixture-based SPP, in this study, effect of the ethanol content on the radical formation in the system was verified. We focused on detecting the magnetic resonance of electronic spins using electron spin resonance spectroscopy to determine the type and quantity of the generated radicals at each χethanol. Results indicated that ethanol radicals were generated in the ethanol-water mixtures and exhibited maximum quantity at the xethanol of 0.089. Relationship between the ethanol radical yield and the rate of reaction, along with possible mechanism responsible for the observed phenomenon, is discussed in this paper.

  15. A Vehicular Mobile Standard Instrument for Field Verification of Traffic Speed Meters Based on Dual-Antenna Doppler Radar Sensor.

    Science.gov (United States)

    Du, Lei; Sun, Qiao; Cai, Changqing; Bai, Jie; Fan, Zhe; Zhang, Yue

    2018-04-05

    Traffic speed meters are important legal measuring instruments specially used for traffic speed enforcement and must be tested and verified in the field every year using a vehicular mobile standard speed-measuring instrument to ensure speed-measuring performances. The non-contact optical speed sensor and the GPS speed sensor are the two most common types of standard speed-measuring instruments. The non-contact optical speed sensor requires extremely high installation accuracy, and its speed-measuring error is nonlinear and uncorrectable. The speed-measuring accuracy of the GPS speed sensor is rapidly reduced if the amount of received satellites is insufficient enough, which often occurs in urban high-rise regions, tunnels, and mountainous regions. In this paper, a new standard speed-measuring instrument using a dual-antenna Doppler radar sensor is proposed based on a tradeoff between the installation accuracy requirement and the usage region limitation, which has no specified requirements for its mounting distance and no limitation on usage regions and can automatically compensate for the effect of an inclined installation angle on its speed-measuring accuracy. Theoretical model analysis, simulated speed measurement results, and field experimental results compared with a GPS speed sensor with high accuracy showed that the dual-antenna Doppler radar sensor is effective and reliable as a new standard speed-measuring instrument.

  16. Verification study of thorium cross section in MVP calculation of thorium based fuel core using experimental data

    International Nuclear Information System (INIS)

    Mai, V. T.; Fujii, T.; Wada, K.; Kitada, T.; Takaki, N.; Yamaguchi, A.; Watanabe, H.; Unesaki, H.

    2012-01-01

    Considering the importance of thorium data and concerning about the accuracy of Th-232 cross section library, a series of experiments of thorium critical core carried out at KUCA facility of Kyoto Univ. Research Reactor Inst. have been analyzed. The core was composed of pure thorium plates and 93% enriched uranium plates, solid polyethylene moderator with hydro to U-235 ratio of 140 and Th-232 to U-235 ratio of 15.2. Calculations of the effective multiplication factor, control rod worth, reactivity worth of Th plates have been conducted by MVP code using JENDL-4.0 library [1]. At the experiment site, after achieving the critical state with 51 fuel rods inserted inside the reactor, the measurements of the reactivity worth of control rod and thorium sample are carried out. By comparing with the experimental data, the calculation overestimates the effective multiplication factor about 0.90%. Reactivity worth of the control rods evaluation using MVP is acceptable with the maximum discrepancy about the statistical error of the measured data. The calculated results agree to the measurement ones within the difference range of 3.1% for the reactivity worth of one Th plate. From this investigation, further experiments and research on Th-232 cross section library need to be conducted to provide more reliable data for thorium based fuel core design and safety calculation. (authors)

  17. Verification of rapid method for estimation of added food colorant type in boiled sausages based on measurement of cross section color

    Science.gov (United States)

    Jovanović, J.; Petronijević, R. B.; Lukić, M.; Karan, D.; Parunović, N.; Branković-Lazić, I.

    2017-09-01

    During the previous development of a chemometric method for estimating the amount of added colorant in meat products, it was noticed that the natural colorant most commonly added to boiled sausages, E 120, has different CIE-LAB behavior compared to artificial colors that are used for the same purpose. This has opened the possibility of transforming the developed method into a method for identifying the addition of natural or synthetic colorants in boiled sausages based on the measurement of the color of the cross-section. After recalibration of the CIE-LAB method using linear discriminant analysis, verification was performed on 76 boiled sausages, of either frankfurters or Parisian sausage types. The accuracy and reliability of the classification was confirmed by comparison with the standard HPLC method. Results showed that the LDA + CIE-LAB method can be applied with high accuracy, 93.42 %, to estimate food color type in boiled sausages. Natural orange colors can give false positive results. Pigments from spice mixtures had no significant effect on CIE-LAB results.

  18. SNP Data Quality Control in a National Beef and Dairy Cattle System and Highly Accurate SNP Based Parentage Verification and Identification.

    Science.gov (United States)

    McClure, Matthew C; McCarthy, John; Flynn, Paul; McClure, Jennifer C; Dair, Emma; O'Connell, D K; Kearney, John F

    2018-01-01

    A major use of genetic data is parentage verification and identification as inaccurate pedigrees negatively affect genetic gain. Since 2012 the international standard for single nucleotide polymorphism (SNP) verification in Bos taurus cattle has been the ISAG SNP panels. While these ISAG panels provide an increased level of parentage accuracy over microsatellite markers (MS), they can validate the wrong parent at ≤1% misconcordance rate levels, indicating that more SNP are needed if a more accurate pedigree is required. With rapidly increasing numbers of cattle being genotyped in Ireland that represent 61 B. taurus breeds from a wide range of farm types: beef/dairy, AI/pedigree/commercial, purebred/crossbred, and large to small herd size the Irish Cattle Breeding Federation (ICBF) analyzed different SNP densities to determine that at a minimum ≥500 SNP are needed to consistently predict only one set of parents at a ≤1% misconcordance rate. For parentage validation and prediction ICBF uses 800 SNP (ICBF800) selected based on SNP clustering quality, ISAG200 inclusion, call rate (CR), and minor allele frequency (MAF) in the Irish cattle population. Large datasets require sample and SNP quality control (QC). Most publications only deal with SNP QC via CR, MAF, parent-progeny conflicts, and Hardy-Weinberg deviation, but not sample QC. We report here parentage, SNP QC, and a genomic sample QC pipelines to deal with the unique challenges of >1 million genotypes from a national herd such as SNP genotype errors from mis-tagging of animals, lab errors, farm errors, and multiple other issues that can arise. We divide the pipeline into two parts: a Genotype QC and an Animal QC pipeline. The Genotype QC identifies samples with low call rate, missing or mixed genotype classes (no BB genotype or ABTG alleles present), and low genotype frequencies. The Animal QC handles situations where the genotype might not belong to the listed individual by identifying: >1 non

  19. Nuclear Energy -- Knowledge Base for Advanced Modeling and Simulation (NE-KAMS) Code Verification and Validation Data Standards and Requirements: Fluid Dynamics Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Greg Weirs; Hyung Lee

    2011-09-01

    V&V and UQ are the primary means to assess the accuracy and reliability of M&S and, hence, to establish confidence in M&S. Though other industries are establishing standards and requirements for the performance of V&V and UQ, at present, the nuclear industry has not established such standards or requirements. However, the nuclear industry is beginning to recognize that such standards are needed and that the resources needed to support V&V and UQ will be very significant. In fact, no single organization has sufficient resources or expertise required to organize, conduct and maintain a comprehensive V&V and UQ program. What is needed is a systematic and standardized approach to establish and provide V&V and UQ resources at a national or even international level, with a consortium of partners from government, academia and industry. Specifically, what is needed is a structured and cost-effective knowledge base that collects, evaluates and stores verification and validation data, and shows how it can be used to perform V&V and UQ, leveraging collaboration and sharing of resources to support existing engineering and licensing procedures as well as science-based V&V and UQ processes. The Nuclear Energy Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Utah State University and others with the objective of establishing a comprehensive and web-accessible knowledge base to provide V&V and UQ resources for M&S for nuclear reactor design, analysis and licensing. The knowledge base will serve as an important resource for technical exchange and collaboration that will enable credible and reliable computational models and simulations for application to nuclear power. NE-KAMS will serve as a valuable resource for the nuclear industry, academia, the national laboratories, the U.S. Nuclear Regulatory Commission (NRC) and

  20. Verification of organ doses calculated by a dose monitoring software tool based on Monte Carlo Simulation in thoracic CT protocols.

    Science.gov (United States)

    Guberina, Nika; Suntharalingam, Saravanabavaan; Naßenstein, Kai; Forsting, Michael; Theysohn, Jens; Wetter, Axel; Ringelstein, Adrian

    2018-03-01

    Background The importance of monitoring of the radiation dose received by the human body during computed tomography (CT) examinations is not negligible. Several dose-monitoring software tools emerged in order to monitor and control dose distribution during CT examinations. Some software tools incorporate Monte Carlo Simulation (MCS) and allow calculation of effective dose and organ dose apart from standard dose descriptors. Purpose To verify the results of a dose-monitoring software tool based on MCS in assessment of effective and organ doses in thoracic CT protocols. Material and Methods Phantom measurements were performed with thermoluminescent dosimeters (TLD LiF:Mg,Ti) using two different thoracic CT protocols of the clinical routine: (I) standard CT thorax (CTT); and (II) CTT with high-pitch mode, P = 3.2. Radiation doses estimated with MCS and measured with TLDs were compared. Results Inter-modality comparison showed an excellent correlation between MCS-simulated and TLD-measured doses ((I) after localizer correction r = 0.81; (II) r = 0.87). The following effective and organ doses were determined: (I) (a) effective dose = MCS 1.2 mSv, TLD 1.3 mSv; (b) thyroid gland = MCS 2.8 mGy, TLD 2.5 mGy; (c) thymus = MCS 3.1 mGy, TLD 2.5 mGy; (d) bone marrow = MCS 0.8 mGy, TLD 0.9 mGy; (e) breast = MCS 2.5 mGy, TLD 2.2 mGy; (f) lung = MCS 2.8 mGy, TLD 2.7 mGy; (II) (a) effective dose = MCS 0.6 mSv, TLD 0.7 mSv; (b) thyroid gland = MCS 1.4 mGy, TLD 1.8 mGy; (c) thymus = MCS 1.4 mGy, TLD 1.8 mGy; (d) bone marrow = MCS 0.4 mGy, TLD 0.5 mGy; (e) breast = MCS 1.1 mGy, TLD 1.1 mGy; (f) lung = MCS 1.2 mGy, TLD 1.3 mGy. Conclusion Overall, in thoracic CT protocols, organ doses simulated by the dose-monitoring software tool were coherent to those measured by TLDs. Despite some challenges, the dose-monitoring software was capable of an accurate dose calculation.

  1. Knowledge-Based Aircraft Automation: Managers Guide on the use of Artificial Intelligence for Aircraft Automation and Verification and Validation Approach for a Neural-Based Flight Controller

    Science.gov (United States)

    Broderick, Ron

    1997-01-01

    The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network

  2. SU-E-T-24: A Simple Correction-Based Method for Independent Monitor Unit (MU) Verification in Monte Carlo (MC) Lung SBRT Plans

    Energy Technology Data Exchange (ETDEWEB)

    Pokhrel, D; Badkul, R; Jiang, H; Estes, C; Kumar, P; Wang, F [UniversityKansas Medical Center, Kansas City, KS (United States)

    2014-06-01

    Purpose: Lung-SBRT uses hypo-fractionated dose in small non-IMRT fields with tissue-heterogeneity corrected plans. An independent MU verification is mandatory for safe and effective delivery of the treatment plan. This report compares planned MU obtained from iPlan-XVM-Calgorithm against spreadsheet-based hand-calculation using most commonly used simple TMR-based method. Methods: Treatment plans of 15 patients who underwent for MC-based lung-SBRT to 50Gy in 5 fractions for PTV V100%=95% were studied. ITV was delineated on MIP images based on 4D-CT scans. PTVs(ITV+5mm margins) ranged from 10.1- 106.5cc(average=48.6cc). MC-SBRT plans were generated using a combination of non-coplanar conformal arcs/beams using iPlan XVM-Calgorithm (BrainLAB iPlan ver.4.1.2) for Novalis-TX consisting of micro-MLCs and 6MV-SRS (1000MU/min) beam. These plans were re-computed using heterogeneity-corrected Pencil-Beam (PB-hete) algorithm without changing any beam parameters, such as MLCs/MUs. Dose-ratio: PB-hete/MC gave beam-by-beam inhomogeneity-correction-factors (ICFs):Individual Correction. For independent-2nd-check, MC-MUs were verified using TMR-based hand-calculation and obtained an average ICF:Average Correction, whereas TMR-based hand-calculation systematically underestimated MC-MUs by ∼5%. Also, first 10 MC-plans were verified with an ion-chamber measurement using homogenous phantom. Results: For both beams/arcs, mean PB-hete dose was systematically overestimated by 5.5±2.6% and mean hand-calculated MU systematic underestimated by 5.5±2.5% compared to XVMC. With individual correction, mean hand-calculated MUs matched with XVMC by - 0.3±1.4%/0.4±1.4 for beams/arcs, respectively. After average 5% correction, hand-calculated MUs matched with XVMC by 0.5±2.5%/0.6±2.0% for beams/arcs, respectively. Smaller dependence on tumor volume(TV)/field size(FS) was also observed. Ion-chamber measurement was within ±3.0%. Conclusion: PB-hete overestimates dose to lung tumor relative to

  3. An Engineering Approach to Atomic Transaction Verification : Use of a Simple Object Model to Achieve Semantics-based Reasoning at Compile-time

    NARCIS (Netherlands)

    Spelt, D.; Even, S.J.

    In this paper, we take an engineering approach to atomic transaction verification. We discuss the design and implementation of a verification tool that can reason about the semantics of atomic database operations. To bridge the gap between language design and automated reasoning, we make use of a

  4. On the feasibility of polyurethane based 3D dosimeters with optical CT for dosimetric verification of low energy photon brachytherapy seeds.

    Science.gov (United States)

    Adamson, Justus; Yang, Yun; Juang, Titania; Chisholm, Kelsey; Rankine, Leith; Adamovics, John; Yin, Fang Fang; Oldham, Mark

    2014-07-01

    To investigate the feasibility of and challenges yet to be addressed to measure dose from low energy (effective energy polyurethane based 3D dosimeters with optical CT. The authors' evaluation used the following sources: models 200 (Pd-103), CS-1 Rev2 (Cs-131), and 6711 (I-125). The authors used the Monte Carlo radiation transport code MCNP5, simulations with the ScanSim optical tomography simulation software, and experimental measurements with PRESAGE(®) dosimeters/optical CT to investigate the following: (1) the water equivalency of conventional (density = 1.065 g/cm(3)) and deformable (density = 1.02 g/cm(3)) formulations of polyurethane dosimeters, (2) the scatter conditions necessary to achieve accurate dosimetry for low energy photon seeds, (3) the change in photon energy spectrum within the dosimeter as a function of distance from the source in order to determine potential energy sensitivity effects, (4) the optimal delivered dose to balance optical transmission (per projection) with signal to noise ratio in the reconstructed dose distribution, and (5) the magnitude and characteristics of artifacts due to the presence of a channel in the dosimeter. Monte Carlo simulations were performed using both conventional and deformable dosimeter formulations. For verification, 2.8 Gy at 1 cm was delivered in 92 h using an I-125 source to a PRESAGE(®) dosimeter with conventional formulation and a central channel with 0.0425 cm radius for source placement. The dose distribution was reconstructed with 0.02 and 0.04 cm(3) voxel size using the Duke midsized optical CT scanner (DMOS). While the conventional formulation overattenuates dose from all three sources compared to water, the current deformable formulation has nearly water equivalent attenuation properties for Cs-131 and I-125, while underattenuating for Pd-103. The energy spectrum of each source is relatively stable within the first 5 cm especially for I-125. The inherent assumption of radial symmetry in the TG43

  5. Marker-based quantification of interfractional tumor position variation and the use of markers for setup verification in radiation therapy for esophageal cancer

    NARCIS (Netherlands)

    Jin, Peng; van der Horst, Astrid; de Jong, Rianne; van Hooft, Jeanin E.; Kamphuis, Martijn; van Wieringen, Niek; Machiels, Melanie; Bel, Arjan; Hulshof, Maarten C. C. M.; Alderliesten, Tanja

    2015-01-01

    The aim of this study was to quantify interfractional esophageal tumor position variation using markers and investigate the use of markers for setup verification. Sixty-five markers placed in the tumor volumes of 24 esophageal cancer patients were identified in computed tomography (CT) and follow-up

  6. DEPSCOR: Research on ARL’s Intelligent Control Architecture: Hierarchical Hybrid-Model Based Design, Verification, Simulation, and Synthesis of Mission Control for Autonomous Underwater Vehicles

    Science.gov (United States)

    2007-02-01

    YBRID M O DELS IN TEJA ................................................................................... 40 Sequential coordinator...37 APPENDIX B : HYBRID M ODELS IN TEJA ................................................................................... 39 Sequential...time or time bounded constraints. The subsystems have been implemented using a high level programming environment provided by Teja . 5 6 Verification of a

  7. Software verification and testing

    Science.gov (United States)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  8. SU-E-T-29: A Web Application for GPU-Based Monte Carlo IMRT/VMAT QA with Delivered Dose Verification

    International Nuclear Information System (INIS)

    Folkerts, M; Graves, Y; Tian, Z; Gu, X; Jia, X; Jiang, S

    2014-01-01

    Purpose: To enable an existing web application for GPU-based Monte Carlo (MC) 3D dosimetry quality assurance (QA) to compute “delivered dose” from linac logfile data. Methods: We added significant features to an IMRT/VMAT QA web application which is based on existing technologies (HTML5, Python, and Django). This tool interfaces with python, c-code libraries, and command line-based GPU applications to perform a MC-based IMRT/VMAT QA. The web app automates many complicated aspects of interfacing clinical DICOM and logfile data with cutting-edge GPU software to run a MC dose calculation. The resultant web app is powerful, easy to use, and is able to re-compute both plan dose (from DICOM data) and delivered dose (from logfile data). Both dynalog and trajectorylog file formats are supported. Users upload zipped DICOM RP, CT, and RD data and set the expected statistic uncertainty for the MC dose calculation. A 3D gamma index map, 3D dose distribution, gamma histogram, dosimetric statistics, and DVH curves are displayed to the user. Additional the user may upload the delivery logfile data from the linac to compute a 'delivered dose' calculation and corresponding gamma tests. A comprehensive PDF QA report summarizing the results can also be downloaded. Results: We successfully improved a web app for a GPU-based QA tool that consists of logfile parcing, fluence map generation, CT image processing, GPU based MC dose calculation, gamma index calculation, and DVH calculation. The result is an IMRT and VMAT QA tool that conducts an independent dose calculation for a given treatment plan and delivery log file. The system takes both DICOM data and logfile data to compute plan dose and delivered dose respectively. Conclusion: We sucessfully improved a GPU-based MC QA tool to allow for logfile dose calculation. The high efficiency and accessibility will greatly facilitate IMRT and VMAT QA

  9. SU-E-T-29: A Web Application for GPU-Based Monte Carlo IMRT/VMAT QA with Delivered Dose Verification

    Energy Technology Data Exchange (ETDEWEB)

    Folkerts, M [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States); University of California, San Diego, La Jolla, CA (United States); Graves, Y [University of California, San Diego, La Jolla, CA (United States); Tian, Z; Gu, X; Jia, X; Jiang, S [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States)

    2014-06-01

    Purpose: To enable an existing web application for GPU-based Monte Carlo (MC) 3D dosimetry quality assurance (QA) to compute “delivered dose” from linac logfile data. Methods: We added significant features to an IMRT/VMAT QA web application which is based on existing technologies (HTML5, Python, and Django). This tool interfaces with python, c-code libraries, and command line-based GPU applications to perform a MC-based IMRT/VMAT QA. The web app automates many complicated aspects of interfacing clinical DICOM and logfile data with cutting-edge GPU software to run a MC dose calculation. The resultant web app is powerful, easy to use, and is able to re-compute both plan dose (from DICOM data) and delivered dose (from logfile data). Both dynalog and trajectorylog file formats are supported. Users upload zipped DICOM RP, CT, and RD data and set the expected statistic uncertainty for the MC dose calculation. A 3D gamma index map, 3D dose distribution, gamma histogram, dosimetric statistics, and DVH curves are displayed to the user. Additional the user may upload the delivery logfile data from the linac to compute a 'delivered dose' calculation and corresponding gamma tests. A comprehensive PDF QA report summarizing the results can also be downloaded. Results: We successfully improved a web app for a GPU-based QA tool that consists of logfile parcing, fluence map generation, CT image processing, GPU based MC dose calculation, gamma index calculation, and DVH calculation. The result is an IMRT and VMAT QA tool that conducts an independent dose calculation for a given treatment plan and delivery log file. The system takes both DICOM data and logfile data to compute plan dose and delivered dose respectively. Conclusion: We sucessfully improved a GPU-based MC QA tool to allow for logfile dose calculation. The high efficiency and accessibility will greatly facilitate IMRT and VMAT QA.

  10. SAT-Based Software Certification

    National Research Council Canada - National Science Library

    Chaki, Sagar

    2006-01-01

    ... predicate abstraction and validated by generating and proving verification conditions. In addition, the first part of the report proposes the use of theorem provers based on Boolean propositional satisfiability (SAT...

  11. From viral genome to specific peptide epitopes: methods for identifying porcine T cell epitopes based on in silico predictions, in vitro identification and ex vivo verification

    DEFF Research Database (Denmark)

    Pedersen, Lasse Eggers; Rasmussen, Michael; Harndah, Mikkel

    2013-01-01

    The affinity with which major histocompatibility complex (MHC) class I molecules bind peptides is instrumental to presentation of viral epitopes to cytotoxic T lymphocytes (CTLs). We analyzed three swine leukocyte antigen (SLA) molecules for complete nonamer peptide-based binding matrices in orde...

  12. WE-EF-303-06: Feasibility of PET Image-Based On-Line Proton Beam-Range Verification with Simulated Uniform Phantom and Human Brain Studies

    International Nuclear Information System (INIS)

    Lou, K; Sun, X; Zhu, X; Grosshans, D; Clark, J; Shao, Y

    2015-01-01

    Purpose: To study the feasibility of clinical on-line proton beam range verification with PET imaging Methods: We simulated a 179.2-MeV proton beam with 5-mm diameter irradiating a PMMA phantom of human brain size, which was then imaged by a brain PET with 300*300*100-mm 3 FOV and different system sensitivities and spatial resolutions. We calculated the mean and standard deviation of positron activity range (AR) from reconstructed PET images, with respect to different data acquisition times (from 5 sec to 300 sec with 5-sec step). We also developed a technique, “Smoothed Maximum Value (SMV)”, to improve AR measurement under a given dose. Furthermore, we simulated a human brain irradiated by a 110-MeV proton beam of 50-mm diameter with 0.3-Gy dose at Bragg peak and imaged by the above PET system with 40% system sensitivity at the center of FOV and 1.7-mm spatial resolution. Results: MC Simulations on the PMMA phantom showed that, regardless of PET system sensitivities and spatial resolutions, the accuracy and precision of AR were proportional to the reciprocal of the square root of image count if image smoothing was not applied. With image smoothing or SMV method, the accuracy and precision could be substantially improved. For a cylindrical PMMA phantom (200 mm diameter and 290 mm long), the accuracy and precision of AR measurement could reach 1.0 and 1.7 mm, with 100-sec data acquired by the brain PET. The study with a human brain showed it was feasible to achieve sub-millimeter accuracy and precision of AR measurement with acquisition time within 60 sec. Conclusion: This study established the relationship between count statistics and the accuracy and precision of activity-range verification. It showed the feasibility of clinical on-line BR verification with high-performance PET systems and improved AR measurement techniques. Cancer Prevention and Research Institute of Texas grant RP120326, NIH grant R21CA187717, The Cancer Center Support (Core) Grant CA016672

  13. Verification of dosimetry cross sections above 10 MeV based on measurement of activation reaction rates in fission neutron field

    International Nuclear Information System (INIS)

    Odano, Naoteru; Miura, Toshimasa; Yamaji, Akio.

    1996-01-01

    To validate the dosimetry cross sections in fast neutron energy range, activation reaction rates were measured for 5 types of dosimetry cross sections which have sensitivity in the energy rage above 10 MeV utilizing JRR-4 reactor of JAERI. The measured reaction rates were compared with the calculations reaction rates by a continuous energy monte carlo code MVP. The calculated reaction rates were based on two dosimetry files, JENDL Dosimetry File and IRDF-90.2. (author)

  14. Clinical Implementation of a Model-Based In Vivo Dose Verification System for Stereotactic Body Radiation Therapy–Volumetric Modulated Arc Therapy Treatments Using the Electronic Portal Imaging Device

    Energy Technology Data Exchange (ETDEWEB)

    McCowan, Peter M., E-mail: pmccowan@cancercare.mb.ca [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Asuni, Ganiyu [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Van Uytven, Eric [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba (Canada); VanBeek, Timothy [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); McCurdy, Boyd M.C. [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba (Canada); Department of Radiology, University of Manitoba, Winnipeg, Manitoba (Canada); Loewen, Shaun K. [Department of Oncology, University of Calgary, Calgary, Alberta (Canada); Ahmed, Naseer; Bashir, Bashir; Butler, James B.; Chowdhury, Amitava; Dubey, Arbind; Leylek, Ahmet; Nashed, Maged [CancerCare Manitoba, Winnipeg, Manitoba (Canada)

    2017-04-01

    Purpose: To report findings from an in vivo dosimetry program implemented for all stereotactic body radiation therapy patients over a 31-month period and discuss the value and challenges of utilizing in vivo electronic portal imaging device (EPID) dosimetry clinically. Methods and Materials: From December 2013 to July 2016, 117 stereotactic body radiation therapy–volumetric modulated arc therapy patients (100 lung, 15 spine, and 2 liver) underwent 602 EPID-based in vivo dose verification events. A developed model-based dose reconstruction algorithm calculates the 3-dimensional dose distribution to the patient by back-projecting the primary fluence measured by the EPID during treatment. The EPID frame-averaging was optimized in June 2015. For each treatment, a 3%/3-mm γ comparison between our EPID-derived dose and the Eclipse AcurosXB–predicted dose to the planning target volume (PTV) and the ≥20% isodose volume were performed. Alert levels were defined as γ pass rates <85% (lung and liver) and <80% (spine). Investigations were carried out for all fractions exceeding the alert level and were classified as follows: EPID-related, algorithmic, patient setup, anatomic change, or unknown/unidentified errors. Results: The percentages of fractions exceeding the alert levels were 22.6% for lung before frame-average optimization and 8.0% for lung, 20.0% for spine, and 10.0% for liver after frame-average optimization. Overall, mean (± standard deviation) planning target volume γ pass rates were 90.7% ± 9.2%, 87.0% ± 9.3%, and 91.2% ± 3.4% for the lung, spine, and liver patients, respectively. Conclusions: Results from the clinical implementation of our model-based in vivo dose verification method using on-treatment EPID images is reported. The method is demonstrated to be valuable for routine clinical use for verifying delivered dose as well as for detecting errors.

  15. SU-F-T-287: A Preliminary Study On Patient Specific VMAT Verification Using a Phosphor-Screen Based Geometric QA System (Raven QA)

    International Nuclear Information System (INIS)

    Lee, M; Yi, B; Wong, J; Ding, K

    2016-01-01

    Purpose: The RavenQA system (LAP Laser, Germany) is a QA device with a phosphor screen detector for performing the QA tasks of TG-142. This study tested if it is feasible to use the system for the patient specific QA of the Volumetric Modulated Arc Therapy (VMAT). Methods: Water equivalent material (5cm) is attached to the front of the detector plate of the RavenQA for dosimetry purpose. Then the plate is attached to the gantry to synchronize the movement between the detector and the gantry. Since the detector moves together with gantry, The ’Reset gantry to 0’ function of the Eclipse planning system (Varian, CA) is used to simulate the measurement situation when calculating dose of the detector plate. The same gantry setup is used when delivering the treatment beam for feasibility test purposes. Cumulative dose is acquired for each arc. The optical scatter component of each captured image from the CCD camera is corrected by deconvolving the 2D spatial invariant optical scatter kernel (OSK). We assume that the OSK is a 2D isotropic point spread function with inverse-squared decrease as a function of radius from the center. Results: Three cases of VMAT plans including head & neck, whole pelvis and abdomen-pelvis are tested. Setup time for measurements was less than 5 minutes. Passing rates of absolute gamma were 99.3, 98.2, 95.9 respectively for 3%/3mm criteria and 96.2, 97.1, 86.4 for 2%/2mm criteria. The abdomen-pelvis field has long treatment fields, 37cm, which are longer than the detector plate (25cm). This plan showed relatively lower passing rate than other plans. Conclusion: An algorithm for IMRT/VMAT verification using the RavenQA has been developed and tested. The model of spatially invariant OSK works well for deconvolution purpose. It is proved that the RavenQA can be used for the patient specific verification of VMAT. This work is funded in part by a Maryland Industrial Partnership Program grant to University of Maryland and to JPLC who owns the

  16. Bone Marrow Stromal Antigen 2 Is a Novel Plasma Biomarker and Prognosticator for Colorectal Carcinoma: A Secretome-Based Verification Study

    Directory of Open Access Journals (Sweden)

    Sum-Fu Chiang

    2015-01-01

    Full Text Available Background. The cancer cell secretome has been recognized as a valuable reservoir for identifying novel serum/plasma biomarkers for different cancers, including colorectal cancer (CRC. This study aimed to verify four CRC cell-secreted proteins (tumor-associated calcium signal transducer 2/trophoblast cell surface antigen 2 (TACSTD2/TROP2, tetraspanin-6 (TSPAN6, bone marrow stromal antigen 2 (BST2, and tumor necrosis factor receptor superfamily member 16 (NGFR as potential plasma CRC biomarkers. Methods. The study population comprises 152 CRC patients and 152 controls. Target protein levels in plasma and tissue samples were assessed by ELISA and immunohistochemistry, respectively. Results. Among the four candidate proteins examined by ELISA in a small sample set, only BST2 showed significantly elevated plasma levels in CRC patients versus controls. Immunohistochemical analysis revealed the overexpression of BST2 in CRC tissues, and higher BST2 expression levels correlated with poorer 5-year survival (46.47% versus 65.57%; p=0.044. Further verification confirmed the elevated plasma BST2 levels in CRC patients (2.35 ± 0.13 ng/mL versus controls (1.04 ± 0.03 ng/mL (p<0.01, with an area under the ROC curve (AUC being 0.858 comparable to that of CEA (0.867. Conclusion. BST2, a membrane protein selectively detected in CRC cell secretome, may be a novel plasma biomarker and prognosticator for CRC.

  17. Proposal and Verification of Visual Walke Aiming at the Rotation Target Object Based on Feature Value Caused by Biped Walking Motion

    Science.gov (United States)

    Asano, Yosuke; Kawamura, Atsuo

    A definition of the visual walking proposed by authors is that the robot autonomously walks by making decision based on the image feature motion. One of the achievements is “visual tracking walk”. In the past conference, authors proposed a hybrid control for “visual walking”. In this paper, the four vertexes of a square surrounding the red target in the image plane are selected as image features. Using such image features, the robot moves in front of the desired position. The rotation orientation of the robot is improved by the feature values caused by the walking motion. The proposed control law is verified by simulations and experiments.

  18. Verification of KERMA factor for beryllium at neutron energy of 14.2 MeV based on charged-particle measurement

    International Nuclear Information System (INIS)

    Kondo, Keitaro; Ochiai, Kentaro; Murata, Isao; Konno, Chikara

    2008-01-01

    In previous direct measurements of nuclear heating for beryllium induced with DT neutrons, it was pointed out that the calculation with JENDL-3.2 underestimated the measured one by 25%. However, reasons of the discrepancy have not been understood clearly. Recently, we measured the α-particle emission double-differential cross section for beryllium and found that the evaluation of the 9 Be(n,2n + 2α) reaction in nuclear data libraries have some problems. We examined KERMA factors for beryllium deduced with three latest nuclear data libraries: JENDL-3.3, ENDF/B-VII.0 and JEFF-3.1. The partial KERMA factors for 9 Be(n,2n + 2α) reaction channel at incident neutron energy of 14.2 MeV deduced from these libraries were compared with a new partial KERMA factor calculated based on our experimental model. The partial KERMA factor from JENDL-3.3 was smaller by 20% than our experiment-based one. The reason of the discrepancy in the previous nuclear heating measurement comes from the smaller partial KERMA factor in JENDL-3.3, which is caused by significant underestimation of higher energy part of the α-particle emission DDX at forward emission angles

  19. Software-In-the-Loop based Modeling and Simulation of Unmanned Semi-submersible Vehicle for Performance Verification of Autonomous Navigation

    Science.gov (United States)

    Lee, Kwangkook; Jeong, Mijin; Kim, Dong Hun

    2017-12-01

    Since an unmanned semi-submersible is mainly used for the purpose of carrying out dangerous missions in the sea, it is possible to work in a region where it is difficult to access due to safety reasons. In this study, an USV hull design was determined using Myring hull profile, and reinforcement work was performed by designing and implementing inner stiffener member for 3D printing. In order to simulate a sea state 5.0 or more at sea, which is difficult to implement in practice, a regular and irregular wave equation was implemented in Matlab / Simulink. We performed modeling and simulation of semi - submersible simulation based on DMWorks considering the rolling motion in wave. To verify and improve unpredicted errors, we implemented a numeric and physical simulation model of the USV based on software-in-the-loop (SIL) method. This simulation allows shipbuilders to participate in new value-added markets such as engineering, procurement, construction, installation, commissioning, operation, and maintenance for the USV.

  20. Global 3-D imaging of mantle electrical conductivity based on inversion of observatory C-responses - I. An approach and its verification

    Science.gov (United States)

    Kuvshinov, Alexey; Semenov, Alexey

    2012-06-01

    We present a novel frequency-domain inverse solution to recover the 3-D electrical conductivity distribution in the mantle. The solution is based on analysis of local C-responses. It exploits an iterative gradient-type method - limited-memory quasi-Newton method - for minimizing the penalty function consisting of data misfit and regularization terms. The integral equation code is used as a forward engine to calculate responses and data misfit gradients during inversion. An adjoint approach is implemented to compute misfit gradients efficiently. Further improvements in computational load come from parallelizing the scheme with respect to frequencies, and from setting the most time-consuming part of the forward calculations - calculation of Green's tensors - apart from the inversion loop. Convergence, performance, and accuracy of our 3-D inverse solution are demonstrated with a synthetic numerical example. A companion paper applies the strategy set forth here to real data.

  1. Studies on the matched potential method for determining the selectivity coefficients of ion-selective electrodes based on neutral ionophores: experimental and theoretical verification.

    Science.gov (United States)

    Tohda, K; Dragoe, D; Shibata, M; Umezawa, Y

    2001-06-01

    A theory is presented that describes the matched potential method (MPM) for the determination of the potentiometric selectivity coefficients (KA,Bpot) of ion-selective electrodes for two ions with any charge. This MPM theory is based on electrical diffuse layers on both the membrane and the aqueous side of the interface, and is therefore independent of the Nicolsky-Eisenman equation. Instead, the Poisson equation is used and a Boltzmann distribution is assumed with respect to all charged species, including primary, interfering and background electrolyte ions located at the diffuse double layers. In this model, the MPM-selectivity coefficients of ions with equal charge (ZA = ZB) are expressed as the ratio of the concentrations of the primary and interfering ions in aqueous solutions at which the same amounts of the primary and interfering ions permselectively extracted into the membrane surface. For ions with unequal charge (ZA not equal to ZB), the selectivity coefficients are expressed as a function not only of the amounts of the primary and interfering ions permeated into the membrane surface, but also of the primary ion concentration in the initial reference solution and the delta EMF value. Using the measured complexation stability constants and single ion distribution coefficients for the relevant systems, the corresponding MPM selectivity coefficients can be calculated from the developed MPM theory. It was found that this MPM theory is capable of accurately and precisely predicting the MPM selectivity coefficients for a series of ion-selective electrodes (ISEs) with representative ionophore systems, which are generally in complete agreement with independently determined MPM selectivity values from the potentiometric measurements. These results also conclude that the assumption for the Boltzmann distribution was in fact valid in the theory. The recent critical papers on MPM have pointed out that because the MPM selectivity coefficients are highly concentration

  2. Verification of an ENSO-Based Long-Range Prediction of Anomalous Weather Conditions During the Vancouver 2010 Olympics and Paralympics

    Science.gov (United States)

    Mo, Ruping; Joe, Paul I.; Doyle, Chris; Whitfield, Paul H.

    2014-01-01

    A brief review of the anomalous weather conditions during the Vancouver 2010 Winter Olympic and Paralympic Games and the efforts to predict these anomalies based on some preceding El Niño-Southern Oscillation (ENSO) signals are presented. It is shown that the Olympic Games were held under extraordinarily warm conditions in February 2010, with monthly mean temperature anomalies of +2.2 °C in Vancouver and +2.8 °C in Whistler, ranking respectively as the highest and the second highest in the past 30 years (1981-2010). The warm conditions continued, but became less anomalous, in March 2010 for the Paralympic Games. While the precipitation amounts in the area remained near normal through this winter, the lack of snow due to warm conditions created numerous media headlines and practical problems for the alpine competitions. A statistical model was developed on the premise that February and March temperatures in the Vancouver area could be predicted using an ENSO signal with considerable lead time. This model successfully predicted the warmer-than-normal, lower-snowfall conditions for the Vancouver 2010 Winter Olympics and Paralympics.

  3. The Verification of the Usefulness of Electronic Nose Based on Ultra-Fast Gas Chromatography and Four Different Chemometric Methods for Rapid Analysis of Spirit Beverages

    Directory of Open Access Journals (Sweden)

    Paulina Wiśniewska

    2016-01-01

    Full Text Available Spirit beverages are a diverse group of foodstuffs. They are very often counterfeited which cause the appearance of low quality products or wrongly labelled products on the market. It is important to find a proper quality control and botanical origin method enabling the same time preliminary check of the composition of investigated samples, which was the main goal of this work. For this purpose, the usefulness of electronic nose based on ultra-fast gas chromatography (fast GC e-nose was verified. A set of 24 samples of raw spirits, 33 samples of vodkas, and 8 samples of whisky were analysed by fast GC e-nose. Four data analysis methods were used. The PCA was applied for the visualization of dataset, observation of the variation inside groups of samples, and selection of variables for the other three statistical methods. The SQC method was utilized to compare the quality of the samples. Both the DFA and SIMCA data analysis methods were used for discrimination of vodka, whisky, and spirits samples. The fast GC e-nose combined with four statistical methods can be used for rapid discrimination of raw spirits, vodkas, and whisky and in the same for preliminary determination of the composition of investigated samples.

  4. Bedrock geology Forsmark. Modelling stage 2.3. Implications for and verification of the deterministic geological models based on complementary data

    Energy Technology Data Exchange (ETDEWEB)

    Stephens, Michael B. (Geological Survey of Sweden, Uppsala (Sweden)); Simeonov, Assen (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Isaksson, Hans (GeoVista AB, Luleaa (Sweden))

    2008-12-15

    The Swedish Nuclear Fuel and Waste Management Company is in the process of completing site descriptive modelling at two locations in Sweden, with the objective to site a deep geological repository for spent nuclear fuel. At Forsmark, the results of the stage 2.2 geological modelling formed the input for downstream users. Since complementary ground and borehole geological and geophysical data, acquired after model stage 2.2, were not planned to be included in the deterministic rock domain, fracture domain and deformation zone models supplied to the users, it was deemed necessary to evaluate the implications of these stage 2.3 data for the stage 2.2 deterministic geological models and, if possible, to make use of these data to verify the models. This report presents the results of the analysis of the complementary stage 2.3 geological and geophysical data. Model verification from borehole data has been implemented in the form of a prediction-outcome test. The stage 2.3 geological and geophysical data at Forsmark mostly provide information on the bedrock outside the target volume. Additional high-resolution ground magnetic data and the data from the boreholes KFM02B, KFM11A, KFM12A and HFM33 to HFM37 can be included in this category. Other data complement older information of identical character, both inside and outside this volume. These include the character and kinematics of deformation zones and fracture mineralogy. In general terms, it can be stated that all these new data either confirm the geological modelling work completed during stage 2.2 or are in good agreement with the data that were used in this work. In particular, although the new high-resolution ground magnetic data modify slightly the position and trace length of some stage 2.2 deformation zones at the ground surface, no new or modified deformation zones with a trace length longer than 3,000 m at the ground surface have emerged. It is also apparent that the revision of fracture orientation data

  5. Bedrock geology Forsmark. Modelling stage 2.3. Implications for and verification of the deterministic geological models based on complementary data

    International Nuclear Information System (INIS)

    Stephens, Michael B.; Simeonov, Assen; Isaksson, Hans

    2008-12-01

    The Swedish Nuclear Fuel and Waste Management Company is in the process of completing site descriptive modelling at two locations in Sweden, with the objective to site a deep geological repository for spent nuclear fuel. At Forsmark, the results of the stage 2.2 geological modelling formed the input for downstream users. Since complementary ground and borehole geological and geophysical data, acquired after model stage 2.2, were not planned to be included in the deterministic rock domain, fracture domain and deformation zone models supplied to the users, it was deemed necessary to evaluate the implications of these stage 2.3 data for the stage 2.2 deterministic geological models and, if possible, to make use of these data to verify the models. This report presents the results of the analysis of the complementary stage 2.3 geological and geophysical data. Model verification from borehole data has been implemented in the form of a prediction-outcome test. The stage 2.3 geological and geophysical data at Forsmark mostly provide information on the bedrock outside the target volume. Additional high-resolution ground magnetic data and the data from the boreholes KFM02B, KFM11A, KFM12A and HFM33 to HFM37 can be included in this category. Other data complement older information of identical character, both inside and outside this volume. These include the character and kinematics of deformation zones and fracture mineralogy. In general terms, it can be stated that all these new data either confirm the geological modelling work completed during stage 2.2 or are in good agreement with the data that were used in this work. In particular, although the new high-resolution ground magnetic data modify slightly the position and trace length of some stage 2.2 deformation zones at the ground surface, no new or modified deformation zones with a trace length longer than 3,000 m at the ground surface have emerged. It is also apparent that the revision of fracture orientation data

  6. Semiportable load-cell-based weighing system prototype of 18.14-metric-ton (20-ton) capacity for UF6 cylinder weight verifications: description and testing procedure

    International Nuclear Information System (INIS)

    McAuley, W.A.

    1984-01-01

    The 18.14-metric-ton-capacity (20-ton) Load-Cell-Based Weighing System (LCBWS) prototype tested at the Oak Ridge (Tennessee) Gaseous Diffusion Plant March 20-30, 1984, is semiportable and has the potential for being highly accurate. Designed by Brookhaven National Laboratory, it can be moved to cylinders for weighing as opposed to the widely used operating philosophy of most enrichment facilities of moving cylinders to stationary accountability scales. Composed mainly of commercially available, off-the-shelf hardware, the system's principal elements are two load cells that sense the weight (i.e., force) of a uranium hexafluoride (UF 6 ) cylinder suspended from the LCBWS while the cylinder is in the process of being weighed. Portability is achieved by its attachment to a double-hook, overhead-bridge crane. The LCBWS prototype is designed to weigh 9.07- and 12.70-metric ton (10- and 14-ton) UF 6 cylinders. A detailed description of the LCBWS is given, design information and criteria are supplied, a testing procedure is outlined, and initial test results are reported. A major objective of the testing is to determine the reliability and accuracy of the system. Other testing objectives include the identification of (1) potential areas for system improvements and (2) procedural modifications that will reflect an improved and more efficient system. The testing procedure described includes, but is not limited to, methods that account for temperature sensitivity of the instrumentation, the local variation in the acceleration due to gravity, and buoyance effects. Operational and safety considerations are noted. A preliminary evaluation of the March test data indicates that the LCBWS prototype has the potential to have an accuracy in the vicinity of 1 kg

  7. Time-resolved imaging of prompt-gamma rays for proton range verification using a knife-edge slit camera based on digital photon counters

    Science.gov (United States)

    Cambraia Lopes, Patricia; Clementel, Enrico; Crespo, Paulo; Henrotin, Sebastien; Huizenga, Jan; Janssens, Guillaume; Parodi, Katia; Prieels, Damien; Roellinghoff, Frauke; Smeets, Julien; Stichelbaut, Frederic; Schaart, Dennis R.

    2015-08-01

    Proton range monitoring may facilitate online adaptive proton therapy and improve treatment outcomes. Imaging of proton-induced prompt gamma (PG) rays using a knife-edge slit collimator is currently under investigation as a potential tool for real-time proton range monitoring. A major challenge in collimated PG imaging is the suppression of neutron-induced background counts. In this work, we present an initial performance test of two knife-edge slit camera prototypes based on arrays of digital photon counters (DPCs). PG profiles emitted from a PMMA target upon irradiation with a 160 MeV proton pencil beams (about 6.5   ×   109 protons delivered in total) were measured using detector modules equipped with four DPC arrays coupled to BGO or LYSO : Ce crystal matrices. The knife-edge slit collimator and detector module were placed at 15 cm and 30 cm from the beam axis, respectively, in all cases. The use of LYSO : Ce enabled time-of-flight (TOF) rejection of background events, by synchronizing the DPC readout electronics with the 106 MHz radiofrequency signal of the cyclotron. The signal-to-background (S/B) ratio of 1.6 obtained with a 1.5 ns TOF window and a 3 MeV-7 MeV energy window was about 3 times higher than that obtained with the same detector module without TOF discrimination and 2 times higher than the S/B ratio obtained with the BGO module. Even 1 mm shifts of the Bragg peak position translated into clear and consistent shifts of the PG profile if TOF discrimination was applied, for a total number of protons as low as about 6.5   ×   108 and a detector surface of 6.6 cm  ×  6.6 cm.

  8. Validation Of Critical Knowledge-Based Systems

    Science.gov (United States)

    Duke, Eugene L.

    1992-01-01

    Report discusses approach to verification and validation of knowledge-based systems. Also known as "expert systems". Concerned mainly with development of methodologies for verification of knowledge-based systems critical to flight-research systems; e.g., fault-tolerant control systems for advanced aircraft. Subject matter also has relevance to knowledge-based systems controlling medical life-support equipment or commuter railroad systems.

  9. Graph-based software specification and verification

    NARCIS (Netherlands)

    Kastenberg, H.

    2008-01-01

    The (in)correct functioning of many software systems heavily influences the way we qualify our daily lives. Software companies as well as academic computer science research groups spend much effort on applying and developing techniques for improving the correctness of software systems. In this

  10. Constraint-based verification of imperative programs

    OpenAIRE

    Beyene, Tewodros Awgichew

    2011-01-01

    work presented in the context of the European Master’s program in Computational Logic, as the partial requirement for obtaining Master of Science degree in Computational Logic The continuous reduction in the cost of computing ever since the first days of computers has resulted in the ubiquity of computing systems today; there is no any sphere of life in the daily routine of human beings that is not directly or indirectly influenced by computer systems anymore. But this high reliance ...

  11. Graph Based Verification of Software Evolution Requirements

    NARCIS (Netherlands)

    Ciraci, S.

    2009-01-01

    Due to market demands and changes in the environment, software systems have to evolve. However, the size and complexity of the current software systems make it time consuming to incorporate changes. During our collaboration with the industry, we observed that the developers spend much time on the

  12. Geothermal Resource Verification for Air Force Bases,

    Science.gov (United States)

    1981-06-01

    680OF (3600 C) in the Salton Sea, California, and the nearby Cerro Prieto region of Mexico . Liquid water can exist underground in nature to a maxi...northwest Mexico’s Cerro Prieto field and southcentral California’s Imperial Valley area [banwell (1970)]. The Baca field in New Mexico’s Jemez Mountains...SAND8l-7123 j O Philip R./Grant, Jr En -gy R aon, Inc. 9720-D Candelaria, NE Albuquerque, New Mexico 87111 Abstract Geothermal energy offers a

  13. Optimal Information-Theoretic Wireless Location Verification

    OpenAIRE

    Yan, Shihao; Malaney, Robert; Nevat, Ido; Peters, Gareth W.

    2012-01-01

    We develop a new Location Verification System (LVS) focussed on network-based Intelligent Transport Systems and vehicular ad hoc networks. The algorithm we develop is based on an information-theoretic framework which uses the received signal strength (RSS) from a network of base-stations and the claimed position. Based on this information we derive the optimal decision regarding the verification of the user's location. Our algorithm is optimal in the sense of maximizing the mutual information...

  14. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  15. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  16. Methodology, results and experience of independent brachytherapy plan verifications based on DICOM standard; Implementacion, resultados y experiencia de una verificacion independiente de tratamientos de braquiterapia basada en el estandar DICOM

    Energy Technology Data Exchange (ETDEWEB)

    Ferrando Sanchez, A.; Pardo Perez, E.; Castro Novals, J.; Casa de Julian, M. A. de la; Cabello Murillo, E.; Diaz Fuentes, R.; Molina Lopez, M. Y.

    2013-09-01

    The use of a high dose rate source together with an afterloading treatment delivery in brachytherapy plans allows for dose modulation minimizing dose to staff. An independent verification of the exported data to the treatment station is required by local regulations (being also a widely accepted recommendation on the international literature). We have developed a methodology under home brew code to import DICOM treatment data onto an Excel spreadsheet that is able to calculate dose on given reference points using the TG-43 formalism of the AAPM3-5. It employs analytic fits of anisotropy factor and radial dose function for different sources. The end point implementations we present here allow merging in one step an independent verification and a treatment printout. The use of DICOM standard makes our code versatile and provides greater compatibility with respect to current treatment planning systems. (Author)

  17. Touch BASE

    CERN Multimedia

    Antonella Del Rosso

    2015-01-01

    In a recent Nature article (see here), the BASE collaboration reported the most precise comparison of the charge-to-mass ratio of the proton to its antimatter equivalent, the antiproton. This result is just the beginning and many more challenges lie ahead.   CERN's AD Hall, where the BASE experiment is set-up. The Baryon Antibaryon Symmetry Experiment (BASE) was approved in June 2013 and was ready to take data in August 2014. During these 14 months, the BASE collaboration worked hard to set up its four cryogenic Penning traps, which are the heart of the whole experiment. As their name indicates, these magnetic devices are used to trap antiparticles – antiprotons coming from the Antiproton Decelerator – and particles of matter – negative hydrogen ions produced in the system by interaction with a degrader that slows the antiprotons down, allowing scientists to perform their measurements. “We had very little time to set up the wh...

  18. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  19. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  20. Web based foundry knowledge base

    Directory of Open Access Journals (Sweden)

    A. Stawowy

    2009-01-01

    Full Text Available The main assumptions and functions of proposed Foundry Knowledge Base (FKB are presented in this paper. FKB is a framework forinformation exchange of casting products and manufacturing methods. We use CMS (Content Management System to develope andmaintain our web-based system. The CastML – XML dialect developed by authors for description of casting products and processes – isused as a tool for information interchange between ours and outside systems, while SQL is used to store and edit knowledge rules and alsoto solve the basic selection problems in the rule-based module. Besides the standard functions (companies data, news, events, forums and media kit, our website contains a number of nonstandard functions; the intelligent search module based on expert system is the main advantage of our solution. FKB is to be a social portal which content will be developed by foundry community.

  1. A SAT-Based Algorithm for Reparameterization in Symbolic Simulation

    National Research Council Canada - National Science Library

    Chauhan, Pankaj; Kroening, Daniel; Clarke, Edmund

    2003-01-01

    .... Efficient SAT solvers have been applied successfully for many verification problems. This paper presents a novel SAT-based reparameterization algorithm that is largely immune to the large number of input variables that need to be quantified...

  2. Principles of Component-Based Design of Intelligent Agents

    NARCIS (Netherlands)

    Brazier, F.M.; Jonker, C.M.; Treur, J.

    Compositional multi-agent system design is a methodological perspective on multi-agent system design based on the software engineering principles process and knowledge abstraction, compositionality, reuse, specification and verification. This paper addresses these principles from a generic

  3. Principles of component-based design of intelligent agents.

    NARCIS (Netherlands)

    Brazier, F.M.T.; Jonker, C.M.; Treur, J.

    2002-01-01

    Compositional multi-agent system design is a methodological perspective on multi-agent system design based on the software engineering principles process and knowledge abstraction, compositionality, reuse, specification and verification. This paper addresses these principles from a generic

  4. The SeaHorn Verification Framework

    Science.gov (United States)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  5. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    based model checking style of verification. The next paper by D'Souza & Thiagarajan presents an automata-theoretic approach to analysing timing properties of systems. The last paper by Mohalik and Ramanujam presents the assumption.

  6. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  7. Fragmentation based

    Directory of Open Access Journals (Sweden)

    Shashank Srivastava

    2014-01-01

    Gaining the understanding of mobile agent architecture and the security concerns, in this paper, we proposed a security protocol which addresses security with mitigated computational cost. The protocol is a combination of self decryption, co-operation and obfuscation technique. To circumvent the risk of malicious code execution in attacking environment, we have proposed fragmentation based encryption technique. Our encryption technique suits the general mobile agent size and provides hard and thorny obfuscation increasing attacker’s challenge on the same plane providing better performance with respect to computational cost as compared to existing AES encryption.

  8. Abstraction and Learning for Infinite-State Compositional Verification

    Directory of Open Access Journals (Sweden)

    Dimitra Giannakopoulou

    2013-09-01

    Full Text Available Despite many advances that enable the application of model checking techniques to the verification of large systems, the state-explosion problem remains the main challenge for scalability. Compositional verification addresses this challenge by decomposing the verification of a large system into the verification of its components. Recent techniques use learning-based approaches to automate compositional verification based on the assume-guarantee style reasoning. However, these techniques are only applicable to finite-state systems. In this work, we propose a new framework that interleaves abstraction and learning to perform automated compositional verification of infinite-state systems. We also discuss the role of learning and abstraction in the related context of interface generation for infinite-state components.

  9. Applying rule-base anomalies to KADS inference structures

    NARCIS (Netherlands)

    Van Harmelen, Frank

    1997-01-01

    The literature on validation and verification of knowledge-based systems contains a catalogue of anomalies for knowledge-based systems, such as redundant, contradictory or deficient knowledge. Detecting such anomalies is a method for verifying knowledge-based systems. Unfortunately, the traditional

  10. A software architecture for knowledge-based systems

    NARCIS (Netherlands)

    Fensel, D; Groenboom, R

    The paper introduces a software architecture for the specification and verification of knowledge-based systems combining conceptual and formal techniques. Our focus is component-based specification enabling their reuse. We identify four elements of the specification of a knowledge-based system: a

  11. Parking Space Verification

    DEFF Research Database (Denmark)

    Høg Peter Jensen, Troels; Thomsen Schmidt, Helge; Dyremose Bodin, Niels

    2018-01-01

    With the number of privately owned cars increasing, the issue of locating an available parking space becomes apparant. This paper deals with the verification of vacant parking spaces, by using a vision based system looking over parking areas. In particular the paper proposes a binary classifier...... system, based on a Convolutional Neural Network, that is capable of determining if a parking space is occupied or not. A benchmark database consisting of images captured from different parking areas, under different weather and illumination conditions, has been used to train and test the system...

  12. Eggspectation : organic egg verification tool

    NARCIS (Netherlands)

    Ruth, van S.M.; Hoogenboom, L.A.P.

    2011-01-01

    In 2009 RIKILT conducted a study on about 2,000 eggs to evaluate three different analytical verification methods: carotenoid profiling, fatty acid profiling and isotope ratio mass spectrometry. The eggs were collected from about 50 Dutch farms. The selection was based on the farms’ location and

  13. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare...... these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey...

  14. Constraint specialisation in Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2017-01-01

    We present a method for specialising the constraints in constrained Horn clauses with respect to a goal. We use abstract interpretation to compute a model of a query–answer transformed version of a given set of clauses and a goal. The constraints from the model are then used to compute...... underlying the clauses. Experimental results on verification problems show that this is an effective transformation, both in our own verification tools (based on a convex polyhedra analyser) and as a pre-processor to other Horn clause verification tools....

  15. Location Verification Systems Under Spatially Correlated Shadowing

    OpenAIRE

    Yan, Shihao; Nevat, Ido; Peters, Gareth W.; Malaney, Robert

    2014-01-01

    The verification of the location information utilized in wireless communication networks is a subject of growing importance. In this work we formally analyze, for the first time, the performance of a wireless Location Verification System (LVS) under the realistic setting of spatially correlated shadowing. Our analysis illustrates that anticipated levels of correlated shadowing can lead to a dramatic performance improvement of a Received Signal Strength (RSS)-based LVS. We also analyze the per...

  16. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic spec...

  17. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  18. Hand-Based Biometric Analysis

    Science.gov (United States)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  19. Foundation: Transforming data bases into knowledge bases

    Science.gov (United States)

    Purves, R. B.; Carnes, James R.; Cutts, Dannie E.

    1987-01-01

    One approach to transforming information stored in relational data bases into knowledge based representations and back again is described. This system, called Foundation, allows knowledge bases to take advantage of vast amounts of pre-existing data. A benefit of this approach is inspection, and even population, of data bases through an intelligent knowledge-based front-end.

  20. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  1. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  2. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  3. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  4. A Comparison of Two Content Area Curriculum-Based Measurement Tools

    Science.gov (United States)

    Ford, Jeremy W.; Conoyer, Sarah J.; Lembke, Erica S.; Smith, R. Alex; Hosp, John L.

    2018-01-01

    In the present study, two types of curriculum-based measurement (CBM) tools in science, Vocabulary Matching (VM) and Statement Verification for Science (SV-S), a modified Sentence Verification Technique, were compared. Specifically, this study aimed to determine whether the format of information presented (i.e., SV-S vs. VM) produces differences…

  5. Signature-based store checking buffer

    Science.gov (United States)

    Sridharan, Vilas; Gurumurthi, Sudhanva

    2015-06-02

    A system and method for optimizing redundant output verification, are provided. A hardware-based store fingerprint buffer receives multiple instances of output from multiple instances of computation. The store fingerprint buffer generates a signature from the content included in the multiple instances of output. When a barrier is reached, the store fingerprint buffer uses the signature to verify the content is error-free.

  6. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  7. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  8. Method for secure electronic voting system: face recognition based approach

    Science.gov (United States)

    Alim, M. Affan; Baig, Misbah M.; Mehboob, Shahzain; Naseem, Imran

    2017-06-01

    In this paper, we propose a framework for low cost secure electronic voting system based on face recognition. Essentially Local Binary Pattern (LBP) is used for face feature characterization in texture format followed by chi-square distribution is used for image classification. Two parallel systems are developed based on smart phone and web applications for face learning and verification modules. The proposed system has two tire security levels by using person ID followed by face verification. Essentially class specific threshold is associated for controlling the security level of face verification. Our system is evaluated three standard databases and one real home based database and achieve the satisfactory recognition accuracies. Consequently our propose system provides secure, hassle free voting system and less intrusive compare with other biometrics.

  9. Verification and Validation of Embedded Knowledge-Based Software Systems

    National Research Council Canada - National Science Library

    Santos, Eugene

    1999-01-01

    .... Our fundamental approach actively assists subject-matter experts in organizing their knowledge inclusive of uncertainty to build such embedded systems in a consistent and correct as well as effective fashion...

  10. Ontology-Based Empirical Knowledge Verification for Professional Virtual Community

    Science.gov (United States)

    Chen, Yuh-Jen

    2011-01-01

    A professional virtual community provides an interactive platform for enterprise experts to create and share their empirical knowledge cooperatively, and the platform contains a tremendous amount of hidden empirical knowledge that knowledge experts have preserved in the discussion process. Therefore, enterprise knowledge management highly…

  11. Camera-based independent couch height verification in radiation oncology

    NARCIS (Netherlands)

    Kusters, M.; Louwe, R.J.W.; Biemans-van Kastel, L.; Nieuwenkamp, H.; Zahradnik, R.; Claessen, R.; Seters, R.V.; Huizenga, H.

    2015-01-01

    For specific radiation therapy (RT) treatments, it is advantageous to use the isocenter-to-couch distance (ICD) for initial patient setup.(1) Since sagging of the treatment couch is not properly taken into account by the electronic readout of the treatment machine, this readout cannot be used for

  12. Provenance based data integrity checking and verification in cloud environments.

    Science.gov (United States)

    Imran, Muhammad; Hlavacs, Helmut; Haq, Inam Ul; Jan, Bilal; Khan, Fakhri Alam; Ahmad, Awais

    2017-01-01

    Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user's data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user's data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called "Data Provenance". Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking.

  13. Provenance based data integrity checking and verification in cloud environments.

    Directory of Open Access Journals (Sweden)

    Muhammad Imran

    Full Text Available Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user's data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user's data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called "Data Provenance". Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking.

  14. Modular Product Verifications Based on Design for Assembly

    OpenAIRE

    Kenger, Patrik; Bergdahl, Anders; Onori, Mauro

    2005-01-01

    The desire to conquer markets through advanced product design and trendy business strategies are still predominant approaches in industry today. In fact, product development has acquired an ever more central role in the strategic planning of companies, and it has extended its influence to R&D funding levels as well. It is not surprising that many national R&D project frameworks within the EU today are dominated by product development topics, leaving production engineering, robotics, a...

  15. Automated Verification of IGRT-based Patient Positioning.

    Science.gov (United States)

    Jiang, Xiaojun; Fox, Tim; Cordova, James S; Schreibmann, Eduard

    2015-11-08

    A system for automated quality assurance in radiotherapy of a therapist's registration was designed and tested in clinical practice. The approach compliments the clinical software's automated registration in terms of algorithm configuration and performance, and constitutes a practical approach for ensuring safe patient setups. Per our convergence analysis, evolutionary algorithms perform better in finding the global optima of the cost function with discrepancies from a deterministic optimizer seen sporadically.

  16. Verification of Context-Dependent Channel-Based Service Models

    NARCIS (Netherlands)

    N. Kokash (Natallia); , C. (born Köhler, , C.) Krause (Christian); E.P. de Vink (Erik Peter)

    2010-01-01

    htmlabstractThe paradigms of service-oriented computing and model-driven development are becoming of increasing importance in the eld of software engineering. According to these paradigms, new systems are composed with added value from existing stand-alone services to support business processes

  17. SMT-Based Formal Verification of a TTEthernet Synchronization Function

    Science.gov (United States)

    Steiner, Wilfried; Dutertre, Bruno

    TTEthernet is a communication infrastructure for mixed-criticality systems that integrates dataflow from applications with different criticality levels on a single network. For applications of highest criticality, TTEthernet provides a synchronization strategy that tolerates multiple failures. The resulting fault-tolerant timebase can then be used for time-triggered communication to ensure temporal partitioning on the shared network.

  18. Preliminary Validation and Verification Plan for CAREM Reactor Protection System

    International Nuclear Information System (INIS)

    Fittipaldi, Ana; Maciel Felix

    2000-01-01

    The purpose of this paper, is to present a preliminary validation and verification plan for a particular architecture proposed for the CAREM reactor protection system with software modules (computer based system).These software modules can be either own design systems or systems based in commercial modules such as programmable logic controllers (PLC) redundant of last generation.During this study, it was seen that this plan can also be used as a validation and verification plan of commercial products (COTS, commercial off the shelf) and/or smart transmitters.The software life cycle proposed and its features are presented, and also the advantages of the preliminary validation and verification plan

  19. On Backward-Style Anonymity Verification

    Science.gov (United States)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  20. Variance based OFDM frame synchronization

    Directory of Open Access Journals (Sweden)

    Z. Fedra

    2012-04-01

    Full Text Available The paper deals with a new frame synchronization scheme for OFDM systems and calculates the complexity of this scheme. The scheme is based on the computing of the detection window variance. The variance is computed in two delayed times, so a modified Early-Late loop is used for the frame position detection. The proposed algorithm deals with different variants of OFDM parameters including guard interval, cyclic prefix, and has good properties regarding the choice of the algorithm's parameters since the parameters may be chosen within a wide range without having a high influence on system performance. The verification of the proposed algorithm functionality has been performed on a development environment using universal software radio peripheral (USRP hardware.

  1. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  2. Extending the similarity-based XML multicast approach with digital signatures

    DEFF Research Database (Denmark)

    Azzini, Antonia; Marrara, Stefania; Jensen, Meiko

    2009-01-01

    This paper investigates the interplay between similarity-based SOAP message aggregation and digital signature application. An overview on the approaches resulting from the different orders for the tasks of signature application, verification, similarity aggregation and splitting is provided. Depe...

  3. Web-Based Dissemination System for the Trusted Computing Exemplar Project

    National Research Council Canada - National Science Library

    Kane, Douglas R., Jr

    2005-01-01

    Open dissemination of the Trusted Computing Exemplar (TCX) project is needed. This dissemination must include methods to provide secure web access to project material, integrity verification of data, and group-based access controls...

  4. Component-Based Development of Runtime Observers in the COMDES Framework

    DEFF Research Database (Denmark)

    Guan, Wei; Li, Gang; Angelov, Christo K.

    2013-01-01

    Formal verification methods, such as exhaustive model checking, are often infeasible because of high computational complexity. Runtime observers (monitors) provide an alternative, light-weight verification method, which offers a non-exhaustive but still feasible approach to monitor system behavior...... against formally specified properties. This paper presents a component-based design method for runtime observers in the context of COMDES framework—a component-based framework for distributed embedded system and its supporting tools. Therefore, runtime verification is facilitated by model...

  5. Formal verification of an oral messages algorithm for interactive consistency

    Science.gov (United States)

    Rushby, John

    1992-01-01

    The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.

  6. Verificação e calibração do modelo de simulação do desempenho reprodutivo de vacas de corte baseado no escore de condição corporal avaliação interna Verification and calibration of a simulation model for reproductive performance of beef cows based on body condition score internal evaluation

    Directory of Open Access Journals (Sweden)

    José Acélio Silveira da Fontoura Júnior

    2010-12-01

    conception, it was based on the probability of occurrence. For internal evaluation of the model, it was used the methodologies of verification and calibration. The use of technique of variation of input data was done based through the construction of scenarios by changing BCSAC, starting date of the reproductive season (SDRS, average date of calving (ADC, and average daily weight gain from birth to weaning (ADWGBW. These scenaries also served for demonstration of the model. Through calibration, it was chosen new standard deviations for the following variables: gestation period, ADWGBW and average birth date (ABD. Tests for degenerescence and independence of seeds generating random numbers, after calibration, showed coherence of the model in generating randomness for the variables being studied. Variation on the input data showed effectiveness of the model to simulate the dynamics of reproduction systems. Nevertheless, adjustments in conception rate of primiparous females are needed for the model to generate values that are compatible with reality.

  7. [The study of noninvasive ventilator impeller based on ANSYS].

    Science.gov (United States)

    Hu, Zhaoyan; Lu, Pan; Xie, Haiming; Zhou, Yaxu

    2011-06-01

    An impeller plays a significant role in the non-invasive ventilator. This paper shows a model of impeller for noninvasive ventilator established with the software Solidworks. The model was studied for feasibility based on ANSYS. Then stress and strain of the impeller were discussed under the external loads. The results of the analysis provided verification for the reliable design of impellers.

  8. Augmented reality-assisted skull base surgery.

    Science.gov (United States)

    Cabrilo, I; Sarrafzadeh, A; Bijlenga, P; Landis, B N; Schaller, K

    2014-12-01

    Neuronavigation is widely considered as a valuable tool during skull base surgery. Advances in neuronavigation technology, with the integration of augmented reality, present advantages over traditional point-based neuronavigation. However, this development has not yet made its way into routine surgical practice, possibly due to a lack of acquaintance with these systems. In this report, we illustrate the usefulness and easy application of augmented reality-based neuronavigation through a case example of a patient with a clivus chordoma. We also demonstrate how augmented reality can help throughout all phases of a skull base procedure, from the verification of neuronavigation accuracy to intraoperative image-guidance. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  9. Summary 2: Graph Grammar Verification through Abstraction

    NARCIS (Netherlands)

    Baldan, P.; Koenig, B.; Rensink, A.; Rensink, Arend; König, B.; Montanari, U.; Gardner, P.

    2005-01-01

    Until now there have been few contributions concerning the verification of graph grammars, specifically of infinite-state graph grammars. This paper compares two existing approaches, based on abstractions of graph transformation systems. While in the unfolding approach graph grammars are

  10. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    The purpose of this thesis is to develop a method for verifying timed temporal properties of continuous dynamical systems, and to develop a method for verifying the safety of an interconnection of continuous systems. The methods must be scalable in the number of continuous variables...... to the high complexity of both the dynamical system and the specification. Therefore, there is a need for methods capable of verifying complex specifications of complex systems. The verification of high dimensional continuous dynamical systems is the key to verifying general systems. In this thesis......, an abstraction approach is taken to the verification problem. A method is developed for abstracting continuous dynamical systems by timed automata. This method is based on subdividing the state space into cells by use of subdivisioning functions that are decreasing along the vector field. To allow...

  11. Plan for a laser weapon verification research program

    Energy Technology Data Exchange (ETDEWEB)

    Karr, T.J.

    1990-03-01

    Currently there is great interest in the question of how, or even whether, a treaty limiting the development and deployment of laser weapons could be verified. The concept of cooperative laser weapon verification is that each party would place monitoring stations near the other party's declared or suspect laser weapon facilities. The monitoring stations would measure the primary laser observables'' such as power or energy, either directly or by collecting laser radiation scattered from the air or the target, and would verify that the laser is operated within treaty limits. This concept is modeled along the lines of the seismic network recently activated in the USSR as a joint project of the United States Geologic Survey and the Soviet Academy of Sciences. The seismic data, gathered cooperatively, can be used by each party as it wishes, including to support verification of future nuclear test ban treaties. For laser weapon verification the monitoring stations are envisioned as ground-based, and would verify treaty limitations on ground-based laser anti-satellite (ASAT) weapons and on the ground-based development of other laser weapons. They would also contribute to verification of limitations on air-, sea- and space-based laser weapons, and the technology developed for cooperative verification could also be used in national technical means of verification. 2 figs., 4 tabs.

  12. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  13. The potential of knowledge based systems in nuclear installations

    International Nuclear Information System (INIS)

    1993-04-01

    The integration of the knowledge based systems (KBS) with processes is a new challenge for artificial intelligence developments. Integration requires improved robustness of the KBS, using methodologies which cater for building well structured, modular applications, and for verification and validation. The following key points related to KBS development were discussed during the meeting: state of the art in knowledge representation and reasoning; methods for building KBS; tools and computers used for building and implementing KBS; requirements for verification and validation; communication between KBS and the process, and between the KBS and the operators. 9 papers were presented by participants. A separate abstract was prepared for each of the papers. Refs and figs

  14. Model-based engineering for medical-device software.

    Science.gov (United States)

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  15. The Guide-based Automatic Creation of Verified Test Scenarious

    Directory of Open Access Journals (Sweden)

    P. D. Drobintsev

    2013-01-01

    Full Text Available This paper presents an overview of technology of the automated generation of test scenarios based on guides. The usage of this technology can significantly improve the quality of the developed program products. In order to ground the technology creation, the main problems that occur during the development and testing of the large industrial systems, are described, as well as the methodologies of software verification on conformity to product requirements. The potentialities of tools for automatic and semi-automatic generation of a test suite by using a formal model in UCM notation are demonstrated, as well as tools for verification and automation of testing.

  16. Verification of Java Programs using Symbolic Execution and Invariant Generation

    Science.gov (United States)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  17. Experimental preparation and verification of quantum money

    Science.gov (United States)

    Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei

    2018-03-01

    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.

  18. Signature verification with writing posture analysis

    Science.gov (United States)

    Cheng, Hsu-Yung; Yu, Chih-Chang

    2013-07-01

    A video-based handwritten signature verification framework is proposed in this paper. Using a camera as the sensor has the advantage that the entire writing processes can be captured along with the signatures. The main contribution of this work is that writing postures are analyzed to achieve the verification purpose because the writing postures cannot be easily imitated or forged. The proposed system is able to achieve low false rejection rates while maintaining low false acceptance rates for database containing both unskilled and skilled imitation signatures.

  19. Formal verification of Simulink/Stateflow diagrams a deductive approach

    CERN Document Server

    Zhan, Naijun; Zhao, Hengjun

    2017-01-01

    This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.

  20. The physics data base

    International Nuclear Information System (INIS)

    Gault, F.D.

    1984-01-01

    The physics data base is introduced along with its associated data base management system. The emphasis is on data and their use and a classification of data and of data bases is developed to distinguish compilation organizations. The characteristics of these organizations are examined briefly and the long term consequences of the physics data base discussed. (orig.)

  1. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  2. Solid Base Catalysis

    CERN Document Server

    Ono, Yoshio

    2011-01-01

    The importance of solid base catalysts has come to be recognized for their environmentally benign qualities, and much significant progress has been made over the past two decades in catalytic materials and solid base-catalyzed reactions. The book is focused on the solid base. Because of the advantages over liquid bases, the use of solid base catalysts in organic synthesis is expanding. Solid bases are easier to dispose than liquid bases, separation and recovery of products, catalysts and solvents are less difficult, and they are non-corrosive. Furthermore, base-catalyzed reactions can be performed without using solvents and even in the gas phase, opening up more possibilities for discovering novel reaction systems. Using numerous examples, the present volume describes the remarkable role solid base catalysis can play, given the ever increasing worldwide importance of "green" chemistry. The reader will obtain an overall view of solid base catalysis and gain insight into the versatility of the reactions to whic...

  3. Symptom-based emergency operating procedures development for Ignalina NPP

    International Nuclear Information System (INIS)

    Kruglov, Y.

    1999-01-01

    In this paper and lecture are presented: (1) Introduction; (2) EOP project work stages and documentation; (3) Selection and justification of accident management strategy; (4) Content of EOP package; (5) Development of EOP package; (6) EOP package verification; (7) EOP package validation; (8) EOP training; (9) EOP implementation; (10) Conditions of symptom-based emergency operating producers package application and its interconnection with event-based emergency operating procedures; (11) Rules of EOP application; EOP maintenance

  4. Implementing model based systems engineering in South Africa

    CSIR Research Space (South Africa)

    Oosthuizen, Rudolph

    2017-10-01

    Full Text Available and validation activities beginning in the conceptual design phase and continuing throughout development and later life cycle phases”. Model-Based Systems Engineering (MBSE) employs modelling languages, such as Systems Modelling Language (SysML) or Unified...). Model Based Systems Engineering General Concepts MBSE is defined as the formalized application of modeling to support system requirements, design, analysis, verification and validation activities throughout the system life cycle phases (Gau...

  5. Prestandardisation Activities for Computer Based Safety Systems

    DEFF Research Database (Denmark)

    Taylor, J. R.; Bologna, S.; Ehrenberger, W.

    1981-01-01

    Questions of technical safety become more and more important. Due to the higher complexity of their functions computer based safety systems have special problems. Researchers, producers, licensing personnel and customers have met on a European basis to exchange knowledge and formulate positions....... The Commission of the european Community supports the work. Major topics comprise hardware configuration and self supervision, software design, verification and testing, documentation, system specification and concurrent processing. Preliminary results have been used for the draft of an IEC standard and for some...

  6. An Optimized Signature Verification System for Vehicle Ad hoc NETwork

    OpenAIRE

    Mamun, Mohammad Saiful Islam; Miyaji, Atsuko

    2012-01-01

    This paper1 presents an efficient approach to an existing batch verification system on Identity based group signature (IBGS) which can be applied to any Mobile ad hoc network device including Vehicle Ad hoc Networks (VANET). We propose an optimized way to batch signatures in order to get maximum throughput from a device in runtime environment. In addition, we minimize the number of pairing computations in batch verification proposed by B. Qin et al. for large scale VANET. We introduce a batch...

  7. Experience in non-proliferation verification: The Treaty of Raratonga

    International Nuclear Information System (INIS)

    Walker, R.A.

    1998-01-01

    The verification provisions of the Treaty of Raratonga are subdivided into two categories: those performed by IAEA and those performed by other entities. A final provision of the Treaty of Raratonga is relevant to IAEA safeguards according to support of the continued effectiveness of the international non-proliferation system based on the Non-proliferation Treaty and the IAEA safeguards system. The non-IAEA verification process is described as well

  8. Deductive Evaluation: Implicit Code Verification With Low User Burden

    Science.gov (United States)

    Di Vito, Ben L.

    2016-01-01

    We describe a framework for symbolically evaluating C code using a deductive approach that discovers and proves program properties. The framework applies Floyd-Hoare verification principles in its treatment of loops, with a library of iteration schemes serving to derive loop invariants. During evaluation, theorem proving is performed on-the-fly, obviating the generation of verification conditions normally needed to establish loop properties. A PVS-based prototype is presented along with results for sample C functions.

  9. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  10. Average Gait Differential Image Based Human Recognition

    Directory of Open Access Journals (Sweden)

    Jinyan Chen

    2014-01-01

    Full Text Available The difference between adjacent frames of human walking contains useful information for human gait identification. Based on the previous idea a silhouettes difference based human gait recognition method named as average gait differential image (AGDI is proposed in this paper. The AGDI is generated by the accumulation of the silhouettes difference between adjacent frames. The advantage of this method lies in that as a feature image it can preserve both the kinetic and static information of walking. Comparing to gait energy image (GEI, AGDI is more fit to representation the variation of silhouettes during walking. Two-dimensional principal component analysis (2DPCA is used to extract features from the AGDI. Experiments on CASIA dataset show that AGDI has better identification and verification performance than GEI. Comparing to PCA, 2DPCA is a more efficient and less memory storage consumption feature extraction method in gait based recognition.

  11. CVPP: A Tool Set for Compositonal Verification of Control-Flow Safety Properties.

    NARCIS (Netherlands)

    Huisman, Marieke; Gurov, Dilian; Beckert, Bernhard; Marche, Claude

    2010-01-01

    This paper describes CVPP, a tool set for compositional verification of control–flow safety properties for programs with procedures. The compositional verification principle that underlies CVPP is based on maximal models constructed from component specifications. Maximal models replace the actual

  12. Verification of RRC Ki code package for neutronic calculations of WWER core with GD

    International Nuclear Information System (INIS)

    Aleshin, S.S.; Bolshagin, S.N.; Lazarenko, A.P.; Markov, A.V.; Pavlov, V.I.; Pavlovitchev, A.M.; Sidorenko, V.D.; Tsvetkov, V.M.

    2001-01-01

    The report presented is concerned with verification results of TVS-M/PERMAK-A/BIPR-7A code package for WWERs neutronic calculation as applied to calculation of systems containing U-GD pins. The verification is based on corresponded benchmark calculations, data critical experiments and on operation data obtained WWER units with Gd. The comparison results are discussed (Authors)

  13. Characterization of a dose verification system dedicated to radiotherapy treatments based on a silicon detector multi-strips; Caracterizacion de un sistema de verificacion de dosis dedicado a tratamientos de radioterapia basado en un detector de silicio de multi-tiras

    Energy Technology Data Exchange (ETDEWEB)

    Bocca, A.; Cortes Giraldo, M. A.; Gallardo, M. I.; Espino, J. M.; Aranas, R.; Abou Haidar, Z.; Alvarez, M. A. G.; Quesada, J. M.; Vega-Leal, A. P.; Perez Neto, F. J.

    2011-07-01

    In this paper, we present the characterization of a silicon detector multi-strips (SSSSD: Single Sided Silicon Strip Detector), developed by the company Micron Semiconductors Ltd. for use as a verification system for radiotherapy treatments.

  14. Towards a Method for Combined Model-based Testing and Analysis

    DEFF Research Database (Denmark)

    Nielsen, Brian

    Efficient and effective verification and validation of complex embedded systems is challenging, and requires the use of various tools and techniques, such as model-based testing and analysis. The aim of this paper is to devise an overall \\method{} for how analysis and testing may be used...... in combination to increase the quality of embedded systems, and reduce development cost. The method is centered on a common verification planning and iteratively exploiting the established results to strengthen the verification activities. We conclude that the proposed method is general enough to capture most...

  15. An Efficient Topology-Based Algorithm for Transient Analysis of Power Grid

    KAUST Repository

    Yang, Lan

    2015-08-10

    In the design flow of integrated circuits, chip-level verification is an important step that sanity checks the performance is as expected. Power grid verification is one of the most expensive and time-consuming steps of chip-level verification, due to its extremely large size. Efficient power grid analysis technology is highly demanded as it saves computing resources and enables faster iteration. In this paper, a topology-base power grid transient analysis algorithm is proposed. Nodal analysis is adopted to analyze the topology which is mathematically equivalent to iteratively solving a positive semi-definite linear equation. The convergence of the method is proved.

  16. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  17. Wu’s Characteristic Set Method for SystemVerilog Assertions Verification

    Directory of Open Access Journals (Sweden)

    Xinyan Gao

    2013-01-01

    Full Text Available We propose a verification solution based on characteristic set of Wu’s method towards SystemVerilog assertion checking over digital circuit systems. We define a suitable subset of SVAs so that an efficient polynomial modeling mechanism for both circuit descriptions and assertions can be applied. We present an algorithm framework based on the algebraic representations using characteristic set of polynomial system. This symbolic algebraic approach is a useful supplement to the existent verification methods based on simulation.

  18. Mobile Inquiry Based Learning

    NARCIS (Netherlands)

    Specht, Marcus

    2012-01-01

    Specht, M. (2012, 8 November). Mobile Inquiry Based Learning. Presentation given at the Workshop "Mobile inquiry-based learning" at the Mobile Learning Day 2012 at the Fernuniversität Hagen, Hagen, Germany.

  19. VectorBase

    Data.gov (United States)

    U.S. Department of Health & Human Services — VectorBase is a Bioinformatics Resource Center for invertebrate vectors. It is one of four Bioinformatics Resource Centers funded by NIAID to provide web-based...

  20. The ground based plan

    International Nuclear Information System (INIS)

    1989-01-01

    The paper presents a report of ''The Ground Based Plan'' of the United Kingdom Science and Engineering Research Council. The ground based plan is a plan for research in astronomy and planetary science by ground based techniques. The contents of the report contains a description of:- the scientific objectives and technical requirements (the basis for the Plan), the present organisation and funding for the ground based programme, the Plan, the main scientific features and the further objectives of the Plan. (U.K.)

  1. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    Science.gov (United States)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  2. Stolen Base Physics

    Science.gov (United States)

    Kagan, David

    2013-01-01

    Few plays in baseball are as consistently close and exciting as the stolen base. While there are several studies of sprinting, the art of base stealing is much more nuanced. This article describes the motion of the base-stealing runner using a very basic kinematic model. The model will be compared to some data from a Major League game. The…

  3. SLS Navigation Model-Based Design Approach

    Science.gov (United States)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and

  4. DCT-based iris recognition.

    Science.gov (United States)

    Monro, Donald M; Rakshit, Soumyadip; Zhang, Dexin

    2007-04-01

    This paper presents a novel iris coding method based on differences of discrete cosine transform (DCT) coefficients of overlapped angular patches from normalized iris images. The feature extraction capabilities of the DCT are optimized on the two largest publicly available iris image data sets, 2,156 images of 308 eyes from the CASIA database and 2,955 images of 150 eyes from the Bath database. On this data, we achieve 100 percent Correct Recognition Rate (CRR) and perfect Receiver-Operating Characteristic (ROC) Curves with no registered false accepts or rejects. Individual feature bit and patch position parameters are optimized for matching through a product-of-sum approach to Hamming distance calculation. For verification, a variable threshold is applied to the distance metric and the False Acceptance Rate (FAR) and False Rejection Rate (FRR) are recorded. A new worst-case metric is proposed for predicting practical system performance in the absence of matching failures, and the worst case theoretical Equal Error Rate (EER) is predicted to be as low as 2.59 x 10(-4) on the available data sets.

  5. Symmetry-Based Conflict Detection and Resolution Method towards Web3D-based Collaborative Design

    Directory of Open Access Journals (Sweden)

    Mingjiu Yu

    2016-05-01

    Full Text Available In the process of web3D-based collaborative design, it is necessary to completely prevent operation conflicts among designers due to distributed environments and complex 3D models. Therefore, conflict detection and conflict resolution are of great significance to attain an acceptable result. In order to facilitate effective and smooth design work, a symmetry-based collaborative design framework is proposed using the X3D operation models. Combined considerations cover both models and operations, while different operation strategies are utilized for conflict resolution in web-based collaborative design. The strategy can achieve automatic operation, real-time conflict detection based on dynamically adjustable time, and conflict auto-detection and resolution with designers’ customization. A proof-of-concept system is developed for verification. The proposed resolution shows good performance, scalability and interactivity in a case study.

  6. Learning a Genetic Measure for Kinship Verification Using Facial Images

    Directory of Open Access Journals (Sweden)

    Lu Kou

    2015-01-01

    Full Text Available Motivated by the key observation that children generally resemble their parents more than other persons with respect to facial appearance, distance metric (similarity learning has been the dominant choice for state-of-the-art kinship verification via facial images in the wild. Most existing learning-based approaches to kinship verification, however, are focused on learning a genetic similarity measure in a batch learning manner, leading to less scalability for practical applications with ever-growing amount of data. To address this, we propose a new kinship verification approach by learning a sparse similarity measure in an online fashion. Experimental results on the kinship datasets show that our approach is highly competitive to the state-of-the-art alternatives in terms of verification accuracy, yet it is superior in terms of scalability for practical applications.

  7. Towards the molecular bases of polymerase dynamics

    International Nuclear Information System (INIS)

    Chela Flores, J.

    1991-03-01

    One aspect of the strong relationship that is known to exist between the processes of DNA replication and transcription is manifest in the coupling of the rates of movement of the replication fork (r f ) and RNA polymerase (r t ). We address two issues concerning the largely unexplored area of polymerase dynamics: (i) The validity of an approximate kinematic formula linking r f and r t suggested by experiments in which transcription is initiated in some prokaryotes with the antibiotic streptolydigin, and (ii) What are the molecular bases of the kinematic formula? An analysis of the available data suggests possible molecular bases for polymerase dynamics. In particular, we are led to a hypothesis: In active chromatin r t may depend on the length (λ t ) of the transcript of the primary messenger RNA (pre-mRNA). This new effect is subject to experimental verification. We discuss possible experiments that may be performed in order to test this prediction. (author). Refs, 6 tabs

  8. Evidence-based hypnotherapy for depression.

    Science.gov (United States)

    Alladin, Assen

    2010-04-01

    Cognitive hypnotherapy (CH) is a comprehensive evidence-based hypnotherapy for clinical depression. This article describes the major components of CH, which integrate hypnosis with cognitive-behavior therapy as the latter provides an effective host theory for the assimilation of empirically supported treatment techniques derived from various theoretical models of psychotherapy and psychopathology. CH meets criteria for an assimilative model of psychotherapy, which is considered to be an efficacious model of psychotherapy integration. The major components of CH for depression are described in sufficient detail to allow replication, verification, and validation of the techniques delineated. CH for depression provides a template that clinicians and investigators can utilize to study the additive effects of hypnosis in the management of other psychological or medical disorders. Evidence-based hypnotherapy and research are encouraged; such a movement is necessary if clinical hypnosis is to integrate into mainstream psychotherapy.

  9. Value-based pricing

    OpenAIRE

    Netseva-Porcheva Tatyana

    2010-01-01

    The main aim of the paper is to present the value-based pricing. Therefore, the comparison between two approaches of pricing is made - cost-based pricing and value-based pricing. The 'Price sensitively meter' is presented. The other topic of the paper is the perceived value - meaning of the perceived value, the components of perceived value, the determination of perceived value and the increasing of perceived value. In addition, the best company strategies in matrix 'value-cost' are outlined. .

  10. Soy-based polyols

    Science.gov (United States)

    Suppes, Galen; Lozada, Zueica; Lubguban, Arnold

    2013-06-25

    The invention provides processes for preparing soy-based oligomeric polyols or substituted oligomeric polyols, as well as urethane bioelasteromers comprising the oligomeric polyols or substituted oligomeric polyols.

  11. XML-BASED REPRESENTATION

    Energy Technology Data Exchange (ETDEWEB)

    R. KELSEY

    2001-02-01

    For focused applications with limited user and use application communities, XML can be the right choice for representation. It is easy to use, maintain, and extend and enjoys wide support in commercial and research sectors. When the knowledge and information to be represented is object-based and use of that knowledge and information is a high priority, then XML-based representation should be considered. This paper discusses some of the issues involved in using XML-based representation and presents an example application that successfully uses an XML-based representation.

  12. Imagery Data Base Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Imagery Data Base Facility supports AFRL and other government organizations by providing imagery interpretation and analysis to users for data selection, imagery...

  13. Synthetic Base Fluids

    Science.gov (United States)

    Brown, M.; Fotheringham, J. D.; Hoyes, T. J.; Mortier, R. M.; Orszulik, S. T.; Randles, S. J.; Stroud, P. M.

    The chemical nature and technology of the main synthetic lubricant base fluids is described, covering polyalphaolefins, alkylated aromatics, gas-to-liquid (GTL) base fluids, polybutenes, aliphatic diesters, polyolesters, polyalkylene glycols or PAGs and phosphate esters.Other synthetic lubricant base oils such as the silicones, borate esters, perfluoroethers and polyphenylene ethers are considered to have restricted applications due to either high cost or performance limitations and are not considered here.Each of the main synthetic base fluids is described for their chemical and physical properties, manufacture and production, their chemistry, key properties, applications and their implications when used in the environment.

  14. Strengths-based Learning

    DEFF Research Database (Denmark)

    Ledertoug, Mette Marie

    Strength-based learning - Children͛s Character Strengths as Means to their Learning Potential͛ is a Ph.D.-project aiming to create a strength-based mindset in school settings and at the same time introducing strength-based interventions as specific tools to improve both learning and well......-being. The Ph.D.-project in Strength-based learning took place in a Danish school with 750 pupils age 6-16 and a similar school was functioning as a control group. The presentation will focus on both the aware-explore-apply processes and the practical implications for the schools involved, and on measurable...

  15. Monitoring Knowledge Base (MKB)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Monitoring Knowledge Base (MKB) is a compilation of emissions measurement and monitoring techniques associated with air pollution control devices, industrial...

  16. Case-based reasoning

    CERN Document Server

    Kolodner, Janet

    1993-01-01

    Case-based reasoning is one of the fastest growing areas in the field of knowledge-based systems and this book, authored by a leader in the field, is the first comprehensive text on the subject. Case-based reasoning systems are systems that store information about situations in their memory. As new problems arise, similar situations are searched out to help solve these problems. Problems are understood and inferences are made by finding the closest cases in memory, comparing and contrasting the problem with those cases, making inferences based on those comparisons, and asking questions whe

  17. Property-driven functional verification technique for high-speed vision system-on-chip processor

    Science.gov (United States)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  18. Secure base stations

    NARCIS (Netherlands)

    Bosch, Peter; Brusilovsky, Alec; McLellan, Rae; Mullender, Sape J.; Polakos, Paul

    2009-01-01

    With the introduction of the third generation (3G) Universal Mobile Telecommunications System (UMTS) base station router (BSR) and fourth generation (4G) base stations, such as the 3rd Generation Partnership Project (3GPP) Long Term Evolution (LTE) Evolved Node B (eNB), it has become important to

  19. Mutually unbiased bases

    Indian Academy of Sciences (India)

    Abstract. After a brief review of the notion of a full set of mutually unbiased bases in an N- dimensional Hilbert space, we summarize the work of Wootters and Fields (W K Wootters and. B C Fields, Ann. Phys. 191, 363 (1989)) which gives an explicit construction for such bases for the case N pr, where p is a prime. Further, we ...

  20. Game-Based Teaching

    DEFF Research Database (Denmark)

    Hanghøj, Thorkild

    2013-01-01

    This chapter outlines theoretical and empirical perspectives on how Game-Based Teaching can be integrated within the context of formal schooling. Initially, this is done by describing game scenarios as models for possible actions that need to be translated into curricular knowledge practices...... approaches to game-based teaching, which may or may not correspond with the pedagogical models of particular games....

  1. Home-based care

    African Journals Online (AJOL)

    Mrs. Patience Edoho Samson-Akpan

    PLWHA. The recommendation was that home based care should be encouraged and given priority by stake holders in the management of PLWHA. KEY WORDS: home-based care, quality of life, basic nursing care, psychosocial care. INTRODUCTION. HIV/AIDS is a chronic progressive disease which threatens the quality ...

  2. Diffusion Based Photon Mapping

    DEFF Research Database (Denmark)

    Schjøth, Lars; Olsen, Ole Fogh; Sporring, Jon

    2006-01-01

    . To address this problem we introduce a novel photon mapping algorithm based on nonlinear anisotropic diffusion. Our algorithm adapts according to the structure of the photon map such that smoothing occurs along edges and structures and not across. In this way we preserve the important illumination features......, while eliminating noise. We call our method diffusion based photon mapping....

  3. Generator of text-based assignments for programming courses

    OpenAIRE

    Jager, Mojca

    2013-01-01

    Verifying and assessing of knowledge represent important part of education. A teacher can verify knowledge on different ways, like classically oral and written, which are in majority still dominant types, and alternatively, which is based on student's current activities. During assembling the questions for written verification, many teachers nowadays help themselves using different test generators like Hot Potatoes, Moodle, Test Pilot, and others, they are all available on internet. At t...

  4. Model-Based Engineering for Supply Chain Risk Management

    Science.gov (United States)

    2015-09-30

    General Accounting Office has placed ICT Supply Chain Risk Management on its annual “Key Issues, High Risk” list [9]. A primary contributor to...Model-Based Engineering for Supply Chain Risk Management Dan Shoemaker, Ph.D. University of Detroit Mercy Carol Woody, Ph.D. Carnegie Mellon...Design Language (AADL), which has tools for modeling and compliance verification, provides an effective capability to model and describe all component

  5. A Model Based Security Testing Method for Protocol Implementation

    Directory of Open Access Journals (Sweden)

    Yu Long Fu

    2014-01-01

    Full Text Available The security of protocol implementation is important and hard to be verified. Since the penetration testing is usually based on the experience of the security tester and the specific protocol specifications, a formal and automatic verification method is always required. In this paper, we propose an extended model of IOLTS to describe the legal roles and intruders of security protocol implementations, and then combine them together to generate the suitable test cases to verify the security of protocol implementation.

  6. Skull base tumours

    Energy Technology Data Exchange (ETDEWEB)

    Borges, Alexandra [Instituto Portugues de Oncologia Francisco Gentil, Servico de Radiologia, Rua Professor Lima Basto, 1093 Lisboa Codex (Portugal)], E-mail: borgesalexandra@clix.pt

    2008-06-15

    With the advances of cross-sectional imaging radiologists gained an increasing responsibility in the management of patients with skull base pathology. As this anatomic area is hidden to clinical exam, surgeons and radiation oncologists have to rely on imaging studies to plan the most adequate treatment. To fulfil these endeavour radiologists need to be knowledgeable about skull base anatomy, about the main treatment options available, their indications and contra-indications and needs to be aware of the wide gamut of pathologies seen in this anatomic region. This article will provide a radiologists' friendly approach to the central skull base and will review the most common central skull base tumours and tumours intrinsic to the bony skull base.

  7. Context Based Wikipedia Linking

    Science.gov (United States)

    Granitzer, Michael; Seifert, Christin; Zechner, Mario

    Automatically linking Wikipedia pages can be done either content based by exploiting word similarities or structure based by exploiting characteristics of the link graph. Our approach focuses on a content based strategy by detecting Wikipedia titles as link candidates and selecting the most relevant ones as links. The relevance calculation is based on the context, i.e. the surrounding text of a link candidate. Our goal was to evaluate the influence of the link-context on selecting relevant links and determining a links best-entry-point. Results show, that a whole Wikipedia page provides the best context for resolving link and that straight forward inverse document frequency based scoring of anchor texts achieves around 4% less Mean Average Precision on the provided data set.

  8. Evidence-based radiography

    International Nuclear Information System (INIS)

    Hafslund, Bjorg; Clare, Judith; Graverholt, Birgitte; Wammen Nortvedt, Monica

    2008-01-01

    Evidence-based practice (EBP) offers the integration of the best research evidence with clinical knowledge and expertise and patient values. EBP is a well known term in health care. This paper discusses the implementation of EBP into radiography and introduces the term evidence-based radiography. Evidence-based radiography is radiography informed and based on the combination of clinical expertise and the best available research-based evidence, patient preferences and resources available. In Norway, EBP in radiography is being debated and radiographers are discussing the challenges of implementing EBP in both academic and clinical practice. This discussion paper explains why EBP needs to be a basis for a radiography curriculum and a part of radiographers' practice. We argue that Norwegian radiographers must increase participation in research and developing practice within their specific radiographic domain

  9. Password Authentication Based on Fractal Coding Scheme

    Directory of Open Access Journals (Sweden)

    Nadia M. G. Al-Saidi

    2012-01-01

    Full Text Available Password authentication is a mechanism used to authenticate user identity over insecure communication channel. In this paper, a new method to improve the security of password authentication is proposed. It is based on the compression capability of the fractal image coding to provide an authorized user a secure access to registration and login process. In the proposed scheme, a hashed password string is generated and encrypted to be captured together with the user identity using text to image mechanisms. The advantage of fractal image coding is to be used to securely send the compressed image data through a nonsecured communication channel to the server. The verification of client information with the database system is achieved in the server to authenticate the legal user. The encrypted hashed password in the decoded fractal image is recognized using optical character recognition. The authentication process is performed after a successful verification of the client identity by comparing the decrypted hashed password with those which was stored in the database system. The system is analyzed and discussed from the attacker’s viewpoint. A security comparison is performed to show that the proposed scheme provides an essential security requirement, while their efficiency makes it easier to be applied alone or in hybrid with other security methods. Computer simulation and statistical analysis are presented.

  10. TWRS system drawings and field verification

    International Nuclear Information System (INIS)

    Shepard, D.G.

    1995-01-01

    The Configuration Management Program combines the TWRS Labeling and O and M drawing and drawing verification programs. The combined program will produce system drawings for systems that are normally operated or have maintenance performed on the system, label individual pieces of equipment for proper identification, even if system drawings are not warranted, and perform verification of drawings that are identified as essential in Tank Farm Essential Drawing Plans. During fiscal year 1994, work was begun to label Tank Farm components and provide user friendly system based drawings for Tank Waste Remediation System (TWRS) operations and maintenance. During the first half of fiscal 1995, the field verification program continued to convert TWRS drawings into CAD format and verify the accuracy based on visual inspections. During the remainder of fiscal year 1995 these efforts will be combined into a single program providing system based drawings and field verification of TWRS equipment and facilities. This combined program for TWRS will include all active systems for tank farms. Operations will determine the extent of drawing and labeling requirements for single shell tanks, i.e. the electrical distribution, HVAC, leak detection, and the radiation monitoring system. The tasks required to meet these objectives, include the following: identify system boundaries or scope for drawing being verified; label equipment/components in the process systems with a unique Equipment Identification Number (EIN) per the TWRS Data Standard; develop system drawings that are coordinated by ''smart'' drawing numbers and/or drawing references as identified on H-14-020000; develop a Master Equipment List (MEL) multi-user data base application which will contain key information about equipment identified in the field; and field verify and release TWRS Operation and Maintenance (O and M) drawings

  11. Authoring and verification of clinical guidelines: a model driven approach.

    Science.gov (United States)

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc

  12. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  13. QuickBase

    CERN Document Server

    Conner, Nancy

    2007-01-01

    Ready to put Intuit's QuickBase to work? Our new Missing Manual shows you how to capture, modify, share, and manage data and documents with this web-based data-sharing program quickly and easily. No longer do you have to coordinate your team through a blizzard of emails or play frustrating games of "guess which document is the right one."QuickBase saves your organization time and money, letting you manage and share the information that makes your business tick: sales figures, project timelines, drafts of documents, purchase or work requests--whatever information you need to keep business flowi

  14. Equipment fragility data base

    International Nuclear Information System (INIS)

    Cover, L.E.

    1982-03-01

    Part of the effort of the Seismic Safety Margins Research Program (SSMRP) has been directed at generating a fragility data base for equipment used in control and safety systems in commercial nuclear power plants. Component fragility data exist in various forms, depending on their content, intended use, and level of reduction. Th data are stored in a relational data base on the LLNL CDC 7600 computers; this provides easy accessibility for LLNL computer users. This report describes the present structure of the data base and presents its contents through the use of tables

  15. Value-based pricing

    Directory of Open Access Journals (Sweden)

    Netseva-Porcheva Tatyana

    2010-01-01

    Full Text Available The main aim of the paper is to present the value-based pricing. Therefore, the comparison between two approaches of pricing is made - cost-based pricing and value-based pricing. The 'Price sensitively meter' is presented. The other topic of the paper is the perceived value - meaning of the perceived value, the components of perceived value, the determination of perceived value and the increasing of perceived value. In addition, the best company strategies in matrix 'value-cost' are outlined. .

  16. Design-Based Research

    DEFF Research Database (Denmark)

    Gynther, Karsten; Christensen, Ove; Petersen, Trine Brun

    2012-01-01

    I denne artikel introduceres Design Based Research for første gang på dansk i et videnskabeligt tidsskrift. Artiklen præsenterer de grundlæggende antagelser, som ligger til grund for Design Based Research-traditionen, og artiklen diskuterer de principper, som ligger til grund for gennemførelse af...... et DBR-forskningsprojekt. Med udgangspunkt i forsknings- og udviklingsprojektet ELYK: E-læring, Yderområder og Klyngedannelse, præsenteres den innovationsmodel, som projektet har udviklet med udgangspunkt i Design Based Research traditionen. ELYKs DBR innovationsmodel har vist sig effektiv i forhold...

  17. Nature-based integration

    DEFF Research Database (Denmark)

    Pitkänen, Kati; Oratuomi, Joose; Hellgren, Daniela

    Increased attention to, and careful planning of the integration of migrants into Nordic societies is ever more important. Nature based integration is a new solution to respond to this need. This report presents the results of a Nordic survey and workshop and illustrates current practices of nature...... based integration by case study descriptions from Denmark, Sweden Norway and Finland. Across Nordic countries several practical projects and initiatives have been launched to promote the benefits of nature in integration and there is also growing academic interest in the topic. Nordic countries have...... the potential of becoming real forerunners in nature based integration even at the global scale....

  18. Cheboygan Vessel Base

    Data.gov (United States)

    Federal Laboratory Consortium — Cheboygan Vessel Base (CVB), located in Cheboygan, Michigan, is a field station of the USGS Great Lakes Science Center (GLSC). CVB was established by congressional...

  19. Biomimetics: nature based innovation

    National Research Council Canada - National Science Library

    Bar-Cohen, Yoseph

    2012-01-01

    "Based on the concept that nature offers numerous sources of inspiration for inventions related to mechanisms, materials, processes, and algorithms, this book covers the topic of biomimetics and the inspired innovation...

  20. Kelomehele preemia Baseli festivalil

    Index Scriptorium Estoniae

    2000-01-01

    Baselis festivalil "VIPER - International Festival for Film Video and New Media" tunnistati parimaks CD-ROMiks Gustav Deutschi/Anna Schimeki "Odysee today", netiprojektiks itaallaste "01.ORG", äramärkimispreemia - Raivo Kelomehe "Videoweaver"

  1. Hanscom Air Force Base

    Data.gov (United States)

    Federal Laboratory Consortium — MIT Lincoln Laboratory occupies 75 acres (20 acres of which are MIT property) on the eastern perimeter of Hanscom Air Force Base, which is at the nexus of Lexington,...

  2. Network-Based Effectiveness

    National Research Council Canada - National Science Library

    Friman, Henrik

    2006-01-01

    ... (extended from Leavitt, 1965). This text identifies aspects of network-based effectiveness that can benefit from a better understanding of leadership and management development of people, procedures, technology, and organizations...

  3. Evidence based practice

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2011-01-01

    Evidence-based practice (EBP) is an influential interdisciplinary movement that originated in medicine as evidence-based medicine (EBM) about 1992. EBP is of considerable interest to library and information science (LIS) because it focuses on a thorough documentation of the basis for the decision...... making that is established in research as well as an optimization of every link in documentation and search processes. EBP is based on the philosophical doctrine of empiricism and, therefore, it is subject to the criticism that has been raised against empiricism. The main criticism of EBP...... is that practitioners lose their autonomy, that the understanding of theory and of underlying mechanisms is weakened, and that the concept of evidence is too narrow in the empiricist tradition. In this article, it is suggested that we should speak of “research-based practice” rather than EBP, because this term is open...

  4. WormBase

    Data.gov (United States)

    U.S. Department of Health & Human Services — WormBase is an international consortium of biologists and computer scientists dedicated to providing the research community with accurate, current, accessible...

  5. Mars 2020 Model Based Systems Engineering Pilot

    Science.gov (United States)

    Dukes, Alexandra Marie

    2017-01-01

    The pilot study is led by the Integration Engineering group in NASA's Launch Services Program (LSP). The Integration Engineering (IE) group is responsible for managing the interfaces between the spacecraft and launch vehicle. This pilot investigates the utility of Model-Based Systems Engineering (MBSE) with respect to managing and verifying interface requirements. The main objectives of the pilot are to model several key aspects of the Mars 2020 integrated operations and interface requirements based on the design and verification artifacts from Mars Science Laboratory (MSL) and to demonstrate how MBSE could be used by LSP to gain further insight on the interface between the spacecraft and launch vehicle as well as to enhance how LSP manages the launch service. The method used to accomplish this pilot started through familiarization of SysML, MagicDraw, and the Mars 2020 and MSL systems through books, tutorials, and NASA documentation. MSL was chosen as the focus of the model since its processes and verifications translate easily to the Mars 2020 mission. The study was further focused by modeling specialized systems and processes within MSL in order to demonstrate the utility of MBSE for the rest of the mission. The systems chosen were the In-Flight Disconnect (IFD) system and the Mass Properties process. The IFD was chosen as a system of focus since it is an interface between the spacecraft and launch vehicle which can demonstrate the usefulness of MBSE from a system perspective. The Mass Properties process was chosen as a process of focus since the verifications for mass properties occur throughout the lifecycle and can demonstrate the usefulness of MBSE from a multi-discipline perspective. Several iterations of both perspectives have been modeled and evaluated. While the pilot study will continue for another 2 weeks, pros and cons of using MBSE for LSP IE have been identified. A pro of using MBSE includes an integrated view of the disciplines, requirements, and

  6. Experimental inventory verification system

    International Nuclear Information System (INIS)

    Steverson, C.A.; Angerman, M.I.

    1991-01-01

    As Low As Reasonably Achievable (ALARA) goals and Department of Energy (DOE) inventory requirements are frequently in conflict at facilities across the DOE complex. The authors wish, on one hand, to verify the presence of correct amounts of nuclear materials that are in storage or in process; yet on the other hand, we wish to achieve ALARA goals by keeping individual and collective exposures as low as social, technical, economic, practical, and public policy considerations permit. The Experimental Inventory Verification System (EIVSystem) is a computer-based, camera-driven system that utilizes image processing technology to detect change in vault areas. Currently in the test and evaluation phase at Idaho National Engineering Laboratory, this system guards personnel. The EIVSystem continually monitors the vault, providing proof of changed status for objects sorted within the vault. This paper reports that these data could provide the basis for reducing inventory requirements when no change has occurred, thus helping implement ALARA policy; the data will also help describe there target area of an inventory when change has been shown to occur

  7. Loyalty Based Investment

    OpenAIRE

    Pei-Hsuan Lee; Ching-Wen Wang

    2010-01-01

    This study investigates the loyalty-based investment behavior in Taiwan. The link between team associations and loyalty has been examined to understand the characteristics of loyalty-based investors. The results revealed that both behavioral loyalty and attitudinal loyalty have significant positive effects on fans' investment intentions. Moreover, both attributes and benefits are significantly and positively related to either aspect of loyalty. In specific, behavioral loyalty has a higher inf...

  8. Game-based telerehabilitation.

    Science.gov (United States)

    Lange, B; Flynn, Sheryl M; Rizzo, A A

    2009-03-01

    This article summarizes the recent accomplishments and current challenges facing game-based virtual reality (VR) telerehabilitation. Specifically this article addresses accomplishments relative to realistic practice scenarios, part to whole practice, objective measurement of performance and progress, motivation, low cost, interaction devices and game design. Furthermore, a description of the current challenges facing game based telerehabilitation including the packaging, internet capabilities and access, data management, technical support, privacy protection, seizures, distance trials, scientific scrutiny and support from insurance companies.

  9. REST based mobile applications

    Science.gov (United States)

    Rambow, Mark; Preuss, Thomas; Berdux, Jörg; Conrad, Marc

    2008-02-01

    Simplicity is the major advantage of REST based webservices. Whereas SOAP is widespread in complex, security sensitive business-to-business aplications, REST is widely used for mashups and end-user centric applicatons. In that context we give an overview of REST and compare it to SOAP. Furthermore we apply the GeoDrawing application as an example for REST based mobile applications and emphasize on pros and cons for the use of REST in mobile application scenarios.

  10. Participatory design based research

    DEFF Research Database (Denmark)

    Dau, Susanne; Falk, Lars; Jensen, Louise Bach

    This poster reveal how participatory design based research by the use of a CoED inspired creative process can be used for designing solutions to problems regarding students study activities outside campus.......This poster reveal how participatory design based research by the use of a CoED inspired creative process can be used for designing solutions to problems regarding students study activities outside campus....

  11. Knowledge Based Strategy

    OpenAIRE

    Nicolescu, Ovidiu

    2006-01-01

    The paper is focused on the essential of knowledge-based strategy taking into consideration the theoretical and practical development during the last 10 years and especially knowledge revolution. It demonstrates the specificity of the company knowledgebased strategies and their necesity. More than this it presents essential elements regarding how to elaborate and to implement a knowledge based strategy starting from well know research in the field.

  12. Evidence-Based Development

    DEFF Research Database (Denmark)

    Hertzum, Morten; Simonsen, Jesper

    2004-01-01

    Systems development is replete with projects that represent substantial resource investments but result in systems that fail to meet users’ needs. Evidence-based development is an emerging idea intended to provide means for managing customer-vendor relationships and working systematically toward...... and electronic patient records for diabetes patients, this paper reports research in progress regarding the prospects and pitfalls of evidence-based development....

  13. Model Checking and Model-based Testing in the Railway Domain

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    2015-01-01

    This chapter describes some approaches and emerging trends for verification and model-based testing of railway control systems. We describe state-of-the-art methods and associated tools for verifying interlocking systems and their configuration data, using bounded model checking and k-induction. ......This chapter describes some approaches and emerging trends for verification and model-based testing of railway control systems. We describe state-of-the-art methods and associated tools for verifying interlocking systems and their configuration data, using bounded model checking and k...

  14. Research and Implementation of Automatic Fuzzy Garage Parking System Based on FPGA

    Directory of Open Access Journals (Sweden)

    Wang Kaiyu

    2016-01-01

    Full Text Available Because of many common scenes of reverse parking in real life, this paper presents a fuzzy controller which accommodates front and back adjustment of vehicle’s body attitude, and based on chaotic-genetic arithmetic to optimize the membership function of this controller, and get a vertical parking fuzzy controller whose simulation result is good .The paper makes the hardware-software embedded design for system based on Field-Programmable Gate Array (FPGA, and set up a 1:10 verification platform of smart car to verify the fuzzy garage parking system with real car. Verification results show that, the system can complete the parking task very well.

  15. [Endonasal skull base endoscopy].

    Science.gov (United States)

    Simal-Julián, Juan Antonio; Miranda-Lloret, Pablo; Pancucci, Giovanni; Evangelista-Zamora, Rocío; Pérez-Borredá, Pedro; Sanromán-Álvarez, Pablo; Perez-de-Sanromán, Laila; Botella-Asunción, Carlos

    2013-01-01

    The endoscopic endonasal techniques used in skull base surgery have evolved greatly in recent years. Our study objective was to perform a qualitative systematic review of the likewise systematic reviews in published English language literature, to examine the evidence and conclusions reached in these studies comparing transcranial and endoscopic approaches in skull base surgery. We searched the references on the MEDLINE and EMBASE electronic databases selecting the systematic reviews, meta-analyses and evidence based medicine reviews on skull based pathologies published from January 2000 until January 2013. We focused on endoscopic impact and on microsurgical and endoscopic technique comparisons. Full endoscopic endonasal approaches achieved gross total removal rates of craniopharyngiomas and chordomas higher than those for transcranial approaches. In anterior skull base meningiomas, complete resections were more frequently achieved after transcranial approaches, with a trend in favour of endoscopy with respect to visual prognosis. Endoscopic endonasal approaches minimised the postoperative complications after the treatment of cerebrospinal fluid (CSF) leaks, encephaloceles, meningoceles, craniopharyngiomas and chordomas, with the exception of postoperative CSF leaks. Randomized multicenter studies are necessary to resolve the controversy over endoscopic and microsurgical approaches in skull base surgery. Copyright © 2013 Sociedad Española de Neurocirugía. Published by Elsevier España. All rights reserved.

  16. LDEF materials data bases

    Science.gov (United States)

    Funk, Joan G.; Strickland, John W.; Davis, John M.

    1993-01-01

    The Long Duration Exposure Facility (LDEF) and the accompanying experiments were composed of and contained a wide variety of materials representing the largest collection of materials flown in low Earth orbit (LEO) and retrieved for ground based analysis to date. The results and implications of the mechanical, thermal, optical, and electrical data from these materials are the foundation on which future LEO space missions will be built. The LDEF Materials Special Investigation Group (MSIG) has been charged with establishing and developing data bases to document these materials and their performance to assure not only that the data are archived for future generations but also that the data are available to the spacecraft user community in an easily accessed, user-friendly form. This paper discusses the format and content of the three data bases developed or being developed to accomplish this task. The hardware and software requirements for each of these three data bases are discussed along with current availability of the data bases. This paper also serves as a user's guide to the MAPTIS LDEF Materials Data Base.

  17. Swarm-based medicine.

    Science.gov (United States)

    Putora, Paul Martin; Oldenburg, Jan

    2013-09-19

    Occasionally, medical decisions have to be taken in the absence of evidence-based guidelines. Other sources can be drawn upon to fill in the gaps, including experience and intuition. Authorities or experts, with their knowledge and experience, may provide further input--known as "eminence-based medicine". Due to the Internet and digital media, interactions among physicians now take place at a higher rate than ever before. With the rising number of interconnected individuals and their communication capabilities, the medical community is obtaining the properties of a swarm. The way individual physicians act depends on other physicians; medical societies act based on their members. Swarm behavior might facilitate the generation and distribution of knowledge as an unconscious process. As such, "swarm-based medicine" may add a further source of information to the classical approaches of evidence- and eminence-based medicine. How to integrate swarm-based medicine into practice is left to the individual physician, but even this decision will be influenced by the swarm.

  18. Mixed-signal methodology guide advanced methodology for AMS IP and SOC design, verification and implementation

    CERN Document Server

    Chen, Jess; Mar, Monte F; Nizic, Mladen; Bailey, Brian

    2012-01-01

    This Book, The Mixed-Signal Methodology Guide: Advanced Methodology For Ams Ip And Soc Design, Verification, And Implementation Provides A Broad Overview Of The Design, Verification And Implementation Methodologies Required For Today's Mixed-Signal Designs. The Book Covers Mixed-Signal Design Trends And Challenges, Abstraction Of Analog Using Behavioral Models, Assertion-Based Metric-Driven Verification Methodology Applied On Analog And Mixed-Signal And Verification Of Low Power Intent In Mixed-Signal Design. It Also Describes Methodology For Physical Implementation In Context Of Concurrent Mixed-Signal Design And For Handling Advanced Node Physical Effects. The Book Contains Many Practical Examples Of Models And Techniques. The Authors Believe It Should Serve As A Reference To Many Analog, Digital And Mixed-Signal Designers, Verification, Physical Implementation Engineers And Managers In Their Pursuit Of Information For A Better Methodology Required To Address The Challenges Of Modern Mixed-Signal Design.

  19. Design verification for large reprocessing plants (Proposed procedures)

    International Nuclear Information System (INIS)

    Rolandi, G.

    1988-07-01

    In the 1990s, four large commercial reprocessing plants will progressively come into operation: If an effective and efficient safeguards system is to be applied to these large and complex plants, several important factors have to be considered. One of these factors, addressed in the present report, concerns plant design verification. Design verification provides an overall assurance on plant measurement data. To this end design verification, although limited to the safeguards aspects of the plant, must be a systematic activity, which starts during the design phase, continues during the construction phase and is particularly performed during the various steps of the plant's commissioning phase. The detailed procedures for design information verification on commercial reprocessing plants must be defined within the frame of the general provisions set forth in INFCIRC/153 for any type of safeguards related activities and specifically for design verification. The present report is intended as a preliminary contribution on a purely technical level, and focusses on the problems within the Agency. For the purpose of the present study the most complex case was assumed: i.e. a safeguards system based on conventional materials accountancy, accompanied both by special input and output verification and by some form of near-real-time accountancy involving in-process inventory taking, based on authenticated operator's measurement data. C/S measures are also foreseen, where necessary to supplement the accountancy data. A complete ''design verification'' strategy comprehends: informing the Agency of any changes in the plant system which are defined as ''safeguards relevant''; ''reverifying by the Agency upon receiving notice from the Operator on any changes, on ''design information''. 13 refs

  20. Image based SAR product simulation for analysis

    Science.gov (United States)

    Domik, G.; Leberl, F.

    1987-01-01

    SAR product simulation serves to predict SAR image gray values for various flight paths. Input typically consists of a digital elevation model and backscatter curves. A new method is described of product simulation that employs also a real SAR input image for image simulation. This can be denoted as 'image-based simulation'. Different methods to perform this SAR prediction are presented and advantages and disadvantages discussed. Ascending and descending orbit images from NASA's SIR-B experiment were used for verification of the concept: input images from ascending orbits were converted into images from a descending orbit; the results are compared to the available real imagery to verify that the prediction technique produces meaningful image data.

  1. Characterization of lens based photoacoustic imaging system

    Directory of Open Access Journals (Sweden)

    Kalloor Joseph Francis

    2017-12-01

    Full Text Available Some of the challenges in translating photoacoustic (PA imaging to clinical applications includes limited view of the target tissue, low signal to noise ratio and the high cost of developing real-time systems. Acoustic lens based PA imaging systems, also known as PA cameras are a potential alternative to conventional imaging systems in these scenarios. The 3D focusing action of lens enables real-time C-scan imaging with a 2D transducer array. In this paper, we model the underlying physics in a PA camera in the mathematical framework of an imaging system and derive a closed form expression for the point spread function (PSF. Experimental verification follows including the details on how to design and fabricate the lens inexpensively. The system PSF is evaluated over a 3D volume that can be imaged by this PA camera. Its utility is demonstrated by imaging phantom and an ex vivo human prostate tissue sample.

  2. A study of applications scribe frame data verifications using design rule check

    Science.gov (United States)

    Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki

    2013-06-01

    In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.

  3. Experimental Comparison of the Behavior between Base Oil and Grease Starvation Based on Inlet Film Thickness

    Directory of Open Access Journals (Sweden)

    D. Kostal

    2017-03-01

    Full Text Available This paper deals with the experimental study of an elastohydrodynamic contact under conditions of insufficient lubricant supply. Starvation level of this type of the contact may be experimentally determined based on the position of the meniscus, but this way can't determine all levels of starvation. Consequent development in the field of tribology achieved theoretical model that can determine all levels of starvation by dependency on the thickness of the lubricant film entering the contact, but it is difficult for experimental verification. The main goal of this work is an experimental study and description of the behavior of the elastohydrodynamic contact with controlled thickness of the lubricant film at the contact input. Contact was lubricated by the base oil and the grease and compared. Results were surprising because the only differences between oil and grease were observed for more viscous lubricants at thicker film layer entering to the contact.

  4. Paper based electronics platform

    KAUST Repository

    Nassar, Joanna Mohammad

    2017-07-20

    A flexible and non-functionalized low cost paper-based electronic system platform fabricated from common paper, such as paper based sensors, and methods of producing paper based sensors, and methods of sensing using the paper based sensors are provided. A method of producing a paper based sensor can include the steps of: a) providing a conventional paper product to serve as a substrate for the sensor or as an active material for the sensor or both, the paper product not further treated or functionalized; and b) applying a sensing element to the paper substrate, the sensing element selected from the group consisting of a conductive material, the conductive material providing contacts and interconnects, sensitive material film that exhibits sensitivity to pH levels, a compressible and/or porous material disposed between a pair of opposed conductive elements, or a combination of two of more said sensing elements. The method of sensing can further include measuring, using the sensing element, a change in resistance, a change in voltage, a change in current, a change in capacitance, or a combination of any two or more thereof.

  5. Design of Test Wrapper Scan Chain Based on Differential Evolution

    Directory of Open Access Journals (Sweden)

    Aijun Zhu

    2013-08-01

    Full Text Available Integrated Circuit has entered the era of design of the IP-based SoC (System on Chip, which makes the IP core reuse become a key issue. SoC test wrapper design for scan chain is a NP Hard problem, we propose an algorithm based on Differential Evolution (DE to design wrapper scan chain. Through group’s mutation, crossover and selection operations, the design of test wrapper scan chain is achieved. Experimental verification is carried out according to the international standard benchmark ITC’02. The results show that the algorithm can obtain shorter longest wrapper scan chains, compared with other algorithms.

  6. Robust face detection based on components and their topology

    Science.gov (United States)

    Goldmann, Lutz; Mönich, Ullrich; Sikora, Thomas

    2006-01-01

    This paper presents a novel approach for automatic and robust object detection. It utilizes a component-based approach that combines techniques from both statistical and structural pattern recognition domain. While the component detection relies on Haar-like features and an AdaBoost trained classifier cascade, the topology verification is based on graph matching techniques. The system was applied to face detection and the experiments show its outstanding performance in comparison to other face detection approaches. Especially in the presence of partial occlusions, uneven illumination and out-of-plane rotations it yields higher robustness.

  7. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  8. Runtime Verification with State Estimation

    Science.gov (United States)

    Stoller, Scott D.; Bartocci, Ezio; Seyster, Justin; Grosu, Radu; Havelund, Klaus; Smolka, Scott A.; Zadok, Erez

    2011-01-01

    We introduce the concept of Runtime Verification with State Estimation and show how this concept can be applied to estimate theprobability that a temporal property is satisfied by a run of a program when monitoring overhead is reduced by sampling. In such situations, there may be gaps in the observed program executions, thus making accurate estimation challenging. To deal with the effects of sampling on runtime verification, we view event sequences as observation sequences of a Hidden Markov Model (HMM), use an HMM model of the monitored program to "fill in" sampling-induced gaps in observation sequences, and extend the classic forward algorithm for HMM state estimation (which determines the probability of a state sequence, given an observation sequence) to compute the probability that the property is satisfied by an execution of the program. To validate our approach, we present a case study based on the mission software for a Mars rover. The results of our case study demonstrate high prediction accuracy for the probabilities computed by our algorithm. They also show that our technique is much more accurate than simply evaluating the temporal property on the given observation sequences, ignoring the gaps.

  9. A verification regime for the spatial discretization of the SN transport equations

    International Nuclear Information System (INIS)

    Schunert, S.; Azmy, Y.

    2012-01-01

    The order-of-accuracy test in conjunction with the method of manufactured solutions is the current state of the art in computer code verification. In this work we investigate the application of a verification procedure including the order-of-accuracy test on a generic SN transport solver that implements the AHOTN spatial discretization. Different types of semantic errors, e.g. removal of a line of code or changing a single character, are introduced randomly into the previously verified S N code and the proposed verification procedure is used to identify the coding mistakes (if possible) and classify them. Itemized by error type we record the stage of the verification procedure where the error is detected and report the frequency with which the errors are correctly identified at various stages of the verification. Errors that remain undetected by the verification procedure are further scrutinized to determine the reason why the introduced coding mistake eluded the verification procedure. The result of this work is that the verification procedure based on an order-of-accuracy test finds almost all detectable coding mistakes but rarely, 1.44% of the time, and under certain circumstances can fail. (authors)

  10. Cantilever Based Mass Sensing

    DEFF Research Database (Denmark)

    Dohn, Søren

    2007-01-01

    suitable for a portable device and to investigate the possibility of enhancing the functionality and sensitivity of cantilever based mass sensors. A readout method based on the hard contact between the cantilever and a biased electrode placed in close proximity to the cantilever is proposed. The viability...... the mass and position of a particle attached to a cantilever to the resonant frequency. It is shown theoretical possible to find the mass and position of a particle by measurements of the resonant frequency of several bending modes. In the measurements the sensitivity of the cantilever based mass sensor...... is improved when operated at higher bending modes. By measuring the resonant frequency of several bending modes both the mass and position of an attached gold bead are determined....

  11. Problem Based Learning

    DEFF Research Database (Denmark)

    de Graaff, Erik; Guerra, Aida

    Problem-Based Learning (PBL) is an innovative method to organize the learning process in such a way that the students actively engage in finding answers by themselves. During the past 40 years PBL has evolved and diversified resulting in a multitude in variations in models and practices. However......, the key principles remain the same everywhere. Graaff & Kolmos (2003) identify the main PBL principles as follows: 1. Problem orientation 2. Project organization through teams or group work 3. Participant-directed 4. Experiental learning 5. Activity-based learning 6. Interdisciplinary learning and 7...... in Engineering Education. In answer to the requests for visits the Aalborg Centre for Problem Based Learning in Engineering Science and Sustainability under the auspices of UNESCO (UCPBL) a two days programme for visitors is offered two times a year. The workshop is an introduction workshop to the Aalborg PBL...

  12. Location-based Scheduling

    DEFF Research Database (Denmark)

    Andersson, Niclas; Christensen, Knud

    the predominant scheduling method since it was introduced in the late 1950s. Over the years, CPM has proven to be a very powerful technique for planning, scheduling and controlling projects, which among other things is indicated by the development of a large number of CPM-based software applications available...... on the market. However, CPM is primarily an activity based method that takes the activity as the unit of focus and there is criticism raised, specifically in the case of construction projects, on the method for deficient management of construction work and continuous flow of resources. To seek solutions...... to the identified limitations of the CPM method, an alternative planning and scheduling methodology that includes locations is tested. Location-based Scheduling (LBS) implies a shift in focus, from primarily the activities to the flow of work through the various locations of the project, i.e. the building. LBS uses...

  13. Skull Base Anatomy.

    Science.gov (United States)

    Patel, Chirag R; Fernandez-Miranda, Juan C; Wang, Wei-Hsin; Wang, Eric W

    2016-02-01

    The anatomy of the skull base is complex with multiple neurovascular structures in a small space. Understanding all of the intricate relationships begins with understanding the anatomy of the sphenoid bone. The cavernous sinus contains the carotid artery and some of its branches; cranial nerves III, IV, VI, and V1; and transmits venous blood from multiple sources. The anterior skull base extends to the frontal sinus and is important to understand for sinus surgery and sinonasal malignancies. The clivus protects the brainstem and posterior cranial fossa. A thorough appreciation of the anatomy of these various areas allows for endoscopic endonasal approaches to the skull base. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Mutually unbiased unitary bases

    Science.gov (United States)

    Shaari, Jesni Shamsul; Nasir, Rinie N. M.; Mancini, Stefano

    2016-11-01

    We consider the notion of unitary transformations forming bases for subspaces of M (d ,C ) such that the square of the Hilbert-Schmidt inner product of matrices from the differing bases is a constant. Moving from the qubit case, we construct the maximal number of such bases for the four- and two-dimensional subspaces while proving the nonexistence of such a construction for the three-dimensional case. Extending this to higher dimensions, we commit to such a construct for the case of qutrits and provide evidence for the existence of such unitaries for prime dimensional quantum systems. Focusing on the qubit case, we show that the average fidelity for estimating any such transformation is equal to the case for estimating a completely unknown unitary from SU(2 ) . This is then followed by a quick application for such unitaries in a quantum cryptographic setup.

  15. Future directions of nuclear verification

    International Nuclear Information System (INIS)

    Blix, H.

    1997-01-01

    Future directions of nuclear verification are discussed including the following topics: verification of non-proliferation commitments; practicalities of strengthening safeguards; new tasks in nuclear verification

  16. Unification & sharing in timed automata verification

    DEFF Research Database (Denmark)

    David, Alexandre; Behrmann, Gerd; Larsen, Kim Guldstrand

    2003-01-01

    We present the design of the model-checking engine and internal data structures for the next generation of UPPAAL. The design is based on a pipeline architecture where each stage represents one independent operation in the verification algorithms. The architecture is based on essentially one shared...... a special-purpose storage manager to best utilize sharing in physical storage. We present experimental results supporting these design decisions. It is demonstrated that the new design and implementation improves the efficiency of the current distributed version of UPPAAL by about 60% in time and 80...

  17. DEVELOPMENT OF A MULTIMODAL MONTE CARLO BASED TREATMENT PLANNING SYSTEM.

    Science.gov (United States)

    Kumada, Hiroaki; Takada, Kenta; Sakurai, Yoshinori; Suzuki, Minoru; Takata, Takushi; Sakurai, Hideyuki; Matsumura, Akira; Sakae, Takeji

    2017-10-26

    To establish boron neutron capture therapy (BNCT), the University of Tsukuba is developing a treatment device and peripheral devices required in BNCT, such as a treatment planning system. We are developing a new multimodal Monte Carlo based treatment planning system (developing code: Tsukuba Plan). Tsukuba Plan allows for dose estimation in proton therapy, X-ray therapy and heavy ion therapy in addition to BNCT because the system employs PHITS as the Monte Carlo dose calculation engine. Regarding BNCT, several verifications of the system are being carried out for its practical usage. The verification results demonstrate that Tsukuba Plan allows for accurate estimation of thermal neutron flux and gamma-ray dose as fundamental radiations of dosimetry in BNCT. In addition to the practical use of Tsukuba Plan in BNCT, we are investigating its application to other radiation therapies. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. On multivariate Wilson bases

    DEFF Research Database (Denmark)

    Bownik, Marcin; Jakobsen, Mads Sielemann; Lemvig, Jakob

    2017-01-01

    A Wilson system is a collection of finite linear combinations of time frequency shifts of a square integrable function. In this paper we give an account of the construction of bimodular Wilson bases in higher dimensions from Gabor frames of redundancy two.......A Wilson system is a collection of finite linear combinations of time frequency shifts of a square integrable function. In this paper we give an account of the construction of bimodular Wilson bases in higher dimensions from Gabor frames of redundancy two....

  19. Inkjet-based micromanufacturing

    CERN Document Server

    Korvink, Jan G; Shin, Dong-Youn; Brand, Oliver; Fedder, Gary K; Hierold, Christofer; Tabata, Osamu

    2012-01-01

    Inkjet-based Micromanufacturing Inkjet technology goes way beyond putting ink on paper: it enables simpler, faster and more reliable manufacturing processes in the fields of micro- and nanotechnology. Modern inkjet heads are per se precision instruments that deposit droplets of fluids on a variety of surfaces in programmable, repeating patterns, allowing, after suitable modifications and adaptations, the manufacturing of devices such as thin-film transistors, polymer-based displays and photovoltaic elements. Moreover, inkjet technology facilitates the large-scale production of flexible RFID tr

  20. Scenario-based strategizing

    DEFF Research Database (Denmark)

    Lehr, Thomas; Lorenz, Ullrich; Willert, Markus

    2017-01-01

    on behavioural strategy as a theoretical lens, we design a yardstick to study the impact of scenario-based strategizing. We then describe our approach, which includes developing scenarios and alternative strategies separately and supporting the strategy selection through an integrated assessment of the goal......-based efficacy and robustness. To facilitate the colla- borative strategizing in teams, we propose a matrix with robustness and efficacy as the two axes, which we call the Parmenides Matrix. We assess the impact of the novel approach by applying it in two cases, at a govern- mental agency (German Environmental...... Ministry) and a firm affected by disruptive change (Bosch, leading global supplier of technology and solutions)....

  1. Iron-based superconductivity

    CERN Document Server

    Johnson, Peter D; Yin, Wei-Guo

    2015-01-01

    This volume presents an in-depth review of experimental and theoretical studies on the newly discovered Fe-based superconductors.  Following the Introduction, which places iron-based superconductors in the context of other unconventional superconductors, the book is divided into three sections covering sample growth, experimental characterization, and theoretical understanding.  To understand the complex structure-property relationships of these materials, results from a wide range of experimental techniques and theoretical approaches are described that probe the electronic and magnetic proper

  2. Identity-based encryption

    CERN Document Server

    Chatterjee, Sanjit

    2011-01-01

    Identity Based Encryption (IBE) is a type of public key encryption and has been intensely researched in the past decade. Identity-Based Encryption summarizes the available research for IBE and the main ideas that would enable users to pursue further work in this area. This book will also cover a brief background on Elliptic Curves and Pairings, security against chosen Cipher text Attacks, standards and more. Advanced-level students in computer science and mathematics who specialize in cryptology, and the general community of researchers in the area of cryptology and data security will find Ide

  3. Video-based rendering

    CERN Document Server

    Magnor, Marcus A

    2005-01-01

    Driven by consumer-market applications that enjoy steadily increasing economic importance, graphics hardware and rendering algorithms are a central focus of computer graphics research. Video-based rendering is an approach that aims to overcome the current bottleneck in the time-consuming modeling process and has applications in areas such as computer games, special effects, and interactive TV. This book offers an in-depth introduction to video-based rendering, a rapidly developing new interdisciplinary topic employing techniques from computer graphics, computer vision, and telecommunication en

  4. Evidence-Based Development

    DEFF Research Database (Denmark)

    Hertzum, Morten; Simonsen, Jesper

    2004-01-01

    Systems development is replete with projects that represent substantial resource investments but result in systems that fail to meet users’ needs. Evidence-based development is an emerging idea intended to provide means for managing customer-vendor relationships and working systematically toward...... meeting customer needs. We are suggesting that the effects of the use of a system should play a prominent role in the contractual definition of IT projects and that contract fulfilment should be determined on the basis of evidence of these effects. Based on two ongoing studies of home-care management...

  5. Scenario-based strategizing

    DEFF Research Database (Denmark)

    Lehr, Thomas; Lorenz, Ullrich; Willert, Markus

    2017-01-01

    For over 40 years, scenarios have been promoted as a key technique for forming strategies in uncertain en- vironments. However, many challenges remain. In this article, we discuss a novel approach designed to increase the applicability of scenario-based strategizing in top management teams. Drawing...... on behavioural strategy as a theoretical lens, we design a yardstick to study the impact of scenario-based strategizing. We then describe our approach, which includes developing scenarios and alternative strategies separately and supporting the strategy selection through an integrated assessment of the goal...... Ministry) and a firm affected by disruptive change (Bosch, leading global supplier of technology and solutions)....

  6. Using Graph Transformations and Graph Abstractions for Software Verification

    NARCIS (Netherlands)

    Zambon, Eduardo; Rensink, Arend

    In this paper we describe our intended approach for the verification of software written in imperative programming languages. We base our approach on model checking of graph transition systems, where each state is a graph and the transitions are specified by graph transformation rules. We believe

  7. Using Graph Transformations and Graph Abstractions for Software Verification

    NARCIS (Netherlands)

    Zambon, Eduardo; Ehrig, Hartmut; Rensink, Arend; Rozenberg, Grzegorz; Schurr, Andy

    In this abstract we present an overview of our intended approach for the verification of software written in imperative programming languages. This approach is based on model checking of graph transition systems (GTS), where each program state is modeled as a graph and the exploration engine is

  8. Optimal decision fusion for verification of face sequences

    NARCIS (Netherlands)

    Tao, Q.; Veldhuis, Raymond N.J.; Veldhuis, R.N.J.; Cronie, H.S.

    2007-01-01

    Face sequence contains more information of the user than a single face image. In this paper, optimal decision fusion is proposed to verify the face sequences, based on the original verification system for a single face image. We show by experiments that optimal decision fusion is a simple but

  9. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Panduro, T. E.; Thorsen, B. J.

    2013-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, however...

  10. Verification of flood damage modelling using insurance data

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Petersen, Toke E. P.; Thorsen, Bo J.

    2012-01-01

    This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, however...

  11. BProVe: Tool support for business process verification

    DEFF Research Database (Denmark)

    Corradini, Flavio; Fornari, Fabrizio; Polini, Andrea

    2017-01-01

    This demo introduces BProVe, a tool supporting automated verification of Business Process models. BProVe analysis is based on a formal operational semantics defined for the BPMN 2.0 modelling language, and is provided as a freely accessible service that uses open standard formats as input data...

  12. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  13. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  14. Automated Formal Verification for PLC Control Systems

    CERN Multimedia

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  15. Program verification using symbolic game semantics

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar

    2014-01-01

    We introduce a new symbolic representation of algorithmic game semantics, and show how it can be applied for efficient verification of open (incomplete) programs. The focus is on an Algol-like programming language which contains the core ingredients of imperative and functional languages...... of game semantics to that of corresponding symbolic representations. In this way programs with infinite data types, such as integers, can be expressed as finite-state symbolic-automata although the standard automata representation is infinite-state, i.e. the standard regular-language representation has...... infinite summations. Moreover, in this way significant reductions of the state space of game semantics models are obtained. This enables efficient verification of programs by our prototype tool based on symbolic game models, which is illustrated with several examples....

  16. Technical safety requirements control level verification; TOPICAL

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  17. Molecule-based magnets

    Indian Academy of Sciences (India)

    The design of molecule-based magnets has also been extended to the design of poly-functional molecular magnets, such as those exhibiting second-order optical nonlinearity, liquid crystallinity, or chirality simultaneously with long-range magnetic order. Solubility, low density and biocompatibility are attractive features of ...

  18. Project-Based Science

    Science.gov (United States)

    Krajcik, Joe

    2015-01-01

    Project-based science is an exciting way to teach science that aligns with the "Next Generation Science Standards" ("NGSS"). By focusing on core ideas along with practices and crosscutting concepts, classrooms become learning environments where teachers and students engage in science by designing and carrying out…

  19. based gel polymer electrolytes

    Indian Academy of Sciences (India)

    Bull. Mater. Sci., Vol. 29, No. 7, December 2006, pp. 673–678. © Indian Academy of Sciences. 673. Investigation on poly (vinylidene fluoride) based gel polymer electrolytes ... (Alamgir and Abraham 1993; Sukeshini et al 1996; Ra- jendran and Uma ... Yang et al 1996; Ramesh and Arof 2001) and such elec- trolytes exhibit ...

  20. Wrought cobalt- base superalloys

    Science.gov (United States)

    Klarstrom, D. L.

    1993-08-01

    Wrought cobalt-base superalloys are used extensively in gas turbine engines because of their excellent high-temperature creep and fatigue strengths and resistance to hot corrosion attack. In addition, the unique character of the oxide scales that form on some of the alloys provides outstanding resistance to high-temperature sliding wear. This article provides a review of the evolutionary development of wrought cobalt-base alloys in terms of alloy design and physical metallurgy. The topics include solid-so-lution strengthening, carbide precipitation characteristics, and attempts to introduce age hardening. The use of PHACOMP to enhance thermal stability characteristics and the incorporation of rare-earth ele-ments to improve oxidation resistance is also reviewed and discussed. The further development of cobalt-base superalloys has been severely hampered by past political events, which have accentuated the strategic vulnerability of cobalt as a base or as an alloying element. Consequently, alternative alloys have been developed that use little or no cobalt. One such alternative, Haynes® 230TMalloy, is discussed briefly.

  1. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    In a method for performing a refractive index based measurement of a property of a fluid such as chemical composition or temperature by observing an apparent angular shift in an interference fringe pattern produced by back or forward scattering interferometry, ambiguities in the measurement caused...

  2. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    In a method for performing a refractive index based measurement of a property of a fluid such as chemical composition or temperature, a chirp in the local spatial frequency of interference fringes of an interference pattern is reduced by mathematical manipulation of the recorded light intensity...

  3. Refractive index based measurements

    DEFF Research Database (Denmark)

    2014-01-01

    A refractive index based measurement of a property of a fluid is measured in an apparatus comprising a variable wavelength coherent light source (16), a sample chamber (12), a wavelength controller (24), a light sensor (20), a data recorder (26) and a computation apparatus (28), by - directing...

  4. Community-Based Care

    Science.gov (United States)

    ... Health A to Z › Community-Based Care Font size A A A Print Share Glossary Basic Facts & Information Other Resources Caregiving How To's Tools & Tips Latest Research Getting More Help Related Topics Assisted Living Home Care Nursing Homes Join our e-newsletter! ...

  5. Mutually unbiased bases

    Indian Academy of Sciences (India)

    Mutually unbiased bases play an important role in quantum cryptography [2] and in the optimal determination of the density operator of an ensemble [3,4]. A density operator ρ in N-dimensions depends on N2 1 real quantities. With the help of MUB's, any such density operator can be encoded, in an optimal way, in terms of ...

  6. Lunar Base Sitting

    Science.gov (United States)

    Staehle, Robert L.; Burke, James D.; Snyder, Gerald C.; Dowling, Richard; Spudis, Paul D.

    1993-01-01

    Speculation with regard to a permanent lunar base has been with us since Robert Goddard was working on the first liquid-fueled rockets in the 1920's. With the infusion of data from the Apollo Moon flights, a once speculative area of space exploration has become an exciting possibility. A Moon base is not only a very real possibility, but is probably a critical element in the continuation of our piloted space program. This article, originally drafted by World Space Foundation volunteers in conjuction with various academic and research groups, examines some of the strategies involved in selecting an appropriate site for such a lunar base. Site selection involves a number of complex variables, including raw materials for possible rocket propellant generation, hot an cold cycles, view of the sky (for astronomical considerations, among others), geological makeup of the region, and more. This article summarizes the key base siting considerations and suggests some alternatives. Availability of specific resources, including energy and certain minerals, is critical to success.

  7. Problem-based learning

    NARCIS (Netherlands)

    Loyens, Sofie; Kirschner, Paul A.; Paas, Fred

    2010-01-01

    Loyens, S. M. M., Kirschner, P. A., & Paas, F. (2011). Problem-based learning. In S. Graham (Editor-in-Chief), A. Bus, S. Major, & L. Swanson (Associate Editors), APA educational psychology handbook: Vol. 3. Application to learning and teaching (pp. 403-425). Washington, DC: American Psychological

  8. ISFET based enzyme sensors

    NARCIS (Netherlands)

    van der Schoot, Bart H.; Bergveld, Piet

    1987-01-01

    This paper reviews the results that have been reported on ISFET based enzyme sensors. The most important improvement that results from the application of ISFETs instead of glass membrane electrodes is in the method of fabrication. Problems with regard to the pH dependence of the response and the

  9. Internet based benchmarking

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Nielsen, Kurt

    2005-01-01

    We discuss the design of interactive, internet based benchmarking using parametric (statistical) as well as nonparametric (DEA) models. The user receives benchmarks and improvement potentials. The user is also given the possibility to search different efficiency frontiers and hereby to explore...

  10. Financing Competency Based Programs.

    Science.gov (United States)

    Daniel, Annette

    Literature on the background, causes, and current prevalence of competency based programs is synthesized in this report. According to one analysis of the actual and probable costs of minimum competency testing, estimated costs for test development, test administration, bureaucratic structures, and remedial programs for students who cannot pass the…

  11. Diffusion Based Photon Mapping

    DEFF Research Database (Denmark)

    Schjøth, Lars; Fogh Olsen, Ole; Sporring, Jon

    2007-01-01

    . To address this problem we introduce a novel photon mapping algorithm based on nonlinear anisotropic diffusion. Our algorithm adapts according to the structure of the photon map such that smoothing occurs along edges and structures and not across. In this way we preserve the important illumination features...

  12. Graphene-based spaser

    Science.gov (United States)

    Berman, Oleg; Kezerashvili, Roman; Lozovik, Yurii

    2013-03-01

    We propose graphene-based surface plasmon amplification by stimulated emission of radiation (spaser) formed in the graphene nanoribbon located near a semiconductor quantum dot (QD). The population inversion of the two electron levels of the QD can be achieved by applying external electric current or laser pumping. If the frequency of the dipole plasmon resonance in a graphene nanoribbon comes in the resonance with the transition frequency for the QD, it is possible to excite plasmons and generate the coherent surface plasmon states in the graphene nanoribbon. Therefore, the oscillating dipole in the QD excites coherent surface plasmons in the graphene nanoribbon. By solving the system of equations for the number of coherent localized plasmons in a graphene-based spaser the optimal design, optimal width of graphene nanoribbon and optimal regime for the graphene-based spaser are found. The minimal size and minimal threshold pumping intensity for the graphene-based spaser are obtained. The advantage of using graphene for the spaser is discussed.

  13. Molecule-based magnets

    Indian Academy of Sciences (India)

    Administrator

    Being weakly coloured, unlike their opaque classical magnet 'cousins' listed above, possibilities of ..... which confers a memory effect on it. Stronger coercive fields are expected for Co. 2+. -based molecular ... of them a colour change, too, occurs reversibly and simultaneously with the change in magnetic properties at.

  14. Fundamental research data base

    Science.gov (United States)

    1983-01-01

    A fundamental research data base containing ground truth, image, and Badhwar profile feature data for 17 North Dakota, South Dakota, and Minnesota agricultural sites is described. Image data was provided for a minimum of four acquisition dates for each site and all four images were registered to one another.

  15. Base tree property

    Czech Academy of Sciences Publication Activity Database

    Balcar, B.; Doucha, Michal; Hrušák, M.

    2015-01-01

    Roč. 32, č. 1 (2015), s. 69-81 ISSN 0167-8094 R&D Projects: GA AV ČR IAA100190902 Institutional support: RVO:67985840 Keywords : forcing * Boolean algebras * base tree Subject RIV: BA - General Mathematics Impact factor: 0.614, year: 2015 http://link.springer.com/article/10.1007/s11083-013-9316-2

  16. Steel column base classification

    OpenAIRE

    Jaspart, J.P.; Wald, F.; Weynand, K.; Gresnigt, A.M.

    2008-01-01

    The influence of the rotational characteristics of the column bases on the structural frame response is discussed and specific design criteria for stiffness classification into semi-rigid and rigid joints are derived. The particular case of an industrial portal frame is then considered. Peer reviewed

  17. Web Based VRML Modelling

    NARCIS (Netherlands)

    Kiss, S.; Sarfraz, M.

    2004-01-01

    Presents a method to connect VRML (Virtual Reality Modeling Language) and Java components in a Web page using EAI (External Authoring Interface), which makes it possible to interactively generate and edit VRML meshes. The meshes used are based on regular grids, to provide an interaction and modeling

  18. Web Based VRML Modelling

    NARCIS (Netherlands)

    Kiss, S.; Banissi, E.; Khosrowshahi, F.; Sarfraz, M.; Ursyn, A.

    2001-01-01

    Presents a method to connect VRML (Virtual Reality Modeling Language) and Java components in a Web page using EAI (External Authoring Interface), which makes it possible to interactively generate and edit VRML meshes. The meshes used are based on regular grids, to provide an interaction and modeling

  19. based gel polymer electrolytes

    Indian Academy of Sciences (India)

    operating systems. With this situation, attempts have been made in poly (ethylene oxide) (PEO) based polymer electrolytes to reach an appreciable electrical conducti- vity at ambient temperature (Wright 1975; Martuscelli et al 1984). Generally solid polymer electrolytes have many advantages, viz. high ionic conductivity, ...

  20. Dictionary Based Image Segmentation

    DEFF Research Database (Denmark)

    Dahl, Anders Bjorholm; Dahl, Vedrana Andersen

    2015-01-01

    We propose a method for weakly supervised segmentation of natural images, which may contain both textured or non-textured regions. Our texture representation is based on a dictionary of image patches. To divide an image into separated regions with similar texture we use an implicit level sets...

  1. Molecule-based magnets

    Indian Academy of Sciences (India)

    Administrator

    SmCo5, Nd2Fe14B etc are all atom-based, and their preparation/processing require high temperature routes. Employing ... synthesis of molecular magnets in 1986, a large variety of them have been synthesized, which can be catego- rized on the basis of the ..... magnetic behaviour of Fe[Fe(CN)6⋅xH2O nanoparticles.

  2. REST based service composition

    DEFF Research Database (Denmark)

    Grönvall, Erik; Ingstrup, Mads; Pløger, Morten

    2011-01-01

    This paper presents an ongoing work developing and testing a Service Composition framework based upon the REST architecture named SECREST. A minimalistic approach have been favored instead of a creating a complete infrastructure. One focus has been on the system's interaction model. Indeed, an ai...

  3. Zero-Base Budgeting.

    Science.gov (United States)

    Harvey, L. James

    The concept of Zero-Base Budgeting (ZBB) is discussed in terms of its application, advantages, disadvantages, and implementation in an effective planning, management, and evaluation (PME) system. A ZBB system requires administrators to: review all programs and expenditures annually, set clear cut goals, and analyze all possible alternatives for…

  4. Zero-Base Budgeting.

    Science.gov (United States)

    Yagielski, John

    In outline form, this document presents basic information on the school district, the reasons the district considered zero-base budgeting (ZBB), the formation and membership of the advisory School Cost Analysis Team, the district's investigation of the ZBB concept, an overview of the ways the district used the ZBB process, the identification of…

  5. Internet Based Benchmarking

    OpenAIRE

    Bogetoft, Peter; Nielsen, Kurt

    2002-01-01

    We discuss the design of interactive, internet based benchmarking using parametric (statistical) as well as non-parametric (DEA) models. The user receives benchmarks and improvement potentials. The user is also given the possibility to search different efficiency frontiers and hereby to explore alternative improvement strategies. An implementation of both a parametric and a non parametric model are presented.

  6. Mutually unbiased bases

    Indian Academy of Sciences (India)

    ... we show how, by exploiting certain freedom in the Wootters–Fields construction, the task of explicitly writing down such bases can be simplified for the case when is an odd prime. In particular, we express the results entirely in terms of the character vectors of the cyclic group of order . We also analyse the connection ...

  7. Surfel Based Geometry Resonstruction

    DEFF Research Database (Denmark)

    Andersen, Vedrana; Aanæs, Henrik; Bærentzen, Jakob Andreas

    2010-01-01

    We propose a method for retrieving a piecewise smooth surface from noisy data. In data acquired by a scanning process sampled points are almost never on the discontinuities making reconstruction of surfaces with sharp features difficult. Our method is based on a Markov Random Field (MRF) formulat...

  8. Performance based fault diagnosis

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik

    2002-01-01

    Different aspects of fault detection and fault isolation in closed-loop systems are considered. It is shown that using the standard setup known from feedback control, it is possible to formulate fault diagnosis problems based on a performance index in this general standard setup. It is also shown...

  9. Cotton-based nonwovens

    Science.gov (United States)

    This article is an abbreviated description of a new cotton-based nonwovens research program at the Southern Regional Research Center, which is one of the four regional research centers of the Agricultural Research Service, U.S. Department of Agriculture. Since cotton is a significant cash crop inte...

  10. Diffusion Based Photon Mapping

    DEFF Research Database (Denmark)

    Schjøth, Lars; Sporring, Jon; Fogh Olsen, Ole

    2008-01-01

    . To address this problem, we introduce a photon mapping algorithm based on nonlinear anisotropic diffusion. Our algorithm adapts according to the structure of the photon map such that smoothing occurs along edges and structures and not across. In this way, we preserve important illumination features, while...

  11. Spiritual-based Leadership

    DEFF Research Database (Denmark)

    Pruzan, Peter

    2015-01-01

    Although far from mainstream, the concept of spiritual-based leadership is emerging as an inclusive and yet highly personal approach to leadership that integrates a leader’s inner perspectives on identity, purpose, responsibility and success with her or his decisions and actions in the outer worl...

  12. Visualization of Instrumental Verification Information Details (VIVID) : code development, description, and usage.

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Christopher John; Bainbridge, Bruce L.; Potter, Donald L.; Blottner, Frederick G.; Black, Amalia Rebecca

    2005-03-01

    The formulation, implementation and usage of a numerical solution verification code is described. This code uses the Richardson extrapolation procedure to estimate the order of accuracy and error of a computational program solution. It evaluates multiple solutions performed in numerical grid convergence studies to verify a numerical algorithm implementation. Analyses are performed on both structured and unstructured grid codes. Finite volume and finite element discretization programs are examined. Two and three-dimensional solutions are evaluated. Steady state and transient solution analysis capabilities are present in the verification code. Multiple input data bases are accepted. Benchmark options are included to allow for minimal solution validation capability as well as verification.

  13. Verification of EPA's " Preliminary remediation goals for radionuclides" (PRG) electronic calculator

    Energy Technology Data Exchange (ETDEWEB)

    Stagich, B. H. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-03-29

    The U.S. Environmental Protection Agency (EPA) requested an external, independent verification study of their “Preliminary Remediation Goals for Radionuclides” (PRG) electronic calculator. The calculator provides information on establishing PRGs for radionuclides at Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) sites with radioactive contamination (Verification Study Charge, Background). These risk-based PRGs set concentration limits using carcinogenic toxicity values under specific exposure conditions (PRG User’s Guide, Section 1). The purpose of this verification study is to ascertain that the computer codes has no inherit numerical problems with obtaining solutions as well as to ensure that the equations are programmed correctly.

  14. Verification of structural analysis computer codes in nuclear engineering

    International Nuclear Information System (INIS)

    Zebeljan, Dj.; Cizelj, L.

    1990-01-01

    Sources of potential errors, which can take place during use of finite element method based computer programs, are described in the paper. The magnitude of errors was defined as acceptance criteria for those programs. Error sources are described as they are treated by 'National Agency for Finite Element Methods and Standards (NAFEMS)'. Specific verification examples are used from literature of Nuclear Regulatory Commission (NRC). Example of verification is made on PAFEC-FE computer code for seismic response analyses of piping systems by response spectrum method. (author)

  15. Practical considerations in the verification of personal sound exposure meters

    Science.gov (United States)

    de Arcas, G.; López, J. M.; Ruiz, M.; Recuero, M.

    2007-06-01

    This paper analyses the problems that appear when calibration or verification of personal sound exposure meters is done following the recommendations of IEC 61252. The tests recommended in the standard to verify the characteristics of such instruments are discussed and alternative procedures are proposed to solve the problems detected. The two main problems detected are related to the duration of the process, which has a direct effect on the verification cost, and the measurement uncertainty. The use of procedures based on measurement modes that are not dependent on integration times, such as when measuring sound pressure level, is recommended whenever possible.

  16. A hand held photo identity verification system for mobile applications

    International Nuclear Information System (INIS)

    Kumar, Ranajit; Upreti, Anil; Mahaptra, U.; Bhattacharya, S.; Srivastava, G.P.

    2009-01-01

    A handheld portable system has been developed for mobile personnel identity verification. The system consists of a contact less RF smart card reader integrated to a Simputer through serial link. The simputer verifies the card data, with the data base and aids the security operator in identifying the persons by providing the facial image of the verified person along with other personal details like name, designation, division etc. All transactions are recorded in the Simputer with time and date for future record. This system finds extensive applications in mobile identity verification in nuclear or other industries. (author)

  17. Convolution based profile fitting

    International Nuclear Information System (INIS)

    Kern, A.; Coelho, A.A.; Cheary, R.W.

    2002-01-01

    Full text: In convolution based profile fitting, profiles are generated by convoluting functions together to form the observed profile shape. For a convolution of 'n' functions this process can be written as, Y(2θ)=F 1 (2θ)x F 2 (2θ)x... x F i (2θ)x....xF n (2θ). In powder diffractometry the functions F i (2θ) can be interpreted as the aberration functions of the diffractometer, but in general any combination of appropriate functions for F i (2θ) may be used in this context. Most direct convolution fitting methods are restricted to combinations of F i (2θ) that can be convoluted analytically (e.g. GSAS) such as Lorentzians, Gaussians, the hat (impulse) function and the exponential function. However, software such as TOPAS is now available that can accurately convolute and refine a wide variety of profile shapes numerically, including user defined profiles, without the need to convolute analytically. Some of the most important advantages of modern convolution based profile fitting are: 1) virtually any peak shape and angle dependence can normally be described using minimal profile parameters in laboratory and synchrotron X-ray data as well as in CW and TOF neutron data. This is possible because numerical convolution and numerical differentiation is used within the refinement procedure so that a wide range of functions can easily be incorporated into the convolution equation; 2) it can use physically based diffractometer models by convoluting the instrument aberration functions. This can be done for most laboratory based X-ray powder diffractometer configurations including conventional divergent beam instruments, parallel beam instruments, and diffractometers used for asymmetric diffraction. It can also accommodate various optical elements (e.g. multilayers and monochromators) and detector systems (e.g. point and position sensitive detectors) and has already been applied to neutron powder diffraction systems (e.g. ANSTO) as well as synchrotron based

  18. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  19. MFTF sensor verification computer program

    International Nuclear Information System (INIS)

    Chow, H.K.

    1984-01-01

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system

  20. Calibration and verification of surface contamination meters --- Procedures and techniques

    International Nuclear Information System (INIS)

    Schuler, C; Butterweck, G.; Wernli, C.; Bochud, F.; Valley, J.-F.

    2007-03-01

    A standardised measurement procedure for surface contamination meters (SCM) is presented. The procedure aims at rendering surface contamination measurements to be simply and safely interpretable. Essential for the approach is the introduction and common use of the radionuclide specific quantity 'guideline value' specified in the Swiss Radiation Protection Ordinance as unit for the measurement of surface activity. The according radionuclide specific 'guideline value count rate' can be summarized as verification reference value for a group of radionuclides ('basis guideline value count rate'). The concept can be generalized for SCM of the same type or for SCM of different types using he same principle of detection. A SCM multi source calibration technique is applied for the determination of the instrument efficiency. Four different electron radiation energy regions, four different photon radiation energy regions and an alpha radiation energy region are represented by a set of calibration sources built according to ISO standard 8769-2. A guideline value count rate representing the activity per unit area of a surface contamination of one guideline value can be calculated for any radionuclide using instrument efficiency, radionuclide decay data, contamination source efficiency, guideline value averaging area (100 cm 2 ), and radionuclide specific guideline value. n this way, instrument responses for the evaluation of surface contaminations are obtained for radionuclides without available calibration sources as well as for short-Iived radionuclides, for which the continuous replacement of certified calibration sources can lead to unreasonable costs. SCM verification is based on surface emission rates of reference sources with an active area of 100 cm 2 . The verification for a given list of radionuclides is based on the radionuclide specific quantity guideline value count rate. Guideline value count rates for groups of radionuclides can be represented within the maximum