WorldWideScience

Sample records for ground based verification

  1. Solar energy prediction and verification using operational model forecasts and ground-based solar measurements

    International Nuclear Information System (INIS)

    Kosmopoulos, P.G.; Kazadzis, S.; Lagouvardos, K.; Kotroni, V.; Bais, A.

    2015-01-01

    The present study focuses on the predictions and verification of these predictions of solar energy using ground-based solar measurements from the Hellenic Network for Solar Energy and the National Observatory of Athens network, as well as solar radiation operational forecasts provided by the MM5 mesoscale model. The evaluation was carried out independently for the different networks, for two forecast horizons (1 and 2 days ahead), for the seasons of the year, for varying solar elevation, for the indicative energy potential of the area, and for four classes of cloud cover based on the calculated clearness index (k_t): CS (clear sky), SC (scattered clouds), BC (broken clouds) and OC (overcast). The seasonal dependence presented relative rRMSE (Root Mean Square Error) values ranging from 15% (summer) to 60% (winter), while the solar elevation dependence revealed a high effectiveness and reliability near local noon (rRMSE ∼30%). An increment of the errors with cloudiness was also observed. For CS with mean GHI (global horizontal irradiance) ∼ 650 W/m"2 the errors are 8%, for SC 20% and for BC and OC the errors were greater (>40%) but correspond to much lower radiation levels (<120 W/m"2) of consequently lower energy potential impact. The total energy potential for each ground station ranges from 1.5 to 1.9 MWh/m"2, while the mean monthly forecast error was found to be consistently below 10%. - Highlights: • Long term measurements at different atmospheric cases are needed for energy forecasting model evaluations. • The total energy potential at the Greek sites presented ranges from 1.5 to 1.9 MWh/m"2. • Mean monthly energy forecast errors are within 10% for all cases analyzed. • Cloud presence results of an additional forecast error that varies with the cloud cover.

  2. Development and verification of ground-based tele-robotics operations concept for Dextre

    Science.gov (United States)

    Aziz, Sarmad

    2013-05-01

    The Special Purpose Dextreous Manipulator (Dextre) is the latest addition to the on-orbit segment of the Mobile Servicing System (MSS); Canada's contribution to the International Space Station (ISS). Launched in March 2008, the advanced two-armed robot is designed to perform various ISS maintenance tasks on robotically compatible elements and on-orbit replaceable units using a wide variety of tools and interfaces. The addition of Dextre has increased the capabilities of the MSS, and has introduced significant complexity to ISS robotics operations. While the initial operations concept for Dextre was based on human-in-the-loop control by the on-orbit astronauts, the complexities of robotic maintenance and the associated costs of training and maintaining the operator skills required for Dextre operations demanded a reexamination of the old concepts. A new approach to ISS robotic maintenance was developed in order to utilize the capabilities of Dextre safely and efficiently, while at the same time reducing the costs of on-orbit operations. This paper will describe the development, validation, and on-orbit demonstration of the operations concept for ground-based tele-robotics control of Dextre. It will describe the evolution of the new concepts from the experience gained from the development and implementation of the ground control capability for the Space Station Remote Manipulator System; Canadarm 2. It will discuss the various technical challenges faced during the development effort, such as requirements for high positioning accuracy, force/moment sensing and accommodation, failure tolerance, complex tool operations, and the novel operational tools and techniques developed to overcome them. The paper will also describe the work performed to validate the new concepts on orbit and will discuss the results and lessons learned from the on-orbit checkout and commissioning of Dextre using the newly developed tele-robotics techniques and capabilities.

  3. Assessment of surface solar irradiance derived from real-time modelling techniques and verification with ground-based measurements

    Science.gov (United States)

    Kosmopoulos, Panagiotis G.; Kazadzis, Stelios; Taylor, Michael; Raptis, Panagiotis I.; Keramitsoglou, Iphigenia; Kiranoudis, Chris; Bais, Alkiviadis F.

    2018-02-01

    This study focuses on the assessment of surface solar radiation (SSR) based on operational neural network (NN) and multi-regression function (MRF) modelling techniques that produce instantaneous (in less than 1 min) outputs. Using real-time cloud and aerosol optical properties inputs from the Spinning Enhanced Visible and Infrared Imager (SEVIRI) on board the Meteosat Second Generation (MSG) satellite and the Copernicus Atmosphere Monitoring Service (CAMS), respectively, these models are capable of calculating SSR in high resolution (1 nm, 0.05°, 15 min) that can be used for spectrally integrated irradiance maps, databases and various applications related to energy exploitation. The real-time models are validated against ground-based measurements of the Baseline Surface Radiation Network (BSRN) in a temporal range varying from 15 min to monthly means, while a sensitivity analysis of the cloud and aerosol effects on SSR is performed to ensure reliability under different sky and climatological conditions. The simulated outputs, compared to their common training dataset created by the radiative transfer model (RTM) libRadtran, showed median error values in the range -15 to 15 % for the NN that produces spectral irradiances (NNS), 5-6 % underestimation for the integrated NN and close to zero errors for the MRF technique. The verification against BSRN revealed that the real-time calculation uncertainty ranges from -100 to 40 and -20 to 20 W m-2, for the 15 min and monthly mean global horizontal irradiance (GHI) averages, respectively, while the accuracy of the input parameters, in terms of aerosol and cloud optical thickness (AOD and COT), and their impact on GHI, was of the order of 10 % as compared to the ground-based measurements. The proposed system aims to be utilized through studies and real-time applications which are related to solar energy production planning and use.

  4. Cleanup Verification Package for the 618-2 Burial Ground

    Energy Technology Data Exchange (ETDEWEB)

    W. S. Thompson

    2006-12-28

    This cleanup verification package documents completion of remedial action for the 618-2 Burial Ground, also referred to as Solid Waste Burial Ground No. 2; Burial Ground No. 2; 318-2; and Dry Waste Burial Site No. 2. This waste site was used primarily for the disposal of contaminated equipment, materials and laboratory waste from the 300 Area Facilities.

  5. Cleanup Verification Package for the 618-2 Burial Ground

    International Nuclear Information System (INIS)

    Thompson, W.S.

    2006-01-01

    This cleanup verification package documents completion of remedial action for the 618-2 Burial Ground, also referred to as Solid Waste Burial Ground No. 2; Burial Ground No. 2; 318-2; and Dry Waste Burial Site No. 2. This waste site was used primarily for the disposal of contaminated equipment, materials and laboratory waste from the 300 Area Facilities

  6. Cleanup Verification Package for the 618-8 Burial Ground

    International Nuclear Information System (INIS)

    Appel, M.J.

    2006-01-01

    This cleanup verification package documents completion of remedial action for the 618-8 Burial Ground, also referred to as the Solid Waste Burial Ground No. 8, 318-8, and the Early Solid Waste Burial Ground. During its period of operation, the 618-8 site is speculated to have been used to bury uranium-contaminated waste derived from fuel manufacturing, and construction debris from the remodeling of the 313 Building

  7. Cleanup Verification Package for the 618-3 Burial Ground

    International Nuclear Information System (INIS)

    Appel, M.J.

    2006-01-01

    This cleanup verification package documents completion of remedial action for the 618-3 Solid Waste Burial Ground, also referred to as Burial Ground Number 3 and the Dry Waste Burial Ground Number 3. During its period of operation, the 618-3 site was used to dispose of uranium-contaminated construction debris from the 311 Building and construction/demolition debris from remodeling of the 313, 303-J and 303-K Buildings

  8. Cleanup Verification Package for the 118-F-2 Burial Ground

    International Nuclear Information System (INIS)

    Capron, J.M.; Anselm, K.A.

    2008-01-01

    This cleanup verification package documents completion of remedial action, sampling activities, and compliance with cleanup criteria for the 118-F-2 Burial Ground. This burial ground, formerly called Solid Waste Burial Ground No. 1, was the original solid waste disposal site for the 100-F Area. Eight trenches contained miscellaneous solid waste from the 105-F Reactor and one trench contained solid waste from the biology facilities

  9. Cleanup Verification Package for the 118-F-1 Burial Ground

    Energy Technology Data Exchange (ETDEWEB)

    E. J. Farris and H. M. Sulloway

    2008-01-10

    This cleanup verification package documents completion of remedial action for the 118-F-1 Burial Ground on the Hanford Site. This burial ground is a combination of two locations formerly called Minor Construction Burial Ground No. 2 and Solid Waste Burial Ground No. 2. This waste site received radioactive equipment and other miscellaneous waste from 105-F Reactor operations, including dummy elements and irradiated process tubing; gun barrel tips, steel sleeves, and metal chips removed from the reactor; filter boxes containing reactor graphite chips; and miscellaneous construction solid waste.

  10. Cleanup Verification Package for the 118-F-6 Burial Ground

    International Nuclear Information System (INIS)

    Sulloway, H.M.

    2008-01-01

    This cleanup verification package documents completion of remedial action for the 118-F-6 Burial Ground located in the 100-FR-2 Operable Unit of the 100-F Area on the Hanford Site. The trenches received waste from the 100-F Experimental Animal Farm, including animal manure, animal carcasses, laboratory waste, plastic, cardboard, metal, and concrete debris as well as a railroad tank car

  11. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  12. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  13. The ground based plan

    International Nuclear Information System (INIS)

    1989-01-01

    The paper presents a report of ''The Ground Based Plan'' of the United Kingdom Science and Engineering Research Council. The ground based plan is a plan for research in astronomy and planetary science by ground based techniques. The contents of the report contains a description of:- the scientific objectives and technical requirements (the basis for the Plan), the present organisation and funding for the ground based programme, the Plan, the main scientific features and the further objectives of the Plan. (U.K.)

  14. Cleanup Verification Package for the 118-B-6, 108-B Solid Waste Burial Ground

    International Nuclear Information System (INIS)

    Proctor, M.L.

    2006-01-01

    This cleanup verification package documents completion of remedial action for the 118-B-6, 108-B Solid Waste Burial Ground. The 118-B-6 site consisted of 2 concrete pipes buried vertically in the ground and capped by a concrete pad with steel lids. The site was used for the disposal of wastes from the 'metal line' of the P-10 Tritium Separation Project.

  15. Cleanup Verification Package for the 118-C-1, 105-C Solid Waste Burial Ground

    Energy Technology Data Exchange (ETDEWEB)

    M. J. Appel and J. M. Capron

    2007-07-25

    This cleanup verification package documents completion of remedial action for the 118-C-1, 105-C Solid Waste Burial Ground. This waste site was the primary burial ground for general wastes from the operation of the 105-C Reactor and received process tubes, aluminum fuel spacers, control rods, reactor hardware, spent nuclear fuel and soft wastes.

  16. Cleanup Verification Package for the 118-C-1, 105-C Solid Waste Burial Ground

    International Nuclear Information System (INIS)

    Appel, M.J.; Capron, J.M.

    2007-01-01

    This cleanup verification package documents completion of remedial action for the 118-C-1, 105-C Solid Waste Burial Ground. This waste site was the primary burial ground for general wastes from the operation of the 105-C Reactor and received process tubes, aluminum fuel spacers, control rods, reactor hardware, spent nuclear fuel and soft wastes

  17. Cleanup Verification Package for the 118-B-1, 105-B Solid Waste Burial Ground

    International Nuclear Information System (INIS)

    Capron, J.M.

    2008-01-01

    This cleanup verification package documents completion of remedial action, sampling activities, and compliance criteria for the 118-B-1, 105-B Solid Waste Burial Ground. This waste site was the primary burial ground for general wastes from the operation of the 105-B Reactor and P-10 Tritium Separation Project and also received waste from the 105-N Reactor. The burial ground received reactor hardware, process piping and tubing, fuel spacers, glassware, electrical components, tritium process wastes, soft wastes and other miscellaneous debris

  18. Consent Based Verification System (CBSV)

    Data.gov (United States)

    Social Security Administration — CBSV is a fee-based service offered by SSA's Business Services Online (BSO). It is used by private companies to verify the SSNs of their customers and clients that...

  19. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  20. The JPSS Ground Project Algorithm Verification, Test and Evaluation System

    Science.gov (United States)

    Vicente, G. A.; Jain, P.; Chander, G.; Nguyen, V. T.; Dixon, V.

    2016-12-01

    The Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) is an operational system that provides services to the Suomi National Polar-orbiting Partnership (S-NPP) Mission. It is also a unique environment for Calibration/Validation (Cal/Val) and Data Quality Assessment (DQA) of the Join Polar Satellite System (JPSS) mission data products. GRAVITE provides a fast and direct access to the data and products created by the Interface Data Processing Segment (IDPS), the NASA/NOAA operational system that converts Raw Data Records (RDR's) generated by sensors on the S-NPP into calibrated geo-located Sensor Data Records (SDR's) and generates Mission Unique Products (MUPS). It also facilitates algorithm investigation, integration, checkouts and tuning, instrument and product calibration and data quality support, monitoring and data/products distribution. GRAVITE is the portal for the latest S-NPP and JPSS baselined Processing Coefficient Tables (PCT's) and Look-Up-Tables (LUT's) and hosts a number DQA offline tools that takes advantage of the proximity to the near-real time data flows. It also contains a set of automated and ad-hoc Cal/Val tools used for algorithm analysis and updates, including an instance of the IDPS called GRAVITE Algorithm Development Area (G-ADA), that has the latest installation of the IDPS algorithms running in an identical software and hardware platforms. Two other important GRAVITE component are the Investigator-led Processing System (IPS) and the Investigator Computing Facility (ICF). The IPS is a dedicated environment where authorized users run automated scripts called Product Generation Executables (PGE's) to support Cal/Val and data quality assurance offline. This data-rich and data-driven service holds its own distribution system and allows operators to retrieve science data products. The ICF is a workspace where users can share computing applications and resources and have full access to libraries and

  1. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  2. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  3. Cleanup Verification Package for the 118-F-3, Minor Construction Burial Ground

    International Nuclear Information System (INIS)

    Appel, M.J.

    2007-01-01

    This cleanup verification package documents completion of remedial action for the 118-F-3, Minor Construction Burial Ground waste site. This site was an open field covered with cobbles, with no vegetation growing on the surface. The site received irradiated reactor parts that were removed during conversion of the 105-F Reactor from the Liquid 3X to the Ball 3X Project safety systems and received mostly vertical safety rod thimbles and step plugs

  4. Ground-based photo monitoring

    Science.gov (United States)

    Frederick C. Hall

    2000-01-01

    Ground-based photo monitoring is repeat photography using ground-based cameras to document change in vegetation or soil. Assume those installing the photo location will not be the ones re-photographing it. This requires a protocol that includes: (1) a map to locate the monitoring area, (2) another map diagramming the photographic layout, (3) type and make of film such...

  5. Game-based verification and synthesis

    DEFF Research Database (Denmark)

    Vester, Steen

    and the environment behaves. Synthesis of strategies in games can thus be used for automatic generation of correct-by-construction programs from specifications. We consider verification and synthesis problems for several well-known game-based models. This includes both model-checking problems and satisfiability...... can be extended to solve finitely-branching turn-based games more efficiently. Further, the novel concept of winning cores in parity games is introduced. We use this to develop a new polynomial-time under-approximation algorithm for solving parity games. Experimental results show that this algorithm...... corresponds directly to a program for the corresponding entity of the system. A strategy for a player which ensures that the player wins no matter how the other players behave then corresponds to a program ensuring that the specification of the entity is satisfied no matter how the other entities...

  6. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  7. Time Series Based for Online Signature Verification

    Directory of Open Access Journals (Sweden)

    I Ketut Gede Darma Putra

    2013-11-01

    Full Text Available Signature verification system is to match the tested signature with a claimed signature. This paper proposes time series based for feature extraction method and dynamic time warping for match method. The system made by process of testing 900 signatures belong to 50 participants, 3 signatures for reference and 5 signatures from original user, simple imposters and trained imposters for signatures test. The final result system was tested with 50 participants with 3 references. This test obtained that system accuracy without imposters is 90,44897959% at threshold 44 with rejection errors (FNMR is 5,2% and acceptance errors (FMR is 4,35102%, when with imposters system accuracy is 80,1361% at threshold 27 with error rejection (FNMR is 15,6% and acceptance errors (average FMR is 4,263946%, with details as follows: acceptance errors is 0,391837%, acceptance errors simple imposters is 3,2% and acceptance errors trained imposters is 9,2%.

  8. Development of optical ground verification method for μm to sub-mm reflectors

    Science.gov (United States)

    Stockman, Y.; Thizy, C.; Lemaire, P.; Georges, M.; Mazy, E.; Mazzoli, A.; Houbrechts, Y.; Rochus, P.; Roose, S.; Doyle, D.; Ulbrich, G.

    2017-11-01

    develop and realise suitable verification tools based on infrared interferometry and other optical techniques for testing large reflector structures, telescope configurations and their performances under simulated space conditions. Two methods and techniques are developed at CSL. The first one is an IR-phase shifting interferometer with high spatial resolution. This interferometer shall be used specifically for the verification of high precision IR, FIR and sub-mm reflector surfaces and telescopes under both ambient and thermal vacuum conditions. The second one presented hereafter is a holographic method for relative shape measurement. The holographic solution proposed makes use of a home built vacuum compatible holographic camera that allows displacement measurements from typically 20 nanometres to 25 microns in one shot. An iterative process allows the measurement of a total of up to several mm of deformation. Uniquely the system is designed to measure both specular and diffuse surfaces.

  9. Android-Based Verification System for Banknotes

    Directory of Open Access Journals (Sweden)

    Ubaid Ur Rahman

    2017-11-01

    Full Text Available With the advancement in imaging technologies for scanning and printing, production of counterfeit banknotes has become cheaper, easier, and more common. The proliferation of counterfeit banknotes causes loss to banks, traders, and individuals involved in financial transactions. Hence, it is inevitably needed that efficient and reliable techniques for detection of counterfeit banknotes should be developed. With the availability of powerful smartphones, it has become possible to perform complex computations and image processing related tasks on these phones. In addition to this, smartphone users have increased greatly and numbers continue to increase. This is a great motivating factor for researchers and developers to propose innovative mobile-based solutions. In this study, a novel technique for verification of Pakistani banknotes is developed, targeting smartphones with android platform. The proposed technique is based on statistical features, and surface roughness of a banknote, representing different properties of the banknote, such as paper material, printing ink, paper quality, and surface roughness. The selection of these features is motivated by the X-ray Diffraction (XRD and Scanning Electron Microscopy (SEM analysis of genuine and counterfeit banknotes. In this regard, two important areas of the banknote, i.e., serial number and flag portions were considered since these portions showed the maximum difference between genuine and counterfeit banknote. The analysis confirmed that genuine and counterfeit banknotes are very different in terms of the printing process, the ingredients used in preparation of banknotes, and the quality of the paper. After extracting the discriminative set of features, support vector machine is used for classification. The experimental results confirm the high accuracy of the proposed technique.

  10. Finite Countermodel Based Verification for Program Transformation (A Case Study

    Directory of Open Access Journals (Sweden)

    Alexei P. Lisitsa

    2015-12-01

    Full Text Available Both automatic program verification and program transformation are based on program analysis. In the past decade a number of approaches using various automatic general-purpose program transformation techniques (partial deduction, specialization, supercompilation for verification of unreachability properties of computing systems were introduced and demonstrated. On the other hand, the semantics based unfold-fold program transformation methods pose themselves diverse kinds of reachability tasks and try to solve them, aiming at improving the semantics tree of the program being transformed. That means some general-purpose verification methods may be used for strengthening program transformation techniques. This paper considers the question how finite countermodels for safety verification method might be used in Turchin's supercompilation method. We extract a number of supercompilation sub-algorithms trying to solve reachability problems and demonstrate use of an external countermodel finder for solving some of the problems.

  11. Sensor-fusion-based biometric identity verification

    International Nuclear Information System (INIS)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W.; Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person's identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm

  12. Algebraic Verification Method for SEREs Properties via Groebner Bases Approaches

    Directory of Open Access Journals (Sweden)

    Ning Zhou

    2013-01-01

    Full Text Available This work presents an efficient solution using computer algebra system to perform linear temporal properties verification for synchronous digital systems. The method is essentially based on both Groebner bases approaches and symbolic simulation. A mechanism for constructing canonical polynomial set based symbolic representations for both circuit descriptions and assertions is studied. We then present a complete checking algorithm framework based on these algebraic representations by using Groebner bases. The computational experience result in this work shows that the algebraic approach is a quite competitive checking method and will be a useful supplement to the existent verification methods based on simulation.

  13. Simulation environment based on the Universal Verification Methodology

    International Nuclear Information System (INIS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  14. Palmprint Based Verification System Using SURF Features

    Science.gov (United States)

    Srinivas, Badrinath G.; Gupta, Phalguni

    This paper describes the design and development of a prototype of robust biometric system for verification. The system uses features extracted using Speeded Up Robust Features (SURF) operator of human hand. The hand image for features is acquired using a low cost scanner. The palmprint region extracted is robust to hand translation and rotation on the scanner. The system is tested on IITK database of 200 images and PolyU database of 7751 images. The system is found to be robust with respect to translation and rotation. It has FAR 0.02%, FRR 0.01% and accuracy of 99.98% and can be a suitable system for civilian applications and high-security environments.

  15. Biometric Subject Verification Based on Electrocardiographic Signals

    Science.gov (United States)

    Dusan, Sorin V. (Inventor); Jorgensen, Charles C. (Inventor)

    2014-01-01

    A method of authenticating or declining to authenticate an asserted identity of a candidate-person. In an enrollment phase, a reference PQRST heart action graph is provided or constructed from information obtained from a plurality of graphs that resemble each other for a known reference person, using a first graph comparison metric. In a verification phase, a candidate-person asserts his/her identity and presents a plurality of his/her heart cycle graphs. If a sufficient number of the candidate-person's measured graphs resemble each other, a representative composite graph is constructed from the candidate-person's graphs and is compared with a composite reference graph, for the person whose identity is asserted, using a second graph comparison metric. When the second metric value lies in a selected range, the candidate-person's assertion of identity is accepted.

  16. Neighbors Based Discriminative Feature Difference Learning for Kinship Verification

    DEFF Research Database (Denmark)

    Duan, Xiaodong; Tan, Zheng-Hua

    2015-01-01

    In this paper, we present a discriminative feature difference learning method for facial image based kinship verification. To transform feature difference of an image pair to be discriminative for kinship verification, a linear transformation matrix for feature difference between an image pair...... than the commonly used feature concatenation, leading to a low complexity. Furthermore, there is no positive semi-definitive constrain on the transformation matrix while there is in metric learning methods, leading to an easy solution for the transformation matrix. Experimental results on two public...... databases show that the proposed method combined with a SVM classification method outperforms or is comparable to state-of-the-art kinship verification methods. © Springer International Publishing AG, Part of Springer Science+Business Media...

  17. Protocol-Based Verification of Message-Passing Parallel Programs

    DEFF Research Database (Denmark)

    López-Acosta, Hugo-Andrés; Eduardo R. B. Marques, Eduardo R. B.; Martins, Francisco

    2015-01-01

    We present ParTypes, a type-based methodology for the verification of Message Passing Interface (MPI) programs written in the C programming language. The aim is to statically verify programs against protocol specifications, enforcing properties such as fidelity and absence of deadlocks. We develo...

  18. Biometric verification based on grip-pattern recognition

    NARCIS (Netherlands)

    Veldhuis, Raymond N.J.; Bazen, A.M.; Kauffman, J.A.; Hartel, Pieter H.; Delp, Edward J.; Wong, Ping W.

    This paper describes the design, implementation and evaluation of a user-verification system for a smart gun, which is based on grip-pattern recognition. An existing pressure sensor consisting of an array of 44 x 44 piezoresistive elements is used to measure the grip pattern. An interface has been

  19. Dynamic Frames Based Verification Method for Concurrent Java Programs

    NARCIS (Netherlands)

    Mostowski, Wojciech

    2016-01-01

    In this paper we discuss a verification method for concurrent Java programs based on the concept of dynamic frames. We build on our earlier work that proposes a new, symbolic permission system for concurrent reasoning and we provide the following new contributions. First, we describe our approach

  20. Wavelet-based verification of the quantitative precipitation forecast

    Science.gov (United States)

    Yano, Jun-Ichi; Jakubiak, Bogumil

    2016-06-01

    This paper explores the use of wavelets for spatial verification of quantitative precipitation forecasts (QPF), and especially the capacity of wavelets to provide both localization and scale information. Two 24-h forecast experiments using the two versions of the Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) on 22 August 2010 over Poland are used to illustrate the method. Strong spatial localizations and associated intermittency of the precipitation field make verification of QPF difficult using standard statistical methods. The wavelet becomes an attractive alternative, because it is specifically designed to extract spatially localized features. The wavelet modes are characterized by the two indices for the scale and the localization. Thus, these indices can simply be employed for characterizing the performance of QPF in scale and localization without any further elaboration or tunable parameters. Furthermore, spatially-localized features can be extracted in wavelet space in a relatively straightforward manner with only a weak dependence on a threshold. Such a feature may be considered an advantage of the wavelet-based method over more conventional "object" oriented verification methods, as the latter tend to represent strong threshold sensitivities. The present paper also points out limits of the so-called "scale separation" methods based on wavelets. Our study demonstrates how these wavelet-based QPF verifications can be performed straightforwardly. Possibilities for further developments of the wavelet-based methods, especially towards a goal of identifying a weak physical process contributing to forecast error, are also pointed out.

  1. Biometric verification based on grip-pattern recognition

    NARCIS (Netherlands)

    Veldhuis, Raymond N.J.; Bazen, A.M.; Kauffman, J.A.; Hartel, Pieter H.

    This paper describes the design, implementation and evaluation of a user-verification system for a smart gun, which is based on grip-pattern recognition. An existing pressure sensor consisting of an array of 44 £ 44 piezoresistive elements is used to measure the grip pattern. An interface has been

  2. An ontology based trust verification of software license agreement

    Science.gov (United States)

    Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo

    2017-08-01

    When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.

  3. Hamming Code Based Watermarking Scheme for 3D Model Verification

    Directory of Open Access Journals (Sweden)

    Jen-Tse Wang

    2014-01-01

    Full Text Available Due to the explosive growth of the Internet and maturing of 3D hardware techniques, protecting 3D objects becomes a more and more important issue. In this paper, a public hamming code based fragile watermarking technique is proposed for 3D objects verification. An adaptive watermark is generated from each cover model by using the hamming code technique. A simple least significant bit (LSB substitution technique is employed for watermark embedding. In the extraction stage, the hamming code based watermark can be verified by using the hamming code checking without embedding any verification information. Experimental results shows that 100% vertices of the cover model can be watermarked, extracted, and verified. It also shows that the proposed method can improve security and achieve low distortion of stego object.

  4. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  5. Type A verification report for the high flux beam reactor stack and grounds, Brookhaven National Laboratory, Upton, New York

    International Nuclear Information System (INIS)

    Harpenau, Evan M.

    2012-01-01

    The U.S. Department of Energy (DOE) Order 458.1 requires independent verification (IV) of DOE cleanup projects (DOE 2011). The Oak Ridge Institute for Science and Education (ORISE) has been designated as the responsible organization for IV of the High Flux Beam Reactor (HFBR) Stack and Grounds area at Brookhaven National Laboratory (BNL) in Upton, New York. The IV evaluation may consist of an in-process inspection with document and data reviews (Type A Verification) or a confirmatory survey of the site (Type B Verification). DOE and ORISE determined that a Type A verification of the documents and data for the HFBR Stack and Grounds: Survey Units (SU) 6, 7, and 8 was appropriate based on the initial survey unit classification, the walkover surveys, and the final analytical results provided by the Brookhaven Science Associates (BSA). The HFBR Stack and Grounds surveys began in June 2011 and were completed in September 2011. Survey activities by BSA included gamma walkover scans and sampling of the as-left soils in accordance with the BSA Work Procedure (BNL 2010a). The Field Sampling Plan - Stack and Remaining HFBR Outside Areas (FSP) stated that gamma walk-over surveys would be conducted with a bare sodium iodide (NaI) detector, and a collimated detector would be used to check areas with elevated count rates to locate the source of the high readings (BNL 2010b). BSA used the Mult- Agency Radiation Survey and Site Investigation Manual (MARSSIM) principles for determining the classifications of each survey unit. Therefore, SUs 6 and 7 were identified as Class 1 and SU 8 was deemed Class 2 (BNL 2010b). Gamma walkover surveys of SUs 6, 7, and 8 were completed using a 2 1/2 2 NaI detector coupled to a data-logger with a global positioning system (GPS). The 100% scan surveys conducted prior to the final status survey (FSS) sampling identified two general soil areas and two isolated soil locations with elevated radioactivity. The general areas of elevated activity

  6. The potential of agent-based modelling for verification of people trajectories based on smartphone sensor data

    International Nuclear Information System (INIS)

    Hillen, F; Ehlers, M; Höfle, B; Reinartz, P

    2014-01-01

    In this paper the potential of smartphone sensor data for verification of people trajectories derived from airborne remote sensing data are investigated and discussed based on simulated test recordings in the city of Osnabrueck, Germany. For this purpose, the airborne imagery is simulated by images taken from a high building with a typical single lens reflex camera. The smartphone data required for the analysis of the potential is simultaneously recorded by test persons on the ground. In a second step, the quality of the smartphone sensor data is evaluated regarding the integration into simulation and modelling approaches. In this context we studied the potential of the agent-based modelling technique concerning the verification of people trajectories

  7. Ground-based observations of exoplanet atmospheres

    NARCIS (Netherlands)

    Mooij, Ernst Johan Walter de

    2011-01-01

    This thesis focuses on the properties of exoplanet atmospheres. The results for ground-based near-infrared secondary eclipse observations of three different exoplanets, TrES-3b, HAT-P-1b and WASP-33b, are presented which have been obtained with ground-based telescopes as part of the GROUSE project.

  8. Simulation-based MDP verification for leading-edge masks

    Science.gov (United States)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification

  9. MESA: Message-Based System Analysis Using Runtime Verification

    Science.gov (United States)

    Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter

    2017-01-01

    In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.

  10. Generalization of information-based concepts in forecast verification

    Science.gov (United States)

    Tödter, J.; Ahrens, B.

    2012-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification. Recent findings concerning the Ignorance Score are shortly reviewed, then the generalization to continuous forecasts is shown. For ensemble forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are the prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up the natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The applicability and usefulness of the conceptually appealing CRIGN is illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This is also directly applicable to the more traditional CRPS.

  11. Image-based fingerprint verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil K. Singla

    2008-09-01

    Full Text Available Biometric-based identification/verification systems provide a solution to the security concerns in the modern world where machine is replacing human in every aspect of life. Fingerprints, because of their uniqueness, are the most widely used and highly accepted biometrics. Fingerprint biometric systems are either minutiae-based or pattern learning (image based. The minutiae-based algorithm depends upon the local discontinuities in the ridge flow pattern and are used when template size is important while image-based matching algorithm uses both the micro and macro feature of a fingerprint and is used if fast response is required. In the present paper an image-based fingerprint verification system is discussed. The proposed method uses a learning phase, which is not present in conventional image-based systems. The learning phase uses pseudo random sub-sampling, which reduces the number of comparisons needed in the matching stage. This system has been developed using LabVIEW (Laboratory Virtual Instrument Engineering Workbench toolbox version 6i. The availability of datalog files in LabVIEW makes it one of the most promising candidates for its usage as a database. Datalog files can access and manipulate data and complex data structures quickly and easily. It makes writing and reading much faster. After extensive experimentation involving a large number of samples and different learning sizes, high accuracy with learning image size of 100 100 and a threshold value of 700 (1000 being the perfect match has been achieved.

  12. Performing Verification and Validation in Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  13. A Scala DSL for RETE-Based Runtime Verification

    Science.gov (United States)

    Havelund, Klaus

    2013-01-01

    Runtime verification (RV) consists in part of checking execution traces against formalized specifications. Several systems have emerged, most of which support specification notations based on state machines, regular expressions, temporal logic, or grammars. The field of Artificial Intelligence (AI) has for an even longer period of time studied rule-based production systems, which at a closer look appear to be relevant for RV, although seemingly focused on slightly different application domains, such as for example business processes and expert systems. The core algorithm in many of these systems is the Rete algorithm. We have implemented a Rete-based runtime verification system, named LogFire (originally intended for offline log analysis but also applicable to online analysis), as an internal DSL in the Scala programming language, using Scala's support for defining DSLs. This combination appears attractive from a practical point of view. Our contribution is in part conceptual in arguing that such rule-based frameworks originating from AI may be suited for RV.

  14. Towards a CPN-Based Modelling Approach for Reconciling Verification and Implementation of Protocol Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2013-01-01

    Formal modelling of protocols is often aimed at one specific purpose such as verification or automatically generating an implementation. This leads to models that are useful for one purpose, but not for others. Being able to derive models for verification and implementation from a single model...... is beneficial both in terms of reduced total modelling effort and confidence that the verification results are valid also for the implementation model. In this paper we introduce the concept of a descriptive specification model and an approach based on refining a descriptive model to target both verification...... how this model can be refined to target both verification and implementation....

  15. Internet-based dimensional verification system for reverse engineering processes

    International Nuclear Information System (INIS)

    Song, In Ho; Kim, Kyung Don; Chung, Sung Chong

    2008-01-01

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  16. Internet-based dimensional verification system for reverse engineering processes

    Energy Technology Data Exchange (ETDEWEB)

    Song, In Ho [Ajou University, Suwon (Korea, Republic of); Kim, Kyung Don [Small Business Corporation, Suwon (Korea, Republic of); Chung, Sung Chong [Hanyang University, Seoul (Korea, Republic of)

    2008-07-15

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  17. Novel Verification Method for Timing Optimization Based on DPSO

    Directory of Open Access Journals (Sweden)

    Chuandong Chen

    2018-01-01

    Full Text Available Timing optimization for logic circuits is one of the key steps in logic synthesis. Extant research data are mainly proposed based on various intelligence algorithms. Hence, they are neither comparable with timing optimization data collected by the mainstream electronic design automation (EDA tool nor able to verify the superiority of intelligence algorithms to the EDA tool in terms of optimization ability. To address these shortcomings, a novel verification method is proposed in this study. First, a discrete particle swarm optimization (DPSO algorithm was applied to optimize the timing of the mixed polarity Reed-Muller (MPRM logic circuit. Second, the Design Compiler (DC algorithm was used to optimize the timing of the same MPRM logic circuit through special settings and constraints. Finally, the timing optimization results of the two algorithms were compared based on MCNC benchmark circuits. The timing optimization results obtained using DPSO are compared with those obtained from DC, and DPSO demonstrates an average reduction of 9.7% in the timing delays of critical paths for a number of MCNC benchmark circuits. The proposed verification method directly ascertains whether the intelligence algorithm has a better timing optimization ability than DC.

  18. Development of a tool for knowledge base verification of expert system based on Design/CPN

    International Nuclear Information System (INIS)

    Kim, Jong Hyun

    1998-02-01

    Verification is a necessary work in developing a reliable expert system. Verification is a process aimed at demonstrating whether a system meets it's specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base. Generally, verification process requires computational support by automated tools. For this reason, this study developed a tool for knowledge base verification based on Design/CPN, which is a tool for editing, modeling, and simulating Colored Petri net. This tool uses Enhanced Colored Petri net as a modeling method. By applying this tool to the knowledge base of nuclear power plant, it is noticed that it can successfully check most of the anomalies that can occur in a knowledge base

  19. ECG based biometrics verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Singla

    2010-07-01

    Full Text Available Biometric based authentication systems provide solutions to the problems in high security which remain with conventionalsecurity systems. In a biometric verification system, human’s biological parameters (such as voice, finger print,palm print or hand geometry, face, iris etc. are used to verify the authenticity of a person. These parameters are good to beused as biometric parameters but do not provide the guarantee that the person is present and alive. As voice can be copied,finger print can be picked from glass on synthetic skin and in face recognition system due to genetic factors identical twinsor father-son may have the same facial appearance. ECG does not have these problems. It can not be recorded without theknowledge of the person and ECG of every person is unique even identical twins have different ECG. In this paper an ECGbasedbiometrics verification system which was developed using Laboratory Virtual Instruments Engineering Workbench(LabVIEW version 7.1 is discussed. Experiments were conducted on the database stored in the laboratory of 20 individualshaving 10 samples each and the results revealed a false rejection rate (FRR of 3% and false acceptance rate (FAR of 3.21%.

  20. Verification of product design using regulation knowledge base and Web services

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ik June [KAERI, Daejeon (Korea, Republic of); Lee, Jae Chul; Mun Du Hwan [Kyungpook National University, Daegu (Korea, Republic of); Kim, Byung Chul [Dong-A University, Busan (Korea, Republic of); Hwang, Jin Sang [PartDB Co., Ltd., Daejeom (Korea, Republic of); Lim, Chae Ho [Korea Institute of Industrial Technology, Incheon (Korea, Republic of)

    2015-11-15

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  1. Verification of product design using regulation knowledge base and Web services

    International Nuclear Information System (INIS)

    Kim, Ik June; Lee, Jae Chul; Mun Du Hwan; Kim, Byung Chul; Hwang, Jin Sang; Lim, Chae Ho

    2015-01-01

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  2. A Ground-Based Validation System of Teleoperation for a Space Robot

    Directory of Open Access Journals (Sweden)

    Xueqian Wang

    2012-10-01

    Full Text Available Teleoperation of space robots is very important for future on-orbit service. In order to assure the task is accomplished successfully, ground experiments are required to verify the function and validity of the teleoperation system before a space robot is launched. In this paper, a ground-based validation subsystem is developed as a part of a teleoperation system. The subsystem is mainly composed of four parts: the input verification module, the onboard verification module, the dynamic and image workstation, and the communication simulator. The input verification module, consisting of hardware and software of the master, is used to verify the input ability. The onboard verification module, consisting of the same hardware and software as the onboard processor, is used to verify the processor's computing ability and execution schedule. In addition, the dynamic and image workstation calculates the dynamic response of the space robot and target, and generates emulated camera images, including the hand-eye cameras, global-vision camera and rendezvous camera. The communication simulator provides fidelity communication conditions, i.e., time delays and communication bandwidth. Lastly, we integrated a teleoperation system and conducted many experiments on the system. Experiment results show that the ground system is very useful for verified teleoperation technology.

  3. Verification and Validation of Embedded Knowledge-Based Software Systems

    National Research Council Canada - National Science Library

    Santos, Eugene

    1999-01-01

    .... We pursued this by carefully examining the nature of uncertainty and information semantics and developing intelligent tools for verification and validation that provides assistance to the subject...

  4. Simulation based mask defect repair verification and disposition

    Science.gov (United States)

    Guo, Eric; Zhao, Shirley; Zhang, Skin; Qian, Sandy; Cheng, Guojie; Vikram, Abhishek; Li, Ling; Chen, Ye; Hsiang, Chingyun; Zhang, Gary; Su, Bo

    2009-10-01

    As the industry moves towards sub-65nm technology nodes, the mask inspection, with increased sensitivity and shrinking critical defect size, catches more and more nuisance and false defects. Increased defect counts pose great challenges in the post inspection defect classification and disposition: which defect is real defect, and among the real defects, which defect should be repaired and how to verify the post-repair defects. In this paper, we address the challenges in mask defect verification and disposition, in particular, in post repair defect verification by an efficient methodology, using SEM mask defect images, and optical inspection mask defects images (only for verification of phase and transmission related defects). We will demonstrate the flow using programmed mask defects in sub-65nm technology node design. In total 20 types of defects were designed including defects found in typical real circuit environments with 30 different sizes designed for each type. The SEM image was taken for each programmed defect after the test mask was made. Selected defects were repaired and SEM images from the test mask were taken again. Wafers were printed with the test mask before and after repair as defect printability references. A software tool SMDD-Simulation based Mask Defect Disposition-has been used in this study. The software is used to extract edges from the mask SEM images and convert them into polygons to save in GDSII format. Then, the converted polygons from the SEM images were filled with the correct tone to form mask patterns and were merged back into the original GDSII design file. This merge is for the purpose of contour simulation-since normally the SEM images cover only small area (~1 μm) and accurate simulation requires including larger area of optical proximity effect. With lithography process model, the resist contour of area of interest (AOI-the area surrounding a mask defect) can be simulated. If such complicated model is not available, a simple

  5. Development of advanced earthquake resistant performance verification on reinforced concrete underground structure. Pt. 2. Verification of the ground modeling methods applied to non-linear soil-structure interaction analysis

    International Nuclear Information System (INIS)

    Kawai, Tadashi; Kanatani, Mamoru; Ohtomo, Keizo; Matsui, Jun; Matsuo, Toyofumi

    2003-01-01

    In order to develop an advanced verification method for earthquake resistant performance on reinforced concrete underground structures, the applicability of two different types of soil modeling methods in numerical analysis were verified through non-linear dynamic numerical simulations of the large shaking table tests conducted using the model comprised of free-field ground or soils and a reinforced concrete two-box culvert structure system. In these simulations, the structure was modeled by a beam type element having a tri-linear curve of the relations between curvature and flexural moment. The soil was modeled by the Ramberg-Osgood model as well as an elasto-plastic constitutive model. The former model only employs non-linearity of shear modulus regarding strain and initial stress conditions, whereas the latter can express non-linearity of shear modulus caused by changes of mean effective stress during ground excitation and dilatancy of ground soil. Therefore the elasto-plastic constitutive model could precisely simulate the vertical acceleration and displacement response on ground surface, which were produced by the soil dilations during a shaking event of a horizontal base input in the model tests. In addition, the model can explain distinctive dynamic earth pressure acting on the vertical walls of the structure which was also confirmed to be related to the soil dilations. However, since both these modeling methods could express the shear force on the upper slab surface of the model structure, which plays the predominant role on structural deformation, these modeling methods were applicable equally to the evaluation of seismic performance similar to the model structure of this study. (author)

  6. Reliability-Based Decision Fusion in Multimodal Biometric Verification Systems

    Directory of Open Access Journals (Sweden)

    Kryszczuk Krzysztof

    2007-01-01

    Full Text Available We present a methodology of reliability estimation in the multimodal biometric verification scenario. Reliability estimation has shown to be an efficient and accurate way of predicting and correcting erroneous classification decisions in both unimodal (speech, face, online signature and multimodal (speech and face systems. While the initial research results indicate the high potential of the proposed methodology, the performance of the reliability estimation in a multimodal setting has not been sufficiently studied or evaluated. In this paper, we demonstrate the advantages of using the unimodal reliability information in order to perform an efficient biometric fusion of two modalities. We further show the presented method to be superior to state-of-the-art multimodal decision-level fusion schemes. The experimental evaluation presented in this paper is based on the popular benchmarking bimodal BANCA database.

  7. Universal Verification Methodology Based Register Test Automation Flow.

    Science.gov (United States)

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  8. Illumination compensation in ground based hyperspectral imaging

    Science.gov (United States)

    Wendel, Alexander; Underwood, James

    2017-07-01

    Hyperspectral imaging has emerged as an important tool for analysing vegetation data in agricultural applications. Recently, low altitude and ground based hyperspectral imaging solutions have come to the fore, providing very high resolution data for mapping and studying large areas of crops in detail. However, these platforms introduce a unique set of challenges that need to be overcome to ensure consistent, accurate and timely acquisition of data. One particular problem is dealing with changes in environmental illumination while operating with natural light under cloud cover, which can have considerable effects on spectral shape. In the past this has been commonly achieved by imaging known reference targets at the time of data acquisition, direct measurement of irradiance, or atmospheric modelling. While capturing a reference panel continuously or very frequently allows accurate compensation for illumination changes, this is often not practical with ground based platforms, and impossible in aerial applications. This paper examines the use of an autonomous unmanned ground vehicle (UGV) to gather high resolution hyperspectral imaging data of crops under natural illumination. A process of illumination compensation is performed to extract the inherent reflectance properties of the crops, despite variable illumination. This work adapts a previously developed subspace model approach to reflectance and illumination recovery. Though tested on a ground vehicle in this paper, it is applicable to low altitude unmanned aerial hyperspectral imagery also. The method uses occasional observations of reference panel training data from within the same or other datasets, which enables a practical field protocol that minimises in-field manual labour. This paper tests the new approach, comparing it against traditional methods. Several illumination compensation protocols for high volume ground based data collection are presented based on the results. The findings in this paper are

  9. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  10. Calibration of Ground -based Lidar instrument

    DEFF Research Database (Denmark)

    Villanueva, Héctor; Yordanova, Ginka

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement...

  11. Verification and Planning Based on Coinductive Logic Programming

    Science.gov (United States)

    Bansal, Ajay; Min, Richard; Simon, Luke; Mallya, Ajay; Gupta, Gopal

    2008-01-01

    Coinduction is a powerful technique for reasoning about unfounded sets, unbounded structures, infinite automata, and interactive computations [6]. Where induction corresponds to least fixed point's semantics, coinduction corresponds to greatest fixed point semantics. Recently coinduction has been incorporated into logic programming and an elegant operational semantics developed for it [11, 12]. This operational semantics is the greatest fix point counterpart of SLD resolution (SLD resolution imparts operational semantics to least fix point based computations) and is termed co- SLD resolution. In co-SLD resolution, a predicate goal p( t) succeeds if it unifies with one of its ancestor calls. In addition, rational infinite terms are allowed as arguments of predicates. Infinite terms are represented as solutions to unification equations and the occurs check is omitted during the unification process. Coinductive Logic Programming (Co-LP) and Co-SLD resolution can be used to elegantly perform model checking and planning. A combined SLD and Co-SLD resolution based LP system forms the common basis for planning, scheduling, verification, model checking, and constraint solving [9, 4]. This is achieved by amalgamating SLD resolution, co-SLD resolution, and constraint logic programming [13] in a single logic programming system. Given that parallelism in logic programs can be implicitly exploited [8], complex, compute-intensive applications (planning, scheduling, model checking, etc.) can be executed in parallel on multi-core machines. Parallel execution can result in speed-ups as well as in larger instances of the problems being solved. In the remainder we elaborate on (i) how planning can be elegantly and efficiently performed under real-time constraints, (ii) how real-time systems can be elegantly and efficiently model- checked, as well as (iii) how hybrid systems can be verified in a combined system with both co-SLD and SLD resolution. Implementations of co-SLD resolution

  12. Research on key technology of the verification system of steel rule based on vision measurement

    Science.gov (United States)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  13. Space and Ground-Based Infrastructures

    Science.gov (United States)

    Weems, Jon; Zell, Martin

    This chapter deals first with the main characteristics of the space environment, outside and inside a spacecraft. Then the space and space-related (ground-based) infrastructures are described. The most important infrastructure is the International Space Station, which holds many European facilities (for instance the European Columbus Laboratory). Some of them, such as the Columbus External Payload Facility, are located outside the ISS to benefit from external space conditions. There is only one other example of orbital platforms, the Russian Foton/Bion Recoverable Orbital Capsule. In contrast, non-orbital weightless research platforms, although limited in experimental time, are more numerous: sounding rockets, parabolic flight aircraft, drop towers and high-altitude balloons. In addition to these facilities, there are a number of ground-based facilities and space simulators, for both life sciences (for instance: bed rest, clinostats) and physical sciences (for instance: magnetic compensation of gravity). Hypergravity can also be provided by human and non-human centrifuges.

  14. A knowledge-base verification of NPP expert systems using extended Petri nets

    International Nuclear Information System (INIS)

    Kwon, Il Won; Seong, Poong Hyun

    1995-01-01

    The verification phase of knowledge base is an important part for developing reliable expert systems, especially in nuclear industry. Although several strategies or tools have been developed to perform potential error checking, they often neglect the reliability of verification methods. Because a Petri net provides a uniform mathematical formalization of knowledge base, it has been employed for knowledge base verification. In this work, we devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for detecting incorrectness, inconsistency, and incompleteness in a knowledge base. The scope of the verification problem is expanded to chained errors, unlike previous studies that assumed error incidence to be limited to rule pairs only. In addition, we consider certainty factor in checking, because most of knowledge bases have certainty factors

  15. Mobile Pit verification system design based on passive special nuclear material verification in weapons storage facilities

    Energy Technology Data Exchange (ETDEWEB)

    Paul, J. N.; Chin, M. R.; Sjoden, G. E. [Nuclear and Radiological Engineering Program, George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, 770 State St, Atlanta, GA 30332-0745 (United States)

    2013-07-01

    A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reaction rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)

  16. Ground-Based Telescope Parametric Cost Model

    Science.gov (United States)

    Stahl, H. Philip; Rowell, Ginger Holmes

    2004-01-01

    A parametric cost model for ground-based telescopes is developed using multi-variable statistical analysis, The model includes both engineering and performance parameters. While diameter continues to be the dominant cost driver, other significant factors include primary mirror radius of curvature and diffraction limited wavelength. The model includes an explicit factor for primary mirror segmentation and/or duplication (i.e.. multi-telescope phased-array systems). Additionally, single variable models based on aperture diameter are derived. This analysis indicates that recent mirror technology advances have indeed reduced the historical telescope cost curve.

  17. Verification and validation of computer based systems for PFBR

    International Nuclear Information System (INIS)

    Thirugnanamurthy, D.

    2017-01-01

    Verification and Validation (V and V) process is essential to build quality into system. Verification is the process of evaluating a system to determine whether the products of each development phase satisfies the requirements imposed by the previous phase. Validation is the process of evaluating a system at the end of the development process to ensure compliance with the functional, performance and interface requirements. This presentation elaborates the V and V process followed, documents submission requirements in each stage, V and V activities, check list used for reviews in each stage and reports

  18. Property-based Code Slicing for Efficient Verification of OSEK/VDX Operating Systems

    Directory of Open Access Journals (Sweden)

    Mingyu Park

    2012-12-01

    Full Text Available Testing is a de-facto verification technique in industry, but insufficient for identifying subtle issues due to its optimistic incompleteness. On the other hand, model checking is a powerful technique that supports comprehensiveness, and is thus suitable for the verification of safety-critical systems. However, it generally requires more knowledge and cost more than testing. This work attempts to take advantage of both techniques to achieve integrated and efficient verification of OSEK/VDX-based automotive operating systems. We propose property-based environment generation and model extraction techniques using static code analysis, which can be applied to both model checking and testing. The technique is automated and applied to an OSEK/VDX-based automotive operating system, Trampoline. Comparative experiments using random testing and model checking for the verification of assertions in the Trampoline kernel code show how our environment generation and abstraction approach can be utilized for efficient fault-detection.

  19. Calibration of Ground-based Lidar instrument

    DEFF Research Database (Denmark)

    Yordanova, Ginka; Gómez Arranz, Paula

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement...... uncertainties provided by measurement standard and corresponding lidar wind speed indications with associated measurement uncertainties. The lidar calibration concerns the 10 minute mean wind speed measurements. The comparison of the lidar measurements of the wind direction with that from wind vanes...

  20. Horn clause verification with convex polyhedral abstraction and tree automata-based refinement

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2017-01-01

    In this paper we apply tree-automata techniques to refinement of abstract interpretation in Horn clause verification. We go beyond previous work on refining trace abstractions; firstly we handle tree automata rather than string automata and thereby can capture traces in any Horn clause derivations...... underlying the Horn clauses. Experiments using linear constraint problems and the abstract domain of convex polyhedra show that the refinement technique is practical and that iteration of abstract interpretation with tree automata-based refinement solves many challenging Horn clause verification problems. We...... compare the results with other state-of-the-art Horn clause verification tools....

  1. Space weather effects on ground based technology

    Science.gov (United States)

    Clark, T.

    Space weather can affect a variety of forms of ground-based technology, usually as a result of either the direct effects of the varying geomagnetic field, or as a result of the induced electric field that accompanies such variations. Technologies affected directly by geomagnetic variations include magnetic measurements made d ringu geophysical surveys, and navigation relying on the geomagnetic field as a direction reference, a method that is particularly common in the surveying of well-bores in the oil industry. The most obvious technology affected by induced electric fields during magnetic storms is electric power transmission, where the example of the blackout in Quebec during the March 1989 magnetic storm is widely known. Additionally, space weather effects must be taken into account in the design of active cathodic protection systems on pipelines to protect them against corrosion. Long-distance telecommunication cables may also have to be designed to cope with space weather related effects. This paper reviews the effects of space weather in these different areas of ground-based technology, and provides examples of how mitigation against hazards may be achieved. (The paper does not include the effects of space weather on radio communication or satellite navigation systems).

  2. A study of compositional verification based IMA integration method

    Science.gov (United States)

    Huang, Hui; Zhang, Guoquan; Xu, Wanmeng

    2018-03-01

    The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.

  3. SCIENTIFIC EFFICIENCY OF GROUND-BASED TELESCOPES

    International Nuclear Information System (INIS)

    Abt, Helmut A.

    2012-01-01

    I scanned the six major astronomical journals of 2008 for all 1589 papers that are based on new data obtained from ground-based optical/IR telescopes worldwide. Then I collected data on numbers of papers, citations to them in 3+ years, the most-cited papers, and annual operating costs. These data are assigned to four groups by telescope aperture. For instance, while the papers from telescopes with an aperture >7 m average 1.29 more citations than those with an aperture of 2 to 7 m) telescopes. I wonder why the large telescopes do so relatively poorly and suggest possible reasons. I also found that papers based on archival data, such as the Sloan Digital Sky Survey, produce 10.6% as many papers and 20.6% as many citations as those based on new data. Also, the 577.2 papers based on radio data produced 36.3% as many papers and 33.6% as many citations as the 1589 papers based on optical/IR telescopes.

  4. Functional verification of dynamically reconfigurable FPGA-based systems

    CERN Document Server

    Gong, Lingkan

    2015-01-01

    This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric.  Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...

  5. Groebner Bases Based Verification Solution for SystemVerilog Concurrent Assertions

    Directory of Open Access Journals (Sweden)

    Ning Zhou

    2014-01-01

    of polynomial ring algebra to perform SystemVerilog assertion verification over digital circuit systems. This method is based on Groebner bases theory and sequential properties checking. We define a constrained subset of SVAs so that an efficient polynomial modeling mechanism for both circuit descriptions and assertions can be applied. We present an algorithm framework based on the algebraic representations using Groebner bases for concurrent SVAs checking. Case studies show that computer algebra can provide canonical symbolic representations for both assertions and circuit designs and can act as a novel solver engine from the viewpoint of symbolic computation.

  6. A DICOM-RT-based toolbox for the evaluation and verification of radiotherapy plans

    International Nuclear Information System (INIS)

    Spezi, E; Lewis, D G; Smith, C W

    2002-01-01

    The verification of radiotherapy plans is an essential step in the treatment planning process. This is especially important for highly conformal and IMRT plans which produce non-intuitive fluence maps and complex 3D dose distributions. In this work we present a DICOM (Digital Imaging and Communication in Medicine) based toolbox, developed for the evaluation and the verification of radiotherapy treatment plans. The toolbox offers the possibility of importing treatment plans generated with different calculation algorithms and/or different optimization engines and evaluating dose distributions on an independent platform. Furthermore the radiotherapy set-up can be exported to the BEAM Monte Carlo code system for dose verification. This can be done by simulating the irradiation of the patient CT dataset or the irradiation of a software-generated water phantom. We show the application of some of the functions implemented in this toolbox for the evaluation and verification of an IMRT treatment of the head and neck region

  7. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Directory of Open Access Journals (Sweden)

    Jin-Won Park

    2009-01-01

    Full Text Available As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  8. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Science.gov (United States)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  9. Comparison of megavoltage position verification for prostate irradiation based on bony anatomy and implanted fiducials

    International Nuclear Information System (INIS)

    Nederveen, Aart J.; Dehnad, Homan; Heide, Uulke A. van der; Moorselaar, R. Jeroen A. van; Hofman, Pieter; Lagendijk, Jan J.W.

    2003-01-01

    Purpose: The patient position during radiotherapy treatment of prostate cancer can be verified with the help of portal images acquired during treatment. In this study we quantify the clinical consequences of the use of image-based verification based on the bony anatomy and the prostate target itself. Patients and methods: We analysed 2025 portal images and 23 computed tomography (CT) scans from 23 patients with prostate cancer. In all patients gold markers were implanted prior to CT scanning. Statistical data for both random and systematic errors were calculated for displacements of bones and markers and we investigated the effectiveness of an off-line correction protocol. Results: Standard deviations for systematic marker displacement are 2.4 mm in the lateral (LR) direction, 4.4 mm in the anterior-posterior (AP) direction and 3.7 mm in the caudal-cranial direction (CC). Application of off-line position verification based on the marker positions results in a shrinkage of the systematic error to well below 1 mm. Position verification based on the bony anatomy reduces the systematic target uncertainty to 50% in the AP direction and in the LR direction. No reduction was observed in the CC direction. For six out of 23 patients we found an increase of the systematic error after application of bony anatomy-based position verification. Conclusions: We show that even if correction based on the bony anatomy is applied, considerable margins have to be set to account for organ motion. Our study highlights that for individual patients the systematic error can increase after application of bony anatomy-based position verification, whereas the population standard deviation will decrease. Off-line target-based position verification effectively reduces the systematic error to well below 1 mm, thus enabling significant margin reduction

  10. A Feature Subtraction Method for Image Based Kinship Verification under Uncontrolled Environments

    DEFF Research Database (Denmark)

    Duan, Xiaodong; Tan, Zheng-Hua

    2015-01-01

    The most fundamental problem of local feature based kinship verification methods is that a local feature can capture the variations of environmental conditions and the differences between two persons having a kin relation, which can significantly decrease the performance. To address this problem...... the feature distance between face image pairs with kinship and maximize the distance between non-kinship pairs. Based on the subtracted feature, the verification is realized through a simple Gaussian based distance comparison method. Experiments on two public databases show that the feature subtraction method...

  11. Design of Service Net based Correctness Verification Approach for Multimedia Conferencing Service Orchestration

    Directory of Open Access Journals (Sweden)

    Cheng Bo

    2012-02-01

    Full Text Available Multimedia conferencing is increasingly becoming a very important and popular application over Internet. Due to the complexity of asynchronous communications and handle large and dynamically concurrent processes for multimedia conferencing, which confront relevant challenge to achieve sufficient correctness guarantees, and supporting the effective verification methods for multimedia conferencing services orchestration is an extremely difficult and challenging problem. In this paper, we firstly present the Business Process Execution Language (BPEL based conferencing service orchestration, and mainly focus on the service net based correction verification approach for multimedia conferencing services orchestration, which can automatically translated the BPEL based service orchestration into a corresponding Petri net model with the Petri Net Markup Language (PNML, and also present the BPEL service net reduction rules and multimedia conferencing service orchestration correction verification algorithms. We perform the correctness analysis and verification using the service net properties as safeness, reachability and deadlocks, and also provide an automated support tool for the formal analysis and soundness verification for the multimedia conferencing services orchestration scenarios. Finally, we give the comparison and evaluations.

  12. Exploring implementation practices in results-based financing: the case of the verification in Benin.

    Science.gov (United States)

    Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier

    2017-03-14

    Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks

  13. Speaker-dependent Dictionary-based Speech Enhancement for Text-Dependent Speaker Verification

    DEFF Research Database (Denmark)

    Thomsen, Nicolai Bæk; Thomsen, Dennis Alexander Lehmann; Tan, Zheng-Hua

    2016-01-01

    not perform well in this setting. In this work we compare the performance of different noise reduction methods under different noise conditions in terms of speaker verification when the text is known and the system is trained on clean data (mis-matched conditions). We furthermore propose a new approach based......The problem of text-dependent speaker verification under noisy conditions is becoming ever more relevant, due to increased usage for authentication in real-world applications. Classical methods for noise reduction such as spectral subtraction and Wiener filtering introduce distortion and do...... on dictionary-based noise reduction and compare it to the baseline methods....

  14. Development of a Torque Sensor-Based Test Bed for Attitude Control System Verification and Validation

    Science.gov (United States)

    2017-12-30

    AFRL-RV-PS- AFRL-RV-PS- TR-2018-0008 TR-2018-0008 DEVELOPMENT OF A TORQUE SENSOR- BASED TEST BED FOR ATTITUDE CONTROL SYSTEM VERIFICATION AND...Sensor-Based Test Bed for Attitude Control System Verification & Validation 5a. CONTRACT NUMBER FA9453-15-1-0315 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...NUMBER 62601F 6. AUTHOR(S) Norman Fitz-Coy 5d. PROJECT NUMBER 4846 5e. TASK NUMBER PPM00015968 5f. WORK UNIT NUMBER EF125135 7. PERFORMING

  15. E-Visas Verification Schemes Based on Public-Key Infrastructure and Identity Based Encryption

    OpenAIRE

    Najlaa A. Abuadhmah; Muawya Naser; Azman Samsudin

    2010-01-01

    Problem statement: Visa is a very important travelling document, which is an essential need at the point of entry of any country we are visiting. However an important document such as visa is still handled manually which affects the accuracy and efficiency of processing the visa. Work on e-visa is almost unexplored. Approach: This study provided a detailed description of a newly proposed e-visa verification system prototyped based on RFID technology. The core technology of the proposed e-visa...

  16. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim; Heo, Gyunyoung [Kyunghee Univ., Yongin (Korea, Republic of); Jung, Jaecheon [KEPCO, Ulsan (Korea, Republic of)

    2016-10-15

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks.

  17. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Heo, Gyunyoung; Jung, Jaecheon

    2016-01-01

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks

  18. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    Science.gov (United States)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  19. Online 3D EPID-based dose verification: Proof of concept

    Energy Technology Data Exchange (ETDEWEB)

    Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozendaal@nki.nl; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben [Department of Radiation Oncology, The Netherlands Cancer Institute, Amsterdam 1066 CX (Netherlands); Herk, Marcel van [University of Manchester, Manchester Academic Health Science Centre, The Christie NHS Foundation Trust, Manchester M20 4BX (United Kingdom)

    2016-07-15

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame

  20. Online 3D EPID-based dose verification: Proof of concept

    International Nuclear Information System (INIS)

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; Herk, Marcel van

    2016-01-01

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame

  1. Online 3D EPID-based dose verification: Proof of concept.

    Science.gov (United States)

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel

    2016-07-01

    Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took

  2. Verification of FPGA-based NPP I and C systems. General approach and techniques

    International Nuclear Information System (INIS)

    Andrashov, Anton; Kharchenko, Vyacheslav; Sklyar, Volodymir; Reva, Lubov; Siora, Alexander

    2011-01-01

    This paper presents a general approach and techniques for design and verification of Field Programmable Gates Arrays (FPGA)-based Instrumentation and Control (I and C) systems for Nuclear Power Plants (NPP). Appropriate regulatory documents used for I and C systems design, development, verification and validation (V and V) are discussed considering the latest international standards and guidelines. Typical development and V and V processes of FPGA electronic design for FPGA-based NPP I and C systems are presented. Some safety-related features of implementation process are discussed. Corresponding development artifacts, related to design and implementation activities are outlined. An approach to test-based verification of FPGA electronic design algorithms, used in FPGA-based reactor trip systems is proposed. The results of application of test-based techniques for assessment of FPGA electronic design algorithms for reactor trip system (RTS) produced by Research and Production Corporation (RPC) 'Radiy' are presented. Some principles of invariant-oriented verification for FPGA-based safety-critical systems are outlined. (author)

  3. A method of knowledge base verification for nuclear power plant expert systems using extended Petri Nets

    International Nuclear Information System (INIS)

    Kwon, I. W.; Seong, P. H.

    1996-01-01

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. The verification phase of knowledge base is an important part for developing reliable expert systems, especially in nuclear industry. Although several strategies or tools have been developed to perform potential error checking, they often neglect the reliability of verification methods. Because a Petri net provides a uniform mathematical formalization of knowledge base, it has been employed for knowledge base verification. In this work, we devise and suggest an automated tool, called COKEP(Checker of Knowledge base using Extended Petri net), for detecting incorrectness, inconsistency, and incompleteness in a knowledge base. The scope of the verification problem is expended to chained errors, unlike previous studies that assume error incidence to be limited to rule pairs only. In addition, we consider certainty factor in checking, because most of knowledge bases have certainly factors. 8 refs,. 2 figs,. 4 tabs. (author)

  4. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices

    Directory of Open Access Journals (Sweden)

    Jingzhen Li

    2017-01-01

    Full Text Available In this paper, an approach to biometric verification based on human body communication (HBC is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer’s forearm is measured by vector network analyzer (VNA. Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR and false rejection rate (FRR based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN classification, support vector machines (SVM, and naive Bayesian method (NBM classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices.

  5. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices.

    Science.gov (United States)

    Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei

    2017-01-10

    In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer's forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices.

  6. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices

    Science.gov (United States)

    Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei

    2017-01-01

    In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer’s forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices. PMID:28075375

  7. The research for the design verification of nuclear power plant based on VR dynamic plant

    International Nuclear Information System (INIS)

    Wang Yong; Yu Xiao

    2015-01-01

    This paper studies a new method of design verification through the VR plant, in order to perform verification and validation the design of plant conform to the requirements of accident emergency. The VR dynamic plant is established by 3D design model and digital maps that composed of GIS system and indoor maps, and driven by the analyze data of design analyzer. The VR plant could present the operation conditions and accident conditions of power plant. This paper simulates the execution of accident procedures, the development of accidents, the evacuation planning of people and so on, based on VR dynamic plant, and ensure that the plant design will not cause bad effect. Besides design verification, simulated result also can be used for optimization of the accident emergency plan, the training of accident plan and emergency accident treatment. (author)

  8. An Improved Constraint-Based System for the Verification of Security Protocols

    NARCIS (Netherlands)

    Corin, R.J.; Etalle, Sandro

    We propose a constraint-based system for the verification of security protocols that improves upon the one developed by Millen and Shmatikov [30]. Our system features (1) a significantly more efficient implementation, (2) a monotonic behavior, which also allows to detect flaws associated to partial

  9. 78 FR 56266 - Consent Based Social Security Number Verification (CBSV) Service

    Science.gov (United States)

    2013-09-12

    ... developed CBSV as a user- friendly, internet-based application with safeguards that protect the public's information. In addition to the benefit of providing high volume, centralized SSN verification services to users in a secure manner, CBSV provides us with cost and workload management benefits. New Information...

  10. An Improved Constraint-based system for the verification of security protocols

    NARCIS (Netherlands)

    Corin, R.J.; Etalle, Sandro; Hermenegildo, Manuel V.; Puebla, German

    We propose a constraint-based system for the verification of security protocols that improves upon the one developed by Millen and Shmatikov. Our system features (1) a significantly more efficient implementation, (2) a monotonic behavior, which also allows to detect aws associated to partial runs

  11. M3 version 3.0: Verification and validation; Hydrochemical model of ground water at repository site

    Energy Technology Data Exchange (ETDEWEB)

    Gomez, Javier B. (Dept. of Earth Sciences, Univ. of Zaragoza, Zaragoza (Spain)); Laaksoharju, Marcus (Geopoint AB, Sollentuna (Sweden)); Skaarman, Erik (Abscondo, Bromma (Sweden)); Gurban, Ioana (3D-Terra (Canada))

    2009-01-15

    Hydrochemical evaluation is a complex type of work that is carried out by specialists. The outcome of this work is generally presented as qualitative models and process descriptions of a site. To support and help to quantify the processes in an objective way, a multivariate mathematical tool entitled M3 (Multivariate Mixing and Mass balance calculations) has been constructed. The computer code can be used to trace the origin of the groundwater, and to calculate the mixing proportions and mass balances from groundwater data. The M3 code is a groundwater response model, which means that changes in the groundwater chemistry in terms of sources and sinks are traced in relation to an ideal mixing model. The complexity of the measured groundwater data determines the configuration of the ideal mixing model. Deviations from the ideal mixing model are interpreted as being due to reactions. Assumptions concerning important mineral phases altering the groundwater or uncertainties associated with thermodynamic constants do not affect the modelling because the calculations are solely based on the measured groundwater composition. M3 uses the opposite approach to that of many standard hydrochemical models. In M3, mixing is evaluated and calculated first. The constituents that cannot be described by mixing are described by reactions. The M3 model consists of three steps: the first is a standard principal component analysis, followed by mixing and finally mass balance calculations. The measured groundwater composition can be described in terms of mixing proportions (%), while the sinks and sources of an element associated with reactions are reported in mg/L. This report contains a set of verification and validation exercises with the intention of building confidence in the use of the M3 methodology. At the same time, clear answers are given to questions related to the accuracy and the precision of the results, including the inherent uncertainties and the errors that can be made

  12. Integration of a satellite ground support system based on analysis of the satellite ground support domain

    Science.gov (United States)

    Pendley, R. D.; Scheidker, E. J.; Levitt, D. S.; Myers, C. R.; Werking, R. D.

    1994-11-01

    This analysis defines a complete set of ground support functions based on those practiced in real space flight operations during the on-orbit phase of a mission. These functions are mapped against ground support functions currently in use by NASA and DOD. Software components to provide these functions can be hosted on RISC-based work stations and integrated to provide a modular, integrated ground support system. Such modular systems can be configured to provide as much ground support functionality as desired. This approach to ground systems has been widely proposed and prototyped both by government institutions and commercial vendors. The combined set of ground support functions we describe can be used as a standard to evaluate candidate ground systems. This approach has also been used to develop a prototype of a modular, loosely-integrated ground support system, which is discussed briefly. A crucial benefit to a potential user is that all the components are flight-qualified, thus giving high confidence in their accuracy and reliability.

  13. SU-E-T-762: Toward Volume-Based Independent Dose Verification as Secondary Check

    International Nuclear Information System (INIS)

    Tachibana, H; Tachibana, R

    2015-01-01

    Purpose: Lung SBRT plan has been shifted to volume prescription technique. However, point dose agreement is still verified using independent dose verification at the secondary check. The volume dose verification is more affected by inhomogeneous correction rather than point dose verification currently used as the check. A feasibility study for volume dose verification was conducted in lung SBRT plan. Methods: Six SBRT plans were collected in our institute. Two dose distributions with / without inhomogeneous correction were generated using Adaptive Convolve (AC) in Pinnacle3. Simple MU Analysis (SMU, Triangle Product, Ishikawa, JP) was used as the independent dose verification software program, in which a modified Clarkson-based algorithm was implemented and radiological path length was computed using CT images independently to the treatment planning system. The agreement in point dose and mean dose between the AC with / without the correction and the SMU were assessed. Results: In the point dose evaluation for the center of the GTV, the difference shows the systematic shift (4.5% ± 1.9 %) in comparison of the AC with the inhomogeneous correction, on the other hands, there was good agreement of 0.2 ± 0.9% between the SMU and the AC without the correction. In the volume evaluation, there were significant differences in mean dose for not only PTV (14.2 ± 5.1 %) but also GTV (8.0 ± 5.1 %) compared to the AC with the correction. Without the correction, the SMU showed good agreement for GTV (1.5 ± 0.9%) as well as PTV (0.9% ± 1.0%). Conclusion: The volume evaluation for secondary check may be possible in homogenous region. However, the volume including the inhomogeneous media would make larger discrepancy. Dose calculation algorithm for independent verification needs to be modified to take into account the inhomogeneous correction

  14. SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements

    International Nuclear Information System (INIS)

    Thoelking, J; Yuvaraj, S; Jens, F; Lohr, F; Wenz, F; Wertz, H; Wertz, H

    2015-01-01

    Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference) and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan

  15. SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Thoelking, J; Yuvaraj, S; Jens, F; Lohr, F; Wenz, F; Wertz, H; Wertz, H [University Medical Center Mannheim, University of Heidelberg, Mannheim, Baden-Wuerttemberg (Germany)

    2015-06-15

    Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference) and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan

  16. A two-region simulation model of vertical U-tube ground heat exchanger and its experimental verification

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Weibo; Liu, Guangyuan [School of Energy and Power Engineering, Yangzhou University, Yangzhou City (China); Shi, Mingheng; Chen, Zhenqian [School of Energy and Environment, Southeast University, Nanjing City (China)

    2009-10-15

    Heat transfer around vertical ground heat exchanger (GHE) is a common problem for the design and simulation of ground coupled heat pump (GCHP). In this paper, an updated two-region vertical U-tube GHE analytical model, which is fit for system dynamic simulation of GCHP, is proposed and developed. It divides the heat transfer region of GHE into two parts at the boundary of borehole wall, and the two regions are coupled by the temperature of borehole wall. Both steady and transient heat transfer method are used to analyze the heat transfer process inside and outside borehole, respectively. The transient borehole wall temperature is calculated for the soil region outside borehole by use of a variable heat flux cylindrical source model. As for the region inside borehole, considering the variation of fluid temperature along the borehole length and the heat interference between two adjacent legs of U-tube, a quasi-three dimensional steady-state heat transfer analytical model for the borehole is developed based on the element energy conservation. The implement process of the model used in the dynamic simulation of GCHPs is illuminated in detail and the application calculation example for it is also presented. The experimental validation on the model is performed in a solar-geothermal multifunctional heat pump experiment system with two vertical boreholes and each with a 30 m vertical 1 1/4 in nominal diameter HDPE single U-tube GHE, the results indicate that the calculated fluid outlet temperatures of GHE by the model are agreed well with the corresponding test data and the guess relative error is less than 6%. (author)

  17. Biomass burning aerosols characterization from ground based and profiling measurements

    Science.gov (United States)

    Marin, Cristina; Vasilescu, Jeni; Marmureanu, Luminita; Ene, Dragos; Preda, Liliana; Mihailescu, Mona

    2018-04-01

    The study goal is to assess the chemical and optical properties of aerosols present in the lofted layers and at the ground. The biomass burning aerosols were evaluated in low level layers from multi-wavelength lidar measurements, while chemical composition at ground was assessed using an Aerosol Chemical Speciation Monitor (ACSM) and an Aethalometer. Classification of aerosol type and specific organic markers were used to explore the potential to sense the particles from the same origin at ground base and on profiles.

  18. Evaluation of DVH-based treatment plan verification in addition to gamma passing rates for head and neck IMRT

    International Nuclear Information System (INIS)

    Visser, Ruurd; Wauben, David J.L.; Groot, Martijn de; Steenbakkers, Roel J.H.M.; Bijl, Henk P.; Godart, Jeremy; Veld, Aart A. van’t; Langendijk, Johannes A.; Korevaar, Erik W.

    2014-01-01

    Background and purpose: Treatment plan verification of intensity modulated radiotherapy (IMRT) is generally performed with the gamma index (GI) evaluation method, which is difficult to extrapolate to clinical implications. Incorporating Dose Volume Histogram (DVH) information can compensate for this. The aim of this study was to evaluate DVH-based treatment plan verification in addition to the GI evaluation method for head and neck IMRT. Materials and methods: Dose verifications of 700 subsequent head and neck cancer IMRT treatment plans were categorised according to gamma and DVH-based action levels. Fractionation dependent absolute dose limits were chosen. The results of the gamma- and DVH-based evaluations were compared to the decision of the medical physicist and/or radiation oncologist for plan acceptance. Results: Nearly all treatment plans (99.7%) were accepted for treatment according to the GI evaluation combined with DVH-based verification. Two treatment plans were re-planned according to DVH-based verification, which would have been accepted using the evaluation alone. DVH-based verification increased insight into dose delivery to patient specific structures increasing confidence that the treatment plans were clinically acceptable. Moreover, DVH-based action levels clearly distinguished the role of the medical physicist and radiation oncologist within the Quality Assurance (QA) procedure. Conclusions: DVH-based treatment plan verification complements the GI evaluation method improving head and neck IMRT-QA

  19. Scenario-based verification of real-time systems using UPPAAL

    DEFF Research Database (Denmark)

    Li, Shuhao; Belaguer, Sandie; David, Alexandre

    2010-01-01

    Abstract This paper proposes two approaches to tool-supported automatic verification of dense real-time systems against scenario-based requirements, where a system is modeled as a network of timed automata (TAs) or as a set of driving live sequence charts (LSCs), and a requirement is specified...... as a separate monitored LSC chart. We make timed extensions to a kernel subset of the LSC language and define a trace-based semantics. By translating a monitored LSC chart to a behavior-equivalent observer TA and then non-intrusively composing this observer with the original TA modeled real-time system......, the problem of scenario-based verification reduces to a computation tree logic (CTL) real-time model checking problem. In case the real time system is modeled as a set of driving LSC charts, we translate these driving charts and the monitored chart into a behavior-equivalent network of TAs by using a “one...

  20. Ground-based measurements of ionospheric dynamics

    Science.gov (United States)

    Kouba, Daniel; Chum, Jaroslav

    2018-05-01

    Different methods are used to research and monitor the ionospheric dynamics using ground measurements: Digisonde Drift Measurements (DDM) and Continuous Doppler Sounding (CDS). For the first time, we present comparison between both methods on specific examples. Both methods provide information about the vertical drift velocity component. The DDM provides more information about the drift velocity vector and detected reflection points. However, the method is limited by the relatively low time resolution. In contrast, the strength of CDS is its high time resolution. The discussed methods can be used for real-time monitoring of medium scale travelling ionospheric disturbances. We conclude that it is advantageous to use both methods simultaneously if possible. The CDS is then applied for the disturbance detection and analysis, and the DDM is applied for the reflection height control.

  1. Intelligent Tools for Planning Knowledge base Development and Verification

    Science.gov (United States)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintaining the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems.

  2. Streaming-based verification of XML signatures in SOAP messages

    DEFF Research Database (Denmark)

    Somorovsky, Juraj; Jensen, Meiko; Schwenk, Jörg

    2010-01-01

    approach for XML processing, the Web Services servers easily become a target of Denial-of-Service attacks. We present a solution for these problems: an external streaming-based WS-Security Gateway. Our implementation is capable of processing XML Signatures in SOAP messages using a streaming-based approach...

  3. NASA Operational Simulator for Small Satellites: Tools for Software Based Validation and Verification of Small Satellites

    Science.gov (United States)

    Grubb, Matt

    2016-01-01

    The NASA Operational Simulator for Small Satellites (NOS3) is a suite of tools to aid in areas such as software development, integration test (IT), mission operations training, verification and validation (VV), and software systems check-out. NOS3 provides a software development environment, a multi-target build system, an operator interface-ground station, dynamics and environment simulations, and software-based hardware models. NOS3 enables the development of flight software (FSW) early in the project life cycle, when access to hardware is typically not available. For small satellites there are extensive lead times on many of the commercial-off-the-shelf (COTS) components as well as limited funding for engineering test units (ETU). Considering the difficulty of providing a hardware test-bed to each developer tester, hardware models are modeled based upon characteristic data or manufacturers data sheets for each individual component. The fidelity of each hardware models is such that FSW executes unaware that physical hardware is not present. This allows binaries to be compiled for both the simulation environment, and the flight computer, without changing the FSW source code. For hardware models that provide data dependent on the environment, such as a GPS receiver or magnetometer, an open-source tool from NASA GSFC (42 Spacecraft Simulation) is used to provide the necessary data. The underlying infrastructure used to transfer messages between FSW and the hardware models can also be used to monitor, intercept, and inject messages, which has proven to be beneficial for VV of larger missions such as James Webb Space Telescope (JWST). As hardware is procured, drivers can be added to the environment to enable hardware-in-the-loop (HWIL) testing. When strict time synchronization is not vital, any number of combinations of hardware components and software-based models can be tested. The open-source operator interface used in NOS3 is COSMOS from Ball Aerospace. For

  4. [Verification of Learning Effects by Team-based Learning].

    Science.gov (United States)

    Ono, Shin-Ichi; Ito, Yoshihisa; Ishige, Kumiko; Inokuchi, Norio; Kosuge, Yasuhiro; Asami, Satoru; Izumisawa, Megumi; Kobayashi, Hiroko; Hayashi, Hiroyuki; Suzuki, Takashi; Kishikawa, Yukinaga; Hata, Harumi; Kose, Eiji; Tabata, Kei-Ichi

    2017-11-01

     It has been recommended that active learning methods, such as team-based learning (TBL) and problem-based learning (PBL), be introduced into university classes by the Central Council for Education. As such, for the past 3 years, we have implemented TBL in a medical therapeutics course for 4-year students. Based upon our experience, TBL is characterized as follows: TBL needs fewer teachers than PBL to conduct a TBL module. TBL enables both students and teachers to recognize and confirm the learning results from preparation and reviewing. TBL grows students' responsibility for themselves and their teams, and likely facilitates learning activities through peer assessment.

  5. Feasibility of biochemical verification in a web-based smoking cessation study.

    Science.gov (United States)

    Cha, Sarah; Ganz, Ollie; Cohn, Amy M; Ehlke, Sarah J; Graham, Amanda L

    2017-10-01

    Cogent arguments have been made against the need for biochemical verification in population-based studies with low-demand characteristics. Despite this fact, studies involving digital interventions (low-demand) are often required in peer review to report biochemically verified abstinence. To address this discrepancy, we examined the feasibility and costs of biochemical verification in a web-based study conducted with a national sample. Participants were 600U.S. adult current smokers who registered on a web-based smoking cessation program and completed surveys at baseline and 3months. Saliva sampling kits were sent to participants who reported 7-day abstinence at 3months, and analyzed for cotinine. The response rate at 3-months was 41.2% (n=247): 93 participants reported 7-day abstinence (38%) and were mailed a saliva kit (71% returned). The discordance rate was 36.4%. Participants with discordant responses were more likely to report 3-month use of nicotine replacement therapy or e-cigarettes than those with concordant responses (79.2% vs. 45.2%, p=0.007). The total cost of saliva sampling was $8280 ($125/sample). Biochemical verification was both time- and cost-intensive, and yielded a relatively small number of samples due to low response rates and use of other nicotine products during the follow-up period. There was a high rate of discordance of self-reported abstinence and saliva testing. Costs for data collection may be prohibitive for studies with large sample sizes or limited budgets. Our findings echo previous statements that biochemical verification is not necessary in population-based studies, and add evidence specific to technology-based studies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Geothermal-resource verification for Air Force bases

    Energy Technology Data Exchange (ETDEWEB)

    Grant, P.R. Jr.

    1981-06-01

    This report summarizes the various types of geothermal energy reviews some legal uncertainties of the resource and then describes a methodology to evaluate geothermal resources for applications to US Air Force bases. Estimates suggest that exploration costs will be $50,000 to $300,000, which, if favorable, would lead to drilling a $500,000 exploration well. Successful identification and development of a geothermal resource could provide all base, fixed system needs with an inexpensive, renewable energy source.

  7. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  8. Time-Contrastive Learning Based DNN Bottleneck Features for Text-Dependent Speaker Verification

    DEFF Research Database (Denmark)

    Sarkar, Achintya Kumar; Tan, Zheng-Hua

    2017-01-01

    In this paper, we present a time-contrastive learning (TCL) based bottleneck (BN) feature extraction method for speech signals with an application to text-dependent (TD) speaker verification (SV). It is well-known that speech signals exhibit quasi-stationary behavior in and only in a short interval......, and the TCL method aims to exploit this temporal structure. More specifically, it trains deep neural networks (DNNs) to discriminate temporal events obtained by uniformly segmenting speech signals, in contrast to existing DNN based BN feature extraction methods that train DNNs using labeled data...... to discriminate speakers or pass-phrases or phones or a combination of them. In the context of speaker verification, speech data of fixed pass-phrases are used for TCL-BN training, while the pass-phrases used for TCL-BN training are excluded from being used for SV, so that the learned features can be considered...

  9. Convex polyhedral abstractions, specialisation and property-based predicate splitting in Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    We present an approach to constrained Horn clause (CHC) verification combining three techniques: abstract interpretation over a domain of convex polyhedra, specialisation of the constraints in CHCs using abstract interpretation of query-answer transformed clauses, and refinement by splitting...... in conjunction with specialisation for propagating constraints it can frequently solve challenging verification problems. This is a contribution in itself, but refinement is needed when it fails, and the question of how to refine convex polyhedral analyses has not been studied much. We present a refinement...... technique based on interpolants derived from a counterexample trace; these are used to drive a property-based specialisation that splits predicates, leading in turn to more precise convex polyhedral analyses. The process of specialisation, analysis and splitting can be repeated, in a manner similar...

  10. USB environment measurements based on full-scale static engine ground tests

    Science.gov (United States)

    Sussman, M. B.; Harkonen, D. L.; Reed, J. B.

    1976-01-01

    Flow turning parameters, static pressures, surface temperatures, surface fluctuating pressures and acceleration levels were measured in the environment of a full-scale upper surface blowing (USB) propulsive lift test configuration. The test components included a flightworthy CF6-50D engine, nacelle, and USB flap assembly utilized in conjunction with ground verification testing of the USAF YC-14 Advanced Medium STOL Transport propulsion system. Results, based on a preliminary analysis of the data, generally show reasonable agreement with predicted levels based on model data. However, additional detailed analysis is required to confirm the preliminary evaluation, to help delineate certain discrepancies with model data, and to establish a basis for future flight test comparisons.

  11. Type-Based Automated Verification of Authenticity in Asymmetric Cryptographic Protocols

    DEFF Research Database (Denmark)

    Dahl, Morten; Kobayashi, Naoki; Sun, Yunde

    2011-01-01

    Gordon and Jeffrey developed a type system for verification of asymmetric and symmetric cryptographic protocols. We propose a modified version of Gordon and Jeffrey's type system and develop a type inference algorithm for it, so that protocols can be verified automatically as they are, without any...... type annotations or explicit type casts. We have implemented a protocol verifier SpiCa based on the algorithm, and confirmed its effectiveness....

  12. BAVP: Blockchain-Based Access Verification Protocol in LEO Constellation Using IBE Keys

    OpenAIRE

    Wei, Songjie; Li, Shuai; Liu, Peilong; Liu, Meilin

    2018-01-01

    LEO constellation has received intensive research attention in the field of satellite communication. The existing centralized authentication protocols traditionally used for MEO/GEO satellite networks cannot accommodate LEO satellites with frequent user connection switching. This paper proposes a fast and efficient access verification protocol named BAVP by combining identity-based encryption and blockchain technology. Two different key management schemes with IBE and blockchain, respectively...

  13. KSC ADVANCED GROUND BASED FIELD MILL V1

    Data.gov (United States)

    National Aeronautics and Space Administration — The Advanced Ground Based Field Mill (AGBFM) network consists of 34 (31 operational) field mills located at Kennedy Space Center (KSC), Florida. The field mills...

  14. Ground Based Support for Exoplanet Space Missions

    Science.gov (United States)

    Haukka, H.; Hentunen, V.-P.; Salmi, T.; Aartolahti, H.; Juutilainen, J.; Vilokki, H.; Nissinen, M.

    2011-10-01

    Taurus Hill Observatory (THO), observatory code A95, is an amateur observatory located in Varkaus, Finland. The observatory is maintained by the local astronomical association Warkauden Kassiopeia. THO research team has observed and measured various stellar objects and phenomena. Observatory has mainly focused to asteroid [1] and exoplanet light curve measurements, observing the gamma rays burst, supernova discoveries and monitoring [2] and long term monitoring projects [3]. In the early 2011 Europlanet NA1 and NA2 organized "Coordinated Observations of Exoplanets from Ground and Space"-workshop in Graz, Austria. The workshop gathered together proam astronomers who have the equipment to measure the light curves of the exoplanets. Also there were professional scientists working in the exoplanet field who attended to the workshop. The result of the workshop was to organize coordinated observation campaign for follow-up observations of exoplanets (e.g. CoRoT planets). Also coordinated observation campaign to observe stellar CME outbreaks was planned. THO has a lot of experience in field of exoplanet light curve measurements and therefore this campaign is very supported by the research team of the observatory. In next coming observing seasons THO will concentrate its efforts for this kind of campaigns.

  15. Verification Based on Set-Abstraction Using the AIF Framework

    DEFF Research Database (Denmark)

    Mödersheim, Sebastian Alexander

    The AIF framework is a novel method for analyzing advanced security protocols, web services, and APIs, based a new abstract interpretation method. It consists of the specification language AIF and a translation/abstraction processes that produces a set of first-order Horn clauses. These can...

  16. Verification of the Performance of a Vertical Ground Heat Exchanger Applied to a Test House in Melbourne, Australia

    Directory of Open Access Journals (Sweden)

    Koon Beng Ooi

    2017-10-01

    Full Text Available The ground heat exchanger is traditionally used as a heat source or sink for the heat pump that raises the temperature of water to about 50 °C to heat houses. However, in winter, the heating thermostat (temperature at which heating begins in the Australian Nationwide House Energy Rating Scheme (NatHERS is only 20 °C during daytime and 15 °C at night. In South-East Melbourne, the temperature at the bottom of a 50-meter-deep borehole has been recorded with an Emerson™ recorder at 17 °C. Melbourne has an annual average temperature of 15 °C, so the ground temperature increases by 2 °C per 50-m depth. A linear projection gives 23 °C at 200-m of depth, and as the average undisturbed temperature of the ground for a 400-m-deep vertical ground heat exchanger (VGHE. This study, by simulation and experimentation, aims to verify that the circulation of water in the VGHE’s U-tube to low-temperature radiators (LTRs could heat a house to thermal comfort. A literature review is included in the introduction. A simulation, using a model of a 60-m2 experimental house, shows that the daytime circulation of water in this VGHE/LTR-on-opposite-walls system during the 8-month cold half of the year, heats the indoors to NatHERS settings. Simulation for the cold half shows that this VGHE-LTR system could cool the indoors. Instead, a fan creating a cooling sensation of up to 4 °C is used so that the VGHE is available for the regeneration of heat extracted from the ground during the cold portion. Simulations for this hot portion show that a 3.4-m2 flat plate solar collector can collect more than twice the heat extracted from the ground in the cold portion. Thus, it can thus replenish the ground heat extracted for houses double the size of this 60-m2 experimental house. Therefore, ground heat is sustainable for family-size homes. Since no heat pump is used, the cost of VGHE-LTR systems could be comparable to systems using the ground source heat pump. Water

  17. Development and verification of symptom based emergency procedure support system

    International Nuclear Information System (INIS)

    Saijou, Nobuyuki; Sakuma, Akira; Takizawa, Yoji; Tamagawa, Naoko; Kubota, Ryuji; Satou, Hiroyuki; Ikeda, Koji; Taminami, Tatsuya

    1998-01-01

    A Computerized Emergency Procedure Guideline (EPG) Support System has been developed for BWR and evaluated using training simulator. It aims to enhance the effective utilization of EPG. The system identifies suitable symptom-based operating procedures for present plant status automatically. It has two functions : one is plant status identification function, and the other is man-machine interface function. For the realization of the former function, a method which identifies and prioritize suitable symptom-based operational procedures against present plant status has been developed. As man-machine interface, operation flow chart display has been developed. It express the flow of the identified operating procedures graphically. For easy understanding of the display, important information such as plant status change, priority of operating procedures and completion/uncompletion of the operation is displayed on the operation flow display by different colors. As evaluation test, the response of the system to the design based accidents was evaluated by actual plant operators, using training simulator at BWR Training Center. Through the analysis of interviews and questionnaires to operators, it was shown that the system is effective and can be utilized for a real plant. (author)

  18. A DVE Time Management Simulation and Verification Platform Based on Causality Consistency Middleware

    Science.gov (United States)

    Zhou, Hangjun; Zhang, Wei; Peng, Yuxing; Li, Sikun

    During the course of designing a time management algorithm for DVEs, the researchers always become inefficiency for the distraction from the realization of the trivial and fundamental details of simulation and verification. Therefore, a platform having realized theses details is desirable. However, this has not been achieved in any published work to our knowledge. In this paper, we are the first to design and realize a DVE time management simulation and verification platform providing exactly the same interfaces as those defined by the HLA Interface Specification. Moreover, our platform is based on a new designed causality consistency middleware and might offer the comparison of three kinds of time management services: CO, RO and TSO. The experimental results show that the implementation of the platform only costs small overhead, and that the efficient performance of it is highly effective for the researchers to merely focus on the improvement of designing algorithms.

  19. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-01-01

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform’s mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument’s working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform. PMID:27869722

  20. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform.

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-11-18

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  1. Research on Linux Trusted Boot Method Based on Reverse Integrity Verification

    Directory of Open Access Journals (Sweden)

    Chenlin Huang

    2016-01-01

    Full Text Available Trusted computing aims to build a trusted computing environment for information systems with the help of secure hardware TPM, which has been proved to be an effective way against network security threats. However, the TPM chips are not yet widely deployed in most computing devices so far, thus limiting the applied scope of trusted computing technology. To solve the problem of lacking trusted hardware in existing computing platform, an alternative security hardware USBKey is introduced in this paper to simulate the basic functions of TPM and a new reverse USBKey-based integrity verification model is proposed to implement the reverse integrity verification of the operating system boot process, which can achieve the effect of trusted boot of the operating system in end systems without TPMs. A Linux operating system booting method based on reverse integrity verification is designed and implemented in this paper, with which the integrity of data and executable files in the operating system are verified and protected during the trusted boot process phase by phase. It implements the trusted boot of operation system without TPM and supports remote attestation of the platform. Enhanced by our method, the flexibility of the trusted computing technology is greatly improved and it is possible for trusted computing to be applied in large-scale computing environment.

  2. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    Energy Technology Data Exchange (ETDEWEB)

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor [Gamma Knife Unit, Department of Neurosurgery, Neurosciences Centre, All India Institute of Medical Sciences, Ansari Nagar, New Delhi 110029 (India)

    2013-12-15

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film.

  3. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    International Nuclear Information System (INIS)

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor

    2013-01-01

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film

  4. Simulation-based design process for the verification of ITER remote handling systems

    International Nuclear Information System (INIS)

    Sibois, Romain; Määttä, Timo; Siuko, Mikko; Mattila, Jouni

    2014-01-01

    Highlights: •Verification and validation process for ITER remote handling system. •Simulation-based design process for early verification of ITER RH systems. •Design process centralized around simulation lifecycle management system. •Verification and validation roadmap for digital modelling phase. -- Abstract: The work behind this paper takes place in the EFDA's European Goal Oriented Training programme on Remote Handling (RH) “GOT-RH”. The programme aims to train engineers for activities supporting the ITER project and the long-term fusion programme. One of the projects of this programme focuses on the verification and validation (V and V) of ITER RH system requirements using digital mock-ups (DMU). The purpose of this project is to study and develop efficient approach of using DMUs in the V and V process of ITER RH system design utilizing a System Engineering (SE) framework. Complex engineering systems such as ITER facilities lead to substantial rise of cost while manufacturing the full-scale prototype. In the V and V process for ITER RH equipment, physical tests are a requirement to ensure the compliance of the system according to the required operation. Therefore it is essential to virtually verify the developed system before starting the prototype manufacturing phase. This paper gives an overview of the current trends in using digital mock-up within product design processes. It suggests a simulation-based process design centralized around a simulation lifecycle management system. The purpose of this paper is to describe possible improvements in the formalization of the ITER RH design process and V and V processes, in order to increase their cost efficiency and reliability

  5. SU-F-T-267: A Clarkson-Based Independent Dose Verification for the Helical Tomotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Nagata, H [Shonan Kamakura General Hospital, Kamakura, Kanagawa, (Japan); Juntendo University, Hongo, Tokyo (Japan); Hongo, H [Shonan Kamakura General Hospital, Kamakura, Kanagawa, (Japan); Tsukuba University, Tsukuba, Ibaraki (Japan); Kawai, D [Kanagawa Cancer Center, Yokohama, Kanagawa (Japan); Takahashi, R [Cancer Institute Hospital of Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Hashimoto, H [Shonan Fujisawa Tokushukai Hospital, Fujisawa, Kanagawa (Japan); Tachibana, H [National Cancer Center, Kashiwa, Chiba (Japan)

    2016-06-15

    Purpose: There have been few reports for independent dose verification for Tomotherapy. We evaluated the accuracy and the effectiveness of an independent dose verification system for the Tomotherapy. Methods: Simple MU Analysis (SMU, Triangle Product, Ishikawa, Japan) was used as the independent verification system and the system implemented a Clarkson-based dose calculation algorithm using CT image dataset. For dose calculation in the SMU, the Tomotherapy machine-specific dosimetric parameters (TMR, Scp, OAR and MLC transmission factor) were registered as the machine beam data. Dose calculation was performed after Tomotherapy sinogram from DICOM-RT plan information was converted to the information for MU and MLC location at more segmented control points. The performance of the SMU was assessed by a point dose measurement in non-IMRT and IMRT plans (simple target and mock prostate plans). Subsequently, 30 patients’ treatment plans for prostate were compared. Results: From the comparison, dose differences between the SMU and the measurement were within 3% for all cases in non-IMRT plans. In the IMRT plan for the simple target, the differences (Average±1SD) were −0.70±1.10% (SMU vs. TPS), −0.40±0.10% (measurement vs. TPS) and −1.20±1.00% (measurement vs. SMU), respectively. For the mock prostate, the differences were −0.40±0.60% (SMU vs. TPS), −0.50±0.90% (measurement vs. TPS) and −0.90±0.60% (measurement vs. SMU), respectively. For patients’ plans, the difference was −0.50±2.10% (SMU vs. TPS). Conclusion: A Clarkson-based independent dose verification for the Tomotherapy can be clinically available as a secondary check with the similar tolerance level of AAPM Task group 114. This research is partially supported by Japan Agency for Medical Research and Development (AMED)

  6. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    Science.gov (United States)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  7. Regional MLEM reconstruction strategy for PET-based treatment verification in ion beam radiotherapy

    International Nuclear Information System (INIS)

    Gianoli, Chiara; Riboldi, Marco; Fattori, Giovanni; Baselli, Giuseppe; Baroni, Guido; Bauer, Julia; Debus, Jürgen; Parodi, Katia; De Bernardi, Elisabetta

    2014-01-01

    In ion beam radiotherapy, PET-based treatment verification provides a consistency check of the delivered treatment with respect to a simulation based on the treatment planning. In this work the region-based MLEM reconstruction algorithm is proposed as a new evaluation strategy in PET-based treatment verification. The comparative evaluation is based on reconstructed PET images in selected regions, which are automatically identified on the expected PET images according to homogeneity in activity values. The strategy was tested on numerical and physical phantoms, simulating mismatches between the planned and measured β + activity distributions. The region-based MLEM reconstruction was demonstrated to be robust against noise and the sensitivity of the strategy results were comparable to three voxel units, corresponding to 6 mm in numerical phantoms. The robustness of the region-based MLEM evaluation outperformed the voxel-based strategies. The potential of the proposed strategy was also retrospectively assessed on patient data and further clinical validation is envisioned. (paper)

  8. Engineering uses of physics-based ground motion simulations

    Science.gov (United States)

    Baker, Jack W.; Luco, Nicolas; Abrahamson, Norman A.; Graves, Robert W.; Maechling, Phillip J.; Olsen, Kim B.

    2014-01-01

    This paper summarizes validation methodologies focused on enabling ground motion simulations to be used with confidence in engineering applications such as seismic hazard analysis and dynmaic analysis of structural and geotechnical systems. Numberical simullation of ground motion from large erthquakes, utilizing physics-based models of earthquake rupture and wave propagation, is an area of active research in the earth science community. Refinement and validatoin of these models require collaboration between earthquake scientists and engineering users, and testing/rating methodolgies for simulated ground motions to be used with confidence in engineering applications. This paper provides an introduction to this field and an overview of current research activities being coordinated by the Souther California Earthquake Center (SCEC). These activities are related both to advancing the science and computational infrastructure needed to produce ground motion simulations, as well as to engineering validation procedures. Current research areas and anticipated future achievements are also discussed.

  9. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  10. The COROT ground-based archive and access system

    Science.gov (United States)

    Solano, E.; González-Riestra, R.; Catala, C.; Baglin, A.

    2002-01-01

    A prototype of the COROT ground-based archive and access system is presented here. The system has been developed at LAEFF and it is based on the experience gained at Laboratorio de Astrofisica Espacial y Fisica Fundamental (LAEFF) with the INES (IUE Newly Extracted System) Archive.

  11. Retrieval and analysis of atmospheric XCO2 using ground-based spectral observation.

    Science.gov (United States)

    Qin, Xiu-Chun; Lei, Li-Ping; Kawasaki, Masahiro; Masafumi, Ohashi; Takahiro, Kuroki; Zeng, Zhao-Cheng; Zhang, Bing

    2014-07-01

    Atmospheric CO2 column concentration (column-averaged dry air mole fractions of atmospheric carbon dioxide) data obtained by ground-based hyperspectral observation is an important source of data for the verification and improvement of the results of CO2 retrieval based on satellite hyperspectral observation. However, few studies have been conducted on atmospheric CO2 column concentration retrieval based on ground-based spectral hyperspectral observation in China. In the present study, we carried out the ground-based hyperspectral observation in Xilingol Grassland, Inner Mongolia of China by using an observation system which is consisted of an optical spectral analyzer, a sun tracker, and some other elements. The atmospheric CO2 column concentration was retrieved using the observed hyperspectral data. The effect of a wavelength shift of the observation spectra and the meteorological parameters on the retrieval precision of the atmospheric CO2 concentration was evaluated and analyzed. The results show that the mean value of atmospheric CO2 concentration was 390.9 microg x mL(-1) in the study area during the observing period from July to September. The shift of wavelength in the range between -0.012 and 0.042 nm will generally lead to 1 microg x mL(-1) deviation in the CO2 retrievals. This study also revealed that the spectral transmittance was sensitive to meteorological parameters in the wavelength range of 6 357-6 358, 6 360-6 361, and 6 363-6 364 cm(-1). By comparing the CO2 retrievals derived from the meteorological parameters observed in synchronous and non-synchronous time, respectively, with the spectral observation, it was showed that the concentration deviation caused by using the non-synchronously observed meteorological parameters is ranged from 0.11 to 4 microg x mL(-1). These results can be used as references for the further improvement of retrieving CO2 column concentration based on spectral observation.

  12. Assertion based verification methodology for HDL designs of primary sodium pump speed and eddy current flow measurement systems of PFBR

    International Nuclear Information System (INIS)

    Misra, M.K.; Menon, Saritha P.; Thirugnana Murthy, D.

    2013-01-01

    With the growing complexity and size of digital designs, functional verification has become a huge challenge. The validation and testing process accounts for a significant percentage of the overall development effort and cost for electronic systems. Many studies have shown that up to 70% of the design development time and resources are spent on functional verification. Functional errors manifest themselves very early in the design flow, and unless they are detected upfront, they can result in severe consequences - both financially and from a safety viewpoint. This paper covers the various types of verification methodologies and focuses on Assertion Based Verification Methodology for HDL designs, taking as case studies, the Primary Sodium Pump Speed and Eddy Current Flow Measurement Systems of PFBR. (author)

  13. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment.

    Science.gov (United States)

    Fuangrod, Todsaporn; Woodruff, Henry C; van Uytven, Eric; McCurdy, Boyd M C; Kuncic, Zdenka; O'Connor, Daryl J; Greer, Peter B

    2013-09-01

    To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient. The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance. The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s). A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  14. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment

    Energy Technology Data Exchange (ETDEWEB)

    Fuangrod, Todsaporn [Faculty of Engineering and Built Environment, School of Electrical Engineering and Computer Science, the University of Newcastle, NSW 2308 (Australia); Woodruff, Henry C.; O’Connor, Daryl J. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308 (Australia); Uytven, Eric van; McCurdy, Boyd M. C. [Division of Medical Physics, CancerCare Manitoba, 675 McDermot Avenue, Winnipeg, Manitoba R3E 0V9 (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Department of Radiology, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Kuncic, Zdenka [School of Physics, University of Sydney, Sydney, NSW 2006 (Australia); Greer, Peter B. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308, Australia and Department of Radiation Oncology, Calvary Mater Newcastle Hospital, Locked Bag 7, Hunter region Mail Centre, Newcastle, NSW 2310 (Australia)

    2013-09-15

    Purpose: To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient.Methods: The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance.Results: The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s).Conclusions: A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  15. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment

    International Nuclear Information System (INIS)

    Fuangrod, Todsaporn; Woodruff, Henry C.; O’Connor, Daryl J.; Uytven, Eric van; McCurdy, Boyd M. C.; Kuncic, Zdenka; Greer, Peter B.

    2013-01-01

    Purpose: To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient.Methods: The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance.Results: The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s).Conclusions: A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy

  16. A Mechanism of Modeling and Verification for SaaS Customization Based on TLA

    Science.gov (United States)

    Luan, Shuai; Shi, Yuliang; Wang, Haiyang

    With the gradually mature of SOA and the rapid development of Internet, SaaS has become a popular software service mode. The customized action of SaaS is usually subject to internal and external dependency relationships. This paper first introduces a method for modeling customization process based on Temporal Logic of Actions, and then proposes a verification algorithm to assure that each step in customization will not cause unpredictable influence on system and follow the related rules defined by SaaS provider.

  17. Design Verification Enhancement of FPGA-based Plant Protection System Trip Logics for Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Jung, Jae Cheon; Heo, Gyun Young

    2016-01-01

    As part of strengthening the application of FPGA technology and find solution to its challenges in NPPs, international atomic energy agency (IAEA) has indicated interest by joining sponsorship of Topical Group on FPGA Applications in NPPs (TG-FAN) that hold meetings up to 7th times until now, in form of workshop (International workshop on the application of FPGAs in NPPs) annually since 2008. The workshops attracted a significant interest and had a broad representation of stakeholders such as regulators, utilities, research organizations, system designers, and vendors, from various countries that converge to discuss the current issues regarding instrumentation and control (I and C) systems as well as FPGA applications. Two out of many technical issues identified by the group are lifecycle of FPGA-based platforms, systems, and applications; and methods and tools for V and V. Therefore, in this work, several design steps that involved the use of model-based systems engineering process as well as MATLAB/SIMULINK model which lead to the enhancement of design verification are employed. The verified and validated design output works correctly and effectively. Conclusively, the model-based systems engineering approach and the structural step-by-step design modeling techniques including SIMULINK model utilized in this work have shown how FPGA PPS trip logics design verification can be enhanced. If these design approaches are employ in the design of FPGA-based I and C systems, the design can be easily verified and validated

  18. A method of knowledge base verification and validation for nuclear power plants expert systems

    International Nuclear Information System (INIS)

    Kwon, Il Won

    1996-02-01

    The adoption of expert systems mainly as operator supporting systems is becoming increasingly popular as the control algorithms of system become more and more sophisticated and complicated. As a result of this popularity, a large number of expert systems are developed. The nature of expert systems, however, requires that they be verified and validated carefully and that detailed methodologies for their development be devised. Therefore, it is widely noted that assuring the reliability of expert systems is very important, especially in nuclear industry, and it is also recognized that the process of verification and validation is an essential part of reliability assurance for these systems. Research and practices have produced numerous methods for expert system verification and validation (V and V) that suggest traditional software and system approaches to V and V. However, many approaches and methods for expert system V and V are partial, unreliable, and not uniform. The purpose of this paper is to present a new approach to expert system V and V, based on Petri nets, providing a uniform model. We devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for checking incorrectness, inconsistency, and incompleteness in a knowledge base. We also suggest heuristic analysis for validation process to show that the reasoning path is correct

  19. Knowledge-based verification of clinical guidelines by detection of anomalies.

    Science.gov (United States)

    Duftschmid, G; Miksch, S

    2001-04-01

    As shown in numerous studies, a significant part of published clinical guidelines is tainted with different types of semantical errors that interfere with their practical application. The adaptation of generic guidelines, necessitated by circumstances such as resource limitations within the applying organization or unexpected events arising in the course of patient care, further promotes the introduction of defects. Still, most current approaches for the automation of clinical guidelines are lacking mechanisms, which check the overall correctness of their output. In the domain of software engineering in general and in the domain of knowledge-based systems (KBS) in particular, a common strategy to examine a system for potential defects consists in its verification. The focus of this work is to present an approach, which helps to ensure the semantical correctness of clinical guidelines in a three-step process. We use a particular guideline specification language called Asbru to demonstrate our verification mechanism. A scenario-based evaluation of our method is provided based on a guideline for the artificial ventilation of newborn infants. The described approach is kept sufficiently general in order to allow its application to several other guideline representation formats.

  20. Content-based Image Hiding Method for Secure Network Biometric Verification

    Directory of Open Access Journals (Sweden)

    Xiangjiu Che

    2011-08-01

    Full Text Available For secure biometric verification, most existing methods embed biometric information directly into the cover image, but content correlation analysis between the biometric image and the cover image is often ignored. In this paper, we propose a novel biometric image hiding approach based on the content correlation analysis to protect the network-based transmitted image. By using principal component analysis (PCA, the content correlation between the biometric image and the cover image is firstly analyzed. Then based on particle swarm optimization (PSO algorithm, some regions of the cover image are selected to represent the biometric image, in which the cover image can carry partial content of the biometric image. As a result of the correlation analysis, the unrepresented part of the biometric image is embedded into the cover image by using the discrete wavelet transform (DWT. Combined with human visual system (HVS model, this approach makes the hiding result perceptually invisible. The extensive experimental results demonstrate that the proposed hiding approach is robust against some common frequency and geometric attacks; it also provides an effective protection for the secure biometric verification.

  1. Modeling ground-based timber harvesting systems using computer simulation

    Science.gov (United States)

    Jingxin Wang; Chris B. LeDoux

    2001-01-01

    Modeling ground-based timber harvesting systems with an object-oriented methodology was investigated. Object-oriented modeling and design promote a better understanding of requirements, cleaner designs, and better maintainability of the harvesting simulation system. The model developed simulates chainsaw felling, drive-to-tree feller-buncher, swing-to-tree single-grip...

  2. Mycological evaluation of a ground cocoa-based beverage ...

    African Journals Online (AJOL)

    Cocoa beans (Theobroma cacao) are processed into cocoa beverage through fermentation, drying, roasting and grounding of the seed to powder. The mycological quality of 39 samples of different brand of these cocoa – based beverage referred to as 'eruku oshodi' collected from 3 different markets in south – west Nigeria ...

  3. Performance Based Criteria for Ship Collision and Grounding

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    2009-01-01

    The paper outlines a probabilistic procedure whereby the maritime industry can develop performance based rules to reduce the risk associated with human, environmental and economic costs of collision and grounding events and identify the most economic risk control options associated with prevention...

  4. Verification of gamma knife based fractionated radiosurgery with newly developed head-thorax phantom

    International Nuclear Information System (INIS)

    Bisht, Raj Kishor; Kale, Shashank Sharad; Natanasabapathi, Gopishankar; Singh, Manmohan Jit; Agarwal, Deepak; Garg, Ajay; Rath, Goura Kishore; Julka, Pramod Kumar; Kumar, Pratik; Thulkar, Sanjay; Sharma, Bhawani Shankar

    2016-01-01

    Objective: Purpose of the study is to verify the Gamma Knife Extend™ system (ES) based fractionated stereotactic radiosurgery with newly developed head-thorax phantom. Methods: Phantoms are extensively used to measure radiation dose and verify treatment plan in radiotherapy. A human upper body shaped phantom with thorax was designed to simulate fractionated stereotactic radiosurgery using Extend™ system of Gamma Knife. The central component of the phantom aids in performing radiological precision test, dosimetric evaluation and treatment verification. A hollow right circular cylindrical space of diameter 7.0 cm was created at the centre of this component to place various dosimetric devices using suitable adaptors. The phantom is made of poly methyl methacrylate (PMMA), a transparent thermoplastic material. Two sets of disk assemblies were designed to place dosimetric films in (1) horizontal (xy) and (2) vertical (xz) planes. Specific cylindrical adaptors were designed to place thimble ionization chamber inside phantom for point dose recording along xz axis. EBT3 Gafchromic films were used to analyze and map radiation field. The focal precision test was performed using 4 mm collimator shot in phantom to check radiological accuracy of treatment. The phantom head position within the Extend™ frame was estimated using encoded aperture measurement of repositioning check tool (RCT). For treatment verification, the phantom with inserts for film and ion chamber was scanned in reference treatment position using X-ray computed tomography (CT) machine and acquired stereotactic images were transferred into Leksell Gammaplan (LGP). A patient treatment plan with hypo-fractionated regimen was delivered and identical fractions were compared using EBT3 films and in-house MATLAB codes. Results: RCT measurement showed an overall positional accuracy of 0.265 mm (range 0.223 mm–0.343 mm). Gamma index analysis across fractions exhibited close agreement between LGP and film

  5. Ground-water contamination at Wurtsmith Air Force Base, Michigan

    Science.gov (United States)

    Stark, J.R.; Cummings, T.R.; Twenter, F.R.

    1983-01-01

    A sand and gravel aquifer of glacial origin underlies Wurtsmith Air Force Base in northeastern lower Michigan. The aquifer overlies a thick clay layer at an average depth of 65 feet. The water table is about 10 feet below land surface in the western part of the Base and about 25 feet below land surface in the eastern part. A ground-water divide cuts diagonally across the Base from northwest to southeast. South of the divide, ground water flows to the Au Sable River; north of the divide, it flows to Van Etten Creek and Van Etten Lake. Mathematical models were used to aid in calculating rates of groundwater flow. Rates range from about 0.8 feet per day in the eastern part of the Base to about 0.3 feet per day in the western part. Models also were used as an aid in making decisions regarding purging of contaminated water from the aquifer. In 1977, trichloroethylene was detected in the Air Force Base water-supply system. It had leaked from a buried storage tank near Building 43 in the southeastern part of the Base and moved northeastward under the influence of the natural ground-water gradient and the pumping of Base water-supply wells. In the most highly contaminated part of the plume, concentrations are greater than 1,000 micrograms per liter. Current purge pumping is removing some of the trichloroethylene, and seems to have arrested its eastward movement. Pumping of additional purge wells could increase the rate of removal. Trichloroethylene has also been detected in ground water in the vicinity of the Base alert apron, where a plume from an unknown source extends northeastward off Base. A smaller, less well-defined area of contamination also occurs just north of the larger plume. Trichloroethylene, identified near the waste-treatment plant, seepage lagoons, and the northern landfill area, is related to activities and operations in these areas. Dichloroethylene and trichloroethylene occur in significant quantities westward of Building 43, upgradient from the major

  6. GEARS: An Enterprise Architecture Based On Common Ground Services

    Science.gov (United States)

    Petersen, S.

    2014-12-01

    Earth observation satellites collect a broad variety of data used in applications that range from weather forecasting to climate monitoring. Within NOAA the National Environmental Satellite Data and Information Service (NESDIS) supports these applications by operating satellites in both geosynchronous and polar orbits. Traditionally NESDIS has acquired and operated its satellites as stand-alone systems with their own command and control, mission management, processing, and distribution systems. As the volume, velocity, veracity, and variety of sensor data and products produced by these systems continues to increase, NESDIS is migrating to a new concept of operation in which it will operate and sustain the ground infrastructure as an integrated Enterprise. Based on a series of common ground services, the Ground Enterprise Architecture System (GEARS) approach promises greater agility, flexibility, and efficiency at reduced cost. This talk describes the new architecture and associated development activities, and presents the results of initial efforts to improve product processing and distribution.

  7. Hanford Ground-Water Data Base management guide

    International Nuclear Information System (INIS)

    Rieger, J.T.; Mitchell, P.J.; Muffett, D.M.; Fruland, R.M.; Moore, S.B.; Marshall, S.M.

    1990-02-01

    This guide describes the Hanford Ground-Water Data Base (HGWDB), a computerized data base used to store hydraulic head, sample analytical, temperature, geologic, and well-structure information for ground-water monitoring wells on the Hanford Site. These data are stored for the purpose of data retrieval for report generation and also for historical purposes. This guide is intended as an aid to the data base manager and the various staff authorized to enter and verify data, maintain the data base, and maintain the supporting software. This guide focuses on the structure of the HGWDB, providing a fairly detailed description of the programs, files, and parameters. Data-retrieval instructions for the general user of the HGWDB will be found in the HGWDB User's Manual. 6 figs

  8. A rule-based verification and control framework in ATLAS Trigger-DAQ

    CERN Document Server

    Kazarov, A; Lehmann-Miotto, G; Sloper, J E; Ryabov, Yu; Computing In High Energy and Nuclear Physics

    2007-01-01

    In order to meet the requirements of ATLAS data taking, the ATLAS Trigger-DAQ system is composed of O(1000) of applications running on more than 2600 computers in a network. With such system size, s/w and h/w failures are quite often. To minimize system downtime, the Trigger-DAQ control system shall include advanced verification and diagnostics facilities. The operator should use tests and expertise of the TDAQ and detectors developers in order to diagnose and recover from errors, if possible automatically. The TDAQ control system is built as a distributed tree of controllers, where behavior of each controller is defined in a rule-based language allowing easy customization. The control system also includes verification framework which allow users to develop and configure tests for any component in the system with different levels of complexity. It can be used as a stand-alone test facility for a small detector installation, as part of the general TDAQ initialization procedure, and for diagnosing the problems ...

  9. Reconstruction based finger-knuckle-print verification with score level adaptive binary fusion.

    Science.gov (United States)

    Gao, Guangwei; Zhang, Lei; Yang, Jian; Zhang, Lin; Zhang, David

    2013-12-01

    Recently, a new biometrics identifier, namely finger knuckle print (FKP), has been proposed for personal authentication with very interesting results. One of the advantages of FKP verification lies in its user friendliness in data collection. However, the user flexibility in positioning fingers also leads to a certain degree of pose variations in the collected query FKP images. The widely used Gabor filtering based competitive coding scheme is sensitive to such variations, resulting in many false rejections. We propose to alleviate this problem by reconstructing the query sample with a dictionary learned from the template samples in the gallery set. The reconstructed FKP image can reduce much the enlarged matching distance caused by finger pose variations; however, both the intra-class and inter-class distances will be reduced. We then propose a score level adaptive binary fusion rule to adaptively fuse the matching distances before and after reconstruction, aiming to reduce the false rejections without increasing much the false acceptances. Experimental results on the benchmark PolyU FKP database show that the proposed method significantly improves the FKP verification accuracy.

  10. Model-Based Design and Formal Verification Processes for Automated Waterway System Operations

    Directory of Open Access Journals (Sweden)

    Leonard Petnga

    2016-06-01

    Full Text Available Waterway and canal systems are particularly cost effective in the transport of bulk and containerized goods to support global trade. Yet, despite these benefits, they are among the most under-appreciated forms of transportation engineering systems. Looking ahead, the long-term view is not rosy. Failures, delays, incidents and accidents in aging waterway systems are doing little to attract the technical and economic assistance required for modernization and sustainability. In a step toward overcoming these challenges, this paper argues that programs for waterway and canal modernization and sustainability can benefit significantly from system thinking, supported by systems engineering techniques. We propose a multi-level multi-stage methodology for the model-based design, simulation and formal verification of automated waterway system operations. At the front-end of development, semi-formal modeling techniques are employed for the representation of project goals and scenarios, requirements and high-level models of behavior and structure. To assure the accuracy of engineering predictions and the correctness of operations, formal modeling techniques are used for the performance assessment and the formal verification of the correctness of functionality. The essential features of this methodology are highlighted in a case study examination of ship and lock-system behaviors in a two-stage lock system.

  11. Modal-pushover-based ground-motion scaling procedure

    Science.gov (United States)

    Kalkan, Erol; Chopra, Anil K.

    2011-01-01

    Earthquake engineering is increasingly using nonlinear response history analysis (RHA) to demonstrate the performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. This paper presents a modal-pushover-based scaling (MPS) procedure to scale ground motions for use in a nonlinear RHA of buildings. In the MPS method, the ground motions are scaled to match to a specified tolerance, a target value of the inelastic deformation of the first-mode inelastic single-degree-of-freedom (SDF) system whose properties are determined by the first-mode pushover analysis. Appropriate for first-mode dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-mode SDF systems in selecting a subset of the scaled ground motions. Based on results presented for three actual buildings-4, 6, and 13-story-the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.

  12. Ground-based Nuclear Detonation Detection (GNDD) Technology Roadmap

    International Nuclear Information System (INIS)

    Casey, Leslie A.

    2014-01-01

    This GNDD Technology Roadmap is intended to provide guidance to potential researchers and help management define research priorities to achieve technology advancements for ground-based nuclear explosion monitoring science being pursued by the Ground-based Nuclear Detonation Detection (GNDD) Team within the Office of Nuclear Detonation Detection in the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE). Four science-based elements were selected to encompass the entire scope of nuclear monitoring research and development (R&D) necessary to facilitate breakthrough scientific results, as well as deliver impactful products. Promising future R&D is delineated including dual use associated with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Important research themes as well as associated metrics are identified along with a progression of accomplishments, represented by a selected bibliography, that are precursors to major improvements to nuclear explosion monitoring.

  13. Ground-based Nuclear Detonation Detection (GNDD) Technology Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    Casey, Leslie A.

    2014-01-13

    This GNDD Technology Roadmap is intended to provide guidance to potential researchers and help management define research priorities to achieve technology advancements for ground-based nuclear explosion monitoring science being pursued by the Ground-based Nuclear Detonation Detection (GNDD) Team within the Office of Nuclear Detonation Detection in the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE). Four science-based elements were selected to encompass the entire scope of nuclear monitoring research and development (R&D) necessary to facilitate breakthrough scientific results, as well as deliver impactful products. Promising future R&D is delineated including dual use associated with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Important research themes as well as associated metrics are identified along with a progression of accomplishments, represented by a selected bibliography, that are precursors to major improvements to nuclear explosion monitoring.

  14. Risk-Based Tailoring of the Verification, Validation, and Accreditation/Acceptance Processes (Adaptation fondee sur le risque, des processus de verification, de validation, et d’accreditation/d’acceptation)

    Science.gov (United States)

    2012-04-01

    AGARD doivent comporter la dénomination « RTO » ou « AGARD » selon le cas, suivi du numéro de série. Des informations analogues, telles que le titre ...MSG-054 Risk-Based Tailoring of the Verification, Validation, and Accreditation/ Acceptance Processes (Adaptation fondée sur le risque, des...MSG-054 Risk-Based Tailoring of the Verification, Validation, and Accreditation/ Acceptance Processes (Adaptation fondée sur le risque, des

  15. Silicon carbide optics for space and ground based astronomical telescopes

    Science.gov (United States)

    Robichaud, Joseph; Sampath, Deepak; Wainer, Chris; Schwartz, Jay; Peton, Craig; Mix, Steve; Heller, Court

    2012-09-01

    Silicon Carbide (SiC) optical materials are being applied widely for both space based and ground based optical telescopes. The material provides a superior weight to stiffness ratio, which is an important metric for the design and fabrication of lightweight space telescopes. The material also has superior thermal properties with a low coefficient of thermal expansion, and a high thermal conductivity. The thermal properties advantages are important for both space based and ground based systems, which typically need to operate under stressing thermal conditions. The paper will review L-3 Integrated Optical Systems - SSG’s (L-3 SSG) work in developing SiC optics and SiC optical systems for astronomical observing systems. L-3 SSG has been fielding SiC optical components and systems for over 25 years. Space systems described will emphasize the recently launched Long Range Reconnaissance Imager (LORRI) developed for JHU-APL and NASA-GSFC. Review of ground based applications of SiC will include supporting L-3 IOS-Brashear’s current contract to provide the 0.65 meter diameter, aspheric SiC secondary mirror for the Advanced Technology Solar Telescope (ATST).

  16. WE-DE-BRA-01: SCIENCE COUNCIL JUNIOR INVESTIGATOR COMPETITION WINNER: Acceleration of a Limited-Angle Intrafraction Verification (LIVE) System Using Adaptive Prior Knowledge Based Image Estimation

    International Nuclear Information System (INIS)

    Zhang, Y; Yin, F; Ren, L; Zhang, Y

    2016-01-01

    Purpose: To develop an adaptive prior knowledge based image estimation method to reduce the scan angle needed in the LIVE system to reconstruct 4D-CBCT for intrafraction verification. Methods: The LIVE system has been previously proposed to reconstructs 4D volumetric images on-the-fly during arc treatment for intrafraction target verification and dose calculation. This system uses limited-angle beam’s eye view (BEV) MV cine images acquired from the treatment beam together with the orthogonally acquired limited-angle kV projections to reconstruct 4D-CBCT images for target verification during treatment. In this study, we developed an adaptive constrained free-form deformation reconstruction technique in LIVE to further reduce the scanning angle needed to reconstruct the CBCT images. This technique uses free form deformation with energy minimization to deform prior images to estimate 4D-CBCT based on projections acquired in limited angle (orthogonal 6°) during the treatment. Note that the prior images are adaptively updated using the latest CBCT images reconstructed by LIVE during treatment to utilize the continuity of patient motion.The 4D digital extended-cardiac-torso (XCAT) phantom was used to evaluate the efficacy of this technique with LIVE system. A lung patient was simulated with different scenario, including baseline drifts, amplitude change and phase shift. Limited-angle orthogonal kV and beam’s eye view (BEV) MV projections were generated for each scenario. The CBCT reconstructed by these projections were compared with the ground-truth generated in XCAT.Volume-percentage-difference (VPD) and center-of-mass-shift (COMS) were calculated between the reconstructed and the ground-truth tumors to evaluate the reconstruction accuracy. Results: Using orthogonal-view of 6° kV and BEV- MV projections, the VPD/COMS values were 12.7±4.0%/0.7±0.5 mm, 13.0±5.1%/0.8±0.5 mm, and 11.4±5.4%/0.5±0.3 mm for the three scenarios, respectively. Conclusion: The

  17. WE-DE-BRA-01: SCIENCE COUNCIL JUNIOR INVESTIGATOR COMPETITION WINNER: Acceleration of a Limited-Angle Intrafraction Verification (LIVE) System Using Adaptive Prior Knowledge Based Image Estimation

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Y; Yin, F; Ren, L [Duke University Medical Center, Durham, NC (United States); Zhang, Y [UT Southwestern Medical Ctr at Dallas, Dallas, TX (United States)

    2016-06-15

    Purpose: To develop an adaptive prior knowledge based image estimation method to reduce the scan angle needed in the LIVE system to reconstruct 4D-CBCT for intrafraction verification. Methods: The LIVE system has been previously proposed to reconstructs 4D volumetric images on-the-fly during arc treatment for intrafraction target verification and dose calculation. This system uses limited-angle beam’s eye view (BEV) MV cine images acquired from the treatment beam together with the orthogonally acquired limited-angle kV projections to reconstruct 4D-CBCT images for target verification during treatment. In this study, we developed an adaptive constrained free-form deformation reconstruction technique in LIVE to further reduce the scanning angle needed to reconstruct the CBCT images. This technique uses free form deformation with energy minimization to deform prior images to estimate 4D-CBCT based on projections acquired in limited angle (orthogonal 6°) during the treatment. Note that the prior images are adaptively updated using the latest CBCT images reconstructed by LIVE during treatment to utilize the continuity of patient motion.The 4D digital extended-cardiac-torso (XCAT) phantom was used to evaluate the efficacy of this technique with LIVE system. A lung patient was simulated with different scenario, including baseline drifts, amplitude change and phase shift. Limited-angle orthogonal kV and beam’s eye view (BEV) MV projections were generated for each scenario. The CBCT reconstructed by these projections were compared with the ground-truth generated in XCAT.Volume-percentage-difference (VPD) and center-of-mass-shift (COMS) were calculated between the reconstructed and the ground-truth tumors to evaluate the reconstruction accuracy. Results: Using orthogonal-view of 6° kV and BEV- MV projections, the VPD/COMS values were 12.7±4.0%/0.7±0.5 mm, 13.0±5.1%/0.8±0.5 mm, and 11.4±5.4%/0.5±0.3 mm for the three scenarios, respectively. Conclusion: The

  18. High energy astrophysics with ground-based gamma ray detectors

    International Nuclear Information System (INIS)

    Aharonian, F; Buckley, J; Kifune, T; Sinnis, G

    2008-01-01

    Recent advances in ground-based gamma ray astronomy have led to the discovery of more than 70 sources of very high energy (E γ ≥ 100 GeV) gamma rays, falling into a number of source populations including pulsar wind nebulae, shell type supernova remnants, Wolf-Rayet stars, giant molecular clouds, binary systems, the Galactic Center, active galactic nuclei and 'dark' (yet unidentified) galactic objects. We summarize the history of TeV gamma ray astronomy up to the current status of the field including a description of experimental techniques and highlight recent astrophysical results. We also discuss the potential of ground-based gamma ray astronomy for future discoveries and describe possible directions for future instrumental developments

  19. Automatic Barometric Updates from Ground-Based Navigational Aids

    Science.gov (United States)

    1990-03-12

    ro fAutomatic Barometric Updates US Department from of Transportation Ground-Based Federal Aviation Administration Navigational Aids Office of Safety...tighter vertical spacing controls , particularly for operations near Terminal Control Areas (TCAs), Airport Radar Service Areas (ARSAs), military climb and...E.F., Ruth, J.C., and Williges, B.H. (1987). Speech Controls and Displays. In Salvendy, G., E. Handbook of Human Factors/Ergonomics, New York, John

  20. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  1. Verification test for on-line diagnosis algorithm based on noise analysis

    International Nuclear Information System (INIS)

    Tamaoki, T.; Naito, N.; Tsunoda, T.; Sato, M.; Kameda, A.

    1980-01-01

    An on-line diagnosis algorithm was developed and its verification test was performed using a minicomputer. This algorithm identifies the plant state by analyzing various system noise patterns, such as power spectral densities, coherence functions etc., in three procedure steps. Each obtained noise pattern is examined by using the distances from its reference patterns prepared for various plant states. Then, the plant state is identified by synthesizing each result with an evaluation weight. This weight is determined automatically from the reference noise patterns prior to on-line diagnosis. The test was performed with 50 MW (th) Steam Generator noise data recorded under various controller parameter values. The algorithm performance was evaluated based on a newly devised index. The results obtained with one kind of weight showed the algorithm efficiency under the proper selection of noise patterns. Results for another kind of weight showed the robustness of the algorithm to this selection. (orig.)

  2. Verification of photon attenuation characteristics for 3D printer based small animal lung model

    International Nuclear Information System (INIS)

    Lee, Se Ho; Lee, Seung Wook; Han, Su Chul; Park, Seung Woo

    2016-01-01

    Since it is difficult to measure absorbed dose to mice in vivo, replica mice are mostly used as alternative. In this study, realistic mouse phantom was fabricated by using 3D printer (object500 connex3, Stratasys, USA). Elemental inks as material of 3D printer were selected corresponding to mouse tissue. To represent lung, selected material was partially used with air layer. In order to verify material equivalent, super-flex bolus was simply compared to verify photon attenuation characteristics. In the case of lung, Hounsfield unit (HU) of the phantom were compared with a live mouse. In this study, we fabricated mouse phantom by using 3D printer, and practically verified photon attenuation characteristics. The fabricated phantom shows tissue equivalence as well as similar geometry with live mouse. As more and more growing of 3D printer technique, 3D printer based small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study

  3. Verification of photon attenuation characteristics for 3D printer based small animal lung model

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Se Ho; Lee, Seung Wook [Pusan National University, Busan (Korea, Republic of); Han, Su Chul; Park, Seung Woo [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of)

    2016-05-15

    Since it is difficult to measure absorbed dose to mice in vivo, replica mice are mostly used as alternative. In this study, realistic mouse phantom was fabricated by using 3D printer (object500 connex3, Stratasys, USA). Elemental inks as material of 3D printer were selected corresponding to mouse tissue. To represent lung, selected material was partially used with air layer. In order to verify material equivalent, super-flex bolus was simply compared to verify photon attenuation characteristics. In the case of lung, Hounsfield unit (HU) of the phantom were compared with a live mouse. In this study, we fabricated mouse phantom by using 3D printer, and practically verified photon attenuation characteristics. The fabricated phantom shows tissue equivalence as well as similar geometry with live mouse. As more and more growing of 3D printer technique, 3D printer based small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study.

  4. BAVP: Blockchain-Based Access Verification Protocol in LEO Constellation Using IBE Keys

    Directory of Open Access Journals (Sweden)

    Songjie Wei

    2018-01-01

    Full Text Available LEO constellation has received intensive research attention in the field of satellite communication. The existing centralized authentication protocols traditionally used for MEO/GEO satellite networks cannot accommodate LEO satellites with frequent user connection switching. This paper proposes a fast and efficient access verification protocol named BAVP by combining identity-based encryption and blockchain technology. Two different key management schemes with IBE and blockchain, respectively, are investigated, which further enhance the authentication reliability and efficiency in LEO constellation. Experiments on OPNET simulation platform evaluate and demonstrate the effectiveness, reliability, and fast-switching efficiency of the proposed protocol. For LEO networks, BAVP surpasses the well-known existing solutions with significant advantages in both performance and scalability which are supported by theoretical analysis and simulation results.

  5. THRIVE: threshold homomorphic encryption based secure and privacy preserving biometric verification system

    Science.gov (United States)

    Karabat, Cagatay; Kiraz, Mehmet Sabir; Erdogan, Hakan; Savas, Erkay

    2015-12-01

    In this paper, we introduce a new biometric verification and template protection system which we call THRIVE. The system includes novel enrollment and authentication protocols based on threshold homomorphic encryption where a private key is shared between a user and a verifier. In the THRIVE system, only encrypted binary biometric templates are stored in a database and verification is performed via homomorphically randomized templates, thus, original templates are never revealed during authentication. Due to the underlying threshold homomorphic encryption scheme, a malicious database owner cannot perform full decryption on encrypted templates of the users in the database. In addition, security of the THRIVE system is enhanced using a two-factor authentication scheme involving user's private key and biometric data. Using simulation-based techniques, the proposed system is proven secure in the malicious model. The proposed system is suitable for applications where the user does not want to reveal her biometrics to the verifier in plain form, but needs to prove her identity by using biometrics. The system can be used with any biometric modality where a feature extraction method yields a fixed size binary template and a query template is verified when its Hamming distance to the database template is less than a threshold. The overall connection time for the proposed THRIVE system is estimated to be 336 ms on average for 256-bit biometric templates on a desktop PC running with quad core 3.2 GHz CPUs at 10 Mbit/s up/down link connection speed. Consequently, the proposed system can be efficiently used in real-life applications.

  6. On marker-based parentage verification via non-linear optimization.

    Science.gov (United States)

    Boerner, Vinzent

    2017-06-15

    Parentage verification by molecular markers is mainly based on short tandem repeat markers. Single nucleotide polymorphisms (SNPs) as bi-allelic markers have become the markers of choice for genotyping projects. Thus, the subsequent step is to use SNP genotypes for parentage verification as well. Recent developments of algorithms such as evaluating opposing homozygous SNP genotypes have drawbacks, for example the inability of rejecting all animals of a sample of potential parents. This paper describes an algorithm for parentage verification by constrained regression which overcomes the latter limitation and proves to be very fast and accurate even when the number of SNPs is as low as 50. The algorithm was tested on a sample of 14,816 animals with 50, 100 and 500 SNP genotypes randomly selected from 40k genotypes. The samples of putative parents of these animals contained either five random animals, or four random animals and the true sire. Parentage assignment was performed by ranking of regression coefficients, or by setting a minimum threshold for regression coefficients. The assignment quality was evaluated by the power of assignment (P[Formula: see text]) and the power of exclusion (P[Formula: see text]). If the sample of putative parents contained the true sire and parentage was assigned by coefficient ranking, P[Formula: see text] and P[Formula: see text] were both higher than 0.99 for the 500 and 100 SNP genotypes, and higher than 0.98 for the 50 SNP genotypes. When parentage was assigned by a coefficient threshold, P[Formula: see text] was higher than 0.99 regardless of the number of SNPs, but P[Formula: see text] decreased from 0.99 (500 SNPs) to 0.97 (100 SNPs) and 0.92 (50 SNPs). If the sample of putative parents did not contain the true sire and parentage was rejected using a coefficient threshold, the algorithm achieved a P[Formula: see text] of 1 (500 SNPs), 0.99 (100 SNPs) and 0.97 (50 SNPs). The algorithm described here is easy to implement

  7. Tree automata-based refinement with application to Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2015-01-01

    In this paper we apply tree-automata techniques to refinement of abstract interpretation in Horn clause verification. We go beyond previous work on refining trace abstractions; firstly we handle tree automata rather than string automata and thereby can capture traces in any Horn clause derivation...... compare the results with other state of the art Horn clause verification tools....

  8. Image Processing Based Signature Verification Technique to Reduce Fraud in Financial Institutions

    Directory of Open Access Journals (Sweden)

    Hussein Walid

    2016-01-01

    Full Text Available Handwritten signature is broadly utilized as personal verification in financial institutions ensures the necessity for a robust automatic signature verification tool. This tool aims to reduce fraud in all related financial transactions’ sectors. This paper proposes an online, robust, and automatic signature verification technique using the recent advances in image processing and machine learning. Once the image of a handwritten signature for a customer is captured, several pre-processing steps are performed on it including filtration and detection of the signature edges. Afterwards, a feature extraction process is applied on the image to extract Speeded up Robust Features (SURF and Scale-Invariant Feature Transform (SIFT features. Finally, a verification process is developed and applied to compare the extracted image features with those stored in the database for the specified customer. Results indicate high accuracy, simplicity, and rapidity of the developed technique, which are the main criteria to judge a signature verification tool in banking and other financial institutions.

  9. A quantification of the effectiveness of EPID dosimetry and software-based plan verification systems in detecting incidents in radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Bojechko, Casey; Phillps, Mark; Kalet, Alan; Ford, Eric C., E-mail: eford@uw.edu [Department of Radiation Oncology, University of Washington, 1959 N. E. Pacific Street, Seattle, Washington 98195 (United States)

    2015-09-15

    Purpose: Complex treatments in radiation therapy require robust verification in order to prevent errors that can adversely affect the patient. For this purpose, the authors estimate the effectiveness of detecting errors with a “defense in depth” system composed of electronic portal imaging device (EPID) based dosimetry and a software-based system composed of rules-based and Bayesian network verifications. Methods: The authors analyzed incidents with a high potential severity score, scored as a 3 or 4 on a 4 point scale, recorded in an in-house voluntary incident reporting system, collected from February 2012 to August 2014. The incidents were categorized into different failure modes. The detectability, defined as the number of incidents that are detectable divided total number of incidents, was calculated for each failure mode. Results: In total, 343 incidents were used in this study. Of the incidents 67% were related to photon external beam therapy (EBRT). The majority of the EBRT incidents were related to patient positioning and only a small number of these could be detected by EPID dosimetry when performed prior to treatment (6%). A large fraction could be detected by in vivo dosimetry performed during the first fraction (74%). Rules-based and Bayesian network verifications were found to be complimentary to EPID dosimetry, able to detect errors related to patient prescriptions and documentation, and errors unrelated to photon EBRT. Combining all of the verification steps together, 91% of all EBRT incidents could be detected. Conclusions: This study shows that the defense in depth system is potentially able to detect a large majority of incidents. The most effective EPID-based dosimetry verification is in vivo measurements during the first fraction and is complemented by rules-based and Bayesian network plan checking.

  10. Model-based verification method for solving the parameter uncertainty in the train control system

    International Nuclear Information System (INIS)

    Cheng, Ruijun; Zhou, Jin; Chen, Dewang; Song, Yongduan

    2016-01-01

    This paper presents a parameter analysis method to solve the parameter uncertainty problem for hybrid system and explore the correlation of key parameters for distributed control system. For improving the reusability of control model, the proposed approach provides the support for obtaining the constraint sets of all uncertain parameters in the abstract linear hybrid automata (LHA) model when satisfying the safety requirements of the train control system. Then, in order to solve the state space explosion problem, the online verification method is proposed to monitor the operating status of high-speed trains online because of the real-time property of the train control system. Furthermore, we construct the LHA formal models of train tracking model and movement authority (MA) generation process as cases to illustrate the effectiveness and efficiency of the proposed method. In the first case, we obtain the constraint sets of uncertain parameters to avoid collision between trains. In the second case, the correlation of position report cycle and MA generation cycle is analyzed under both the normal and the abnormal condition influenced by packet-loss factor. Finally, considering stochastic characterization of time distributions and real-time feature of moving block control system, the transient probabilities of wireless communication process are obtained by stochastic time petri nets. - Highlights: • We solve the parameters uncertainty problem by using model-based method. • We acquire the parameter constraint sets by verifying linear hybrid automata models. • Online verification algorithms are designed to monitor the high-speed trains. • We analyze the correlation of key parameters and uncritical parameters. • The transient probabilities are obtained by using reliability analysis.

  11. Verification & Validation of High-Order Short-Characteristics-Based Deterministic Transport Methodology on Unstructured Grids

    International Nuclear Information System (INIS)

    Azmy, Yousry; Wang, Yaqi

    2013-01-01

    The research team has developed a practical, high-order, discrete-ordinates, short characteristics neutron transport code for three-dimensional configurations represented on unstructured tetrahedral grids that can be used for realistic reactor physics applications at both the assembly and core levels. This project will perform a comprehensive verification and validation of this new computational tool against both a continuous-energy Monte Carlo simulation (e.g. MCNP) and experimentally measured data, an essential prerequisite for its deployment in reactor core modeling. Verification is divided into three phases. The team will first conduct spatial mesh and expansion order refinement studies to monitor convergence of the numerical solution to reference solutions. This is quantified by convergence rates that are based on integral error norms computed from the cell-by-cell difference between the code's numerical solution and its reference counterpart. The latter is either analytic or very fine- mesh numerical solutions from independent computational tools. For the second phase, the team will create a suite of code-independent benchmark configurations to enable testing the theoretical order of accuracy of any particular discretization of the discrete ordinates approximation of the transport equation. For each tested case (i.e. mesh and spatial approximation order), researchers will execute the code and compare the resulting numerical solution to the exact solution on a per cell basis to determine the distribution of the numerical error. The final activity comprises a comparison to continuous-energy Monte Carlo solutions for zero-power critical configuration measurements at Idaho National Laboratory's Advanced Test Reactor (ATR). Results of this comparison will allow the investigators to distinguish between modeling errors and the above-listed discretization errors introduced by the deterministic method, and to separate the sources of uncertainty.

  12. Scenario based seismic hazard assessment and its application to the seismic verification of relevant buildings

    Science.gov (United States)

    Romanelli, Fabio; Vaccari, Franco; Altin, Giorgio; Panza, Giuliano

    2016-04-01

    The procedure we developed, and applied to a few relevant cases, leads to the seismic verification of a building by: a) use of a scenario based neodeterministic approach (NDSHA) for the calculation of the seismic input, and b) control of the numerical modeling of an existing building, using free vibration measurements of the real structure. The key point of this approach is the strict collaboration, from the seismic input definition to the monitoring of the response of the building in the calculation phase, of the seismologist and the civil engineer. The vibrometry study allows the engineer to adjust the computational model in the direction suggested by the experimental result of a physical measurement. Once the model has been calibrated by vibrometric analysis, one can select in the design spectrum the proper range of periods of interest for the structure. Then, the realistic values of spectral acceleration, which include the appropriate amplification obtained through the modeling of a "scenario" input to be applied to the final model, can be selected. Generally, but not necessarily, the "scenario" spectra lead to higher accelerations than those deduced by taking the spectra from the national codes (i.e. NTC 2008, for Italy). The task of the verifier engineer is to act so that the solution of the verification is conservative and realistic. We show some examples of the application of the procedure to some relevant (e.g. schools) buildings of the Trieste Province. The adoption of the scenario input has given in most of the cases an increase of critical elements that have to be taken into account in the design of reinforcements. However, the higher cost associated with the increase of elements to reinforce is reasonable, especially considering the important reduction of the risk level.

  13. Model-based verification and validation of the SMAP uplink processes

    Science.gov (United States)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  14. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  15. Operational verification of a blow out preventer utilizing fiber Bragg grating based strain gauges

    Science.gov (United States)

    Turner, Alan L.; Loustau, Philippe; Thibodeau, Dan

    2015-05-01

    Ultra-deep water BOP (Blowout Preventer) operation poses numerous challenges in obtaining accurate knowledge of current system integrity and component condition- a salient example is the difficulty of verifying closure of the pipe and shearing rams during and after well control events. Ascertaining the integrity of these functions is currently based on a manual volume measurement performed with a stop watch. Advances in sensor technology now permit more accurate methods of BOP condition monitoring. Fiber optic sensing technology and particularly fiber optic strain gauges have evolved to a point where we can derive a good representation of what is happening inside a BOP by installing sensors on the outside shell. Function signatures can be baselined to establish thresholds that indicate successful function activation. Based on this knowledge base, signal variation over time can then be utilized to assess degradation of these functions and subsequent failure to function. Monitoring the BOP from the outside has the advantage of gathering data through a system that can be interfaced with risk based integrity management software and/or a smart monitoring system that analyzes BOP control redundancies without the requirement of interfacing with OEM control systems. The paper will present the results of ongoing work on a fully instrumented 13-½" 10,000 psi pipe ram. Instrumentation includes commonly used pressure transducers, accelerometers, flow meters, and optical strain gauges. Correlation will be presented between flow, pressure, acceleration signatures and the fiber optic strain gauge's response as it relates to functional verification and component level degradation trending.

  16. Quality verification at Arkansas Nuclear One using performance-based concepts

    International Nuclear Information System (INIS)

    Cooper, R.M.

    1990-01-01

    Performance-based auditing is beginning to make an impact within the nuclear industry. Its use provides performance assessments of the operating plant. In the past, this company along with most other nuclear utilities, performed compliance-based audits. These audits focused on paper reviews of past activities that were completed in weeks or months. This type of audit did not provide a comprehensive assessment of the effectiveness of an activity's performance, nor was it able to identify any performance problems that may have occurred. To respond to this discrepancy, a comprehensive overhaul of quality assurance (QA) assessment programs was developed. The first major change was to develop a technical specification (tech spec) audit program, with the objective of auditing each tech spec line item every 5 yr. To achieve performance-based results within the tech spec audit program, a tech spec surveillance program was implemented whose goal is to observe 75% of the tech-spec required tests every 5 yr. The next major change was to develop a QA surveillance program that would provide surveillance coverage for the remainder of the plant not covered by the tech spec surveillance program. One other improvement was to merge the QA/quality control (QC) functions into one nuclear quality group. The final part of the quality verification effort is trending of the quality performance-based data (including US Nuclear Regulatory Commission (NRC) violations)

  17. Augmenting WFIRST Microlensing with a Ground-Based Telescope Network

    Science.gov (United States)

    Zhu, Wei; Gould, Andrew

    2016-06-01

    Augmenting the Wide Field Infrared Survey Telescope (WFIRST) microlensing campaigns with intensive observations from a ground-based network of wide-field survey telescopes would have several major advantages. First, it would enable full two-dimensional (2-D) vector microlens parallax measurements for a substantial fraction of low-mass lenses as well as planetary and binary events that show caustic crossing features. For a significant fraction of the free-floating planet (FFP) events and all caustic-crossing planetary/binary events, these 2-D parallax measurements directly lead to complete solutions (mass, distance, transverse velocity) of the lens object (or lens system). For even more events, the complementary ground-based observations will yield 1-D parallax measurements. Together with the 1-D parallaxes from WFIRST alone, they can probe the entire mass range M > M_Earth. For luminous lenses, such 1-D parallax measurements can be promoted to complete solutions (mass, distance, transverse velocity) by high-resolution imaging. This would provide crucial information not only about the hosts of planets and other lenses, but also enable a much more precise Galactic model. Other benefits of such a survey include improved understanding of binaries (particularly with low mass primaries), and sensitivity to distant ice-giant and gas-giant companions of WFIRST lenses that cannot be detected by WFIRST itself due to its restricted observing windows. Existing ground-based microlensing surveys can be employed if WFIRST is pointed at lower-extinction fields than is currently envisaged. This would come at some cost to the event rate. Therefore the benefits of improved characterization of lenses must be weighed against these costs.

  18. Lidar to lidar calibration of Ground-based Lidar

    DEFF Research Database (Denmark)

    Fernandez Garcia, Sergio; Courtney, Michael

    This report presents the result of the lidar to lidar calibration performed for ground-based lidar. Calibration is here understood as the establishment of a relation between the reference lidar wind speed measurements with measurement uncertainties provided by measurement standard and corresponding...... lidar wind speed indications with associated measurement uncertainties. The lidar calibration concerns the 10 minute mean wind speed measurements. The comparison of the lidar measurements of the wind direction with that from the reference lidar measurements are given for information only....

  19. Mass Spectrometry-based Assay for High Throughput and High Sensitivity Biomarker Verification

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Xuejiang; Tang, Keqi

    2017-06-14

    Searching for disease specific biomarkers has become a major undertaking in the biomedical research field as the effective diagnosis, prognosis and treatment of many complex human diseases are largely determined by the availability and the quality of the biomarkers. A successful biomarker as an indicator to a specific biological or pathological process is usually selected from a large group of candidates by a strict verification and validation process. To be clinically useful, the validated biomarkers must be detectable and quantifiable by the selected testing techniques in their related tissues or body fluids. Due to its easy accessibility, protein biomarkers would ideally be identified in blood plasma or serum. However, most disease related protein biomarkers in blood exist at very low concentrations (<1ng/mL) and are “masked” by many none significant species at orders of magnitude higher concentrations. The extreme requirements of measurement sensitivity, dynamic range and specificity make the method development extremely challenging. The current clinical protein biomarker measurement primarily relies on antibody based immunoassays, such as ELISA. Although the technique is sensitive and highly specific, the development of high quality protein antibody is both expensive and time consuming. The limited capability of assay multiplexing also makes the measurement an extremely low throughput one rendering it impractical when hundreds to thousands potential biomarkers need to be quantitatively measured across multiple samples. Mass spectrometry (MS)-based assays have recently shown to be a viable alternative for high throughput and quantitative candidate protein biomarker verification. Among them, the triple quadrupole MS based assay is the most promising one. When it is coupled with liquid chromatography (LC) separation and electrospray ionization (ESI) source, a triple quadrupole mass spectrometer operating in a special selected reaction monitoring (SRM) mode

  20. Strong Sporadic E Occurrence Detected by Ground-Based GNSS

    Science.gov (United States)

    Sun, Wenjie; Ning, Baiqi; Yue, Xinan; Li, Guozhu; Hu, Lianhuan; Chang, Shoumin; Lan, Jiaping; Zhu, Zhengping; Zhao, Biqiang; Lin, Jian

    2018-04-01

    The ionospheric sporadic E (Es) layer has significant impact on radio wave propagation. The traditional techniques employed for Es layer observation, for example, ionosondes, are not dense enough to resolve the morphology and dynamics of Es layer in spatial distribution. The ground-based Global Navigation Satellite Systems (GNSS) technique is expected to shed light on the understanding of regional strong Es occurrence, owing to the facts that the critical frequency (foEs) of strong Es structure is usually high enough to cause pulse-like disturbances in GNSS total electron content (TEC), and a large number of GNSS receivers have been deployed all over the world. Based on the Chinese ground-based GNSS networks, including the Crustal Movement Observation Network of China and the Beidou Ionospheric Observation Network, a large-scale strong Es event was observed in the middle latitude of China. The strong Es shown as a band-like structure in the southwest-northeast direction extended more than 1,000 km. By making a comparative analysis of Es occurrences identified from the simultaneous observations by ionosondes and GNSS TEC receivers over China middle latitude statistically, we found that GNSS TEC can be well employed to observe strong Es occurrence with a threshold value of foEs, 14 MHz.

  1. USB environment measurements based on full-scale static engine ground tests. [Upper Surface Blowing for YC-14

    Science.gov (United States)

    Sussman, M. B.; Harkonen, D. L.; Reed, J. B.

    1976-01-01

    Flow turning parameters, static pressures, surface temperatures, surface fluctuating pressures and acceleration levels were measured in the environment of a full-scale upper surface blowing (USB) propulsive-lift test configuration. The test components included a flightworthy CF6-50D engine, nacelle and USB flap assembly utilized in conjunction with ground verification testing of the USAF YC-14 Advanced Medium STOL Transport propulsion system. Results, based on a preliminary analysis of the data, generally show reasonable agreement with predicted levels based on model data. However, additional detailed analysis is required to confirm the preliminary evaluation, to help delineate certain discrepancies with model data and to establish a basis for future flight test comparisons.

  2. 4D offline PET-based treatment verification in ion beam therapy. Experimental and clinical evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Kurz, Christopher

    2014-06-12

    Due to the accessible sharp dose gradients, external beam radiotherapy with protons and heavier ions enables a highly conformal adaptation of the delivered dose to arbitrarily shaped tumour volumes. However, this high conformity is accompanied by an increased sensitivity to potential uncertainties, e.g., due to changes in the patient anatomy. Additional challenges are imposed by respiratory motion which does not only lead to rapid changes of the patient anatomy, but, in the cased of actively scanned ions beams, also to the formation of dose inhomogeneities. Therefore, it is highly desirable to verify the actual application of the treatment and to detect possible deviations with respect to the planned irradiation. At present, the only clinically implemented approach for a close-in-time verification of single treatment fractions is based on detecting the distribution of β{sup +}-emitter formed in nuclear fragmentation reactions during the irradiation by means of positron emission tomography (PET). For this purpose, a commercial PET/CT (computed tomography) scanner has been installed directly next to the treatment rooms at the Heidelberg Ion-Beam Therapy Center (HIT). Up to present, the application of this treatment verification technique is, however, still limited to static target volumes. This thesis aimed at investigating the feasibility and performance of PET-based treatment verification under consideration of organ motion. In experimental irradiation studies with moving phantoms, not only the practicability of PET-based treatment monitoring for moving targets, using a commercial PET/CT device, could be shown for the first time, but also the potential of this technique to detect motion-related deviations from the planned treatment with sub-millimetre accuracy. The first application to four exemplary hepato-cellular carcinoma patient cases under substantially more challenging clinical conditions indicated potential for improvement by taking organ motion into

  3. 4D offline PET-based treatment verification in ion beam therapy. Experimental and clinical evaluation

    International Nuclear Information System (INIS)

    Kurz, Christopher

    2014-01-01

    Due to the accessible sharp dose gradients, external beam radiotherapy with protons and heavier ions enables a highly conformal adaptation of the delivered dose to arbitrarily shaped tumour volumes. However, this high conformity is accompanied by an increased sensitivity to potential uncertainties, e.g., due to changes in the patient anatomy. Additional challenges are imposed by respiratory motion which does not only lead to rapid changes of the patient anatomy, but, in the cased of actively scanned ions beams, also to the formation of dose inhomogeneities. Therefore, it is highly desirable to verify the actual application of the treatment and to detect possible deviations with respect to the planned irradiation. At present, the only clinically implemented approach for a close-in-time verification of single treatment fractions is based on detecting the distribution of β + -emitter formed in nuclear fragmentation reactions during the irradiation by means of positron emission tomography (PET). For this purpose, a commercial PET/CT (computed tomography) scanner has been installed directly next to the treatment rooms at the Heidelberg Ion-Beam Therapy Center (HIT). Up to present, the application of this treatment verification technique is, however, still limited to static target volumes. This thesis aimed at investigating the feasibility and performance of PET-based treatment verification under consideration of organ motion. In experimental irradiation studies with moving phantoms, not only the practicability of PET-based treatment monitoring for moving targets, using a commercial PET/CT device, could be shown for the first time, but also the potential of this technique to detect motion-related deviations from the planned treatment with sub-millimetre accuracy. The first application to four exemplary hepato-cellular carcinoma patient cases under substantially more challenging clinical conditions indicated potential for improvement by taking organ motion into

  4. Ground-Based Global Positioning System (GPS) Meteorology Integrated Precipitable Water Vapor (IPW)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Ground-Based Global Positioning System (GPS) Meteorology Integrated Precipitable Water Vapor (IPW) data set measures atmospheric water vapor using ground-based...

  5. MODELING ATMOSPHERIC EMISSION FOR CMB GROUND-BASED OBSERVATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Errard, J.; Borrill, J. [Space Sciences Laboratory, University of California, Berkeley, CA 94720 (United States); Ade, P. A. R. [School of Physics and Astronomy, Cardiff University, Cardiff CF10 3XQ (United Kingdom); Akiba, Y.; Chinone, Y. [High Energy Accelerator Research Organization (KEK), Tsukuba, Ibaraki 305-0801 (Japan); Arnold, K.; Atlas, M.; Barron, D.; Elleflot, T. [Department of Physics, University of California, San Diego, CA 92093-0424 (United States); Baccigalupi, C.; Fabbian, G. [International School for Advanced Studies (SISSA), Trieste I-34014 (Italy); Boettger, D. [Department of Astronomy, Pontifica Universidad Catolica de Chile (Chile); Chapman, S. [Department of Physics and Atmospheric Science, Dalhousie University, Halifax, NS, B3H 4R2 (Canada); Cukierman, A. [Department of Physics, University of California, Berkeley, CA 94720 (United States); Delabrouille, J. [AstroParticule et Cosmologie, Univ Paris Diderot, CNRS/IN2P3, CEA/Irfu, Obs de Paris, Sorbonne Paris Cité (France); Dobbs, M.; Gilbert, A. [Physics Department, McGill University, Montreal, QC H3A 0G4 (Canada); Ducout, A.; Feeney, S. [Department of Physics, Imperial College London, London SW7 2AZ (United Kingdom); Feng, C. [Department of Physics and Astronomy, University of California, Irvine (United States); and others

    2015-08-10

    Atmosphere is one of the most important noise sources for ground-based cosmic microwave background (CMB) experiments. By increasing optical loading on the detectors, it amplifies their effective noise, while its fluctuations introduce spatial and temporal correlations between detected signals. We present a physically motivated 3D-model of the atmosphere total intensity emission in the millimeter and sub-millimeter wavelengths. We derive a new analytical estimate for the correlation between detectors time-ordered data as a function of the instrument and survey design, as well as several atmospheric parameters such as wind, relative humidity, temperature and turbulence characteristics. Using an original numerical computation, we examine the effect of each physical parameter on the correlations in the time series of a given experiment. We then use a parametric-likelihood approach to validate the modeling and estimate atmosphere parameters from the polarbear-i project first season data set. We derive a new 1.0% upper limit on the linear polarization fraction of atmospheric emission. We also compare our results to previous studies and weather station measurements. The proposed model can be used for realistic simulations of future ground-based CMB observations.

  6. M&V Guidelines: Measurement and Verification for Performance-Based Contracts Version 4.0

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-11-02

    Document outlines the Federal Energy Management Program's standard procedures and guidelines for measurement and verification (M&V) for federal energy managers, procurement officials, and energy service providers.

  7. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    OpenAIRE

    Jin-Won Park; Sung Bum Pan; Yongwha Chung; Daesung Moon

    2009-01-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification i...

  8. Experimental Verification of a Vehicle Localization based on Moving Horizon Estimation Integrating LRS and Odometry

    International Nuclear Information System (INIS)

    Sakaeta, Kuniyuki; Nonaka, Kenichiro; Sekiguchi, Kazuma

    2016-01-01

    Localization is an important function for the robots to complete various tasks. For localization, both internal and external sensors are used generally. The odometry is widely used as the method based on the internal sensors, but it suffers from cumulative errors. In the method using the laser range sensor (LRS) which is a kind of external sensor, the estimation accuracy is affected by the number of available measurement data. In our previous study, we applied moving horizon estimation (MHE) to the vehicle localization for integrating the LRS measurement data and the odometry information where the weightings of them are balanced relatively adapting to the number of the available LRS measurement data. In this paper, the effectiveness of the proposed localization method is verified through both numerical simulations and experiments using a 1/10 scale vehicle. The verification is conducted in the situations where the vehicle position cannot be localized uniquely on a certain direction using the LRS measurement data only. We achieve accurate localization even in such a situation by integrating the odometry and LRS based on MHE. We also show the superiority of the method through comparisons with a method using extended Kalman filter (EKF). (paper)

  9. Computer-aided diagnosis of mammographic masses using geometric verification-based image retrieval

    Science.gov (United States)

    Li, Qingliang; Shi, Weili; Yang, Huamin; Zhang, Huimao; Li, Guoxin; Chen, Tao; Mori, Kensaku; Jiang, Zhengang

    2017-03-01

    Computer-Aided Diagnosis of masses in mammograms is an important indicator of breast cancer. The use of retrieval systems in breast examination is increasing gradually. In this respect, the method of exploiting the vocabulary tree framework and the inverted file in the mammographic masse retrieval have been proved high accuracy and excellent scalability. However it just considered the features in each image as a visual word and had ignored the spatial configurations of features. It greatly affect the retrieval performance. To overcome this drawback, we introduce the geometric verification method to retrieval in mammographic masses. First of all, we obtain corresponding match features based on the vocabulary tree framework and the inverted file. After that, we grasps the main point of local similarity characteristic of deformations in the local regions by constructing the circle regions of corresponding pairs. Meanwhile we segment the circle to express the geometric relationship of local matches in the area and generate the spatial encoding strictly. Finally we judge whether the matched features are correct or not, based on verifying the all spatial encoding are whether satisfied the geometric consistency. Experiments show the promising results of our approach.

  10. Csf Based Non-Ground Points Extraction from LIDAR Data

    Science.gov (United States)

    Shen, A.; Zhang, W.; Shi, H.

    2017-09-01

    Region growing is a classical method of point cloud segmentation. Based on the idea of collecting the pixels with similar properties to form regions, region growing is widely used in many fields such as medicine, forestry and remote sensing. In this algorithm, there are two core problems. One is the selection of seed points, the other is the setting of the growth constraints, in which the selection of the seed points is the foundation. In this paper, we propose a CSF (Cloth Simulation Filtering) based method to extract the non-ground seed points effectively. The experiments have shown that this method can obtain a group of seed spots compared with the traditional methods. It is a new attempt to extract seed points

  11. Monitoring Hydraulic Fracturing Using Ground-Based Controlled Source Electromagnetics

    Science.gov (United States)

    Hickey, M. S.; Trevino, S., III; Everett, M. E.

    2017-12-01

    Hydraulic fracturing allows hydrocarbon production in low permeability formations. Imaging the distribution of fluid used to create a hydraulic fracture can aid in the characterization of fracture properties such as extent of plume penetration as well as fracture azimuth and symmetry. This could contribute to improving the efficiency of an operation, for example, in helping to determine ideal well spacing or the need to refracture a zone. A ground-based controlled-source electromagnetics (CSEM) technique is ideal for imaging the fluid due to the change in field caused by the difference in the conductive properties of the fluid when compared to the background. With advances in high signal to noise recording equipment, coupled with a high-power, broadband transmitter we can show hydraulic fracture extent and azimuth with minimal processing. A 3D finite element code is used to model the complete well casing along with the layered subsurface. This forward model is used to optimize the survey design and isolate the band of frequencies with the best response. In the field, the results of the modeling are also used to create a custom pseudorandom numeric (PRN) code to control the frequencies transmitted through a grounded dipole source. The receivers record the surface voltage across two grounded dipoles, one parallel and one perpendicular to the transmitter. The data are presented as the displays of amplitude ratios across several frequencies with the associated spatial information. In this presentation, we show multiple field results in multiple basins in the United States along with the CSEM theory used to create the survey designs.

  12. Mechanisms of time-based figure-ground segregation.

    Science.gov (United States)

    Kandil, Farid I; Fahle, Manfred

    2003-11-01

    Figure-ground segregation can rely on purely temporal information, that is, on short temporal delays between positional changes of elements in figure and ground (Kandil, F.I. & Fahle, M. (2001) Eur. J. Neurosci., 13, 2004-2008). Here, we investigate the underlying mechanisms by measuring temporal segregation thresholds for various kinds of motion cues. Segregation can rely on monocular first-order motion (based on luminance modulation) and second-order motion cues (contrast modulation) with a high temporal resolution of approximately 20 ms. The mechanism can also use isoluminant motion with a reduced temporal resolution of 60 ms. Figure-ground segregation can be achieved even at presentation frequencies too high for human subjects to inspect successive frames individually. In contrast, when stimuli are presented dichoptically, i.e. separately to both eyes, subjects are unable to perceive any segregation, irrespective of temporal frequency. We propose that segregation in these displays is detected by a mechanism consisting of at least two stages. On the first level, standard motion or flicker detectors signal local positional changes (flips). On the second level, a segregation mechanism combines the local activities of the low-level detectors with high temporal precision. Our findings suggest that the segregation mechanism can rely on monocular detectors but not on binocular mechanisms. Moreover, the results oppose the idea that segregation in these displays is achieved by motion detectors of a higher order (motion-from-motion), but favour mechanisms sensitive to short temporal delays even without activation of higher-order motion detectors.

  13. SU-C-207A-04: Accuracy of Acoustic-Based Proton Range Verification in Water

    International Nuclear Information System (INIS)

    Jones, KC; Sehgal, CM; Avery, S; Vander Stappen, F

    2016-01-01

    Purpose: To determine the accuracy and dose required for acoustic-based proton range verification (protoacoustics) in water. Methods: Proton pulses with 17 µs FWHM and instantaneous currents of 480 nA (5.6 × 10 7 protons/pulse, 8.9 cGy/pulse) were generated by a clinical, hospital-based cyclotron at the University of Pennsylvania. The protoacoustic signal generated in a water phantom by the 190 MeV proton pulses was measured with a hydrophone placed at multiple known positions surrounding the dose deposition. The background random noise was measured. The protoacoustic signal was simulated to compare to the experiments. Results: The maximum protoacoustic signal amplitude at 5 cm distance was 5.2 mPa per 1 × 10 7 protons (1.6 cGy at the Bragg peak). The background random noise of the measurement was 27 mPa. Comparison between simulation and experiment indicates that the hydrophone introduced a delay of 2.4 µs. For acoustic data collected with a signal-to-noise ratio (SNR) of 21, deconvolution of the protoacoustic signal with the proton pulse provided the most precise time-of-flight range measurement (standard deviation of 2.0 mm), but a systematic error (−4.5 mm) was observed. Conclusion: Based on water phantom measurements at a clinical hospital-based cyclotron, protoacoustics is a potential technique for measuring the proton Bragg peak range with 2.0 mm standard deviation. Simultaneous use of multiple detectors is expected to reduce the standard deviation, but calibration is required to remove systematic error. Based on the measured background noise and protoacoustic amplitude, a SNR of 5.3 is projected for a deposited dose of 2 Gy.

  14. Bridge Testing With Ground-Based Interferometric Radar: Experimental Results

    International Nuclear Information System (INIS)

    Chiara, P.; Morelli, A.

    2010-01-01

    The research of innovative non-contact techniques aimed at the vibration measurement of civil engineering structures (also for damage detection and structural health monitoring) is continuously directed to the optimization of measures and methods. Ground-Based Radar Interferometry (GBRI) represents the more recent technique available for static and dynamic control of structures and ground movements.Dynamic testing of bridges and buildings in operational conditions are currently performed: (a) to assess the conformity of the structure to the project design at the end of construction; (b) to identify the modal parameters (i.e. natural frequencies, mode shapes and damping ratios) and to check the variation of any modal parameters over the years; (c) to evaluate the amplitude of the structural response to special load conditions (i.e. strong winds, earthquakes, heavy railway or roadway loads). If such tests are carried out by using a non-contact technique (like GBRI), the classical issues of contact sensors (like accelerometers) are easily overtaken.This paper presents and discusses the results of various tests carried out on full-scale bridges by using a Stepped Frequency-Continuous Wave radar system.

  15. Bridge Testing With Ground-Based Interferometric Radar: Experimental Results

    Science.gov (United States)

    Chiara, P.; Morelli, A.

    2010-05-01

    The research of innovative non-contact techniques aimed at the vibration measurement of civil engineering structures (also for damage detection and structural health monitoring) is continuously directed to the optimization of measures and methods. Ground-Based Radar Interferometry (GBRI) represents the more recent technique available for static and dynamic control of structures and ground movements. Dynamic testing of bridges and buildings in operational conditions are currently performed: (a) to assess the conformity of the structure to the project design at the end of construction; (b) to identify the modal parameters (i.e. natural frequencies, mode shapes and damping ratios) and to check the variation of any modal parameters over the years; (c) to evaluate the amplitude of the structural response to special load conditions (i.e. strong winds, earthquakes, heavy railway or roadway loads). If such tests are carried out by using a non-contact technique (like GBRI), the classical issues of contact sensors (like accelerometers) are easily overtaken. This paper presents and discusses the results of various tests carried out on full-scale bridges by using a Stepped Frequency-Continuous Wave radar system.

  16. Ground-based observations coordinated with Viking satellite measurements

    International Nuclear Information System (INIS)

    Opgenoorth, H.J.; Kirkwood, S.

    1989-01-01

    The instrumentation and the orbit of the Viking satellite made this first Swedish satellite mission ideally suited for coordinated observations with the dense network of ground-based stations in northern Scandinavia. Several arrays of complementing instruments such as magnetometers, all-sky cameras, riometers and doppler radars monitored on a routine basis the ionosphere under the magnetospheric region passed by Viking. For a large number of orbits the Viking passages close to Scandinavia were covered by the operation of specially designed programmes at the European incoherent-scatter facility (EISCAT). First results of coordinated observations on the ground and aboard Viking have shed new light on the most spectacular feature of substorm expansion, the westward-travelling surge. The end of a substorm and the associated decay of a westward-travelling surge have been analysed. EISCAT measurements of high spatial and temporal resolution indicate that the conductivities and electric fields associated with westward-travelling surges are not represented correctly by the existing models. (author)

  17. Dynamic knowledge representation using agent-based modeling: ontology instantiation and verification of conceptual models.

    Science.gov (United States)

    An, Gary

    2009-01-01

    The sheer volume of biomedical research threatens to overwhelm the capacity of individuals to effectively process this information. Adding to this challenge is the multiscale nature of both biological systems and the research community as a whole. Given this volume and rate of generation of biomedical information, the research community must develop methods for robust representation of knowledge in order for individuals, and the community as a whole, to "know what they know." Despite increasing emphasis on "data-driven" research, the fact remains that researchers guide their research using intuitively constructed conceptual models derived from knowledge extracted from publications, knowledge that is generally qualitatively expressed using natural language. Agent-based modeling (ABM) is a computational modeling method that is suited to translating the knowledge expressed in biomedical texts into dynamic representations of the conceptual models generated by researchers. The hierarchical object-class orientation of ABM maps well to biomedical ontological structures, facilitating the translation of ontologies into instantiated models. Furthermore, ABM is suited to producing the nonintuitive behaviors that often "break" conceptual models. Verification in this context is focused at determining the plausibility of a particular conceptual model, and qualitative knowledge representation is often sufficient for this goal. Thus, utilized in this fashion, ABM can provide a powerful adjunct to other computational methods within the research process, as well as providing a metamodeling framework to enhance the evolution of biomedical ontologies.

  18. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Directory of Open Access Journals (Sweden)

    Yin Peili

    2017-08-01

    Full Text Available Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI. The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  19. Simple thermal to thermal face verification method based on local texture descriptors

    Science.gov (United States)

    Grudzien, A.; Palka, Norbert; Kowalski, M.

    2017-08-01

    Biometrics is a science that studies and analyzes physical structure of a human body and behaviour of people. Biometrics found many applications ranging from border control systems, forensics systems for criminal investigations to systems for access control. Unique identifiers, also referred to as modalities are used to distinguish individuals. One of the most common and natural human identifiers is a face. As a result of decades of investigations, face recognition achieved high level of maturity, however recognition in visible spectrum is still challenging due to illumination aspects or new ways of spoofing. One of the alternatives is recognition of face in different parts of light spectrum, e.g. in infrared spectrum. Thermal infrared offer new possibilities for human recognition due to its specific properties as well as mature equipment. In this paper we present the scheme of subject's verification methodology by using facial images in thermal range. The study is focused on the local feature extraction methods and on the similarity metrics. We present comparison of two local texture-based descriptors for thermal 1-to-1 face recognition.

  20. Reconstruction of Sky Illumination Domes from Ground-Based Panoramas

    Science.gov (United States)

    Coubard, F.; Lelégard, L.; Brédif, M.; Paparoditis, N.; Briottet, X.

    2012-07-01

    The knowledge of the sky illumination is important for radiometric corrections and for computer graphics applications such as relighting or augmented reality. We propose an approach to compute environment maps, representing the sky radiance, from a set of ground-based images acquired by a panoramic acquisition system, for instance a mobile-mapping system. These images can be affected by important radiometric artifacts, such as bloom or overexposure. A Perez radiance model is estimated with the blue sky pixels of the images, and used to compute additive corrections in order to reduce these radiometric artifacts. The sky pixels are then aggregated in an environment map, which still suffers from discontinuities on stitching edges. The influence of the quality of estimated sky radiance on the simulated light signal is measured quantitatively on a simple synthetic urban scene; in our case, the maximal error for the total sensor radiance is about 10%.

  1. Ground-based transmission line conductor motion sensor

    International Nuclear Information System (INIS)

    Jacobs, M.L.; Milano, U.

    1988-01-01

    A ground-based-conductor motion-sensing apparatus is provided for remotely sensing movement of electric-power transmission lines, particularly as would occur during the wind-induced condition known as galloping. The apparatus is comprised of a motion sensor and signal-generating means which are placed underneath a transmission line and will sense changes in the electric field around the line due to excessive line motion. The detector then signals a remote station when a conditioning of galloping is sensed. The apparatus of the present invention is advantageous over the line-mounted sensors of the prior art in that it is easier and less hazardous to install. The system can also be modified so that a signal will only be given when particular conditions, such as specific temperature range, large-amplitude line motion, or excessive duration of the line motion, are occurring

  2. RECONSTRUCTION OF SKY ILLUMINATION DOMES FROM GROUND-BASED PANORAMAS

    Directory of Open Access Journals (Sweden)

    F. Coubard

    2012-07-01

    Full Text Available The knowledge of the sky illumination is important for radiometric corrections and for computer graphics applications such as relighting or augmented reality. We propose an approach to compute environment maps, representing the sky radiance, from a set of ground-based images acquired by a panoramic acquisition system, for instance a mobile-mapping system. These images can be affected by important radiometric artifacts, such as bloom or overexposure. A Perez radiance model is estimated with the blue sky pixels of the images, and used to compute additive corrections in order to reduce these radiometric artifacts. The sky pixels are then aggregated in an environment map, which still suffers from discontinuities on stitching edges. The influence of the quality of estimated sky radiance on the simulated light signal is measured quantitatively on a simple synthetic urban scene; in our case, the maximal error for the total sensor radiance is about 10%.

  3. Wavelet-based ground vehicle recognition using acoustic signals

    Science.gov (United States)

    Choe, Howard C.; Karlsen, Robert E.; Gerhart, Grant R.; Meitzler, Thomas J.

    1996-03-01

    We present, in this paper, a wavelet-based acoustic signal analysis to remotely recognize military vehicles using their sound intercepted by acoustic sensors. Since expedited signal recognition is imperative in many military and industrial situations, we developed an algorithm that provides an automated, fast signal recognition once implemented in a real-time hardware system. This algorithm consists of wavelet preprocessing, feature extraction and compact signal representation, and a simple but effective statistical pattern matching. The current status of the algorithm does not require any training. The training is replaced by human selection of reference signals (e.g., squeak or engine exhaust sound) distinctive to each individual vehicle based on human perception. This allows a fast archiving of any new vehicle type in the database once the signal is collected. The wavelet preprocessing provides time-frequency multiresolution analysis using discrete wavelet transform (DWT). Within each resolution level, feature vectors are generated from statistical parameters and energy content of the wavelet coefficients. After applying our algorithm on the intercepted acoustic signals, the resultant feature vectors are compared with the reference vehicle feature vectors in the database using statistical pattern matching to determine the type of vehicle from where the signal originated. Certainly, statistical pattern matching can be replaced by an artificial neural network (ANN); however, the ANN would require training data sets and time to train the net. Unfortunately, this is not always possible for many real world situations, especially collecting data sets from unfriendly ground vehicles to train the ANN. Our methodology using wavelet preprocessing and statistical pattern matching provides robust acoustic signal recognition. We also present an example of vehicle recognition using acoustic signals collected from two different military ground vehicles. In this paper, we will

  4. Satellite and Ground Based Monitoring of Aerosol Plumes

    International Nuclear Information System (INIS)

    Doyle, Martin; Dorling, Stephen

    2002-01-01

    Plumes of atmospheric aerosol have been studied using a range of satellite and ground-based techniques. The Sea-viewing WideField-of-view Sensor (SeaWiFS) has been used to observe plumes of sulphate aerosol and Saharan dust around the coast of the United Kingdom. Aerosol Optical Thickness (AOT) was retrieved from SeaWiFS for two events; a plume of Saharan dust transported over the United Kingdom from Western Africa and a period of elevated sulphate experienced over the Easternregion of the UK. Patterns of AOT are discussed and related to the synoptic and mesoscale weather conditions. Further observation of the sulphate aerosol event was undertaken using the Advanced Very High Resolution Radiometer instrument(AVHRR). Atmospheric back trajectories and weather conditions were studied in order to identify the meteorological conditions which led to this event. Co-located ground-based measurements of PM 10 and PM 2.5 were obtained for 4sites within the UK and PM 2.5/10 ratios were calculated in order to identify any unusually high or low ratios(indicating the dominant size fraction within the plume)during either of these events. Calculated percentiles ofPM 2.5/10 ratios during the 2 events examined show that these events were notable within the record, but were in noway unique or unusual in the context of a 3 yr monitoring record. Visibility measurements for both episodes have been examined and show that visibility degradation occurred during both the sulphate aerosol and Saharan dust episodes

  5. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience

    International Nuclear Information System (INIS)

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Joergen; Nyholm, Tufve; Ahnesjoe, Anders; Karlsson, Mikael

    2007-01-01

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm 3 ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 ± 1.2% and 0.5 ± 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 ± 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach

  6. Accelerating SystemVerilog UVM Based VIP to Improve Methodology for Verification of Image Signal Processing Designs Using HW Emulator

    OpenAIRE

    Jain, Abhishek; Gupta, Piyush Kumar; Gupta, Dr. Hima; Dhar, Sachish

    2014-01-01

    In this paper we present the development of Acceleratable UVCs from standard UVCs in SystemVerilog and their usage in UVM based Verification Environment of Image Signal Processing designs to increase run time performance. This paper covers development of Acceleratable UVCs from standard UVCs for internal control and data buses of ST imaging group by partitioning of transaction-level components and cycle-accurate signal-level components between the software simulator and hardware accelerator r...

  7. Ground-Based Remote Sensing of Volcanic CO2 Fluxes at Solfatara (Italy—Direct Versus Inverse Bayesian Retrieval

    Directory of Open Access Journals (Sweden)

    Manuel Queißer

    2018-01-01

    Full Text Available CO2 is the second most abundant volatile species of degassing magma. CO2 fluxes carry information of incredible value, such as periods of volcanic unrest. Ground-based laser remote sensing is a powerful technique to measure CO2 fluxes in a spatially integrated manner, quickly and from a safe distance, but it needs accurate knowledge of the plume speed. The latter is often difficult to estimate, particularly for complex topographies. So, a supplementary or even alternative way of retrieving fluxes would be beneficial. Here, we assess Bayesian inversion as a potential technique for the case of the volcanic crater of Solfatara (Italy, a complex terrain hosting two major CO2 degassing fumarolic vents close to a steep slope. Direct integration of remotely sensed CO2 concentrations of these vents using plume speed derived from optical flow analysis yielded a flux of 717 ± 121 t day−1, in agreement with independent measurements. The flux from Bayesian inversion based on a simple Gaussian plume model was in excellent agreement under certain conditions. In conclusion, Bayesian inversion is a promising retrieval tool for CO2 fluxes, especially in situations where plume speed estimation methods fail, e.g., optical flow for transparent plumes. The results have implications beyond volcanology, including ground-based remote sensing of greenhouse gases and verification of satellite soundings.

  8. Hydrogeology, simulated ground-water flow, and ground-water quality, Wright-Patterson Air Force Base, Ohio

    Science.gov (United States)

    Dumouchelle, D.H.; Schalk, C.W.; Rowe, G.L.; De Roche, J.T.

    1993-01-01

    Ground water is the primary source of water in the Wright-Patterson Air Force Base area. The aquifer consists of glacial sands and gravels that fill a buried bedrock-valley system. Consolidated rocks in the area consist of poorly permeable Ordovician shale of the Richmondian stage, in the upland areas, the Brassfield Limestone of Silurian age. The valleys are filled with glacial sediments of Wisconsinan age consisting of clay-rich tills and coarse-grained outwash deposits. Estimates of hydraulic conductivity of the shales based on results of displacement/recovery tests range from 0.0016 to 12 feet per day; estimates for the glacial sediments range from less than 1 foot per day to more than 1,000 feet per day. Ground water flow from the uplands towards the valleys and the major rivers in the region, the Great Miami and the Mad Rivers. Hydraulic-head data indicate that ground water flows between the bedrock and unconsolidated deposits. Data from a gain/loss study of the Mad River System and hydrographs from nearby wells reveal that the reach of the river next to Wright-Patterson Air Force Base is a ground-water discharge area. A steady-state, three-dimensional ground-water-flow model was developed to simulate ground-water flow in the region. The model contains three layers and encompasses about 100 square miles centered on Wright-Patterson Air Force Base. Ground water enters the modeled area primarily by river leakage and underflow at the model boundary. Ground water exits the modeled area primarily by flow through the valleys at the model boundaries and through production wells. A model sensitivity analysis involving systematic changes in values of hydrologic parameters in the model indicates that the model is most sensitive to decreases in riverbed conductance and vertical conductance between the upper two layers. The analysis also indicates that the contribution of water to the buried-valley aquifer from the bedrock that forms the valley walls is about 2 to 4

  9. Development, verification and validation of an FPGA-based core heat removal protection system for a PWR

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Yichun, E-mail: ycwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China); Shui, Xuanxuan, E-mail: 807001564@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Cai, Yuanfeng, E-mail: 1056303902@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Zhou, Junyi, E-mail: 1032133755@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Wu, Zhiqiang, E-mail: npic_wu@126.com [State Key Laboratory of Reactor System Design Technology, Nuclear Power Institute of China, Chengdu 610041 (China); Zheng, Jianxiang, E-mail: zwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China)

    2016-05-15

    Highlights: • An example on life cycle development process and V&V on FPGA-based I&C is presented. • Software standards and guidelines are used in FPGA-based NPP I&C system logic V&V. • Diversified FPGA design and verification languages and tools are utilized. • An NPP operation principle simulator is used to simulate operation scenarios. - Abstract: To reach high confidence and ensure reliability of nuclear FPGA-based safety system, life cycle processes of discipline specification and implementation of design as well as regulations verification and validation (V&V) are needed. A specific example on how to conduct life cycle development process and V&V on FPGA-based core heat removal (CHR) protection system for CPR1000 pressure water reactor (PWR) is presented in this paper. Using the existing standards and guidelines for life cycle development and V&V, a simplified FPGA-based CHR protection system for PWR has been designed, implemented, verified and validated. Diversified verification and simulation languages and tools are used by the independent design team and the V&V team. In the system acceptance testing V&V phase, a CPR1000 NPP operation principle simulator (OPS) model is utilized to simulate normal and abnormal operation scenarios, and provide input data to the under-test FPGA-based CHR protection system and a verified C code CHR function module. The evaluation results are applied to validate the under-test FPGA-based CHR protection system. The OPS model operation outputs also provide reasonable references for the tests. Using an OPS model in the system acceptance testing V&V is cost-effective and high-efficient. A dedicated OPS, as a commercial-off-the-shelf (COTS) item, would contribute as an important tool in the V&V process of NPP I&C systems, including FPGA-based and microprocessor-based systems.

  10. First Experience With Real-Time EPID-Based Delivery Verification During IMRT and VMAT Sessions

    International Nuclear Information System (INIS)

    Woodruff, Henry C.; Fuangrod, Todsaporn; Van Uytven, Eric; McCurdy, Boyd M.C.; Beek, Timothy van; Bhatia, Shashank; Greer, Peter B.

    2015-01-01

    Purpose: Gantry-mounted megavoltage electronic portal imaging devices (EPIDs) have become ubiquitous on linear accelerators. WatchDog is a novel application of EPIDs, in which the image frames acquired during treatment are used to monitor treatment delivery in real time. We report on the preliminary use of WatchDog in a prospective study of cancer patients undergoing intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT) and identify the challenges of clinical adoption. Methods and Materials: At the time of submission, 28 cancer patients (head and neck, pelvis, and prostate) undergoing fractionated external beam radiation therapy (24 IMRT, 4 VMAT) had ≥1 treatment fraction verified in real time (131 fractions or 881 fields). EPID images acquired continuously during treatment were synchronized and compared with model-generated transit EPID images within a frame time (∼0.1 s). A χ comparison was performed to cumulative frames to gauge the overall delivery quality, and the resulting pass rates were reported graphically during treatment delivery. Every frame acquired (500-1500 per fraction) was saved for postprocessing and analysis. Results: The system reported the mean ± standard deviation in real time χ 91.1% ± 11.5% (83.6% ± 13.2%) for cumulative frame χ analysis with 4%, 4 mm (3%, 3 mm) criteria, global over the integrated image. Conclusions: A real-time EPID-based radiation delivery verification system for IMRT and VMAT has been demonstrated that aims to prevent major mistreatments in radiation therapy.

  11. Verification of the Microgravity Active Vibration Isolation System based on Parabolic Flight

    Science.gov (United States)

    Zhang, Yong-kang; Dong, Wen-bo; Liu, Wei; Li, Zong-feng; Lv, Shi-meng; Sang, Xiao-ru; Yang, Yang

    2017-12-01

    The Microgravity active vibration isolation system (MAIS) is a device to reduce on-orbit vibration and to provide a lower gravity level for certain scientific experiments. MAIS system is made up of a stator and a floater, the stator is fixed on the spacecraft, and the floater is suspended by electromagnetic force so as to reduce the vibration from the stator. The system has 3 position sensors, 3 accelerometers, 8 Lorentz actuators, signal processing circuits and a central controller embedded in the operating software and control algorithms. For the experiments on parabolic flights, a laptop is added to MAIS for monitoring and operation, and a power module is for electric power converting. The principle of MAIS is as follows: the system samples the vibration acceleration of the floater from accelerometers, measures the displacement between stator and floater from position sensitive detectors, and computes Lorentz force current for each actuator so as to eliminate the vibration of the scientific payload, and meanwhile to avoid crashing between the stator and the floater. This is a motion control technic in 6 degrees of freedom (6-DOF) and its function could only be verified in a microgravity environment. Thanks for DLR and Novespace, we get a chance to take the DLR 27th parabolic flight campaign to make experiments to verify the 6-DOF control technic. The experiment results validate that the 6-DOF motion control technique is effective, and vibration isolation performance perfectly matches what we expected based on theoretical analysis and simulation. The MAIS has been planned on Chinese manned spacecraft for many microgravity scientific experiments, and the verification on parabolic flights is very important for its following mission. Additionally, we also test some additional function by microgravity electromagnetic suspension, such as automatic catching and locking and working in fault mode. The parabolic flight produces much useful data for these experiments.

  12. First Experience With Real-Time EPID-Based Delivery Verification During IMRT and VMAT Sessions

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, Henry C., E-mail: henry.woodruff@newcastle.edu.au [Faculty of Science and Information Technology, School of Mathematical and Physical Sciences, University of Newcastle, New South Wales (Australia); Fuangrod, Todsaporn [Faculty of Engineering and Built Environment, School of Electrical Engineering and Computer Science, University of Newcastle, New South Wales (Australia); Van Uytven, Eric; McCurdy, Boyd M.C.; Beek, Timothy van [Division of Medical Physics, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba (Canada); Department of Radiology, University of Manitoba, Winnipeg, Manitoba (Canada); Bhatia, Shashank [Department of Radiation Oncology, Calvary Mater Newcastle Hospital, Newcastle, New South Wales (Australia); Greer, Peter B. [Faculty of Science and Information Technology, School of Mathematical and Physical Sciences, University of Newcastle, New South Wales (Australia); Department of Radiation Oncology, Calvary Mater Newcastle Hospital, Newcastle, New South Wales (Australia)

    2015-11-01

    Purpose: Gantry-mounted megavoltage electronic portal imaging devices (EPIDs) have become ubiquitous on linear accelerators. WatchDog is a novel application of EPIDs, in which the image frames acquired during treatment are used to monitor treatment delivery in real time. We report on the preliminary use of WatchDog in a prospective study of cancer patients undergoing intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT) and identify the challenges of clinical adoption. Methods and Materials: At the time of submission, 28 cancer patients (head and neck, pelvis, and prostate) undergoing fractionated external beam radiation therapy (24 IMRT, 4 VMAT) had ≥1 treatment fraction verified in real time (131 fractions or 881 fields). EPID images acquired continuously during treatment were synchronized and compared with model-generated transit EPID images within a frame time (∼0.1 s). A χ comparison was performed to cumulative frames to gauge the overall delivery quality, and the resulting pass rates were reported graphically during treatment delivery. Every frame acquired (500-1500 per fraction) was saved for postprocessing and analysis. Results: The system reported the mean ± standard deviation in real time χ 91.1% ± 11.5% (83.6% ± 13.2%) for cumulative frame χ analysis with 4%, 4 mm (3%, 3 mm) criteria, global over the integrated image. Conclusions: A real-time EPID-based radiation delivery verification system for IMRT and VMAT has been demonstrated that aims to prevent major mistreatments in radiation therapy.

  13. Vision-based Ground Test for Active Debris Removal

    Directory of Open Access Journals (Sweden)

    Seong-Min Lim

    2013-12-01

    Full Text Available Due to the continuous space development by mankind, the number of space objects including space debris in orbits around the Earth has increased, and accordingly, difficulties of space development and activities are expected in the near future. In this study, among the stages for space debris removal, the implementation of a vision-based approach technique for approaching space debris from a far-range rendezvous state to a proximity state, and the ground test performance results were described. For the vision-based object tracking, the CAM-shift algorithm with high speed and strong performance, and the Kalman filter were combined and utilized. For measuring the distance to a tracking object, a stereo camera was used. For the construction of a low-cost space environment simulation test bed, a sun simulator was used, and in the case of the platform for approaching, a two-dimensional mobile robot was used. The tracking status was examined while changing the position of the sun simulator, and the results indicated that the CAM-shift showed a tracking rate of about 87% and the relative distance could be measured down to 0.9 m. In addition, considerations for future space environment simulation tests were proposed.

  14. Comparison of carina-based versus bony anatomy-based registration for setup verification in esophageal cancer radiotherapy.

    Science.gov (United States)

    Machiels, Mélanie; Jin, Peng; van Gurp, Christianne H; van Hooft, Jeanin E; Alderliesten, Tanja; Hulshof, Maarten C C M

    2018-03-21

    To investigate the feasibility and geometric accuracy of carina-based registration for CBCT-guided setup verification in esophageal cancer IGRT, compared with current practice bony anatomy-based registration. Included were 24 esophageal cancer patients with 65 implanted fiducial markers, visible on planning CTs and follow-up CBCTs. All available CBCT scans (n = 236) were rigidly registered to the planning CT with respect to the bony anatomy and the carina. Target coverage was visually inspected and marker position variation was quantified relative to both registration approaches; the variation of systematic (Σ) and random errors (σ) was estimated. Automatic carina-based registration was feasible in 94.9% of the CBCT scans, with an adequate target coverage in 91.1% compared to 100% after bony anatomy-based registration. Overall, Σ (σ) in the LR/CC/AP direction was 2.9(2.4)/4.1(2.4)/2.2(1.8) mm using the bony anatomy registration compared to 3.3(3.0)/3.6(2.6)/3.9(3.1) mm for the carina. Mid-thoracic placed markers showed a non-significant but smaller Σ in CC and AP direction when using the carina-based registration. Compared with a bony anatomy-based registration, carina-based registration for esophageal cancer IGRT results in inadequate target coverage in 8.9% of cases. Furthermore, large Σ and σ, requiring larger anisotropic margins, were seen after carina-based registration. Only for tumors entirely confined to the mid-thoracic region the carina-based registration might be slightly favorable.

  15. Graph-based specification and verification for aspect-oriented languages

    NARCIS (Netherlands)

    Staijen, T.

    2010-01-01

    Aspect-oriented software development aims at improving separation of concerns at all levels in the software development life-cycle, from architecture to code implementation. In this thesis we strive to develop verification methods specifically for aspect-oriented programming languages. For this

  16. 76 FR 60112 - Consent Based Social Security Number Verification (CBSV) Service

    Science.gov (United States)

    2011-09-28

    ... protect the public's information. In addition to the benefit of providing high volume, centralized SSN verification services to the business community in a secure manner, CBSV provides us with cost and workload management benefits. New Information: To use CBSV, interested parties must pay a one- time non-refundable...

  17. Long term landslide monitoring with Ground Based SAR

    Science.gov (United States)

    Monserrat, Oriol; Crosetto, Michele; Luzi, Guido; Gili, Josep; Moya, Jose; Corominas, Jordi

    2014-05-01

    In the last decade, Ground-Based (GBSAR) has proven to be a reliable microwave Remote Sensing technique in several application fields, especially for unstable slopes monitoring. GBSAR can provide displacement measurements over few squared kilometres areas and with a very high spatial and temporal resolution. This work is focused on the use of GBSAR technique for long term landslide monitoring based on a particular data acquisition configuration, which is called discontinuous GBSAR (D-GBSAR). In the most commonly used GBSAR configuration, the radar is left installed in situ, acquiring data periodically, e.g. every few minutes. Deformations are estimated by processing sets of GBSAR images acquired during several weeks or months, without moving the system. By contrast, in the D-GBSAR the radar is installed and dismounted at each measurement campaign, revisiting a given site periodically. This configuration is useful to monitor slow deformation phenomena. In this work, two alternative ways for exploiting the D-GBSAR technique will be presented: the DInSAR technique and the Amplitude based Technique. The former is based on the exploitation of the phase component of the acquired SAR images and it allows providing millimetric precision on the deformation estimates. However, this technique presents several limitations like the reduction of measurable points with an increase in the period of observation, the ambiguous nature of the phase measurements, and the influence of the atmospheric phase component that can make it non applicable in some cases, specially when working in natural environments. The second approach, that is based on the use of the amplitude component of GB-SAR images combined with a image matching technique, will allow the estimation of the displacements over specific targets avoiding two of the limitations commented above: the phase unwrapping and atmosphere contribution but reducing the deformation measurement precision. Two successful examples of D

  18. Control Method of Single-phase Inverter Based Grounding System in Distribution Networks

    DEFF Research Database (Denmark)

    Wang, Wen; Yan, L.; Zeng, X.

    2016-01-01

    of neutral-to-ground voltage is critical for the safety of distribution networks. An active grounding system based on single-phase inverter is proposed to achieve this objective. Relationship between output current of the system and neutral-to-ground voltage is derived to explain the principle of neutral......The asymmetry of the inherent distributed capacitances causes the rise of neutral-to-ground voltage in ungrounded system or high resistance grounded system. Overvoltage may occur in resonant grounded system if Petersen coil is resonant with the distributed capacitances. Thus, the restraint...

  19. Space- and Ground-based Coronal Spectro-Polarimetry

    Science.gov (United States)

    Fineschi, Silvano; Bemporad, Alessandro; Rybak, Jan; Capobianco, Gerardo

    This presentation gives an overview of the near-future perspectives of ultraviolet and visible-light spectro-polarimetric instrumentation for probing coronal magnetism from space-based and ground-based observatories. Spectro-polarimetric imaging of coronal emission-lines in the visible-light wavelength-band provides an important diagnostics tool of the coronal magnetism. The interpretation in terms of Hanle and Zeeman effect of the line-polarization in forbidden emission-lines yields information on the direction and strength of the coronal magnetic field. As study case, this presentation will describe the Torino Coronal Magnetograph (CorMag) for the spectro-polarimetric observation of the FeXIV, 530.3 nm, forbidden emission-line. CorMag - consisting of a Liquid Crystal (LC) Lyot filter and a LC linear polarimeter - has been recently installed on the Lomnicky Peak Observatory 20cm Zeiss coronagraph. The preliminary results from CorMag will be presented. The linear polarization by resonance scattering of coronal permitted line-emission in the ultraviolet (UV)can be modified by magnetic fields through the Hanle effect. Space-based UV spectro-polarimeters would provide an additional tool for the disgnostics of coronal magnetism. As a case study of space-borne UV spectro-polarimeters, this presentation will describe the future upgrade of the Sounding-rocket Coronagraphic Experiment (SCORE) to include the capability of imaging polarimetry of the HI Lyman-alpha, 121.6 nm. SCORE is a multi-wavelength imager for the emission-lines, HeII 30.4 nm and HI 121.6 nm, and visible-light broad-band emission of the polarized K-corona. SCORE has flown successfully in 2009. This presentation will describe how in future re-flights SCORE could observe the expected Hanle effect in corona with a HI Lyman-alpha polarimeter.

  20. VME-based remote instrument control without ground loops

    CERN Document Server

    Belleman, J; González, J L

    1997-01-01

    New electronics has been developed for the remote control of the pick-up electrodes at the CERN Proton Synchrotron (PS). Communication between VME-based control computers and remote equipment is via full duplex point-to-point digital data links. Data are sent and received in serial format over simple twisted pairs at a rate of 1 Mbit/s, for distances of up to 300 m. Coupling transformers are used to avoid ground loops. The link hardware consists of a general-purpose VME-module, the 'TRX' (transceiver), containing four FIFO-buffered communication channels, and a dedicated control card for each remote station. Remote transceiver electronics is simple enough not to require micro-controllers or processors. Currently, some sixty pick-up stations of various types, all over the PS Complex (accelerators and associated beam transfer lines) are equipped with the new system. Even though the TRX was designed primarily for communication with pick-up electronics, it could also be used for other purposes, for example to for...

  1. Observing Tsunamis in the Ionosphere Using Ground Based GPS Measurements

    Science.gov (United States)

    Galvan, D. A.; Komjathy, A.; Song, Y. Tony; Stephens, P.; Hickey, M. P.; Foster, J.

    2011-01-01

    Ground-based Global Positioning System (GPS) measurements of ionospheric Total Electron Content (TEC) show variations consistent with atmospheric internal gravity waves caused by ocean tsunamis following recent seismic events, including the Tohoku tsunami of March 11, 2011. We observe fluctuations correlated in time, space, and wave properties with this tsunami in TEC estimates processed using JPL's Global Ionospheric Mapping Software. These TEC estimates were band-pass filtered to remove ionospheric TEC variations with periods outside the typical range of internal gravity waves caused by tsunamis. Observable variations in TEC appear correlated with the Tohoku tsunami near the epicenter, at Hawaii, and near the west coast of North America. Disturbance magnitudes are 1-10% of the background TEC value. Observations near the epicenter are compared to estimates of expected tsunami-driven TEC variations produced by Embry Riddle Aeronautical University's Spectral Full Wave Model, an atmosphere-ionosphere coupling model, and found to be in good agreement. The potential exists to apply these detection techniques to real-time GPS TEC data, providing estimates of tsunami speed and amplitude that may be useful for future early warning systems.

  2. A design for a ground-based data management system

    Science.gov (United States)

    Lambird, Barbara A.; Lavine, David

    1988-01-01

    An initial design for a ground-based data management system which includes intelligent data abstraction and cataloging is described. The large quantity of data on some current and future NASA missions leads to significant problems in providing scientists with quick access to relevant data. Human screening of data for potential relevance to a particular study is time-consuming and costly. Intelligent databases can provide automatic screening when given relevent scientific parameters and constraints. The data management system would provide, at a minimum, information of availability of the range of data, the type available, specific time periods covered together with data quality information, and related sources of data. The system would inform the user about the primary types of screening, analysis, and methods of presentation available to the user. The system would then aid the user with performing the desired tasks, in such a way that the user need only specify the scientific parameters and objectives, and not worry about specific details for running a particular program. The design contains modules for data abstraction, catalog plan abstraction, a user-friendly interface, and expert systems for data handling, data evaluation, and application analysis. The emphasis is on developing general facilities for data representation, description, analysis, and presentation that will be easily used by scientists directly, thus bypassing the knowledge acquisition bottleneck. Expert system technology is used for many different aspects of the data management system, including the direct user interface, the interface to the data analysis routines, and the analysis of instrument status.

  3. Ground-Based Correction of Remote-Sensing Spectral Imagery

    Science.gov (United States)

    Alder-Golden, Steven M.; Rochford, Peter; Matthew, Michael; Berk, Alexander

    2007-01-01

    Software has been developed for an improved method of correcting for the atmospheric optical effects (primarily, effects of aerosols and water vapor) in spectral images of the surface of the Earth acquired by airborne and spaceborne remote-sensing instruments. In this method, the variables needed for the corrections are extracted from the readings of a radiometer located on the ground in the vicinity of the scene of interest. The software includes algorithms that analyze measurement data acquired from a shadow-band radiometer. These algorithms are based on a prior radiation transport software model, called MODTRAN, that has been developed through several versions up to what are now known as MODTRAN4 and MODTRAN5 . These components have been integrated with a user-friendly Interactive Data Language (IDL) front end and an advanced version of MODTRAN4. Software tools for handling general data formats, performing a Langley-type calibration, and generating an output file of retrieved atmospheric parameters for use in another atmospheric-correction computer program known as FLAASH have also been incorporated into the present soft-ware. Concomitantly with the soft-ware described thus far, there has been developed a version of FLAASH that utilizes the retrieved atmospheric parameters to process spectral image data.

  4. Use of ground-based wind profiles in mesoscale forecasting

    Science.gov (United States)

    Schlatter, Thomas W.

    1985-01-01

    A brief review is presented of recent uses of ground-based wind profile data in mesoscale forecasting. Some of the applications are in real time, and some are after the fact. Not all of the work mentioned here has been published yet, but references are given wherever possible. As Gage and Balsley (1978) point out, sensitive Doppler radars have been used to examine tropospheric wind profiles since the 1970's. It was not until the early 1980's, however, that the potential contribution of these instruments to operational forecasting and numerical weather prediction became apparent. Profiler winds and radiosonde winds compare favorably, usually within a few m/s in speed and 10 degrees in direction (see Hogg et al., 1983), but the obvious advantage of the profiler is its frequent (hourly or more often) sampling of the same volume. The rawinsonde balloon is launched only twice a day and drifts with the wind. In this paper, I will: (1) mention two operational uses of data from a wind profiling system developed jointly by the Wave Propagation and Aeronomy Laboratories of NOAA; (2) describe a number of displays of these same data on a workstation for mesoscale forecasting developed by the Program for Regional Observing and Forecasting Services (PROFS); and (3) explain some interesting diagnostic calculations performed by meteorologists of the Wave Propagation Laboratory.

  5. Tissue Engineering of Cartilage on Ground-Based Facilities

    Science.gov (United States)

    Aleshcheva, Ganna; Bauer, Johann; Hemmersbach, Ruth; Egli, Marcel; Wehland, Markus; Grimm, Daniela

    2016-06-01

    Investigations under simulated microgravity offer the opportunity for a better understanding of the influence of altered gravity on cells and the scaffold-free three-dimensional (3D) tissue formation. To investigate the short-term influence, human chondrocytes were cultivated for 2 h, 4 h, 16 h, and 24 h on a 2D Fast-Rotating Clinostat (FRC) in DMEM/F-12 medium supplemented with 10 % FCS. We detected holes in the vimentin network, perinuclear accumulations of vimentin after 2 h, and changes in the chondrocytes shape visualised by F-actin staining after 4 h of FRC-exposure. Scaffold-free cultivation of chondrocytes for 7 d on the Random Positioning Machine (RPM), the FRC and the Rotating Wall Vessel (RWV) resulted in spheroid formation, a phenomenon already known from spaceflight experiments with chondrocytes (MIR Space Station) and thyroid cancer cells (SimBox/Shenzhou-8 space mission). The experiments enabled by the ESA-CORA-GBF programme gave us an optimal opportunity to study gravity-related cellular processes, validate ground-based facilities for our chosen cell system, and prepare long-term experiments under real microgravity conditions in space

  6. Ground-based detection of G star superflares with NGTS

    Science.gov (United States)

    Jackman, James A. G.; Wheatley, Peter J.; Pugh, Chloe E.; Gänsicke, Boris T.; Gillen, Edward; Broomhall, Anne-Marie; Armstrong, David J.; Burleigh, Matthew R.; Chaushev, Alexander; Eigmüller, Philipp; Erikson, Anders; Goad, Michael R.; Grange, Andrew; Günther, Maximilian N.; Jenkins, James S.; McCormac, James; Raynard, Liam; Thompson, Andrew P. G.; Udry, Stéphane; Walker, Simon; Watson, Christopher A.; West, Richard G.

    2018-04-01

    We present high cadence detections of two superflares from a bright G8 star (V = 11.56) with the Next Generation Transit Survey (NGTS). We improve upon previous superflare detections by resolving the flare rise and peak, allowing us to fit a solar flare inspired model without the need for arbitrary break points between rise and decay. Our data also enables us to identify substructure in the flares. From changing starspot modulation in the NGTS data we detect a stellar rotation period of 59 hours, along with evidence for differential rotation. We combine this rotation period with the observed ROSAT X-ray flux to determine that the star's X-ray activity is saturated. We calculate the flare bolometric energies as 5.4^{+0.8}_{-0.7}× 10^{34}and 2.6^{+0.4}_{-0.3}× 10^{34}erg and compare our detections with G star superflares detected in the Kepler survey. We find our main flare to be one of the largest amplitude superflares detected from a bright G star. With energies more than 100 times greater than the Carrington event, our flare detections demonstrate the role that ground-based instruments such as NGTS can have in assessing the habitability of Earth-like exoplanets, particularly in the era of PLATO.

  7. MO-FG-202-01: A Fast Yet Sensitive EPID-Based Real-Time Treatment Verification System

    International Nuclear Information System (INIS)

    Ahmad, M; Nourzadeh, H; Neal, B; Siebers, J; Watkins, W

    2016-01-01

    Purpose: To create a real-time EPID-based treatment verification system which robustly detects treatment delivery and patient attenuation variations. Methods: Treatment plan DICOM files sent to the record-and-verify system are captured and utilized to predict EPID images for each planned control point using a modified GPU-based digitally reconstructed radiograph algorithm which accounts for the patient attenuation, source energy fluence, source size effects, and MLC attenuation. The DICOM and predicted images are utilized by our C++ treatment verification software which compares EPID acquired 1024×768 resolution frames acquired at ∼8.5hz from Varian Truebeam™ system. To maximize detection sensitivity, image comparisons determine (1) if radiation exists outside of the desired treatment field; (2) if radiation is lacking inside the treatment field; (3) if translations, rotations, and magnifications of the image are within tolerance. Acquisition was tested with known test fields and prior patient fields. Error detection was tested in real-time and utilizing images acquired during treatment with another system. Results: The computational time of the prediction algorithms, for a patient plan with 350 control points and 60×60×42cm^3 CT volume, is 2–3minutes on CPU and <27 seconds on GPU for 1024×768 images. The verification software requires a maximum of ∼9ms and ∼19ms for 512×384 and 1024×768 resolution images, respectively, to perform image analysis and dosimetric validations. Typical variations in geometric parameters between reference and the measured images are 0.32°for gantry rotation, 1.006 for scaling factor, and 0.67mm for translation. For excess out-of-field/missing in-field fluence, with masks extending 1mm (at isocenter) from the detected aperture edge, the average total in-field area missing EPID fluence was 1.5mm2 the out-of-field excess EPID fluence was 8mm^2, both below error tolerances. Conclusion: A real-time verification software, with

  8. MO-FG-202-01: A Fast Yet Sensitive EPID-Based Real-Time Treatment Verification System

    Energy Technology Data Exchange (ETDEWEB)

    Ahmad, M; Nourzadeh, H; Neal, B; Siebers, J [University of Virginia Health System, Charlottesville, VA (United States); Watkins, W

    2016-06-15

    Purpose: To create a real-time EPID-based treatment verification system which robustly detects treatment delivery and patient attenuation variations. Methods: Treatment plan DICOM files sent to the record-and-verify system are captured and utilized to predict EPID images for each planned control point using a modified GPU-based digitally reconstructed radiograph algorithm which accounts for the patient attenuation, source energy fluence, source size effects, and MLC attenuation. The DICOM and predicted images are utilized by our C++ treatment verification software which compares EPID acquired 1024×768 resolution frames acquired at ∼8.5hz from Varian Truebeam™ system. To maximize detection sensitivity, image comparisons determine (1) if radiation exists outside of the desired treatment field; (2) if radiation is lacking inside the treatment field; (3) if translations, rotations, and magnifications of the image are within tolerance. Acquisition was tested with known test fields and prior patient fields. Error detection was tested in real-time and utilizing images acquired during treatment with another system. Results: The computational time of the prediction algorithms, for a patient plan with 350 control points and 60×60×42cm^3 CT volume, is 2–3minutes on CPU and <27 seconds on GPU for 1024×768 images. The verification software requires a maximum of ∼9ms and ∼19ms for 512×384 and 1024×768 resolution images, respectively, to perform image analysis and dosimetric validations. Typical variations in geometric parameters between reference and the measured images are 0.32°for gantry rotation, 1.006 for scaling factor, and 0.67mm for translation. For excess out-of-field/missing in-field fluence, with masks extending 1mm (at isocenter) from the detected aperture edge, the average total in-field area missing EPID fluence was 1.5mm2 the out-of-field excess EPID fluence was 8mm^2, both below error tolerances. Conclusion: A real-time verification software, with

  9. Investigating Ground Swarm Robotics Using Agent Based Simulation

    National Research Council Canada - National Science Library

    Ho, Sze-Tek T

    2006-01-01

    The concept of employing ground swarm robotics to accomplish tasks has been proposed for future use in humanitarian de-mining, plume monitoring, searching for survivors in a disaster site, and other hazardous activities...

  10. System of gait analysis based on ground reaction force assessment

    Directory of Open Access Journals (Sweden)

    František Vaverka

    2015-12-01

    Full Text Available Background: Biomechanical analysis of gait employs various methods used in kinematic and kinetic analysis, EMG, and others. One of the most frequently used methods is kinetic analysis based on the assessment of the ground reaction forces (GRF recorded on two force plates. Objective: The aim of the study was to present a method of gait analysis based on the assessment of the GRF recorded during the stance phase of two steps. Methods: The GRF recorded with a force plate on one leg during stance phase has three components acting in directions: Fx - mediolateral, Fy - anteroposterior, and Fz - vertical. A custom-written MATLAB script was used for gait analysis in this study. This software displays instantaneous force data for both legs as Fx(t, Fy(t and Fz(t curves, automatically determines the extremes of functions and sets the visual markers defining the individual points of interest. Positions of these markers can be easily adjusted by the rater, which may be necessary if the GRF has an atypical pattern. The analysis is fully automated and analyzing one trial takes only 1-2 minutes. Results: The method allows quantification of temporal variables of the extremes of the Fx(t, Fy(t, Fz(t functions, durations of the braking and propulsive phase, duration of the double support phase, the magnitudes of reaction forces in extremes of measured functions, impulses of force, and indices of symmetry. The analysis results in a standardized set of 78 variables (temporal, force, indices of symmetry which can serve as a basis for further research and diagnostics. Conclusions: The resulting set of variable offers a wide choice for selecting a specific group of variables with consideration to a particular research topic. The advantage of this method is the standardization of the GRF analysis, low time requirements allowing rapid analysis of a large number of trials in a short time, and comparability of the variables obtained during different research measurements.

  11. Foundation Investigation for Ground Based Radar Project-Kwajalein Island, Marshall Islands

    Science.gov (United States)

    1990-04-01

    iL_ COPY MISCELLANEOUS PAPER GL-90-5 i iFOUNDATION INVESTIGATION FOR GROUND BASED RADAR PROJECT--KWAJALEIN ISLAND, MARSHALL ISLANDS by Donald E...C!assification) Foundatioa Investigation for Ground Based Radar Project -- Kwajalein Island, Marshall Islands 12. PERSONAL AUTHOR(S) Yule, Donald E...investigation for the Ground Based Radar Project -- Kwajalein Island, Marshall Islands , are presented.- eophysical tests comprised of surface refrac- tion

  12. An investigation of the effects of relevant samples and a comparison of verification versus discovery based lab design

    Science.gov (United States)

    Rieben, James C., Jr.

    This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the

  13. Characterization of subarctic vegetation using ground based remote sensing methods

    Science.gov (United States)

    Finnell, D.; Garnello, A.; Palace, M. W.; Sullivan, F.; Herrick, C.; Anderson, S. M.; Crill, P. M.; Varner, R. K.

    2014-12-01

    Stordalen mire is located at 68°21'N and 19°02'E in the Swedish subarctic. Climate monitoring has revealed a warming trend spanning the past 150 years affecting the mires ability to hold stable palsa/hummock mounds. The micro-topography of the landscape has begun to degrade into thaw ponds changing the vegetation cover from ombrothrophic to minerotrophic. Hummocks are ecologically important due to their ability to act as a carbon sinks. Thaw ponds and sphagnum rich transitional zones have been documented as sources of atmospheric CH4. An objective of this project is to determine if a high resolution three band camera (RGB) and a RGNIR camera could detect differences in vegetation over five different site types. Species composition was collected for 50 plots with ten repetitions for each site type: palsa/hummock, tall shrub, semi-wet, tall graminoid, and wet. Sites were differentiated based on dominating species and features consisting of open water presence, sphagnum spp. cover, graminoid spp. cover, or the presence of dry raised plateaus/mounds. A pole based camera mount was used to collect images at a height of ~2.44m from the ground. The images were cropped in post-processing to fit a one-square meter quadrat. Texture analysis was performed on all images, including entropy, lacunarity, and angular second momentum. Preliminary results suggested that site type influences the number of species present. The p-values for the ability to predict site type using a t-test range from use of a stepwise regression of texture variables, actual vs. predicted percent of vegetation coverage provided R squared values of 0.73, 0.71, 0.67, and 0.89 for C. bigelowii, R. chamaemorus, Sphagnum spp., and open water respectively. These data have provided some support to the notion that texture analyses can be used for classification of mire site types. Future work will involve scaling up from the 50 plots through the use of data collected from two unmanned aerial systems (UAS), as

  14. OBSERVATIONAL SELECTION EFFECTS WITH GROUND-BASED GRAVITATIONAL WAVE DETECTORS

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Hsin-Yu; Holz, Daniel E. [University of Chicago, Chicago, Illinois 60637 (United States); Essick, Reed; Vitale, Salvatore; Katsavounidis, Erik [LIGO, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139 (United States)

    2017-01-20

    Ground-based interferometers are not perfect all-sky instruments, and it is important to account for their behavior when considering the distribution of detected events. In particular, the LIGO detectors are most sensitive to sources above North America and the Indian Ocean, and as the Earth rotates, the sensitive regions are swept across the sky. However, because the detectors do not acquire data uniformly over time, there is a net bias on detectable sources’ right ascensions. Both LIGO detectors preferentially collect data during their local night; it is more than twice as likely to be local midnight than noon when both detectors are operating. We discuss these selection effects and how they impact LIGO’s observations and electromagnetic (EM) follow-up. Beyond galactic foregrounds associated with seasonal variations, we find that equatorial observatories can access over 80% of the localization probability, while mid-latitudes will access closer to 70%. Facilities located near the two LIGO sites can observe sources closer to their zenith than their analogs in the south, but the average observation will still be no closer than 44° from zenith. We also find that observatories in Africa or the South Atlantic will wait systematically longer before they can begin observing compared to the rest of the world; though, there is a preference for longitudes near the LIGOs. These effects, along with knowledge of the LIGO antenna pattern, can inform EM follow-up activities and optimization, including the possibility of directing observations even before gravitational-wave events occur.

  15. Project management for complex ground-based instruments: MEGARA plan

    Science.gov (United States)

    García-Vargas, María. Luisa; Pérez-Calpena, Ana; Gil de Paz, Armando; Gallego, Jesús; Carrasco, Esperanza; Cedazo, Raquel; Iglesias, Jorge

    2014-08-01

    The project management of complex instruments for ground-based large telescopes is a challenge itself. A good management is a clue for project success in terms of performance, schedule and budget. Being on time has become a strict requirement for two reasons: to assure the arrival at the telescope due to the pressure on demanding new instrumentation for this first world-class telescopes and to not fall in over-costs. The budget and cash-flow is not always the expected one and has to be properly handled from different administrative departments at the funding centers worldwide distributed. The complexity of the organizations, the technological and scientific return to the Consortium partners and the participation in the project of all kind of professional centers working in astronomical instrumentation: universities, research centers, small and large private companies, workshops and providers, etc. make the project management strategy, and the tools and procedures tuned to the project needs, crucial for success. MEGARA (Multi-Espectrógrafo en GTC de Alta Resolución para Astronomía) is a facility instrument of the 10.4m GTC (La Palma, Spain) working at optical wavelengths that provides both Integral-Field Unit (IFU) and Multi-Object Spectrograph (MOS) capabilities at resolutions in the range R=6,000-20,000. The project is an initiative led by Universidad Complutense de Madrid (Spain) in collaboration with INAOE (Mexico), IAA-CSIC (Spain) and Universidad Politécnica de Madrid (Spain). MEGARA is being developed under contract with GRANTECAN.

  16. OBSERVATIONAL SELECTION EFFECTS WITH GROUND-BASED GRAVITATIONAL WAVE DETECTORS

    International Nuclear Information System (INIS)

    Chen, Hsin-Yu; Holz, Daniel E.; Essick, Reed; Vitale, Salvatore; Katsavounidis, Erik

    2017-01-01

    Ground-based interferometers are not perfect all-sky instruments, and it is important to account for their behavior when considering the distribution of detected events. In particular, the LIGO detectors are most sensitive to sources above North America and the Indian Ocean, and as the Earth rotates, the sensitive regions are swept across the sky. However, because the detectors do not acquire data uniformly over time, there is a net bias on detectable sources’ right ascensions. Both LIGO detectors preferentially collect data during their local night; it is more than twice as likely to be local midnight than noon when both detectors are operating. We discuss these selection effects and how they impact LIGO’s observations and electromagnetic (EM) follow-up. Beyond galactic foregrounds associated with seasonal variations, we find that equatorial observatories can access over 80% of the localization probability, while mid-latitudes will access closer to 70%. Facilities located near the two LIGO sites can observe sources closer to their zenith than their analogs in the south, but the average observation will still be no closer than 44° from zenith. We also find that observatories in Africa or the South Atlantic will wait systematically longer before they can begin observing compared to the rest of the world; though, there is a preference for longitudes near the LIGOs. These effects, along with knowledge of the LIGO antenna pattern, can inform EM follow-up activities and optimization, including the possibility of directing observations even before gravitational-wave events occur.

  17. Simulating the Performance of Ground-Based Optical Asteroid Surveys

    Science.gov (United States)

    Christensen, Eric J.; Shelly, Frank C.; Gibbs, Alex R.; Grauer, Albert D.; Hill, Richard E.; Johnson, Jess A.; Kowalski, Richard A.; Larson, Stephen M.

    2014-11-01

    We are developing a set of asteroid survey simulation tools in order to estimate the capability of existing and planned ground-based optical surveys, and to test a variety of possible survey cadences and strategies. The survey simulator is composed of several layers, including a model population of solar system objects and an orbital integrator, a site-specific atmospheric model (including inputs for seeing, haze and seasonal cloud cover), a model telescope (with a complete optical path to estimate throughput), a model camera (including FOV, pixel scale, and focal plane fill factor) and model source extraction and moving object detection layers with tunable detection requirements. We have also developed a flexible survey cadence planning tool to automatically generate nightly survey plans. Inputs to the cadence planner include camera properties (FOV, readout time), telescope limits (horizon, declination, hour angle, lunar and zenithal avoidance), preferred and restricted survey regions in RA/Dec, ecliptic, and Galactic coordinate systems, and recent coverage by other asteroid surveys. Simulated surveys are created for a subset of current and previous NEO surveys (LINEAR, Pan-STARRS and the three Catalina Sky Survey telescopes), and compared against the actual performance of these surveys in order to validate the model’s performance. The simulator tracks objects within the FOV of any pointing that were not discovered (e.g. too few observations, too trailed, focal plane array gaps, too fast or slow), thus dividing the population into “discoverable” and “discovered” subsets, to inform possible survey design changes. Ongoing and future work includes generating a realistic “known” subset of the model NEO population, running multiple independent simulated surveys in coordinated and uncoordinated modes, and testing various cadences to find optimal strategies for detecting NEO sub-populations. These tools can also assist in quantifying the efficiency of novel

  18. Electrical performance verification methodology for large reflector antennas: based on the P-band SAR payload of the ESA BIOMASS candidate mission

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Kim, Oleksiy S.; Nielsen, Jeppe Majlund

    2013-01-01

    pattern and gain of the entire antenna including support and satellite structure with an appropriate computational software. A preliminary investigation of the proposed methodology was carried out by performing extensive simulations of different verification approaches. The experimental validation......In this paper, an electrical performance verification methodology for large reflector antennas is proposed. The verification methodology was developed for the BIOMASS P-band (435 MHz) synthetic aperture radar (SAR), but can be applied to other large deployable or fixed reflector antennas for which...... the verification of the entire antenna or payload is impossible. The two-step methodology is based on accurate measurement of the feed structure characteristics, such as complex radiation pattern and radiation efficiency, with an appropriate Measurement technique, and then accurate calculation of the radiation...

  19. Film based verification of calculation algorithms used for brachytherapy planning-getting ready for upcoming challenges of MBDCA

    Directory of Open Access Journals (Sweden)

    Grzegorz Zwierzchowski

    2016-08-01

    Full Text Available Purpose: Well-known defect of TG-43 based algorithms used in brachytherapy is a lack of information about interaction cross-sections, which are determined not only by electron density but also by atomic number. TG-186 recommendations with using of MBDCA (model-based dose calculation algorithm, accurate tissues segmentation, and the structure’s elemental composition continue to create difficulties in brachytherapy dosimetry. For the clinical use of new algorithms, it is necessary to introduce reliable and repeatable methods of treatment planning systems (TPS verification. The aim of this study is the verification of calculation algorithm used in TPS for shielded vaginal applicators as well as developing verification procedures for current and further use, based on the film dosimetry method. Material and methods : Calibration data was collected by separately irradiating 14 sheets of Gafchromic® EBT films with the doses from 0.25 Gy to 8.0 Gy using HDR 192Ir source. Standard vaginal cylinders of three diameters were used in the water phantom. Measurements were performed without any shields and with three shields combination. Gamma analyses were performed using the VeriSoft® package. Results : Calibration curve was determined as third-degree polynomial type. For all used diameters of unshielded cylinder and for all shields combinations, Gamma analysis were performed and showed that over 90% of analyzed points meets Gamma criteria (3%, 3 mm. Conclusions : Gamma analysis showed good agreement between dose distributions calculated using TPS and measured by Gafchromic films, thus showing the viability of using film dosimetry in brachytherapy.

  20. Principle and Design of a Single-phase Inverter-Based Grounding System for Neutral-to-ground Voltage Compensation in Distribution Networks

    DEFF Research Database (Denmark)

    Wang, Wen; Yan, Lingjie; Zeng, Xiangjun

    2017-01-01

    Neutral-to-ground overvoltage may occur in non-effectively grounded power systems because of the distributed parameters asymmetry and resonance between Petersen coil and distributed capacitances. Thus, the constraint of neutral-to-ground voltage is critical for the safety of distribution networks....... In this paper, an active grounding system based on single-phase inverter and its control parameter design method is proposed to achieve this objective. Relationship between its output current and neutral-to-ground voltage is derived to explain the principle of neutral-to-ground voltage compensation. Then...

  1. Quantum supremacy in constant-time measurement-based computation: A unified architecture for sampling and verification

    Science.gov (United States)

    Miller, Jacob; Sanders, Stephen; Miyake, Akimasa

    2017-12-01

    While quantum speed-up in solving certain decision problems by a fault-tolerant universal quantum computer has been promised, a timely research interest includes how far one can reduce the resource requirement to demonstrate a provable advantage in quantum devices without demanding quantum error correction, which is crucial for prolonging the coherence time of qubits. We propose a model device made of locally interacting multiple qubits, designed such that simultaneous single-qubit measurements on it can output probability distributions whose average-case sampling is classically intractable, under similar assumptions as the sampling of noninteracting bosons and instantaneous quantum circuits. Notably, in contrast to these previous unitary-based realizations, our measurement-based implementation has two distinctive features. (i) Our implementation involves no adaptation of measurement bases, leading output probability distributions to be generated in constant time, independent of the system size. Thus, it could be implemented in principle without quantum error correction. (ii) Verifying the classical intractability of our sampling is done by changing the Pauli measurement bases only at certain output qubits. Our usage of random commuting quantum circuits in place of computationally universal circuits allows a unique unification of sampling and verification, so they require the same physical resource requirements in contrast to the more demanding verification protocols seen elsewhere in the literature.

  2. Review of commonly used remote sensing and ground-based ...

    African Journals Online (AJOL)

    This review provides an overview of the use of remote sensing data, the development of spectral reflectance indices for detecting plant water stress, and the usefulness of field measurements for ground-truthing purposes. Reliable measurements of plant water stress over large areas are often required for management ...

  3. Imaging of Ground Ice with Surface-Based Geophysics

    Science.gov (United States)

    2015-10-01

    terrains. Electrical Resistivity Tomography (ERT), in particular, has been effective for imaging ground ice. ERT measures the ability of materials to...13 2.2.1 Electrical resistivity tomography (ERT...Engineer Research and Development Center ERT Electrical Resistivity Tomography GPS Global Positioning System LiDAR Light Detection and Ranging SIPRE

  4. Large antennas for ground-based astronomy above 1 THz

    NARCIS (Netherlands)

    Wild, Wolfgang; Guesten, R.; Holland, W. S.; Ivison, R.; Stacey, G. J.

    2006-01-01

    In its history astronomy has continuously expanded access to new wavelength regions both from space and on the ground. Today, one of the few unexplored regimes is the terahertz (THz) frequency range, more specifically above 1 THz (< lambda 300 mum). Astronomical observations above 1 THz are

  5. Biosensors for EVA: Improved Instrumentation for Ground-based Studies

    Science.gov (United States)

    Soller, B.; Ellerby, G.; Zou, F.; Scott, P.; Jin, C.; Lee, S. M. C.; Coates, J.

    2010-01-01

    During lunar excursions in the EVA suit, real-time measurement of metabolic rate is required to manage consumables and guide activities to ensure safe return to the base. Metabolic rate, or oxygen consumption (VO2), is normally measured from pulmonary parameters but cannot be determined with standard techniques in the oxygen-rich environment of a spacesuit. Our group has developed novel near infrared spectroscopic (NIRS) methods to calculate muscle oxygen saturation (SmO 2), hematocrit, and pH, and we recently demonstrated that we can use our NIRS sensor to measure VO 2 on the leg during cycling. Our NSBRI project has 4 objectives: (1) increase the accuracy of the metabolic rate calculation through improved prediction of stroke volume; (2) investigate the relative contributions of calf and thigh oxygen consumption to metabolic rate calculation for walking and running; (3) demonstrate that the NIRS-based noninvasive metabolic rate methodology is sensitive enough to detect decrement in VO 2 in a space analog; and (4) improve instrumentation to allow testing within a spacesuit. Over the past year we have made progress on all four objectives, but the most significant progress was made in improving the instrumentation. The NIRS system currently in use at JSC is based on fiber optics technology. Optical fiber bundles are used to deliver light from a light source in the monitor to the patient, and light reflected back from the patient s muscle to the monitor for spectroscopic analysis. The fiber optic cables are large and fragile, and there is no way to get them in and out of the test spacesuit used for ground-based studies. With complimentary funding from the US Army, we undertook a complete redesign of the sensor and control electronics to build a novel system small enough to be used within the spacesuit and portable enough to be used by a combat medic. In the new system the filament lamp used in the fiber optic system was replaced with a novel broadband near infrared

  6. Scaling earthquake ground motions for performance-based assessment of buildings

    Science.gov (United States)

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.; Hamburger, R.O.

    2011-01-01

    The impact of alternate ground-motion scaling procedures on the distribution of displacement responses in simplified structural systems is investigated. Recommendations are provided for selecting and scaling ground motions for performance-based assessment of buildings. Four scaling methods are studied, namely, (1)geometric-mean scaling of pairs of ground motions, (2)spectrum matching of ground motions, (3)first-mode-period scaling to a target spectral acceleration, and (4)scaling of ground motions per the distribution of spectral demands. Data were developed by nonlinear response-history analysis of a large family of nonlinear single degree-of-freedom (SDOF) oscillators that could represent fixed-base and base-isolated structures. The advantages and disadvantages of each scaling method are discussed. The relationship between spectral shape and a ground-motion randomness parameter, is presented. A scaling procedure that explicitly considers spectral shape is proposed. ?? 2011 American Society of Civil Engineers.

  7. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  8. Experimental verification of preset time count rate meters based on adaptive digital signal processing algorithms

    Directory of Open Access Journals (Sweden)

    Žigić Aleksandar D.

    2005-01-01

    Full Text Available Experimental verifications of two optimized adaptive digital signal processing algorithms implemented in two pre set time count rate meters were per formed ac cording to appropriate standards. The random pulse generator realized using a personal computer, was used as an artificial radiation source for preliminary system tests and performance evaluations of the pro posed algorithms. Then measurement results for background radiation levels were obtained. Finally, measurements with a natural radiation source radioisotope 90Sr-90Y, were carried out. Measurement results, con ducted without and with radio isotopes for the specified errors of 10% and 5% showed to agree well with theoretical predictions.

  9. A detrimental soil disturbance prediction model for ground-based timber harvesting

    Science.gov (United States)

    Derrick A. Reeves; Matthew C. Reeves; Ann M. Abbott; Deborah S. Page-Dumroese; Mark D. Coleman

    2012-01-01

    Soil properties and forest productivity can be affected during ground-based harvest operations and site preparation. The degree of impact varies widely depending on topographic features and soil properties. Forest managers who understand site-specific limits to ground-based harvesting can alter harvest method or season to limit soil disturbance. To determine the...

  10. WE-D-BRA-04: Online 3D EPID-Based Dose Verification for Optimum Patient Safety

    International Nuclear Information System (INIS)

    Spreeuw, H; Rozendaal, R; Olaciregui-Ruiz, I; Mans, A; Mijnheer, B; Herk, M van; Gonzalez, P

    2015-01-01

    Purpose: To develop an online 3D dose verification tool based on EPID transit dosimetry to ensure optimum patient safety in radiotherapy treatments. Methods: A new software package was developed which processes EPID portal images online using a back-projection algorithm for the 3D dose reconstruction. The package processes portal images faster than the acquisition rate of the portal imager (∼ 2.5 fps). After a portal image is acquired, the software seeks for “hot spots” in the reconstructed 3D dose distribution. A hot spot is in this study defined as a 4 cm 3 cube where the average cumulative reconstructed dose exceeds the average total planned dose by at least 20% and 50 cGy. If a hot spot is detected, an alert is generated resulting in a linac halt. The software has been tested by irradiating an Alderson phantom after introducing various types of serious delivery errors. Results: In our first experiment the Alderson phantom was irradiated with two arcs from a 6 MV VMAT H&N treatment having a large leaf position error or a large monitor unit error. For both arcs and both errors the linac was halted before dose delivery was completed. When no error was introduced, the linac was not halted. The complete processing of a single portal frame, including hot spot detection, takes about 220 ms on a dual hexacore Intel Xeon 25 X5650 CPU at 2.66 GHz. Conclusion: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for various kinds of gross delivery errors. The detection of hot spots was proven to be effective for the timely detection of these errors. Current work is focused on hot spot detection criteria for various treatment sites and the introduction of a clinical pilot program with online verification of hypo-fractionated (lung) treatments

  11. 3D VMAT Verification Based on Monte Carlo Log File Simulation with Experimental Feedback from Film Dosimetry.

    Science.gov (United States)

    Barbeiro, A R; Ureba, A; Baeza, J A; Linares, R; Perucha, M; Jiménez-Ortega, E; Velázquez, S; Mateos, J C; Leal, A

    2016-01-01

    A model based on a specific phantom, called QuAArC, has been designed for the evaluation of planning and verification systems of complex radiotherapy treatments, such as volumetric modulated arc therapy (VMAT). This model uses the high accuracy provided by the Monte Carlo (MC) simulation of log files and allows the experimental feedback from the high spatial resolution of films hosted in QuAArC. This cylindrical phantom was specifically designed to host films rolled at different radial distances able to take into account the entrance fluence and the 3D dose distribution. Ionization chamber measurements are also included in the feedback process for absolute dose considerations. In this way, automated MC simulation of treatment log files is implemented to calculate the actual delivery geometries, while the monitor units are experimentally adjusted to reconstruct the dose-volume histogram (DVH) on the patient CT. Prostate and head and neck clinical cases, previously planned with Monaco and Pinnacle treatment planning systems and verified with two different commercial systems (Delta4 and COMPASS), were selected in order to test operational feasibility of the proposed model. The proper operation of the feedback procedure was proved through the achieved high agreement between reconstructed dose distributions and the film measurements (global gamma passing rates > 90% for the 2%/2 mm criteria). The necessary discretization level of the log file for dose calculation and the potential mismatching between calculated control points and detection grid in the verification process were discussed. Besides the effect of dose calculation accuracy of the analytic algorithm implemented in treatment planning systems for a dynamic technique, it was discussed the importance of the detection density level and its location in VMAT specific phantom to obtain a more reliable DVH in the patient CT. The proposed model also showed enough robustness and efficiency to be considered as a pre

  12. DEVELOPMENT OF ENRICHMENT VERIFICATION ASSAY BASED ON THE AGE AND 235U AND 238U ACTIVITIES OF THE SAMPLES

    International Nuclear Information System (INIS)

    AL-YAMAHI, H.; EL-MONGY, S.A.

    2008-01-01

    Development of the enrichment verification methods is the backbone of the nuclear materials safeguards skeleton. In this study, the 235U percentage of depleted , natural and very slightly enriched uranium samples were estimated based on the sample age and the measured activity of 235U and 238U. The HpGe and NaI spectrometry were used for samples assay. A developed equation was derived to correlate the sample age and 235U and 238U activities with the enrichment percentage (E%). The results of the calculated E% by the deduced equation and the target E% values were found to be similar and within 0.58 -1.75% bias in the case of HpGe measurements. The correlation between them was found to be very sharp. The activity was also calculated based on the measured sample count rate and the efficiency at the gamma energies of interest. The correlation between the E% and the 235U activity was estimated and found to be linearly sharp. The results obtained by NaI was found to be less accurate than these obtained by HpGe. The bias in the case of NaI assay was in the range from 6.398% to 22.8% for E% verification

  13. NO2 DOAS measurements from ground and space: comparison of ground based measurements and OMI data in Mexico City

    Science.gov (United States)

    Rivera, C.; Stremme, W.; Grutter, M.

    2012-04-01

    The combination of satellite data and ground based measurements can provide valuable information about atmospheric chemistry and air quality. In this work we present a comparison between measured ground based NO2 differential columns at the Universidad Nacional Autónoma de México (UNAM) in Mexico City, using the Differential Optical Absorption Spectroscopy (DOAS) technique and NO2 total columns measured by the Ozone Monitoring Instrument (OMI) onboard the Aura satellite using the same measurement technique. From these data, distribution maps of average NO2 above the Mexico basin were constructed and hot spots inside the city could be identified. In addition, a clear footprint was detected from the Tula industrial area, ~50 km northwest of Mexico City, where a refinery, a power plant and other industries are located. A less defined footprint was identified in the Cuernavaca basin, South of Mexico City, and the nearby cities of Toluca and Puebla do not present strong enhancements in the NO2 total columns. With this study we expect to cross-validate space and ground measurements and provide useful information for future studies.

  14. Testing a ground-based canopy model using the wind river canopy crane

    Science.gov (United States)

    Robert Van Pelt; Malcolm P. North

    1999-01-01

    A ground-based canopy model that estimates the volume of occupied space in forest canopies was tested using the Wind River Canopy Crane. A total of 126 trees in a 0.25 ha area were measured from the ground and directly from a gondola suspended from the crane. The trees were located in a low elevation, old-growth forest in the southern Washington Cascades. The ground-...

  15. Design verification enhancement of field programmable gate array-based safety-critical I&C system of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of); Jung, Jaecheon, E-mail: jcjung@kings.ac.kr [Department of Nuclear Power Plant Engineering, KEPCO International Nuclear Graduate School, 658-91 Haemaji-ro, Seosang-myeon, Ulju-gun, Ulsan 45014 (Korea, Republic of); Heo, Gyunyoung [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of)

    2017-06-15

    Highlights: • An enhanced, systematic and integrated design verification approach is proposed for V&V of FPGA-based I&C system of NPP. • RPS bistable fixed setpoint trip algorithm is designed, analyzed, verified and discussed using the proposed approaches. • The application of integrated verification approach simultaneously verified the entire design modules. • The applicability of the proposed V&V facilitated the design verification processes. - Abstract: Safety-critical instrumentation and control (I&C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. However, safety analysis for FPGA-based I&C systems, and verification and validation (V&V) assessments still remain important issues to be resolved, which are now become a global research point of interests. In this work, we proposed a systematic design and verification strategies from start to ready-to-use in form of model-based approaches for FPGA-based reactor protection system (RPS) that can lead to the enhancement of the design verification and validation processes. The proposed methodology stages are requirement analysis, enhanced functional flow block diagram (EFFBD) models, finite state machine with data path (FSMD) models, hardware description language (HDL) code development, and design verifications. The design verification stage includes unit test – Very high speed integrated circuit Hardware Description Language (VHDL) test and modified condition decision coverage (MC/DC) test, module test – MATLAB/Simulink Co-simulation test, and integration test – FPGA hardware test beds. To prove the adequacy of the proposed

  16. Design verification enhancement of field programmable gate array-based safety-critical I&C system of nuclear power plant

    International Nuclear Information System (INIS)

    Ahmed, Ibrahim; Jung, Jaecheon; Heo, Gyunyoung

    2017-01-01

    Highlights: • An enhanced, systematic and integrated design verification approach is proposed for V&V of FPGA-based I&C system of NPP. • RPS bistable fixed setpoint trip algorithm is designed, analyzed, verified and discussed using the proposed approaches. • The application of integrated verification approach simultaneously verified the entire design modules. • The applicability of the proposed V&V facilitated the design verification processes. - Abstract: Safety-critical instrumentation and control (I&C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. However, safety analysis for FPGA-based I&C systems, and verification and validation (V&V) assessments still remain important issues to be resolved, which are now become a global research point of interests. In this work, we proposed a systematic design and verification strategies from start to ready-to-use in form of model-based approaches for FPGA-based reactor protection system (RPS) that can lead to the enhancement of the design verification and validation processes. The proposed methodology stages are requirement analysis, enhanced functional flow block diagram (EFFBD) models, finite state machine with data path (FSMD) models, hardware description language (HDL) code development, and design verifications. The design verification stage includes unit test – Very high speed integrated circuit Hardware Description Language (VHDL) test and modified condition decision coverage (MC/DC) test, module test – MATLAB/Simulink Co-simulation test, and integration test – FPGA hardware test beds. To prove the adequacy of the proposed

  17. Independent calculation-based verification of IMRT plans using a 3D dose-calculation engine

    International Nuclear Information System (INIS)

    Arumugam, Sankar; Xing, Aitang; Goozee, Gary; Holloway, Lois

    2013-01-01

    Independent monitor unit verification of intensity-modulated radiation therapy (IMRT) plans requires detailed 3-dimensional (3D) dose verification. The aim of this study was to investigate using a 3D dose engine in a second commercial treatment planning system (TPS) for this task, facilitated by in-house software. Our department has XiO and Pinnacle TPSs, both with IMRT planning capability and modeled for an Elekta-Synergy 6 MV photon beam. These systems allow the transfer of computed tomography (CT) data and RT structures between them but do not allow IMRT plans to be transferred. To provide this connectivity, an in-house computer programme was developed to convert radiation therapy prescription (RTP) files as generated by many planning systems into either XiO or Pinnacle IMRT file formats. Utilization of the technique and software was assessed by transferring 14 IMRT plans from XiO and Pinnacle onto the other system and performing 3D dose verification. The accuracy of the conversion process was checked by comparing the 3D dose matrices and dose volume histograms (DVHs) of structures for the recalculated plan on the same system. The developed software successfully transferred IMRT plans generated by 1 planning system into the other. Comparison of planning target volume (TV) DVHs for the original and recalculated plans showed good agreement; a maximum difference of 2% in mean dose, − 2.5% in D95, and 2.9% in V95 was observed. Similarly, a DVH comparison of organs at risk showed a maximum difference of +7.7% between the original and recalculated plans for structures in both high- and medium-dose regions. However, for structures in low-dose regions (less than 15% of prescription dose) a difference in mean dose up to +21.1% was observed between XiO and Pinnacle calculations. A dose matrix comparison of original and recalculated plans in XiO and Pinnacle TPSs was performed using gamma analysis with 3%/3 mm criteria. The mean and standard deviation of pixels passing

  18. Integration of Remote Sensing Products with Ground-Based Measurements to Understand the Dynamics of Nepal's Forests and Plantation Sites

    Science.gov (United States)

    Gilani, H.; Jain, A. K.

    2016-12-01

    This study assembles information from three sources - remote sensing, terrestrial photography and ground-based inventory data, to understand the dynamics of Nepal's tropical and sub-tropical forests and plantation sites for the period 1990-2015. Our study focuses on following three specific district areas, which have conserved forests through social and agroforestry management practices: 1. Dolakha district: This site has been selected to study the impact of community-based forest management on land cover change using repeat photography and satellite imagery, in combination with interviews with community members. The study time period is during the period 1990-2010. We determined that satellite data with ground photographs can provide transparency for long term monitoring. The initial results also suggests that community-based forest management program in the mid-hills of Nepal was successful. 2. Chitwan district: Here we use high resolution remote sensing data and optimized community field inventories to evaluate potential application and operational feasibility of community level REDD+ measuring, reporting and verification (MRV) systems. The study uses temporal dynamics of land cover transitions, tree canopy size classes and biomass over a Kayar khola watershed REDD+ study area with community forest to evaluate satellite Image segmentation for land cover, linear regression model for above ground biomass (AGB), and estimation and monitoring field data for tree crowns and AGB. We study three specific years 2002, 2009, 2012. Using integration of WorldView-2 and airborne LiDAR data for tree species level. 3. Nuwakot district: This district was selected to study the impact of establishment of tree plantation on total barren/fallow. Over the last 40 year, this area has went through a drastic changes, from barren land to forest area with tree species consisting of Dalbergia sissoo, Leucaena leucocephala, Michelia champaca, etc. In 1994, this district area was registered

  19. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  20. Exoplanets -New Results from Space and Ground-based Surveys

    Science.gov (United States)

    Udry, Stephane

    The exploration of the outer solar system and in particular of the giant planets and their environments is an on-going process with the Cassini spacecraft currently around Saturn, the Juno mission to Jupiter preparing to depart and two large future space missions planned to launch in the 2020-2025 time frame for the Jupiter system and its satellites (Europa and Ganymede) on the one hand, and the Saturnian system and Titan on the other hand [1,2]. Titan, Saturn's largest satellite, is the only other object in our Solar system to possess an extensive nitrogen atmosphere, host to an active organic chemistry, based on the interaction of N2 with methane (CH4). Following the Voyager flyby in 1980, Titan has been intensely studied from the ground-based large telescopes (such as the Keck or the VLT) and by artificial satellites (such as the Infrared Space Observatory and the Hubble Space Telescope) for the past three decades. Prior to Cassini-Huygens, Titan's atmospheric composition was thus known to us from the Voyager missions and also through the explorations by the ISO. Our perception of Titan had thus greatly been enhanced accordingly, but many questions remained as to the nature of the haze surrounding the satellite and the composition of the surface. The recent revelations by the Cassini-Huygens mission have managed to surprise us with many discoveries [3-8] and have yet to reveal more of the interesting aspects of the satellite. The Cassini-Huygens mission to the Saturnian system has been an extraordinary success for the planetary community since the Saturn-Orbit-Insertion (SOI) in July 2004 and again the very successful probe descent and landing of Huygens on January 14, 2005. One of its main targets was Titan. Titan was revealed to be a complex world more like the Earth than any other: it has a dense mostly nitrogen atmosphere and active climate and meteorological cycles where the working fluid, methane, behaves under Titan conditions the way that water does on

  1. A novel method for sub-arc VMAT dose delivery verification based on portal dosimetry with an EPID.

    Science.gov (United States)

    Cools, Ruud A M; Dirkx, Maarten L P; Heijmen, Ben J M

    2017-11-01

    The EPID-based sub-arc verification of VMAT dose delivery requires synchronization of the acquired electronic portal images (EPIs) with the VMAT delivery, that is, establishment of the start- and stop-MU of the acquired images. To realize this, published synchronization methods propose the use of logging features of the linac or dedicated hardware solutions. In this study, we developed a novel, software-based synchronization method that only uses information inherently available in the acquired images. The EPIs are continuously acquired during pretreatment VMAT delivery and converted into Portal Dose Images (PDIs). Sub-arcs of approximately 10 MU are then defined by combining groups of sequentially acquired PDIs. The start- and stop-MUs of measured sub-arcs are established in a synchronization procedure, using only dosimetric information in measured and predicted PDIs. Sub-arc verification of a VMAT dose delivery is based on comparison of measured sub-arc PDIs with synchronized, predicted sub-arc PDIs, using γ-analyses. To assess the accuracy of this new method, measured and predicted PDIs were compared for 20 clinically applied VMAT prostate cancer plans. The sensitivity of the method for detection of delivery errors was investigated using VMAT deliveries with intentionally inserted, small perturbations (25 error scenarios; leaf gap deviations ≤ 1.5 mm, leaf motion stops during ≤ 15 MU, linac output error ≤ 2%). For the 20 plans, the average failed pixel rates (FPR) for full-arc and sub-arc dose QA were 0.36% ± 0.26% (1 SD) and 0.64% ± 0.88%, based on 2%/2 mm and 3%/3 mm γ-analyses, respectively. Small systematic perturbations of up to 1% output error and 1 mm leaf offset were detected using full-arc QA. Sub-arc QA was able to detect positioning errors in three leaves only during approximately 20 MU and small dose delivery errors during approximately 40 MU. In an ROC analysis, the area under the curve (AUC) for the combined full-arc/sub-arc approach was

  2. Replacement of Hydrochlorofluorocarbon (HCFC) -225 Solvent for Cleaning and Verification Sampling of NASA Propulsion Oxygen Systems Hardware, Ground Support Equipment, and Associated Test Systems

    Science.gov (United States)

    Mitchell, Mark A.; Lowrey, Nikki M.

    2015-01-01

    Since the 1990's, when the Class I Ozone Depleting Substance (ODS) chlorofluorocarbon-113 (CFC-113) was banned, NASA's rocket propulsion test facilities at Marshall Space Flight Center (MSFC) and Stennis Space Center (SSC) have relied upon hydrochlorofluorocarbon-225 (HCFC-225) to safely clean and verify the cleanliness of large scale propulsion oxygen systems. Effective January 1, 2015, the production, import, export, and new use of HCFC-225, a Class II ODS, was prohibited by the Clean Air Act. In 2012 through 2014, leveraging resources from both the NASA Rocket Propulsion Test Program and the Defense Logistics Agency - Aviation Hazardous Minimization and Green Products Branch, test labs at MSFC, SSC, and Johnson Space Center's White Sands Test Facility (WSTF) collaborated to seek out, test, and qualify a replacement for HCFC-225 that is both an effective cleaner and safe for use with oxygen systems. Candidate solvents were selected and a test plan was developed following the guidelines of ASTM G127, Standard Guide for the Selection of Cleaning Agents for Oxygen Systems. Solvents were evaluated for materials compatibility, oxygen compatibility, cleaning effectiveness, and suitability for use in cleanliness verification and field cleaning operations. Two solvents were determined to be acceptable for cleaning oxygen systems and one was chosen for implementation at NASA's rocket propulsion test facilities. The test program and results are summarized. This project also demonstrated the benefits of cross-agency collaboration in a time of limited resources.

  3. Space- and ground-based particle physics meet at CERN

    CERN Multimedia

    CERN Bulletin

    2012-01-01

    The fourth international conference on Particle and Fundamental Physics in Space (SpacePart12) will take place at CERN from 5 to 7 November. The conference will bring together scientists working on particle and fundamental physics in space and on ground, as well as space policy makers from around the world.   One hundred years after Victor Hess discovered cosmic rays using hot air balloons, the experimental study of particle and fundamental physics is still being pursued today with extremely sophisticated techniques: on the ground, with state-of-the-art accelerators like the LHC; and in space, with powerful observatories that probe, with amazing accuracy, the various forms of cosmic radiation, charged and neutral, which are messengers of the most extreme conditions of matter and energy. SpacePart12 will be the opportunity for participants to exchange views on the progress of space-related science and technology programmes in the field of particle and fundamental physics in space. SpacePar...

  4. SU-F-T-463: Light-Field Based Dynalog Verification

    International Nuclear Information System (INIS)

    Atwal, P; Ramaseshan, R

    2016-01-01

    Purpose: To independently verify leaf positions in so-called dynalog files for a Varian iX linac with a Millennium 120 MLC. This verification provides a measure of confidence that the files can be used directly as part of a more extensive intensity modulated radiation therapy / volumetric modulated arc therapy QA program. Methods: Initial testing used white paper placed at the collimator plane and a standard hand-held digital camera to image the light and shadow of a static MLC field through the paper. Known markings on the paper allow for image calibration. Noise reduction was attempted with removal of ‘inherent noise’ from an open-field light image through the paper, but the method was found to be inconsequential. This is likely because the environment could not be controlled to the precision required for the sort of reproducible characterization of the quantum noise needed in order to meaningfully characterize and account for it. A multi-scale iterative edge detection algorithm was used for localizing the leaf ends. These were compared with the planned locations from the treatment console. Results: With a very basic setup, the image of the central bank A leaves 15–45, which are arguably the most important for beam modulation, differed from the planned location by [0.38±0.28] mm. Similarly, for bank B leaves 15–45 had a difference of [0.42±0.28] mm Conclusion: It should be possible to determine leaf position accurately with not much more than a modern hand-held camera and some software. This means we can have a periodic and independent verification of the dynalog file information. This is indicated by the precision already achieved using a basic setup and analysis methodology. Currently, work is being done to reduce imaging and setup errors, which will bring the leaf position error down further, and allow meaningful analysis over the full range of leaves.

  5. SU-F-T-463: Light-Field Based Dynalog Verification

    Energy Technology Data Exchange (ETDEWEB)

    Atwal, P; Ramaseshan, R [BC Cancer Agency, Abbotsford, BC (Canada)

    2016-06-15

    Purpose: To independently verify leaf positions in so-called dynalog files for a Varian iX linac with a Millennium 120 MLC. This verification provides a measure of confidence that the files can be used directly as part of a more extensive intensity modulated radiation therapy / volumetric modulated arc therapy QA program. Methods: Initial testing used white paper placed at the collimator plane and a standard hand-held digital camera to image the light and shadow of a static MLC field through the paper. Known markings on the paper allow for image calibration. Noise reduction was attempted with removal of ‘inherent noise’ from an open-field light image through the paper, but the method was found to be inconsequential. This is likely because the environment could not be controlled to the precision required for the sort of reproducible characterization of the quantum noise needed in order to meaningfully characterize and account for it. A multi-scale iterative edge detection algorithm was used for localizing the leaf ends. These were compared with the planned locations from the treatment console. Results: With a very basic setup, the image of the central bank A leaves 15–45, which are arguably the most important for beam modulation, differed from the planned location by [0.38±0.28] mm. Similarly, for bank B leaves 15–45 had a difference of [0.42±0.28] mm Conclusion: It should be possible to determine leaf position accurately with not much more than a modern hand-held camera and some software. This means we can have a periodic and independent verification of the dynalog file information. This is indicated by the precision already achieved using a basic setup and analysis methodology. Currently, work is being done to reduce imaging and setup errors, which will bring the leaf position error down further, and allow meaningful analysis over the full range of leaves.

  6. A novel technique for extracting clouds base height using ground based imaging

    Directory of Open Access Journals (Sweden)

    E. Hirsch

    2011-01-01

    Full Text Available The height of a cloud in the atmospheric column is a key parameter in its characterization. Several remote sensing techniques (passive and active, either ground-based or on space-borne platforms and in-situ measurements are routinely used in order to estimate top and base heights of clouds. In this article we present a novel method that combines thermal imaging from the ground and sounded wind profile in order to derive the cloud base height. This method is independent of cloud types, making it efficient for both low boundary layer and high clouds. In addition, using thermal imaging ensures extraction of clouds' features during daytime as well as at nighttime. The proposed technique was validated by comparison to active sounding by ceilometers (which is a standard ground based method, to lifted condensation level (LCL calculations, and to MODIS products obtained from space. As all passive remote sensing techniques, the proposed method extracts only the height of the lowest cloud layer, thus upper cloud layers are not detected. Nevertheless, the information derived from this method can be complementary to space-borne cloud top measurements when deep-convective clouds are present. Unlike techniques such as LCL, this method is not limited to boundary layer clouds, and can extract the cloud base height at any level, as long as sufficient thermal contrast exists between the radiative temperatures of the cloud and its surrounding air parcel. Another advantage of the proposed method is its simplicity and modest power needs, making it particularly suitable for field measurements and deployment at remote locations. Our method can be further simplified for use with visible CCD or CMOS camera (although nighttime clouds will not be observed.

  7. Practical Applications of Cosmic Ray Science: Spacecraft, Aircraft, Ground Based Computation and Control Systems and Human Health and Safety

    Science.gov (United States)

    Atwell, William; Koontz, Steve; Normand, Eugene

    2012-01-01

    In this paper we review the discovery of cosmic ray effects on the performance and reliability of microelectronic systems as well as on human health and safety, as well as the development of the engineering and health science tools used to evaluate and mitigate cosmic ray effects in earth surface, atmospheric flight, and space flight environments. Three twentieth century technological developments, 1) high altitude commercial and military aircraft; 2) manned and unmanned spacecraft; and 3) increasingly complex and sensitive solid state micro-electronics systems, have driven an ongoing evolution of basic cosmic ray science into a set of practical engineering tools (e.g. ground based test methods as well as high energy particle transport and reaction codes) needed to design, test, and verify the safety and reliability of modern complex electronic systems as well as effects on human health and safety. The effects of primary cosmic ray particles, and secondary particle showers produced by nuclear reactions with spacecraft materials, can determine the design and verification processes (as well as the total dollar cost) for manned and unmanned spacecraft avionics systems. Similar considerations apply to commercial and military aircraft operating at high latitudes and altitudes near the atmospheric Pfotzer maximum. Even ground based computational and controls systems can be negatively affected by secondary particle showers at the Earth's surface, especially if the net target area of the sensitive electronic system components is large. Accumulation of both primary cosmic ray and secondary cosmic ray induced particle shower radiation dose is an important health and safety consideration for commercial or military air crews operating at high altitude/latitude and is also one of the most important factors presently limiting manned space flight operations beyond low-Earth orbit (LEO).

  8. Intercomparison of ground based and satellite pictures of the sun

    International Nuclear Information System (INIS)

    Chapman, R.D.; Epstein, G.L.; Hobbs, R.W.; Neupert, W.M.; Thomas, R.J.

    1975-01-01

    Using NASA facilities in space (OSO-7) and on the ground (Goddard Multi-Channel Spectrophotometer at Sacramento Peak, New Mexico) an active region has been mapped and by combining these ultraviolet, X-ray and visible data, a physical picture of this structured region has been constructed from the photosphere to the corona, corresponding to temperature regimes over the range 4500 K to 4 000 000 K. The morphology of the active region was then studied by comparing grey-shaded images in which fine details stand out more clearly than in the contour plots. One result of the study is that gross similarities persist from the low photosphere up to high in the transition region while some changes occur in the corona. (Auth.)

  9. Ground-based spectral measurements of solar radiation, (2)

    International Nuclear Information System (INIS)

    Murai, Keizo; Kobayashi, Masaharu; Goto, Ryozo; Yamauchi, Toyotaro

    1979-01-01

    A newly designed spectro-pyranometer was used for the measurement of the global (direct + diffuse) and the diffuse sky radiation reaching the ground. By the subtraction of the diffuse component from the global radiation, we got the direct radiation component which leads to the spectral distribution of the optical thickness (extinction coefficient) of the turbid atmosphere. The measurement of the diffuse sky radiation reveals the scattering effect of aerosols and that of the global radiation allows the estimation of total attenuation caused by scattering and absorption of aerosols. The effects of the aerosols are represented by the deviation of the real atmosphere measured from the Rayleigh atmosphere. By the combination of the measured values with those obtained by theoretical calculation for the model atmosphere, we estimated the amount of absorption by the aerosols. Very strong absorption in the ultraviolet region was recognized. (author)

  10. Modelling of Surface Fault Structures Based on Ground Magnetic Survey

    Science.gov (United States)

    Michels, A.; McEnroe, S. A.

    2017-12-01

    The island of Leka confines the exposure of the Leka Ophiolite Complex (LOC) which contains mantle and crustal rocks and provides a rare opportunity to study the magnetic properties and response of these formations. The LOC is comprised of five rock units: (1) harzburgite that is strongly deformed, shifting into an increasingly olivine-rich dunite (2) ultramafic cumulates with layers of olivine, chromite, clinopyroxene and orthopyroxene. These cumulates are overlain by (3) metagabbros, which are cut by (4) metabasaltic dykes and (5) pillow lavas (Furnes et al. 1988). Over the course of three field seasons a detailed ground-magnetic survey was made over the island covering all units of the LOC and collecting samples from 109 sites for magnetic measurements. NRM, susceptibility, density and hysteresis properties were measured. In total 66% of samples with a Q value > 1, suggests that the magnetic anomalies should include both induced and remanent components in the model.This Ophiolite originated from a suprasubduction zone near the coast of Laurentia (497±2 Ma), was obducted onto Laurentia (≈460 Ma) and then transferred to Baltica during the Caledonide Orogeny (≈430 Ma). The LOC was faulted, deformed and serpentinized during these events. The gabbro and ultramafic rocks are separated by a normal fault. The dominant magnetic anomaly that crosses the island correlates with this normal fault. There are a series of smaller scale faults that are parallel to this and some correspond to local highs that can be highlighted by a tilt derivative of the magnetic data. These fault boundaries which are well delineated by the distinct magnetic anomalies in both ground and aeromagnetic survey data are likely caused by increased amount of serpentinization of the ultramafic rocks in the fault areas.

  11. Multisatellite and ground-based observations of transient ULF waves

    International Nuclear Information System (INIS)

    Potemra, T.A.; Zanetti, L.J.; Takahashi, K.; Erlandson, R.E.; Luehr, H.; Marklund, G.T.; Block, L.P.; Blomberg, L.G.; Lepping, R.P.

    1989-01-01

    A unique alignment of the Active Magnetospheric Particle Tracer Explorers (AMPTE) CCE and Viking satellites with respect to the EISCAT Magnetometer Cross has provided an opportunity to study transient ULF pulsations associated with variations in solar wind plasma density observed by the IMP 8 satellite. These observations were acquired during a relatively quiet period on April 24, 1986, during the Polar Region and Outer Magnetosphere International Study (PROMIS) period. An isolated 4-mHz (4-min period) pulsation was detected on the ground which was associated with transverse magnetic field oscillations observed by Viking at a ∼ 2-R E altitude above the auroral zone and by CCE at ∼ 8-R E in the equatorial plane on nearly the same flux tube. CCE detected a compressional oscillation in the magnetic field with twice the period (∼ 10 min) of the transverse waves, and with a waveform nearly identical to an isolated oscillation in the solar wind plasma density measured by IMP 8. The authors conclude that the isolated 10-min oscillation in solar wind plasma density produced magnetic field compression oscillations inside the magnetosphere at the same frequency which also enhanced resonant oscillations at approximately twice the frequency that were already present. The ground magnetic field variations are due to ionospheric Hall currents driven by the electric field of the standing Alfven waves. The time delay between surface and satellite data acquired at different local times supports the conclusion that the periodic solar wind density variation excites a tailward traveling large-scale magnetosphere wave train which excites local field line resonant oscillations. They conclude that these transient magnetic field variations are not associated with magnetic field reconnection or flux transfer events

  12. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  13. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  14. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  15. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  16. LEO-to-ground optical communications using SOTA (Small Optical TrAnsponder) - Payload verification results and experiments on space quantum communications

    Science.gov (United States)

    Carrasco-Casado, Alberto; Takenaka, Hideki; Kolev, Dimitar; Munemasa, Yasushi; Kunimori, Hiroo; Suzuki, Kenji; Fuse, Tetsuharu; Kubo-Oka, Toshihiro; Akioka, Maki; Koyama, Yoshisada; Toyoshima, Morio

    2017-10-01

    Free-space optical communications have held the promise of revolutionizing space communications for a long time. The benefits of increasing the bitrate while reducing the volume, mass and energy of the space terminals have attracted the attention of many researchers for a long time. In the last few years, more and more technology demonstrations have been taking place with participants from both the public and the private sector. The National Institute of Information and Communications Technology (NICT) in Japan has a long experience in this field. SOTA (Small Optical TrAnsponder) was the last NICT space lasercom mission, designed to demonstrate the potential of this technology applied to microsatellites. Since the beginning of SOTA mission in 2014, NICT regularly established communication using the Optical Ground Stations (OGS) located in the Headquarters at Koganei (Tokyo) to receive the SOTA signals, with over one hundred successful links. All the goals of the SOTA mission were fulfilled, including up to 10-Mbit/s downlinks using two different wavelengths and apertures, coarse and fine tracking of the OGS beacon, space-to-ground transmission of the on-board-camera images, experiments with different error correcting codes, interoperability with other international OGS, and experiments on quantum communications. The SOTA mission ended on November 2016, more than doubling the designed lifetime of 1-year. In this paper, the SOTA characteristics and basic operation are explained, along with the most relevant technological demonstrations.

  17. Practical Applications of Cosmic Ray Science: Spacecraft, Aircraft, Ground-Based Computation and Control Systems, and Human Health and Safety

    Science.gov (United States)

    Atwell, William; Koontz, Steve; Normand, Eugene

    2012-01-01

    Three twentieth century technological developments, 1) high altitude commercial and military aircraft; 2) manned and unmanned spacecraft; and 3) increasingly complex and sensitive solid state micro-electronics systems, have driven an ongoing evolution of basic cosmic ray science into a set of practical engineering tools needed to design, test, and verify the safety and reliability of modern complex technological systems. The effects of primary cosmic ray particles and secondary particle showers produced by nuclear reactions with the atmosphere, can determine the design and verification processes (as well as the total dollar cost) for manned and unmanned spacecraft avionics systems. Similar considerations apply to commercial and military aircraft operating at high latitudes and altitudes near the atmospheric Pfotzer maximum. Even ground based computational and controls systems can be negatively affected by secondary particle showers at the Earth s surface, especially if the net target area of the sensitive electronic system components is large. Finally, accumulation of both primary cosmic ray and secondary cosmic ray induced particle shower radiation dose is an important health and safety consideration for commercial or military air crews operating at high altitude/latitude and is also one of the most important factors presently limiting manned space flight operations beyond low-Earth orbit (LEO). In this paper we review the discovery of cosmic ray effects on the performance and reliability of microelectronic systems as well as human health and the development of the engineering and health science tools used to evaluate and mitigate cosmic ray effects in ground-based atmospheric flight, and space flight environments. Ground test methods applied to microelectronic components and systems are used in combinations with radiation transport and reaction codes to predict the performance of microelectronic systems in their operating environments. Similar radiation transport

  18. (Environmental investigation of ground water contamination at Wright-Patterson Air Force Base, Ohio)

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-01

    This report presents information concerning field procedures employed during the monitoring, well construction, well purging, sampling, and well logging at the Wright-Patterson Air Force Base. Activities were conducted in an effort to evaluate ground water contamination.

  19. Spectral Analysis of the Background in Ground-based, Long-slit ...

    Indian Academy of Sciences (India)

    1996-12-08

    Dec 8, 1996 ... Spectral Analysis of the Background in Ground-based,. Long-slit .... Figure 1 plots spectra from the 2-D array, after instrumental calibration and before correction for ..... which would merit attention and a better understanding.

  20. Ground-Based Global Navigation Satellite System Combined Broadcast Ephemeris Data (daily files) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — This dataset consists of ground-based Global Navigation Satellite System (GNSS) Combined Broadcast Ephemeris Data (daily files of all distinct navigation messages...

  1. Chasing Small Exoplanets with Ground-Based Near-Infrared Transit Photometry

    Science.gov (United States)

    Colon, K. D.; Barentsen, G.; Vinicius, Z.; Vanderburg, A.; Coughlin, J.; Thompson, S.; Mullally, F.; Barclay, T.; Quintana, E.

    2017-11-01

    I will present results from a ground-based survey to measure the infrared radius and other properties of small K2 exoplanets and candidates. The survey is preparation for upcoming discoveries from TESS and characterization with JWST.

  2. Verification and validation issues for digitally-based NPP safety systems

    International Nuclear Information System (INIS)

    Ets, A.R.

    1993-01-01

    The trend toward standardization, integration and reduced costs has led to increasing use of digital systems in reactor protection systems. While digital systems provide maintenance and performance advantages, their use also introduces new safety issues, in particular with regard to software. Current practice relies on verification and validation (V and V) to ensure the quality of safety software. However, effective V and V must be done in conjunction with a structured software development process and must consider the context of the safety system application. This paper present some of the issues and concerns that impact on the V and V process. These include documentation of systems requirements, common mode failures, hazards analysis and independence. These issues and concerns arose during evaluations of NPP safety systems for advanced reactor designs and digital I and C retrofits for existing nuclear plants in the United States. The pragmatic lessons from actual systems reviews can provide a basis for further refinement and development of guidelines for applying V and V to NPP safety systems. (author). 14 refs

  3. Grip-Pattern Verification for Smart Gun Based on Maximum-Pairwise Comparison and Mean-Template Comparison

    NARCIS (Netherlands)

    Shang, X.; Veldhuis, Raymond N.J.

    2008-01-01

    In our biometric verification system of a smart gun, the rightful user of a gun is authenticated by grip-pattern recognition. In this work verification will be done using two types of comparison methods, respectively. One is mean-template comparison, where the matching score between a test image and

  4. Novel method based on Fricke gel dosimeters for dose verification in IMRT techniques

    International Nuclear Information System (INIS)

    Aon, E.; Brunetto, M.; Sansogne, R.; Castellano, G.; Valente, M.

    2008-01-01

    Modern radiotherapy is becoming increasingly complex. Conformal and intensity modulated (IMRT) techniques are nowadays available for achieving better tumour control. However, accurate methods for 3D dose verification for these modern irradiation techniques have not been adequately established yet. Fricke gel dosimeters consist, essentially, in a ferrous sulphate (Fricke) solution fixed to a gel matrix, which enables spatial resolution. A suitable radiochromic marker (xylenol orange) is added to the solution in order to produce radiochromic changes within the visible spectrum range, due to the chemical internal conversion (oxidation) of ferrous ions to ferric ions. In addition, xylenol orange has proved to slow down the internal diffusion effect of ferric ions. These dosimeters suitably shaped in form of thin layers and optically analyzed by means of visible light transmission imaging have recently been proposed as a method for 3D absorbed dose distribution determinations in radiotherapy, and tested in several IMRT applications employing a homogeneous plane (visible light) illuminator and a CCD camera with a monochromatic filter for sample analysis by means of transmittance images. In this work, the performance of an alternative read-out method is characterized, consisting on visible light images, acquired before and after irradiation by means of a commercially available flatbed-like scanner. Registered images are suitably converted to matrices and analyzed by means of dedicated 'in-house' software. The integral developed method allows performing 1D (profiles), 2D (surfaces) and 3D (volumes) dose mapping. In addition, quantitative comparisons have been performed by means of the Gamma composite criteria. Dose distribution comparisons between Fricke gel dosimeters and traditional standard dosimetric techniques for IMRT irradiations show an overall good agreement, supporting the suitability of the method. The agreement, quantified by the gamma index (that seldom

  5. Development and Verification of a Pilot Code based on Two-fluid Three-field Model

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Moon Kyu; Bae, S. W.; Lee, Y. J.; Chung, B. D.; Jeong, J. J.; Ha, K. S.; Kang, D. H

    2006-09-15

    In this study, a semi-implicit pilot code is developed for a one-dimensional channel flow as three-fields. The three fields are comprised of a gas, continuous liquid and entrained liquid fields. All the three fields are allowed to have their own velocities. The temperatures of the continuous liquid and the entrained liquid are, however, assumed to be equilibrium. The interphase phenomena include heat and mass transfer, as well as momentum transfer. The fluid/structure interaction, generally, include both heat and momentum transfer. Assuming adiabatic system, only momentum transfer is considered in this study, leaving the wall heat transfer for the future study. Using 10 conceptual problems, the basic pilot code has been verified. The results of the verification are summarized below: It was confirmed that the basic pilot code can simulate various flow conditions (such as single-phase liquid flow, bubbly flow, slug/churn turbulent flow, annular-mist flow, and single-phase vapor flow) and transitions of the flow conditions. The pilot code was programmed so that the source terms of the governing equations and numerical solution schemes can be easily tested. The mass and energy conservation was confirmed for single-phase liquid and single-phase vapor flows. It was confirmed that the inlet pressure and velocity boundary conditions work properly. It was confirmed that, for single- and two-phase flows, the velocity and temperature of non-existing phase are calculated as intended. Complete phase depletion which might occur during a phase change was found to adversely affect the code stability. A further study would be required to enhance code capability in this regard.

  6. A Comparison of Two Above-Ground Biomass Estimation Techniques Integrating Satellite-Based Remotely Sensed Data and Ground Data for Tropical and Semiarid Forests in Puerto Rico

    Science.gov (United States)

    Two above-ground forest biomass estimation techniques were evaluated for the United States Territory of Puerto Rico using predictor variables acquired from satellite based remotely sensed data and ground data from the U.S. Department of Agriculture Forest Inventory Analysis (FIA)...

  7. Knowledge-Base Application to Ground Moving Target Detection

    National Research Council Canada - National Science Library

    Adve, R

    2001-01-01

    This report summarizes a multi-year in-house effort to apply knowledge-base control techniques and advanced Space-Time Adaptive Processing algorithms to improve detection performance and false alarm...

  8. Sequential Ground Motion Effects on the Behavior of a Base-Isolated RCC Building

    Directory of Open Access Journals (Sweden)

    Zhi Zheng

    2017-01-01

    Full Text Available The sequential ground motion effects on the dynamic responses of reinforced concrete containment (RCC buildings with typical isolators are studied in this paper. Although the base isolation technique is developed to guarantee the security and integrity of RCC buildings under single earthquakes, seismic behavior of base-isolated RCC buildings under sequential ground motions is deficient. Hence, an ensemble of as-recorded sequential ground motions is employed to study the effect of including aftershocks on the seismic evaluation of base-isolated RCC buildings. The results indicate that base isolation can significantly attenuate the earthquake shaking of the RCC building under not only single earthquakes but also seismic sequences. It is also found that the adverse aftershock effect on the RCC can be reduced due to the base isolation applied to the RCC. More importantly, the study indicates that disregarding aftershocks can induce significant underestimation of the isolator displacement for base-isolated RCC buildings.

  9. Designed microtremor array based actual measurement and analysis of strong ground motion at Palu city, Indonesia

    Energy Technology Data Exchange (ETDEWEB)

    Thein, Pyi Soe, E-mail: pyisoethein@yahoo.com [Geology Department, Yangon University (Myanmar); Pramumijoyo, Subagyo; Wilopo, Wahyu; Setianto, Agung [Geological Engineering Department, Gadjah Mada University (Indonesia); Brotopuspito, Kirbani Sri [Physics Department, Gadjah Mada University (Indonesia); Kiyono, Junji; Putra, Rusnardi Rahmat [Graduate School of Global Environmental Studies, Kyoto University (Japan)

    2015-04-24

    In this study, we investigated the strong ground motion characteristics under Palu City, Indonesia. The shear wave velocity structures evaluated by eight microtremors measurement are the most applicable to determine the thickness of sediments and average shear wave velocity with Vs ≤ 300 m/s. Based on subsurface underground structure models identified, earthquake ground motion was estimated in the future Palu-Koro earthquake by using statistical green’s function method. The seismic microzonation parameters were carried out by considering several significant controlling factors on ground response at January 23, 2005 earthquake.

  10. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    International Nuclear Information System (INIS)

    Chiodo, Mario S.G.; Ruggieri, Claudio

    2009-01-01

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects

  11. Failure assessments of corroded pipelines with axial defects using stress-based criteria: Numerical studies and verification analyses

    Energy Technology Data Exchange (ETDEWEB)

    Chiodo, Mario S.G. [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil); Ruggieri, Claudio [Department of Naval Architecture and Ocean Engineering, University of Sao Paulo, Av. Prof. Mello Moraes, 2231 (PNV-EPUSP), Sao Paulo, SP 05508-030 (Brazil)], E-mail: claudio.ruggieri@poli.usp.br

    2009-02-15

    Conventional procedures used to assess the integrity of corroded piping systems with axial defects generally employ simplified failure criteria based upon a plastic collapse failure mechanism incorporating the tensile properties of the pipe material. These methods establish acceptance criteria for defects based on limited experimental data for low strength structural steels which do not necessarily address specific requirements for the high grade steels currently used. For these cases, failure assessments may be overly conservative or provide significant scatter in their predictions, which lead to unnecessary repair or replacement of in-service pipelines. Motivated by these observations, this study examines the applicability of a stress-based criterion based upon plastic instability analysis to predict the failure pressure of corroded pipelines with axial defects. A central focus is to gain additional insight into effects of defect geometry and material properties on the attainment of a local limit load to support the development of stress-based burst strength criteria. The work provides an extensive body of results which lend further support to adopt failure criteria for corroded pipelines based upon ligament instability analyses. A verification study conducted on burst testing of large-diameter pipe specimens with different defect length shows the effectiveness of a stress-based criterion using local ligament instability in burst pressure predictions, even though the adopted burst criterion exhibits a potential dependence on defect geometry and possibly on material's strain hardening capacity. Overall, the results presented here suggests that use of stress-based criteria based upon plastic instability analysis of the defect ligament is a valid engineering tool for integrity assessments of pipelines with axial corroded defects.

  12. EPID-based verification of the MLC performance for dynamic IMRT and VMAT

    International Nuclear Information System (INIS)

    Rowshanfarzad, Pejman; Sabet, Mahsheed; Barnes, Michael P.; O’Connor, Daryl J.; Greer, Peter B.

    2012-01-01

    Purpose: In advanced radiotherapy treatments such as intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), verification of the performance of the multileaf collimator (MLC) is an essential part of the linac QA program. The purpose of this study is to use the existing measurement methods for geometric QA of the MLCs and extend them to more comprehensive evaluation techniques, and to develop dedicated robust algorithms to quantitatively investigate the MLC performance in a fast, accurate, and efficient manner. Methods: The behavior of leaves was investigated in the step-and-shoot mode by the analysis of integrated electronic portal imaging device (EPID) images acquired during picket fence tests at fixed gantry angles and arc delivery. The MLC was also studied in dynamic mode by the analysis of cine EPID images of a sliding gap pattern delivered in a variety of conditions including different leaf speeds, deliveries at fixed gantry angles or in arc mode, and changing the direction of leaf motion. The accuracy of the method was tested by detection of the intentionally inserted errors in the delivery patterns. Results: The algorithm developed for the picket fence analysis was able to find each individual leaf position, gap width, and leaf bank skewness in addition to the deviations from expected leaf positions with respect to the beam central axis with sub-pixel accuracy. For the three tested linacs over a period of 5 months, the maximum change in the gap width was 0.5 mm, the maximum deviation from the expected leaf positions was 0.1 mm and the MLC skewness was up to 0.2°. The algorithm developed for the sliding gap analysis could determine the velocity and acceleration/deceleration of each individual leaf as well as the gap width. There was a slight decrease in the accuracy of leaf performance with increasing leaf speeds. The analysis results were presented through several graphs. The accuracy of the method was assessed as 0.01 mm

  13. Enhancing Ground Based Telescope Performance with Image Processing

    Science.gov (United States)

    2013-11-13

    called the hybrid diversity algorithm ( HDA ) that is based on the Gerchberg-Saxton algorithm with another process to perform phase-unwraping [36, 45...47]. The HDA requires phase diversity similar to the LM least squares method used for characterizing the HST [32]. The problem of generating...addition, the new phase retrieval algorithm proposed in this chapter has the advantage over NASA’s hybrid diversity algorithm ( HDA ) planned for use on JWST

  14. Ground test of satellite constellation based quantum communication

    OpenAIRE

    Liao, Sheng-Kai; Yong, Hai-Lin; Liu, Chang; Shentu, Guo-Liang; Li, Dong-Dong; Lin, Jin; Dai, Hui; Zhao, Shuang-Qiang; Li, Bo; Guan, Jian-Yu; Chen, Wei; Gong, Yun-Hong; Li, Yang; Lin, Ze-Hong; Pan, Ge-Sheng

    2016-01-01

    Satellite based quantum communication has been proven as a feasible way to achieve global scale quantum communication network. Very recently, a low-Earth-orbit (LEO) satellite has been launched for this purpose. However, with a single satellite, it takes an inefficient 3-day period to provide the worldwide connectivity. On the other hand, similar to how the Iridium system functions in classic communication, satellite constellation (SC) composed of many quantum satellites, could provide global...

  15. Informing hydrological models with ground-based time-lapse relative gravimetry: potential and limitations

    DEFF Research Database (Denmark)

    Bauer-Gottwein, Peter; Christiansen, Lars; Rosbjerg, Dan

    2011-01-01

    parameter uncertainty decreased significantly when TLRG data was included in the inversion. The forced infiltration experiment caused changes in unsaturated zone storage, which were monitored using TLRG and ground-penetrating radar. A numerical unsaturated zone model was subsequently conditioned on both......Coupled hydrogeophysical inversion emerges as an attractive option to improve the calibration and predictive capability of hydrological models. Recently, ground-based time-lapse relative gravity (TLRG) measurements have attracted increasing interest because there is a direct relationship between...

  16. Quantitative Estimation of Above Ground Crop Biomass using Ground-based, Airborne and Spaceborne Low Frequency Polarimetric Synthetic Aperture Radar

    Science.gov (United States)

    Koyama, C.; Watanabe, M.; Shimada, M.

    2016-12-01

    Estimation of crop biomass is one of the important challenges in environmental remote sensing related to agricultural as well as hydrological and meteorological applications. Usually passive optical data (photographs, spectral data) operating in the visible and near-infrared bands is used for such purposes. The virtue of optical remote sensing for yield estimation, however, is rather limited as the visible light can only provide information about the chemical characteristics of the canopy surface. Low frequency microwave signals with wavelength longer 20 cm have the potential to penetrate through the canopy and provide information about the whole vertical structure of vegetation from the top of the canopy down to the very soil surface. This phenomenon has been well known and exploited to detect targets under vegetation in the military radar application known as FOPEN (foliage penetration). With the availability of polarimetric interferometric SAR data the use PolInSAR techniques to retrieve vertical vegetation structures has become an attractive tool. However, PolInSAR is still highly experimental and suitable data is not yet widely available. In this study we focus on the use of operational dual-polarization L-band (1.27 GHz) SAR which is since the launch of Japan's Advanced Land Observing Satellite (ALOS, 2006-2011) available worldwide. Since 2014 ALOS-2 continues to deliver such kind of partial polarimetric data for the entire land surface. In addition to these spaceborne data sets we use airborne L-band SAR data acquired by the Japanese Pi-SAR-L2 as well as ultra-wideband (UWB) ground based SAR data operating in the frequency range from 1-4 GHz. By exploiting the complex dual-polarization [C2] Covariance matrix information, the scattering contributions from the canopy can be well separated from the ground reflections allowing for the establishment of semi-empirical relationships between measured radar reflectivity and the amount of fresh-weight above-ground

  17. Design, implementation and verification of software code for radiation dose assessment based on simple generic environmental model

    International Nuclear Information System (INIS)

    I Putu Susila; Arif Yuniarto

    2017-01-01

    Radiation dose assessment to determine the potential of radiological impacts of various installations within nuclear facility complex is necessary to ensure environmental and public safety. A simple generic model-based method for calculating radiation doses caused by the release of radioactive substances into the environment has been published by the International Atomic Energy Agency (IAEA) as the Safety Report Series No. 19 (SRS-19). In order to assist the application of the assessment method and a basis for the development of more complex assessment methods, an open-source based software code has been designed and implemented. The software comes with maps and is very easy to be used because assessment scenarios can be done through diagrams. Software verification was performed by comparing its result to SRS-19 and CROM software calculation results. Dose estimated by SRS-19 are higher compared to the result of developed software. However, these are still acceptable since dose estimation in SRS-19 is based on conservative approach. On the other hand, compared to CROM software, the same results for three scenarios and a non-significant difference of 2.25 % in another scenario were obtained. These results indicate the correctness of our implementation and implies that the developed software is ready for use in real scenario. In the future, the addition of various features and development of new model need to be done to improve the capability of software that has been developed. (author)

  18. New frontiers in ground-based optical astronomy

    Science.gov (United States)

    Strom, Steve

    1991-07-01

    Technological advances made in telescope designs during 1980's are outlined, including a segmented primary mirror for a 10-m telescope, new mirror-figuring techniques, and control systems based on computers and electronics. A new detector technology employing CCD's and advances in high-resolution telescopes are considered, along with such areas of research ready for major advances given new observing tools as the origin of large-scale structures in the universe, the creation and evolution of galaxies, and the formation of stars and planetary systems. Attention is focused on circumstellar disks, dust veils, jets, and brown dwarfs.

  19. New frontiers in ground-based optical astronomy

    International Nuclear Information System (INIS)

    Strom, S.

    1991-01-01

    Technological advances made in telescope designs during 1980's are outlined, including a segmented primary mirror for a 10-m telescope, new mirror-figuring techniques, and control systems based on computers and electronics. A new detector technology employing CCD's and advances in high-resolution telescopes are considered, along with such areas of research ready for major advances given new observing tools as the origin of large-scale structures in the universe, the creation and evolution of galaxies, and the formation of stars and planetary systems. Attention is focused on circumstellar disks, dust veils, jets, and brown dwarfs

  20. The setting for ground based augmentation system station

    Science.gov (United States)

    Ni, Yude; Liu, Ruihua

    2007-11-01

    Based on the minimum field strength requirement within the whole GBAS service volume, this paper performs nominal link power budget for GBAS VHF data broadcast (VDB) system, and the required power transmitted from VDB system is derived. The paper elaborates the requirement of Desired-to-Undesired (D/U) signal ratio for a specific VHF airborne receiver to ensure the normal operation by the test, and presents the experimental method and results for acquiring the D/U signal ratios. The minimum geographical separations among GBAS, VOR and ILS stations are calculated according to the specifications of these three kinds of navigation systems.

  1. Verification and completion of a soil data base for process based erosion model applications in Mato Grosso/Brazil

    Science.gov (United States)

    Schindewolf, Marcus; Schultze, Nico; Schönke, Daniela; Amorim, Ricardo S. S.; Schmidt, Jürgen

    2014-05-01

    The study area of central Mato Grosso is subjected to severe soil erosion. Continuous erosion leads to massive losses of top soil and related organic carbon. Consequently agricultural soil soils suffer a drop in soil fertility which only can be balanced by mineral fertilization. In order to control soil degradation and organic carbon losses of Mato Grosso cropland soils a process based soil loss and deposition model is used. Applying the model it will be possible to: - identify the main areas affected by soil erosion or deposition in different scales under present and future climate and socio-economic conditions - estimate the related nutrient and organic carbon losses/yields - figure out site-related causes of soil mobilization/deposition - locate sediment and sediment related nutrient and organic matter pass over points into surface water bodies - estimate the impacts of climate and land use changes on the losses of top soil, sediment bound nutrients and organic carbon. Model input parameters include digital elevation data, precipitation characteristics and standard soil properties as particle size distribution, total organic carbon (TOC) and bulk density. The effects of different types of land use and agricultural management practices are accounted for by varying site-specific parameters predominantly related to soil surface properties such as erosional resistance, hydraulic roughness and percentage ground cover. In this context the existing EROSION 3D soil parameter data base deducted from large scale rainfall simulations in Germany is verified for application in the study area, using small scale disc type rainfall simulator with an additional runoff reflux approach. Thus it's possible to enlarge virtual plot length up to at least 10 m. Experimental plots are located in Cuiabá region of central Mato Grosso in order to cover the most relevant land use variants and tillage practices in the region. Results show that derived model parameters are highly influenced

  2. Hailstorms over Switzerland: Verification of Crowd-sourced Data

    Science.gov (United States)

    Noti, Pascal-Andreas; Martynov, Andrey; Hering, Alessandro; Martius, Olivia

    2016-04-01

    The reports of smartphone users, witnessing hailstorms, can be used as source of independent, ground-based observation data on ground-reaching hailstorms with high temporal and spatial resolution. The presented work focuses on the verification of crowd-sourced data collected over Switzerland with the help of a smartphone application recently developed by MeteoSwiss. The precise location, time of hail precipitation and the hailstone size are included in the crowd-sourced data, assessed on the basis of the weather radar data of MeteoSwiss. Two radar-based hail detection algorithms, POH (Probability of Hail) and MESHS (Maximum Expected Severe Hail Size), in use at MeteoSwiss are confronted with the crowd-sourced data. The available data and investigation time period last from June to August 2015. Filter criteria have been applied in order to remove false reports from the crowd-sourced data. Neighborhood methods have been introduced to reduce the uncertainties which result from spatial and temporal biases. The crowd-sourced and radar data are converted into binary sequences according to previously set thresholds, allowing for using a categorical verification. Verification scores (e.g. hit rate) are then calculated from a 2x2 contingency table. The hail reporting activity and patterns corresponding to "hail" and "no hail" reports, sent from smartphones, have been analyzed. The relationship between the reported hailstone sizes and both radar-based hail detection algorithms have been investigated.

  3. Coastal wind study based on Sentinel-1 and ground-based scanning lidar

    DEFF Research Database (Denmark)

    Hasager, Charlotte Bay; Badger, Merete; Pena Diaz, Alfredo

    Winds in the coastal zone have importance for near-shore wind farm planning. Recently the Danish Energy Agency gave new options for placing offshore wind farms much closer to the coastlines than previously. The new tender areas are located from 3 to 8 km from the coast. Ground-based scanning lidar...... located on land can partly cover this area out to around 15 km. In order to improve wind farm planning for near-shore coastal areas, the project‘Reducing the Uncertainty of Near-shore Energy estimates from meso- and micro-scale wind models’ (RUNE) is established. The measurement campaign starts October....... The various observation types have advantages and limitations; one advantage of both the Sentinel-1 and the scanning lidar is that they both observe wind fields covering a large area and so can be combined for studying the spatial variability of winds. Sentinel-1 are being processed near-real-time at DTU Wind...

  4. Characterization of a dose verification system dedicated to radiotherapy treatments based on a silicon detector multi-strips

    International Nuclear Information System (INIS)

    Bocca, A.; Cortes Giraldo, M. A.; Gallardo, M. I.; Espino, J. M.; Aranas, R.; Abou Haidar, Z.; Alvarez, M. A. G.; Quesada, J. M.; Vega-Leal, A. P.; Perez Neto, F. J.

    2011-01-01

    In this paper, we present the characterization of a silicon detector multi-strips (SSSSD: Single Sided Silicon Strip Detector), developed by the company Micron Semiconductors Ltd. for use as a verification system for radiotherapy treatments.

  5. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  6. TH-CD-202-05: DECT Based Tissue Segmentation as Input to Monte Carlo Simulations for Proton Treatment Verification Using PET Imaging

    International Nuclear Information System (INIS)

    Berndt, B; Wuerl, M; Dedes, G; Landry, G; Parodi, K; Tessonnier, T; Schwarz, F; Kamp, F; Thieke, C; Belka, C; Reiser, M; Sommer, W; Bauer, J; Verhaegen, F

    2016-01-01

    Purpose: To improve agreement of predicted and measured positron emitter yields in patients, after proton irradiation for PET-based treatment verification, using a novel dual energy CT (DECT) tissue segmentation approach, overcoming known deficiencies from single energy CT (SECT). Methods: DECT head scans of 5 trauma patients were segmented and compared to existing decomposition methods with a first focus on the brain. For validation purposes, three brain equivalent solutions [water, white matter (WM) and grey matter (GM) – equivalent with respect to their reference carbon and oxygen contents and CT numbers at 90kVp and 150kVp] were prepared from water, ethanol, sucrose and salt. The activities of all brain solutions, measured during a PET scan after uniform proton irradiation, were compared to Monte Carlo simulations. Simulation inputs were various solution compositions obtained from different segmentation approaches from DECT, SECT scans, and known reference composition. Virtual GM solution salt concentration corrections were applied based on DECT measurements of solutions with varying salt concentration. Results: The novel tissue segmentation showed qualitative improvements in %C for patient brain scans (ground truth unavailable). The activity simulations based on reference solution compositions agree with the measurement within 3–5% (4–8Bq/ml). These reference simulations showed an absolute activity difference between WM (20%C) and GM (10%C) to H2O (0%C) of 43 Bq/ml and 22 Bq/ml, respectively. Activity differences between reference simulations and segmented ones varied from −6 to 1 Bq/ml for DECT and −79 to 8 Bq/ml for SECT. Conclusion: Compared to the conventionally used SECT segmentation, the DECT based segmentation indicates a qualitative and quantitative improvement. In controlled solutions, a MC input based on DECT segmentation leads to better agreement with the reference. Future work will address the anticipated improvement of quantification

  7. TH-CD-202-05: DECT Based Tissue Segmentation as Input to Monte Carlo Simulations for Proton Treatment Verification Using PET Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Berndt, B; Wuerl, M; Dedes, G; Landry, G; Parodi, K [Ludwig-Maximilians-Universitaet Muenchen, Garching, DE (Germany); Tessonnier, T [Ludwig-Maximilians-Universitaet Muenchen, Garching, DE (Germany); Universitaetsklinikum Heidelberg, Heidelberg, DE (Germany); Schwarz, F; Kamp, F; Thieke, C; Belka, C; Reiser, M; Sommer, W [LMU Munich, Munich, DE (Germany); Bauer, J [Universitaetsklinikum Heidelberg, Heidelberg, DE (Germany); Heidelberg Ion-Beam Therapy Center, Heidelberg, DE (Germany); Verhaegen, F [Maastro Clinic, Maastricht (Netherlands)

    2016-06-15

    Purpose: To improve agreement of predicted and measured positron emitter yields in patients, after proton irradiation for PET-based treatment verification, using a novel dual energy CT (DECT) tissue segmentation approach, overcoming known deficiencies from single energy CT (SECT). Methods: DECT head scans of 5 trauma patients were segmented and compared to existing decomposition methods with a first focus on the brain. For validation purposes, three brain equivalent solutions [water, white matter (WM) and grey matter (GM) – equivalent with respect to their reference carbon and oxygen contents and CT numbers at 90kVp and 150kVp] were prepared from water, ethanol, sucrose and salt. The activities of all brain solutions, measured during a PET scan after uniform proton irradiation, were compared to Monte Carlo simulations. Simulation inputs were various solution compositions obtained from different segmentation approaches from DECT, SECT scans, and known reference composition. Virtual GM solution salt concentration corrections were applied based on DECT measurements of solutions with varying salt concentration. Results: The novel tissue segmentation showed qualitative improvements in %C for patient brain scans (ground truth unavailable). The activity simulations based on reference solution compositions agree with the measurement within 3–5% (4–8Bq/ml). These reference simulations showed an absolute activity difference between WM (20%C) and GM (10%C) to H2O (0%C) of 43 Bq/ml and 22 Bq/ml, respectively. Activity differences between reference simulations and segmented ones varied from −6 to 1 Bq/ml for DECT and −79 to 8 Bq/ml for SECT. Conclusion: Compared to the conventionally used SECT segmentation, the DECT based segmentation indicates a qualitative and quantitative improvement. In controlled solutions, a MC input based on DECT segmentation leads to better agreement with the reference. Future work will address the anticipated improvement of quantification

  8. Adaptive beamlet-based finite-size pencil beam dose calculation for independent verification of IMRT and VMAT.

    Science.gov (United States)

    Park, Justin C; Li, Jonathan G; Arhjoul, Lahcen; Yan, Guanghua; Lu, Bo; Fan, Qiyong; Liu, Chihray

    2015-04-01

    The use of sophisticated dose calculation procedure in modern radiation therapy treatment planning is inevitable in order to account for complex treatment fields created by multileaf collimators (MLCs). As a consequence, independent volumetric dose verification is time consuming, which affects the efficiency of clinical workflow. In this study, the authors present an efficient adaptive beamlet-based finite-size pencil beam (AB-FSPB) dose calculation algorithm that minimizes the computational procedure while preserving the accuracy. The computational time of finite-size pencil beam (FSPB) algorithm is proportional to the number of infinitesimal and identical beamlets that constitute an arbitrary field shape. In AB-FSPB, dose distribution from each beamlet is mathematically modeled such that the sizes of beamlets to represent an arbitrary field shape no longer need to be infinitesimal nor identical. As a result, it is possible to represent an arbitrary field shape with combinations of different sized and minimal number of beamlets. In addition, the authors included the model parameters to consider MLC for its rounded edge and transmission. Root mean square error (RMSE) between treatment planning system and conventional FSPB on a 10 × 10 cm(2) square field using 10 × 10, 2.5 × 2.5, and 0.5 × 0.5 cm(2) beamlet sizes were 4.90%, 3.19%, and 2.87%, respectively, compared with RMSE of 1.10%, 1.11%, and 1.14% for AB-FSPB. This finding holds true for a larger square field size of 25 × 25 cm(2), where RMSE for 25 × 25, 2.5 × 2.5, and 0.5 × 0.5 cm(2) beamlet sizes were 5.41%, 4.76%, and 3.54% in FSPB, respectively, compared with RMSE of 0.86%, 0.83%, and 0.88% for AB-FSPB. It was found that AB-FSPB could successfully account for the MLC transmissions without major discrepancy. The algorithm was also graphical processing unit (GPU) compatible to maximize its computational speed. For an intensity modulated radiation therapy (∼12 segments) and a volumetric modulated arc

  9. Thermal Analysis of the Driving Component Based on the Thermal Network Method in a Lunar Drilling System and Experimental Verification

    Directory of Open Access Journals (Sweden)

    Dewei Tang

    2017-03-01

    Full Text Available The main task of the third Chinese lunar exploration project is to obtain soil samples that are greater than two meters in length and to acquire bedding information from the surface of the moon. The driving component is the power output unit of the drilling system in the lander; it provides drilling power for core drilling tools. High temperatures can cause the sensors, permanent magnet, gears, and bearings to suffer irreversible damage. In this paper, a thermal analysis model for this driving component, based on the thermal network method (TNM was established and the model was solved using the quasi-Newton method. A vacuum test platform was built and an experimental verification method (EVM was applied to measure the surface temperature of the driving component. Then, the TNM was optimized, based on the principle of heat distribution. Through comparative analyses, the reasonableness of the TNM is validated. Finally, the static temperature field of the driving component was predicted and the “safe working times” of every mode are given.

  10. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  11. The relative importance of managerial competencies for predicting the perceived job performance of Broad-Based Black Economic Empowerment verification practitioners

    Directory of Open Access Journals (Sweden)

    Barbara M. Seate

    2016-04-01

    Full Text Available Orientation: There is a need for the growing Broad-Based Black Economic Empowerment (B-BBEE verification industry to assess competencies and determine skills gaps for the management of the verification practitioners’ perceived job performance. Knowing which managerial competencies are important for different managerial functions is vital for developing and improving training and development programmes. Research purpose: The purpose of this study was to determine the managerial capabilities that are required of the B-BBEE verification practitioners, in order to improve their perceived job performance. Motivation for the study: The growing number of the B-BBEE verification practitioners calls for more focused training and development. Generating such a training and development programme demands empirical research into the relative importance of managerial competencies. Research approach, design and method: A quantitative design using the survey approach was adopted. A questionnaire was administered to a stratified sample of 87 B-BBEE verification practitioners. Data were analysed using the Statistical Package for Social Sciences (version 22.0 and Smart Partial Least Squares software. Main findings: The results of the correlation analysis revealed that there were strong and positive associations between technical skills, interpersonal skills, compliance to standards and ethics, managerial skills and perceived job performance. Results of the regression analysis showed that managerial skills, compliance to standards and ethics and interpersonal skills were statistically significant in predicting perceived job performance. However, technical skills were insignificant in predicting perceived job performance. Practical/managerial implications: The study has shown that the B-BBEE verification industry, insofar as the technical skills of the practitioners are concerned, does have suitably qualified staff with the requisite educational qualifications. At

  12. Research on Ground Motion Metal Target Based on Rocket Projectile by Using Millimeter Wave Radiometer Technology

    Directory of Open Access Journals (Sweden)

    Zhang Dongyang

    2014-06-01

    Full Text Available How to detect the ground motion metal target effectively is an important guarantee for precision strike in the process of Rocket Projectile flight. Accordingly and in view of the millimeter- wave radiation characteristic of the ground motion metal target, a mathematical model was established based on Rocket Projectile about millimeter-wave detection to the ground motion metal target. Through changing various parameters in the process of Rocket Projectile flight, the detection model was studied by simulation. The parameters variation and effective range of millimeter wave radiometer were obtained in the process of rotation and horizontal flight. So a certain theoretical basis was formed for the precision strike to the ground motion metal target.

  13. Novel identification strategy for ground coffee adulteration based on UPLC-HRMS oligosaccharide profiling.

    Science.gov (United States)

    Cai, Tie; Ting, Hu; Jin-Lan, Zhang

    2016-01-01

    Coffee is one of the most common and most valuable beverages. According to International Coffee Organization (ICO) reports, the adulteration of coffee for financial reasons is regarded as the most serious threat to the sustainable development of the coffee market. In this work, a novel strategy for adulteration identification in ground coffee was developed based on UPLC-HRMS oligosaccharide profiling. Along with integrated statistical analysis, 17 oligosaccharide composition were identified as markers for the identification of soybeans and rice in ground coffee. This strategy, validated by manual mixtures, optimized both the reliability and authority of adulteration identification. Rice and soybean adulterants present in ground coffee in amounts as low as 5% were identified and evaluated. Some commercial ground coffees were also successfully tested using this strategy. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Development of a PC-based ground support system for a small satellite instrument

    Science.gov (United States)

    Deschambault, Robert L.; Gregory, Philip R.; Spenler, Stephen; Whalen, Brian A.

    1993-11-01

    The importance of effective ground support for the remote control and data retrieval of a satellite instrument cannot be understated. Problems with ground support may include the need to base personnel at a ground tracking station for extended periods, and the delay between the instrument observation and the processing of the data by the science team. Flexible solutions to such problems in the case of small satellite systems are provided by using low-cost, powerful personal computers and off-the-shelf software for data acquisition and processing, and by using Internet as a communication pathway to enable scientists to view and manipulate satellite data in real time at any ground location. The personal computer based ground support system is illustrated for the case of the cold plasma analyzer flown on the Freja satellite. Commercial software was used as building blocks for writing the ground support equipment software. Several levels of hardware support, including unit tests and development, functional tests, and integration were provided by portable and desktop personal computers. Satellite stations in Saskatchewan and Sweden were linked to the science team via phone lines and Internet, which provided remote control through a central point. These successful strategies will be used on future small satellite space programs.

  15. High-precision ground-based photometry of exoplanets

    Directory of Open Access Journals (Sweden)

    de Mooij Ernst J.W.

    2013-04-01

    Full Text Available High-precision photometry of transiting exoplanet systems has contributed significantly to our understanding of the properties of their atmospheres. The best targets are the bright exoplanet systems, for which the high number of photons allow very high signal-to-noise ratios. Most of the current instruments are not optimised for these high-precision measurements, either they have a large read-out overhead to reduce the readnoise and/or their field-of-view is limited, preventing simultaneous observations of both the target and a reference star. Recently we have proposed a new wide-field imager for the Observatoir de Mont-Megantic optimised for these bright systems (PI: Jayawardhana. The instruments has a dual beam design and a field-of-view of 17' by 17'. The cameras have a read-out time of 2 seconds, significantly reducing read-out overheads. Over the past years we have obtained significant experience with how to reach the high precision required for the characterisation of exoplanet atmospheres. Based on our experience we provide the following advice: Get the best calibrations possible. In the case of bad weather, characterise the instrument (e.g. non-linearity, dome flats, bias level, this is vital for better understanding of the science data. Observe the target for as long as possible, the out-of-transit baseline is as important as the transit/eclipse itself. A short baseline can lead to improperly corrected systematic and mis-estimation of the red-noise. Keep everything (e.g. position on detector, exposure time as stable as possible. Take care that the defocus is not too strong. For a large defocus, the contribution of the total flux from the sky-background in the aperture could well exceed that of the target, resulting in very strict requirements on the precision at which the background is measured.

  16. BigBOSS: The Ground-Based Stage IV BAO Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schlegel, David; Bebek, Chris; Heetderks, Henry; Ho, Shirley; Lampton, Michael; Levi, Michael; Mostek, Nick; Padmanabhan, Nikhil; Perlmutter, Saul; Roe, Natalie; Sholl, Michael; Smoot, George; White, Martin; Dey, Arjun; Abraham, Tony; Jannuzi, Buell; Joyce, Dick; Liang, Ming; Merrill, Mike; Olsen, Knut; Salim, Samir

    2009-04-01

    The BigBOSS experiment is a proposed DOE-NSF Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with an all-sky galaxy redshift survey. The project is designed to unlock the mystery of dark energy using existing ground-based facilities operated by NOAO. A new 4000-fiber R=5000 spectrograph covering a 3-degree diameter field will measure BAO and redshift space distortions in the distribution of galaxies and hydrogen gas spanning redshifts from 0.2< z< 3.5. The Dark Energy Task Force figure of merit (DETF FoM) for this experiment is expected to be equal to that of a JDEM mission for BAO with the lower risk and cost typical of a ground-based experiment.

  17. MRI-based treatment planning for radiotherapy: Dosimetric verification for prostate IMRT

    International Nuclear Information System (INIS)

    Chen, Lili; Price, Robert A.; Wang Lu; Li Jinsheng; Qin Lihong; McNeeley, Shawn; Ma, C.-M. Charlie; Freedman, Gary M.; Pollack, Alan

    2004-01-01

    Purpose: Magnetic resonance (MR) and computed tomography (CT) image fusion with CT-based dose calculation is the gold standard for prostate treatment planning. MR and CT fusion with CT-based dose calculation has become a routine procedure for intensity-modulated radiation therapy (IMRT) treatment planning at Fox Chase Cancer Center. The use of MRI alone for treatment planning (or MRI simulation) will remove any errors associated with image fusion. Furthermore, it will reduce treatment cost by avoiding redundant CT scans and save patient, staff, and machine time. The purpose of this study is to investigate the dosimetric accuracy of MRI-based treatment planning for prostate IMRT. Methods and materials: A total of 30 IMRT plans for 15 patients were generated using both MRI and CT data. The MRI distortion was corrected using gradient distortion correction (GDC) software provided by the vendor (Philips Medical System, Cleveland, OH). The same internal contours were used for the paired plans. The external contours were drawn separately between CT-based and MR imaging-based plans to evaluate the effect of any residual distortions on dosimetric accuracy. The same energy, beam angles, dose constrains, and optimization parameters were used for dose calculations for each paired plans using a treatment optimization system. The resulting plans were compared in terms of isodose distributions and dose-volume histograms (DVHs). Hybrid phantom plans were generated for both the CT-based plans and the MR-based plans using the same leaf sequences and associated monitor units (MU). The physical phantom was then irradiated using the same leaf sequences to verify the dosimetry accuracy of the treatment plans. Results: Our results show that dose distributions between CT-based and MRI-based plans were equally acceptable based on our clinical criteria. The absolute dose agreement for the planning target volume was within 2% between CT-based and MR-based plans and 3% between measured dose

  18. Static and Completion Analysis for Planning Knowledge Base Development and Verification

    Science.gov (United States)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintaining the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems.

  19. Model-based virtual VSB mask writer verification for efficient mask error checking and optimization prior to MDP

    Science.gov (United States)

    Pack, Robert C.; Standiford, Keith; Lukanc, Todd; Ning, Guo Xiang; Verma, Piyush; Batarseh, Fadi; Chua, Gek Soon; Fujimura, Akira; Pang, Linyong

    2014-10-01

    A methodology is described wherein a calibrated model-based `Virtual' Variable Shaped Beam (VSB) mask writer process simulator is used to accurately verify complex Optical Proximity Correction (OPC) and Inverse Lithography Technology (ILT) mask designs prior to Mask Data Preparation (MDP) and mask fabrication. This type of verification addresses physical effects which occur in mask writing that may impact lithographic printing fidelity and variability. The work described here is motivated by requirements for extreme accuracy and control of variations for today's most demanding IC products. These extreme demands necessitate careful and detailed analysis of all potential sources of uncompensated error or variation and extreme control of these at each stage of the integrated OPC/ MDP/ Mask/ silicon lithography flow. The important potential sources of variation we focus on here originate on the basis of VSB mask writer physics and other errors inherent in the mask writing process. The deposited electron beam dose distribution may be examined in a manner similar to optical lithography aerial image analysis and image edge log-slope analysis. This approach enables one to catch, grade, and mitigate problems early and thus reduce the likelihood for costly long-loop iterations between OPC, MDP, and wafer fabrication flows. It moreover describes how to detect regions of a layout or mask where hotspots may occur or where the robustness to intrinsic variations may be improved by modification to the OPC, choice of mask technology, or by judicious design of VSB shots and dose assignment.

  20. A transit timing analysis with combined ground- and space-based photometry

    Directory of Open Access Journals (Sweden)

    Raetz St.

    2015-01-01

    The CoRoT satellite looks back on six years of high precision photometry of a very high number of stars. Thousands of transiting events are detected from which 27 were confirmed to be transiting planets so far. In my research I search and analyze TTVs in the CoRoT sample and combine the unprecedented precision of the light curves with ground-based follow-up photometry. Because CoRoT can observe transiting planets only for a maximum duration of 150 days the ground-based follow-up can help to refine the ephemeris. Here we present first examples.

  1. Asteroseismology of solar-type stars with Kepler: III. Ground-based data

    DEFF Research Database (Denmark)

    Karoff, Christoffer; Molenda-Żakowicz , J.

    2010-01-01

    We report on the ground-based follow-up program of spectroscopic and photometric observations of solar-like asteroseismic targets for the Kepler space mission. These stars constitute a large group of more than a thousand objects which are the subject of an intensive study by the Kepler Asteroseis......We report on the ground-based follow-up program of spectroscopic and photometric observations of solar-like asteroseismic targets for the Kepler space mission. These stars constitute a large group of more than a thousand objects which are the subject of an intensive study by the Kepler...

  2. Status of advanced ground-based laser interferometers for gravitational-wave detection

    International Nuclear Information System (INIS)

    Dooley, K L; Akutsu, T; Dwyer, S; Puppo, P

    2015-01-01

    Ground-based laser interferometers for gravitational-wave (GW) detection were first constructed starting 20 years ago and as of 2010 collection of several years’ worth of science data at initial design sensitivities was completed. Upgrades to the initial detectors together with construction of brand new detectors are ongoing and feature advanced technologies to improve the sensitivity to GWs. This conference proceeding provides an overview of the common design features of ground-based laser interferometric GW detectors and establishes the context for the status updates of each of the four gravitational-wave detectors around the world: Advanced LIGO, Advanced Virgo, GEO 600 and KAGRA. (paper)

  3. Status of advanced ground-based laser interferometers for gravitational-wave detection

    Science.gov (United States)

    Dooley, K. L.; Akutsu, T.; Dwyer, S.; Puppo, P.

    2015-05-01

    Ground-based laser interferometers for gravitational-wave (GW) detection were first constructed starting 20 years ago and as of 2010 collection of several years’ worth of science data at initial design sensitivities was completed. Upgrades to the initial detectors together with construction of brand new detectors are ongoing and feature advanced technologies to improve the sensitivity to GWs. This conference proceeding provides an overview of the common design features of ground-based laser interferometric GW detectors and establishes the context for the status updates of each of the four gravitational-wave detectors around the world: Advanced LIGO, Advanced Virgo, GEO 600 and KAGRA.

  4. A spent coffee grounds based biorefinery for the production of biofuels, biopolymers, antioxidants and biocomposites.

    Science.gov (United States)

    Karmee, Sanjib Kumar

    2018-02-01

    Spent coffee grounds are composed of lipid, carbohydrates, carbonaceous, and nitrogen containing compounds among others. Using n-hexane and n-hexane/isopropanol mixture highest oil yield was achived during soxhlet extraction of oil from spent coffee grounds. Alternatively, supercritical carbon dioxide can be employed as a green solvent for the extraction of oil. Using advanced chemical and biotechnological methods, spent coffee grounds are converted to various biofuels such as, biodiesel, renewable diesel, bioethanol, bioethers, bio-oil, biochar, and biogas. The in-situ transesterification of spent coffee grounds was carried out in a large scale (4 kg), which led to 80-83% biodiesel yield. In addition, a large number of value added and diversified products viz. polyhydroxyalkanoates, biosorbent, activated carbon, polyol, polyurethane foam, carotenoid, phenolic antioxidants, and green composite are obtained from spent coffee grounds. The principles of circular economy are applied to develop a sustanaible biorefinery based on valorisation of spent coffee grounds. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Communication grounding facility

    International Nuclear Information System (INIS)

    Lee, Gye Seong

    1998-06-01

    It is about communication grounding facility, which is made up twelve chapters. It includes general grounding with purpose, materials thermal insulating material, construction of grounding, super strength grounding method, grounding facility with grounding way and building of insulating, switched grounding with No. 1A and LCR, grounding facility of transmission line, wireless facility grounding, grounding facility in wireless base station, grounding of power facility, grounding low-tenton interior power wire, communication facility of railroad, install of arrester in apartment and house, install of arrester on introduction and earth conductivity and measurement with introduction and grounding resistance.

  6. Overview of Boundary Layer Clouds Using Satellite and Ground-Based Measurements

    Science.gov (United States)

    Xi, B.; Dong, X.; Wu, P.; Qiu, S.

    2017-12-01

    A comprehensive summary of boundary layer clouds properties based on our few recently studies will be presented. The analyses include the global cloud fractions and cloud macro/micro- physical properties based on satellite measurements using both CERES-MODIS and CloudSat/Caliposo data products,; the annual/seasonal/diurnal variations of stratocumulus clouds over different climate regions (mid-latitude land, mid-latitude ocean, and Arctic region) using DOE ARM ground-based measurements over Southern great plain (SGP), Azores (GRW), and North slope of Alaska (NSA) sites; the impact of environmental conditions to the formation and dissipation process of marine boundary layer clouds over Azores site; characterizing Arctice mixed-phase cloud structure and favorable environmental conditions for the formation/maintainess of mixed-phase clouds over NSA site. Though the presentation has widely spread topics, we will focus on the representation of the ground-based measurements over different climate regions; evaluation of satellite retrieved cloud properties using these ground-based measurements, and understanding the uncertainties of both satellite and ground-based retrievals and measurements.

  7. Ground-Based VIS/NIR Reflectance Spectra of 25143 Itokawa: What Hayabusa will See and How Ground-Based Data can Augment Analyses

    Science.gov (United States)

    Vilas, Faith; Abell, P. A.; Jarvis, K. S.

    2004-01-01

    Planning for the arrival of the Hayabusa spacecraft at asteroid 25143 Itokawa includes consideration of the expected spectral information to be obtained using the AMICA and NIRS instruments. The rotationally-resolved spatial coverage the asteroid we have obtained with ground-based telescopic spectrophotometry in the visible and near-infrared can be utilized here to address expected spacecraft data. We use spectrophotometry to simulate the types of data that Hayabusa will receive with the NIRS and AMICA instruments, and will demonstrate them here. The NIRS will cover a wavelength range from 0.85 m, and have a dispersion per element of 250 Angstroms. Thus, we are limited in coverage of the 1.0 micrometer and 2.0 micrometer mafic silicate absorption features. The ground-based reflectance spectra of Itokawa show a large component of olivine in its surface material, and the 2.0 micrometer feature is shallow. Determining the olivine to pyroxene abundance ratio is critically dependent on the attributes of the 1.0- and 2.0 micrometer features. With a cut-off near 2,1 micrometer the longer edge of the 2.0- feature will not be obtained by NIRS. Reflectance spectra obtained using ground-based telescopes can be used to determine the regional composition around space-based spectral observations, and possibly augment the longer wavelength spectral attributes. Similarly, the shorter wavelength end of the 1.0 micrometer absorption feature will be partially lost to the NIRS. The AMICA filters mimic the ECAS filters, and have wavelength coverage overlapping with the NIRS spectral range. We demonstrate how merging photometry from AMICA will extend the spectral coverage of the NIRS. Lessons learned from earlier spacecraft to asteroids should be considered.

  8. CLSI-based transference and verification of CALIPER pediatric reference intervals for 29 Ortho VITROS 5600 chemistry assays.

    Science.gov (United States)

    Higgins, Victoria; Truong, Dorothy; Woroch, Amy; Chan, Man Khun; Tahmasebi, Houman; Adeli, Khosrow

    2018-03-01

    Evidence-based reference intervals (RIs) are essential to accurately interpret pediatric laboratory test results. To fill gaps in pediatric RIs, the Canadian Laboratory Initiative on Pediatric Reference Intervals (CALIPER) project developed an age- and sex-specific pediatric RI database based on healthy pediatric subjects. Originally established for Abbott ARCHITECT assays, CALIPER RIs were transferred to assays on Beckman, Roche, Siemens, and Ortho analytical platforms. This study provides transferred reference intervals for 29 biochemical assays for the Ortho VITROS 5600 Chemistry System (Ortho). Based on Clinical Laboratory Standards Institute (CLSI) guidelines, a method comparison analysis was performed by measuring approximately 200 patient serum samples using Abbott and Ortho assays. The equation of the line of best fit was calculated and the appropriateness of the linear model was assessed. This equation was used to transfer RIs from Abbott to Ortho assays. Transferred RIs were verified using 84 healthy pediatric serum samples from the CALIPER cohort. RIs for most chemistry analytes successfully transferred from Abbott to Ortho assays. Calcium and CO 2 did not meet statistical criteria for transference (r 2 reference intervals, 29 successfully verified with approximately 90% of results from reference samples falling within transferred confidence limits. Transferred RIs for total bilirubin, magnesium, and LDH did not meet verification criteria and are not reported. This study broadens the utility of the CALIPER pediatric RI database to laboratories using Ortho VITROS 5600 biochemical assays. Clinical laboratories should verify CALIPER reference intervals for their specific analytical platform and local population as recommended by CLSI. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  9. Towards the development of a rapid, portable, surface enhanced Raman spectroscopy based cleaning verification system for the drug nelarabine.

    Science.gov (United States)

    Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey

    2010-09-01

    Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.

  10. Hanford ground-water data base management guide and user's manual

    International Nuclear Information System (INIS)

    Mitchell, P.J.; Argo, R.S.; Bradymire, S.L.; Newbill, C.A.

    1985-05-01

    This management guide and user's manual is a working document for the computerized Hanford Ground-water Data Base maintained by the Geosciences Research and Engineering Department at Pacific Northwest Laboratory for the Hanford Ground-Water Surveillance Program. The program is managed by the Occupational and Environmental Protection Department for the US Department of Energy. The data base is maintained to provide rapid access to data that are rountinely collected from ground-water monitoring wells at the Hanford site. The data include water levels, sample analyses, geologic descriptions and well construction information of over 3000 existing or destroyed wells. These data are used to monitor water quality and for the evaluation of ground-water flow and pollutant transport problems. The management guide gives instructions for maintenance of the data base on the Digital Equipment Corporation PDP 11/70 Computer using the CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) data base management software developed at Pacific Northwest Laboratory. Maintenance activities include inserting, modifying and deleting data, making back-up copies of the data base, and generating tables for annual monitoring reports. The user's guide includes instructions for running programs to retrieve the data in the form of listings of graphical plots. 3 refs

  11. (Environmental investigation of ground water contamination at Wright-Patterson Air Force Base, Ohio)

    Energy Technology Data Exchange (ETDEWEB)

    1992-03-01

    An environmental investigation of ground water conditions has been undertaken at Wright-Patterson Air Force Base (WPAFB), Ohio to obtain data to assist in the evaluation of a potential removal action to prevent, to the extent practicable, migration of the contaminated ground water across Base boundaries. Field investigations were limited to the central section of the southwestern boundary of Area C and the Springfield Pike boundary of Area B. Further, the study was limited to a maximum depth of 150 feet below grade. Three primary activities of the field investigation were: (1) installation of 22 monitoring wells, (2) collection and analysis of ground water from 71 locations, (3) measurement of ground water elevations at 69 locations. Volatile organic compounds including trichloroethylene, perchloroethylene, and/or vinyl chloride were detected in concentrations exceeding Maximum Contaminant Levels (MCL) at three locations within the Area C investigation area. Ground water at the Springfield Pike boundary of Area B occurs in two primary units, separated by a thicker-than-expected clay layers. One well within Area B was determined to exceed the MCL for trichloroethylene.

  12. Ground Control Point - Wireless System Network for UAV-based environmental monitoring applications

    Science.gov (United States)

    Mejia-Aguilar, Abraham

    2016-04-01

    In recent years, Unmanned Aerial Vehicles (UAV) have seen widespread civil applications including usage for survey and monitoring services in areas such as agriculture, construction and civil engineering, private surveillance and reconnaissance services and cultural heritage management. Most aerial monitoring services require the integration of information acquired during the flight (such as imagery) with ground-based information (such as GPS information or others) for improved ground truth validation. For example, to obtain an accurate 3D and Digital Elevation Model based on aerial imagery, it is necessary to include ground-based information of coordinate points, which are normally acquired with surveying methods based on Global Position Systems (GPS). However, GPS surveys are very time consuming and especially for longer time series of monitoring data repeated GPS surveys are necessary. In order to improve speed of data collection and integration, this work presents an autonomous system based on Waspmote technologies build on single nodes interlinked in a Wireless Sensor Network (WSN) star-topology for ground based information collection and later integration with surveying data obtained by UAV. Nodes are designed to be visible from the air, to resist extreme weather conditions with low-power consumption. Besides, nodes are equipped with GPS as well as Inertial Measurement Unit (IMU), accelerometer, temperature and soil moisture sensors and thus provide significant advantages in a broad range of applications for environmental monitoring. For our purpose, the WSN transmits the environmental data with 3G/GPRS to a database on a regular time basis. This project provides a detailed case study and implementation of a Ground Control Point System Network for UAV-based vegetation monitoring of dry mountain grassland in the Matsch valley, Italy.

  13. Kinematic analysis and experimental verification of a eccentric wheel based precision alignment mechanism for LINAC

    International Nuclear Information System (INIS)

    Mundra, G.; Jain, V.; Singh, K.K.; Saxena, P.; Khare, R.K.; Bagre, M.

    2011-01-01

    Eccentric wheel based precision alignment system was designed for the remote motorized alignment of proposed proton injector LINAC (SFDTL). As a part of the further development for the alignment and monitoring scheme, a menu driven alignment system is being developed. The paper describes a general kinematic equation (with base line tilt correction) based on the various parameters of the mechanism like eccentricity, wheel diameter, distance between the wheels and the diameter of the cylindrical accelerator component. Based on this equation the extent of the alignment range for the 4 degree of freedom is evaluated and analysis on some of the parameters variation and the theoretical accuracy/resolution is computed. For the same a computer program is written which can compute the various points for the each discrete position of the two motor combinations. The paper also describes the experimentally evaluated values of these positions (for the full extent of area) and the matching/comparison of the two data. These data now can be used for the movement computation required for alignment of the four motors (two front and two rear motors of the support structure). (author)

  14. Measurement Verification of Plane Wave Synthesis Technique Based on Multi-probe MIMO-OTA Setup

    DEFF Research Database (Denmark)

    Fan, Wei; Carreño, Xavier; Nielsen, Jesper Ødum

    2012-01-01

    Standardization work for MIMO OTA testing methods is currently ongoing, where a multi-probe anechoic chamber based solution is an important candidate. In this paper, the probes located on an OTA ring are used to synthesize a plane wave field in the center of the OTA ring. This paper investigates...

  15. Verification of Emulated Channels in Multi-Probe Based MIMO OTA Testing Setup

    DEFF Research Database (Denmark)

    Fan, Wei; Carreño, Xavier; Nielsen, Jesper Ødum

    2013-01-01

    Standardization work for MIMO OTA testing methods is currently ongoing, where a multi-probe anechoic chamber based solution is an important candidate. In this paper, the probes located on an OTA ring are used to synthesize a plane wave field in the center of the OTA ring. This paper investigates...

  16. Verification of MENDL2 and IEAF-2001 Data bases at intermediate energies

    Energy Technology Data Exchange (ETDEWEB)

    Titarenko, Y. E. (Yury E.); Batyaev, V. F. (Vyacheslav F.); Karpikhin, E. I. (Evgeny I.); Zhivun, V. M. (Valery M.); Koldobsky, A. B. (Aleksander B.); Mulambetov, R. D. (Ruslan D.); Mulambetova, S. V.; Trebukhovsky, Y. V. (Yury V.); Zaitsev, S. L.; Lipatov, K. A.; Mashnik, S. G. (Stepan G.); Prael, R. E. (Richard E.)

    2004-01-01

    The work presents results on computer simulations of two experiments whose aim was measuring the threshold activation reaction rates in {sup 12}C, {sup 19}F, {sup 27}Al, {sup 59}Co, {sup 63}Cu, {sup 65}Cu, {sup 64}Zn, {sup 93}Nb, {sup 115}In, {sup 169}Tm, {sup 181}Ta, {sup 197}Au, and {sup 209}Bi thin samples placed inside and outside a 0.8-GeV proton-irradiated 4-cm thick W target and a 92-cm thick W-Na composite target of 15-cm diameter both. In total, more than 1000 values of activation reaction rates were determined in both experiments. The measured data were compared with results by the LAHET code using several nuclear data bases for the respective excitation functions, namely, ENDF/B6 for cross section of neutrons at energies below 20 MeV and MENDL2 together with MENDL2P for cross sections of protons and neutrons of 20 to 100 MeV energies. The recently developed IEAF-2001 data base that provides neutron cross sections up to 150 MeV was used as well. Simulation-to-experiment results obtained using MENDL2 and IEAF-2001 are presented. The agreement between simulation and experiment was found satisfactory for both data bases. Nevertheless; further studies should be conducted to improve simulations of the production of secondary protons and high-energy neutrons, as well as the high-energy neutron elastic scattering. Our results allow drawing some conclusions concerning the reliability of the transport codes and data bases used to simulate Accelerator Driven Systems (ADS), particularly with Na-cooled W targets. The high-energy threshold excitation functions to be used in activation-based unfolding of neutron spectra inside the ADS can be also inferred from our results.

  17. Design exploration and verification platform, based on high-level modeling and FPGA prototyping, for fast and flexible digital communication in physics experiments

    International Nuclear Information System (INIS)

    Magazzù, G; Borgese, G; Costantino, N; Fanucci, L; Saponara, S; Incandela, J

    2013-01-01

    In many research fields as high energy physics (HEP), astrophysics, nuclear medicine or space engineering with harsh operating conditions, the use of fast and flexible digital communication protocols is becoming more and more important. The possibility to have a smart and tested top-down design flow for the design of a new protocol for control/readout of front-end electronics is very useful. To this aim, and to reduce development time, costs and risks, this paper describes an innovative design/verification flow applied as example case study to a new communication protocol called FF-LYNX. After the description of the main FF-LYNX features, the paper presents: the definition of a parametric SystemC-based Integrated Simulation Environment (ISE) for high-level protocol definition and validation; the set up of figure of merits to drive the design space exploration; the use of ISE for early analysis of the achievable performances when adopting the new communication protocol and its interfaces for a new (or upgraded) physics experiment; the design of VHDL IP cores for the TX and RX protocol interfaces; their implementation on a FPGA-based emulator for functional verification and finally the modification of the FPGA-based emulator for testing the ASIC chipset which implements the rad-tolerant protocol interfaces. For every step, significant results will be shown to underline the usefulness of this design and verification approach that can be applied to any new digital protocol development for smart detectors in physics experiments.

  18. Design exploration and verification platform, based on high-level modeling and FPGA prototyping, for fast and flexible digital communication in physics experiments

    Science.gov (United States)

    Magazzù, G.; Borgese, G.; Costantino, N.; Fanucci, L.; Incandela, J.; Saponara, S.

    2013-02-01

    In many research fields as high energy physics (HEP), astrophysics, nuclear medicine or space engineering with harsh operating conditions, the use of fast and flexible digital communication protocols is becoming more and more important. The possibility to have a smart and tested top-down design flow for the design of a new protocol for control/readout of front-end electronics is very useful. To this aim, and to reduce development time, costs and risks, this paper describes an innovative design/verification flow applied as example case study to a new communication protocol called FF-LYNX. After the description of the main FF-LYNX features, the paper presents: the definition of a parametric SystemC-based Integrated Simulation Environment (ISE) for high-level protocol definition and validation; the set up of figure of merits to drive the design space exploration; the use of ISE for early analysis of the achievable performances when adopting the new communication protocol and its interfaces for a new (or upgraded) physics experiment; the design of VHDL IP cores for the TX and RX protocol interfaces; their implementation on a FPGA-based emulator for functional verification and finally the modification of the FPGA-based emulator for testing the ASIC chipset which implements the rad-tolerant protocol interfaces. For every step, significant results will be shown to underline the usefulness of this design and verification approach that can be applied to any new digital protocol development for smart detectors in physics experiments.

  19. OGLE-2015-BLG-0196: GROUND-BASED GRAVITATIONAL MICROLENS PARALLAX CONFIRMED BY SPACE-BASED OBSERVATION

    Energy Technology Data Exchange (ETDEWEB)

    Han, C. [Department of Physics, Chungbuk National University, Cheongju 361-763 (Korea, Republic of); Udalski, A.; Szymański, M. K.; Soszyński, I.; Skowron, J.; Mróz, P.; Poleski, R.; Pietrukowicz, P.; Kozłowski, S.; Ulaczyk, K.; Pawlak, M. [Warsaw University Observatory, Al. Ujazdowskie 4, 00-478 Warszawa (Poland); Gould, A.; Zhu, Wei; Fausnaugh, M.; Gaudi, B. S. [Department of Astronomy, Ohio State University, 140 W. 18th Ave., Columbus, OH 43210 (United States); Yee, J. C. [Harvard-Smithsonian Center for Astrophysics, 60 Garden St., Cambridge, MA 02138 (United States); Beichman, C. [NASA Exoplanet Science Institute, MS 100-22, California Institute of Technology, Pasadena, CA 91125 (United States); Novati, S. Calchi [Dipartimento di Fisica “E. R. Caianiello,” Uńiversitá di Salerno, Via Giovanni Paolo II, I-84084 Fisciano (Italy); Carey, S. [Spitzer Science Center, MS 220-6, California Institute of Technology, Pasadena, CA (United States); Bryden, C. [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Dr., Pasadena, CA 91109 (United States); Collaboration: OGLE Collaboration; Spitzer Microlensing Team; and others

    2017-01-01

    In this paper, we present an analysis of the binary gravitational microlensing event OGLE-2015-BLG-0196. The event lasted for almost a year, and the light curve exhibited significant deviations from the lensing model based on the rectilinear lens-source relative motion, enabling us to measure the microlens parallax. The ground-based microlens parallax is confirmed by the data obtained from space-based microlens observations using the Spitzer telescope. By additionally measuring the angular Einstein radius from the analysis of the resolved caustic crossing, the physical parameters of the lens are determined up to the twofold degeneracy, u {sub 0} < 0 and u {sub 0} > 0, solutions caused by the well-known “ecliptic” degeneracy. It is found that the binary lens is composed of two M dwarf stars with similar masses, M {sub 1} = 0.38 ± 0.04 M {sub ⊙} (0.50 ± 0.05 M {sub ⊙}) and M {sub 2} = 0.38 ± 0.04 M {sub ⊙} (0.55 ± 0.06 M {sub ⊙}), and the distance to the lens is D {sub L} = 2.77 ± 0.23 kpc (3.30 ± 0.29 kpc). Here the physical parameters outside and inside the parentheses are for the u {sub 0} < 0 and u {sub 0} > 0 solutions, respectively.

  20. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    Energy Technology Data Exchange (ETDEWEB)

    Qiu, J [Taishan Medical University, Taian, Shandong (China); Washington University in St Louis, St Louis, MO (United States); Li, H. Harlod; Zhang, T; Yang, D [Washington University in St Louis, St Louis, MO (United States); Ma, F [Taishan Medical University, Taian, Shandong (China)

    2015-06-15

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The most important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools.

  1. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    International Nuclear Information System (INIS)

    Qiu, J; Li, H. Harlod; Zhang, T; Yang, D; Ma, F

    2015-01-01

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The most important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools

  2. Refinement and verification in component-based model-driven design

    DEFF Research Database (Denmark)

    Chen, Zhenbang; Liu, Zhiming; Ravn, Anders Peter

    2009-01-01

    Modern software development is complex as it has to deal with many different and yet related aspects of applications. In practical software engineering this is now handled by a UML-like modelling approach in which different aspects are modelled by different notations. Component-based and object-o...... be integrated in computer-aided software engineering (CASE) tools for adding formally supported checking, transformation and generation facilities.......Modern software development is complex as it has to deal with many different and yet related aspects of applications. In practical software engineering this is now handled by a UML-like modelling approach in which different aspects are modelled by different notations. Component-based and object...

  3. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  4. Abstract Interpretation-based verification/certification in the ciaoPP system

    OpenAIRE

    Puebla Sánchez, Alvaro Germán; Albert Albiol, Elvira; Hermenegildo, Manuel V.

    2005-01-01

    CiaoPP is the abstract interpretation-based preprocessor of the Ciao multi-paradigm (Constraint) Logic Programming system. It uses modular, incremental abstract interpretation as a fundamental tool to obtain information about programs. In CiaoPP, the semantic approximations thus produced have been applied to perform high- and low-level optimizations during program compilation, including transformations such as múltiple abstract specialization, parallelization, partial evaluation, resource...

  5. Safety based on organisational learning (SOL) - Conceptual approach and verification of a method for event analysis

    International Nuclear Information System (INIS)

    Miller, R.; Wilpert, B.; Fahlbruch, B.

    1999-01-01

    This paper discusses a method for analysing safety-relevant events in NPP which is known as 'SOL', safety based on organisational learning. After discussion of the specific organisational and psychological problems examined in the event analysis, the analytic process using the SOL approach is explained as well as the required general setting. The SOL approach has been tested both with scientific experiments and from the practical perspective, by operators of NPPs and experts from other branches of industry. (orig./CB) [de

  6. A physically-based constitutive model for SA508-III steel: Modeling and experimental verification

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Dingqian [National Die & Mold CAD Engineering Research Center, Shanghai Jiao Tong University, 1954 Huashan Rd., Shanghai 200030 (China); Chen, Fei, E-mail: feechn@gmail.com [National Die & Mold CAD Engineering Research Center, Shanghai Jiao Tong University, 1954 Huashan Rd., Shanghai 200030 (China); Department of Mechanical, Materials and Manufacturing Engineering, University of Nottingham, Nottingham NG7 2RD (United Kingdom); Cui, Zhenshan, E-mail: cuizs@sjtu.edu.cn [National Die & Mold CAD Engineering Research Center, Shanghai Jiao Tong University, 1954 Huashan Rd., Shanghai 200030 (China)

    2015-05-14

    Due to its good toughness and high weldability, SA508-III steel has been widely used in the components manufacturing of reactor pressure vessels (RPV) and steam generators (SG). In this study, the hot deformation behaviors of SA508-III steel are investigated by isothermal hot compression tests with forming temperature of (950–1250)°C and strain rate of (0.001–0.1)s{sup −1}, and the corresponding flow stress curves are obtained. According to the experimental results, quantitative analysis of work hardening and dynamic softening behaviors is presented. The critical stress and critical strain for initiation of dynamic recrystallization are calculated by setting the second derivative of the third order polynomial. Based on the classical stress–dislocation relation and the kinetics of dynamic recrystallization, a two-stage constitutive model is developed to predict the flow stress of SA508-III steel. Comparisons between the predicted and measured flow stress indicate that the established physically-based constitutive model can accurately characterize the hot deformations for the steel. Furthermore, a successful numerical simulation of the industrial upsetting process is carried out by implementing the developed constitutive model into a commercial software, which evidences that the physically-based constitutive model is practical and promising to promote industrial forging process for nuclear components.

  7. Mechatronics design and experimental verification of an electric-vehicle-based hybrid thermal management system

    Directory of Open Access Journals (Sweden)

    Yi-Hsuan Hung

    2016-02-01

    Full Text Available In this study, an electric-vehicle-based thermal management system was designed for dual energy sources. An experimental platform developed in a previous study was modified. Regarding the mechanical components, a heat exchanger with a radiator, proportional valve, coolant pipes, and coolant pump was appropriately integrated. Regarding the electric components, two heaters emulating waste heat were controlled using two programmable power supply machines. A rapid-prototyping controller with two temperature inputs and three outputs was designed. Rule-based control strategies were coded to maintain optimal temperatures for the emulated proton exchange membrane fuel cells and lithium batteries. To evaluate the heat power of dual energy sources, driving cycles, energy management control, and efficiency maps of energy sources were considered for deriving time-variant values. The main results are as follows: (a an advanced mechatronics platform was constructed; (b a driving cycle simulation was successfully conducted; and (c coolant temperatures reached their optimal operating ranges when the proportional valve, radiator, and coolant pump were sequentially controlled. The benefits of this novel electric-vehicle-based thermal management system are (a high-efficiency operation of energy sources, (b low occupied volume integrated with energy sources, and (c higher electric vehicle traveling mileage. This system will be integrated with real energy sources and a real electric vehicle in the future.

  8. A physically-based constitutive model for SA508-III steel: Modeling and experimental verification

    International Nuclear Information System (INIS)

    Dong, Dingqian; Chen, Fei; Cui, Zhenshan

    2015-01-01

    Due to its good toughness and high weldability, SA508-III steel has been widely used in the components manufacturing of reactor pressure vessels (RPV) and steam generators (SG). In this study, the hot deformation behaviors of SA508-III steel are investigated by isothermal hot compression tests with forming temperature of (950–1250)°C and strain rate of (0.001–0.1)s −1 , and the corresponding flow stress curves are obtained. According to the experimental results, quantitative analysis of work hardening and dynamic softening behaviors is presented. The critical stress and critical strain for initiation of dynamic recrystallization are calculated by setting the second derivative of the third order polynomial. Based on the classical stress–dislocation relation and the kinetics of dynamic recrystallization, a two-stage constitutive model is developed to predict the flow stress of SA508-III steel. Comparisons between the predicted and measured flow stress indicate that the established physically-based constitutive model can accurately characterize the hot deformations for the steel. Furthermore, a successful numerical simulation of the industrial upsetting process is carried out by implementing the developed constitutive model into a commercial software, which evidences that the physically-based constitutive model is practical and promising to promote industrial forging process for nuclear components

  9. Take-off and Landing Using Ground Based Power - Landing Simulations Using Multibody Dynamics

    NARCIS (Netherlands)

    Wu, P.; Voskuijl, M.; Van Tooren, M.J.L.

    2014-01-01

    A novel take-off and landing system using ground based power is proposed in the EUFP7 project GABRIEL. The proposed system has the potential benefit to reduce aircraft weight, emissions and noise. A preliminary investigation of the feasibility of the structural design of the connection mechanism

  10. ForestCrowns: a software tool for analyzing ground-based digital photographs of forest canopies

    Science.gov (United States)

    Matthew F. Winn; Sang-Mook Lee; Phillip A. Araman

    2013-01-01

    Canopy coverage is a key variable used to characterize forest structure. In addition, the light transmitted through the canopy is an important ecological indicator of plant and animal habitat and understory climate conditions. A common ground-based method used to document canopy coverage is to take digital photographs from below the canopy. To assist with analyzing...

  11. Identifying Barriers in Implementing Outcomes-Based Assessment Program Review: A Grounded Theory Analysis

    Science.gov (United States)

    Bresciani, Marilee J.

    2011-01-01

    The purpose of this grounded theory study was to identify the typical barriers encountered by faculty and administrators when implementing outcomes-based assessment program review. An analysis of interviews with faculty and administrators at nine institutions revealed a theory that faculty and administrators' promotion, tenure (if applicable),…

  12. Estimating and validating ground-based timber harvesting production through computer simulation

    Science.gov (United States)

    Jingxin Wang; Chris B. LeDoux

    2003-01-01

    Estimating ground-based timber harvesting systems production with an object oriented methodology was investigated. The estimation model developed generates stands of trees, simulates chain saw, drive-to-tree feller-buncher, swing-to-tree single-grip harvester felling, and grapple skidder and forwarder extraction activities, and analyzes costs and productivity. It also...

  13. On reconciling ground-based with spaceborne normalized radar cross section measurements

    DEFF Research Database (Denmark)

    Baumgartner, Francois; Munk, Jens; Jezek, K C

    2002-01-01

    This study examines differences in the normalized radar cross section, derived from ground-based versus spaceborne radar data. A simple homogeneous half-space model, indicates that agreement between the two improves as 1) the distance from the scatterer is increased; and/or 2) the extinction...

  14. Validation of the CrIS fast physical NH3 retrieval with ground-based FTIR

    NARCIS (Netherlands)

    Dammers, E.; Shephard, M.W.; Palm, M.; Cady-Pereira, K.; Capps, S.; Lutsch, E.; Strong, K.; Hannigan, J.W.; Ortega, I.; Toon, G.C.; Stremme, W.; Grutter, M.; Jones, N.; Smale, D.; Siemons, J.; Hrpcek, K.; Tremblay, D.; Schaap, M.; Notholt, J.; Willem Erisman, J.

    2017-01-01

    Presented here is the validation of the CrIS (Cross-track Infrared Sounder) fast physical NH3 retrieval (CFPR) column and profile measurements using ground-based Fourier transform infrared (FTIR) observations. We use the total columns and profiles from seven FTIR sites in the Network for the

  15. A cost-performance model for ground-based optical communications receiving telescopes

    Science.gov (United States)

    Lesh, J. R.; Robinson, D. L.

    1986-01-01

    An analytical cost-performance model for a ground-based optical communications receiving telescope is presented. The model considers costs of existing telescopes as a function of diameter and field of view. This, coupled with communication performance as a function of receiver diameter and field of view, yields the appropriate telescope cost versus communication performance curve.

  16. Retrieval of liquid water cloud properties from ground-based remote sensing observations

    NARCIS (Netherlands)

    Knist, C.L.

    2014-01-01

    Accurate ground-based remotely sensed microphysical and optical properties of liquid water clouds are essential references to validate satellite-observed cloud properties and to improve cloud parameterizations in weather and climate models. This requires the evaluation of algorithms for retrieval of

  17. Modern developments for ground-based monitoring of fire behavior and effects

    Science.gov (United States)

    Colin C. Hardy; Robert Kremens; Matthew B. Dickinson

    2010-01-01

    Advances in electronic technology over the last several decades have been staggering. The cost of electronics continues to decrease while system performance increases seemingly without limit. We have applied modern techniques in sensors, electronics and instrumentation to create a suite of ground based diagnostics that can be used in laboratory (~ 1 m2), field scale...

  18. Submillimetric motion detection with a 94 GHz ground based synthetic aperture radar

    OpenAIRE

    Martinez Cervera, Arturo; Lort Cuenca, Marc; Aguasca Solé, Alberto; Broquetas Ibars, Antoni

    2015-01-01

    The paper presents the validation and experimental assessment of a 94 GHz (W-Band) CW-FM Radar that can be configured as a Ground Based SAR for high resolution imaging and interferometry. Several experimental campaigns have been carried out to assess the capability of the system to remotely observe submillimetric deformation and vibration in infrastructures. Peer Reviewed

  19. The Council of Regional Accrediting Commissions Framework for Competency-Based Education: A Grounded Theory Study

    Science.gov (United States)

    Butland, Mark James

    2017-01-01

    Colleges facing pressures to increase student outcomes while reducing costs have shown an increasing interest in competency-based education (CBE) models. Regional accreditors created a joint policy on CBE evaluation. Two years later, through this grounded theory study, I sought to understand from experts the nature of this policy, its impact, and…

  20. Ground-based forest harvesting effects on soil physical properties and Douglas-fir growth.

    Science.gov (United States)

    Adrian Ares; Thomas A. Terry; Richard E. Miller; Harry W. Anderson; Barry L. Flaming

    2005-01-01

    Soil properties and forest productivity can be affected by heavy equipment used for harvest and site preparation but these impacts vary greatly with site conditions and operational practices. We assessed the effects of ground-based logging on soil physical properties and subsequent Douglas-fir [Pseudotsuga menziesii (Mirb) Franco] growth on a highly...

  1. Design and verification of computer-based reactor control system modification at Bruce-A candu nuclear generating station

    International Nuclear Information System (INIS)

    Basu, S.; Webb, N.

    1995-01-01

    The Reactor Control System at Bruce-A Nuclear Generating Station is going through some design modifications, which involve a rigorous design process including independent verification and validation. The design modification includes changes to the control logic, alarms and annunciation, hardware and software. The design (and verification) process includes design plan, design requirements, hardware and software specifications, hardware and software design, testing, technical review, safety evaluation, reliability analysis, failure mode and effect analysis, environmental qualification, seismic qualification, software quality assurance, system validation, documentation update, configuration management, and final acceptance. (7 figs.)

  2. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments

  3. Formal Verification of Safety Buffers for Sate-Based Conflict Detection and Resolution

    Science.gov (United States)

    Herencia-Zapana, Heber; Jeannin, Jean-Baptiste; Munoz, Cesar A.

    2010-01-01

    The information provided by global positioning systems is never totally exact, and there are always errors when measuring position and velocity of moving objects such as aircraft. This paper studies the effects of these errors in the actual separation of aircraft in the context of state-based conflict detection and resolution. Assuming that the state information is uncertain but that bounds on the errors are known, this paper provides an analytical definition of a safety buffer and sufficient conditions under which this buffer guarantees that actual conflicts are detected and solved. The results are presented as theorems, which were formally proven using a mechanical theorem prover.

  4. Research on Signature Verification Method Based on Discrete Fréchet Distance

    Science.gov (United States)

    Fang, J. L.; Wu, W.

    2018-05-01

    This paper proposes a multi-feature signature template based on discrete Fréchet distance, which breaks through the limitation of traditional signature authentication using a single signature feature. It solves the online handwritten signature authentication signature global feature template extraction calculation workload, signature feature selection unreasonable problem. In this experiment, the false recognition rate (FAR) and false rejection rate (FRR) of the statistical signature are calculated and the average equal error rate (AEER) is calculated. The feasibility of the combined template scheme is verified by comparing the average equal error rate of the combination template and the original template.

  5. Experimental verification of self-calibration radiometer based on spontaneous parametric downconversion

    Science.gov (United States)

    Gao, Dongyang; Zheng, Xiaobing; Li, Jianjun; Hu, Youbo; Xia, Maopeng; Salam, Abdul; Zhang, Peng

    2018-03-01

    Based on spontaneous parametric downconversion process, we propose a novel self-calibration radiometer scheme which can self-calibrate the degradation of its own response and ultimately monitor the fluctuation of a target radiation. Monitor results were independent of its degradation and not linked to the primary standard detector scale. The principle and feasibility of the proposed scheme were verified by observing bromine-tungsten lamp. A relative standard deviation of 0.39 % was obtained for stable bromine-tungsten lamp. Results show that the proposed scheme is advanced of its principle. The proposed scheme could make a significant breakthrough in the self-calibration issue on the space platform.

  6. Ground-Based Midcourse Defense (GMD) Initial Defensive Operations Capability (IDOC) at Vandenberg Air Force Base Environmental Assessment

    Science.gov (United States)

    2003-08-28

    Zielinski , EDAW, Inc., concerning utilities supply and demand for Vandenberg Air Force Base, 1 August. Rush, P., 2002. Personal communication between...Pernell W. Rush, Technical Sergeant, Water Utilities/Water Treatment NCO, USAF 30th CES/CEOIU, Vandenberg Air Force Base, and James E. Zielinski ... Dave Savinsky, Environmental Consultant, 30 CES/CEVC, Vandenberg Air Force Base, on the Preliminary Draft Ground-Based Midcourse Defense (GMD

  7. Neural Correlates of Auditory Figure-Ground Segregation Based on Temporal Coherence.

    Science.gov (United States)

    Teki, Sundeep; Barascud, Nicolas; Picard, Samuel; Payne, Christopher; Griffiths, Timothy D; Chait, Maria

    2016-09-01

    To make sense of natural acoustic environments, listeners must parse complex mixtures of sounds that vary in frequency, space, and time. Emerging work suggests that, in addition to the well-studied spectral cues for segregation, sensitivity to temporal coherence-the coincidence of sound elements in and across time-is also critical for the perceptual organization of acoustic scenes. Here, we examine pre-attentive, stimulus-driven neural processes underlying auditory figure-ground segregation using stimuli that capture the challenges of listening in complex scenes where segregation cannot be achieved based on spectral cues alone. Signals ("stochastic figure-ground": SFG) comprised a sequence of brief broadband chords containing random pure tone components that vary from 1 chord to another. Occasional tone repetitions across chords are perceived as "figures" popping out of a stochastic "ground." Magnetoencephalography (MEG) measurement in naïve, distracted, human subjects revealed robust evoked responses, commencing from about 150 ms after figure onset that reflect the emergence of the "figure" from the randomly varying "ground." Neural sources underlying this bottom-up driven figure-ground segregation were localized to planum temporale, and the intraparietal sulcus, demonstrating that this area, outside the "classic" auditory system, is also involved in the early stages of auditory scene analysis." © The Author 2016. Published by Oxford University Press.

  8. Verification of the IVA4 film boiling model with the data base of Liu and Theofanous

    Energy Technology Data Exchange (ETDEWEB)

    Kolev, N.I. [Siemens AG Unternehmensbereich KWU, Erlangen (Germany)

    1998-01-01

    Part 1 of this work presents a closed analytical solution for mixed-convection film boiling on vertical walls. Heat transfer coefficients predicted by the proposed model and experimental data obtained at the Royal Institute of Technology in Sweden by Okkonen et al are compared. All data predicted are inside the {+-}10% error band, with mean averaged error being below 4% using the slightly modified analytical solution. The solution obtained is recommended for practical applications. The method presented here is used in Part 2 as a guideline for developing model for film boiling on spheres. The new semi-empirical film boiling model for spheres used in IVA4 computer code is compared with the experimental data base obtained by Liu and Theofanous. The data are predicted within {+-}30% error band. (author)

  9. Formal verification of dynamic hybrid systems: a NuSMV-based model checking approach

    Directory of Open Access Journals (Sweden)

    Xu Zhi

    2018-01-01

    Full Text Available Software security is an important and challenging research topic in developing dynamic hybrid embedded software systems. Ensuring the correct behavior of these systems is particularly difficult due to the interactions between the continuous subsystem and the discrete subsystem. Currently available security analysis methods for system risks have been limited, as they rely on manual inspections of the individual subsystems under simplifying assumptions. To improve this situation, a new approach is proposed that is based on the symbolic model checking tool NuSMV. A dual PID system is used as an example system, for which the logical part and the computational part of the system are modeled in a unified manner. Constraints are constructed on the controlled object, and a counter-example path is ultimately generated, indicating that the hybrid system can be analyzed by the model checking tool.

  10. Development and verification of a compact TDC-based data acquisition system for space applications

    Energy Technology Data Exchange (ETDEWEB)

    Losekamm, Martin [Physics Department E18, Technische Universitaet Muenchen (Germany); Institute of Astronautics, Technische Universitaet Muenchen (Germany); Gaisbauer, Dominic; Konorov, Igor; Paul, Stephan; Poeschl, Thomas [Physics Department E18, Technische Universitaet Muenchen (Germany)

    2015-07-01

    The advances of solid-state detectors and in particular those for the detection of photons have made their application in space systems increasingly attractive in recent years. The use of, for example, silicon photomultipliers (SiPM) paired with a suitable scintillating material allows the development of compact and lightweight particle detectors. The Antiproton Flux in Space experiment (AFIS) intends to measure the flux of antiprotons trapped in Earth's magnetosphere aboard a nanosatellite using an active target tracking detector, consisting of plastic scintillating fibers read out by SiPMs. In order to implement a large number of detector channels while adhering to the given space, mass and power constraints, the development of a compact TDC-based data acquisition system was proposed. This talk presents a current prototype featuring 900 channels, real-time multi-channel temperature measurement and bias regulation. Possible alternative applications as well as the next steps in the development are also discussed.

  11. Development and verification of remote research environment based on 'Fusion research grid'

    International Nuclear Information System (INIS)

    Iba, Katsuyuki; Ozeki, Takahisa; Totsuka, Toshiyuki; Suzuki, Yoshio; Oshima, Takayuki; Sakata, Shinya; Sato, Minoru; Suzuki, Mitsuhiro; Hamamatsu, Kiyotaka; Kiyono, Kimihiro

    2008-01-01

    'Fusion research grid' is a concept that unites scientists and let them collaborate effectively against their difference in time zone and location in a nuclear fusion research. Fundamental technologies of 'Fusion research grid' have been developed at JAEA in the VizGrid project under the e-Japan project at the Ministry of Education, Culture, Sports, Science and Technology (MEXT). We are conscious of needs to create new systems that assist researchers with their research activities because remote collaborations have been increasing in international projects. Therefore we have developed prototype remote research environments for experiments, diagnostics, analyses and communications based on 'Fusion research grid'. All users can access these environments from anywhere because 'Fusion research grid' does not require a closed network like Super SINET to maintain security. The prototype systems were verified in experiments at JT-60U and their availability was confirmed

  12. Verification of simple illuminance based measures for indication of discomfort glare from windows

    DEFF Research Database (Denmark)

    Karlsen, Line Røseth; Heiselberg, Per Kvols; Bryn, Ida

    2015-01-01

    predictions of discomfort glare from windows already in the early design stage when decisions regarding the façade are taken. This study focus on verifying if simple illuminance based measures like vertical illuminance at eye level or horizontal illuminance at the desk are correlated with the perceived glare...... reported by 44 test subjects in a repeated measure design occupant survey and if the reported glare corresponds with the predictions from the simple Daylight Glare Probability (DGPs) model. Large individual variations were seen in the occupants’ assessment of glare in the present study. Yet, the results...... confirm that there is a statistically significant correlation between both vertical eye illuminance and horizontal illuminance at the desk and the occupants’ perception of glare in a perimeter zone office environment, which is promising evidence towards utilizing such simple measures for indication...

  13. Verification of a characterization method of the laser-induced selective activation based on industrial lasers

    DEFF Research Database (Denmark)

    Zhang, Yang; Hansen, Hans Nørgaard; Tang, Peter T.

    2013-01-01

    In this article, laser-induced selective activation (LISA) for subsequent autocatalytic copper plating is performed by several types of industrial scale lasers, including a Nd:YAG laser, a UV laser, a fiber laser, a green laser, and a short pulsed laser. Based on analysis of all the laser......-machined surfaces, normalized bearing area curves and parameters are used to characterize the surface quantitatively. The range of normalized bearing area curve parameters for plate-able surface is suggested. PBT/PET with 40 % glass fiber was used as the substrate material. For all of the studied lasers......, the parameters were varied in a relatively large range, and matrixes of the laser-machined surface were obtained. The topography of those laser-machined surfaces was examined by scanning electronic microscope (SEM). For each sample examined by SEM, there was an identical workpiece plated by for 90 min...

  14. Ground-Based Global Navigation Satellite System (GNSS) GLONASS Broadcast Ephemeris Data (hourly files) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — This dataset consists of ground-based Global Navigation Satellite System (GNSS) GLObal NAvigation Satellite System (GLONASS) Broadcast Ephemeris Data (hourly files)...

  15. The evaluation of a population based diffusion tensor image atlas using a ground truth method

    Science.gov (United States)

    Van Hecke, Wim; Leemans, Alexander; D'Agostino, Emiliano; De Backer, Steve; Vandervliet, Evert; Parizel, Paul M.; Sijbers, Jan

    2008-03-01

    Purpose: Voxel based morphometry (VBM) is increasingly being used to detect diffusion tensor (DT) image abnormalities in patients for different pathologies. An important requisite for these VBM studies is the use of a high-dimensional, non-rigid coregistration technique, which is able to align both the spatial and the orientational information. Recent studies furthermore indicate that high-dimensional DT information should be included during coregistration for an optimal alignment. In this context, a population based DTI atlas is created that preserves the orientational DT information robustly and contains a minimal bias towards any specific individual data set. Methods: A ground truth evaluation method is developed using a single subject DT image that is deformed with 20 deformation fields. Thereafter, an atlas is constructed based on these 20 resulting images. Thereby, the non-rigid coregistration algorithm is based on a viscous fluid model and on mutual information. The fractional anisotropy (FA) maps as well as the DT elements are used as DT image information during the coregistration algorithm, in order to minimize the orientational alignment inaccuracies. Results: The population based DT atlas is compared with the ground truth image using accuracy and precision measures of spatial and orientational dependent metrics. Results indicate that the population based atlas preserves the orientational information in a robust way. Conclusion: A subject independent population based DT atlas is constructed and evaluated with a ground truth method. This atlas contains all available orientational information and can be used in future VBM studies as a reference system.

  16. Adaptive beamlet-based finite-size pencil beam dose calculation for independent verification of IMRT and VMAT

    International Nuclear Information System (INIS)

    Park, Justin C.; Li, Jonathan G.; Arhjoul, Lahcen; Yan, Guanghua; Lu, Bo; Fan, Qiyong; Liu, Chihray

    2015-01-01

    Purpose: The use of sophisticated dose calculation procedure in modern radiation therapy treatment planning is inevitable in order to account for complex treatment fields created by multileaf collimators (MLCs). As a consequence, independent volumetric dose verification is time consuming, which affects the efficiency of clinical workflow. In this study, the authors present an efficient adaptive beamlet-based finite-size pencil beam (AB-FSPB) dose calculation algorithm that minimizes the computational procedure while preserving the accuracy. Methods: The computational time of finite-size pencil beam (FSPB) algorithm is proportional to the number of infinitesimal and identical beamlets that constitute an arbitrary field shape. In AB-FSPB, dose distribution from each beamlet is mathematically modeled such that the sizes of beamlets to represent an arbitrary field shape no longer need to be infinitesimal nor identical. As a result, it is possible to represent an arbitrary field shape with combinations of different sized and minimal number of beamlets. In addition, the authors included the model parameters to consider MLC for its rounded edge and transmission. Results: Root mean square error (RMSE) between treatment planning system and conventional FSPB on a 10 × 10 cm 2 square field using 10 × 10, 2.5 × 2.5, and 0.5 × 0.5 cm 2 beamlet sizes were 4.90%, 3.19%, and 2.87%, respectively, compared with RMSE of 1.10%, 1.11%, and 1.14% for AB-FSPB. This finding holds true for a larger square field size of 25 × 25 cm 2 , where RMSE for 25 × 25, 2.5 × 2.5, and 0.5 × 0.5 cm 2 beamlet sizes were 5.41%, 4.76%, and 3.54% in FSPB, respectively, compared with RMSE of 0.86%, 0.83%, and 0.88% for AB-FSPB. It was found that AB-FSPB could successfully account for the MLC transmissions without major discrepancy. The algorithm was also graphical processing unit (GPU) compatible to maximize its computational speed. For an intensity modulated radiation therapy (∼12 segments) and a

  17. Enhanced static ground power unit based on flying capacitor based h-bridge hybrid active-neutral-point-clamped converter

    DEFF Research Database (Denmark)

    Abarzadeh, Mostafa; Madadi Kojabadi, Hossein; Deng, Fujin

    2016-01-01

    Static power converters have various applications, such as static ground power units (GPUs) for airplanes. This study proposes a new configuration of a static GPU based on a novel nine-level flying capacitor h-bridge active-neutral-point-clamped (FCHB_ANPC) converter. The main advantages of the p......Static power converters have various applications, such as static ground power units (GPUs) for airplanes. This study proposes a new configuration of a static GPU based on a novel nine-level flying capacitor h-bridge active-neutral-point-clamped (FCHB_ANPC) converter. The main advantages...

  18. Response of base isolated structure during strong ground motions beyond design earthquakes

    International Nuclear Information System (INIS)

    Yabana, Shuichi; Ishida, Katsuhiko; Shiojiri, Hiroo

    1991-01-01

    In Japan, some base isolated structures for fast breeder reactors (FBR) are tried to design. When a base isolated structure are designed, the relative displacement of isolators are generally limited so sa to be remain in linear state of those during design earthquakes. But to estimate safety margin of a base isolated structure, the response of that until the failure must be obtained experimentally to analytically during strong ground motions of beyond design earthquake. The aim of this paper is to investigate the response of a base isolated structure when the stiffness of the isolators hardens and to simulate the response during strong ground motions of beyond design earthquakes. The optimum characteristics of isolators, with which the margin of the structure are increased, are discussed. (author)

  19. Efficient prediction of ground noise from helicopters and parametric studies based on acoustic mapping

    Directory of Open Access Journals (Sweden)

    Fei WANG

    2018-02-01

    Full Text Available Based on the acoustic mapping, a prediction model for the ground noise radiated from an in-flight helicopter is established. For the enhancement of calculation efficiency, a high-efficiency second-level acoustic radiation model capable of taking the influence of atmosphere absorption on noise into account is first developed by the combination of the point-source idea and the rotor noise radiation characteristics. The comparison between the present model and the direct computation method of noise is done and the high efficiency of the model is validated. Rotor free-wake analysis method and Ffowcs Williams-Hawkings (FW-H equation are applied to the aerodynamics and noise prediction in the present model. Secondly, a database of noise spheres with the characteristic parameters of advance ratio and tip-path-plane angle is established by the helicopter trim model together with a parametric modeling approach. Furthermore, based on acoustic mapping, a method of rapid simulation for the ground noise radiated from an in-flight helicopter is developed. The noise footprint for AH-1 rotor is then calculated and the influence of some parameters including advance ratio and flight path angle on ground noise is deeply analyzed using the developed model. The results suggest that with the increase of advance ratio and flight path angle, the peak noise levels on the ground first increase and then decrease, in the meantime, the maximum Sound Exposure Level (SEL noise on the ground shifts toward the advancing side of rotor. Besides, through the analysis of the effects of longitudinal forces on miss-distance and rotor Blade-Vortex Interaction (BVI noise in descent flight, some meaningful results for reducing the BVI noise on the ground are obtained. Keywords: Acoustic mapping, Helicopter, Noise footprint, Rotor noise, Second-level acoustic radiation model

  20. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  1. Dynamic model based novel findings in power systems analysis and frequency measurement verification

    Science.gov (United States)

    Kook, Kyung Soo

    This study selects several new advanced topics in power systems, and verifies their usefulness using the simulation. In the study on ratio of the equivalent reactance and resistance of the bulk power systems, the simulation results give us the more correct value of X/R of the bulk power system, which can explain why the active power compensation is also important in voltage flicker mitigation. In the application study of the Energy Storage System(ESS) to the wind power, the new model implementation of the ESS connected to the wind power is proposed, and the control effect of ESS to the intermittency of the wind power is verified. Also this study conducts the intensive simulations for clarifying the behavior of the wide-area power system frequency as well as the possibility of the on-line instability detection. In our POWER IT Laboratory, since 2003, the U.S. national frequency monitoring network (FNET) has been being continuously operated to monitor the wide-area power system frequency in the U.S. Using the measured frequency data, the event of the power system is triggered, and its location and scale are estimated. This study also looks for the possibility of using the simulation technologies to contribute the applications of FNET, finds similarity of the event detection orders between the frequency measurements and the simulations in the U.S. Eastern power grid, and develops the new methodology for estimating the event location based on the simulated N-1 contingencies using the frequency measurement. It has been pointed out that the simulation results can not represent the actual response of the power systems due to the inevitable limit of modeling power systems and different operating conditions of the systems at every second. However, in the circumstances that we need to test such an important infrastructure supplying the electric energy without taking any risk of it, the software based simulation will be the best solution to verify the new technologies in

  2. MENDL2 and IEAF-2001 nuclide production yields data bases verification at intermediate energies.

    Energy Technology Data Exchange (ETDEWEB)

    Titarenko, Y. E. (Yury E.); Batyaev, V. F. (Vyacheslav F.); Zhivun, V. M. (Valery M.); Mulambetov, R. D. (Ruslan D.); Mulambetova, S. V.; Zaitsev, S. L.; Lipatov, K. A.; Mashnik, S. G. (Stepan G.); Prael, R. E. (Richard E.)

    2004-01-01

    The work presents the results of computer simulation of two experiments which aim was measuring the threshold activation reaction rates in {sup 12}C, {sup 19}F, {sup 27}Al, {sup 59}Co, {sup 63}Cu, {sup 65}Cu, {sup 64}Zn, {sup 93}Nb, {sup 115}In, {sup 169}Tm, {sup 181}Ta, {sup 197}Au, and {sup 209}Bi thin samples placed inside and outside the 0.8-GeV proton-irradiated 4-cm thick W target and 92-cm thick W-Na composite target of 15-cm diameter both. In total, more than 1000 values of activation reaction were determined in the both experiments. The measured reaction rates were compared with the rates simulated by the LAHET code with the use of several nuclear databases for the respective excitation functions, namely, MENDL2/2P for neutron/proton cross sections up to 100 MeV, and recently developed IEAF-2001 that provides neutron cross sections up to 150 MeV. The comparison between the simulation-to-experiment agreements obtained via the MENDL2 and IEAF-2001 is presented. The agreement between simulation and experiment has been found generally satisfactory for both of the databases. The high-energy threshold excitation functions to be used in the activation-based unfolding of neutron spectra inside the Accelerator Driven Systems (ADS), particularly with Na-cooled W targets, can be inferred from the results.

  3. Experimental verification of internal dosimetry calculations: Construction of a heterogeneous phantom based on human organs

    International Nuclear Information System (INIS)

    Lauridsen, B.; Hedemann Jensen, P.

    1987-01-01

    The basic dosimetric quantity in ICRP-publication no. 30 is the aborbed fraction AF(T<-S). This parameter is the fraction of energy absorbed in a target organ T per emission of radiation from activity deposited in the source organ S. Based upon this fraction it is possible to calculate the Specific Effective Energy SEE(T<-S). From this, the committed effective dose equivalent from an intake of radioactive material can be found, and thus the annual limit of intake for given radionuclides can be determined. A male phantom has been constructed with the aim of measuring the Specific Effective Energy SEE(T<-S) in various target organs. Impressions-of real human organs have been used to produce vacuum forms. Tissue equivalent plastic sheets were sucked into the vacuum forms producing a shell with a shape identical to the original organ. Each organ has been made of two shells. The same procedure has been used for the body. Thin tubes through the organs make it possible to place TL dose meters in a matrix so the dose distribution can be measured. The phantom has been supplied with lungs, liver, kidneys, spleen, stomach, bladder, pancreas, and thyroid gland. To select a suitable body liquid for the phantom, laboratory experiments have been made with different liquids and different radionuclides. In these experiments the change in dose rate due to changes in density and composition of the liquid was determined. Preliminary results of the experiments are presented. (orig.)

  4. Design and Verification of Remote Sensing Image Data Center Storage Architecture Based on Hadoop

    Science.gov (United States)

    Tang, D.; Zhou, X.; Jing, Y.; Cong, W.; Li, C.

    2018-04-01

    The data center is a new concept of data processing and application proposed in recent years. It is a new method of processing technologies based on data, parallel computing, and compatibility with different hardware clusters. While optimizing the data storage management structure, it fully utilizes cluster resource computing nodes and improves the efficiency of data parallel application. This paper used mature Hadoop technology to build a large-scale distributed image management architecture for remote sensing imagery. Using MapReduce parallel processing technology, it called many computing nodes to process image storage blocks and pyramids in the background to improve the efficiency of image reading and application and sovled the need for concurrent multi-user high-speed access to remotely sensed data. It verified the rationality, reliability and superiority of the system design by testing the storage efficiency of different image data and multi-users and analyzing the distributed storage architecture to improve the application efficiency of remote sensing images through building an actual Hadoop service system.

  5. Multiple Authorities Attribute-Based Verification Mechanism for Blockchain Mircogrid Transactions

    Directory of Open Access Journals (Sweden)

    Sarmadullah Khan

    2018-05-01

    Full Text Available Recently, advancements in energy distribution models have fulfilled the needs of microgrids in finding a suitable energy distribution model between producer and consumer without the need of central controlling authority. Most of the energy distribution model deals with energy transactions and losses without considering the security aspects such as information tampering. The transaction data could be accessible online to keep track of the energy distribution between the consumer and producer (e.g., online payment records and supplier profiles. However this data is prone to modification and misuse if a consumer moves from one producer to other. Blockchain is considered to be one solution to allow users to exchange energy related data and keep track of it without exposing it to modification. In this paper, electrical transactions embedded in blockchain are validated using the signatures of multiple producers based on their assigned attributes. These signatures are verified and endorsed by the consumers satisfying those attributes without revealing any information. The public and private keys for these consumers are generated by the producers and endorsement procedure using these keys ensures that these consumers are authorized. This approach does not need any central authority. To resist against collision attacks, producers are given a secret pseudorandom function seed. The comparative analysis shows the efficiency of proposed approach over the existing ones.

  6. Feature-Based Attention in Early Vision for the Modulation of Figure?Ground Segregation

    OpenAIRE

    Wagatsuma, Nobuhiko; Oki, Megumi; Sakai, Ko

    2013-01-01

    We investigated psychophysically whether feature-based attention modulates the perception of figure–ground (F–G) segregation and, based on the results, we investigated computationally the neural mechanisms underlying attention modulation. In the psychophysical experiments, the attention of participants was drawn to a specific motion direction and they were then asked to judge the side of figure in an ambiguous figure with surfaces consisting of distinct motion directions. The results of these...

  7. Feature-based attention in early vision for the modulation of figure–ground segregation

    OpenAIRE

    Nobuhiko eWagatsuma; Nobuhiko eWagatsuma; Megumi eOki; Ko eSakai

    2013-01-01

    We investigated psychophysically whether feature-based attention modulates the perception of figure–ground (F–G) segregation and, based on the results, we investigated computationally the neural mechanisms underlying attention modulation. In the psychophysical experiments, the attention of participants was drawn to a specific motion direction and they were then asked to judge the side of figure in an ambiguous figure with surfaces consisting of distinct motion directions. The results of these...

  8. Shear wave velocity-based evaluation and design of stone column improved ground for liquefaction mitigation

    Science.gov (United States)

    Zhou, Yanguo; Sun, Zhengbo; Chen, Jie; Chen, Yunmin; Chen, Renpeng

    2017-04-01

    The evaluation and design of stone column improvement ground for liquefaction mitigation is a challenging issue for the state of practice. In this paper, a shear wave velocity-based approach is proposed based on the well-defined correlations of liquefaction resistance (CRR)-shear wave velocity ( V s)-void ratio ( e) of sandy soils, and the values of parameters in this approach are recommended for preliminary design purpose when site specific values are not available. The detailed procedures of pre- and post-improvement liquefaction evaluations and stone column design are given. According to this approach, the required level of ground improvement will be met once the target V s of soil is raised high enough (i.e., no less than the critical velocity) to resist the given earthquake loading according to the CRR- V s relationship, and then this requirement is transferred to the control of target void ratio (i.e., the critical e) according to the V s- e relationship. As this approach relies on the densification of the surrounding soil instead of the whole improved ground and is conservative by nature, specific considerations of the densification mechanism and effect are given, and the effects of drainage and reinforcement of stone columns are also discussed. A case study of a thermal power plant in Indonesia is introduced, where the effectiveness of stone column improved ground was evaluated by the proposed V s-based method and compared with the SPT-based evaluation. This improved ground performed well and experienced no liquefaction during subsequent strong earthquakes.

  9. Introducing the VISAGE project - Visualization for Integrated Satellite, Airborne, and Ground-based data Exploration

    Science.gov (United States)

    Gatlin, P. N.; Conover, H.; Berendes, T.; Maskey, M.; Naeger, A. R.; Wingo, S. M.

    2017-12-01

    A key component of NASA's Earth observation system is its field experiments, for intensive observation of particular weather phenomena, or for ground validation of satellite observations. These experiments collect data from a wide variety of airborne and ground-based instruments, on different spatial and temporal scales, often in unique formats. The field data are often used with high volume satellite observations that have very different spatial and temporal coverage. The challenges inherent in working with such diverse datasets make it difficult for scientists to rapidly collect and analyze the data for physical process studies and validation of satellite algorithms. The newly-funded VISAGE project will address these issues by combining and extending nascent efforts to provide on-line data fusion, exploration, analysis and delivery capabilities. A key building block is the Field Campaign Explorer (FCX), which allows users to examine data collected during field campaigns and simplifies data acquisition for event-based research. VISAGE will extend FCX's capabilities beyond interactive visualization and exploration of coincident datasets, to provide interrogation of data values and basic analyses such as ratios and differences between data fields. The project will also incorporate new, higher level fused and aggregated analysis products from the System for Integrating Multi-platform data to Build the Atmospheric column (SIMBA), which combines satellite and ground-based observations into a common gridded atmospheric column data product; and the Validation Network (VN), which compiles a nationwide database of coincident ground- and satellite-based radar measurements of precipitation for larger scale scientific analysis. The VISAGE proof-of-concept will target "golden cases" from Global Precipitation Measurement Ground Validation campaigns. This presentation will introduce the VISAGE project, initial accomplishments and near term plans.

  10. Tropospheric nitrogen dioxide column retrieval based on ground-based zenith-sky DOAS observations

    Science.gov (United States)

    Tack, F. M.; Hendrick, F.; Pinardi, G.; Fayt, C.; Van Roozendael, M.

    2013-12-01

    A retrieval approach has been developed to derive tropospheric NO2 vertical column amounts from ground-based zenith-sky measurements of scattered sunlight. Zenith radiance spectra are observed in the visible range by the BIRA-IASB Multi-Axis Differential Optical Absorption Spectroscopy (MAX-DOAS) instrument and analyzed by the DOAS technique, based on a least-squares spectral fitting. In recent years, this technique has shown to be a well-suited remote sensing tool for monitoring atmospheric trace gases. The retrieval algorithm is developed and validated based on a two month dataset acquired from June to July 2009 in the framework of the Cabauw (51.97° N, 4.93° E) Intercomparison campaign for Nitrogen Dioxide measuring Instruments (CINDI). Once fully operational, the retrieval approach can be applied to observations from stations of the Network for the Detection of Atmospheric Composition Change (NDACC). The obtained tropospheric vertical column amounts are compared with the multi-axis retrieval from the BIRA-IASB MAX-DOAS instrument and the retrieval from a zenith-viewing only SAOZ instrument (Système d'Analyse par Observations Zénithales), owned by Laboratoire Atmosphères, Milieux, Observations Spatiales (LATMOS). First results show a good agreement for the whole time series with the multi-axis retrieval (R = 0.82; y = 0.88x + 0.30) as well as with the SAOZ retrieval (R = 0.85; y = 0.76x + 0.28 ). Main error sources arise from the uncertainties in the determination of tropospheric and stratospheric air mass factors, the stratospheric NO2 abundances and the residual amount in the reference spectrum. However zenith-sky measurements have been commonly used over the last decades for stratospheric monitoring, this study also illustrates the suitability for retrieval of tropospheric column amounts. As there are long time series of zenith-sky acquisitions available, the developed approach offers new perspectives with regard to the use of observations from the NDACC

  11. Analysis of CPolSK-based FSO system working in space-to-ground channel

    Science.gov (United States)

    Su, Yuwei; Sato, Takuro

    2018-03-01

    In this article, the transmission performance of a circle polarization shift keying (CPolSK)-based free space optical (FSO) system working in space-to-ground channel is analyzed. Formulas describing the optical polarization distortion caused by the atmospheric turbulence and the communication qualities in terms of signal-to-noise-ratio (SNR), bit-error-ratio (BER) and outage probability of the proposed system are derived. Based on the Stokes parameters data measured by a Japanese optical communication satellite, we evaluate the space-to-ground FSO link and simulate the system performance under a varying regime of turbulence strength. The proposed system provides a more efficient way to compensate scintillation effects in a comparison with the on-off-keying (OOK)-based FSO system. These results are useful to the designing and evaluating of a deep space FSO communication system.

  12. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  13. Improving Agricultural Water Resources Management Using Ground-based Infrared Thermometry

    Science.gov (United States)

    Taghvaeian, S.

    2014-12-01

    Irrigated agriculture is the largest user of freshwater resources in arid/semi-arid parts of the world. Meeting rapidly growing demands in food, feed, fiber, and fuel while minimizing environmental pollution under a changing climate requires significant improvements in agricultural water management and irrigation scheduling. Although recent advances in remote sensing techniques and hydrological modeling has provided valuable information on agricultural water resources and their management, real improvements will only occur if farmers, the decision makers on the ground, are provided with simple, affordable, and practical tools to schedule irrigation events. This presentation reviews efforts in developing methods based on ground-based infrared thermometry and thermography for day-to-day management of irrigation systems. The results of research studies conducted in Colorado and Oklahoma show that ground-based remote sensing methods can be used effectively in quantifying water stress and consequently triggering irrigation events. Crop water use estimates based on stress indices have also showed to be in good agreement with estimates based on other methods (e.g. surface energy balance, root zone soil water balance, etc.). Major challenges toward the adoption of this approach by agricultural producers include the reduced accuracy under cloudy and humid conditions and its inability to forecast irrigation date, which is a critical knowledge since many irrigators need to decide about irrigations a few days in advance.

  14. Toward High Altitude Airship Ground-Based Boresight Calibration of Hyperspectral Pushbroom Imaging Sensors

    Directory of Open Access Journals (Sweden)

    Aiwu Zhang

    2015-12-01

    Full Text Available The complexity of the single linear hyperspectral pushbroom imaging based on a high altitude airship (HAA without a three-axis stabilized platform is much more than that based on the spaceborne and airborne. Due to the effects of air pressure, temperature and airflow, the large pitch and roll angles tend to appear frequently that create pushbroom images highly characterized with severe geometric distortions. Thus, the in-flight calibration procedure is not appropriate to apply to the single linear pushbroom sensors on HAA having no three-axis stabilized platform. In order to address this problem, a new ground-based boresight calibration method is proposed. Firstly, a coordinate’s transformation model is developed for direct georeferencing (DG of the linear imaging sensor, and then the linear error equation is derived from it by using the Taylor expansion formula. Secondly, the boresight misalignments are worked out by using iterative least squares method with few ground control points (GCPs and ground-based side-scanning experiments. The proposed method is demonstrated by three sets of experiments: (i the stability and reliability of the method is verified through simulation-based experiments; (ii the boresight calibration is performed using ground-based experiments; and (iii the validation is done by applying on the orthorectification of the real hyperspectral pushbroom images from a HAA Earth observation payload system developed by our research team—“LanTianHao”. The test results show that the proposed boresight calibration approach significantly improves the quality of georeferencing by reducing the geometric distortions caused by boresight misalignments to the minimum level.

  15. Ground Motion Prediction Trends For Eastern North America Based on the Next Generation Attenuation East Ground Motion Database

    Science.gov (United States)

    Cramer, C. H.; Kutliroff, J.; Dangkua, D.

    2010-12-01

    A five-year Next Generation Attenuation (NGA) East project to develop new ground motion prediction equations for stable continental regions (SCRs), including eastern North America (ENA), has begun at the Pacific Earthquake Engineering Research (PEER) Center funded by the Nuclear Regulatory Commission (NRC), the U.S. Geological Survey (USGS), the Electric Power Research Institute (EPRI), and the Department of Energy (DOE). The initial effort focused on database design and collection of appropriate M>4 ENA broadband and accelerograph records to populate the database. Ongoing work has focused on adding records from smaller ENA earthquakes and from other SCRs such as Europe, Australia, and India. Currently, over 6500 horizontal and vertical component records from 60 ENA earthquakes have been collected and prepared (instrument response removed, filtering to acceptable-signal band, determining peak and spectral parameter values, quality assurance, etc.) for the database. Geologic Survey of Canada (GSC) strong motion recordings, previously not available, have also been added to the NGA East database. The additional earthquakes increase the number of ground motion recordings in the 10 - 100 km range, particularly from the 2008 M5.2 Mt. Carmel, IL event, and the 2005 M4.7 Riviere du Loup and 2010 M5.0 Val des Bois earthquakes in Quebec, Canada. The goal is to complete the ENA database and make it available in 2011 followed by a SCR database in 2012. Comparisons of ground motion observations from four recent M5 ENA earthquakes with current ENA ground motion prediction equations (GMPEs) suggest that current GMPEs, as a group, reasonably agree with M5 observations at short periods, particularly at distances less than 200 km. However, at one second, current GMPEs over predict M5 ground motion observations. The 2001 M7.6 Bhuj, India, earthquake provides some constraint at large magnitudes, as geology and regional attenuation is analogous to ENA. Cramer and Kumar, 2003, have

  16. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  17. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  18. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  19. Predicting Electron Population Characteristics in 2-D Using Multispectral Ground-Based Imaging

    Science.gov (United States)

    Grubbs, Guy; Michell, Robert; Samara, Marilia; Hampton, Donald; Jahn, Jorg-Micha

    2018-01-01

    Ground-based imaging and in situ sounding rocket data are compared to electron transport modeling for an active inverted-V type auroral event. The Ground-to-Rocket Electrodynamics-Electrons Correlative Experiment (GREECE) mission successfully launched from Poker Flat, Alaska, on 3 March 2014 at 11:09:50 UT and reached an apogee of approximately 335 km over the aurora. Multiple ground-based electron-multiplying charge-coupled device (EMCCD) imagers were positioned at Venetie, Alaska, and aimed toward magnetic zenith. The imagers observed the intensity of different auroral emission lines (427.8, 557.7, and 844.6 nm) at the magnetic foot point of the rocket payload. Emission line intensity data are correlated with electron characteristics measured by the GREECE onboard electron spectrometer. A modified version of the GLobal airglOW (GLOW) model is used to estimate precipitating electron characteristics based on optical emissions. GLOW predicted the electron population characteristics with 20% error given the observed spectral intensities within 10° of magnetic zenith. Predictions are within 30% of the actual values within 20° of magnetic zenith for inverted-V-type aurora. Therefore, it is argued that this technique can be used, at least in certain types of aurora, such as the inverted-V type presented here, to derive 2-D maps of electron characteristics. These can then be used to further derive 2-D maps of ionospheric parameters as a function of time, based solely on multispectral optical imaging data.

  20. Gender Verification of Female Olympic Athletes.

    Science.gov (United States)

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  1. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  2. [Environmental investigation of ground water contamination at Wright-Patterson Air Force Base, Ohio

    International Nuclear Information System (INIS)

    1992-04-01

    This Removal Action System Design has been prepared as a Phase I Volume for the implementation of the Phase II removal action at Wright-Patterson Air Force Base (WPAFB) near Dayton, Ohio. The objective of the removal action is to prevent, to the extent practicable, the migration of ground water contaminated with chlorinated volatile organic compounds (VOCS) across the southwest boundary of Area C. The Phase 1, Volume 9 Removal Action System Design compiles the design documents prepared for the Phase II Removal Action. These documents, which are presented in Appendices to Volume 9, include: Process Design, which presents the 30 percent design for the ground water treatment system (GWTS); Design Packages 1 and 2 for Earthwork and Road Construction, and the Discharge Pipeline, respectively; no drawings are included in the appendix; Design Package 3 for installation of the Ground Water Extraction Well(s); Design Package 4 for installation of the Monitoring Well Instrumentation; and Design Package 5 for installation of the Ground Water Treatment System; this Design Package is incorporated by reference because of its size

  3. Ground target geolocation based on digital elevation model for airborne wide-area reconnaissance system

    Science.gov (United States)

    Qiao, Chuan; Ding, Yalin; Xu, Yongsen; Xiu, Jihong

    2018-01-01

    To obtain the geographical position of the ground target accurately, a geolocation algorithm based on the digital elevation model (DEM) is developed for an airborne wide-area reconnaissance system. According to the platform position and attitude information measured by the airborne position and orientation system and the gimbal angles information from the encoder, the line-of-sight pointing vector in the Earth-centered Earth-fixed coordinate frame is solved by the homogeneous coordinate transformation. The target longitude and latitude can be solved with the elliptical Earth model and the global DEM. The influences of the systematic error and measurement error on ground target geolocation calculation accuracy are analyzed by the Monte Carlo method. The simulation results show that this algorithm can improve the geolocation accuracy of ground target in rough terrain area obviously. The geolocation accuracy of moving ground target can be improved by moving average filtering (MAF). The validity of the geolocation algorithm is verified by the flight test in which the plane flies at a geodetic height of 15,000 m and the outer gimbal angle is <47°. The geolocation root mean square error of the target trajectory is <45 and <7 m after MAF.

  4. Neural Correlates of Auditory Figure-Ground Segregation Based on Temporal Coherence

    Science.gov (United States)

    Teki, Sundeep; Barascud, Nicolas; Picard, Samuel; Payne, Christopher; Griffiths, Timothy D.; Chait, Maria

    2016-01-01

    To make sense of natural acoustic environments, listeners must parse complex mixtures of sounds that vary in frequency, space, and time. Emerging work suggests that, in addition to the well-studied spectral cues for segregation, sensitivity to temporal coherence—the coincidence of sound elements in and across time—is also critical for the perceptual organization of acoustic scenes. Here, we examine pre-attentive, stimulus-driven neural processes underlying auditory figure-ground segregation using stimuli that capture the challenges of listening in complex scenes where segregation cannot be achieved based on spectral cues alone. Signals (“stochastic figure-ground”: SFG) comprised a sequence of brief broadband chords containing random pure tone components that vary from 1 chord to another. Occasional tone repetitions across chords are perceived as “figures” popping out of a stochastic “ground.” Magnetoencephalography (MEG) measurement in naïve, distracted, human subjects revealed robust evoked responses, commencing from about 150 ms after figure onset that reflect the emergence of the “figure” from the randomly varying “ground.” Neural sources underlying this bottom-up driven figure-ground segregation were localized to planum temporale, and the intraparietal sulcus, demonstrating that this area, outside the “classic” auditory system, is also involved in the early stages of auditory scene analysis.” PMID:27325682

  5. Topographic gradient based site characterization in India complemented by strong ground-motion spectral attributes

    KAUST Repository

    Nath, Sankar Kumar; Thingbaijam, Kiran Kumar; Adhikari, M. D.; Nayak, Avinash; Devaraj, N.; Ghosh, Soumalya K.; Mahajan, Arun K.

    2013-01-01

    We appraise topographic-gradient approach for site classification that employs correlations between 30. m column averaged shear-wave velocity and topographic gradients. Assessments based on site classifications reported from cities across India indicate that the approach is reasonably viable at regional level. Additionally, we experiment three techniques for site classification based on strong ground-motion recordings, namely Horizontal-to-Vertical Spectral Ratio (HVSR), Response Spectra Shape (RSS), and Horizontal-to-Vertical Response Spectral Ratio (HVRSR) at the strong motion stations located across the Himalayas and northeast India. Statistical tests on the results indicate that these three techniques broadly differentiate soil and rock sites while RSS and HVRSR yield better signatures. The results also support the implemented site classification in the light of strong ground-motion spectral attributes observed in different parts of the globe. © 2013 Elsevier Ltd.

  6. Topographic gradient based site characterization in India complemented by strong ground-motion spectral attributes

    KAUST Repository

    Nath, Sankar Kumar

    2013-12-01

    We appraise topographic-gradient approach for site classification that employs correlations between 30. m column averaged shear-wave velocity and topographic gradients. Assessments based on site classifications reported from cities across India indicate that the approach is reasonably viable at regional level. Additionally, we experiment three techniques for site classification based on strong ground-motion recordings, namely Horizontal-to-Vertical Spectral Ratio (HVSR), Response Spectra Shape (RSS), and Horizontal-to-Vertical Response Spectral Ratio (HVRSR) at the strong motion stations located across the Himalayas and northeast India. Statistical tests on the results indicate that these three techniques broadly differentiate soil and rock sites while RSS and HVRSR yield better signatures. The results also support the implemented site classification in the light of strong ground-motion spectral attributes observed in different parts of the globe. © 2013 Elsevier Ltd.

  7. Ground-and satellite-based evidence of the biophysical mechanisms behind the greening Sahel

    DEFF Research Database (Denmark)

    Brandt, Martin Stefan; Mbow, Cheikh; Diouf, Abdoul A.

    2015-01-01

    After a dry period with prolonged droughts in the 1970s and 1980s, recent scientific outcome suggests that the decades of abnormally dry conditions in the Sahel have been reversed by positive anomalies in rainfall. Various remote sensing studies observed a positive trend in vegetation greenness...... over the last decades which is known as the re-greening of the Sahel. However, little investment has been made in including long-term ground-based data collections to evaluate and better understand the biophysical mechanisms behind these findings. Thus, deductions on a possible increment in biomass...... remain speculative. Our aim is to bridge these gaps and give specifics on the biophysical background factors of the re-greening Sahel. Therefore, a trend analysis was applied on long time series (1987-2013) of satellite-based vegetation and rainfall data, as well as on ground-observations of leaf biomass...

  8. Space debris removal using a high-power ground-based laser

    Energy Technology Data Exchange (ETDEWEB)

    Monroe, D.K.

    1993-12-31

    The feasibility and practicality of using a ground-based laser (GBL) to remove artificial space debris is examined. Physical constraints indicate that a reactor-pumped laser (RPL) may be best suited for this mission, because of its capabilities for multimegawatt output long run-times, and near-diffraction-limited initial beams. Simulations of a laser-powered debris removal system indicate that a 5-MW RPL with a 10-meter-diameter beam director and adaptive optics capabilities can deorbit 1-kg debris from space station altitudes. Larger debris can be deorbited or transferred to safer orbits after multiple laser engagements. A ground-based laser system may be the only realistic way to access and remove some 10,000 separate objects, having velocities in the neighborhood of 7 km/sec, and being spatially distributed over some 10{sup 10} km{sup 3} of space.

  9. Climatological lower thermosphere winds as seen by ground-based and space-based instruments

    Directory of Open Access Journals (Sweden)

    J. M. Forbes

    2004-06-01

    Full Text Available Comparisons are made between climatological dynamic fields obtained from ground-based (GB and space-based (SB instruments with a view towards identifying SB/GB intercalibration issues for TIMED and other future aeronomy satellite missions. SB measurements are made from the High Resolution Doppler Imager (HRDI instrument on the Upper Atmosphere Research Satellite (UARS. The GB data originate from meteor radars at Obninsk, (55° N, 37° E, Shigaraki (35° N, 136° E and Jakarta (6° S, 107° E and MF spaced-antenna radars at Hawaii (22° N, 160° W, Christmas I. (2° N, 158° W and Adelaide (35° S, 138° E. We focus on monthly-mean prevailing, diurnal and semidiurnal wind components at 96km, averaged over the 1991-1999 period. We perform space-based (SB analyses for 90° longitude sectors including the GB sites, as well as for the zonal mean. Taking the monthly prevailing zonal winds from these stations as a whole, on average, SB zonal winds exceed GB determinations by ~63%, whereas meridional winds are in much better agreement. The origin of this discrepancy remains unknown, and should receive high priority in initial GB/SB comparisons during the TIMED mission. We perform detailed comparisons between monthly climatologies from Jakarta and the geographically conjugate sites of Shigaraki and Adelaide, including some analyses of interannual variations. SB prevailing, diurnal and semidiurnal tides exceed those measured over Jakarta by factors, on the average, of the order of 2.0, 1.6, 1.3, respectively, for the eastward wind, although much variability exists. For the meridional component, SB/GB ratios for the diurnal and semidiurnal tide are about 1.6 and 1.7. Prevailing and tidal amplitudes at Adelaide are significantly lower than SB values, whereas similar net differences do not occur at the conjugate Northern Hemisphere location of Shigaraki. Adelaide diurnal phases lag SB phases by several hours, but excellent agreement between the two data

  10. Model-Based Knowing: How Do Students Ground Their Understanding About Climate Systems in Agent-Based Computer Models?

    Science.gov (United States)

    Markauskaite, Lina; Kelly, Nick; Jacobson, Michael J.

    2017-12-01

    This paper gives a grounded cognition account of model-based learning of complex scientific knowledge related to socio-scientific issues, such as climate change. It draws on the results from a study of high school students learning about the carbon cycle through computational agent-based models and investigates two questions: First, how do students ground their understanding about the phenomenon when they learn and solve problems with computer models? Second, what are common sources of mistakes in students' reasoning with computer models? Results show that students ground their understanding in computer models in five ways: direct observation, straight abstraction, generalisation, conceptualisation, and extension. Students also incorporate into their reasoning their knowledge and experiences that extend beyond phenomena represented in the models, such as attitudes about unsustainable carbon emission rates, human agency, external events, and the nature of computational models. The most common difficulties of the students relate to seeing the modelled scientific phenomenon and connecting results from the observations with other experiences and understandings about the phenomenon in the outside world. An important contribution of this study is the constructed coding scheme for establishing different ways of grounding, which helps to understand some challenges that students encounter when they learn about complex phenomena with agent-based computer models.

  11. (DCT-FY08) Target Detection Using Multiple Modality Airborne and Ground Based Sensors

    Science.gov (United States)

    2013-03-01

    resolution SIFT grids in metric-topological SLAM ,” in Proc. of the IEEE Conference on Computer Vision and Pattern Recognition, 2009. [4] M. Bosse and R...single camera SLAM ,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 29, no. 6, pp. 1052–1067, 2007. [7] D. Nister, O. Naroditsky, and J. Bergen...segmentation with ground-based and airborne LIDAR range data,” in Proceedings of the Fourth International Symposium on 3D Data Processing

  12. The SPARC water vapor assessment II: intercomparison of satellite and ground-based microwave measurements

    Science.gov (United States)

    Nedoluha, Gerald E.; Kiefer, Michael; Lossow, Stefan; Gomez, R. Michael; Kämpfer, Niklaus; Lainer, Martin; Forkman, Peter; Christensen, Ole Martin; Oh, Jung Jin; Hartogh, Paul; Anderson, John; Bramstedt, Klaus; Dinelli, Bianca M.; Garcia-Comas, Maya; Hervig, Mark; Murtagh, Donal; Raspollini, Piera; Read, William G.; Rosenlof, Karen; Stiller, Gabriele P.; Walker, Kaley A.

    2017-12-01

    As part of the second SPARC (Stratosphere-troposphere Processes And their Role in Climate) water vapor assessment (WAVAS-II), we present measurements taken from or coincident with seven sites from which ground-based microwave instruments measure water vapor in the middle atmosphere. Six of the ground-based instruments are part of the Network for the Detection of Atmospheric Composition Change (NDACC) and provide datasets that can be used for drift and trend assessment. We compare measurements from these ground-based instruments with satellite datasets that have provided retrievals of water vapor in the lower mesosphere over extended periods since 1996. We first compare biases between the satellite and ground-based instruments from the upper stratosphere to the upper mesosphere. We then show a number of time series comparisons at 0.46 hPa, a level that is sensitive to changes in H2O and CH4 entering the stratosphere but, because almost all CH4 has been oxidized, is relatively insensitive to dynamical variations. Interannual variations and drifts are investigated with respect to both the Aura Microwave Limb Sounder (MLS; from 2004 onwards) and each instrument's climatological mean. We find that the variation in the interannual difference in the mean H2O measured by any two instruments is typically ˜ 1%. Most of the datasets start in or after 2004 and show annual increases in H2O of 0-1 % yr-1. In particular, MLS shows a trend of between 0.5 % yr-1 and 0.7 % yr-1 at the comparison sites. However, the two longest measurement datasets used here, with measurements back to 1996, show much smaller trends of +0.1 % yr-1 (at Mauna Loa, Hawaii) and -0.1 % yr-1 (at Lauder, New Zealand).

  13. The SPARC water vapor assessment II: intercomparison of satellite and ground-based microwave measurements

    Directory of Open Access Journals (Sweden)

    G. E. Nedoluha

    2017-12-01

    Full Text Available As part of the second SPARC (Stratosphere–troposphere Processes And their Role in Climate water vapor assessment (WAVAS-II, we present measurements taken from or coincident with seven sites from which ground-based microwave instruments measure water vapor in the middle atmosphere. Six of the ground-based instruments are part of the Network for the Detection of Atmospheric Composition Change (NDACC and provide datasets that can be used for drift and trend assessment. We compare measurements from these ground-based instruments with satellite datasets that have provided retrievals of water vapor in the lower mesosphere over extended periods since 1996. We first compare biases between the satellite and ground-based instruments from the upper stratosphere to the upper mesosphere. We then show a number of time series comparisons at 0.46 hPa, a level that is sensitive to changes in H2O and CH4 entering the stratosphere but, because almost all CH4 has been oxidized, is relatively insensitive to dynamical variations. Interannual variations and drifts are investigated with respect to both the Aura Microwave Limb Sounder (MLS; from 2004 onwards and each instrument's climatological mean. We find that the variation in the interannual difference in the mean H2O measured by any two instruments is typically  ∼  1%. Most of the datasets start in or after 2004 and show annual increases in H2O of 0–1 % yr−1. In particular, MLS shows a trend of between 0.5 % yr−1 and 0.7 % yr−1 at the comparison sites. However, the two longest measurement datasets used here, with measurements back to 1996, show much smaller trends of +0.1 % yr−1 (at Mauna Loa, Hawaii and −0.1 % yr−1 (at Lauder, New Zealand.

  14. Laser Guidestar Satellite for Ground-based Adaptive Optics Imaging of Geosynchronous Satellites and Astronomical Targets

    Science.gov (United States)

    Marlow, W. A.; Cahoy, K.; Males, J.; Carlton, A.; Yoon, H.

    2015-12-01

    Real-time observation and monitoring of geostationary (GEO) satellites with ground-based imaging systems would be an attractive alternative to fielding high cost, long lead, space-based imagers, but ground-based observations are inherently limited by atmospheric turbulence. Adaptive optics (AO) systems are used to help ground telescopes achieve diffraction-limited seeing. AO systems have historically relied on the use of bright natural guide stars or laser guide stars projected on a layer of the upper atmosphere by ground laser systems. There are several challenges with this approach such as the sidereal motion of GEO objects relative to natural guide stars and limitations of ground-based laser guide stars; they cannot be used to correct tip-tilt, they are not point sources, and have finite angular sizes when detected at the receiver. There is a difference between the wavefront error measured using the guide star compared with the target due to cone effect, which also makes it difficult to use a distributed aperture system with a larger baseline to improve resolution. Inspired by previous concepts proposed by A.H. Greenaway, we present using a space-based laser guide starprojected from a satellite orbiting the Earth. We show that a nanosatellite-based guide star system meets the needs for imaging GEO objects using a low power laser even from 36,000 km altitude. Satellite guide star (SGS) systemswould be well above atmospheric turbulence and could provide a small angular size reference source. CubeSatsoffer inexpensive, frequent access to space at a fraction of the cost of traditional systems, and are now being deployed to geostationary orbits and on interplanetary trajectories. The fundamental CubeSat bus unit of 10 cm cubed can be combined in multiple units and offers a common form factor allowing for easy integration as secondary payloads on traditional launches and rapid testing of new technologies on-orbit. We describe a 6U CubeSat SGS measuring 10 cm x 20 cm x

  15. Study of the unknown hemisphere of mercury by ground-based astronomical facilities

    Science.gov (United States)

    Ksanfomality, L. V.

    2011-08-01

    The short exposure method proved to be very productive in ground-based observations of Mercury. Telescopic observations with short exposures, together with computer codes for the processing of data arrays of many thousands of original electronic photos, make it possible to improve the resolution of images from ground-based instruments to almost the diffraction limit. The resulting composite images are comparable with images from spacecrafts approaching from a distance of about 1 million km. This paper presents images of the hemisphere of Mercury in longitude sectors 90°-180°W, 215°-350°W, and 50°-90°W, including, among others, areas not covered by spacecraft cameras. For the first time a giant S basin was discovered in the sector of longitudes 250°-290°W, which is the largest formation of this type on terrestrial planets. Mercury has a strong phase effects. As a result, the view of the surface changes completely with the change in the planetary phase. But the choice of the phase in the study using spacecrafts is limited by orbital characteristics of the mission. Thus, ground-based observations of the planet provide a valuable support.

  16. Intercomparison of ground-based ozone and NO2 measurements during the MANTRA 2004 campaign

    Directory of Open Access Journals (Sweden)

    K. Strong

    2007-11-01

    Full Text Available The MANTRA (Middle Atmosphere Nitrogen TRend Assessment 2004 campaign took place in Vanscoy, Saskatchewan, Canada (52° N, 107° W from 3 August to 15 September, 2004. In support of the main balloon launch, a suite of five zenith-sky and direct-Sun-viewing UV-visible ground-based spectrometers was deployed, primarily measuring ozone and NO2 total columns. Three Fourier transform spectrometers (FTSs that were part of the balloon payload also performed ground-based measurements of several species, including ozone. Ground-based measurements of ozone and NO2 differential slant column densities from the zenith-viewing UV-visible instruments are presented herein. They are found to partially agree within NDACC (Network for the Detection of Atmospheric Composition Change standards for instruments certified for process studies and satellite validation. Vertical column densities of ozone from the zenith-sky UV-visible instruments, the FTSs, a Brewer spectrophotometer, and ozonesondes are compared, and found to agree within the combined error estimates of the instruments (15%. NO2 vertical column densities from two of the UV-visible instruments are compared, and are also found to agree within combined error (15%.

  17. Design and implementation of embedded hardware accelerator for diagnosing HDL-CODE in assertion-based verification environment

    Directory of Open Access Journals (Sweden)

    C. U. Ngene

    2013-08-01

    Full Text Available The use of assertions for monitoring the designer’s intention in hardware description language (HDL model is gaining popularity as it helps the designer to observe internal errors at the output ports of the device under verification. During verification assertions are synthesised and the generated data are represented in a tabular forms. The amount of data generated can be enormous depending on the size of the code and the number of modules that constitute the code. Furthermore, to manually inspect these data and diagnose the module with functional violation is a time consuming process which negatively affects the overall product development time. To locate the module with functional violation within acceptable diagnostic time, the data processing and analysis procedure must be accelerated. In this paper a multi-array processor (hardware accelerator was designed and implemented in Virtex6 field programmable gate array (FPGA and it can be integrated into verification environment. The design was captured in very high speed integrated circuit HDL (VHDL. The design was synthesised with Xilinx design suite ISE 13.1 and simulated with Xilinx ISIM. The multi-array processor (MAP executes three logical operations (AND, OR, XOR and a one’s compaction operation on array of data in parallel. An improvement in processing and analysis time was recorded as compared to the manual procedure after the multi-array processor was integrated into the verification environment. It was also found that the multi-array processor which was developed as an Intellectual Property (IP core can also be used in applications where output responses and golden model that are represented in the form of matrices can be compared for searching, recognition and decision-making.

  18. A comparison of two prompt gamma imaging techniques with collimator-based cameras for range verification in proton therapy

    Science.gov (United States)

    Lin, Hsin-Hon; Chang, Hao-Ting; Chao, Tsi-Chian; Chuang, Keh-Shih

    2017-08-01

    In vivo range verification plays an important role in proton therapy to fully utilize the benefits of the Bragg peak (BP) for delivering high radiation dose to tumor, while sparing the normal tissue. For accurately locating the position of BP, camera equipped with collimators (multi-slit and knife-edge collimator) to image prompt gamma (PG) emitted along the proton tracks in the patient have been proposed for range verification. The aim of the work is to compare the performance of multi-slit collimator and knife-edge collimator for non-invasive proton beam range verification. PG imaging was simulated by a validated GATE/GEANT4 Monte Carlo code to model the spot-scanning proton therapy and cylindrical PMMA phantom in detail. For each spot, 108 protons were simulated. To investigate the correlation between the acquired PG profile and the proton range, the falloff regions of PG profiles were fitted with a 3-line-segment curve function as the range estimate. Factors including the energy window setting, proton energy, phantom size, and phantom shift that may influence the accuracy of detecting range were studied. Results indicated that both collimator systems achieve reasonable accuracy and good response to the phantom shift. The accuracy of range predicted by multi-slit collimator system is less affected by the proton energy, while knife-edge collimator system can achieve higher detection efficiency that lead to a smaller deviation in predicting range. We conclude that both collimator systems have potentials for accurately range monitoring in proton therapy. It is noted that neutron contamination has a marked impact on range prediction of the two systems, especially in multi-slit system. Therefore, a neutron reduction technique for improving the accuracy of range verification of proton therapy is needed.

  19. Tolerance limits and methodologies for IMRT measurement-based verification QA: Recommendations of AAPM Task Group No. 218.

    Science.gov (United States)

    Miften, Moyed; Olch, Arthur; Mihailidis, Dimitris; Moran, Jean; Pawlicki, Todd; Molineu, Andrea; Li, Harold; Wijesooriya, Krishni; Shi, Jie; Xia, Ping; Papanikolaou, Nikos; Low, Daniel A

    2018-04-01

    Patient-specific IMRT QA measurements are important components of processes designed to identify discrepancies between calculated and delivered radiation doses. Discrepancy tolerance limits are neither well defined nor consistently applied across centers. The AAPM TG-218 report provides a comprehensive review aimed at improving the understanding and consistency of these processes as well as recommendations for methodologies and tolerance limits in patient-specific IMRT QA. The performance of the dose difference/distance-to-agreement (DTA) and γ dose distribution comparison metrics are investigated. Measurement methods are reviewed and followed by a discussion of the pros and cons of each. Methodologies for absolute dose verification are discussed and new IMRT QA verification tools are presented. Literature on the expected or achievable agreement between measurements and calculations for different types of planning and delivery systems are reviewed and analyzed. Tests of vendor implementations of the γ verification algorithm employing benchmark cases are presented. Operational shortcomings that can reduce the γ tool accuracy and subsequent effectiveness for IMRT QA are described. Practical considerations including spatial resolution, normalization, dose threshold, and data interpretation are discussed. Published data on IMRT QA and the clinical experience of the group members are used to develop guidelines and recommendations on tolerance and action limits for IMRT QA. Steps to check failed IMRT QA plans are outlined. Recommendations on delivery methods, data interpretation, dose normalization, the use of γ analysis routines and choice of tolerance limits for IMRT QA are made with focus on detecting differences between calculated and measured doses via the use of robust analysis methods and an in-depth understanding of IMRT verification metrics. The recommendations are intended to improve the IMRT QA process and establish consistent, and comparable IMRT QA

  20. Simulation of Ground-Water Flow and Effects of Ground-Water Irrigation on Base Flow in the Elkhorn and Loup River Basins, Nebraska

    Science.gov (United States)

    Peterson, Steven M.; Stanton, Jennifer S.; Saunders, Amanda T.; Bradley, Jesse R.

    2008-01-01

    Irrigated agriculture is vital to the livelihood of communities in the Elkhorn and Loup River Basins in Nebraska, and ground water is used to irrigate most of the cropland. Concerns about the sustainability of ground-water and surface-water resources have prompted State and regional agencies to evaluate the cumulative effects of ground-water irrigation in this area. To facilitate understanding of the effects of ground-water irrigation, a numerical computer model was developed to simulate ground-water flow and assess the effects of ground-water irrigation (including ground-water withdrawals, hereinafter referred to as pumpage, and enhanced recharge) on stream base flow. The study area covers approximately 30,800 square miles, and includes the Elkhorn River Basin upstream from Norfolk, Nebraska, and the Loup River Basin upstream from Columbus, Nebraska. The water-table aquifer consists of Quaternary-age sands and gravels and Tertiary-age silts, sands, and gravels. The simulation was constructed using one layer with 2-mile by 2-mile cell size. Simulations were constructed to represent the ground-water system before 1940 and from 1940 through 2005, and to simulate hypothetical conditions from 2006 through 2045 or 2055. The first simulation represents steady-state conditions of the system before anthropogenic effects, and then simulates the effects of early surface-water development activities and recharge of water leaking from canals during 1895 to 1940. The first simulation ends at 1940 because before that time, very little pumpage for irrigation occurred, but after that time it became increasingly commonplace. The pre-1940 simulation was calibrated against measured water levels and estimated long-term base flow, and the 1940 through 2005 simulation was calibrated against measured water-level changes and estimated long-term base flow. The calibrated 1940 through 2005 simulation was used as the basis for analyzing hypothetical scenarios to evaluate the effects of

  1. An approach to the verification of a fault-tolerant, computer-based reactor safety system: A case study using automated reasoning: Volume 1: Interim report

    International Nuclear Information System (INIS)

    Chisholm, G.H.; Kljaich, J.; Smith, B.T.; Wojcik, A.S.

    1987-01-01

    The purpose of this project is to explore the feasibility of automating the verification process for computer systems. The intent is to demonstrate that both the software and hardware that comprise the system meet specified availability and reliability criteria, that is, total design analysis. The approach to automation is based upon the use of Automated Reasoning Software developed at Argonne National Laboratory. This approach is herein referred to as formal analysis and is based on previous work on the formal verification of digital hardware designs. Formal analysis represents a rigorous evaluation which is appropriate for system acceptance in critical applications, such as a Reactor Safety System (RSS). This report describes a formal analysis technique in the context of a case study, that is, demonstrates the feasibility of applying formal analysis via application. The case study described is based on the Reactor Safety System (RSS) for the Experimental Breeder Reactor-II (EBR-II). This is a system where high reliability and availability are tantamount to safety. The conceptual design for this case study incorporates a Fault-Tolerant Processor (FTP) for the computer environment. An FTP is a computer which has the ability to produce correct results even in the presence of any single fault. This technology was selected as it provides a computer-based equivalent to the traditional analog based RSSs. This provides a more conservative design constraint than that imposed by the IEEE Standard, Criteria For Protection Systems For Nuclear Power Generating Stations (ANSI N42.7-1972)

  2. Methane Emissions from Bangladesh: Bridging the Gap Between Ground-based and Space-borne Estimates

    Science.gov (United States)

    Peters, C.; Bennartz, R.; Hornberger, G. M.

    2015-12-01

    Gaining an understanding of methane (CH4) emission sources and atmospheric dispersion is an essential part of climate change research. Large-scale and global studies often rely on satellite observations of column CH4 mixing ratio whereas high-spatial resolution estimates rely on ground-based measurements. Extrapolation of ground-based measurements on, for example, rice paddies to broad region scales is highly uncertain because of spatio-temporal variability. We explore the use of ground-based river stage measurements and independent satellite observations of flooded area along with satellite measurements of CH4 mixing ratio to estimate the extent of methane emissions. Bangladesh, which comprises most of the Ganges Brahmaputra Meghna (GBM) delta, is a region of particular interest for studying spatio-temporal variation of methane emissions due to (1) broadscale rice cultivation and (2) seasonal flooding and atmospheric convection during the monsoon. Bangladesh and its deltaic landscape exhibit a broad range of environmental, economic, and social circumstances that are relevant to many nations in South and Southeast Asia. We explore the seasonal enhancement of CH4 in Bangladesh using passive remote sensing spectrometer CH4 products from the SCanning Imaging Absorption SpectroMeter for Atmospheric CHartographY (SCIAMACHY) and the Atmospheric Infrared Sounder (AIRS). The seasonal variation of CH4 is compared to independent estimates of seasonal flooding from water gauge stations and space-based passive microwave water-to-land fractions from the Tropical Rainfall Measuring Mission Microwave Imager (TRMM-TMI). Annual cycles in inundation (natural and anthropogenic) and atmospheric CH4 concentrations show highly correlated seasonal signals. NOAA's HYSPLIT model is used to determine atmospheric residence time of ground CH4 fluxes. Using the satellite observations, we can narrow the large uncertainty in extrapolation of ground-based CH4 emission estimates from rice paddies

  3. High-dose intensity-modulated radiotherapy for prostate cancer using daily fiducial marker-based position verification: acute and late toxicity in 331 patients

    International Nuclear Information System (INIS)

    Lips, Irene M; Dehnad, Homan; Gils, Carla H van; Boeken Kruger, Arto E; Heide, Uulke A van der; Vulpen, Marco van

    2008-01-01

    We evaluated the acute and late toxicity after high-dose intensity-modulated radiotherapy (IMRT) with fiducial marker-based position verification for prostate cancer. Between 2001 and 2004, 331 patients with prostate cancer received 76 Gy in 35 fractions using IMRT combined with fiducial marker-based position verification. The symptoms before treatment (pre-treatment) and weekly during treatment (acute toxicity) were scored using the Common Toxicity Criteria (CTC). The goal was to score late toxicity according to the Radiation Therapy Oncology Group/European Organization for Research and Treatment of Cancer (RTOG/EORTC) scale with a follow-up time of at least three years. Twenty-two percent of the patients experienced pre-treatment grade ≥ 2 genitourinary (GU) complaints and 2% experienced grade 2 gastrointestinal (GI) complaints. Acute grade 2 GU and GI toxicity occurred in 47% and 30%, respectively. Only 3% of the patients developed acute grade 3 GU and no grade ≥ 3 GI toxicity occurred. After a mean follow-up time of 47 months with a minimum of 31 months for all patients, the incidence of late grade 2 GU and GI toxicity was 21% and 9%, respectively. Grade ≥ 3 GU and GI toxicity rates were 4% and 1%, respectively, including one patient with a rectal fistula and one patient with a severe hemorrhagic cystitis (both grade 4). In conclusion, high-dose intensity-modulated radiotherapy with fiducial marker-based position verification is well tolerated. The low grade ≥ 3 toxicity allows further dose escalation if the same dose constraints for the organs at risk will be used

  4. High-dose intensity-modulated radiotherapy for prostate cancer using daily fiducial marker-based position verification: acute and late toxicity in 331 patients

    Directory of Open Access Journals (Sweden)

    Boeken Kruger Arto E

    2008-05-01

    Full Text Available Abstract We evaluated the acute and late toxicity after high-dose intensity-modulated radiotherapy (IMRT with fiducial marker-based position verification for prostate cancer. Between 2001 and 2004, 331 patients with prostate cancer received 76 Gy in 35 fractions using IMRT combined with fiducial marker-based position verification. The symptoms before treatment (pre-treatment and weekly during treatment (acute toxicity were scored using the Common Toxicity Criteria (CTC. The goal was to score late toxicity according to the Radiation Therapy Oncology Group/European Organization for Research and Treatment of Cancer (RTOG/EORTC scale with a follow-up time of at least three years. Twenty-two percent of the patients experienced pre-treatment grade ≥ 2 genitourinary (GU complaints and 2% experienced grade 2 gastrointestinal (GI complaints. Acute grade 2 GU and GI toxicity occurred in 47% and 30%, respectively. Only 3% of the patients developed acute grade 3 GU and no grade ≥ 3 GI toxicity occurred. After a mean follow-up time of 47 months with a minimum of 31 months for all patients, the incidence of late grade 2 GU and GI toxicity was 21% and 9%, respectively. Grade ≥ 3 GU and GI toxicity rates were 4% and 1%, respectively, including one patient with a rectal fistula and one patient with a severe hemorrhagic cystitis (both grade 4. In conclusion, high-dose intensity-modulated radiotherapy with fiducial marker-based position verification is well tolerated. The low grade ≥ 3 toxicity allows further dose escalation if the same dose constraints for the organs at risk will be used.

  5. Understanding the Longitudinal Variability of Equatorial Electrodynamics using integrated Ground- and Space-based Observations

    Science.gov (United States)

    Yizengaw, E.; Moldwin, M.; Zesta, E.

    2015-12-01

    The currently funded African Meridian B-Field Education and Research (AMBER) magnetometer array comprises more than thirteen magnetometers stationed globally in the vicinity of geomagnetic equator. One of the main objectives of AMBER network is to understand the longitudinal variability of equatorial electrodynamics as function of local time, magnetic activity, and season. While providing complete meridian observation in the region and filling the largest land-based gap in global magnetometer coverage, the AMBER array addresses two fundamental areas of space physics: first, the processes governing electrodynamics of the equatorial ionosphere as a function of latitude (or L-shell), local time, longitude, magnetic activity, and season, and second, ULF pulsation strength at low/mid-latitude regions and its connection with equatorial electrojet and density fluctuation. The global AMBER network can also be used to augment observations from space-based instruments, such us the triplet SWARM mission and the upcoming ICON missions. Thus, in coordination with space-based and other ground-based observations, the AMBER magnetometer network provides a great opportunity to understand the electrodynamics that governs equatorial ionosphere motions. In this paper we present the longitudinal variability of the equatorial electrodynamics using the combination of instruments onboard SWARM and C/NOFS satellites and ground-based AMBER network. Both ground- and pace-based observations show stronger dayside and evening sector equatorial electrodynamics in the American and Asian sectors compared to the African sector. On the other hand, the African sector is home to stronger and year-round ionospheric bubbles/irregularities compared to the American and Asian sectors. This raises the question if the evening sector equatorial electrodynamics (vertical drift), which is believed to be the main cause for the enhancement of Rayleigh-Taylor (RT) instability growth rate, is stronger in the

  6. PROBABILISTIC SEISMIC ASSESSMENT OF BASE-ISOLATED NPPS SUBJECTED TO STRONG GROUND MOTIONS OF TOHOKU EARTHQUAKE

    Directory of Open Access Journals (Sweden)

    AHMER ALI

    2014-10-01

    Full Text Available The probabilistic seismic performance of a standard Korean nuclear power plant (NPP with an idealized isolation is investigated in the present work. A probabilistic seismic hazard analysis (PSHA of the Wolsong site on the Korean peninsula is performed by considering peak ground acceleration (PGA as an earthquake intensity measure. A procedure is reported on the categorization and selection of two sets of ground motions of the Tohoku earthquake, i.e. long-period and common as Set A and Set B respectively, for the nonlinear time history response analysis of the base-isolated NPP. Limit state values as multiples of the displacement responses of the NPP base isolation are considered for the fragility estimation. The seismic risk of the NPP is further assessed by incorporation of the rate of frequency exceedance and conditional failure probability curves. Furthermore, this framework attempts to show the unacceptable performance of the isolated NPP in terms of the probabilistic distribution and annual probability of limit states. The comparative results for long and common ground motions are discussed to contribute to the future safety of nuclear facilities against drastic events like Tohoku.

  7. Nighttime Aerosol Optical Depth Measurements Using a Ground-based Lunar Photometer

    Science.gov (United States)

    Berkoff, Tim; Omar, Ali; Haggard, Charles; Pippin, Margaret; Tasaddaq, Aasam; Stone, Tom; Rodriguez, Jon; Slutsker, Ilya; Eck, Tom; Holben, Brent; hide

    2015-01-01

    In recent years it was proposed to combine AERONET network photometer capabilities with a high precision lunar model used for satellite calibration to retrieve columnar nighttime AODs. The USGS lunar model can continuously provide pre-atmosphere high precision lunar irradiance determinations for multiple wavelengths at ground sensor locations. When combined with measured irradiances from a ground-based AERONET photometer, atmospheric column transmissions can determined yielding nighttime column aerosol AOD and Angstrom coefficients. Additional demonstrations have utilized this approach to further develop calibration methods and to obtain data in polar regions where extended periods of darkness occur. This new capability enables more complete studies of the diurnal behavior of aerosols, and feedback for models and satellite retrievals for the nighttime behavior of aerosols. It is anticipated that the nighttime capability of these sensors will be useful for comparisons with satellite lidars such as CALIOP and CATS in additional to ground-based lidars in MPLNET at night, when the signal-to-noise ratio is higher than daytime and more precise AOD comparisons can be made.

  8. Probabilistic seismic assessment of base-isolated NPPs subjected to strong ground motions of Tohoku earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Ali, Ahmer; Hayah, Nadin Abu; Kim, Doo Kie [Dept. of Civil and Environmental Engineering, Kunsan National University, Kunsan (Korea, Republic of); Cho, Sung Gook [R and D Center, JACE KOREA Company, Gyeonggido (Korea, Republic of)

    2014-10-15

    The probabilistic seismic performance of a standard Korean nuclear power plant (NPP) with an idealized isolation is investigated in the present work. A probabilistic seismic hazard analysis (PSHA) of the Wolsong site on the Korean peninsula is performed by considering peak ground acceleration (PGA) as an earthquake intensity measure. A procedure is reported on the categorization and selection of two sets of ground motions of the Tohoku earthquake, i.e. long-period and common as Set A and Set B respectively, for the nonlinear time history response analysis of the base-isolated NPP. Limit state values as multiples of the displacement responses of the NPP base isolation are considered for the fragility estimation. The seismic risk of the NPP is further assessed by incorporation of the rate of frequency exceedance and conditional failure probability curves. Furthermore, this framework attempts to show the unacceptable performance of the isolated NPP in terms of the probabilistic distribution and annual probability of limit states. The comparative results for long and common ground motions are discussed to contribute to the future safety of nuclear facilities against drastic events like Tohoku.

  9. Recent successes and emerging challenges for coordinated satellite/ground-based magnetospheric exploration and modeling.

    Science.gov (United States)

    Angelopoulos, Vassilis

    With the availability of a distributed constellation of spacecraft (THEMIS, Geotail, Cluster) and increased capability ground based arrays (SuperDARN, THEMIS/GBOs), it is now pos-sible to infer simply from timing significant information regarding mapping of magnetospheric phenomena. Optical, magnetometer and radar data can pinpoint the location and nature of onset signatures. On the other hand, magnetic field modeling constrained by physical bound-aries (such as the isotropy boundary) the measured magnetic field and total pressure values at a distibuted network of satellites has proven to do a much better job at correlating ionospheric precipitation and diffuse auroral boundaries to magnetospheric phenomena, such as the inward boundary of the dipolarization fronts. It is now possible to routinely compare in-situ measured phase space densities of ion and electron distributions during ionosphere -magnetosphere con-junctions, in the absense of potential drops. It is also possible to not only infer equivalent current systems from the ground, but use reconstruction of the ionospheric current system from space to determine the full electrodynamics evolution of the ionosphere and compare with radars. Assimilation of this emerging ground based and global magnetospheric panoply into a self consistent magnetospheric model will likely be one of the most fruitful endeavors in magnetospheric exploration during the next few years.

  10. Integration between ground based and satellite SAR data in landslide mapping: The San Fratello case study

    Science.gov (United States)

    Bardi, Federica; Frodella, William; Ciampalini, Andrea; Bianchini, Silvia; Del Ventisette, Chiara; Gigli, Giovanni; Fanti, Riccardo; Moretti, Sandro; Basile, Giuseppe; Casagli, Nicola

    2014-10-01

    The potential use of the integration of PSI (Persistent Scatterer Interferometry) and GB-InSAR (Ground-based Synthetic Aperture Radar Interferometry) for landslide hazard mitigation was evaluated for mapping and monitoring activities of the San Fratello landslide (Sicily, Italy). Intense and exceptional rainfall events are the main factors that triggered several slope movements in the study area, which is susceptible to landslides, because of its steep slopes and silty-clayey sedimentary cover. In the last three centuries, the town of San Fratello was affected by three large landslides, developed in different periods: the oldest one occurred in 1754, damaging the northeastern sector of the town; in 1922 a large landslide completely destroyed a wide area in the western hillside of the town. In this paper, the attention is focussed on the most recent landslide that occurred on 14 February 2010: in this case, the phenomenon produced the failure of a large sector of the eastern hillside, causing severe damages to buildings and infrastructures. In particular, several slow-moving rotational and translational slides occurred in the area, making it suitable to monitor ground instability through different InSAR techniques. PS-InSAR™ (permanent scatterers SAR interferometry) techniques, using ERS-1/ERS-2, ENVISAT, RADARSAT-1, and COSMO-SkyMed SAR images, were applied to analyze ground displacements during pre- and post-event phases. Moreover, during the post-event phase in March 2010, a GB-InSAR system, able to acquire data continuously every 14 min, was installed collecting ground displacement maps for a period of about three years, until March 2013. Through the integration of space-borne and ground-based data sets, ground deformation velocity maps were obtained, providing a more accurate delimitation of the February 2010 landslide boundary, with respect to the carried out traditional geomorphological field survey. The integration of GB-InSAR and PSI techniques proved to

  11. A Terminal Guidance Law Based on Motion Camouflage Strategy of Air-to-Ground Missiles

    Directory of Open Access Journals (Sweden)

    Chang-sheng Gao

    2016-01-01

    Full Text Available A guidance law for attacking ground target based on motion camouflage strategy is proposed in this paper. According to the relative position between missile and target, the dual second-order dynamics model is derived. The missile guidance condition is given by analyzing the characteristic of motion camouflage strategy. Then, the terminal guidance law is derived by using the relative motion of missile and target and the guidance condition. In the process of derivation, the three-dimensional guidance law could be designed in a two-dimensional plane and the difficulty of guidance law design is reduced. A two-dimensional guidance law for three-dimensional space is derived by bringing the estimation for target maneuver. Finally, simulation for the proposed guidance law is taken and compared with pure proportional navigation. The simulation results demonstrate that the proposed guidance law can be applied to air-to-ground missiles.

  12. Recovery Act: Finite Volume Based Computer Program for Ground Source Heat Pump Systems

    Energy Technology Data Exchange (ETDEWEB)

    James A Menart, Professor

    2013-02-22

    This report is a compilation of the work that has been done on the grant DE-EE0002805 entitled Finite Volume Based Computer Program for Ground Source Heat Pump Systems. The goal of this project was to develop a detailed computer simulation tool for GSHP (ground source heat pump) heating and cooling systems. Two such tools were developed as part of this DOE (Department of Energy) grant; the first is a two-dimensional computer program called GEO2D and the second is a three-dimensional computer program called GEO3D. Both of these simulation tools provide an extensive array of results to the user. A unique aspect of both these simulation tools is the complete temperature profile information calculated and presented. Complete temperature profiles throughout the ground, casing, tube wall, and fluid are provided as a function of time. The fluid temperatures from and to the heat pump, as a function of time, are also provided. In addition to temperature information, detailed heat rate information at several locations as a function of time is determined. Heat rates between the heat pump and the building indoor environment, between the working fluid and the heat pump, and between the working fluid and the ground are computed. The heat rates between the ground and the working fluid are calculated as a function time and position along the ground loop. The heating and cooling loads of the building being fitted with a GSHP are determined with the computer program developed by DOE called ENERGYPLUS. Lastly COP (coefficient of performance) results as a function of time are provided. Both the two-dimensional and three-dimensional computer programs developed as part of this work are based upon a detailed finite volume solution of the energy equation for the ground and ground loop. Real heat pump characteristics are entered into the program and used to model the heat pump performance. Thus these computer tools simulate the coupled performance of the ground loop and the heat pump. The

  13. Finite Volume Based Computer Program for Ground Source Heat Pump System

    Energy Technology Data Exchange (ETDEWEB)

    Menart, James A. [Wright State University

    2013-02-22

    This report is a compilation of the work that has been done on the grant DE-EE0002805 entitled ?Finite Volume Based Computer Program for Ground Source Heat Pump Systems.? The goal of this project was to develop a detailed computer simulation tool for GSHP (ground source heat pump) heating and cooling systems. Two such tools were developed as part of this DOE (Department of Energy) grant; the first is a two-dimensional computer program called GEO2D and the second is a three-dimensional computer program called GEO3D. Both of these simulation tools provide an extensive array of results to the user. A unique aspect of both these simulation tools is the complete temperature profile information calculated and presented. Complete temperature profiles throughout the ground, casing, tube wall, and fluid are provided as a function of time. The fluid temperatures from and to the heat pump, as a function of time, are also provided. In addition to temperature information, detailed heat rate information at several locations as a function of time is determined. Heat rates between the heat pump and the building indoor environment, between the working fluid and the heat pump, and between the working fluid and the ground are computed. The heat rates between the ground and the working fluid are calculated as a function time and position along the ground loop. The heating and cooling loads of the building being fitted with a GSHP are determined with the computer program developed by DOE called ENERGYPLUS. Lastly COP (coefficient of performance) results as a function of time are provided. Both the two-dimensional and three-dimensional computer programs developed as part of this work are based upon a detailed finite volume solution of the energy equation for the ground and ground loop. Real heat pump characteristics are entered into the program and used to model the heat pump performance. Thus these computer tools simulate the coupled performance of the ground loop and the heat pump

  14. (Environmental investigation of ground water contamination at Wright- Patterson Air Force Base, Ohio)

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-01

    This Health and Safety Plan (HSP) was developed for the Environmental Investigation of Ground-water Contamination Investigation at Wright-Patterson Air Force Base near Dayton, Ohio, based on the projected scope of work for the Phase 1, Task 4 Field Investigation. The HSP describes hazards that may be encountered during the investigation, assesses the hazards, and indicates what type of personal protective equipment is to be used for each task performed. The HSP also addresses the medical monitoring program, decontamination procedures, air monitoring, training, site control, accident prevention, and emergency response.

  15. Remote sensing of high-latitude ionization profiles by ground-based and spaceborne instrumentation

    International Nuclear Information System (INIS)

    Vondrak, R.R.

    1981-01-01

    Ionospheric specification and modeling are now largely based on data provided by active remote sensing with radiowave techniques (ionosondes, incoherent-scatter radars, and satellite beacons). More recently, passive remote sensing techniques have been developed that can be used to monitor quantitatively the spatial distribution of high-latitude E-region ionization. These passive methods depend on the measurement, or inference, of the energy distribution of precipitating kilovolt electrons, the principal source of the nighttime E-region at high latitudes. To validate these techniques, coordinated measurements of the auroral ionosphere have been made with the Chatanika incoherent-scatter radar and a variety of ground-based and spaceborne sensors

  16. Plant diversity to support humans in a CELSS ground based demonstrator

    Science.gov (United States)

    Howe, J. M.; Hoff, J. E.

    1981-01-01

    A controlled ecological life support system (CELSS) for human habitation in preparation for future long duration space flights is considered. The success of such a system depends upon the feasibility of revitalization of food resources and the human nutritional needs which are to be met by these food resources. Edible higher plants are prime candidates for the photoautotrophic components of this system if nutritionally adequate diets can be derived from these plant sources to support humans. Human nutritional requirements information based on current knowledge are developed for inhabitants envisioned in the CELSS ground based demonstrator. Groups of plant products that can provide the nutrients are identified.

  17. The laser calibration system for the STACEE ground-based gamma ray detector

    CERN Document Server

    Hanna, D

    2002-01-01

    We describe the design and performance of the laser system used for calibration monitoring of components of the STACEE detector. STACEE is a ground based gamma ray detector which uses the heliostats of a solar power facility to collect and focus Cherenkov light onto a system of secondary optics and photomultiplier tubes. To monitor the gain and check the linearity and timing properties of the phototubes and associated electronics, a system based on a dye laser, neutral density filters and optical fibres has been developed. In this paper we describe the system and present some results from initial tests made with it.

  18. The sphinx project: experimental verification of design inputs for a transmuter with liquid fuel based on molten fluorides

    International Nuclear Information System (INIS)

    Hron, M.; Uhlir, J.; Vanicek, J.

    2002-01-01

    The current proposals for high-active long-lived (more then 10 4 years) waste from spent nuclear fuel disposal calls forth an increasing societal mistrust towards nuclear power. These problems are highly topical in the Czech Republic, a country which is operating nuclear power and accumulating spent fuel from PWRs and is further located on an inland and heavily populous Central European region. The proposed project, known under the acronym SPHINX (SPent Hot fuel Incineration by Neutron flux) deals with a solution to some of the principle problems through a very promising means of radioactive waste treatment. In particular, high-level wastes from spent nuclear fuel could be treated using this method, which is based on the transmutation of radionuclides through the use of a nuclear reactor with liquid fuel based on molten fluorides (Molten Salt Transmutation Reactor - MSTR) which might be a subcritical system driven by a suitable neutron source. Its superiority also lies in the fact that it makes possible to utilize actinides contained, by others, in spent nuclear fuel and so to reach a positive energy effect. After the first three-year stage of Research and Development which has been focused mostly on computer analyses of neutronics and corresponding physical characteristics, the next three-year stage of this programme will be devoted to experimental verification of inputs for the design of a demonstration transmuter using molten fluoride fuel. The Research and Development part of the SPHINX project in the area of fuel cycle of the MSTR is focused in the first place on the development of suitable technology for the preparation of an introductory liquid fluoride fuel for MSTR and subsequently on the development of suitable fluoride pyrometallurgical technology for the separation of the transmuted elements from the non-transmuted ones. The idea of the introductory fuel preparation is based on the reprocessing of PWR spent fuel using the Fluoride Volatility Method

  19. Eggspectation : organic egg verification tool

    NARCIS (Netherlands)

    Ruth, van S.M.; Hoogenboom, L.A.P.

    2011-01-01

    In 2009 RIKILT conducted a study on about 2,000 eggs to evaluate three different analytical verification methods: carotenoid profiling, fatty acid profiling and isotope ratio mass spectrometry. The eggs were collected from about 50 Dutch farms. The selection was based on the farms’ location and

  20. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  1. Multi-story base-isolated buildings under a harmonic ground motion. Pt. 1

    International Nuclear Information System (INIS)

    Fan Fagung; Ahmadi, G.; Tadjbakhsh, I.G.

    1990-01-01

    The performances of several leading base-isolation devices (Pure-Friction/Sliding-Joint, Rubber Bearing, French System, New Zealand System, and Resilient-Friction) and a newly proposed system (Sliding Resilient-Friction) for a multi-story building subject to a horizontal harmonic ground motion are studied. The governing equations of motion of various systems and the criteria for stick-slip transition are described and a computational algorithm for obtaining their numerical solutions is developed. The responses of the structure with different base-isolation systems under various conditions are analyzed. The peak absolute acceleration, the maximum structural deflection, and the peak base-displacement responses are obtained. The effectiveness of various base isolators are studied and advantages and disadvantages of different systems are discussed. The results show that the base-isolation devices effectively reduce the column stresses and the acceleration transmitted to the superstructure. (orig.)

  2. A Little Knowledge of Ground Motion: Explaining 3-D Physics-Based Modeling to Engineers

    Science.gov (United States)

    Porter, K.

    2014-12-01

    Users of earthquake planning scenarios require the ground-motion map to be credible enough to justify costly planning efforts, but not all ground-motion maps are right for all uses. There are two common ways to create a map of ground motion for a hypothetical earthquake. One approach is to map the median shaking estimated by empirical attenuation relationships. The other uses 3-D physics-based modeling, in which one analyzes a mathematical model of the earth's crust near the fault rupture and calculates the generation and propagation of seismic waves from source to ground surface by first principles. The two approaches produce different-looking maps. The more-familiar median maps smooth out variability and correlation. Using them in a planning scenario can lead to a systematic underestimation of damage and loss, and could leave a community underprepared for realistic shaking. The 3-D maps show variability, including some very high values that can disconcert non-scientists. So when the USGS Science Application for Risk Reduction's (SAFRR) Haywired scenario project selected 3-D maps, it was necessary to explain to scenario users—especially engineers who often use median maps—the differences, advantages, and disadvantages of the two approaches. We used authority, empirical evidence, and theory to support our choice. We prefaced our explanation with SAFRR's policy of using the best available earth science, and cited the credentials of the maps' developers and the reputation of the journal in which they published the maps. We cited recorded examples from past earthquakes of extreme ground motions that are like those in the scenario map. We explained the maps on theoretical grounds as well, explaining well established causes of variability: directivity, basin effects, and source parameters. The largest mapped motions relate to potentially unfamiliar extreme-value theory, so we used analogies to human longevity and the average age of the oldest person in samples of

  3. Summer planetary-scale oscillations: aura MLS temperature compared with ground-based radar wind

    Directory of Open Access Journals (Sweden)

    C. E. Meek

    2009-04-01

    Full Text Available The advent of satellite based sampling brings with it the opportunity to examine virtually any part of the globe. Aura MLS mesospheric temperature data are analysed in a wavelet format for easy identification of possible planetary waves (PW and aliases masquerading as PW. A calendar year, 2005, of eastward, stationary, and westward waves at a selected latitude is shown in separate panels for wave number range −3 to +3 for period range 8 h to 30 days (d. Such a wavelet analysis is made possible by Aura's continuous sampling at all latitudes 82° S–82° N. The data presentation is suitable for examination of years of data. However this paper focuses on the striking feature of a "dish-shaped" upper limit to periods near 2 d in mid-summer, with longer periods appearing towards spring and fall, a feature also commonly seen in radar winds. The most probable cause is suggested to be filtering by the summer jet at 70–80 km, the latter being available from ground based medium frequency radar (MFR. Classically, the phase velocity of a wave must be greater than that of the jet in order to propagate through it. As an attempt to directly relate satellite and ground based sampling, a PW event of period 8d and wave number 2, which appears to be the original rather than an alias, is compared with ground based radar wind data. An appendix discusses characteristics of satellite data aliases with regard to their periods and amplitudes.

  4. SAR Ground Moving Target Indication Based on Relative Residue of DPCA Processing

    Directory of Open Access Journals (Sweden)

    Jia Xu

    2016-10-01

    Full Text Available For modern synthetic aperture radar (SAR, it has much more urgent demands on ground moving target indication (GMTI, which includes not only the point moving targets like cars, truck or tanks but also the distributed moving targets like river or ocean surfaces. Among the existing GMTI methods, displaced phase center antenna (DPCA can effectively cancel the strong ground clutter and has been widely used. However, its detection performance is closely related to the target’s signal-to-clutter ratio (SCR as well as radial velocity, and it cannot effectively detect the weak large-sized river surfaces in strong ground clutter due to their low SCR caused by specular scattering. This paper proposes a novel method called relative residue of DPCA (RR-DPCA, which jointly utilizes the DPCA cancellation outputs and the multi-look images to improve the detection performance of weak river surfaces. Furthermore, based on the statistics analysis of the RR-DPCA outputs on the homogenous background, the cell average (CA method can be well applied for subsequent constant false alarm rate (CFAR detection. The proposed RR-DPCA method can well detect the point moving targets and distributed moving targets simultaneously. Finally, the results of both simulated and real data are provided to demonstrate the effectiveness of the proposed SAR/GMTI method.

  5. Removal of lead and fluoride from contaminated water using exhausted coffee grounds based bio-sorbent.

    Science.gov (United States)

    Naga Babu, A; Reddy, D Srinivasa; Kumar, G Suresh; Ravindhranath, K; Krishna Mohan, G V

    2018-07-15

    Water pollution by industrial and anthropogenic actives has become a serious threat to the environment. World Health Organization (WHO) has identified that lead and fluoride amid the environmental pollutants are most poisonous water contaminants with devastating impact on the human race. The present work proposes a study on economical bio-adsorbent based technique using exhausted coffee grounds in the removal of lead and fluoride contaminants from water. The exhausted coffee grounds gathered from industrial wastes have been acid-activated and examined for their adsorption capacity. The surface morphology and elemental characterization of pre-and-post adsorption operations by FESEM, EDX and FTIR spectral analysis confirmed the potential of the exhausted coffee ground as successful bio-sorbent. However, thermodynamic analysis confirmed the adsorption to be spontaneous physisorption with Langmuir mode of homogenous monolayer deposition. The kinetics of adsorption is well defined by pseudo second order model for both lead and fluoride. A significant quantity of lead and fluoride is removed from the synthetic contaminated water by the proposed bio-sorbent with the respective sorption capabilities of 61.6 mg/g and 9.05 mg/g. However, the developed bio-sorbent is also recyclable and is capable of removing the lead and fluoride from the domestic and industrial waste-water sources with an overall removal efficiency of about 90%. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Component design challenges for the ground-based SP-100 nuclear assembly test

    International Nuclear Information System (INIS)

    Markley, R.A.; Disney, R.K.; Brown, G.B.

    1989-01-01

    The SP-100 ground engineering system (GES) program involves a ground test of the nuclear subsystems to demonstrate their design. The GES nuclear assembly test (NAT) will be performed in a simulated space environment within a vessel maintained at ultrahigh vacuum. The NAT employs a radiation shielding system that is comprised of both prototypical and nonprototypical shield subsystems to attenuate the reactor radiation leakage and also nonprototypical heat transport subsystems to remove the heat generated by the reactor. The reactor is cooled by liquid lithium, which will operate at temperatures prototypical of the flight system. In designing the components for these systems, a number of design challenges were encountered in meeting the operational requirements of the simulated space environment (and where necessary, prototypical requirements) while also accommodating the restrictions of a ground-based test facility with its limited available space. This paper presents a discussion of the design challenges associated with the radiation shield subsystem components and key components of the heat transport systems

  7. Evaluation of modal pushover-based scaling of one component of ground motion: Tall buildings

    Science.gov (United States)

    Kalkan, Erol; Chopra, Anil K.

    2012-01-01

    Nonlinear response history analysis (RHA) is now increasingly used for performance-based seismic design of tall buildings. Required for nonlinear RHAs is a set of ground motions selected and scaled appropriately so that analysis results would be accurate (unbiased) and efficient (having relatively small dispersion). This paper evaluates accuracy and efficiency of recently developed modal pushover–based scaling (MPS) method to scale ground motions for tall buildings. The procedure presented explicitly considers structural strength and is based on the standard intensity measure (IM) of spectral acceleration in a form convenient for evaluating existing structures or proposed designs for new structures. Based on results presented for two actual buildings (19 and 52 stories, respectively), it is demonstrated that the MPS procedure provided a highly accurate estimate of the engineering demand parameters (EDPs), accompanied by significantly reduced record-to-record variability of the responses. In addition, the MPS procedure is shown to be superior to the scaling procedure specified in the ASCE/SEI 7-05 document.

  8. SU-F-T-288: Impact of Trajectory Log Files for Clarkson-Based Independent Dose Verification of IMRT and VMAT

    Energy Technology Data Exchange (ETDEWEB)

    Takahashi, R; Kamima, T [Cancer Institute Hospital of Japanese Foundation for Cancer Research, Koto, Tokyo (Japan); Tachibana, H [National Cancer Center, Kashiwa, Chiba (Japan)

    2016-06-15

    Purpose: To investigate the effect of the trajectory files from linear accelerator for Clarkson-based independent dose verification in IMRT and VMAT plans. Methods: A CT-based independent dose verification software (Simple MU Analysis: SMU, Triangle Products, Japan) with a Clarksonbased algorithm was modified to calculate dose using the trajectory log files. Eclipse with the three techniques of step and shoot (SS), sliding window (SW) and Rapid Arc (RA) was used as treatment planning system (TPS). In this study, clinically approved IMRT and VMAT plans for prostate and head and neck (HN) at two institutions were retrospectively analyzed to assess the dose deviation between DICOM-RT plan (PL) and trajectory log file (TJ). An additional analysis was performed to evaluate MLC error detection capability of SMU when the trajectory log files was modified by adding systematic errors (0.2, 0.5, 1.0 mm) and random errors (5, 10, 30 mm) to actual MLC position. Results: The dose deviations for prostate and HN in the two sites were 0.0% and 0.0% in SS, 0.1±0.0%, 0.1±0.1% in SW and 0.6±0.5%, 0.7±0.9% in RA, respectively. The MLC error detection capability shows the plans for HN IMRT were the most sensitive and 0.2 mm of systematic error affected 0.7% dose deviation on average. Effect of the MLC random error did not affect dose error. Conclusion: The use of trajectory log files including actual information of MLC location, gantry angle, etc should be more effective for an independent verification. The tolerance level for the secondary check using the trajectory file may be similar to that of the verification using DICOM-RT plan file. From the view of the resolution of MLC positional error detection, the secondary check could detect the MLC position error corresponding to the treatment sites and techniques. This research is partially supported by Japan Agency for Medical Research and Development (AMED)

  9. Validation of OMI erythemal doses with multi-sensor ground-based measurements in Thessaloniki, Greece

    Science.gov (United States)

    Zempila, Melina Maria; Fountoulakis, Ilias; Taylor, Michael; Kazadzis, Stelios; Arola, Antti; Koukouli, Maria Elissavet; Bais, Alkiviadis; Meleti, Chariklia; Balis, Dimitrios

    2018-06-01

    The aim of this study is to validate the Ozone Monitoring Instrument (OMI) erythemal dose rates using ground-based measurements in Thessaloniki, Greece. In the Laboratory of Atmospheric Physics of the Aristotle University of Thessaloniki, a Yankee Environmental System UVB-1 radiometer measures the erythemal dose rates every minute, and a Norsk Institutt for Luftforskning (NILU) multi-filter radiometer provides multi-filter based irradiances that were used to derive erythemal dose rates for the period 2005-2014. Both these datasets were independently validated against collocated UV irradiance spectra from a Brewer MkIII spectrophotometer. Cloud detection was performed based on measurements of the global horizontal radiation from a Kipp & Zonen pyranometer and from NILU measurements in the visible range. The satellite versus ground observation validation was performed taking into account the effect of temporal averaging, limitations related to OMI quality control criteria, cloud conditions, the solar zenith angle and atmospheric aerosol loading. Aerosol optical depth was also retrieved using a collocated CIMEL sunphotometer in order to assess its impact on the comparisons. The effect of total ozone columns satellite versus ground-based differences on the erythemal dose comparisons was also investigated. Since most of the public awareness alerts are based on UV Index (UVI) classifications, an analysis and assessment of OMI capability for retrieving UVIs was also performed. An overestimation of the OMI erythemal product by 3-6% and 4-8% with respect to ground measurements is observed when examining overpass and noontime estimates respectively. The comparisons revealed a relatively small solar zenith angle dependence, with the OMI data showing a slight dependence on aerosol load, especially at high aerosol optical depth values. A mean underestimation of 2% in OMI total ozone columns under cloud-free conditions was found to lead to an overestimation in OMI erythemal

  10. Ground-based Efforts to Support a Space-based Experiment: the Latest LADEE Results (Abstract)

    Science.gov (United States)

    Cudnik, B.; Rahman, M.

    2014-12-01

    (Abstract only) The much anticipated launch of NASA’s Lunar Atmosphere and Dust Environment Explorer happened flawlessly last October and the satellite has been doing science (and sending a few images) since late Novermber. [The LADEE mission ended with the crash-landing of the spacecraft on the lunar far side on April 17, 2014, capping a successful 140-day mission.] We also have launched our campaign to document lunar meteroid impact flashes from the ground to supply ground truth to inform of any changes in dust concentration encountered by the spacecraft in orbit around the moon. To date I have received six reports of impact flashes or flash candidates from the group I am coordinating; other groups around the world may have more to add when all is said and done. In addition, plans are underway to prepare a program at Prairie View A&M University to involve our physics majors in lunar meteoroid, asteroid occultation, and other astronomical work through our Center for Astronomical Sciences and Technology. This facility will be a control center to not only involve physics majors, but also to include pre-service teachers and members of the outside community to promote pro-am collaborations.

  11. Ground-based Efforts to Support a Space-Based Experiment: the Latest LADEE Results

    Science.gov (United States)

    Cudnik, Brian; Rahman, Mahmudur

    2014-05-01

    The much anticipated launch of the Lunar Atmosphere and Dust Environment Explorer happened flawlessly last October and the satellite has been doing science (and sending a few images) since late November. [the LADEE mission ended with the crash-landing of the spacecraft on the lunar far side on April 17, 2014, capping a successful 140 day mission] .We also have launched our campaign to document lunar meteoroid impact flashes from the ground to supply ground truth to inform of any changes in dust concentration encountered by the spacecraft in orbit around the moon. To date I have received six reports of impact flashes or flash candidates from the group I am coordinating; other groups around the world may have more to add when all is said and done. In addition, plans are underway to prepare a program at Prairie View A&M University to involve our physics majors in lunar meteoroid, asteroid occultation, and other astronomical work through our Center for Astronomical Sciences and Technology. This facility will be a control center to not only involve physics majors, but also to include pre-service teachers and member of the outside community to promote pro-am collaborations.

  12. Main control system verification and validation of NPP digital I and C system based on engineering simulator

    International Nuclear Information System (INIS)

    Lin Meng; Hou Dong; Liu Pengfei; Yang Zongwei; Yang Yanhua

    2010-01-01

    Full-scope digital instrumentation and controls system (I and C) technique is being introduced in Chinese new constructed Nuclear Power Plant (NPP), which mainly includes three parts: control system, reactor protection system and engineered safety feature actuation system. For example, SIEMENS TELEPERM XP and XS distributed control system (DCS) have been used in Ling Ao Phase II NPP, which is located in Guangdong province, China. This is the first NPP project in China that Chinese engineers are fully responsible for all the configuration of actual analog and logic diagram, although experience in NPP full-scope digital I and C is very limited. For the safety, it has to be made sure that configuration is right and control functions can be accomplished before the phase of real plant testing on reactor. Therefore, primary verification and validation (V and V) of I and C needs to be carried out. Except the common and basic way, i.e. checking the diagram configuration one by one according to original design, NPP engineering simulator is applied as another effective approach of V and V. For this purpose, a virtual NPP thermal-hydraulic model is established as a basis according to Ling Ao Phase II NPP design, and the NPP simulation tools can provide plant operation parameters to DCS, accept control signal from I and C and give response. During the test, one set of data acquisition equipments are used to build a connection between the engineering simulator (software) and SIEMENS DCS I/O cabinet (hardware). In this emulation, original diagram configuration in DCS and field hardware structures are kept unchanged. In this way, firstly judging whether there are some problems by observing the input and output of DCS without knowing the internal configuration. Then secondly, problems can be found and corrected by understanding and checking the exact and complex configuration in detail. At last, the correctness and functionality of the control system are verified. This method is

  13. Ground-based SMART-COMMIT Measurements for Studying Aerosol and Cloud Properties

    Science.gov (United States)

    Tsay, Si-Chee

    2008-01-01

    From radiometric principles, it is expected that the retrieved properties of extensive aerosols and clouds from reflected/emitted measurements by satellite (and/or aircraft) should be consistent with those retrieved from transmitted/emitted radiance observed at the surface. Although space-borne remote sensing observations cover large spatial domain, they are often plagued by contamination of surface signatures. Thus, ground-based in-situ and remote-sensing measurements, where signals come directly from atmospheric constituents, the sun, and/or the Earth-atmosphere interactions, provide additional information content for comparisons that confirm quantitatively the usefulness of the integrated surface, aircraft, and satellite data sets. The development and deployment of SMARTCOMMIT (Surface-sensing Measurements for Atmospheric Radiative Transfer - Chemical, Optical & Microphysical Measurements of In-situ Troposphere) mobile facilities are aimed for the optimal utilization of collocated ground-based observations as constraints to yield higher fidelity satellite retrievals and to determine any sampling bias due to target conditions. To quantify the energetics of the surface-atmosphere system and the atmospheric processes, SMART-COMMIT instruments fall into three categories: flux radiometer, radiance sensor and in-situ probe. In this paper, we will demonstrate the capability of SMART-COMMIT in recent field campaigns (e.g., CRYSTAL-FACE, UAE 2, BASEASIA, NAMMA) that were designed and executed to study the compelling variability in temporal scale of both anthropogenic and natural aerosols (e.g., biomass-burning smoke, airborne dust) and cirrus clouds. We envision robust approaches in which well-collocated ground-based measurements and space-borne observations will greatly advance our knowledge of extensive aerosols and clouds.

  14. Kepler and Ground-Based Transits of the exo-Neptune HAT-P-11b

    Science.gov (United States)

    Deming, Drake; Sada, Pedro V.; Jackson, Brian; Peterson, Steven W.; Agol, Eric; Knutson, Heather A.; Jennings, Donald E.; Haase, Plynn; Bays, Kevin

    2011-01-01

    We analyze 26 archival Kepler transits of the exo-Neptune HAT-P-11b, supplemented by ground-based transits observed in the blue (B band) and near-IR (J band). Both the planet and host star are smaller than previously believed; our analysis yields Rp = 4.31 R xor 0.06 R xor and Rs = 0.683 R solar mass 0.009 R solar mass, both about 3 sigma smaller than the discovery values. Our ground-based transit data at wavelengths bracketing the Kepler bandpass serve to check the wavelength dependence of stellar limb darkening, and the J-band transit provides a precise and independent constraint on the transit duration. Both the limb darkening and transit duration from our ground-based data are consistent with the new Kepler values for the system parameters. Our smaller radius for the planet implies that its gaseous envelope can be less extensive than previously believed, being very similar to the H-He envelope of GJ 436b and Kepler-4b. HAT-P-11 is an active star, and signatures of star spot crossings are ubiquitous in the Kepler transit data. We develop and apply a methodology to correct the planetary radius for the presence of both crossed and uncrossed star spots. Star spot crossings are concentrated at phases 0.002 and +0.006. This is consistent with inferences from Rossiter-McLaughlin measurements that the planet transits nearly perpendicular to the stellar equator. We identify the dominant phases of star spot crossings with active latitudes on the star, and infer that the stellar rotational pole is inclined at about 12 deg 5 deg to the plane of the sky. We point out that precise transit measurements over long durations could in principle allow us to construct a stellar Butterfly diagram to probe the cyclic evolution of magnetic activity on this active K-dwarf star.

  15. Automated cloud classification using a ground based infra-red camera and texture analysis techniques

    Science.gov (United States)

    Rumi, Emal; Kerr, David; Coupland, Jeremy M.; Sandford, Andrew P.; Brettle, Mike J.

    2013-10-01

    Clouds play an important role in influencing the dynamics of local and global weather and climate conditions. Continuous monitoring of clouds is vital for weather forecasting and for air-traffic control. Convective clouds such as Towering Cumulus (TCU) and Cumulonimbus clouds (CB) are associated with thunderstorms, turbulence and atmospheric instability. Human observers periodically report the presence of CB and TCU clouds during operational hours at airports and observatories; however such observations are expensive and time limited. Robust, automatic classification of cloud type using infrared ground-based instrumentation offers the advantage of continuous, real-time (24/7) data capture and the representation of cloud structure in the form of a thermal map, which can greatly help to characterise certain cloud formations. The work presented here utilised a ground based infrared (8-14 μm) imaging device mounted on a pan/tilt unit for capturing high spatial resolution sky images. These images were processed to extract 45 separate textural features using statistical and spatial frequency based analytical techniques. These features were used to train a weighted k-nearest neighbour (KNN) classifier in order to determine cloud type. Ground truth data were obtained by inspection of images captured simultaneously from a visible wavelength colour camera at the same installation, with approximately the same field of view as the infrared device. These images were classified by a trained cloud observer. Results from the KNN classifier gave an encouraging success rate. A Probability of Detection (POD) of up to 90% with a Probability of False Alarm (POFA) as low as 16% was achieved.

  16. Development of ground-based wind energy in DOM and Corsica - Joint CGEDD / CGEIET report

    International Nuclear Information System (INIS)

    Joannis de Verclos, Christian de; Albrecht, Patrick; Iselin, Philippe; Legait, Benoit; Vignolles, Denis

    2012-09-01

    Addressing the peculiar cases of the French overseas districts (DOM: Guadeloupe, Martinique, Guyana, Mayotte, La Reunion) and Corsica, this report analyzes four main topics: the objectives and challenges of ground-based wind energy (sustainable development, not-interconnected areas, and public service of electricity supply), the local situations and their cartography, the legal issues and the possible evolution options (energy law, environmental law, urban planning law, local community law), and the modalities of devolution of project. The authors highlight the issues which require a new legal framework, notably governance and the devolution procedure

  17. Tests of the gravitational redshift effect in space-born and ground-based experiments

    Science.gov (United States)

    Vavilova, I. B.

    2018-02-01

    This paper provides a brief overview of experiments as concerns with the tests of the gravitational redshift (GRS) effect in ground-based and space-born experiments. In particular, we consider the GRS effects in the gravitational field of the Earth, the major planets of the Solar system, compact stars (white dwarfs and neutron stars) where this effect is confirmed with a higher accuracy. We discuss availabilities to confirm the GRS effect for galaxies and galaxy clusters in visible and X-ray ranges of the electromagnetic spectrum.

  18. Low velocity target detection based on time-frequency image for high frequency ground wave radar

    Institute of Scientific and Technical Information of China (English)

    YAN Songhua; WU Shicai; WEN Biyang

    2007-01-01

    The Doppler spectral broadening resulted from non-stationary movement of target and radio-frequency interference will decrease the veracity of target detection by high frequency ground wave(HEGW)radar.By displaying the change of signal energy on two dimensional time-frequency images based on time-frequency analysis,a new mathematical morphology method to distinguish target from nonlinear time-frequency curves is presented.The analyzed results from the measured data verify that with this new method the target can be detected correctly from wide Doppler spectrum.

  19. On mean wind and turbulence profile measurements from ground-based wind lidars

    DEFF Research Database (Denmark)

    Mikkelsen, Torben

    2009-01-01

    Two types of wind lidar?s have become available for ground-based vertical mean wind and turbulence profiling. A continuous wave (CW) wind lidar, and a pulsed wind lidar. Although they both are build upon the same recent 1.55 μ telecom fibre technology, they possess fundamental differences between...... their temporal and spatial resolution capabilities. A literature review of the two lidar systems spatial and temporal resolution characteristics will be presented, and the implication for the two lidar types vertical profile measurements of mean wind and turbulence in the lower atmospheric boundary layer...

  20. Pulsation of IU Per from the Ground-based and ‘Integral’ Photometry

    Directory of Open Access Journals (Sweden)

    Kundra E.

    2013-06-01

    Full Text Available IU Per is an eclipsing semi-detached binary with a pulsating component. Using our own ground-based, as well as INTEGRAL satellite photometric observations in the B and V passbands, we derived geometrical and physical parameters of this system. We detected the short-term variations of IU Per in the residuals of brightness after the subtraction of synthetic light curves. Analysis of these residuals enabled us to characterize and localize the source of short-term variations as the pulsations of the primary component typical to δ Scuti-type stars.

  1. Liquid Structures and Physical Properties -- Ground Based Studies for ISS Experiments

    Science.gov (United States)

    Kelton, K. F.; Bendert, J. C.; Mauro, N. A.

    2012-01-01

    Studies of electrostatically-levitated supercooled liquids have demonstrated strong short- and medium-range ordering in transition metal and alloy liquids, which can influence phase transitions like crystal nucleation and the glass transition. The structure is also related to the liquid properties. Planned ISS experiments will allow a deeper investigation of these results as well as the first investigations of a new type of coupling in crystal nucleation in primary crystallizing liquids, resulting from a linking of the stochastic processes of diffusion with interfacial-attachment. A brief description of the techniques used for ground-based studies and some results relevant to planned ISS investigations are discussed.

  2. Comprehensive predictions of target proteins based on protein-chemical interaction using virtual screening and experimental verifications.

    Science.gov (United States)

    Kobayashi, Hiroki; Harada, Hiroko; Nakamura, Masaomi; Futamura, Yushi; Ito, Akihiro; Yoshida, Minoru; Iemura, Shun-Ichiro; Shin-Ya, Kazuo; Doi, Takayuki; Takahashi, Takashi; Natsume, Tohru; Imoto, Masaya; Sakakibara, Yasubumi

    2012-04-05

    Identification of the target proteins of bioactive compounds is critical for elucidating the mode of action; however, target identification has been difficult in general, mostly due to the low sensitivity of detection using affinity chromatography followed by CBB staining and MS/MS analysis. We applied our protocol of predicting target proteins combining in silico screening and experimental verification for incednine, which inhibits the anti-apoptotic function of Bcl-xL by an unknown mechanism. One hundred eighty-two target protein candidates were computationally predicted to bind to incednine by the statistical prediction method, and the predictions were verified by in vitro binding of incednine to seven proteins, whose expression can be confirmed in our cell system.As a result, 40% accuracy of the computational predictions was achieved successfully, and we newly found 3 incednine-binding proteins. This study revealed that our proposed protocol of predicting target protein combining in silico screening and experimental verification is useful, and provides new insight into a strategy for identifying target proteins of small molecules.

  3. Comprehensive predictions of target proteins based on protein-chemical interaction using virtual screening and experimental verifications

    Directory of Open Access Journals (Sweden)

    Kobayashi Hiroki

    2012-04-01

    Full Text Available Abstract Background Identification of the target proteins of bioactive compounds is critical for elucidating the mode of action; however, target identification has been difficult in general, mostly due to the low sensitivity of detection using affinity chromatography followed by CBB staining and MS/MS analysis. Results We applied our protocol of predicting target proteins combining in silico screening and experimental verification for incednine, which inhibits the anti-apoptotic function of Bcl-xL by an unknown mechanism. One hundred eighty-two target protein candidates were computationally predicted to bind to incednine by the statistical prediction method, and the predictions were verified by in vitro binding of incednine to seven proteins, whose expression can be confirmed in our cell system. As a result, 40% accuracy of the computational predictions was achieved successfully, and we newly found 3 incednine-binding proteins. Conclusions This study revealed that our proposed protocol of predicting target protein combining in silico screening and experimental verification is useful, and provides new insight into a strategy for identifying target proteins of small molecules.

  4. a Universal De-Noising Algorithm for Ground-Based LIDAR Signal

    Science.gov (United States)

    Ma, Xin; Xiang, Chengzhi; Gong, Wei

    2016-06-01

    Ground-based lidar, working as an effective remote sensing tool, plays an irreplaceable role in the study of atmosphere, since it has the ability to provide the atmospheric vertical profile. However, the appearance of noise in a lidar signal is unavoidable, which leads to difficulties and complexities when searching for more information. Every de-noising method has its own characteristic but with a certain limitation, since the lidar signal will vary with the atmosphere changes. In this paper, a universal de-noising algorithm is proposed to enhance the SNR of a ground-based lidar signal, which is based on signal segmentation and reconstruction. The signal segmentation serving as the keystone of the algorithm, segments the lidar signal into three different parts, which are processed by different de-noising method according to their own characteristics. The signal reconstruction is a relatively simple procedure that is to splice the signal sections end to end. Finally, a series of simulation signal tests and real dual field-of-view lidar signal shows the feasibility of the universal de-noising algorithm.

  5. A hardware-in-the-loop simulation program for ground-based radar

    Science.gov (United States)

    Lam, Eric P.; Black, Dennis W.; Ebisu, Jason S.; Magallon, Julianna

    2011-06-01

    A radar system created using an embedded computer system needs testing. The way to test an embedded computer system is different from the debugging approaches used on desktop computers. One way to test a radar system is to feed it artificial inputs and analyze the outputs of the radar. More often, not all of the building blocks of the radar system are available to test. This will require the engineer to test parts of the radar system using a "black box" approach. A common way to test software code on a desktop simulation is to use breakpoints so that is pauses after each cycle through its calculations. The outputs are compared against the values that are expected. This requires the engineer to use valid test scenarios. We will present a hardware-in-the-loop simulator that allows the embedded system to think it is operating with real-world inputs and outputs. From the embedded system's point of view, it is operating in real-time. The hardware in the loop simulation is based on our Desktop PC Simulation (PCS) testbed. In the past, PCS was used for ground-based radars. This embedded simulation, called Embedded PCS, allows a rapid simulated evaluation of ground-based radar performance in a laboratory environment.

  6. Ground-based VHE γ ray astronomy with air Cherenkov imaging telescopes

    International Nuclear Information System (INIS)

    Mirzoyan, R.

    2000-01-01

    The history of astronomy has been one of the scientific discovery following immediately the introduction of new technology. In this report, we will review shortly the basic development of the atmospheric air Cherenkov light detection technique, particularly the imaging telescope technique, which in the last years led to the firm establishment of a new branch in experimental astronomy, namely ground-based very high-energy (VHE) γ ray astronomy. Milestones in the technology and in the analysis of imaging technique will be discussed. The design of the 17 m diameter MAGIC Telescope, being currently under construction, is based on the development of new technologies for all its major parts and sets new standards in the performance of the ground-based γ detectors. MAGIC is one of the next major steps in the development of the technique being the first instrument that will allow one to carry out measurements also in the not yet investigated energy gap i.e. between 10 and 300 GeV

  7. Automatic vetting of planet candidates from ground based surveys: Machine learning with NGTS

    Science.gov (United States)

    Armstrong, David J.; Günther, Maximilian N.; McCormac, James; Smith, Alexis M. S.; Bayliss, Daniel; Bouchy, François; Burleigh, Matthew R.; Casewell, Sarah; Eigmüller, Philipp; Gillen, Edward; Goad, Michael R.; Hodgkin, Simon T.; Jenkins, James S.; Louden, Tom; Metrailler, Lionel; Pollacco, Don; Poppenhaeger, Katja; Queloz, Didier; Raynard, Liam; Rauer, Heike; Udry, Stéphane; Walker, Simon R.; Watson, Christopher A.; West, Richard G.; Wheatley, Peter J.

    2018-05-01

    State of the art exoplanet transit surveys are producing ever increasing quantities of data. To make the best use of this resource, in detecting interesting planetary systems or in determining accurate planetary population statistics, requires new automated methods. Here we describe a machine learning algorithm that forms an integral part of the pipeline for the NGTS transit survey, demonstrating the efficacy of machine learning in selecting planetary candidates from multi-night ground based survey data. Our method uses a combination of random forests and self-organising-maps to rank planetary candidates, achieving an AUC score of 97.6% in ranking 12368 injected planets against 27496 false positives in the NGTS data. We build on past examples by using injected transit signals to form a training set, a necessary development for applying similar methods to upcoming surveys. We also make the autovet code used to implement the algorithm publicly accessible. autovet is designed to perform machine learned vetting of planetary candidates, and can utilise a variety of methods. The apparent robustness of machine learning techniques, whether on space-based or the qualitatively different ground-based data, highlights their importance to future surveys such as TESS and PLATO and the need to better understand their advantages and pitfalls in an exoplanetary context.

  8. The Monitoring Case of Ground-Based Synthetic Aperture Radar with Frequency Modulated Continuous Wave System

    Science.gov (United States)

    Zhang, H. Y.; Zhai, Q. P.; Chen, L.; Liu, Y. J.; Zhou, K. Q.; Wang, Y. S.; Dou, Y. D.

    2017-09-01

    The features of the landslide geological disaster are wide distribution, variety, high frequency, high intensity, destructive and so on. It has become a natural disaster with harmful and wide range of influence. The technology of ground-based synthetic aperture radar is a novel deformation monitoring technology developed in recent years. The features of the technology are large monitoring area, high accuracy, long distance without contact and so on. In this paper, fast ground-based synthetic aperture radar (Fast-GBSAR) based on frequency modulated continuous wave (FMCW) system is used to collect the data of Ma Liuzui landslide in Chongqing. The device can reduce the atmospheric errors caused by rapidly changing environment. The landslide deformation can be monitored in severe weather conditions (for example, fog) by Fast-GBSAR with acquisition speed up to 5 seconds per time. The data of Ma Liuzui landslide in Chongqing are analyzed in this paper. The result verifies that the device can monitor landslide deformation under severe weather conditions.

  9. A New Technique to Observe ENSO Activity via Ground-Based GPS Receivers

    Science.gov (United States)

    Suparta, Wayan; Iskandar, Ahmad; Singh, Mandeep Singh Jit

    In an attempt to study the effects of global climate change in the tropics for improving global climate model, this paper aims to detect the ENSO events, especially El Nino phase by using ground-based GPS receivers. Precipitable water vapor (PWV) obtained from the Global Positioning System (GPS) Meteorology measurements in line with the sea surface temperature anomaly (SSTa) are used to connect their response to El Niño activity. The data gathered from four selected stations over the Southeast Asia, namely PIMO (Philippines), KUAL (Malaysia), NTUS (Singapore) and BAKO (Indonesia) for the year of 2009/2010 were processed. A strong correlation was observed for PIMO station with a correlation coefficient of -0.90, significantly at the 99 % confidence level. In general, the relationship between GPS PWV and SSTa at all stations on a weekly basis showed with a negative correlation. The negative correlation indicates that during the El Niño event, the PWV variation was in decreased trend. Decreased trend of PWV value is caused by a dry season that affected the GPS signals in the ocean-atmospheric coupling. Based on these promising results, we can propose that the ground-based GPS receiver is capable used to monitor ENSO activity and this is a new prospective method that previously unexplored.

  10. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  11. Human Walking Pattern Recognition Based on KPCA and SVM with Ground Reflex Pressure Signal

    Directory of Open Access Journals (Sweden)

    Zhaoqin Peng

    2013-01-01

    Full Text Available Algorithms based on the ground reflex pressure (GRF signal obtained from a pair of sensing shoes for human walking pattern recognition were investigated. The dimensionality reduction algorithms based on principal component analysis (PCA and kernel principal component analysis (KPCA for walking pattern data compression were studied in order to obtain higher recognition speed. Classifiers based on support vector machine (SVM, SVM-PCA, and SVM-KPCA were designed, and the classification performances of these three kinds of algorithms were compared using data collected from a person who was wearing the sensing shoes. Experimental results showed that the algorithm fusing SVM and KPCA had better recognition performance than the other two methods. Experimental outcomes also confirmed that the sensing shoes developed in this paper can be employed for automatically recognizing human walking pattern in unlimited environments which demonstrated the potential application in the control of exoskeleton robots.

  12. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  13. Validation of OMI UV measurements against ground-based measurements at a station in Kampala, Uganda

    Science.gov (United States)

    Muyimbwa, Dennis; Dahlback, Arne; Stamnes, Jakob; Hamre, Børge; Frette, Øyvind; Ssenyonga, Taddeo; Chen, Yi-Chun

    2015-04-01

    We present solar ultraviolet (UV) irradiance data measured with a NILU-UV instrument at a ground site in Kampala (0.31°N, 32.58°E), Uganda for the period 2005-2014. The data were analyzed and compared with UV irradiances inferred from the Ozone Monitoring Instrument (OMI) for the same period. Kampala is located on the shores of lake Victoria, Africa's largest fresh water lake, which may influence the climate and weather conditions of the region. Also, there is an excessive use of worn cars, which may contribute to a high anthropogenic loading of absorbing aerosols. The OMI surface UV algorithm does not account for absorbing aerosols, which may lead to systematic overestimation of surface UV irradiances inferred from OMI satellite data. We retrieved UV index values from OMI UV irradiances and validated them against the ground-based UV index values obtained from NILU-UV measurements. The UV index values were found to follow a seasonal pattern similar to that of the clouds and the rainfall. OMI inferred UV index values were overestimated with a mean bias of about 28% under all-sky conditions, but the mean bias was reduced to about 8% under clear-sky conditions when only days with radiation modification factor (RMF) greater than 65% were considered. However, when days with RMF greater than 70, 75, and 80% were considered, OMI inferred UV index values were found to agree with the ground-based UV index values to within 5, 3, and 1%, respectively. In the validation we identified clouds/aerosols, which were present in 88% of the measurements, as the main cause of OMI inferred overestimation of the UV index.

  14. How ground-based observations can support satellite greenhouse gas retrievals

    Science.gov (United States)

    Butler, J. H.; Tans, P. P.; Sweeney, C.; Dlugokencky, E. J.

    2012-04-01

    Global society will eventually accelerate efforts to reduce greenhouse gas emissions in a variety of ways. These would likely involve international treaties, national policies, and regional strategies that will affect a number of economic, social, and environmental sectors. Some strategies will work better than others and some will not work at all. Because trillions of dollars will be involved in pursuing greenhouse gas emission reductions - through realignment of energy production, improvement of efficiencies, institution of taxes, implementation of carbon trading markets, and use of offsets - it is imperative that society be given all the tools at its disposal to ensure the ultimate success of these efforts. Providing independent, globally coherent information on the success of these efforts will give considerable strength to treaties, policies, and strategies. Doing this will require greenhouse gas observations greatly expanded from what we have today. Satellite measurements may ultimately be indispensable in achieving global coverage, but the requirements for accuracy and continuity of measurements over time are demanding if the data are to be relevant. Issues such as those associated with sensor drift, aging electronics, and retrieval artifacts present challenges that can be addressed in part by close coordination with ground-based and in situ systems. This presentation identifies the information that ground-based systems provide very well, but it also looks at what would be deficient even in a greatly expanded surface system, where satellites can fill these gaps, and how on-going, ground and in situ measurements can aid in addressing issues associated with accuracy, long-term continuity, and retrieval artifacts.

  15. Geocenter variations derived from a combined processing of LEO- and ground-based GPS observations

    Science.gov (United States)

    Männel, Benjamin; Rothacher, Markus

    2017-08-01

    GNSS observations provided by the global tracking network of the International GNSS Service (IGS, Dow et al. in J Geod 83(3):191-198, 2009) play an important role in the realization of a unique terrestrial reference frame that is accurate enough to allow a detailed monitoring of the Earth's system. Combining these ground-based data with GPS observations tracked by high-quality dual-frequency receivers on-board low earth orbiters (LEOs) is a promising way to further improve the realization of the terrestrial reference frame and the estimation of geocenter coordinates, GPS satellite orbits and Earth rotation parameters. To assess the scope of the improvement on the geocenter coordinates, we processed a network of 53 globally distributed and stable IGS stations together with four LEOs (GRACE-A, GRACE-B, OSTM/Jason-2 and GOCE) over a time interval of 3 years (2010-2012). To ensure fully consistent solutions, the zero-difference phase observations of the ground stations and LEOs were processed in a common least-squares adjustment, estimating all the relevant parameters such as GPS and LEO orbits, station coordinates, Earth rotation parameters and geocenter motion. We present the significant impact of the individual LEO and a combination of all four LEOs on the geocenter coordinates. The formal errors are reduced by around 20% due to the inclusion of one LEO into the ground-only solution, while in a solution with four LEOs LEO-specific characteristics are significantly reduced. We compare the derived geocenter coordinates w.r.t. LAGEOS results and external solutions based on GPS and SLR data. We found good agreement in the amplitudes of all components; however, the phases in x- and z-direction do not agree well.

  16. Exploring the relationship between monitored ground-based and satellite aerosol measurements over the City of Johannesburg

    CSIR Research Space (South Africa)

    Garland, Rebecca M

    2012-09-01

    Full Text Available This project studied the relationship between aerosol optical depth (AOD) from the Multi-angle Imaging SpectroRadiometer (MISR) instrument on the Terra satellite, and ground-based monitored particulate matter (PM) mass concentrations measured...

  17. Information Technology Management: Select Controls for the Information Security of the Ground-Based Midcourse Defense Communications Network

    National Research Council Canada - National Science Library

    Truex, Kathryn M; Lamar, Karen J; Leighton, George A; Woodruff, Courtney E; Brunetti, Tina N; Russell, Dawn M

    2006-01-01

    ... to the Ground-Based Midcourse Defense Communications Network should read this report to reduce the risk of interruption, misuse, modification, and unauthorized access to information in the system...

  18. Ground-Based Global Navigation Satellite System (GNSS) GPS Broadcast Ephemeris Data (daily files) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — This dataset consists of ground-based Global Navigation Satellite System (GNSS) GPS Broadcast Ephemeris Data (daily files) from the NASA Crustal Dynamics Data...

  19. Ground-Based Global Navigation Satellite System Mixed Broadcast Ephemeris Data (sub-hourly files) from NASA CDDIS

    Data.gov (United States)

    National Aeronautics and Space Administration — This dataset consists of ground-based Global Navigation Satellite System (GNSS) Mixed Broadcast Ephemeris Data (sub-hourly files) from the NASA Crustal Dynamics Data...

  20. Coordinated Ground-Based Observations and the New Horizons Fly-by of Pluto

    Science.gov (United States)

    Young, Eliot; Young, Leslie; Parker, Joel; Binzel, Richard

    2015-04-01

    The New Horizons (NH) spacecraft is scheduled to make its closest approach to Pluto on July 14, 2015. NH carries seven scientific instruments, including separate UV and Visible-IR spectrographs, a long-focal-length imager, two plasma-sensing instruments and a dust counter. There are three arenas in particular in which ground-based observations should augment the NH instrument suite in synergistic ways: IR spectra at wavelengths longer than 2.5 µm (i.e., longer than the NH Ralph spectrograph), stellar occultation observations near the time of the fly-by, and thermal surface maps and atmospheric CO abundances based on ALMA observations - we discuss the first two of these. IR spectra in the 3 - 5 µm range cover the CH4 absorption band near 3.3 µm. This band can be an important constraint on the state and areal extent of nitrogen frost on Pluto's surface. If this band depth is close to zero (as was observed by Olkin et al. 2007), it limits the area of nitrogen frost, which is bright at that wavelength. Combined with the NH observations of nitrogen frost at 2.15 µm, the ground-based spectra will determine how much nitrogen frost is diluted with methane, which is a basic constraint on the seasonal cycle of sublimation and condensation that takes place on Pluto (and similar objects like Triton and Eris). There is a fortuitous stellar occultation by Pluto on 29-JUN-2015, only two weeks before the NH closest approach. The occulted star will be the brightest ever observed in a Pluto event, about 2 magnitudes brighter than Pluto itself. The track of the event is predicted to cover parts of Australia and New Zealand. Thanks to HST and ground based campaigns to find a TNO target reachable by NH, the position of the shadow path will be known at the +/-100 km level, allowing SOFIA and mobile ground-based observers to reliably cover the central flash region. Ground-based & SOFIA observations in visible and IR wavelengths will characterize the haze opacity and vertical