WorldWideScience

Sample records for ground based verification

  1. Dust forecast over North Africa: verification with satellite and ground based observations

    Science.gov (United States)

    Singh, Aditi; Kumar, Sumit; George, John P.

    2016-05-01

    Arid regions of North Africa are considered as one of the major dust source. Present study focuses on the forecast of aerosol optical depth (AOD) of dust over different regions of North Africa. NCMRWF Unified Model (NCUM) produces dust AOD forecasts at different wavelengths with lead time upto 240 hr, based on 00UTC initial conditions. Model forecast of dust AOD at 550 nm up to 72 hr forecast, based on different initial conditions are verified against satellite and ground based observations of total AOD during May-June 2014 with the assumption that except dust, presence of all other aerosols type are negligible. Location specific and geographical distribution of dust AOD forecast is verified against Aerosol Robotic Network (AERONET) station observations of total and coarse mode AOD. Moderate Resolution Imaging Spectroradiometer (MODIS) dark target and deep blue merged level 3 total aerosol optical depth (AOD) at 550 nm and Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) retrieved dust AOD at 532 nm are also used for verification. CALIOP dust AOD was obtained by vertical integration of aerosol extinction coefficient at 532 nm from the aerosol profile level 2 products. It is found that at all the selected AERONET stations, the trend in dust AODs is well predicted by NCUM up to three days advance. Good correlation, with consistently low bias (~ +/-0.06) and RMSE (~ 0.2) values, is found between model forecasts and point measurements of AERONET, except over one location Cinzana (Mali). Model forecast consistently overestimated the dust AOD compared to CALIOP dust AOD, with a bias of 0.25 and RMSE of 0.40.

  2. Ground-based multispectral measurements for airborne data verification in non-operating open pit mine "Kremikovtsi"

    Science.gov (United States)

    Borisova, Denitsa; Nikolov, Hristo; Petkov, Doyno

    2013-10-01

    The impact of mining industry and metal production on the environment is presented all over the world. In our research we set focus on the impact of already non-operating ferrous "Kremikovtsi"open pit mine and related waste dumps and tailings which we consider to be the major factor responsible for pollution of one densely populated region in Bulgaria. The approach adopted is based on correct estimation of the distribution of the iron oxides inside open pit mines and the neighboring regions those considered in this case to be the key issue for the ecological state assessment of soils, vegetation and water. For this study the foremost source of data are those of airborne origin and those combined with ground-based in-situ and laboratory acquired data were used for verification of the environmental variables and thus in process of assessment of the present environmental status influenced by previous mining activities. The percentage of iron content was selected as main indicator for presence of metal pollution since it could be reliably identified by multispectral data used in this study and also because the iron compounds are widely spread in the most of the minerals, rocks and soils. In our research the number of samples from every source (air, field, lab) was taken in the way to be statistically sound and confident. In order to establish relationship between the degree of pollution of the soil and mulspectral data 40 soil samples were collected during a field campaign in the study area together with GPS measurements for two types of laboratory measurements: the first one, chemical and mineralogical analysis and the second one, non-destructive spectroscopy. In this work for environmental variables verification over large areas mulspectral satellite data from Landsat instruments TM/ETM+ and from ALI/OLI (Operational Land Imager) were used. Ground-based (laboratory and in-situ) spectrometric measurements were performed using the designed and constructed in Remote

  3. Ground-based verification and data processing of Yutu rover Active Particle-induced X-ray Spectrometer

    CERN Document Server

    Guo, Dongya; Peng, Wenxi; Cui, Xingzhu; Zhang, Chengmo; Liu, Yaqing; Liang, Xiaohua; Dong, Yifan; Wang, Jinzhou; Gao, Min; Yang, Jiawei; Zhang, Jiayu; Li, Chunlai; Zou, Yongliao; Zhang, Guangliang; Zhang, Liyan; Fu, Xiaohui

    2015-01-01

    The Active Particle-induced X-ray Spectrometer (APXS) is one of the payloads on board the Yutu rover of Chang'E-3 mission. In order to assess the instrumental performance of APXS, a ground verification test was done for two unknown samples (basaltic rock, mixed powder sample). In this paper, the details of the experiment configurations and data analysis method are presented. The results show that the elemental abundance of major elements can be well determined by the APXS with relative deviations < 15 wt. % (detection distance = 30 mm, acquisition time = 30 min). The derived detection limit of each major element is inversely proportional to acquisition time and directly proportional to detection distance, suggesting that the appropriate distance should be < 50mm.

  4. Ground-based verification and data processing of Yutu rover Active Particle-induced X-ray Spectrometer

    Institute of Scientific and Technical Information of China (English)

    GUO Dong-Ya; WANG Huan-Yu; PENG Wen-Xi; CUI Xing-Zhu; ZHANG Cheng-Mo; LIU Ya-Qing; LIANG Xiao-Hua

    2015-01-01

    The Active Particle-induced X-ray Spectrometer (APXS) is one of the payloads on board the Yutu rover of the Chang'E-3 mission.In order to assess the instrumental performance of APXS,a ground verification test was performed for two unknown samples (basaltic rock,mixed powder sample).In this paper,the details of the experiment configurations and data analysis method are presented.The results show that the elemental abundance of major elements can be well determined by the APXS with relative deviations <15 wt.% (detection distance=30 mm,acquisition time=30 min).The derived detection limit of each major element is inversely proportional to acquisition time and directly proportional to detection distance,suggesting that the appropriate distance should be <50 mm.

  5. Cleanup Verification Package for the 618-2 Burial Ground

    Energy Technology Data Exchange (ETDEWEB)

    W. S. Thompson

    2006-12-28

    This cleanup verification package documents completion of remedial action for the 618-2 Burial Ground, also referred to as Solid Waste Burial Ground No. 2; Burial Ground No. 2; 318-2; and Dry Waste Burial Site No. 2. This waste site was used primarily for the disposal of contaminated equipment, materials and laboratory waste from the 300 Area Facilities.

  6. Cleanup Verification Package for the 618-8 Burial Ground

    Energy Technology Data Exchange (ETDEWEB)

    M. J. Appel

    2006-08-10

    This cleanup verification package documents completion of remedial action for the 618-8 Burial Ground, also referred to as the Solid Waste Burial Ground No. 8, 318-8, and the Early Solid Waste Burial Ground. During its period of operation, the 618-8 site is speculated to have been used to bury uranium-contaminated waste derived from fuel manufacturing, and construction debris from the remodeling of the 313 Building.

  7. Cleanup Verification Package for the 118-F-1 Burial Ground

    Energy Technology Data Exchange (ETDEWEB)

    E. J. Farris and H. M. Sulloway

    2008-01-10

    This cleanup verification package documents completion of remedial action for the 118-F-1 Burial Ground on the Hanford Site. This burial ground is a combination of two locations formerly called Minor Construction Burial Ground No. 2 and Solid Waste Burial Ground No. 2. This waste site received radioactive equipment and other miscellaneous waste from 105-F Reactor operations, including dummy elements and irradiated process tubing; gun barrel tips, steel sleeves, and metal chips removed from the reactor; filter boxes containing reactor graphite chips; and miscellaneous construction solid waste.

  8. Cleanup Verification Package for the 118-F-6 Burial Ground

    Energy Technology Data Exchange (ETDEWEB)

    H. M. Sulloway

    2008-10-02

    This cleanup verification package documents completion of remedial action for the 118-F-6 Burial Ground located in the 100-FR-2 Operable Unit of the 100-F Area on the Hanford Site. The trenches received waste from the 100-F Experimental Animal Farm, including animal manure, animal carcasses, laboratory waste, plastic, cardboard, metal, and concrete debris as well as a railroad tank car.

  9. Generic System Verilog Universal Verification Methodology Based Reusable Verification Environment for Efficient Verification of Image Signal Processing IPS/SOCS

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2012-12-01

    Full Text Available In this paper, we present Generic System Verilog Universal Verification Methodology based Reusable Verification Environment for efficient verification of Image Signal Processing IP’s/SoC’s. With the tight schedules on all projects it is important to have a strong verification methodology which contributes to First Silicon Success. Deploy methodologies which enforce full functional coverage and verification of corner cases through pseudo random test scenarios is required. Also, standardization of verification flow is needed. Previously, inside imaging group of ST, Specman (e/Verilog based Verification Environment forIP/Subsystem level verification and C/C++/Verilog based Directed Verification Environment for SoC Level Verification was used for Functional Verification. Different Verification Environments were used at IP level and SoC level. Different Verification/Validation Methodologies were used for SoC Verification across multiple sites. Verification teams were also looking for the ways how to catch bugs early in the design cycle? Thus, Generic System Verilog Universal Verification Methodology (UVM based Reusable Verification Environment is required to avoid the problem of having so many methodologies and provides a standard unified solution which compiles on all tools.

  10. A Correlation-Based Fingerprint Verification System

    NARCIS (Netherlands)

    Bazen, Asker M.; Verwaaijen, Gerben T.B.; Gerez, Sabih H.; Veelenturf, Leo P.J.; Zwaag, van der Berend Jan

    2000-01-01

    In this paper, a correlation-based fingerprint verification system is presented. Unlike the traditional minutiae-based systems, this system directly uses the richer gray-scale information of the fingerprints. The correlation-based fingerprint verification system first selects appropriate templates i

  11. Active Thermal Control Experiments for LISA Ground Verification Testing

    Science.gov (United States)

    Higuchi, Sei; DeBra, Daniel B.

    2006-11-01

    The primary mission goal of LISA is detecting gravitational waves. LISA uses laser metrology to measure the distance between proof masses in three identical spacecrafts. The total acceleration disturbance to each proof mass is required to be below 3 × 10-15 m/s2√Hz . Optical path length variations on each optical bench must be kept below 40 pm/√Hz over 1 Hz to 0.1 mHz. Thermal variations due to, for example, solar radiation or temperature gradients across the proof mass housing will distort the spacecraft causing changes in the mass attraction and sensor location. We have developed a thermal control system developed for the LISA gravitational reference sensor (GRS) ground verification testing which provides thermal stability better than 1 mK/√Hz to f control for the LISA spacecraft to compensate solar irradiation. Thermally stable environment is very demanded for LISA performance verification. In a lab environment specifications can be met with considerable amount of insulation and thermal mass. For spacecraft, the very limited thermal mass calls for an active control system which can meet disturbance rejection and stability requirements simultaneously in the presence of long time delay. A simple proportional plus integral control law presently provides approximately 1 mK/√Hz of thermal stability for over 80 hours. Continuing development of a model predictive feed-forward algorithm will extend performance to below 1 mK/√Hz at f < 1 mHz and lower.

  12. Fingerprint verification based on wavelet subbands

    Science.gov (United States)

    Huang, Ke; Aviyente, Selin

    2004-08-01

    Fingerprint verification has been deployed in a variety of security applications. Traditional minutiae detection based verification algorithms do not utilize the rich discriminatory texture structure of fingerprint images. Furthermore, minutiae detection requires substantial improvement of image quality and is thus error-prone. In this paper, we propose an algorithm for fingerprint verification using the statistics of subbands from wavelet analysis. One important feature for each frequency subband is the distribution of the wavelet coefficients, which can be modeled with a Generalized Gaussian Density (GGD) function. A fingerprint verification algorithm that combines the GGD parameters from different subbands is proposed to match two fingerprints. The verification algorithm in this paper is tested on a set of 1,200 fingerprint images. Experimental results indicate that wavelet analysis provides useful features for the task of fingerprint verification.

  13. Imaging Ground Motions in the Tokyo Metropolitan Area Based on MeSO-net: Azimuth Verification for Seismometers and Transfer Function Estimation for Site Effects

    Science.gov (United States)

    Kano, M.; Nagao, H.; Shiomi, K.; Sakai, S.; Nakagawa, S.; Mizusako, S.; Hori, M.; Hirata, N.

    2014-12-01

    Prediction of structural motions during large earthquakes is important to prevent secondary disasters. To evaluate such strong motion as accurately as possible, it is essential to infer the image of ground motion in the target area based on densely installed seismological networks. In the Tokyo metropolitan area of Japan, the dense seismological array "MeSO-net" was established in 2007, and has approximately 300 stations with several kilometer intervals. Mizusako et al. (2014, AGU) applies lasso, which is a linear regression modeling method using the L1 regularization, to the MeSO-net data during the 2011 off the Pacific coast of Tohoku Earthquake to infer the spatially-high-resolution strong motions in the metropolitan area. Their method succeeds in reproducing the waveforms up to much higher frequency component than previous studies. However, there are two topics to deal with before practical use of their study. The first topic is that real azimuths of MeSO-net seismometers installed after 2009 have not been verified, while those installed in 2007 and 2008 were already verified based on cross correlation with nearby tiltmeters of Hi-net and/or seismometers of F-net (Shiomi et al., 2009). Since azimuths of seismometers obviously affect the data processing, we evaluate the azimuths of seismometers following Shiomi et al. (2009). The second topic is that we cannot directly obtain the ground motion data on surface since MeSO-net seismometers are installed at 20m depth. We have been also developing the method to estimate transfer functions that convert strong motion at 20m depth to that on surface, by utilizing continuous observations obtained both on surface and at 20m depth at two stations, and short-term observations obtained above the boreholes at more than 100 stations. A combination of this vertical transformation method and the horizontal estimation method (Mizusako et al., 2014), enables us to infer an image of ground motions in the whole Tokyo area.

  14. Cleanup Verification Package for the 118-B-6, 108-B Solid Waste Burial Ground

    Energy Technology Data Exchange (ETDEWEB)

    M. L. Proctor

    2006-06-13

    This cleanup verification package documents completion of remedial action for the 118-B-6, 108-B Solid Waste Burial Ground. The 118-B-6 site consisted of 2 concrete pipes buried vertically in the ground and capped by a concrete pad with steel lids. The site was used for the disposal of wastes from the "metal line" of the P-10 Tritium Separation Project.

  15. Cleanup Verification Package for the 118-C-1, 105-C Solid Waste Burial Ground

    Energy Technology Data Exchange (ETDEWEB)

    M. J. Appel and J. M. Capron

    2007-07-25

    This cleanup verification package documents completion of remedial action for the 118-C-1, 105-C Solid Waste Burial Ground. This waste site was the primary burial ground for general wastes from the operation of the 105-C Reactor and received process tubes, aluminum fuel spacers, control rods, reactor hardware, spent nuclear fuel and soft wastes.

  16. VERIFICATION OF PARALLEL AUTOMATA-BASED PROGRAMS

    Directory of Open Access Journals (Sweden)

    M. A. Lukin

    2014-01-01

    Full Text Available The paper deals with an interactive method of automatic verification for parallel automata-based programs. The hierarchical state machines can be implemented in different threads and can interact with each other. Verification is done by means of Spin tool and includes automatic Promela model construction, conversion of LTL-formula to Spin format and counterexamples in terms of automata. Interactive verification gives the possibility to decrease verification time and increase the maximum size of verifiable programs. Considered method supports verification of the parallel system for hierarchical automata that interact with each other through messages and shared variables. The feature of automaton model is that each state machine is considered as a new data type and can have an arbitrary bounded number of instances. Each state machine in the system can run a different state machine in a new thread or have nested state machine. This method was implemented in the developed Stater tool. Stater shows correct operation for all test cases.

  17. Consent Based Verification System (CBSV)

    Data.gov (United States)

    Social Security Administration — CBSV is a fee-based service offered by SSA's Business Services Online (BSO). It is used by private companies to verify the SSNs of their customers and clients that...

  18. Fingerprint Verification based on Gabor Filter Enhancement

    CERN Document Server

    Lavanya, B N; Venugopal, K R

    2009-01-01

    Human fingerprints are reliable characteristics for personnel identification as it is unique and persistence. A fingerprint pattern consists of ridges, valleys and minutiae. In this paper we propose Fingerprint Verification based on Gabor Filter Enhancement (FVGFE) algorithm for minutiae feature extraction and post processing based on 9 pixel neighborhood. A global feature extraction and fingerprints enhancement are based on Hong enhancement method which is simultaneously able to extract local ridge orientation and ridge frequency. It is observed that the Sensitivity and Specificity values are better compared to the existing algorithms.

  19. Experimental verification of a computational technique for determining ground reactions in human bipedal stance.

    Science.gov (United States)

    Audu, Musa L; Kirsch, Robert F; Triolo, Ronald J

    2007-01-01

    We have developed a three-dimensional (3D) biomechanical model of human standing that enables us to study the mechanisms of posture and balance simultaneously in various directions in space. Since the two feet are on the ground, the system defines a kinematically closed-chain which has redundancy problems that cannot be resolved using the laws of mechanics alone. We have developed a computational (optimization) technique that avoids the problems with the closed-chain formulation thus giving users of such models the ability to make predictions of joint moments, and potentially, muscle activations using more sophisticated musculoskeletal models. This paper describes the experimental verification of the computational technique that is used to estimate the ground reaction vector acting on an unconstrained foot while the other foot is attached to the ground, thus allowing human bipedal standing to be analyzed as an open-chain system. The computational approach was verified in terms of its ability to predict lower extremity joint moments derived from inverse dynamic simulations performed on data acquired from four able-bodied volunteers standing in various postures on force platforms. Sensitivity analyses performed with model simulations indicated which ground reaction force (GRF) and center of pressure (COP) components were most critical for providing better estimates of the joint moments. Overall, the joint moments predicted by the optimization approach are strongly correlated with the joint moments computed using the experimentally measured GRF and COP (0.78 unity slope (experimental=computational results) for postures of the four subjects examined. These results indicate that this model-based technique can be relied upon to predict reasonable and consistent estimates of the joint moments using the predicted GRF and COP for most standing postures.

  20. Verification in referral-based crowdsourcing.

    Directory of Open Access Journals (Sweden)

    Victor Naroditskiy

    Full Text Available Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through "referral-based crowdsourcing": the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge.

  1. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  2. Cleanup Verification Package for the 118-F-3, Minor Construction Burial Ground

    Energy Technology Data Exchange (ETDEWEB)

    M. J. Appel

    2007-01-04

    This cleanup verification package documents completion of remedial action for the 118-F-3, Minor Construction Burial Ground waste site. This site was an open field covered with cobbles, with no vegetation growing on the surface. The site received irradiated reactor parts that were removed during conversion of the 105-F Reactor from the Liquid 3X to the Ball 3X Project safety systems and received mostly vertical safety rod thimbles and step plugs.

  3. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  4. Development of optical ground verification method for μm to sub-mm reflectors

    Science.gov (United States)

    Stockman, Y.; Thizy, C.; Lemaire, P.; Georges, M.; Mazy, E.; Mazzoli, A.; Houbrechts, Y.; Rochus, P.; Roose, S.; Doyle, D.; Ulbrich, G.

    2004-06-01

    develop and realise suitable verification tools based on infrared interferometry and other optical techniques for testing large reflector structures, telescope configurations and their performances under simulated space conditions. The first one is an IR-phase shifting interferometer with high spatial resolution. This interferometer shall be used specifically for the verification of high precision IR, FIR and sub-mm reflector surfaces and telescopes under both ambient and thermal vacuum conditions. The second one presented hereafter is a holographic method for relative shape measurement. The holographic solution proposed makes use of a home built vacuum compatible holographic camera that allows displacement measurements from typically 20 nanometres to 25 microns in one shot. An iterative process allows the measurement of a total of up to several mm of deformation. Uniquely the system is designed to measure both specular and diffuse surfaces.

  5. On-ground electrical performance verification strategies for large deployable reflector antennas

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Kim, Oleksiy S.; Breinbjerg, Olav;

    2012-01-01

    In this paper, possible verification strategies for large deployable reflector antennas are reviewed and analysed. One of the approaches considered to be the most feasible and promising is based on measurements of the feed characteristics, such as pattern and gain, and then calculation of the ove...

  6. CHANG'E-3 Active Particle-induced X-ray Spectrometer: ground verification test

    Science.gov (United States)

    Guo, Dongya; Peng, Wenxi; Cui, XingZhu; Wang, Huanyu

    The Active Particle-induced X-ray Spectrometer (APXS) is one of the payloads of Chang’E-3 rover Yutu, with which the major elemental composition of lunar soils and rocks can be measured on site. In order to assess the instrument performance and the accuracy of determination, ground verification test was carried out with two blind samples(basaltic rock, powder). Details of the experiments and data analysis method are discussed. The results show that the accuracy of quantitative analysis for major elements(Mg,Al,Si,K,Ca,Ti,Fe) is better than 15%.

  7. Base isolation system and verificational experiment of base isolated building

    Energy Technology Data Exchange (ETDEWEB)

    Takeuchi, Mikio; Harada, Osamu; Aoyagi, Sakae; Matsuda, Taiji

    1987-05-15

    With the objective of rationalization of the earthquake resistant design and the economical design based thereupon, many base isolation systems have been proposed and its research, development and application have been made in recent years. In order to disseminate the system, it is necessary to accumulate the data obtained from vibration tests and earthquake observations and verify the reliability of the system. From this viewpoint, the Central Research Institute of Electric power Industry and Okumura Corporation did the following experiments with a base isolated building as the object: 1) static power application experiments, 2) shaking experiments, 3) free vibration experiments, 4) regular slight vibration observations and 5) earthquake response observations (continuing). This article reports the outline of the base isolation system and the base isolated building concerned as well as the results of the verification experiments 1) through 3) above. From the results of these verification experiments, the basic vibration characteristics of the base isolation system consisting of laminated rubber and plastic damper were revealed and its functions were able to be verified. Especially during the free vibration experiments, the initial displacement even up to a maximum of 10cm was applied to the portion between the foundation and the structure and this displacement corresponds to the responded amplitude in case of the earthquake of seismic intensity of the 6th degree. It is planned to continue the verification further. (18 figs, 3 tabs, 3 photos, 6 refs)

  8. High-stability temperature control for ST-7/LISA Pathfinder gravitational reference sensor ground verification testing

    Science.gov (United States)

    Higuchi, S.; Allen, G.; Bencze, W.; Byer, R.; Dang, A.; DeBra, D. B.; Lauben, D.; Dorlybounxou, S.; Hanson, J.; Ho, L.; Huffman, G.; Sabur, F.; Sun, K.; Tavernetti, R.; Rolih, L.; Van Patten, R.; Wallace, J.; Williams, S.

    2006-03-01

    This article demonstrates experimental results of a thermal control system developed for ST-7 gravitational reference sensor (GRS) ground verification testing which provides thermal stability δT control of the LISA spacecraft to compensate solar irradiate 1/f fluctuations. Although for ground testing these specifications can be met fairly readily with sufficient insulation and thermal mass, in contrast, for spacecraft the very limited thermal mass calls for an active control system which can simultaneously meet disturbance rejection and stability requirements in the presence of long time delay; a considerable design challenge. Simple control laws presently provide ~ 1mK/surdHz for >24 hours. Continuing development of a model predictive feedforward control algorithm will extend performance to <1 mK/surdHz at f < 0.01 mHz and possibly lower, extending LISA coverage of super massive black hole mergers.

  9. On-Ground Processing of Yaogan-24 Remote Sensing Satellite Attitude Data and Verification Using Geometric Field Calibration.

    Science.gov (United States)

    Wang, Mi; Fan, Chengcheng; Yang, Bo; Jin, Shuying; Pan, Jun

    2016-07-30

    Satellite attitude accuracy is an important factor affecting the geometric processing accuracy of high-resolution optical satellite imagery. To address the problem whereby the accuracy of the Yaogan-24 remote sensing satellite's on-board attitude data processing is not high enough and thus cannot meet its image geometry processing requirements, we developed an approach involving on-ground attitude data processing and digital orthophoto (DOM) and the digital elevation model (DEM) verification of a geometric calibration field. The approach focuses on three modules: on-ground processing based on bidirectional filter, overall weighted smoothing and fitting, and evaluation in the geometric calibration field. Our experimental results demonstrate that the proposed on-ground processing method is both robust and feasible, which ensures the reliability of the observation data quality, convergence and stability of the parameter estimation model. In addition, both the Euler angle and quaternion could be used to build a mathematical fitting model, while the orthogonal polynomial fitting model is more suitable for modeling the attitude parameter. Furthermore, compared to the image geometric processing results based on on-board attitude data, the image uncontrolled and relative geometric positioning result accuracy can be increased by about 50%.

  10. Finite Countermodel Based Verification for Program Transformation (A Case Study

    Directory of Open Access Journals (Sweden)

    Alexei P. Lisitsa

    2015-12-01

    Full Text Available Both automatic program verification and program transformation are based on program analysis. In the past decade a number of approaches using various automatic general-purpose program transformation techniques (partial deduction, specialization, supercompilation for verification of unreachability properties of computing systems were introduced and demonstrated. On the other hand, the semantics based unfold-fold program transformation methods pose themselves diverse kinds of reachability tasks and try to solve them, aiming at improving the semantics tree of the program being transformed. That means some general-purpose verification methods may be used for strengthening program transformation techniques. This paper considers the question how finite countermodels for safety verification method might be used in Turchin's supercompilation method. We extract a number of supercompilation sub-algorithms trying to solve reachability problems and demonstrate use of an external countermodel finder for solving some of the problems.

  11. Algebraic Verification Method for SEREs Properties via Groebner Bases Approaches

    Directory of Open Access Journals (Sweden)

    Ning Zhou

    2013-01-01

    Full Text Available This work presents an efficient solution using computer algebra system to perform linear temporal properties verification for synchronous digital systems. The method is essentially based on both Groebner bases approaches and symbolic simulation. A mechanism for constructing canonical polynomial set based symbolic representations for both circuit descriptions and assertions is studied. We then present a complete checking algorithm framework based on these algebraic representations by using Groebner bases. The computational experience result in this work shows that the algebraic approach is a quite competitive checking method and will be a useful supplement to the existent verification methods based on simulation.

  12. Optical secure image verification system based on ghost imaging

    Science.gov (United States)

    Wu, Jingjing; Haobogedewude, Buyinggaridi; Liu, Zhengjun; Liu, Shutian

    2017-09-01

    The ghost imaging can perform Fourier-space filtering by tailoring the configuration. We proposed a novel optical secure image verification system based on this theory with the help of phase matched filtering. In the verification process, the system key and the ID card which contain the information of the correct image and the information to be verified are put in the reference and the test paths, respectively. We demonstrate that the ghost imaging configuration can perform an incoherent correlation between the system key and the ID card. The correct verification manifests itself with a correlation peak in the ghost image. The primary image and the image to be verified are encrypted and encoded into pure phase masks beforehand for security. Multi-image secure verifications can also be implemented in the proposed system.

  13. Sensor-fusion-based biometric identity verification

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W. [Sandia National Labs., Albuquerque, NM (United States); Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L. [New Mexico State Univ., Las Cruces, NM (United States). Electronic Vision Research Lab.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person`s identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm.

  14. Simulation environment based on the Universal Verification Methodology

    Science.gov (United States)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  15. Verification strategies for fluid-based plasma simulation models

    Science.gov (United States)

    Mahadevan, Shankar

    2012-10-01

    Verification is an essential aspect of computational code development for models based on partial differential equations. However, verification of plasma models is often conducted internally by authors of these programs and not openly discussed. Several professional research bodies including the IEEE, AIAA, ASME and others have formulated standards for verification and validation (V&V) of computational software. This work focuses on verification, defined succinctly as determining whether the mathematical model is solved correctly. As plasma fluid models share several aspects with the Navier-Stokes equations used in Computational Fluid Dynamics (CFD), the CFD verification process is used as a guide. Steps in the verification process: consistency checks, examination of iterative, spatial and temporal convergence, and comparison with exact solutions, are described with examples from plasma modeling. The Method of Manufactured Solutions (MMS), which has been used to verify complex systems of PDEs in solid and fluid mechanics, is introduced. An example of the application of MMS to a self-consistent plasma fluid model using the local mean energy approximation is presented. The strengths and weaknesses of the techniques presented in this work are discussed.

  16. History-Based Verification of Functional Behaviour of Concurrent Programs

    NARCIS (Netherlands)

    Blom, Stefan; Huisman, Marieke; Zaharieva, M.; Calinescu, Radu; Rumpe, Bernhard

    2015-01-01

    We extend permission-based separation logic with a history-based mechanism to simplify the verification of functional properties in concurrent programs. This allows one to specify the local behaviour of a method intuitively in terms of actions added to a local history; local histories can be combine

  17. Palmprint Based Verification System Using SURF Features

    Science.gov (United States)

    Srinivas, Badrinath G.; Gupta, Phalguni

    This paper describes the design and development of a prototype of robust biometric system for verification. The system uses features extracted using Speeded Up Robust Features (SURF) operator of human hand. The hand image for features is acquired using a low cost scanner. The palmprint region extracted is robust to hand translation and rotation on the scanner. The system is tested on IITK database of 200 images and PolyU database of 7751 images. The system is found to be robust with respect to translation and rotation. It has FAR 0.02%, FRR 0.01% and accuracy of 99.98% and can be a suitable system for civilian applications and high-security environments.

  18. Biometric Subject Verification Based on Electrocardiographic Signals

    Science.gov (United States)

    Dusan, Sorin V. (Inventor); Jorgensen, Charles C. (Inventor)

    2014-01-01

    A method of authenticating or declining to authenticate an asserted identity of a candidate-person. In an enrollment phase, a reference PQRST heart action graph is provided or constructed from information obtained from a plurality of graphs that resemble each other for a known reference person, using a first graph comparison metric. In a verification phase, a candidate-person asserts his/her identity and presents a plurality of his/her heart cycle graphs. If a sufficient number of the candidate-person's measured graphs resemble each other, a representative composite graph is constructed from the candidate-person's graphs and is compared with a composite reference graph, for the person whose identity is asserted, using a second graph comparison metric. When the second metric value lies in a selected range, the candidate-person's assertion of identity is accepted.

  19. Ground based materials science experiments

    Science.gov (United States)

    Meyer, M. B.; Johnston, J. C.; Glasgow, T. K.

    1988-01-01

    The facilities at the Microgravity Materials Science Laboratory (MMSL) at the Lewis Research Center, created to offer immediate and low-cost access to ground-based testing facilities for industrial, academic, and government researchers, are described. The equipment in the MMSL falls into three categories: (1) devices which emulate some aspect of low gravitational forces, (2) specialized capabilities for 1-g development and refinement of microgravity experiments, and (3) functional duplicates of flight hardware. Equipment diagrams are included.

  20. Ground based materials science experiments

    Science.gov (United States)

    Meyer, M. B.; Johnston, J. C.; Glasgow, T. K.

    1988-01-01

    The facilities at the Microgravity Materials Science Laboratory (MMSL) at the Lewis Research Center, created to offer immediate and low-cost access to ground-based testing facilities for industrial, academic, and government researchers, are described. The equipment in the MMSL falls into three categories: (1) devices which emulate some aspect of low gravitational forces, (2) specialized capabilities for 1-g development and refinement of microgravity experiments, and (3) functional duplicates of flight hardware. Equipment diagrams are included.

  1. The Potential of Agent-Based Modelling for Verification of People Trajectories Based on Smartphone Sensor Data

    OpenAIRE

    Hillen, Florian; Höfle, Bernd; Ehlers, Manfred; Reinartz, Peter

    2013-01-01

    In this paper the potential of smartphone sensor data for verification of people trajectories derived from airborne remote sensing data are investigated and discussed based on simulated test recordings in the city of Osnabrueck, Germany. For this purpose, the airborne imagery is simulated by images taken from a high building with a typical single lens reflex camera. The smartphone data required for the analysis of the potential is simultaneously recorded by test persons on the ground. In a se...

  2. A Ground-Based Validation System of Teleoperation for a Space Robot

    OpenAIRE

    Xueqian Wang; Houde Liu; Wenfu Xu; Bin Liang; Yingchun Zhang

    2012-01-01

    Teleoperation of space robots is very important for future on‐orbit service. In order to assure the task is accomplished successfully, ground experiments are required to verify the function and validity of the teleoperation system before a space robot is launched. In this paper, a ground‐based validation subsystem is developed as a part of a teleoperation system. The subsystem is mainly composed of four parts: the input verification module, the onboard verification module, the dynamic and ima...

  3. Biometric verification based on grip-pattern recognition

    NARCIS (Netherlands)

    Veldhuis, Raymond; Bazen, Asker; Kauffman, Joost; Hartel, Pieter

    2004-01-01

    This paper describes the design, implementation and evaluation of a user-verification system for a smart gun, which is based on grip-pattern recognition. An existing pressure sensor consisting of an array of 44 £ 44 piezoresistive elements is used to measure the grip pattern. An interface has been d

  4. Biometric verification based on grip-pattern recognition

    NARCIS (Netherlands)

    Veldhuis, Raymond N.J.; Bazen, A.M.; Kauffman, J.A.; Hartel, Pieter H.; Delp, Edward J.; Wong, Ping W.

    This paper describes the design, implementation and evaluation of a user-verification system for a smart gun, which is based on grip-pattern recognition. An existing pressure sensor consisting of an array of 44 x 44 piezoresistive elements is used to measure the grip pattern. An interface has been

  5. Dynamic Frames Based Verification Method for Concurrent Java Programs

    NARCIS (Netherlands)

    Mostowski, Wojciech

    2016-01-01

    In this paper we discuss a verification method for concurrent Java programs based on the concept of dynamic frames. We build on our earlier work that proposes a new, symbolic permission system for concurrent reasoning and we provide the following new contributions. First, we describe our approach

  6. Protocol-Based Verification of Message-Passing Parallel Programs

    DEFF Research Database (Denmark)

    López-Acosta, Hugo-Andrés; Eduardo R. B. Marques, Eduardo R. B.; Martins, Francisco;

    2015-01-01

    a protocol language based on a dependent type system for message-passing parallel programs, which includes various communication operators, such as point-to-point messages, broadcast, reduce, array scatter and gather. For the verification of a program against a given protocol, the protocol is first...

  7. Pixel Based Off-line Signature Verification System

    Directory of Open Access Journals (Sweden)

    Anik Barua

    2015-01-01

    Full Text Available The verification of handwritten signatures is one of the oldest and the most popular authentication methods all around the world. As technology improved, different ways of comparing and analyzing signatures become more and more sophisticated. Since the early seventies, people have been exploring how computers can fully take over the task of signature verification and tried different methods. However, none of them is satisfactory enough and time consuming too. Therefore, our proposed pixel based offline signature verification system is one of the fastest and easiest ways to authenticate any handwritten signature we have ever found. For signature acquisition, we have used scanner. Then we have divided the signature image into 2D array and calculated the hexadecimal RGB value of each pixel. After that, we have calculated the total percentage of matching. If the percentage of matching is more than 90, the signature is considered as valid otherwise invalid. We have experimented on more than 35 signatures and the result of our experiment is quite impressive. We have made the whole system web based so that the signature can be verified from anywhere. The average execution time for signature verification is only 0.00003545 second only.

  8. Text-Independent Speaker Verification Based on Information Theoretic Learning

    Directory of Open Access Journals (Sweden)

    Sheeraz Memon

    2011-07-01

    Full Text Available In this paper VQ (Vector Quantization based on information theoretic learning is investigated for the task of text-independent speaker verification. A novel VQ method based on the IT (Information Theoretic principles is used for the task of speaker verification and compared with two classical VQ approaches: the K-means algorithm and the LBG (Linde Buzo Gray algorithm. The paper provides a theoretical background of the vector quantization techniques, which is followed by experimental results illustrating their performance. The results demonstrated that the ITVQ (Information Theoretic Vector Quantization provided the best performance in terms of classification rates, EER (Equal Error Rates and the MSE (Mean Squared Error compare to Kmeans and the LBG algorithms. The outstanding performance of the ITVQ algorithm can be attributed to the fact that the IT criteria used by this algorithm provide superior matching between distribution of the original data vectors and the codewords.

  9. Nonminutiae-Based Decision-Level Fusion for Fingerprint Verification

    OpenAIRE

    Helfroush Sadegh; Ghassemian Hassan

    2007-01-01

    Most of the proposed methods used for fingerprint verification are based on local visible features called minutiae. However, due to problems for extracting minutiae from low-quality fingerprint images, other discriminatory information has been considered. In this paper, the idea of decision-level fusion of orientation, texture, and spectral features of fingerprint image is proposed. At first, a value is assigned to the similarity of block orientation field of two-fingerprint images. This is ...

  10. Airworthiness Compliance Verification Method Based on Simulation of Complex System

    Institute of Scientific and Technical Information of China (English)

    XU Haojun; LIU Dongliang; XUE Yuan; ZHOU Li; MIN Guilong

    2012-01-01

    A study is conducted on a new airworthiness compliance verification method based on pilot-aircraft-environment complex system simulation.Verification scenarios are established by “block diagram” method based on airworthiness criteria..A pilot-aircraft-environment complex model is set up and a virtual flight testing method based on connection of MATLAB/Simulink and Flightgear is proposed.Special researches are conducted on the modeling of pilot manipulation stochastic parameters and manipulation in critical situation.Unfavorable flight factors of certain scenario are analyzed,and reliability modeling of important system is researched.A distribution function of small probability event and the theory on risk probability measurement are studied.Nonlinear function is used to depict the relationship between the cumulative probability and the extremum of the critical parameter.A synthetic evaluation model is set up,modified genetic algorithm (MGA) is applied to ascertaining the distribution parameter in the model,and amore reasonable result is obtained.A clause about vehicle control functions (VCFs) verification in MIL-HDBK-516B is selected as an example to validate the practicability of the method.

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: ECR TECHNOLOGIES, INC., EARTHLINKED GROUND-SOURCE HEAT PUMP WATER HEATING SYSTEM

    Science.gov (United States)

    EPA has created the Environmental Technology Verification program to provide high quality, peer reviewed data on technology performance. This data is expected to accelerate the acceptance and use of improved environmental protection technologies. The Greenhouse Gas Technology C...

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: ECR TECHNOLOGIES, INC., EARTHLINKED GROUND-SOURCE HEAT PUMP WATER HEATING SYSTEM

    Science.gov (United States)

    EPA has created the Environmental Technology Verification program to provide high quality, peer reviewed data on technology performance. This data is expected to accelerate the acceptance and use of improved environmental protection technologies. The Greenhouse Gas Technology C...

  13. The potential of agent-based modelling for verification of people trajectories based on smartphone sensor data

    Science.gov (United States)

    Hillen, F.; Höfle, B.; Ehlers, M.; Reinartz, P.

    2014-02-01

    In this paper the potential of smartphone sensor data for verification of people trajectories derived from airborne remote sensing data are investigated and discussed based on simulated test recordings in the city of Osnabrueck, Germany. For this purpose, the airborne imagery is simulated by images taken from a high building with a typical single lens reflex camera. The smartphone data required for the analysis of the potential is simultaneously recorded by test persons on the ground. In a second step, the quality of the smartphone sensor data is evaluated regarding the integration into simulation and modelling approaches. In this context we studied the potential of the agent-based modelling technique concerning the verification of people trajectories.

  14. Neighbors Based Discriminative Feature Difference Learning for Kinship Verification

    DEFF Research Database (Denmark)

    Duan, Xiaodong; Tan, Zheng-Hua

    2015-01-01

    than the commonly used feature concatenation, leading to a low complexity. Furthermore, there is no positive semi-definitive constrain on the transformation matrix while there is in metric learning methods, leading to an easy solution for the transformation matrix. Experimental results on two public......In this paper, we present a discriminative feature difference learning method for facial image based kinship verification. To transform feature difference of an image pair to be discriminative for kinship verification, a linear transformation matrix for feature difference between an image pair...... is inferred from training data. This transformation matrix is obtained through minimizing the difference of L2 norm between the feature difference of each kinship pair and its neighbors from non-kinship pairs. To find the neighbors, a cosine similarity is applied. Our method works on feature difference rather...

  15. MODEL-BASED VALIDATION AND VERIFICATION OF ANOMALIES IN LEGISLATION

    Directory of Open Access Journals (Sweden)

    Vjeran Strahonja

    2006-12-01

    Full Text Available An anomaly in legislation is absence of completeness, consistency and other desirable properties, caused by different semantic, syntactic or pragmatic reasons. In general, the detection of anomalies in legislation comprises validation and verification. The basic idea of research, as presented in this paper, is modelling legislation by capturing domain knowledge of legislation and specifying it in a generic way by using commonly agreed and understandable modelling concepts of the Unified Modelling Language (UML. Models of legislation enable to understand the system better, support the detection of anomalies and help to improve the quality of legislation by validation and verification. By implementing model-based approach, the object of validation and verification moves from legislation to its model. The business domain of legislation has two distinct aspects: a structural or static aspect (functionality, business data etc., and a behavioural or dynamic part (states, transitions, activities, sequences etc.. Because anomalism can occur on two different levels, on the level of a model, or on the level of legislation itself, a framework for validation and verification of legal regulation and its model is discussed. The presented framework includes some significant types of semantic and syntactic anomalies. Some ideas for assessment of pragmatic anomalies of models were found in the field of software quality metrics. Thus pragmatic features and attributes can be determined that could be relevant for evaluation purposes of models. Based on analogue standards for the evaluation of software, a qualitative and quantitative scale can be applied to determine the value of some feature for a specific model.

  16. PALMPRINT VERIFICATION USING INVARIANT MOMENTS BASED ON WAVELET TRANSFORM

    Directory of Open Access Journals (Sweden)

    Inass SH. Hussein

    2014-01-01

    Full Text Available Data security is one of the important issues among computer users. Data security can prevent fraudulent users from accessing an individual’s personal data. The biometrics recognition as one of the most important parts in the security of the data and the application of computer vision. The biometrics is the authentication method used in a wide variety of applications such as e-banking, e-commerce, e-government and many others. A biometric system is one which requires the recognition of a pattern, whereby it enables the differentiation of features from one individual to another. Biometric technologies, thus may be defined as the automated methods of identifying, or authenticating, the identity of a living person based on physiological or behavioral traits. This study emphasizes palmprint recognition, which provides a wide deployment range of authentication methods. The palmprint contains principal lines, wrinkles, fine lines, ridges and surface area; thus the palmprint of person differs from one to another. Previous researchers have difficulty extracting the features of a palm print, because of the effects of rotation, translation and scaling changes and the accuracy rate of verification performance needs to be improved. The aim of this study is to extract shape features using an invariant moments algorithm based on wavelet transform and identify the person’s verification. This model has shown a promising results without the effects of rotation, translation and scaling of objects, because it is associated with the use of a good description of shape features. This system has been tested using databases from the Indian Institute of Technology, Kanpur (IITK, by using the False Rejection Rate (FRR and False Acceptance Rate (FAR, we may calculate the accuracy rate of verification. The experiment shows a 97.99% accuracy rate of verification.

  17. Game-based verification and synthesis

    DEFF Research Database (Denmark)

    Vester, Steen

    problems for logics capable of expressing strategic abilities of players in games with both qualitative and quantitative objectives. A number of computational complexity results for model-checking and satisfiability problems in this domain are obtained. We also show how the technique of symmetry reduction...... can be extended to solve finitely-branching turn-based games more efficiently. Further, the novel concept of winning cores in parity games is introduced. We use this to develop a new polynomial-time under-approximation algorithm for solving parity games. Experimental results show that this algorithm......Infinite-duration games provide a convenient way to model distributed, reactive and open systems in which several entities and an uncontrollable environment interact. Here, each entitity as well as the uncontrollable environment are modelled as players. A strategy for an entity player in the model...

  18. Online Signature Verification Based on DCT and Sparse Representation.

    Science.gov (United States)

    Liu, Yishu; Yang, Zhihua; Yang, Lihua

    2015-11-01

    In this paper, a novel online signature verification technique based on discrete cosine transform (DCT) and sparse representation is proposed. We find a new property of DCT, which can be used to obtain a compact representation of an online signature using a fixed number of coefficients, leading to simple matching procedures and providing an effective alternative to deal with time series of different lengths. The property is also used to extract energy features. Furthermore, a new attempt to apply sparse representation to online signature verification is made, and a novel task-specific method for building overcomplete dictionaries is proposed, then sparsity features are extracted. Finally, energy features and sparsity features are concatenated to form a feature vector. Experiments are conducted on the Sabancı University's Signature Database (SUSIG)-Visual and SVC2004 databases, and the results show that our proposed method authenticates persons very reliably with a verification performance which is better than those of state-of-the-art methods on the same databases.

  19. Image-based fingerprint verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil K. Singla

    2008-09-01

    Full Text Available Biometric-based identification/verification systems provide a solution to the security concerns in the modern world where machine is replacing human in every aspect of life. Fingerprints, because of their uniqueness, are the most widely used and highly accepted biometrics. Fingerprint biometric systems are either minutiae-based or pattern learning (image based. The minutiae-based algorithm depends upon the local discontinuities in the ridge flow pattern and are used when template size is important while image-based matching algorithm uses both the micro and macro feature of a fingerprint and is used if fast response is required. In the present paper an image-based fingerprint verification system is discussed. The proposed method uses a learning phase, which is not present in conventional image-based systems. The learning phase uses pseudo random sub-sampling, which reduces the number of comparisons needed in the matching stage. This system has been developed using LabVIEW (Laboratory Virtual Instrument Engineering Workbench toolbox version 6i. The availability of datalog files in LabVIEW makes it one of the most promising candidates for its usage as a database. Datalog files can access and manipulate data and complex data structures quickly and easily. It makes writing and reading much faster. After extensive experimentation involving a large number of samples and different learning sizes, high accuracy with learning image size of 100 100 and a threshold value of 700 (1000 being the perfect match has been achieved.

  20. Performing Verification and Validation in Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  1. Measurement campaigns for selection of optimum on-ground performance verification approach for large deployable reflector antenna

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Nielsen, Jeppe Majlund; Kim, Oleksiy S.;

    2012-01-01

    This paper describes the measurement campaigns carried out at P-band (435 MHz) for selection of optimum on-ground verification approach for a large deployable reflector antenna (LDA). The feed array of the LDA was measured in several configurations with spherical, cylindrical, and planar near-field...... techniques at near-field facilities in Denmark and in the Netherlands. The measured results for the feed array were then used in calculation of the radiation pattern and gain of the entire LDA. The primary goals for the campaigns were to obtain realistic measurement uncertainty estimates and to investigate...

  2. Refinement-based verification of sequential implementations of Stateflow charts

    CERN Document Server

    Miyazawa, Alvaro; 10.4204/EPTCS.55.5

    2011-01-01

    Simulink/Stateflow charts are widely used in industry for the specification of control systems, which are often safety-critical. This suggests a need for a formal treatment of such models. In previous work, we have proposed a technique for automatic generation of formal models of Stateflow blocks to support refinement-based reasoning. In this article, we present a refinement strategy that supports the verification of automatically generated sequential C implementations of Stateflow charts. In particular, we discuss how this strategy can be specialised to take advantage of architectural features in order to allow a higher level of automation.

  3. Space-based monitoring of ground deformation

    Science.gov (United States)

    Nobakht Ersi, Fereydoun; Safari, Abdolreza; Gamse, Sonja

    2016-07-01

    Ground deformation monitoring is valuable to understanding of the behaviour of natural phenomena. Space-Based measurement systems such as Global Positioning System are useful tools for continuous monitoring of ground deformation. Ground deformation analysis based on space geodetic techniques have provided a new, more accurate, and reliable source of information for geodetic positioning which is used to detect deformations of the Ground surface. This type of studies using displacement fields derived from repeated measurments of space-based geodetic networks indicates how crucial role the space geodetic methods play in geodynamics. The main scope of this contribution is to monitor of ground deformation by obtained measurements from GPS sites. We present ground deformation analysis in three steps: a global congruency test on daily coordinates of permanent GPS stations to specify in which epochs deformations occur, the localization of the deformed GPS sites and the determination of deformations.

  4. Ground-based observations of exoplanet atmospheres

    NARCIS (Netherlands)

    Mooij, Ernst Johan Walter de

    2011-01-01

    This thesis focuses on the properties of exoplanet atmospheres. The results for ground-based near-infrared secondary eclipse observations of three different exoplanets, TrES-3b, HAT-P-1b and WASP-33b, are presented which have been obtained with ground-based telescopes as part of the GROUSE project.

  5. Ground-based observations of exoplanet atmospheres

    NARCIS (Netherlands)

    Mooij, Ernst Johan Walter de

    2011-01-01

    This thesis focuses on the properties of exoplanet atmospheres. The results for ground-based near-infrared secondary eclipse observations of three different exoplanets, TrES-3b, HAT-P-1b and WASP-33b, are presented which have been obtained with ground-based telescopes as part of the GROUSE project.

  6. Accelerometers for the GOCE Mission: on-ground verification and in-orbit early results

    Science.gov (United States)

    Foulon, B.; Christophe, B.; Marque, J.-P.

    2009-04-01

    The six accelerometers of the ESA GOCE mission have been developed by ONERA under contract with ThalesAleniaSpace France as Prime Contractor of the Gradiometer. These instruments are based on a principle similar to the ones flying from several years on board the CHAMP and the twin GRACE satellites but with some technological evolution to improve their resolution by 2 orders of magnitude in order to guarantee a level of noise acceleration lower than 2E-12 ms-2 Hz-1/2 as required by the GOCE mission scientific performance. Their contribution to the mission is double by providing the Satellite with the linear accelerations as input to the continuous drag compensation system and with the scientific data measurements to be on-ground processed. The presentation will first shortly describe the accelerometer together with a summary of on-ground test plan philosophy and results, including free fall tests in the Bremen drop tower. Then, if available at that time, the first and preliminary results of the in orbit performance of the accelerometers will be presented and compared. Such instrument can also contribute to improve the performance of some new geodetic mission by measuring more accurately the non gravitational forces acting on the satellites, as corner-stone instrument in some gradiometer arms or as sensor for drag compensation system of low orbit spacecrafts.

  7. Internet-based dimensional verification system for reverse engineering processes

    Energy Technology Data Exchange (ETDEWEB)

    Song, In Ho [Ajou University, Suwon (Korea, Republic of); Kim, Kyung Don [Small Business Corporation, Suwon (Korea, Republic of); Chung, Sung Chong [Hanyang University, Seoul (Korea, Republic of)

    2008-07-15

    This paper proposes a design methodology for a Web-based collaborative system applicable to reverse engineering processes in a distributed environment. By using the developed system, design reviewers of new products are able to confirm geometric shapes, inspect dimensional information of products through measured point data, and exchange views with other design reviewers on the Web. In addition, it is applicable to verifying accuracy of production processes by manufacturing engineers. Functional requirements for designing this Web-based dimensional verification system are described in this paper. ActiveX-server architecture and OpenGL plug-in methods using ActiveX controls realize the proposed system. In the developed system, visualization and dimensional inspection of the measured point data are done directly on the Web: conversion of the point data into a CAD file or a VRML form is unnecessary. Dimensional verification results and design modification ideas are uploaded to markups and/or XML files during collaboration processes. Collaborators review the markup results created by others to produce a good design result on the Web. The use of XML files allows information sharing on the Web to be independent of the platform of the developed system. It is possible to diversify the information sharing capability among design collaborators. Validity and effectiveness of the developed system has been confirmed by case studies

  8. Verification of product design using regulation knowledge base and Web services

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ik June [KAERI, Daejeon (Korea, Republic of); Lee, Jae Chul; Mun Du Hwan [Kyungpook National University, Daegu (Korea, Republic of); Kim, Byung Chul [Dong-A University, Busan (Korea, Republic of); Hwang, Jin Sang [PartDB Co., Ltd., Daejeom (Korea, Republic of); Lim, Chae Ho [Korea Institute of Industrial Technology, Incheon (Korea, Republic of)

    2015-11-15

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  9. Efficient Data Integrity Verification Using CRC Based on HDFS in Cloud Storage

    Directory of Open Access Journals (Sweden)

    Xia Yun-Hao

    2017-01-01

    Full Text Available Data integrity verification is becoming a major challenge in cloud storage which can’t be ignored. This paper proposes an optimized variant of CRC (Checker Redundancy Cyclic verification algorithm based on HDFS to improve the efficiency of data integrity verification in cloud storage through the research of CRC checksum algorithm and data integrity verification mechanism of HDFS. A new method is formulated to establish the deformational optimization and to accelerate the algorithm by researching characteristics of generating and checking the algorithm. Moreover, this method optimizes the code to improve the computational efficiency according to data integrity verification mechanism of HDFS. A data integrity verification system based on Hadoop is designed to verify proposed method. Experimental results demonstrate that proposed HDFS based CRC algorithm was able to improve the calculation efficiency and the utilization of system resource on the whole and outperformed well compared to existing models in terms of accuracy and time.

  10. Fresnel zones for ground-based antennas

    DEFF Research Database (Denmark)

    Andersen, J. Bach

    1964-01-01

    The ordinary Fresnel zone concept is modified to include the influence of finite ground conductivity. This is important for ground-based antennas because the influence on the radiation pattern of irregularities near the antenna is determined by the amplitude and phase of the groundwave. A new...

  11. Integrated Key based Strict Friendliness Verification of Neighbors in MANET

    CERN Document Server

    Vaman, Dhadesugoor R

    2012-01-01

    A novel Strict Friendliness Verification (SFV) scheme based on the integrated key consisting of symmetric node identity, geographic location and round trip response time between the sender and the receiver radio in MANET is proposed. This key is dynamically updated for encryption and decryption of each packet to resolve Wormhole attack and Sybil attack. Additionally, it meets the minimal key lengths required for symmetric ciphers to provide adequate commercial security. Furthermore, the foe or unfriendly node detection is found significantly increasing with the lower number of symmetric IDs. This paper presents the simulation demonstrating the performance of SFV in terms of dynamic range using directional antenna on radios (or nodes), and the performance in terms of aggregate throughput, average end to end delay and packet delivered ratio.

  12. Transition and verification of ground fault protection method in Hokuriku Shinkansen line

    Directory of Open Access Journals (Sweden)

    Michiteru Koyanagi

    2016-01-01

    Full Text Available The electrical discharge gaps called S type horn are applied to the ground fault detection and protection in AC traction power supply system for high speed railway called Shinkansen. In this method, the earth resistance of the steel pipe pillar is an important factor for the ground fault protection by the electrical discharge. In this study, the analyses of the transient characteristics of grounding fault are carried out by using EMTP, and the ground resistance value required to trigger discharge at S type horn was calculated. Moreover, the protection effect of a discharge gap called GP the substation equipment that is a discharge gap which is inserted between rails and substation mesh is evaluated.

  13. Postures and Motions Library Development for Verification of Ground Crew Human Systems Integration Requirements

    Science.gov (United States)

    Jackson, Mariea Dunn; Dischinger, Charles; Stambolian, Damon; Henderson, Gena

    2012-01-01

    Spacecraft and launch vehicle ground processing activities require a variety of unique human activities. These activities are being documented in a Primitive motion capture library. The Library will be used by the human factors engineering in the future to infuse real to life human activities into the CAD models to verify ground systems human factors requirements. As the Primitive models are being developed for the library the project has selected several current human factors issues to be addressed for the SLS and Orion launch systems. This paper explains how the Motion Capture of unique ground systems activities are being used to verify the human factors analysis requirements for ground system used to process the STS and Orion vehicles, and how the primitive models will be applied to future spacecraft and launch vehicle processing.

  14. Postures and Motions Library Development for Verification of Ground Crew Human Factors Requirements

    Science.gov (United States)

    Stambolian, Damon; Henderson, Gena; Jackson, Mariea Dunn; Dischinger, Charles

    2013-01-01

    Spacecraft and launch vehicle ground processing activities require a variety of unique human activities. These activities are being documented in a primitive motion capture library. The library will be used by human factors engineering analysts to infuse real to life human activities into the CAD models to verify ground systems human factors requirements. As the primitive models are being developed for the library, the project has selected several current human factors issues to be addressed for the Space Launch System (SLS) and Orion launch systems. This paper explains how the motion capture of unique ground systems activities is being used to verify the human factors engineering requirements for ground systems used to process the SLS and Orion vehicles, and how the primitive models will be applied to future spacecraft and launch vehicle processing.

  15. Assessing the capability of numerical methods to predict earthquake ground motion: the Euroseistest verification and validation project

    Science.gov (United States)

    Chaljub, E. O.; Bard, P.; Tsuno, S.; Kristek, J.; Moczo, P.; Franek, P.; Hollender, F.; Manakou, M.; Raptakis, D.; Pitilakis, K.

    2009-12-01

    During the last decades, an important effort has been dedicated to develop accurate and computationally efficient numerical methods to predict earthquake ground motion in heterogeneous 3D media. The progress in methods and increasing capability of computers have made it technically feasible to calculate realistic seismograms for frequencies of interest in seismic design applications. In order to foster the use of numerical simulation in practical prediction, it is important to (1) evaluate the accuracy of current numerical methods when applied to realistic 3D applications where no reference solution exists (verification) and (2) quantify the agreement between recorded and numerically simulated earthquake ground motion (validation). Here we report the results of the Euroseistest verification and validation project - an ongoing international collaborative work organized jointly by the Aristotle University of Thessaloniki, Greece, the Cashima research project (supported by the French nuclear agency, CEA, and the Laue-Langevin institute, ILL, Grenoble), and the Joseph Fourier University, Grenoble, France. The project involves more than 10 international teams from Europe, Japan and USA. The teams employ the Finite Difference Method (FDM), the Finite Element Method (FEM), the Global Pseudospectral Method (GPSM), the Spectral Element Method (SEM) and the Discrete Element Method (DEM). The project makes use of a new detailed 3D model of the Mygdonian basin (about 5 km wide, 15 km long, sediments reach about 400 m depth, surface S-wave velocity is 200 m/s). The prime target is to simulate 8 local earthquakes with magnitude from 3 to 5. In the verification, numerical predictions for frequencies up to 4 Hz for a series of models with increasing structural and rheological complexity are analyzed and compared using quantitative time-frequency goodness-of-fit criteria. Predictions obtained by one FDM team and the SEM team are close and different from other predictions

  16. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    Fiergolski, Adrian

    2016-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  17. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2016-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  18. Cleaning Verification Monitor Technique Based on Infrared Optical Methods

    Science.gov (United States)

    2004-10-01

    Cleaning Verification Techniques.” Real-time methods to provide both qualitative and quantitative assessments of surface cleanliness are needed for a...detection VCPI method offer a wide range of complementary capabilities in real-time surface cleanliness verification. Introduction Currently...also has great potential to reduce or eliminate premature failures of surface coatings caused by a lack of surface cleanliness . Additional

  19. Verification and Planning Based on Coinductive Logic Programming

    Science.gov (United States)

    Bansal, Ajay; Min, Richard; Simon, Luke; Mallya, Ajay; Gupta, Gopal

    2008-01-01

    Coinduction is a powerful technique for reasoning about unfounded sets, unbounded structures, infinite automata, and interactive computations [6]. Where induction corresponds to least fixed point's semantics, coinduction corresponds to greatest fixed point semantics. Recently coinduction has been incorporated into logic programming and an elegant operational semantics developed for it [11, 12]. This operational semantics is the greatest fix point counterpart of SLD resolution (SLD resolution imparts operational semantics to least fix point based computations) and is termed co- SLD resolution. In co-SLD resolution, a predicate goal p( t) succeeds if it unifies with one of its ancestor calls. In addition, rational infinite terms are allowed as arguments of predicates. Infinite terms are represented as solutions to unification equations and the occurs check is omitted during the unification process. Coinductive Logic Programming (Co-LP) and Co-SLD resolution can be used to elegantly perform model checking and planning. A combined SLD and Co-SLD resolution based LP system forms the common basis for planning, scheduling, verification, model checking, and constraint solving [9, 4]. This is achieved by amalgamating SLD resolution, co-SLD resolution, and constraint logic programming [13] in a single logic programming system. Given that parallelism in logic programs can be implicitly exploited [8], complex, compute-intensive applications (planning, scheduling, model checking, etc.) can be executed in parallel on multi-core machines. Parallel execution can result in speed-ups as well as in larger instances of the problems being solved. In the remainder we elaborate on (i) how planning can be elegantly and efficiently performed under real-time constraints, (ii) how real-time systems can be elegantly and efficiently model- checked, as well as (iii) how hybrid systems can be verified in a combined system with both co-SLD and SLD resolution. Implementations of co-SLD resolution

  20. Finger vein verification system based on sparse representation.

    Science.gov (United States)

    Xin, Yang; Liu, Zhi; Zhang, Haixia; Zhang, Hong

    2012-09-01

    Finger vein verification is a promising biometric pattern for personal identification in terms of security and convenience. The recognition performance of this technology heavily relies on the quality of finger vein images and on the recognition algorithm. To achieve efficient recognition performance, a special finger vein imaging device is developed, and a finger vein recognition method based on sparse representation is proposed. The motivation for the proposed method is that finger vein images exhibit a sparse property. In the proposed system, the regions of interest (ROIs) in the finger vein images are segmented and enhanced. Sparse representation and sparsity preserving projection on ROIs are performed to obtain the features. Finally, the features are measured for recognition. An equal error rate of 0.017% was achieved based on the finger vein image database, which contains images that were captured by using the near-IR imaging device that was developed in this study. The experimental results demonstrate that the proposed method is faster and more robust than previous methods.

  1. VERIFICATION & VALIDATION OF A SEMANTIC IMAGE TAGGING FRAMEWORK VIA GENERATION OF GEOSPATIAL IMAGERY GROUND TRUTH

    Energy Technology Data Exchange (ETDEWEB)

    Gleason, Shaun Scott [ORNL; Ferrell, Regina Kay [ORNL; Cheriyadat, Anil M [ORNL; Vatsavai, Raju [ORNL; Sari-Sarraf, Hamed [ORNL; Dema, Mesfin A [ORNL

    2011-01-01

    As a result of increasing geospatial image libraries, many algorithms are being developed to automatically extract and classify regions of interest from these images. However, limited work has been done to compare, validate and verify these algorithms due to the lack of datasets with high accuracy ground truth annotations. In this paper, we present an approach to generate a large number of synthetic images accompanied by perfect ground truth annotation via learning scene statistics from few training images through Maximum Entropy (ME) modeling. The ME model [1,2] embeds a Stochastic Context Free Grammar (SCFG) to model object attribute variations with Markov Random Fields (MRF) with the final goal of modeling contextual relations between objects. Using this model, 3D scenes are generated by configuring a 3D object model to obey the learned scene statistics. Finally, these plausible 3D scenes are captured by ray tracing software to produce synthetic images with the corresponding ground truth annotations that are useful for evaluating the performance of a variety of image analysis algorithms.

  2. Continuous Verification of Large Embedded Software using SMT-Based Bounded Model Checking

    CERN Document Server

    Cordeiro, Lucas; Marques-Silva, Joao

    2009-01-01

    The complexity of software in embedded systems has increased significantly over the last years so that software verification now plays an important role in ensuring the overall product quality. In this context, SAT-based bounded model checking has been successfully applied to discover subtle errors, but for larger applications, it often suffers from the state space explosion problem. This paper describes a new approach called continuous verification to detect design errors as quickly as possible by looking at the Software Configuration Management (SCM) system and by combining dynamic and static verification to reduce the state space to be explored. We also give a set of encodings that provide accurate support for program verification and use different background theories in order to improve scalability and precision in a completely automatic way. A case study from the telecommunications domain shows that the proposed approach improves the error-detection capability and reduces the overall verification time by...

  3. A Ground-Based Validation System of Teleoperation for a Space Robot

    Directory of Open Access Journals (Sweden)

    Xueqian Wang

    2012-10-01

    Full Text Available Teleoperation of space robots is very important for future on‐orbit service. In order to assure the task is accomplished successfully, ground experiments are required to verify the function and validity of the teleoperation system before a space robot is launched. In this paper, a ground‐based validation subsystem is developed as a part of a teleoperation system. The subsystem is mainly composed of four parts: the input verification module, the onboard verification module, the dynamic and image workstation, and the communication simulator. The input verification module, consisting of hardware and software of the master, is used to verify the input ability. The onboard verification module, consisting of the same hardware and software as the onboard processor, is used to verify the processor’s computing ability and execution schedule. In addition, the dynamic and image workstation calculates the dynamic response of the space robot and target, and generates emulated camera images, including the hand‐eye cameras, global‐ vision camera and rendezvous camera. The communication simulator provides fidelity communication conditions, i.e., time delays and communication bandwidth. Lastly, we integrated a teleoperation system and conducted many experiments on the system. Experiment results show that the ground system is very useful for verified teleoperation technology.

  4. Illumination compensation in ground based hyperspectral imaging

    Science.gov (United States)

    Wendel, Alexander; Underwood, James

    2017-07-01

    Hyperspectral imaging has emerged as an important tool for analysing vegetation data in agricultural applications. Recently, low altitude and ground based hyperspectral imaging solutions have come to the fore, providing very high resolution data for mapping and studying large areas of crops in detail. However, these platforms introduce a unique set of challenges that need to be overcome to ensure consistent, accurate and timely acquisition of data. One particular problem is dealing with changes in environmental illumination while operating with natural light under cloud cover, which can have considerable effects on spectral shape. In the past this has been commonly achieved by imaging known reference targets at the time of data acquisition, direct measurement of irradiance, or atmospheric modelling. While capturing a reference panel continuously or very frequently allows accurate compensation for illumination changes, this is often not practical with ground based platforms, and impossible in aerial applications. This paper examines the use of an autonomous unmanned ground vehicle (UGV) to gather high resolution hyperspectral imaging data of crops under natural illumination. A process of illumination compensation is performed to extract the inherent reflectance properties of the crops, despite variable illumination. This work adapts a previously developed subspace model approach to reflectance and illumination recovery. Though tested on a ground vehicle in this paper, it is applicable to low altitude unmanned aerial hyperspectral imagery also. The method uses occasional observations of reference panel training data from within the same or other datasets, which enables a practical field protocol that minimises in-field manual labour. This paper tests the new approach, comparing it against traditional methods. Several illumination compensation protocols for high volume ground based data collection are presented based on the results. The findings in this paper are

  5. Nonminutiae-Based Decision-Level Fusion for Fingerprint Verification

    Directory of Open Access Journals (Sweden)

    Hassan Ghassemian

    2007-01-01

    Full Text Available Most of the proposed methods used for fingerprint verification are based on local visible features called minutiae. However, due to problems for extracting minutiae from low-quality fingerprint images, other discriminatory information has been considered. In this paper, the idea of decision-level fusion of orientation, texture, and spectral features of fingerprint image is proposed. At first, a value is assigned to the similarity of block orientation field of two-fingerprint images. This is also performed for texture and spectral features. Each one of the proposed similarity measure does not need core-point existence and detection. Rotation and translation of two fingerprint images are also taken into account in each method and all points of fingerprint image are employed in feature extraction. Then, the similarity of each feature is normalized and used for decision-level fusion of fingerprint information. The experimental results on FVC2000 database demonstrate the effectiveness of the proposed fusion method and its significant accuracy.

  6. E-Visas Verification Schemes Based on Public-Key Infrastructure and Identity Based Encryption

    Directory of Open Access Journals (Sweden)

    Najlaa A. Abuadhmah

    2010-01-01

    Full Text Available Problem statement: Visa is a very important travelling document, which is an essential need at the point of entry of any country we are visiting. However an important document such as visa is still handled manually which affects the accuracy and efficiency of processing the visa. Work on e-visa is almost unexplored. Approach: This study provided a detailed description of a newly proposed e-visa verification system prototyped based on RFID technology. The core technology of the proposed e-visa verification system is based on Identity Based Encryption (IBE and Public Key Infrastructure (PKI. This research provided comparison between both methods in terms of processing time and application usability. Results: The result showed the e-visa verification system is highly flexible when implemented with IBE and on the other hand produces better processing speed when implemented with PKI. Conclusion: Therefore, it is believed that the proposed e-visa verification schemes are valuable security protocol for future study on e-visa.

  7. Ground based spectroscopy of hot Jupiters

    Science.gov (United States)

    Waldmann, Ingo

    2010-05-01

    It has been shown in recent years with great success that spectroscopy of exoplanetary atmospheres is feasible using space based observatories such as the HST and Spitzer. However, with the end of the Spitzer cold-phase, space based observations in the near to mid infra-red are limited, which will remain true until the the onset of the JWST. The importance of developing methods of ground based spectroscopic analysis of known hot Jupiters is therefore apparent. In the past, various groups have attempted exoplanetary spectroscopy using ground based facilities and various techniques. Here I will present results using a novel spectral retrieval method for near to mid infra-red emission and transmission spectra of exoplanetary atmospheres taken from the ground and discuss the feasibility of future ground-based spectroscopy in a broader context. My recently commenced PhD project is under the supervision of Giovanna Tinetti (University College London) and in collaboration with J. P. Beaulieu (Institut d'Astrophysique de Paris), Mark Swain and Pieter Deroo (Jet Propulsion Laboratory, Caltech).

  8. 76 FR 60112 - Consent Based Social Security Number Verification (CBSV) Service

    Science.gov (United States)

    2011-09-28

    ... service, visit our Internet site, Social Security Online, at http://www.socialsecurity.gov . Gerard R... From the Federal Register Online via the Government Publishing Office SOCIAL SECURITY ADMINISTRATION Consent Based Social Security Number Verification (CBSV) Service AGENCY: Social...

  9. 78 FR 56266 - Consent Based Social Security Number Verification (CBSV) Service

    Science.gov (United States)

    2013-09-12

    ...-6401, , for more information about the CBSV service, visit our Internet site, Social Security Online... From the Federal Register Online via the Government Publishing Office SOCIAL SECURITY ADMINISTRATION Consent Based Social Security Number Verification (CBSV) Service AGENCY: Social...

  10. Calibration of Ground-based Lidar instrument

    DEFF Research Database (Denmark)

    Villanueva, Héctor; Gómez Arranz, Paula

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement unce...

  11. Calibration of Ground -based Lidar instrument

    DEFF Research Database (Denmark)

    Villanueva, Héctor; Yordanova, Ginka

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement unce...

  12. Verification of g-factors for lead monofluoride ground state, PbF

    CERN Document Server

    Skripnikov, L V; Titov, A V; Mawhorter, R J; Baum, A L; Sears, T J; Grabow, J -U

    2015-01-01

    We report the results of our theoretical study and analysis of earlier experimental data for the g-factor tensor components of the ground $^2\\Pi_{1/2}$ state of free PbF radical. The values obtained both within the relativistic coupled-cluster method combined with the generalized relativistic effective core potential approach and with our fit of the experimental data from [R.J. Mawhorter, B.S. Murphy, A.L. Baum, T.J. Sears, T. Yang, P.M. Rupasinghe, C.P. McRaven, N.E. Shafer-Ray, L.D. Alphei, J.-U. Grabow, Phys. Rev. A 84, 022508 (2011); A. Baum, B.S. thesis, Pomona College, 2011]. The obtained results agree very well with each other but contradict the previous fit performed in the cited works. Our final prediction for g-factors is $G_{\\parallel}= 0.081(5)$, $G_{\\perp}=-0.27(1)$.

  13. Property-based Code Slicing for Efficient Verification of OSEK/VDX Operating Systems

    Directory of Open Access Journals (Sweden)

    Mingyu Park

    2012-12-01

    Full Text Available Testing is a de-facto verification technique in industry, but insufficient for identifying subtle issues due to its optimistic incompleteness. On the other hand, model checking is a powerful technique that supports comprehensiveness, and is thus suitable for the verification of safety-critical systems. However, it generally requires more knowledge and cost more than testing. This work attempts to take advantage of both techniques to achieve integrated and efficient verification of OSEK/VDX-based automotive operating systems. We propose property-based environment generation and model extraction techniques using static code analysis, which can be applied to both model checking and testing. The technique is automated and applied to an OSEK/VDX-based automotive operating system, Trampoline. Comparative experiments using random testing and model checking for the verification of assertions in the Trampoline kernel code show how our environment generation and abstraction approach can be utilized for efficient fault-detection.

  14. Fuzzy-logic-based safety verification framework for nuclear power plants.

    Science.gov (United States)

    Rastogi, Achint; Gabbar, Hossam A

    2013-06-01

    This article presents a practical implementation of a safety verification framework for nuclear power plants (NPPs) based on fuzzy logic where hazard scenarios are identified in view of safety and control limits in different plant process values. Risk is estimated quantitatively and compared with safety limits in real time so that safety verification can be achieved. Fuzzy logic is used to define safety rules that map hazard condition with required safety protection in view of risk estimate. Case studies are analyzed from NPP to realize the proposed real-time safety verification framework. An automated system is developed to demonstrate the safety limit for different hazard scenarios.

  15. Protocol-Based Verification of Message-Passing Parallel Programs

    DEFF Research Database (Denmark)

    López-Acosta, Hugo-Andrés; Eduardo R. B. Marques, Eduardo R. B.; Martins, Francisco

    2015-01-01

    translated into a representation read by VCC, a software verifier for C. We successfully verified several MPI programs in a running time that is independent of the number of processes or other input parameters. This contrasts with alternative techniques, notably model checking and runtime verification...

  16. Arithmetic Circuit Verification Based on Word-Level Decision Diagrams

    Science.gov (United States)

    1998-05-01

    the addition of *BMDs may have exponential operations in the worst case. Arditi [3] used *BMDs for verification of arithmetic assembly instructions...pp. 6:509-516. [3] ARDITI , L. *BMDS can delay the use of theorem proving for verifying arithmetic as- sembly instructions. In Proceedings of the

  17. FGMOS Based Voltage-Controlled Grounded Resistor

    Directory of Open Access Journals (Sweden)

    R. Pandey

    2010-09-01

    Full Text Available This paper proposes a new floating gate MOSFET (FGMOS based voltage-controlled grounded resistor. In the proposed circuit FGMOS operating in the ohmic region is linearized by another conventional MOSFET operating in the saturation region. The major advantages of FGMOS based voltage-controlled grounded resistor (FGVCGR are simplicity, low total harmonic distortion (THD, and low power consumption. A simple application of this FGVCGR as a tunable high-pass filter is also suggested. The proposed circuits operate at the supply voltages of +/-0.75 V. The circuits are designed and simulated using SPICE in 0.25-µm CMOS technology. The simulation results of FGVCGR demonstrate a THD of 0.28% for the input signal 0.32 Vpp at 45 kHz, and a maximum power consumption of 254 µW.

  18. Space and Ground-Based Infrastructures

    Science.gov (United States)

    Weems, Jon; Zell, Martin

    This chapter deals first with the main characteristics of the space environment, outside and inside a spacecraft. Then the space and space-related (ground-based) infrastructures are described. The most important infrastructure is the International Space Station, which holds many European facilities (for instance the European Columbus Laboratory). Some of them, such as the Columbus External Payload Facility, are located outside the ISS to benefit from external space conditions. There is only one other example of orbital platforms, the Russian Foton/Bion Recoverable Orbital Capsule. In contrast, non-orbital weightless research platforms, although limited in experimental time, are more numerous: sounding rockets, parabolic flight aircraft, drop towers and high-altitude balloons. In addition to these facilities, there are a number of ground-based facilities and space simulators, for both life sciences (for instance: bed rest, clinostats) and physical sciences (for instance: magnetic compensation of gravity). Hypergravity can also be provided by human and non-human centrifuges.

  19. Fusion of PCA-Based and LDA-Based Similarity Measures for Face Verification

    Directory of Open Access Journals (Sweden)

    Kittler Josef

    2010-01-01

    Full Text Available The problem of fusing similarity measure-based classifiers is considered in the context of face verification. The performance of face verification systems using different similarity measures in two well-known appearance-based representation spaces, namely Principle Component Analysis (PCA and Linear Discriminant Analysis (LDA is experimentally studied. The study is performed for both manually and automatically registered face images. The experimental results confirm that our optimised Gradient Direction (GD metric within the LDA feature space outperforms the other adopted metrics. Different methods of selection and fusion of the similarity measure-based classifiers are then examined. The experimental results demonstrate that the combined classifiers outperform any individual verification algorithm. In our studies, the Support Vector Machines (SVMs and Weighted Averaging of similarity measures appear to be the best fusion rules. Another interesting achievement of the work is that although features derived from the LDA approach lead to better results than those of the PCA algorithm for all the adopted scoring functions, fusing the PCA- and LDA-based scores improves the performance of the system.

  20. Development of Ground-Based Plant Sentinels

    Science.gov (United States)

    2007-11-02

    plants in response to different strains of Pseudomonas syringae. Planta . 217:767-775. De Moraes CM, Schultz JC, Mescher MC, Tumlinson JH. (2004...09-30-2004 Final Technical _ April 2001 - April 2003 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Developing Plants as Ground-based Sentinels 5b. GRANT...SUPPLEMENTARY NOTES 14. ABSTRACT 9 "Z Plants emit volatile mixes characteristic of exposure to both plant and animal (insect) pathogens (bacteria and fungi). The

  1. Functional verification of dynamically reconfigurable FPGA-based systems

    CERN Document Server

    Gong, Lingkan

    2015-01-01

    This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric.  Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...

  2. Calibration of Ground-based Lidar instrument

    DEFF Research Database (Denmark)

    Yordanova, Ginka; Gómez Arranz, Paula

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement...... uncertainties provided by measurement standard and corresponding lidar wind speed indications with associated measurement uncertainties. The lidar calibration concerns the 10 minute mean wind speed measurements. The comparison of the lidar measurements of the wind direction with that from wind vanes...

  3. Calibration of Ground-based Lidar instrument

    DEFF Research Database (Denmark)

    Yordanova, Ginka; Gómez Arranz, Paula

    This report presents the result of a test of a ground-based lidar of other type. The test was performed at DTU’s test site for large wind turbines at Høvsøre, Denmark. The result as an establishment of a relation between the reference wind speed measurements with measurement uncertainties provided...... by measurement standard and corresponding lidar wind speed indications with associated measurement uncertainties. The comparison of the lidar measurements of the wind direction with that from the wind vanes is also given....

  4. Calibration of Ground -based Lidar instrument

    DEFF Research Database (Denmark)

    Villanueva, Héctor; Yordanova, Ginka

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement...... uncertainties provided by measurement standard and corresponding lidar wind speed indications with associated measurement uncertainties. The lidar calibration concerns the 10 minute mean wind speed measurements. The comparison of the lidar measurements of the wind direction with that from wind vanes...

  5. Calibration of Ground-based Lidar instrument

    DEFF Research Database (Denmark)

    Yordanova, Ginka; Gómez Arranz, Paula

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement...... uncertainties provided by measurement standard and corresponding lidar wind speed indications with associated measurement uncertainties. The lidar calibration concerns the 10 minute mean wind speed measurements. The comparison of the lidar measurements of the wind direction with that from wind vanes...

  6. Calibration of Ground -based Lidar instrument

    DEFF Research Database (Denmark)

    Villanueva, Héctor; Georgieva Yankova, Ginka

    This report presents the result of the lidar calibration performed for the given Ground-based Lidar at DTU’s test site for large wind turbines at Høvsøre, Denmark. Calibration is here understood as the establishment of a relation between the reference wind speed measurements with measurement...... uncertainties provided by measurement standard and corresponding lidar wind speed indications with associated measurement uncertainties. The lidar calibration concerns the 10 minute mean wind speed measurements. The comparison of the lidar measurements of the wind direction with that from wind vanes...

  7. A Feature Subtraction Method for Image Based Kinship Verification under Uncontrolled Environments

    DEFF Research Database (Denmark)

    Duan, Xiaodong; Tan, Zheng-Hua

    2015-01-01

    The most fundamental problem of local feature based kinship verification methods is that a local feature can capture the variations of environmental conditions and the differences between two persons having a kin relation, which can significantly decrease the performance. To address this problem...... the feature distance between face image pairs with kinship and maximize the distance between non-kinship pairs. Based on the subtracted feature, the verification is realized through a simple Gaussian based distance comparison method. Experiments on two public databases show that the feature subtraction method...

  8. Exploring implementation practices in results-based financing: the case of the verification in Benin.

    Science.gov (United States)

    Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier

    2017-03-14

    Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks

  9. A scenario-based verification technique to assess the compatibility of collaborative business processes

    NARCIS (Netherlands)

    De Backer, M.; Snoeck, M.; Monsieur, G.; Lemahieu, W.; Dedene, G.

    2009-01-01

    Successful E-Business is based on seamless collaborative business processes. Each partner in the collaboration specifies its own rules and interaction preconditions. The verification of the compatibility of collaborative business processes, based on local and global views, is a complex task, which i

  10. Application of Integrated Verification Approach to FPGA-based Safety-Critical I and C System of Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim; Heo, Gyunyoung [Kyunghee Univ., Yongin (Korea, Republic of); Jung, Jaecheon [KEPCO, Ulsan (Korea, Republic of)

    2016-10-15

    Safety-critical instrumentation and control (I and C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. Generally in FPGA design verification, the designers make use of verification techniques by writing the test benches which involved various stages of verification activities of register-transfer level (RTL), gate-level, and place and route. Writing the test benches is considerably time consuming and require a lot of efforts to achieve a satisfied desire results. Furthermore, performing the verification at each stage is a major bottleneck and demanded much activities and time. In addition, verification is conceivably, the most difficult and complicated aspect of any design. Therefore, in view of these, this work applied an integrated verification approach to the verification of FPGA-based I and C system in NPP that simultaneously verified the whole design modules using MATLAB/Simulink HDL Co-simulation models. Verification is conceivably, the most difficult and complicated aspect of any design, and an FPGA design is not an exception. Therefore, in this work, we introduced and discussed how an application of integrated verification technique to the verification and testing of FPGA-based I and C system design in NPP can facilitate the verification processes, and verify the entire design modules of the system simultaneously using MATLAB/Simulink HDL co-simulation models. In conclusion, the results showed that, the integrated verification approach through MATLAB/Simulink models, if applied to any design to be verified, could speed up the design verification and reduce the V and V tasks.

  11. A New Algorithm for On- Line Handwriting Signature Verification Based on Evolutionary Computation

    Institute of Scientific and Technical Information of China (English)

    ZHENG Jianbin; ZHU Guangxi

    2006-01-01

    The paper proposes an on-line signature verification algorithm, through which test sample and template signatures can be optimizedly matched, based on evolutionary computation (EC). Firstly, the similarity of signature curve segment is defined, and shift and scale transforms are also introduced due to the randoness of on-line signature. Secondly,this paper puts forward signature verification matching algorithm after establishment of the mathematical model. Thirdly, the concrete realization of the algorithm based on EC is discussed as well. In addition, the influence of shift and scale on the matching result is fully considered in the algorithm.Finally, a computation example is given, and the matching results between the test sample curve and the template signature curve are analyzed in detail. The preliminary experiments reveal that the type of signature verification problem can be solved by EC.

  12. Asteroseismology of Solar-type stars with Kepler III. Ground-based Data

    CERN Document Server

    Molenda-Zakowicz, Joanna; Sousa, Sergio; Frasca, Antonio; Biazzo, Katia; Huber, Daniel; Ireland, Mike; Bedding, Tim; Stello, Dennis; Uytterhoeven, Katrien; Dreizler, Stefan; De Cat, Peter; Briquet, Maryline; Catanzaro, Giovanni; Karoff, Chistoffer; Frandsen, Soeren; Spezzi, Loredana; Catala, Claude

    2010-01-01

    We report on the ground-based follow-up program of spectroscopic and photometric observations of solar-like asteroseismic targets for the Kepler space mission. These stars constitute a large group of more than thousand objects which are the subject of an intensive study of the Kepler Asteroseismic Science Consortium Working Group 1 (KASC WG-1). The main goal of this coordinated research is the determination of the fundamental stellar atmospheric parameters, which are used for the computing of their asteroseismic models, as well as for the verification of the Kepler Input Catalogue (KIC).

  13. SCENARIO AND TARGET SIMULATION FOR A GROUND BASED MULTIFUNCTION PHASED ARRAY RADAR

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    This paper describes a scenario and target simulation which operates in non real-time to provide full closed-loop operation of the ground based multifunction phased array radar simulation system in support of ballistic missile defence experiments against countermeasure.By simulating the target scattering signature and dynamical signature,this scenario and target simulation provide re- alistic scenario source to evaluate the system performance of multifunction phased array radar,and the key algorithms verification and validation such as target tracking,multi-target imaging and target recognition.

  14. Exact Verification of Hybrid Systems Based on Bilinear SOS Representation

    CERN Document Server

    Yang, Zhengfeng; Lin, Wang

    2012-01-01

    In this paper, we address the problem of safety verification of nonlinear hybrid systems and stability analysis of nonlinear autonomous systems. A hybrid symbolic-numeric method is presented to compute exact inequality invariants of hybrid systems and exact estimates of regions of attraction of autonomous systems efficiently. Some numerical invariants of a hybrid system or an estimate of region of attraction can be obtained by solving a bilinear SOS program via PENBMI solver or iterative method, then the modified Newton refinement and rational vector recovery techniques are applied to obtain exact polynomial invariants and estimates of regions of attraction with rational coefficients. Experiments on some benchmarks are given to illustrate the efficiency of our algorithm.

  15. Verification of Interdomain Routing System Based on Formal Methods

    Institute of Scientific and Technical Information of China (English)

    ZANG Zhiyuan; LUO Guiming; YIN Chongyuan

    2009-01-01

    In networks,the stable path problem (SPP) usually results in oscillations in interdomain systems and may cause systems to become unstable.With the rapid development of internet technology,the occurrence of SPPs in interdomain systems has quite recently become a significant focus of research.A framework for checking SPPs is presented in this paper with verification of an interdomain routing system using formal methods and the NuSMV software.Sufficient conditions and necessary conditions for determining SPP occurrence are presented with proof of the method's effectiveness.Linear temporal logic was used to model an interdomain routing system and its properties were analyzed.An example is included to demonstrate the method's reliability.

  16. LithoScope: Simulation Based Mask Layout Verification with Physical Resist Model

    Science.gov (United States)

    Qian, Qi-De

    2002-12-01

    Simulation based mask layout verification and optimization is a cost effective way to ensure high mask performance in wafer lithography. Because mask layout verification serves as a gateway to the expensive manufacturing process, the model used for verification must have superior accuracy than models used upstream. In this paper, we demonstrate, for the first time, a software system for mask layout verification and optical proximity correction that employs a physical resist development model. The new system, LithoScope, predicts wafer patterning by solving optical and resist processing equations on a scale that is until recently considered unpractical. Leveraging the predictive capability of the physical model, LithoScope can perform mask layout verification and optical proximity correction under a wide range of processing conditions and for any reticle enhancement technology without the need for multiple model development. We show the ability for physical resist model to change iso-focal bias by optimizing resist parameters, which is critical for matching the experimental process window. We present line width variation statistics and chip level process window predictions using a practical cell layout. We show that LithoScope model can accurately describe the resist-intensive poly gate layer patterning. This system can be used to pre-screen mask data problems before manufacturing to reduce the overall cost of the mask and the product.

  17. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices

    Directory of Open Access Journals (Sweden)

    Jingzhen Li

    2017-01-01

    Full Text Available In this paper, an approach to biometric verification based on human body communication (HBC is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer’s forearm is measured by vector network analyzer (VNA. Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR and false rejection rate (FRR based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN classification, support vector machines (SVM, and naive Bayesian method (NBM classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices.

  18. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices.

    Science.gov (United States)

    Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei

    2017-01-10

    In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer's forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices.

  19. An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices

    Science.gov (United States)

    Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei

    2017-01-01

    In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer’s forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices. PMID:28075375

  20. Information verification and encryption based on phase retrieval with sparsity constraints and optical inference

    Science.gov (United States)

    Zhong, Shenlu; Li, Mengjiao; Tang, Xiajie; He, Weiqing; Wang, Xiaogang

    2017-01-01

    A novel optical information verification and encryption method is proposed based on inference principle and phase retrieval with sparsity constraints. In this method, a target image is encrypted into two phase-only masks (POMs), which comprise sparse phase data used for verification. Both of the two POMs need to be authenticated before being applied for decrypting. The target image can be optically reconstructed when the two authenticated POMs are Fourier transformed and convolved by the correct decryption key, which is also generated in encryption process. No holographic scheme is involved in the proposed optical verification and encryption system and there is also no problem of information disclosure in the two authenticable POMs. Numerical simulation results demonstrate the validity and good performance of this new proposed method.

  1. Verification of Concurrent Assembly Programs with a Petri Net Based Safety Policy

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Concurrent programs written in a machine level language are being used in many areas but verification of such programs brings new challenges to the programming language community. Most of the studies in the literature on verifying the safety properties of concurrent programs are for high-level languages,specifications, or calculi. Therefore, more studies are needed on concurrency verification for machine level language programs. This paper describes a framework of a Petri net based safety policy for the verification of concurrent assembly programs, to exploit the capability of Petri nets in concurrency modeling. The concurrency safety properties can be considered separately using the net structure and by mixing Hoare logic and computational tree logic. Therefore, more useful higher-level safety properties can be specified and verified.

  2. Speaker-dependent Dictionary-based Speech Enhancement for Text-Dependent Speaker Verification

    DEFF Research Database (Denmark)

    Thomsen, Nicolai Bæk; Thomsen, Dennis Alexander Lehmann; Tan, Zheng-Hua

    2016-01-01

    The problem of text-dependent speaker verification under noisy conditions is becoming ever more relevant, due to increased usage for authentication in real-world applications. Classical methods for noise reduction such as spectral subtraction and Wiener filtering introduce distortion and do...... not perform well in this setting. In this work we compare the performance of different noise reduction methods under different noise conditions in terms of speaker verification when the text is known and the system is trained on clean data (mis-matched conditions). We furthermore propose a new approach based...

  3. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    Science.gov (United States)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  4. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    Science.gov (United States)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  5. An Improved Constraint-based system for the verification of security protocols

    NARCIS (Netherlands)

    Corin, R.J.; Etalle, Sandro; Hermenegildo, Manuel V.; Puebla, German

    We propose a constraint-based system for the verification of security protocols that improves upon the one developed by Millen and Shmatikov. Our system features (1) a significantly more efficient implementation, (2) a monotonic behavior, which also allows to detect aws associated to partial runs

  6. An Improved Constraint-based system for the verification of security protocols

    NARCIS (Netherlands)

    Corin, Ricardo; Etalle, Sandro; Hermenegildo, Manuel V.; Puebla, German

    2002-01-01

    We propose a constraint-based system for the verification of security protocols that improves upon the one developed by Millen and Shmatikov. Our system features (1) a significantly more efficient implementation, (2) a monotonic behavior, which also allows to detect aws associated to partial runs an

  7. Ensemble-based approximation of observation impact using an observation-based verification metric

    Directory of Open Access Journals (Sweden)

    Matthias Sommer

    2016-07-01

    Full Text Available Knowledge on the contribution of observations to forecast accuracy is crucial for the refinement of observing and data assimilation systems. Several recent publications highlighted the benefits of efficiently approximating this observation impact using adjoint methods or ensembles. This study proposes a modification of an existing method for computing observation impact in an ensemble-based data assimilation and forecasting system and applies the method to a pre-operational, convective-scale regional modelling environment. Instead of the analysis, the modified approach uses observation-based verification metrics to mitigate the effect of correlation between the forecast and its verification norm. Furthermore, a peculiar property in the distribution of individual observation impact values is used to define a reliability indicator for the accuracy of the impact approximation. Applying this method to a 3-day test period shows that a well-defined observation impact value can be approximated for most observation types and the reliability indicator successfully depicts where results are not significant.

  8. Verification of Information Flow in Agent-Based Systems

    Science.gov (United States)

    Sabri, Khair Eddin; Khedri, Ridha; Jaskolka, Jason

    Analyzing information flow is beneficial for ensuring the satisfiability of security policies during the exchange of information between the agents of a system. In the literature, models such as Bell-LaPadula model and the Chinese Wall model are proposed to capture and govern the exchange of information among agents. Also, we find several verification techniques for analyzing information flow within programs or multi-agent systems. However, these models and techniques assume the atomicity of the exchanged information, which means that the information cannot be decomposed or combined with other pieces of information. Also, the policies of their models prohibit any transfer of information from a high level agent to a low level agent. In this paper, we propose a technique that relaxes these assumptions. Indeed, the proposed technique allows classifying information into frames and articulating finer granularity policies that involve information, its elements, or its frames. Also, it allows for information manipulation through several operations such as focusing and combining information. Relaxing the atomicity of information assumption permits an analysis that takes into account the ability of an agent to link elements of information in order to evolve its knowledge.

  9. M3 version 3.0: Verification and validation; Hydrochemical model of ground water at repository site

    Energy Technology Data Exchange (ETDEWEB)

    Gomez, Javier B. (Dept. of Earth Sciences, Univ. of Zaragoza, Zaragoza (Spain)); Laaksoharju, Marcus (Geopoint AB, Sollentuna (Sweden)); Skaarman, Erik (Abscondo, Bromma (Sweden)); Gurban, Ioana (3D-Terra (Canada))

    2009-01-15

    Hydrochemical evaluation is a complex type of work that is carried out by specialists. The outcome of this work is generally presented as qualitative models and process descriptions of a site. To support and help to quantify the processes in an objective way, a multivariate mathematical tool entitled M3 (Multivariate Mixing and Mass balance calculations) has been constructed. The computer code can be used to trace the origin of the groundwater, and to calculate the mixing proportions and mass balances from groundwater data. The M3 code is a groundwater response model, which means that changes in the groundwater chemistry in terms of sources and sinks are traced in relation to an ideal mixing model. The complexity of the measured groundwater data determines the configuration of the ideal mixing model. Deviations from the ideal mixing model are interpreted as being due to reactions. Assumptions concerning important mineral phases altering the groundwater or uncertainties associated with thermodynamic constants do not affect the modelling because the calculations are solely based on the measured groundwater composition. M3 uses the opposite approach to that of many standard hydrochemical models. In M3, mixing is evaluated and calculated first. The constituents that cannot be described by mixing are described by reactions. The M3 model consists of three steps: the first is a standard principal component analysis, followed by mixing and finally mass balance calculations. The measured groundwater composition can be described in terms of mixing proportions (%), while the sinks and sources of an element associated with reactions are reported in mg/L. This report contains a set of verification and validation exercises with the intention of building confidence in the use of the M3 methodology. At the same time, clear answers are given to questions related to the accuracy and the precision of the results, including the inherent uncertainties and the errors that can be made

  10. M3 version 3.0: Verification and validation; Hydrochemical model of ground water at repository site

    Energy Technology Data Exchange (ETDEWEB)

    Gomez, Javier B. (Dept. of Earth Sciences, Univ. of Zaragoza, Zaragoza (Spain)); Laaksoharju, Marcus (Geopoint AB, Sollentuna (Sweden)); Skaarman, Erik (Abscondo, Bromma (Sweden)); Gurban, Ioana (3D-Terra (Canada))

    2009-01-15

    Hydrochemical evaluation is a complex type of work that is carried out by specialists. The outcome of this work is generally presented as qualitative models and process descriptions of a site. To support and help to quantify the processes in an objective way, a multivariate mathematical tool entitled M3 (Multivariate Mixing and Mass balance calculations) has been constructed. The computer code can be used to trace the origin of the groundwater, and to calculate the mixing proportions and mass balances from groundwater data. The M3 code is a groundwater response model, which means that changes in the groundwater chemistry in terms of sources and sinks are traced in relation to an ideal mixing model. The complexity of the measured groundwater data determines the configuration of the ideal mixing model. Deviations from the ideal mixing model are interpreted as being due to reactions. Assumptions concerning important mineral phases altering the groundwater or uncertainties associated with thermodynamic constants do not affect the modelling because the calculations are solely based on the measured groundwater composition. M3 uses the opposite approach to that of many standard hydrochemical models. In M3, mixing is evaluated and calculated first. The constituents that cannot be described by mixing are described by reactions. The M3 model consists of three steps: the first is a standard principal component analysis, followed by mixing and finally mass balance calculations. The measured groundwater composition can be described in terms of mixing proportions (%), while the sinks and sources of an element associated with reactions are reported in mg/L. This report contains a set of verification and validation exercises with the intention of building confidence in the use of the M3 methodology. At the same time, clear answers are given to questions related to the accuracy and the precision of the results, including the inherent uncertainties and the errors that can be made

  11. Advances in SVM-Based System Using GMM Super Vectors for Text-Independent Speaker Verification

    Institute of Scientific and Technical Information of China (English)

    ZHAO Jian; DONG Yuan; ZHAO Xianyu; YANG Hao; LU Liang; WANG Haila

    2008-01-01

    For text-independent speaker verification,the Gaussian mixture model (GMM) using a universal background model strategy and the GMM using support vector machines are the two most commonly used methodologies.Recently,a new SVM-based speaker verification method using GMM super vectors has been proposed.This paper describes the construction of a new speaker verification system and investigates the use of nuisance attribute projection and test normalization to further enhance performance.Experiments were conducted on the core test of the 2006 NIST speaker recognition evaluation corpus.The experimental results indicate that an SVM-based speaker verification system using GMM super vectors can achieve ap-pealing performance.With the use of nuisance attribute projection and test normalization,the system per-formance can be significantly improved,with improvements in the equal error rate from 7.78% to 4.92% and detection cost function from 0.0376 to 0.0251.

  12. SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Thoelking, J; Yuvaraj, S; Jens, F; Lohr, F; Wenz, F; Wertz, H; Wertz, H [University Medical Center Mannheim, University of Heidelberg, Mannheim, Baden-Wuerttemberg (Germany)

    2015-06-15

    Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference) and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan

  13. Current trends in ground based solar magnetometry

    Science.gov (United States)

    Gosain, Sanjay

    2016-07-01

    Continuous observations of the sun, over more than a century, have led to several important discoveries in solar astronomy. These include the discovery of the solar magnetism and its cyclic modulation, active region formation and decay and their role in energetic phenomena such as fares and coronal mass ejections (CMEs), fine structure and dynamics of the sunspots and small-scale organization of the magnetic flux in the form of flux tubes and so forth. In this article we give a brief overview of advancements in solar observational techniques in recent decades and the results obtained from the such observations. These include techniques to achieve high angular resolution, high spectral and polarimetric sensitivity and innovative new detectors. A wide range of spatial, temporal and spectral domains exploited by solar astronomers to understand the solar phenomena are discussed. Many new upcoming telescopes and instruments that are designed to address different aspects of solar physics problems are briefly described. Finally, we discuss the advantages of observing from the ground and how they can complement space-based observations.

  14. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    Science.gov (United States)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  15. Formalization and Verification of Business Process Modeling Based on UML and Petri Nets

    Institute of Scientific and Technical Information of China (English)

    YAN Zhi-jun; GAN Ren-chu

    2005-01-01

    In order to provide a quantitative analysis and verification method for activity diagrams based business process modeling, a formal definition of activity diagrams is introduced. And the basic requirements for activity diagrams based business process models are proposed. Furthermore, the standardized transformation technique between business process models and basic Petri nets is presented and the analysis method for the soundness and well-structured properties of business processes is introduced.

  16. Scenario-based verification of real-time systems using UPPAAL

    DEFF Research Database (Denmark)

    Li, Shuhao; Belaguer, Sandie; David, Alexandre;

    2010-01-01

    as a separate monitored LSC chart. We make timed extensions to a kernel subset of the LSC language and define a trace-based semantics. By translating a monitored LSC chart to a behavior-equivalent observer TA and then non-intrusively composing this observer with the original TA modeled real-time system......, the problem of scenario-based verification reduces to a computation tree logic (CTL) real-time model checking problem. In case the real time system is modeled as a set of driving LSC charts, we translate these driving charts and the monitored chart into a behavior-equivalent network of TAs by using a “one......Abstract This paper proposes two approaches to tool-supported automatic verification of dense real-time systems against scenario-based requirements, where a system is modeled as a network of timed automata (TAs) or as a set of driving live sequence charts (LSCs), and a requirement is specified...

  17. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  18. Face verification system for Android mobile devices using histogram based features

    Science.gov (United States)

    Sato, Sho; Kobayashi, Kazuhiro; Chen, Qiu

    2016-07-01

    This paper proposes a face verification system that runs on Android mobile devices. In this system, facial image is captured by a built-in camera on the Android device firstly, and then face detection is implemented using Haar-like features and AdaBoost learning algorithm. The proposed system verify the detected face using histogram based features, which are generated by binary Vector Quantization (VQ) histogram using DCT coefficients in low frequency domains, as well as Improved Local Binary Pattern (Improved LBP) histogram in spatial domain. Verification results with different type of histogram based features are first obtained separately and then combined by weighted averaging. We evaluate our proposed algorithm by using publicly available ORL database and facial images captured by an Android tablet.

  19. Introduction of a terrestrial free-space optical communications network facility: IN-orbit and Networked Optical ground stations experimental Verification Advanced testbed (INNOVA)

    Science.gov (United States)

    Toyoshima, Morio; Munemasa, Yasushi; Takenaka, Hideki; Takayama, Yoshihisa; Koyama, Yoshisada; Kunimori, Hiroo; Kubooka, Toshihiro; Suzuki, Kenji; Yamamoto, Shinichi; Taira, Shinichi; Tsuji, Hiroyuki; Nakazawa, Isao; Akioka, Maki

    2014-03-01

    A terrestrial free-space optical communications network facility, named IN-orbit and Networked Optical ground stations experimental Verification Advanced testbed (INNOVA) is introduced. Many demonstrations have been conducted to verify the usability of sophisticated optical communications equipment in orbit. However, the influence of terrestrial weather conditions remains as an issue to be solved. One potential solution is site diversity, where several ground stations are used. In such systems, implementing direct high-speed optical communications links for transmission of data from satellites to terrestrial sites requires that links can be established even in the presence of clouds and rain. NICT is developing a terrestrial free-space optical communications network called INNOVA for future airborne and satellitebased optical communications projects. Several ground stations and environmental monitoring stations around Japan are being used to explore the site diversity concept. This paper describes the terrestrial free-space optical communications network facility, the monitoring stations around Japan for free-space laser communications, and potential research at NICT.

  20. Ground-based observations of Kepler asteroseismic targets

    DEFF Research Database (Denmark)

    Uyttterhoeven , K.; Karoff, Christoffer

    2010-01-01

    We present the ground-based activities within the different working groups of the Kepler Asteroseismic Science Consortium (KASC). The activities aim at the systematic characterization of the 5000+ KASC targets, and at the collection of ground-based follow-up time-series data of selected promising...

  1. A solenoid-based active hydraulic engine mount: modelling, analysis, and verification

    OpenAIRE

    Hosseini, Ali

    2010-01-01

    The focus of this thesis is on the design, modelling, identification, simulation, and experimental verification of a low-cost solenoid-based active hydraulic engine mount. To build an active engine mount, a commercial On-Off solenoid is modified to be used as an actuator and it is embedded inside a hydraulic engine mount. The hydraulic engine mount is modelled and tested, solenoid actuator is modelled and identified, and finally the models were integrated to obtain the analytical model of the...

  2. Power Gating Based Ground Bounce Noise Reduction

    Directory of Open Access Journals (Sweden)

    M. Uma Maheswari

    2014-08-01

    Full Text Available As low power circuits are most popular the decrease in supply voltage leads to increase in leakage power with respect to the technology scaling. So for removing this kind of leakages and to provide a better power efficiency many power gating techniques are used. But the leakage due to ground connection to the active part of the circuit is very high rather than all other leakages. As it is mainly due to the back EMF of the ground connection it was called it as ground bounce noise. To reduce this noise different methodologies are designed. In this paper the design of such an efficient technique related to ground bounce noise reduction using power gating circuits and comparing the results using DSCH and Microwind low power tools. In this paper the analysis of adders such as full adders using different types of power gated circuits using low power VLSI design techniques and to present the comparison results between different power gating methods.

  3. Streaming-based verification of XML signatures in SOAP messages

    DEFF Research Database (Denmark)

    Somorovsky, Juraj; Jensen, Meiko; Schwenk, Jörg

    2010-01-01

    approach for XML processing, the Web Services servers easily become a target of Denial-of-Service attacks. We present a solution for these problems: an external streaming-based WS-Security Gateway. Our implementation is capable of processing XML Signatures in SOAP messages using a streaming-based approach...

  4. Movable Ground Based Recovery System for Reuseable Space Flight Hardware

    Science.gov (United States)

    Sarver, George L. (Inventor)

    2013-01-01

    A reusable space flight launch system is configured to eliminate complex descent and landing systems from the space flight hardware and move them to maneuverable ground based systems. Precision landing of the reusable space flight hardware is enabled using a simple, light weight aerodynamic device on board the flight hardware such as a parachute, and one or more translating ground based vehicles such as a hovercraft that include active speed, orientation and directional control. The ground based vehicle maneuvers itself into position beneath the descending flight hardware, matching its speed and direction and captures the flight hardware. The ground based vehicle will contain propulsion, command and GN&C functionality as well as space flight hardware landing cushioning and retaining hardware. The ground based vehicle propulsion system enables longitudinal and transverse maneuverability independent of its physical heading.

  5. Formal verification of software-based medical devices considering medical guidelines.

    Science.gov (United States)

    Daw, Zamira; Cleaveland, Rance; Vetter, Marcus

    2014-01-01

    Software-based devices have increasingly become an important part of several clinical scenarios. Due to their critical impact on human life, medical devices have very strict safety requirements. It is therefore necessary to apply verification methods to ensure that the safety requirements are met. Verification of software-based devices is commonly limited to the verification of their internal elements without considering the interaction that these elements have with other devices as well as the application environment in which they are used. Medical guidelines define clinical procedures, which contain the necessary information to completely verify medical devices. The objective of this work was to incorporate medical guidelines into the verification process in order to increase the reliability of the software-based medical devices. Medical devices are developed using the model-driven method deterministic models for signal processing of embedded systems (DMOSES). This method uses unified modeling language (UML) models as a basis for the development of medical devices. The UML activity diagram is used to describe medical guidelines as workflows. The functionality of the medical devices is abstracted as a set of actions that is modeled within these workflows. In this paper, the UML models are verified using the UPPAAL model-checker. For this purpose, a formalization approach for the UML models using timed automaton (TA) is presented. A set of requirements is verified by the proposed approach for the navigation-guided biopsy. This shows the capability for identifying errors or optimization points both in the workflow and in the system design of the navigation device. In addition to the above, an open source eclipse plug-in was developed for the automated transformation of UML models into TA models that are automatically verified using UPPAAL. The proposed method enables developers to model medical devices and their clinical environment using clinical workflows as one

  6. Estimation of above ground biomass in boreal forest using ground-based Lidar

    Science.gov (United States)

    Taheriazad, L.; Moghadas, H.; Sanchez-Azofeifa, A.

    2017-05-01

    Assessing above ground biomass of forest is important for carbon storage monitoring in boreal forest. In this study, a new model is developed to estimate the above ground biomass using ground based Lidar data. 21 trees were measured and scanned across the plot area study in boreal forests of Alberta, Canada. The study area was scanned in the summer season 2014 to quantify the green biomass. The average of total crown biomass and green biomass in this study was 377 kg (standard deviation, S.D. = 243 kg) and 6.42 kg (S.D. = 2.69 m), respectively.

  7. Streaming-based verification of XML signatures in SOAP messages

    DEFF Research Database (Denmark)

    Somorovsky, Juraj; Jensen, Meiko; Schwenk, Jörg

    2010-01-01

    WS-Security is a standard providing message-level security in Web Services. Therewith, it ensures their integrity, confidentiality, and authenticity. However, using sophisticated security algorithms can lead to high memory consumptions and long evaluation times. In combination with the standard DOM...... approach for XML processing, the Web Services servers easily become a target of Denial-of-Service attacks. We present a solution for these problems: an external streaming-based WS-Security Gateway. Our implementation is capable of processing XML Signatures in SOAP messages using a streaming-based approach...

  8. The Application of GeoRSC Based on Domestic Satellite in Field Remote Sensing Anomaly Verification

    Science.gov (United States)

    Gao, Ting; Yang, Min; Han, Haihui; Li, Jianqiang; Yi, Huan

    2016-11-01

    The Geo REC is the digital remote sensing survey system which based on domestic satellites, and by means of it, the thesis carriedy out a remote sensing anomaly verification field application test in Nachitai area of Qinghai. Field test checks the system installation, the stability of the system operation, the efficiency of reading and show the romoate image or vector data, the security of the data management system and the accuracy of BeiDou navigation; through the test data, the author indicated that the hardware and software system could satisfy the remote sensing anomaly verification work in field, which could also could make it convenient forconvenient the workflow of remote sense survey and, improve the work efficiency,. Aat the same time, in the course of the experiment, we also found some shortcomings of the system, and give some suggestions for improvement combineding with the practical work for the system.

  9. Tree automata-based refinement with application to Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2015-01-01

    underlying the Horn clauses. Experiments using linear constraint problems and the abstract domain of convex polyhedra show that the refinement technique is practical and that iteration of abstract interpretation with tree automata-based refinement solves many challenging Horn clause verification problems. We......In this paper we apply tree-automata techniques to refinement of abstract interpretation in Horn clause verification. We go beyond previous work on refining trace abstractions; firstly we handle tree automata rather than string automata and thereby can capture traces in any Horn clause derivations...... rather than just transition systems; secondly, we show how algorithms manipulating tree automata interact with abstract interpretations, establishing progress in refinement and generating refined clauses that eliminate causes of imprecision. We show how to derive a refined set of Horn clauses in which...

  10. Horn clause verification with convex polyhedral abstraction and tree automata-based refinement

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2017-01-01

    underlying the Horn clauses. Experiments using linear constraint problems and the abstract domain of convex polyhedra show that the refinement technique is practical and that iteration of abstract interpretation with tree automata-based refinement solves many challenging Horn clause verification problems. We......In this paper we apply tree-automata techniques to refinement of abstract interpretation in Horn clause verification. We go beyond previous work on refining trace abstractions; firstly we handle tree automata rather than string automata and thereby can capture traces in any Horn clause derivations...... rather than just transition systems; secondly, we show how algorithms manipulating tree automata interact with abstract interpretations, establishing progress in refinement and generating refined clauses that eliminate causes of imprecision. We show how to derive a refined set of Horn clauses in which...

  11. SAT-based verification for timed component connectors

    NARCIS (Netherlands)

    Kemper, S.

    2011-01-01

    Component-based software construction relies on suitable models underlying components, and in particular the coordinators which orchestrate component behaviour. Verifying correctness and safety of such systems amounts to model checking the underlying system model. The model checking techniques not o

  12. Multimodal human verification using stereo-based 3D inforamtion, IR, and speech

    Science.gov (United States)

    Park, Changhan

    2007-04-01

    In this paper, we propose a personal verification method using 3D face information, infrared (IR), and speech to improve the rate of single biometric authentication. False acceptance rate (FAR) and false rejection rate (FRR) have been a fundamental bottleneck of real-time personal verification. Proposed method uses principal component analysis (PCA) for face recognition and hidden markov model (HMM) for speech recognition based on stereo acquisition system with IR imagery. 3D face information acquires face's depth and distance using a stereo system. The proposed system consists of eye detection, facial pose direction estimation, and PCA modules. An IR image of the human face presents its unique heat-signature and can be used for recognition. IR images use only for decision whether human face or not. It also uses fuzzy logic for the final decision of personal verification. Based on experimental results, the proposed system can reduce FAR which provides that the proposed method overcomes the limitation of single biometric system and provides stable person authentication in real-time.

  13. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform

    Directory of Open Access Journals (Sweden)

    Raquel Acero

    2016-11-01

    Full Text Available This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs together with a capacitive sensor-based indexed metrology platform (IMP based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform’s mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument’s working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  14. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform.

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-11-18

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  15. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film

    Energy Technology Data Exchange (ETDEWEB)

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor [Gamma Knife Unit, Department of Neurosurgery, Neurosciences Centre, All India Institute of Medical Sciences, Ansari Nagar, New Delhi 110029 (India)

    2013-12-15

    Purpose: This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery.Methods: A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage—110 kV, tube current—280 mA, pixel size—0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes.Results: Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%/1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement.Conclusions: Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film.

  16. Verification of Gamma Knife extend system based fractionated treatment planning using EBT2 film.

    Science.gov (United States)

    Natanasabapathi, Gopishankar; Bisht, Raj Kishor

    2013-12-01

    This paper presents EBT2 film verification of fractionated treatment planning with the Gamma Knife (GK) extend system, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery. A human head shaped phantom simulated the verification process for fractionated Gamma Knife treatment. Phantom preparation for Extend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, and acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion 6) of the phantom was obtained with following parameters: Tube voltage--110 kV, tube current--280 mA, pixel size--0.5 × 0.5 and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sectors blocking in each shot was made. Dose prescription of 4 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 to 12 Gy doses for calibration purposes. An EPSON (Expression 10000 XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with inhouse written MATLAB codes. Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for tolerance criteria of 1%∕1 mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane were in close agreement. Through this study, the authors propose treatment verification QA method for Extend frame based fractionated Gamma Knife radiosurgery using EBT2 film.

  17. Verification Based on Set-Abstraction Using the AIF Framework

    DEFF Research Database (Denmark)

    Mödersheim, Sebastian Alexander

    The AIF framework is a novel method for analyzing advanced security protocols, web services, and APIs, based a new abstract interpretation method. It consists of the specification language AIF and a translation/abstraction processes that produces a set of first-order Horn clauses. These can then ...... then be checked with several back-ends such as SPASS. We discuss in this article how to use AIF for modeling of a variety of examples....

  18. Early Development of UVM based Verification Environment of Image Signal Processing Designs using TLM Reference Model of RTL

    Directory of Open Access Journals (Sweden)

    Abhishek Jain

    2014-01-01

    Full Text Available With semiconductor industry trend of “smaller the better”, from an idea to a final product, more innovation on product portfolio and yet remaining competitive and profitable are few criteria which are culminating into pressure and need for more and more innovation for CAD flow, process management and project execution cycle. Project schedules are very tight and to achieve first silicon success is key for projects. This necessitates quicker verification with better coverage matrix. Quicker Verification requires early development of the verification environment with wider test vectors without waiting for RTL to be available. In this paper, we are presenting a novel approach of early development of reusable multi-language verification flow, by addressing four major activities of verification – 1. Early creation of Executable Specification 2. Early creation of Verification Environment 3. Early development of test vectors and 4. Better and increased Re-use of blocks Although this paper focuses on early development of UVM based Verification Environment of Image Signal Processing designs using TLM Reference Model of RTL, same concept can be extended for non-image signal processing designs.

  19. Color information verification system based on singular value decomposition in gyrator transform domains

    Science.gov (United States)

    Abuturab, Muhammad Rafiq

    2014-06-01

    A new color image security system based on singular value decomposition (SVD) in gyrator transform (GT) domains is proposed. In the encryption process, a color image is decomposed into red, green and blue channels. Each channel is independently modulated by random phase masks and then separately gyrator transformed at different parameters. The three gyrator spectra are joined by multiplication to get one gray ciphertext. The ciphertext is separated into U, S, and V parts by SVD. All the three parts are individually gyrator transformed at different transformation angles. The three encoded information can be assigned to different authorized users for highly secure verification. Only when all the authorized users place the U, S, and V parts in correct multiplication order in the verification system, the correct information can be obtained with all the right keys. In the proposed method, SVD offers one-way asymmetrical decomposition algorithm and it is an optimal matrix decomposition in a least-square sense. The transformation angles of GT provide very sensitive additional keys. The pre-generated keys for red, green and blue channels are served as decryption (private) keys. As all the three encrypted parts are the gray scale ciphertexts with stationary white noise distributions, which have camouflage property to some extent. These advantages enhance the security and robustness. Numerical simulations are presented to support the viability of the proposed verification system.

  20. Secure voice-based authentication for mobile devices: vaulted voice verification

    Science.gov (United States)

    Johnson, R. C.; Scheirer, Walter J.; Boult, Terrance E.

    2013-05-01

    As the use of biometrics becomes more wide-spread, the privacy concerns that stem from the use of biometrics are becoming more apparent. As the usage of mobile devices grows, so does the desire to implement biometric identification into such devices. A large majority of mobile devices being used are mobile phones. While work is being done to implement different types of biometrics into mobile phones, such as photo based biometrics, voice is a more natural choice. The idea of voice as a biometric identifier has been around a long time. One of the major concerns with using voice as an identifier is the instability of voice. We have developed a protocol that addresses those instabilities and preserves privacy. This paper describes a novel protocol that allows a user to authenticate using voice on a mobile/remote device without compromising their privacy. We first discuss the Vaulted Verification protocol, which has recently been introduced in research literature, and then describe its limitations. We then introduce a novel adaptation and extension of the Vaulted Verification protocol to voice, dubbed Vaulted Voice Verification (V3). Following that we show a performance evaluation and then conclude with a discussion of security and future work.

  1. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    Science.gov (United States)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  2. Formal Verification Techniques Based on Boolean Satisfiability Problem

    Institute of Scientific and Technical Information of China (English)

    Xiao-Wei Li; Guang-Hui Li; Ming Shao

    2005-01-01

    This paper exploits Boolean satisfiability problem in equivalence checking and model checking respectively. A combinational equivalence checking method based on incremental satisfiability is presented. This method chooses the can didate equivalent pairs with some new techniques, and uses incremental satisfiability algorithm to improve its performance. By substituting the internal equivalent pairs and converting the equivalence relations into conjunctive normal form (CNF) formulas, this approach can avoid the false negatives, and reduce the search space of SAT procedure. Experimental results on ISCAS'85 benchmark circuits show that, the presented approach is faster and more robust than those existed in literature.This paper also presents an algorithm for extracting of unsatisfiable core, which has an important application in abstraction and refinement for model checking to alleviate the state space explosion bottleneck. The error of approximate extraction is analyzed by means of simulation. An analysis reveals that an interesting phenomenon occurs, with the increasing density of the formula, the average error of the extraction is decreasing. An exact extraction approach for MU subformula, referred to as pre-assignment algorithm, is proposed. Both theoretical analysis and experimental results show that it is more efficient.

  3. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment

    Energy Technology Data Exchange (ETDEWEB)

    Fuangrod, Todsaporn [Faculty of Engineering and Built Environment, School of Electrical Engineering and Computer Science, the University of Newcastle, NSW 2308 (Australia); Woodruff, Henry C.; O’Connor, Daryl J. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308 (Australia); Uytven, Eric van; McCurdy, Boyd M. C. [Division of Medical Physics, CancerCare Manitoba, 675 McDermot Avenue, Winnipeg, Manitoba R3E 0V9 (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Department of Radiology, University of Manitoba, Winnipeg, Manitoba R3T 2N2 (Canada); Kuncic, Zdenka [School of Physics, University of Sydney, Sydney, NSW 2006 (Australia); Greer, Peter B. [Faculty of Science and IT, School of Mathematical and Physical Sciences, the University of Newcastle, NSW 2308, Australia and Department of Radiation Oncology, Calvary Mater Newcastle Hospital, Locked Bag 7, Hunter region Mail Centre, Newcastle, NSW 2310 (Australia)

    2013-09-15

    Purpose: To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient.Methods: The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance.Results: The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s).Conclusions: A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  4. VERIFICATION OF GRAPHEMES USING NEURAL NETWORKS IN AN HMM­BASED ON­LINE KOREAN HANDWRITING RECOGNITION SYSTEM

    NARCIS (Netherlands)

    So, S.J.; Kim, J.; Kim, J.H.

    2004-01-01

    This paper presents a neural network based verification method in an HMM­based on­line Korean handwriting recognition system. It penalizes unreasonable grapheme hypotheses and complements global and structural information to the HMM­based recognition system, which is intrinsically based on local inf

  5. Safety Verification of a Fault Tolerant Reconfigurable Autonomous Goal-Based Robotic Control System

    Science.gov (United States)

    Braman, Julia M. B.; Murray, Richard M; Wagner, David A.

    2007-01-01

    Fault tolerance and safety verification of control systems are essential for the success of autonomous robotic systems. A control architecture called Mission Data System (MDS), developed at the Jet Propulsion Laboratory, takes a goal-based control approach. In this paper, a method for converting goal network control programs into linear hybrid systems is developed. The linear hybrid system can then be verified for safety in the presence of failures using existing symbolic model checkers. An example task is simulated in MDS and successfully verified using HyTech, a symbolic model checking software for linear hybrid systems.

  6. Mask synthesis and verification based on geometric model for surface micro-machined MEMS

    Institute of Scientific and Technical Information of China (English)

    LI Jian-hua; LIU Yu-sheng; GAO Shu-ming

    2005-01-01

    Traditional MEMS (microelectromechanical system) design methodology is not a structured method and has become an obstacle for MEMS creative design. In this paper, a novel method of mask synthesis and verification for surface micro-machined MEMS is proposed, which is based on the geometric model of a MEMS device. The emphasis is focused on synthesizing the masks at the basis of the layer model generated from the geometric model of the MEMS device. The method is comprised of several steps: the correction of the layer model, the generation of initial masks and final masks including multi-layer etch masks, and mask simulation. Finally some test results are given.

  7. Design Verification Enhancement of FPGA-based Plant Protection System Trip Logics for Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim; Jung, Jae Cheon [KEPCO, Ulsan (Korea, Republic of); Heo, Gyun Young [Kyunghee University, Yongin (Korea, Republic of)

    2016-05-15

    As part of strengthening the application of FPGA technology and find solution to its challenges in NPPs, international atomic energy agency (IAEA) has indicated interest by joining sponsorship of Topical Group on FPGA Applications in NPPs (TG-FAN) that hold meetings up to 7th times until now, in form of workshop (International workshop on the application of FPGAs in NPPs) annually since 2008. The workshops attracted a significant interest and had a broad representation of stakeholders such as regulators, utilities, research organizations, system designers, and vendors, from various countries that converge to discuss the current issues regarding instrumentation and control (I and C) systems as well as FPGA applications. Two out of many technical issues identified by the group are lifecycle of FPGA-based platforms, systems, and applications; and methods and tools for V and V. Therefore, in this work, several design steps that involved the use of model-based systems engineering process as well as MATLAB/SIMULINK model which lead to the enhancement of design verification are employed. The verified and validated design output works correctly and effectively. Conclusively, the model-based systems engineering approach and the structural step-by-step design modeling techniques including SIMULINK model utilized in this work have shown how FPGA PPS trip logics design verification can be enhanced. If these design approaches are employ in the design of FPGA-based I and C systems, the design can be easily verified and validated.

  8. Verification of the two-dimensional hydrodynamic model based on remote sensing

    Science.gov (United States)

    Sazonov, Alexey; Mikhailukova, Polina; Krylenko, Inna; Frolova, Natalya; Kireeva, Mariya

    2016-04-01

    Mathematical modeling methods are used more and more actively to evaluate possible damage, identify potential flood zone and the influence of individual factors affecting the river during the passage of the flood. Calculations were performed by means of domestic software complex «STREAM-2D» which is based on the numerical solution of two-dimensional St. Venant equations. One of the major challenges in mathematical modeling is the verification of the model. This is usually made using data on water levels from hydrological stations: the smaller the difference of the actual level and the simulated one, the better the quality of the model used. Data from hydrological stations are not always available, so alternative sources of verification, such as remote sensing, are increasingly used. The aim of this work is to develop a method of verification of hydrodynamic model based on a comparison of actual flood zone area, which in turn is determined on the basis of the automated satellite image interpretation methods for different imaging systems and flooded area obtained in the course of the model. The study areas are Lena River, The North Dvina River, Amur River near Blagoveshchensk. We used satellite images made by optical and radar sensors: SPOT-5/HRG, Resurs-F, Radarsat-2. Flooded area were calculated using unsupervised classification (ISODATA and K-mean) for optical images and segmentation for Radarsat-2. Knowing the flow rate and the water level at a given date for the upper and lower limits of the model, respectively, it is possible to calculate flooded area by means of program STREAM-2D and GIS technology. All the existing vector layers with the boundaries of flooding are included in a GIS project for flood area calculation. This study was supported by the Russian Science Foundation, project no. 14-17-00155.

  9. 4D offline PET-based treatment verification in scanned ion beam therapy: a phantom study

    Science.gov (United States)

    Kurz, Christopher; Bauer, Julia; Unholtz, Daniel; Richter, Daniel; Stützer, Kristin; Bert, Christoph; Parodi, Katia

    2015-08-01

    At the Heidelberg Ion-Beam Therapy Center, patient irradiation with scanned proton and carbon ion beams is verified by offline positron emission tomography (PET) imaging: the {β+} -activity measured within the patient is compared to a prediction calculated on the basis of the treatment planning data in order to identify potential delivery errors. Currently, this monitoring technique is limited to the treatment of static target structures. However, intra-fractional organ motion imposes considerable additional challenges to scanned ion beam radiotherapy. In this work, the feasibility and potential of time-resolved (4D) offline PET-based treatment verification with a commercial full-ring PET/CT (x-ray computed tomography) device are investigated for the first time, based on an experimental campaign with moving phantoms. Motion was monitored during the gated beam delivery as well as the subsequent PET acquisition and was taken into account in the corresponding 4D Monte-Carlo simulations and data evaluation. Under the given experimental conditions, millimeter agreement between the prediction and measurement was found. Dosimetric consequences due to the phantom motion could be reliably identified. The agreement between PET measurement and prediction in the presence of motion was found to be similar as in static reference measurements, thus demonstrating the potential of 4D PET-based treatment verification for future clinical applications.

  10. A Quarter Active Suspension System Based Ground-Hook Controller

    OpenAIRE

    Turnip Arjon

    2016-01-01

    An alternative design technique for active suspension system of vehicle using a developved ground-hook damping system as a reference is proposed. The controller parameters are determined using Lyapunov method and can be tuned to precisely achieve the type of desired response which given by reference model. The simulation result show that the designed active suspension system based ground-hook reference model is able to significantly improve the ride comfort and the road holding compared with ...

  11. Quantitative Safety: Linking Proof-Based Verification with Model Checking for Probabilistic Systems

    CERN Document Server

    Ndukwu, Ukachukwu

    2009-01-01

    This paper presents a novel approach for augmenting proof-based verification with performance-style analysis of the kind employed in state-of-the-art model checking tools for probabilistic systems. Quantitative safety properties usually specified as probabilistic system invariants and modeled in proof-based environments are evaluated using bounded model checking techniques. Our specific contributions include the statement of a theorem that is central to model checking safety properties of proof-based systems, the establishment of a procedure; and its full implementation in a prototype system (YAGA) which readily transforms a probabilistic model specified in a proof-based environment to its equivalent verifiable PRISM model equipped with reward structures. The reward structures capture the exact interpretation of the probabilistic invariants and can reveal succinct information about the model during experimental investigations. Finally, we demonstrate the novelty of the technique on a probabilistic library cas...

  12. Constraining millennial scale dynamics of a Greenland tidewater glacier for the verification of a calving criterion based numerical model

    Science.gov (United States)

    Lea, J.; Mair, D.; Rea, B.; Nick, F.; Schofield, E.

    2012-04-01

    The ability to successfully model the behaviour of Greenland tidewater glaciers is pivotal to understanding the controls on their dynamics and potential impact on global sea level. However, to have confidence in the results of numerical models in this setting, the evidence required for robust verification must extend well beyond the existing instrumental record. Perhaps uniquely for a major Greenland outlet glacier, both the advance and retreat dynamics of Kangiata Nunata Sermia (KNS), Nuuk Fjord, SW Greenland over the last ~1000 years can be reasonably constrained through a combination of geomorphological, sedimentological and archaeological evidence. It is therefore an ideal location to test the ability of the latest generation of calving criterion based tidewater models to explain millennial scale dynamics. This poster presents geomorphological evidence recording the post-Little Ice Age maximum dynamics of KNS, derived from high-resolution satellite imagery. This includes evidence of annual retreat moraine complexes suggesting controlled rather than catastrophic retreat between pinning points, in addition to a series of ice dammed lake shorelines, allowing detailed interpretation of the dynamics of the glacier as it thinned and retreated. Pending ground truthing, this evidence will contribute towards the calibration of results obtained from a calving criterion numerical model (Nick et al, 2010), driven by an air temperature reconstruction for the KNS region determined from ice core data.

  13. Evaluation of the Vocal Tract Length Normalization Based Classifiers for Speaker Verification

    Directory of Open Access Journals (Sweden)

    Walid Hussein

    2016-12-01

    Full Text Available This paper proposes and evaluates classifiers based on Vocal Tract Length Normalization (VTLN in a text-dependent speaker verification (SV task with short testing utterances. This type of tasks is important in commercial applications and is not easily addressed with methods designed for long utterances such as JFA and i-Vectors. In contrast, VTLN is a speaker compensation scheme that can lead to significant improvements in speech recognition accuracy with just a few seconds of speech samples. A novel scheme to generate new classifiers is employed by incorporating the observation vector sequence compensated with VTLN. The modified sequence of feature vectors and the corresponding warping factors are used to generate classifiers whose scores are combined by a Support Vector Machine (SVM based SV system. The proposed scheme can provide an average reduction in EER equal to 14% when compared with the baseline system based on the likelihood of observation vectors.

  14. Engineering uses of physics-based ground motion simulations

    Science.gov (United States)

    Baker, Jack W.; Luco, Nicolas; Abrahamson, Norman A.; Graves, Robert W.; Maechling, Phillip J.; Olsen, Kim B.

    2014-01-01

    This paper summarizes validation methodologies focused on enabling ground motion simulations to be used with confidence in engineering applications such as seismic hazard analysis and dynmaic analysis of structural and geotechnical systems. Numberical simullation of ground motion from large erthquakes, utilizing physics-based models of earthquake rupture and wave propagation, is an area of active research in the earth science community. Refinement and validatoin of these models require collaboration between earthquake scientists and engineering users, and testing/rating methodolgies for simulated ground motions to be used with confidence in engineering applications. This paper provides an introduction to this field and an overview of current research activities being coordinated by the Souther California Earthquake Center (SCEC). These activities are related both to advancing the science and computational infrastructure needed to produce ground motion simulations, as well as to engineering validation procedures. Current research areas and anticipated future achievements are also discussed.

  15. A rule-based verification and control framework in ATLAS Trigger-DAQ

    CERN Document Server

    Kazarov, A; Lehmann-Miotto, G; Sloper, J E; Ryabov, Yu; Computing In High Energy and Nuclear Physics

    2007-01-01

    In order to meet the requirements of ATLAS data taking, the ATLAS Trigger-DAQ system is composed of O(1000) of applications running on more than 2600 computers in a network. With such system size, s/w and h/w failures are quite often. To minimize system downtime, the Trigger-DAQ control system shall include advanced verification and diagnostics facilities. The operator should use tests and expertise of the TDAQ and detectors developers in order to diagnose and recover from errors, if possible automatically. The TDAQ control system is built as a distributed tree of controllers, where behavior of each controller is defined in a rule-based language allowing easy customization. The control system also includes verification framework which allow users to develop and configure tests for any component in the system with different levels of complexity. It can be used as a stand-alone test facility for a small detector installation, as part of the general TDAQ initialization procedure, and for diagnosing the problems ...

  16. Range verification of passively scattered proton beams based on prompt gamma time patterns

    Science.gov (United States)

    Testa, Mauro; Min, Chul Hee; Verburg, Joost M.; Schümann, Jan; Lu, Hsiao-Ming; Paganetti, Harald

    2014-07-01

    We propose a proton range verification technique for passive scattering proton therapy systems where spread out Bragg peak (SOBP) fields are produced with rotating range modulator wheels. The technique is based on the correlation of time patterns of the prompt gamma ray emission with the range of protons delivering the SOBP. The main feature of the technique is the ability to verify the proton range with a single point of measurement and a simple detector configuration. We performed four-dimensional (time-dependent) Monte Carlo simulations using TOPAS to show the validity and accuracy of the technique. First, we validated the hadronic models used in TOPAS by comparing simulations and prompt gamma spectrometry measurements published in the literature. Second, prompt gamma simulations for proton range verification were performed for the case of a water phantom and a prostate cancer patient. In the water phantom, the proton range was determined with 2 mm accuracy with a full ring detector configuration for a dose of ~2.5 cGy. For the prostate cancer patient, 4 mm accuracy on range determination was achieved for a dose of ~15 cGy. The results presented in this paper are encouraging in view of a potential clinical application of the technique.

  17. Model-based mask verification on critical 45nm logic masks

    Science.gov (United States)

    Sundermann, F.; Foussadier, F.; Takigawa, T.; Wiley, J.; Vacca, A.; Depre, L.; Chen, G.; Bai, S.; Wang, J.-S.; Howell, R.; Arnoux, V.; Hayano, K.; Narukawa, S.; Kawashima, S.; Mohri, H.; Hayashi, N.; Miyashita, H.; Trouiller, Y.; Robert, F.; Vautrin, F.; Kerrien, G.; Planchot, J.; Martinelli, C.; Di-Maria, J. L.; Farys, V.; Vandewalle, B.; Perraud, L.; Le Denmat, J. C.; Villaret, A.; Gardin, C.; Yesilada, E.; Saied, M.

    2008-05-01

    In the continuous battle to improve critical dimension (CD) uniformity, especially for 45-nanometer (nm) logic advanced products, one important recent advance is the ability to accurately predict the mask CD uniformity contribution to the overall global wafer CD error budget. In most wafer process simulation models, mask error contribution is embedded in the optical and/or resist models. We have separated the mask effects, however, by creating a short-range mask process model (MPM) for each unique mask process and a long-range CD uniformity mask bias map (MBM) for each individual mask. By establishing a mask bias map, we are able to incorporate the mask CD uniformity signature into our modelling simulations and measure the effects on global wafer CD uniformity and hotspots. We also have examined several ways of proving the efficiency of this approach, including the analysis of OPC hot spot signatures with and without the mask bias map (see Figure 1) and by comparing the precision of the model contour prediction to wafer SEM images. In this paper we will show the different steps of mask bias map generation and use for advanced 45nm logic node layers, along with the current results of this new dynamic application to improve hot spot verification through Brion Technologies' model-based mask verification loop.

  18. Laser based bi-directional Gbit ground links with the Tesat transportable adaptive optical ground station

    Science.gov (United States)

    Heine, Frank; Saucke, Karen; Troendle, Daniel; Motzigemba, Matthias; Bischl, Hermann; Elser, Dominique; Marquardt, Christoph; Henninger, Hennes; Meyer, Rolf; Richter, Ines; Sodnik, Zoran

    2017-02-01

    Optical ground stations can be an alternative to radio frequency based transmit (forward) and receive (return) systems for data relay services and other applications including direct to earth optical communications from low earth orbit spacecrafts, deep space receivers, space based quantum key distribution systems and Tbps capacity feeder links to geostationary spacecrafts. The Tesat Transportable Adaptive Optical Ground Station is operational since September 2015 at the European Space Agency site in Tenerife, Spain.. This paper reports about the results of the 2016 experimental campaigns including the characterization of the optical channel from Tenerife for an optimized coding scheme, the performance of the T-AOGS under different atmospheric conditions and the first successful measurements of the suitability of the Alphasat LCT optical downlink performance for future continuous variable quantum key distribution systems.

  19. Ground point filtering of UAV-based photogrammetric point clouds

    Science.gov (United States)

    Anders, Niels; Seijmonsbergen, Arie; Masselink, Rens; Keesstra, Saskia

    2016-04-01

    Unmanned Aerial Vehicles (UAVs) have proved invaluable for generating high-resolution and multi-temporal imagery. Based on photographic surveys, 3D surface reconstructions can be derived photogrammetrically so producing point clouds, orthophotos and surface models. For geomorphological or ecological applications it may be necessary to separate ground points from vegetation points. Existing filtering methods are designed for point clouds derived using other methods, e.g. laser scanning. The purpose of this paper is to test three filtering algorithms for the extraction of ground points from point clouds derived from low-altitude aerial photography. Three subareas were selected from a single flight which represent different scenarios: 1) low relief, sparsely vegetated area, 2) low relief, moderately vegetated area, 3) medium relief and moderately vegetated area. The three filtering methods are used to classify ground points in different ways, based on 1) RGB color values from training samples, 2) TIN densification as implemented in LAStools, and 3) an iterative surface lowering algorithm. Ground points are then interpolated into a digital terrain model using inverse distance weighting. The results suggest that different landscapes require different filtering methods for optimal ground point extraction. While iterative surface lowering and TIN densification are fully automated, color-based classification require fine-tuning in order to optimize the filtering results. Finally, we conclude that filtering photogrammetric point clouds could provide a cheap alternative to laser scan surveys for creating digital terrain models in sparsely vegetated areas.

  20. Magnetic nanoparticles-based extraction and verification of nucleic acids from different sources.

    Science.gov (United States)

    Ma, Chao; Li, Chuanyan; Wang, Fang; Ma, Ningning; Li, Xiaolong; Li, Zhiyang; Deng, Yan; Wang, Zhifei; Xi, Zhijiang; Tang, Yongjun; Hel, Nongyue

    2013-04-01

    In many molecule biology and genetic technology studies, the amount of available DNA can be one of the important criteria for selecting the samples from different sources. Compared with those genomic DNA methods using organic solvents or other traditional commercial kits, the method based on magnetic nanoparticles (MNPs) and adsorption technology has many remarkable advantages like being time-saving and cost effective without the laborious centrifugation or precipitation steps, and more importantly it has the great potential and especially suitable for automated DNA extraction and up-scaling. In this paper, the extraction efficiency of genomic nucleic acids based on magnetic nanoparticles from four different sources including bacteria, yeast, human blood and virus samples are compared and verified. After measurement and verification of the extracted genomic nucleic acids, it was shown that all these genomic nucleic acids extracted using the MNPs method can be of high yield and be available for next molecule biological steps.

  1. GLAST and Ground-Based Gamma-Ray Astronomy

    Science.gov (United States)

    McEnery, Julie

    2008-01-01

    The launch of the Gamma-ray Large Area Space Telescope together with the advent of a new generation of ground-based gamma-ray detectors such as VERITAS, HESS, MAGIC and CANGAROO, will usher in a new era of high-energy gamma-ray astrophysics. GLAST and the ground based gamma-ray observatories will provide highly complementary capabilities for spectral, temporal and spatial studies of high energy gamma-ray sources. Joint observations will cover a huge energy range, from 20 MeV to over 20 TeV. The LAT will survey the entire sky every three hours, allowing it both to perform uniform, long-term monitoring of variable sources and to detect flaring sources promptly. Both functions complement the high-sensitivity pointed observations provided by ground-based detectors. Finally, the large field of view of GLAST will allow a study of gamma-ray emission on large angular scales and identify interesting regions of the sky for deeper studies at higher energies. In this poster, we will discuss the science returns that might result from joint GLAST/ground-based gamma-ray observations and illustrate them with detailed source simulations.

  2. GLAST and Ground-Based Gamma-Ray Astronomy

    Science.gov (United States)

    McEnery, Julie

    2008-01-01

    The launch of the Gamma-ray Large Area Space Telescope together with the advent of a new generation of ground-based gamma-ray detectors such as VERITAS, HESS, MAGIC and CANGAROO, will usher in a new era of high-energy gamma-ray astrophysics. GLAST and the ground based gamma-ray observatories will provide highly complementary capabilities for spectral, temporal and spatial studies of high energy gamma-ray sources. Joint observations will cover a huge energy range, from 20 MeV to over 20 TeV. The LAT will survey the entire sky every three hours, allowing it both to perform uniform, long-term monitoring of variable sources and to detect flaring sources promptly. Both functions complement the high-sensitivity pointed observations provided by ground-based detectors. Finally, the large field of view of GLAST will allow a study of gamma-ray emission on large angular scales and identify interesting regions of the sky for deeper studies at higher energies. In this poster, we will discuss the science returns that might result from joint GLAST/ground-based gamma-ray observations and illustrate them with detailed source simulations.

  3. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  4. PLM-based Approach for Design Verification and Validation using Manufacturing Process Knowledge

    Directory of Open Access Journals (Sweden)

    Luis Toussaint

    2010-02-01

    Full Text Available Out of 100 hours of engineering work, only 20 are dedicated to real engineering and 80 are spent on what is considered as routine activities. Readjusting the ratio of innovative vs. routine work is a considerable challenge in the product lifecycle management (PLM strategy. Therefore, the main objective is to develop an approach in order to accelerate routine processes in engineering design. The proposed methodology called FabK consists of capturing manufacturing knowledge and its application towards the design verification and validation of new engineering designs. The approach is implemented into a Web-based PLM prototype and a Computer Aided Design system. A series of experiments from an industrial case study is introduced to provide significant results.

  5. A ROBUST GA/KNN BASED HYPOTHESIS VERIFICATION SYSTEM FOR VEHICLE DETECTION

    Directory of Open Access Journals (Sweden)

    Nima Khairdoost

    2015-03-01

    Full Text Available Vehicle detection is an important issue in driver assistance systems and self-guided vehicles that includes two stages of hypothesis generation and verification. In the first stage, potential vehicles are hypothesized and in the second stage, all hypothesis are verified. The focus of this work is on the second stage. We extract Pyramid Histograms of Oriented Gradients (PHOG features from a traffic image as candidates of feature vectors to detect vehicles. Principle Component Analysis (PCA and Linear Discriminant Analysis (LDA are applied to these PHOG feature vectors as dimension reduction and feature selection tools parallelly. After feature fusion, we use Genetic Algorithm (GA and cosine similarity-based K Nearest Neighbor (KNN classification to improve the performance and generalization of the features. Our tests show good classification accuracy of more than 97% correct classification on realistic on-road vehicle images.

  6. Verification of photon attenuation characteristics for 3D printer based small animal lung model

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Se Ho; Lee, Seung Wook [Pusan National University, Busan (Korea, Republic of); Han, Su Chul; Park, Seung Woo [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of)

    2016-05-15

    Since it is difficult to measure absorbed dose to mice in vivo, replica mice are mostly used as alternative. In this study, realistic mouse phantom was fabricated by using 3D printer (object500 connex3, Stratasys, USA). Elemental inks as material of 3D printer were selected corresponding to mouse tissue. To represent lung, selected material was partially used with air layer. In order to verify material equivalent, super-flex bolus was simply compared to verify photon attenuation characteristics. In the case of lung, Hounsfield unit (HU) of the phantom were compared with a live mouse. In this study, we fabricated mouse phantom by using 3D printer, and practically verified photon attenuation characteristics. The fabricated phantom shows tissue equivalence as well as similar geometry with live mouse. As more and more growing of 3D printer technique, 3D printer based small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study.

  7. Truth in Complex Adaptive Systems Models Should BE Based on Proof by Constructive Verification

    Science.gov (United States)

    Shipworth, David

    It is argued that the truth status of emergent properties of complex adaptive systems models should be based on an epistemology of proof by constructive verification and therefore on the ontological axioms of a non-realist logical system such as constructivism or intuitionism. `Emergent' properties of complex adaptive systems (CAS) models create particular epistemological and ontological challenges. These challenges bear directly on current debates in the philosophy of mathematics and in theoretical computer science. CAS research, with its emphasis on computer simulation, is heavily reliant on models which explore the entailments of Formal Axiomatic Systems (FAS). The incompleteness results of Gödel, the incomputability results of Turing, and the Algorithmic Information Theory results of Chaitin, undermine a realist (platonic) truth model of emergent properties. These same findings support the hegemony of epistemology over ontology and point to alternative truth models such as intuitionism, constructivism and quasi-empiricism.

  8. A Quarter Active Suspension System Based Ground-Hook Controller

    Directory of Open Access Journals (Sweden)

    Turnip Arjon

    2016-01-01

    Full Text Available An alternative design technique for active suspension system of vehicle using a developved ground-hook damping system as a reference is proposed. The controller parameters are determined using Lyapunov method and can be tuned to precisely achieve the type of desired response which given by reference model. The simulation result show that the designed active suspension system based ground-hook reference model is able to significantly improve the ride comfort and the road holding compared with semi-active suspension.

  9. GEARS: An Enterprise Architecture Based On Common Ground Services

    Science.gov (United States)

    Petersen, S.

    2014-12-01

    Earth observation satellites collect a broad variety of data used in applications that range from weather forecasting to climate monitoring. Within NOAA the National Environmental Satellite Data and Information Service (NESDIS) supports these applications by operating satellites in both geosynchronous and polar orbits. Traditionally NESDIS has acquired and operated its satellites as stand-alone systems with their own command and control, mission management, processing, and distribution systems. As the volume, velocity, veracity, and variety of sensor data and products produced by these systems continues to increase, NESDIS is migrating to a new concept of operation in which it will operate and sustain the ground infrastructure as an integrated Enterprise. Based on a series of common ground services, the Ground Enterprise Architecture System (GEARS) approach promises greater agility, flexibility, and efficiency at reduced cost. This talk describes the new architecture and associated development activities, and presents the results of initial efforts to improve product processing and distribution.

  10. Integrated Train Ground Radio Communication System Based TD-LTE

    Institute of Scientific and Technical Information of China (English)

    ZHAO Hongli; CAO Yuan; ZHU Li; XU Wei

    2016-01-01

    In existing metro systems, the train ground radio communication system for different applications are deployed independently. Investing and constructing the communication infrastructures repeatedly wastes substan-tial social resources, and it brings difficulties to maintain all these infrastructures. We present the communication Quality of service (QoS) requirement for different train ground radio applications. An integrated TD-LTE based train ground radio communication system for the metro system (LTE-M) is designed next. In order to test the LTE-M system performance, an indoor testing environment is set up. The channel simulator and programmable attenua-tors are used to simulate the real metro environment. Ex-tensive test results show that the designed LTE-M system performance satisfies metro communication requirements.

  11. Ground-based observations of Kepler asteroseismic targets

    CERN Document Server

    Uytterhoeven, K; Southworth, J; Randall, S; Ostensen, R; Molenda-Zakowicz, J; Marconi, M; Kurtz, D W; Kiss, L; Gutierrez-Soto, J; Frandsen, S; De Cat, P; Bruntt, H; Briquet, M; Zhang, X B; Telting, J H; Steslicki, M; Ripepi, V; Pigulski, A; Paparo, M; Oreiro, R; Choong, Ngeow Chow; Niemczura, E; Nemec, J; Narwid, A; Mathias, P; Martin-Ruiz, S; Lehman, H; Kopacki, G; Karoff, C; Jackiewicz, J; Henden, A A; Handler, G; Grigachene, A; Green, E M; Garrido, R; Machado, L Fox; Debosscher, J; Creevey, O L; Catanzaro, G; Bognar, Z; Biazzo, K; Bernabei, S

    2010-01-01

    We present the ground-based activities within the different working groups of the Kepler Asteroseismic Science Consortium (KASC). The activities aim at the systematic characterization of the 5000+ KASC targets, and at the collection of ground-based follow-up time-series data of selected promising Kepler pulsators. So far, 35 different instruments at 30 telescopes on 22 different observatories in 12 countries are in use, and a total of more than 530 observing nights has been awarded. (Based on observations made with the Isaac Newton Telescope, William Herschel Telescope, Nordic Optical Telescope, Telescopio Nazionale Galileo, Mercator Telescope (La Palma, Spain), and IAC-80 (Tenerife, Spain). Also based on observations taken at the observatories of Sierra Nevada, San Pedro Martir, Vienna, Xinglong, Apache Point, Lulin, Tautenburg, Loiano, Serra la Nave, Asiago, McDonald, Skinakas, Pic du Midi, Mauna Kea, Steward Observatory, Bialkow Observatory of the Wroclaw University, Piszkesteto Mountain Station, Observato...

  12. Ground-Based Lidar for Atmospheric Boundary Layer Ozone Measurements

    Science.gov (United States)

    Kuang, Shi; Newchurch, Michael J.; Burris, John; Liu, Xiong

    2013-01-01

    Ground-based lidars are suitable for long-term ozone monitoring as a complement to satellite and ozonesonde measurements. However, current ground-based lidars are unable to consistently measure ozone below 500 m above ground level (AGL) due to both engineering issues and high retrieval sensitivity to various measurement errors. In this paper, we present our instrument design, retrieval techniques, and preliminary results that focus on the high-temporal profiling of ozone within the atmospheric boundary layer (ABL) achieved by the addition of an inexpensive and compact mini-receiver to the previous system. For the first time, to the best of our knowledge, the lowest, consistently achievable observation height has been extended down to 125 m AGL for a ground-based ozone lidar system. Both the analysis and preliminary measurements demonstrate that this lidar measures ozone with a precision generally better than 10% at a temporal resolution of 10 min and a vertical resolution from 150 m at the bottom of the ABL to 550 m at the top. A measurement example from summertime shows that inhomogeneous ozone aloft was affected by both surface emissions and the evolution of ABL structures.

  13. Ground-based lidar for atmospheric boundary layer ozone measurements.

    Science.gov (United States)

    Kuang, Shi; Newchurch, Michael J; Burris, John; Liu, Xiong

    2013-05-20

    Ground-based lidars are suitable for long-term ozone monitoring as a complement to satellite and ozonesonde measurements. However, current ground-based lidars are unable to consistently measure ozone below 500 m above ground level (AGL) due to both engineering issues and high retrieval sensitivity to various measurement errors. In this paper, we present our instrument design, retrieval techniques, and preliminary results that focus on the high-temporal profiling of ozone within the atmospheric boundary layer (ABL) achieved by the addition of an inexpensive and compact mini-receiver to the previous system. For the first time, to the best of our knowledge, the lowest, consistently achievable observation height has been extended down to 125 m AGL for a ground-based ozone lidar system. Both the analysis and preliminary measurements demonstrate that this lidar measures ozone with a precision generally better than ±10% at a temporal resolution of 10 min and a vertical resolution from 150 m at the bottom of the ABL to 550 m at the top. A measurement example from summertime shows that inhomogeneous ozone aloft was affected by both surface emissions and the evolution of ABL structures.

  14. On the validation of SPDM task verification facility

    NARCIS (Netherlands)

    Ma, Ou; Wang, Jiegao; Misra, Sarthak; Liu, Michael

    2004-01-01

    This paper describes a methodology for validating a ground-based, hardware-in-the-loop, space-robot simulation facility. This facility, called ‘‘SPDM task verification facility,’’ is being developed by the Canadian Space Agency for the purpose of verifying the contact dynamics performance of the spe

  15. Ground-Based Calibration Of A Microwave Landing System

    Science.gov (United States)

    Kiriazes, John J.; Scott, Marshall M., Jr.; Willis, Alfred D.; Erdogan, Temel; Reyes, Rolando

    1996-01-01

    System of microwave instrumentation and data-processing equipment developed to enable ground-based calibration of microwave scanning-beam landing system (MSBLS) at distances of about 500 to 1,000 ft from MSBLS transmitting antenna. Ensures accuracy of MSBLS near touchdown point, without having to resort to expense and complex logistics of aircraft-based testing. Modified versions prove useful in calibrating aircraft instrument landing systems.

  16. Scenario based seismic hazard assessment and its application to the seismic verification of relevant buildings

    Science.gov (United States)

    Romanelli, Fabio; Vaccari, Franco; Altin, Giorgio; Panza, Giuliano

    2016-04-01

    The procedure we developed, and applied to a few relevant cases, leads to the seismic verification of a building by: a) use of a scenario based neodeterministic approach (NDSHA) for the calculation of the seismic input, and b) control of the numerical modeling of an existing building, using free vibration measurements of the real structure. The key point of this approach is the strict collaboration, from the seismic input definition to the monitoring of the response of the building in the calculation phase, of the seismologist and the civil engineer. The vibrometry study allows the engineer to adjust the computational model in the direction suggested by the experimental result of a physical measurement. Once the model has been calibrated by vibrometric analysis, one can select in the design spectrum the proper range of periods of interest for the structure. Then, the realistic values of spectral acceleration, which include the appropriate amplification obtained through the modeling of a "scenario" input to be applied to the final model, can be selected. Generally, but not necessarily, the "scenario" spectra lead to higher accelerations than those deduced by taking the spectra from the national codes (i.e. NTC 2008, for Italy). The task of the verifier engineer is to act so that the solution of the verification is conservative and realistic. We show some examples of the application of the procedure to some relevant (e.g. schools) buildings of the Trieste Province. The adoption of the scenario input has given in most of the cases an increase of critical elements that have to be taken into account in the design of reinforcements. However, the higher cost associated with the increase of elements to reinforce is reasonable, especially considering the important reduction of the risk level.

  17. Verification & Validation of High-Order Short-Characteristics-Based Deterministic Transport Methodology on Unstructured Grids

    Energy Technology Data Exchange (ETDEWEB)

    Azmy, Yousry [North Carolina State Univ., Raleigh, NC (United States); Wang, Yaqi [North Carolina State Univ., Raleigh, NC (United States)

    2013-12-20

    The research team has developed a practical, high-order, discrete-ordinates, short characteristics neutron transport code for three-dimensional configurations represented on unstructured tetrahedral grids that can be used for realistic reactor physics applications at both the assembly and core levels. This project will perform a comprehensive verification and validation of this new computational tool against both a continuous-energy Monte Carlo simulation (e.g. MCNP) and experimentally measured data, an essential prerequisite for its deployment in reactor core modeling. Verification is divided into three phases. The team will first conduct spatial mesh and expansion order refinement studies to monitor convergence of the numerical solution to reference solutions. This is quantified by convergence rates that are based on integral error norms computed from the cell-by-cell difference between the code’s numerical solution and its reference counterpart. The latter is either analytic or very fine- mesh numerical solutions from independent computational tools. For the second phase, the team will create a suite of code-independent benchmark configurations to enable testing the theoretical order of accuracy of any particular discretization of the discrete ordinates approximation of the transport equation. For each tested case (i.e. mesh and spatial approximation order), researchers will execute the code and compare the resulting numerical solution to the exact solution on a per cell basis to determine the distribution of the numerical error. The final activity comprises a comparison to continuous-energy Monte Carlo solutions for zero-power critical configuration measurements at Idaho National Laboratory’s Advanced Test Reactor (ATR). Results of this comparison will allow the investigators to distinguish between modeling errors and the above-listed discretization errors introduced by the deterministic method, and to separate the sources of uncertainty.

  18. Ground-based Nuclear Detonation Detection (GNDD) Technology Roadmap

    Energy Technology Data Exchange (ETDEWEB)

    Casey, Leslie A.

    2014-01-13

    This GNDD Technology Roadmap is intended to provide guidance to potential researchers and help management define research priorities to achieve technology advancements for ground-based nuclear explosion monitoring science being pursued by the Ground-based Nuclear Detonation Detection (GNDD) Team within the Office of Nuclear Detonation Detection in the National Nuclear Security Administration (NNSA) of the U.S. Department of Energy (DOE). Four science-based elements were selected to encompass the entire scope of nuclear monitoring research and development (R&D) necessary to facilitate breakthrough scientific results, as well as deliver impactful products. Promising future R&D is delineated including dual use associated with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Important research themes as well as associated metrics are identified along with a progression of accomplishments, represented by a selected bibliography, that are precursors to major improvements to nuclear explosion monitoring.

  19. Operational verification of a blow out preventer utilizing fiber Bragg grating based strain gauges

    Science.gov (United States)

    Turner, Alan L.; Loustau, Philippe; Thibodeau, Dan

    2015-05-01

    Ultra-deep water BOP (Blowout Preventer) operation poses numerous challenges in obtaining accurate knowledge of current system integrity and component condition- a salient example is the difficulty of verifying closure of the pipe and shearing rams during and after well control events. Ascertaining the integrity of these functions is currently based on a manual volume measurement performed with a stop watch. Advances in sensor technology now permit more accurate methods of BOP condition monitoring. Fiber optic sensing technology and particularly fiber optic strain gauges have evolved to a point where we can derive a good representation of what is happening inside a BOP by installing sensors on the outside shell. Function signatures can be baselined to establish thresholds that indicate successful function activation. Based on this knowledge base, signal variation over time can then be utilized to assess degradation of these functions and subsequent failure to function. Monitoring the BOP from the outside has the advantage of gathering data through a system that can be interfaced with risk based integrity management software and/or a smart monitoring system that analyzes BOP control redundancies without the requirement of interfacing with OEM control systems. The paper will present the results of ongoing work on a fully instrumented 13-½" 10,000 psi pipe ram. Instrumentation includes commonly used pressure transducers, accelerometers, flow meters, and optical strain gauges. Correlation will be presented between flow, pressure, acceleration signatures and the fiber optic strain gauge's response as it relates to functional verification and component level degradation trending.

  20. Performance evaluation of wavelet-based face verification on a PDA recorded database

    Science.gov (United States)

    Sellahewa, Harin; Jassim, Sabah A.

    2006-05-01

    The rise of international terrorism and the rapid increase in fraud and identity theft has added urgency to the task of developing biometric-based person identification as a reliable alternative to conventional authentication methods. Human Identification based on face images is a tough challenge in comparison to identification based on fingerprints or Iris recognition. Yet, due to its unobtrusive nature, face recognition is the preferred method of identification for security related applications. The success of such systems will depend on the support of massive infrastructures. Current mobile communication devices (3G smart phones) and PDA's are equipped with a camera which can capture both still and streaming video clips and a touch sensitive display panel. Beside convenience, such devices provide an adequate secure infrastructure for sensitive & financial transactions, by protecting against fraud and repudiation while ensuring accountability. Biometric authentication systems for mobile devices would have obvious advantages in conflict scenarios when communication from beyond enemy lines is essential to save soldier and civilian life. In areas of conflict or disaster the luxury of fixed infrastructure is not available or destroyed. In this paper, we present a wavelet-based face verification scheme that have been specifically designed and implemented on a currently available PDA. We shall report on its performance on the benchmark audio-visual BANCA database and on a newly developed PDA recorded audio-visual database that take include indoor and outdoor recordings.

  1. The Impact of Verification Area Design on Tropical Cyclone Targeted Observations Based on the CNOP Method

    Institute of Scientific and Technical Information of China (English)

    ZHOU Feifan; MU Mu

    2011-01-01

    This study investigated the impact of different verification-area designs on the sensitive areas identified using the conditional nonlinear optimal perturbation (CNOP) method for tropical cyclone targeted observations.The sensitive areas identified using the first singular vector (FSV) method,which is the linear approximation of CNOP,were also investigated for comparison.By analyzing the validity of the sensitive areas,the proper design of a verification area was developed.Tropical cyclone Rananim,which occurred in August 2004 in the northwest Pacific Ocean,was studied.Two sets of verification areas were designed; one changed position,and the other changed both size and position.The CNOP and its identified sensitive areas were found to be less sensitive to small variations of the verification areas than those of the FSV and its sensitive areas.With larger variations of the verification area,the CNOP and the FSV as well as their identified sensitive areas changed substantially.In terms of reducing forecast errors in the verification area,the CNOP-identified sensitive areas were more beneficial than those identified using FSV.The design of the verification area is important for cyclone prediction.The verification area should be designed with a proper size according to the possible locations of the cyclone obtained from the ensemble forecast results.In addition,the development trend of the cyclone analyzed from its dynamic mechanisms was another reference.When the general position of the verification area was determined,a small variation in size or position had little influence on the results of CNOP.

  2. Mass Spectrometry-based Assay for High Throughput and High Sensitivity Biomarker Verification

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Xuejiang; Tang, Keqi

    2017-06-14

    Searching for disease specific biomarkers has become a major undertaking in the biomedical research field as the effective diagnosis, prognosis and treatment of many complex human diseases are largely determined by the availability and the quality of the biomarkers. A successful biomarker as an indicator to a specific biological or pathological process is usually selected from a large group of candidates by a strict verification and validation process. To be clinically useful, the validated biomarkers must be detectable and quantifiable by the selected testing techniques in their related tissues or body fluids. Due to its easy accessibility, protein biomarkers would ideally be identified in blood plasma or serum. However, most disease related protein biomarkers in blood exist at very low concentrations (<1ng/mL) and are “masked” by many none significant species at orders of magnitude higher concentrations. The extreme requirements of measurement sensitivity, dynamic range and specificity make the method development extremely challenging. The current clinical protein biomarker measurement primarily relies on antibody based immunoassays, such as ELISA. Although the technique is sensitive and highly specific, the development of high quality protein antibody is both expensive and time consuming. The limited capability of assay multiplexing also makes the measurement an extremely low throughput one rendering it impractical when hundreds to thousands potential biomarkers need to be quantitatively measured across multiple samples. Mass spectrometry (MS)-based assays have recently shown to be a viable alternative for high throughput and quantitative candidate protein biomarker verification. Among them, the triple quadrupole MS based assay is the most promising one. When it is coupled with liquid chromatography (LC) separation and electrospray ionization (ESI) source, a triple quadrupole mass spectrometer operating in a special selected reaction monitoring (SRM) mode

  3. A comparative study of satellite and ground-based phenology.

    Science.gov (United States)

    Studer, S; Stöckli, R; Appenzeller, C; Vidale, P L

    2007-05-01

    Long time series of ground-based plant phenology, as well as more than two decades of satellite-derived phenological metrics, are currently available to assess the impacts of climate variability and trends on terrestrial vegetation. Traditional plant phenology provides very accurate information on individual plant species, but with limited spatial coverage. Satellite phenology allows monitoring of terrestrial vegetation on a global scale and provides an integrative view at the landscape level. Linking the strengths of both methodologies has high potential value for climate impact studies. We compared a multispecies index from ground-observed spring phases with two types (maximum slope and threshold approach) of satellite-derived start-of-season (SOS) metrics. We focus on Switzerland from 1982 to 2001 and show that temporal and spatial variability of the multispecies index correspond well with the satellite-derived metrics. All phenological metrics correlate with temperature anomalies as expected. The slope approach proved to deviate strongly from the temporal development of the ground observations as well as from the threshold-defined SOS satellite measure. The slope spring indicator is considered to indicate a different stage in vegetation development and is therefore less suited as a SOS parameter for comparative studies in relation to ground-observed phenology. Satellite-derived metrics are, however, very susceptible to snow cover, and it is suggested that this snow cover should be better accounted for by the use of newer satellite sensors.

  4. 4D offline PET-based treatment verification in ion beam therapy. Experimental and clinical evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Kurz, Christopher

    2014-06-12

    Due to the accessible sharp dose gradients, external beam radiotherapy with protons and heavier ions enables a highly conformal adaptation of the delivered dose to arbitrarily shaped tumour volumes. However, this high conformity is accompanied by an increased sensitivity to potential uncertainties, e.g., due to changes in the patient anatomy. Additional challenges are imposed by respiratory motion which does not only lead to rapid changes of the patient anatomy, but, in the cased of actively scanned ions beams, also to the formation of dose inhomogeneities. Therefore, it is highly desirable to verify the actual application of the treatment and to detect possible deviations with respect to the planned irradiation. At present, the only clinically implemented approach for a close-in-time verification of single treatment fractions is based on detecting the distribution of β{sup +}-emitter formed in nuclear fragmentation reactions during the irradiation by means of positron emission tomography (PET). For this purpose, a commercial PET/CT (computed tomography) scanner has been installed directly next to the treatment rooms at the Heidelberg Ion-Beam Therapy Center (HIT). Up to present, the application of this treatment verification technique is, however, still limited to static target volumes. This thesis aimed at investigating the feasibility and performance of PET-based treatment verification under consideration of organ motion. In experimental irradiation studies with moving phantoms, not only the practicability of PET-based treatment monitoring for moving targets, using a commercial PET/CT device, could be shown for the first time, but also the potential of this technique to detect motion-related deviations from the planned treatment with sub-millimetre accuracy. The first application to four exemplary hepato-cellular carcinoma patient cases under substantially more challenging clinical conditions indicated potential for improvement by taking organ motion into

  5. M&V Guidelines: Measurement and Verification for Performance-Based Contracts Version 4.0

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-11-02

    Document outlines the Federal Energy Management Program's standard procedures and guidelines for measurement and verification (M&V) for federal energy managers, procurement officials, and energy service providers.

  6. Development and validation of MCNPX-based Monte Carlo treatment plan verification system

    OpenAIRE

    Iraj Jabbari; Shahram Monadi

    2015-01-01

    A Monte Carlo treatment plan verification (MCTPV) system was developed for clinical treatment plan verification (TPV), especially for the conformal and intensity-modulated radiotherapy (IMRT) plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation ...

  7. Towards Open-World Person Re-Identification by One-Shot Group-Based Verification.

    Science.gov (United States)

    Zheng, Wei-Shi; Gong, Shaogang; Xiang, Tao

    2016-03-01

    Solving the problem of matching people across non-overlapping multi-camera views, known as person re-identification (re-id), has received increasing interests in computer vision. In a real-world application scenario, a watch-list (gallery set) of a handful of known target people are provided with very few (in many cases only a single) image(s) (shots) per target. Existing re-id methods are largely unsuitable to address this open-world re-id challenge because they are designed for (1) a closed-world scenario where the gallery and probe sets are assumed to contain exactly the same people, (2) person-wise identification whereby the model attempts to verify exhaustively against each individual in the gallery set, and (3) learning a matching model using multi-shots. In this paper, a novel transfer local relative distance comparison (t-LRDC) model is formulated to address the open-world person re-identification problem by one-shot group-based verification. The model is designed to mine and transfer useful information from a labelled open-world non-target dataset. Extensive experiments demonstrate that the proposed approach outperforms both non-transfer learning and existing transfer learning based re-id methods.

  8. Experimental Verification of a Vehicle Localization based on Moving Horizon Estimation Integrating LRS and Odometry

    Science.gov (United States)

    Sakaeta, Kuniyuki; Nonaka, Kenichiro; Sekiguchi, Kazuma

    2016-09-01

    Localization is an important function for the robots to complete various tasks. For localization, both internal and external sensors are used generally. The odometry is widely used as the method based on the internal sensors, but it suffers from cumulative errors. In the method using the laser range sensor (LRS) which is a kind of external sensor, the estimation accuracy is affected by the number of available measurement data. In our previous study, we applied moving horizon estimation (MHE) to the vehicle localization for integrating the LRS measurement data and the odometry information where the weightings of them are balanced relatively adapting to the number of the available LRS measurement data. In this paper, the effectiveness of the proposed localization method is verified through both numerical simulations and experiments using a 1/10 scale vehicle. The verification is conducted in the situations where the vehicle position cannot be localized uniquely on a certain direction using the LRS measurement data only. We achieve accurate localization even in such a situation by integrating the odometry and LRS based on MHE. We also show the superiority of the method through comparisons with a method using extended Kalman filter (EKF).

  9. Ground-based complex for checking the optical system

    Science.gov (United States)

    Grebenyuk, V.; Boreiko, V.; Dmitrotsa, A.; Gorbunov, N.; Khrenov, B.; Klimov, P.; Lavrova, M.; Popescu, E. M.; Sabirov, B.; Tkachenko, A.; Tkachev, L.; Volvach, A.; Yashin, I.

    2016-09-01

    The purpose TUS space experiment is to study cosmic rays of ultrahigh energies produced by extensive air showers from space. The concentrator is located on satellite, made in the form of the Fresnel mirror towards the earth's atmosphere, the focus of which is a photodetector. The angle of view of the mirror is ±4.5° that for a given height of the orbit corresponds to the area 80 × 80 km2 on ground. The ground complex consisting of a number of stations, to check the optical system of the experiment is created, (their location and the amount will be determined after the launch of the satellite based on its actual orbit).

  10. Ground extraction from airborne laser data based on wavelet analysis

    Science.gov (United States)

    Xu, Liang; Yang, Yan; Jiang, Bowen; Li, Jia

    2007-11-01

    With the advantages of high resolution and accuracy, airborne laser scanning data are widely used in topographic mapping. In order to generate a DTM, measurements from object features such as buildings, vehicles and vegetation have to be classified and removed. However, the automatic extraction of bare earth from point clouds acquired by airborne laser scanning equipment remains a problem in LIDAR data filtering nowadays. In this paper, a filter algorithm based on wavelet analysis is proposed. Relying on the capability of detecting discontinuities of continuous wavelet transform and the feature of multi-resolution analysis, the object points can be removed, while ground data are preserved. In order to evaluate the performance of this approach, we applied it to the data set used in the ISPRS filter test in 2003. 15 samples have been tested by the proposed approach. Results showed that it filtered most of the objects like vegetation and buildings, and extracted a well defined ground model.

  11. Online Verification of Control Parameter Calculations in Communication Based Train Control System

    CERN Document Server

    Bu, Lei; Wang, Linzhang; Li, Xuandong

    2011-01-01

    Communication Based Train Control (CBTC) system is the state-of-the-art train control system. In a CBTC system, to guarantee the safety of train operation, trains communicate with each other intensively and adjust their control modes autonomously by computing critical control parameters, e.g. velocity range, according to the information they get. As the correctness of the control parameters generated are critical to the safety of the system, a method to verify these parameters is a strong desire in the area of train control system. In this paper, we present our ideas of how to model and verify the control parameter calculations in a CBTC system efficiently. - As the behavior of the system is highly nondeterministic, it is difficult to build and verify the complete behavior space model of the system online in advance. Thus, we propose to model the system according to the ongoing behavior model induced by the control parameters. - As the parameters are generated online and updated very quickly, the verification...

  12. Modal analysis based equivalent circuit model and its verification for a single cMUT cell

    Science.gov (United States)

    Mao, S. P.; Rottenberg, X.; Rochus, V.; Czarnecki, P.; Helin, P.; Severi, S.; Nauwelaers, B.; Tilmans, H. A. C.

    2017-03-01

    This paper presents the lumped equivalent circuit model and its verification of both transmission and reception properties of a single cell capacitive micromachined ultrasonic transducer (cMUT), which is operating in a non-collapse small signal region. The derivation of this equivalent circuit model is based on the modal analysis techniques, harmonic modes are included by using the mode superposition method; and thus a wide frequency range response of the cMUT cell can be simulated by our equivalent circuit model. The importance of the cross modal coupling between different eigenmodes of a cMUT cell is discussed by us for the first time. In this paper the development of this model is only illustrated by a single circular cMUT cell under a uniform excitation. Extension of this model and corresponding results under a more generalized excitation will be presented in our upcoming publication (Mao et al 2016 Proc. IEEE Int. Ultrasonics Symp.). This model is verified by both finite element method (FEM) simulation and experimental characterizations. Results predicted by our model are in a good agreement with the FEM simulation results, and this works for a single cMUT cell operated in either transmission or reception. Results obtained from the model also rather match the experimental results of the cMUT cell. This equivalent circuit model provides an easy and precise way to rapidly predict the behaviors of cMUT cells.

  13. Fusion of hand vein, iris and fingerprint for person identity verification based on Bayesian theory

    Science.gov (United States)

    Li, Xiuyan; Liu, Tiegen; Deng, Shichao; Wang, Yunxin

    2009-11-01

    Biometric identification is an important guarantee for social security. In recent years, as the development of social and economic, the more accuracy and safety of identification are required. The person identity verification systems that use a single biometric appear inherent limitations in accuracy, user acceptance, universality. Limitations of unimodal biometric systems can be overcome by using multimodal biometric systems, which combines the conclusions made by a number of unrelated biometrics indicators. Aiming at the limitations of unimodal biometric identification, a recognition algorithm for multimodal biometric fusion based on hand vein, iris and fingerprint was proposed. To verify person identity, the hand vein images, iris images and fingerprint images were preprocessed firstly. The region of interest (ROI) of hand vein image was obtained and filtered to reduce image noises. The multiresolution analysis theory was utilized to extract the texture information of hand vein. The iris image was preprocessed through iris localization, eyelid detection, image normalization and image enhancement, and then the feature code of iris was extracted from the detail images obtained using wavelet transform. The texture feature information represented fingerprint pattern was extracted after filtering and image enhancement. The Bayesian theorem was employed to realize the fusion at the matching score level and the fusion recognition result was finally obtained. The experimental results were presented, which showed that the recognition performance of the proposed fusion method was obviously higher than that of single biometric recognition algorithm. It had verified the efficiency of the proposed method for biometrics.

  14. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Directory of Open Access Journals (Sweden)

    Yin Peili

    2017-08-01

    Full Text Available Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI. The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  15. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Science.gov (United States)

    Yin, Peili; Wang, Jianhua; Lu, Chunxia

    2017-08-01

    Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI). The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI) to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  16. HMM based Offline Signature Verification system using ContourletTransform and Textural features

    Directory of Open Access Journals (Sweden)

    K N PUSHPALATHA

    2014-07-01

    Full Text Available Handwritten signatures occupy a very special place in the identification of an individual and it is a challenging task because of the possible variations in directions and shapes of the constituent strokes of written samples. In this paper we investigated offline verifications system based on fusion of contourlet transform, directional features and Hidden Markov Model (HMM as classifier. The handwritten signature image is preprocessed for noise removal and a two level contourlet transform is applied to get feature vector. The textural features are computed and concatenated with coefficients of contourlet transform to form the final feature vector. A two level contourlet transform is applied to get feature vector after the signature images of both query and database are preprocessed for noise removal. The classification results are computed using HTK tool with HMM classifier. The experimental results are computed using GPDS-960 database images to get the parameters like False Rejection Rate (FRR, False Acceptance Rate (FAR and Total Success Rate (TSR. The results show that the values of FRR and FAR are improved compared to the existing algorithm.

  17. Numerical verification of similar Cam-clay model based on generalized potential theory

    Institute of Scientific and Technical Information of China (English)

    钟志辉; 杨光华; 傅旭东; 温勇; 张玉成

    2014-01-01

    From the mathematical principles, the generalized potential theory can be employed to create constitutive model of geomaterial directly. The similar Cam-clay model, which is created based on the generalized potential theory, has less assumptions, clearer mathematical basis, and better computational accuracy. Theoretically, it is more scientific than the traditional Cam-clay models. The particle flow code PFC3D was used to make numerical tests to verify the rationality and practicality of the similar Cam-clay model. The verification process was as follows: 1) creating the soil sample for numerical test in PFC3D, and then simulating the conventional triaxial compression test, isotropic compression test, and isotropic unloading test by PFC3D; 2) determining the parameters of the similar Cam-clay model from the results of above tests; 3) predicting the sample’s behavior in triaxial tests under different stress paths by the similar Cam-clay model, and comparing the predicting results with predictions by the Cam-clay model and the modified Cam-clay model. The analysis results show that the similar Cam-clay model has relatively high prediction accuracy, as well as good practical value.

  18. Model based correction of placement error in EBL and its verification

    Science.gov (United States)

    Babin, Sergey; Borisov, Sergey; Militsin, Vladimir; Komagata, Tadashi; Wakatsuki, Tetsuro

    2016-05-01

    In maskmaking, the main source of error contributing to placement error is charging. DISPLACE software corrects the placement error for any layout, based on a physical model. The charge of a photomask and multiple discharge mechanisms are simulated to find the charge distribution over the mask. The beam deflection is calculated for each location on the mask, creating data for the placement correction. The software considers the mask layout, EBL system setup, resist, and writing order, as well as other factors such as fogging and proximity effects correction. The output of the software is the data for placement correction. One important step is the calibration of physical model. A test layout on a single calibration mask was used for calibration. The extracted model parameters were used to verify the correction. As an ultimate test for the correction, a sophisticated layout was used for the verification that was very different from the calibration mask. The placement correction results were predicted by DISPLACE. A good correlation of the measured and predicted values of the correction confirmed the high accuracy of the charging placement error correction.

  19. Verification measurements of the Karoo Array timing system: a laser radar based time transfer system

    Science.gov (United States)

    Siebrits, R.; Bauermeister, E.; Gamatham, R.; Adams, G.; Malan, J. A.; Burger, J. P.; Kapp, F.; Gibbon, T.; Kriel, H.; Abbott, T.

    2016-02-01

    An optical fiber based laser radar time transfer system has been developed for the 64-dish MeerKAT radiointerferometer telescope project to provide accurate atomic time to the receivers of the telescope system. This time transfer system is called the Karoo Array Timing System (KATS). Calibration of the time transfer system is essential to ensure that time is accurately transferred to the digitisers that form part of the receivers. Frequency domain reflectometry via vector network analysers is also used to verify measurements taken using time interval counters. This paper details the progress that is made in the verification measurements of the system in order to ensure that time, accurate to within a few nanoseconds of the Universal Coordinated Time (UTC, is available at the point where radio signals from astronomical sources are received. This capability enables world class transient and timing studies with a compact radio interferometer, which has inherent advantages over large single dish radio-telescopes, in observing the transient sky.

  20. Combined ground-based optical support for the aurora (DELTA) sounding rocket campaign

    Science.gov (United States)

    Griffin, Eoghan; Kosch, Mike; Aruliah, Anasuya; Kavanagh, Andrew; McWhirter, Ian; Senior, Andrew; Ford, Elaina; Davis, Chris; Abe, Takumi; Kurihara, Junichi; Kauristie, Kirsti; Ogawa, Yasunobu

    2006-09-01

    The Japan Aerospace Exploration Agency (JAXA) DELTA rocket experiment, successfully launched from Andøya at 0033 UT on December 13, 2004, supported by ground based optical instruments, primarily 2 Fabry- Perot Interferometers (FPIs) located at Skibotn, Norway (69.3°N, 20.4°E) and the KEOPS Site, Esrange, Kiruna, Sweden (67.8°N, 20.4°E). Both these instruments sampled the 557.7 nm lower thermosphere atomic oxygen emission and provided neutral temperatures and line-of-sight wind velocities, with deduced vector wind patterns over each site. All sky cameras allow contextual auroral information to be acquired. The proximity of the sites provided overlapping fields of view, adjacent to the trajectory of the DELTA rocket. This allowed independent verification of the absolute temperatures in the relatively quiet conditions early in the night, especially important given the context provided by co-located EISCAT ion temperature measurements which allow investigation of the likely emission altitude of the passive FPI measurements. The results demonstrate that this altitude changes from 120 km pre-midnight to 115 km post-midnight. Within this large scale context the results from the FPIs also demonstrate smaller scale structure in neutral temperatures, winds and intensities consistent with localised heating. These results present a challenge to the representation of thermospheric variability for the existing models of the region.

  1. Film based verification of calculation algorithms used for brachytherapy planning-getting ready for upcoming challenges of MBDCA.

    Science.gov (United States)

    Zwierzchowski, Grzegorz; Bielęda, Grzegorz; Skowronek, Janusz; Mazur, Magdalena

    2016-08-01

    Well-known defect of TG-43 based algorithms used in brachytherapy is a lack of information about interaction cross-sections, which are determined not only by electron density but also by atomic number. TG-186 recommendations with using of MBDCA (model-based dose calculation algorithm), accurate tissues segmentation, and the structure's elemental composition continue to create difficulties in brachytherapy dosimetry. For the clinical use of new algorithms, it is necessary to introduce reliable and repeatable methods of treatment planning systems (TPS) verification. The aim of this study is the verification of calculation algorithm used in TPS for shielded vaginal applicators as well as developing verification procedures for current and further use, based on the film dosimetry method. Calibration data was collected by separately irradiating 14 sheets of Gafchromic(®) EBT films with the doses from 0.25 Gy to 8.0 Gy using HDR (192)Ir source. Standard vaginal cylinders of three diameters were used in the water phantom. Measurements were performed without any shields and with three shields combination. Gamma analyses were performed using the VeriSoft(®) package. Calibration curve was determined as third-degree polynomial type. For all used diameters of unshielded cylinder and for all shields combinations, Gamma analysis were performed and showed that over 90% of analyzed points meets Gamma criteria (3%, 3 mm). Gamma analysis showed good agreement between dose distributions calculated using TPS and measured by Gafchromic films, thus showing the viability of using film dosimetry in brachytherapy.

  2. Augmenting WFIRST Microlensing with a Ground-Based Telescope Network

    Science.gov (United States)

    Zhu, Wei; Gould, Andrew

    2016-06-01

    Augmenting the Wide Field Infrared Survey Telescope (WFIRST) microlensing campaigns with intensive observations from a ground-based network of wide-field survey telescopes would have several major advantages. First, it would enable full two-dimensional (2-D) vector microlens parallax measurements for a substantial fraction of low-mass lenses as well as planetary and binary events that show caustic crossing features. For a significant fraction of the free-floating planet (FFP) events and all caustic-crossing planetary/binary events, these 2-D parallax measurements directly lead to complete solutions (mass, distance, transverse velocity) of the lens object (or lens system). For even more events, the complementary ground-based observations will yield 1-D parallax measurements. Together with the 1-D parallaxes from WFIRST alone, they can probe the entire mass range M > M_Earth. For luminous lenses, such 1-D parallax measurements can be promoted to complete solutions (mass, distance, transverse velocity) by high-resolution imaging. This would provide crucial information not only about the hosts of planets and other lenses, but also enable a much more precise Galactic model. Other benefits of such a survey include improved understanding of binaries (particularly with low mass primaries), and sensitivity to distant ice-giant and gas-giant companions of WFIRST lenses that cannot be detected by WFIRST itself due to its restricted observing windows. Existing ground-based microlensing surveys can be employed if WFIRST is pointed at lower-extinction fields than is currently envisaged. This would come at some cost to the event rate. Therefore the benefits of improved characterization of lenses must be weighed against these costs.

  3. The STACEE-32 Ground Based Gamma-ray Detector

    CERN Document Server

    Hanna, D S; Boone, L M; Chantell, M C; Conner, Z; Covault, C E; Dragovan, M; Fortin, P; Gregorich, D T; Hinton, J A; Mukherjee, R; Ong, R A; Oser, S; Ragan, K; Scalzo, R A; Schütte, D R; Theoret, C G; Tümer, T O; Williams, D A; Zweerink, J A

    2002-01-01

    We describe the design and performance of the Solar Tower Atmospheric Cherenkov Effect Experiment detector in its initial configuration (STACEE-32). STACEE is a new ground-based gamma ray detector using the atmospheric Cherenkov technique. In STACEE, the heliostats of a solar energy research array are used to collect and focus the Cherenkov photons produced in gamma-ray induced air showers. The large Cherenkov photon collection area of STACEE results in a gamma-ray energy threshold below that of previous detectors.

  4. The STACEE Ground-Based Gamma-Ray Detector

    CERN Document Server

    Gingrich, D M; Bramel, D; Carson, J; Covault, C E; Fortin, P; Hanna, D S; Hinton, J A; Jarvis, A; Kildea, J; Lindner, T; Müller, C; Mukherjee, R; Ong, R A; Ragan, K; Scalzo, R A; Theoret, C G; Williams, D A; Zweerink, J A

    2005-01-01

    We describe the design and performance of the Solar Tower Atmospheric Cherenkov Effect Experiment (STACEE) in its complete configuration. STACEE uses the heliostats of a solar energy research facility to collect and focus the Cherenkov photons produced in gamma-ray induced air showers. The light is concentrated onto an array of photomultiplier tubes located near the top of a tower. The large Cherenkov photon collection area of STACEE results in a gamma-ray energy threshold below that of previous ground-based detectors. STACEE is being used to observe pulsars, supernova remnants, active galactic nuclei, and gamma-ray bursts.

  5. Research on target accuracy for ground-based lidar

    Science.gov (United States)

    Zhu, Ling; Shi, Ruoming

    2009-05-01

    In ground based Lidar system, the targets are used in the process of registration, georeferencing for point cloud, and also can be used as check points. Generally, the accuracy of capturing the flat target center is influenced by scanning range and scanning angle. In this research, the experiments are designed to extract accuracy index of the target center with 0-90°scan angles and 100-195 meter scan ranges using a Leica HDS3000 laser scanner. The data of the experiments are listed in detail and the related results are analyzed.

  6. REQUIREMENT ANALYSIS, ARCHITECTURAL DESIGN AND FORMAL VERIFICATION OF A MULTI-AGENT BASED UNIVERSITY INFORMATION MANAGEMENT SYSTEM

    Directory of Open Access Journals (Sweden)

    Nadeem AKHTAR

    2014-12-01

    Full Text Available This paper presents an approach based on the analysis, design, and formal verification of a multi-agent based university Information Management System (IMS. University IMS accesses information, creates reports and facilitates teachers as well as students. An orchestrator agent manages the coordination between all agents. It also manages the database connectivity for the whole system. The proposed IMS is based on BDI agent architecture, which models the system based on belief, desire, and intentions. The correctness properties of safety and liveness are specified by First-order predicate logic.

  7. Ground-Based Global Positioning System (GPS) Meteorology Integrated Precipitable Water Vapor (IPW)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Ground-Based Global Positioning System (GPS) Meteorology Integrated Precipitable Water Vapor (IPW) data set measures atmospheric water vapor using ground-based...

  8. Feature-Aware Verification

    CERN Document Server

    Apel, Sven; Wendler, Philipp; von Rhein, Alexander; Beyer, Dirk

    2011-01-01

    A software product line is a set of software products that are distinguished in terms of features (i.e., end-user--visible units of behavior). Feature interactions ---situations in which the combination of features leads to emergent and possibly critical behavior--- are a major source of failures in software product lines. We explore how feature-aware verification can improve the automatic detection of feature interactions in software product lines. Feature-aware verification uses product-line verification techniques and supports the specification of feature properties along with the features in separate and composable units. It integrates the technique of variability encoding to verify a product line without generating and checking a possibly exponential number of feature combinations. We developed the tool suite SPLverifier for feature-aware verification, which is based on standard model-checking technology. We applied it to an e-mail system that incorporates domain knowledge of AT&T. We found that feat...

  9. Applications of FBG-based sensors to ground stability monitoring

    Institute of Scientific and Technical Information of China (English)

    An-Bin Huang; Chien-Chih Wang; Jui-Ting Lee; Yen-Te Ho

    2016-01-01

    Over the past few decades, many optical fiber sensing techniques have been developed. Among these available sensing methods, optical fiber Bragg grating (FBG) is probably the most popular one. With its unique capabilities, FBG-based geotechnical sensors can be used as a sensor array for distributive (profile) measurements, deployed under water (submersible), for localized high resolution and/or dif-ferential measurements. The authors have developed a series of FBG-based transducers that include inclination, linear displacement and gauge/differential pore pressure sensors. Techniques that involve the field deployment of FBG inclination, extension and pore-pressure sensor arrays for automated slope stability and ground subsidence monitoring have been developed. The paper provides a background of FBG and the design concepts behind the FBG-based field monitoring sensors. Cases of field monitoring using the FBG sensor arrays are presented, and their practical implications are discussed.

  10. 基于C#的验证码的绘制%Draw the Verification Code Based on C#

    Institute of Scientific and Technical Information of China (English)

    马相芬

    2015-01-01

    the verification code technology is usually used some lines, random numbers, symbols and irregular characters, to pre⁃vent some hackers password data or in the network using robots to automatically register, login and irrigation. This paper introduces a realization method of the C# and the GDI+technology based on verification code.%验证码技术通常使用一些线条、随机数、符号和不规则的字符组成,借此防止一些黑客把密码数据化或者在网络上用机器人自动注册、登录和灌水。为此该文介绍了一种基于C#和GDI+技术的验证码的实现方法。

  11. Towards a CPN-Based Modelling Approach for Reconciling Verification and Implementation of Protocol Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2013-01-01

    and implementation. Our approach has been developed in the context of the Coloured Petri Nets (CPNs) modelling language. We illustrate our approach by presenting a descriptive specification model of the Websocket protocol which is currently under development by the Internet Engineering Task Force (IETF), and we show......Formal modelling of protocols is often aimed at one specific purpose such as verification or automatically generating an implementation. This leads to models that are useful for one purpose, but not for others. Being able to derive models for verification and implementation from a single model...

  12. A Methodology for Platform Based High—Level System—on—Chip Verification

    Institute of Scientific and Technical Information of China (English)

    GAOFeng; LIUPeng; YAOQingdong

    2003-01-01

    The time-to-market challenge has increased the need for shortening the co-verification time in system-on-chip development.In this article,a new methodology of high-level hardware/software coverification is introduced.With the help of the real-time operating system,the application program can easily be migrated from the software simulator to the hardware emulation board.The hierarchical architecture can be used to separate application program from the implementation of the platform during the veriflaction process.The highlevel verification platform is successfully used in developing the HDTV decoding chip.

  13. Development, verification and validation of an FPGA-based core heat removal protection system for a PWR

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Yichun, E-mail: ycwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China); Shui, Xuanxuan, E-mail: 807001564@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Cai, Yuanfeng, E-mail: 1056303902@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Zhou, Junyi, E-mail: 1032133755@qq.com [College of Energy, Xiamen University, Xiamen 361102 (China); Wu, Zhiqiang, E-mail: npic_wu@126.com [State Key Laboratory of Reactor System Design Technology, Nuclear Power Institute of China, Chengdu 610041 (China); Zheng, Jianxiang, E-mail: zwu@xmu.edu.cn [College of Energy, Xiamen University, Xiamen 361102 (China)

    2016-05-15

    Highlights: • An example on life cycle development process and V&V on FPGA-based I&C is presented. • Software standards and guidelines are used in FPGA-based NPP I&C system logic V&V. • Diversified FPGA design and verification languages and tools are utilized. • An NPP operation principle simulator is used to simulate operation scenarios. - Abstract: To reach high confidence and ensure reliability of nuclear FPGA-based safety system, life cycle processes of discipline specification and implementation of design as well as regulations verification and validation (V&V) are needed. A specific example on how to conduct life cycle development process and V&V on FPGA-based core heat removal (CHR) protection system for CPR1000 pressure water reactor (PWR) is presented in this paper. Using the existing standards and guidelines for life cycle development and V&V, a simplified FPGA-based CHR protection system for PWR has been designed, implemented, verified and validated. Diversified verification and simulation languages and tools are used by the independent design team and the V&V team. In the system acceptance testing V&V phase, a CPR1000 NPP operation principle simulator (OPS) model is utilized to simulate normal and abnormal operation scenarios, and provide input data to the under-test FPGA-based CHR protection system and a verified C code CHR function module. The evaluation results are applied to validate the under-test FPGA-based CHR protection system. The OPS model operation outputs also provide reasonable references for the tests. Using an OPS model in the system acceptance testing V&V is cost-effective and high-efficient. A dedicated OPS, as a commercial-off-the-shelf (COTS) item, would contribute as an important tool in the V&V process of NPP I&C systems, including FPGA-based and microprocessor-based systems.

  14. Statistical Studies of Ground-Based Optical Lightning Signatures

    Science.gov (United States)

    Hunt, C. R.; Nemzek, R. J.; Suszcynsky, D. M.

    2005-12-01

    Most extensive optical studies of lightning have been conducted from orbit, and the statistics of events collected from earth are relatively poorly documented. The time signatures of optical power measured in the presence of clouds are inevitably affected by scattering,which can distort the signatures by extending and delaying the amplitude profile in time. We have deployed two all-sky photodiode detectors, one in New Mexico and one in Oklahoma, which are gathering data alongside electric field change monitors as part of the LANL EDOTX Great Plains Array. Preliminary results show that the photodiode is sensitive to approximately 50% or more of RF events detected at ranges of up to 30 km, and still has some sensitivity at ranges in excess of 60 km (distances determined by the EDOTX field-change array). The shapes of events within this range were assessed, with focus on rise time, width, peak power, and their correlation to corresponding electric field signatures, and these are being compared with published on-orbit and ground-based data. Initial findings suggest a mean characteristic width (ratio of total detected optical energy to peak power) of 291 +/- 12 microseconds and a mean delay between the RF signal peak and optical peak of 121 +/- 17 microseconds. These values fall between prior ground-based measurements of direct return stroke emissions, and scattering-dominated on-orbit measurements. This work will promote better understanding of the correspondence between radio and optical measurements of lightning.

  15. Real-time Gaussian Markov random-field-based ground tracking for ground penetrating radar data

    Science.gov (United States)

    Bradbury, Kyle; Torrione, Peter A.; Collins, Leslie

    2009-05-01

    Current ground penetrating radar algorithms for landmine detection require accurate estimates of the location of the air/ground interface to maintain high levels of performance. However, the presence of surface clutter, natural soil roughness, and antenna motion lead to uncertainty in these estimates. Previous work on improving estimates of the location of the air/ground interface have focused on one-dimensional filtering techniques to localize the air/ground interface. In this work, we propose an algorithm for interface localization using a 2- D Gaussian Markov random field (GMRF). The GMRF provides a statistical model of the surface structure, which enables the application of statistical optimization techniques. In this work, the ground location is inferred using iterated conditional modes (ICM) optimization which maximizes the conditional pseudo-likelihood of the GMRF at a point, conditioned on its neighbors. To illustrate the efficacy of the proposed interface localization approach, pre-screener performance with and without the proposed ground localization algorithm is compared. We show that accurate localization of the air/ground interface provides the potential for future performance improvements.

  16. Verification of sectoral cloud motion based direct normal irradiance nowcasting from satellite imagery

    Science.gov (United States)

    Schroedter-Homscheidt, Marion; Gesell, Gerhard

    2016-05-01

    The successful integration of solar electricity from photovoltaics or concentrating solar power plants into the existing electricity supply requires an electricity production forecast for 48 hours, while any improved surface irradiance forecast over the next upcoming hours is relevant for an optimized operation of the power plant. While numerical weather prediction has been widely assessed and is in commercial use, the short-term nowcasting is still a major field of development. European Commission's FP7 DNICast project is especially focusing on this task and this paper reports about parts of DNICast results. A nowcasting scheme based on Meteosat Second Generation cloud imagery and cloud movement tracking has been developed for Southern Spain as part of a solar production forecasting tool (CSP-FoSyS). It avoids the well-known, but not really satisfying standard cloud motion vector approach by using a sectoral approach and asking the question at which time any cloud structure will affect the power plant. It distinguishes between thin cirrus clouds and other clouds, which typically occur in different heights in the atmosphere and move in different directions. Also, their optical properties are very different - especially for the calculation of direct normal irradiances as required by concentrating solar power plants. Results for Southern Spain show a positive impact of up to 8 hours depending of the time of the day and a RMSD reduction of up to 10% in hourly DNI irradiation compared to day ahead forecasts. This paper presents the verification of this scheme at other locations in Europe and Northern Africa (BSRN and EnerMENA stations) with different cloud conditions. Especially for Jordan and Tunisia as the most relevant countries for CSP in this station list, we also find a positive impact of up to 8 hours.

  17. Verification of the FtCayuga fault-tolerant microprocessor system. Volume 1: A case study in theorem prover-based verification

    Science.gov (United States)

    Srivas, Mandayam; Bickford, Mark

    1991-01-01

    The design and formal verification of a hardware system for a task that is an important component of a fault tolerant computer architecture for flight control systems is presented. The hardware system implements an algorithm for obtaining interactive consistancy (byzantine agreement) among four microprocessors as a special instruction on the processors. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, provided certain preconditions hold. An assumption is made that the processors execute synchronously. For verification, the authors used a computer aided design hardware design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.

  18. Identification of rainy periods from ground based microwave radiometry

    Directory of Open Access Journals (Sweden)

    Ada Vittoria Bosisio

    2012-03-01

    Full Text Available In this paper the authors present the results of a study aiming at detecting rainy data in measurements collected by a dual band ground-based radiometer. The proposed criterion is based on the ratio of the brightness temperatures observed in the 20-30 GHz band without need of any ancillary information. A major result obtained from the probability density of the ratio computed over one month of data is the identification of threshold values between clear sky, cloudy sky and rainy sky, respectively. A linear fit performed by using radiometric data and concurrent rain gauge measurements shows a correlation coefficient equal to 0.56 between the temperature ratio and the observed precipitation.

  19. Unique cell culture systems for ground based research

    Science.gov (United States)

    Lewis, Marian L.

    1990-01-01

    The horizontally rotating fluid-filled, membrane oxygenated bioreactors developed at NASA Johnson for spacecraft applications provide a powerful tool for ground-based research. Three-dimensional aggregates formed by cells cultured on microcarrier beads are useful for study of cell-cell interactions and tissue development. By comparing electron micrographs of plant seedlings germinated during Shuttle flight 61-C and in an earth-based rotating bioreactor it is shown that some effects of microgravity are mimicked. Bioreactors used in the UAH Bioreactor Laboratory will make it possible to determine some of the effects of altered gravity at the cellular level. Bioreactors can be valuable for performing critical, preliminary-to-spaceflight experiments as well as medical investigations such as in vitro tumor cell growth and chemotherapeutic drug response; the enrichment of stem cells from bone marrow; and the effect of altered gravity on bone and muscle cell growth and function and immune response depression.

  20. Spatial-angular modeling of ground-based biaxial lidar

    Science.gov (United States)

    Agishev, Ravil R.

    1997-10-01

    Results of spatial-angular LIDAR modeling based on an efficiency criterion introduced are represented. Their analysis shows that a low spatial-angular efficiency of traditional VIS and NIR systems is a main cause of a low S/BR ratio at the photodetector input. It determines the considerable measurements errors and the following low accuracy of atmospheric optical parameters retrieval. As we have shown, the most effective protection against intensive sky background radiation for ground-based biaxial LIDAR's consist in forming of their angular field according to spatial-angular efficiency criterion G. Some effective approaches to high G-parameter value achievement to achieve the receiving system optimization are discussed.

  1. Coastal wind study based on Sentinel-1 and ground-based scanning lidar

    DEFF Research Database (Denmark)

    Hasager, Charlotte Bay; Badger, Merete; Pena Diaz, Alfredo

    Energy (Badger et al. 2016) using GFS winds as input. Wind direction can be checked from the various other observations. Sensitivity to possible deviations in wind directions in the near-shore area will be investigated. Furthermore, oceanic features not related to winds but to e.g. surface current......, breaking waves, etc. will be investigated. The plan is to establish high-quality coastal wind speed cases based on Sentinel-1 for quantification of the coastal winds, for verification of wind resource modelling best practices in the coastal zone. The study is supported by RUNE and New European Wind Atlas...

  2. Speaker-dependent Dictionary-based Speech Enhancement for Text-Dependent Speaker Verification

    DEFF Research Database (Denmark)

    Thomsen, Nicolai Bæk; Thomsen, Dennis Alexander Lehmann; Tan, Zheng-Hua

    2016-01-01

    The problem of text-dependent speaker verification under noisy conditions is becoming ever more relevant, due to increased usage for authentication in real-world applications. Classical methods for noise reduction such as spectral subtraction and Wiener filtering introduce distortion and do...

  3. Type-Based Automated Verification of Authenticity in Asymmetric Cryptographic Protocols

    DEFF Research Database (Denmark)

    Dahl, Morten; Kobayashi, Naoki; Sun, Yunde

    2011-01-01

    Gordon and Jeffrey developed a type system for verification of asymmetric and symmetric cryptographic protocols. We propose a modified version of Gordon and Jeffrey's type system and develop a type inference algorithm for it, so that protocols can be verified automatically as they are, without any...

  4. A New "Moodle" Module Supporting Automatic Verification of VHDL-Based Assignments

    Science.gov (United States)

    Gutierrez, Eladio; Trenas, Maria A.; Ramos, Julian; Corbera, Francisco; Romero, Sergio

    2010-01-01

    This work describes a new "Moodle" module developed to give support to the practical content of a basic computer organization course. This module goes beyond the mere hosting of resources and assignments. It makes use of an automatic checking and verification engine that works on the VHDL designs submitted by the students. The module automatically…

  5. A New "Moodle" Module Supporting Automatic Verification of VHDL-Based Assignments

    Science.gov (United States)

    Gutierrez, Eladio; Trenas, Maria A.; Ramos, Julian; Corbera, Francisco; Romero, Sergio

    2010-01-01

    This work describes a new "Moodle" module developed to give support to the practical content of a basic computer organization course. This module goes beyond the mere hosting of resources and assignments. It makes use of an automatic checking and verification engine that works on the VHDL designs submitted by the students. The module automatically…

  6. Soil properties and performance of landmine detection by metal detector and ground-penetrating radar — Soil characterisation and its verification by a field test

    Science.gov (United States)

    Takahashi, Kazunori; Preetz, Holger; Igel, Jan

    2011-04-01

    Metal detectors have commonly been used for landmine detection, and ground-penetrating radar (GPR) is about to be deployed for this purpose. These devices are influenced by the magnetic and electric properties of soil, since both employ electromagnetic techniques. Various soil properties and their spatial distributions were measured and determined with geophysical methods in four soil types where a test of metal detectors and GPR systems took place. By analysing the soil properties, these four soils were classified based on the expected influence of each detection technique and predicted soil difficulty. This classification was compared to the detection performance of the detectors and a clear correlation between the predicted soil difficulty and performance was observed. The detection performance of the metal detector and target identification performance of the GPR systems degraded in soils that were expected to be problematic. Therefore, this study demonstrated that the metal detector and GPR performance for landmine detection can be assessed qualitatively by geophysical analyses.

  7. DDCC-Based Quadrature Oscillator with Grounded Capacitors and Resistors

    Directory of Open Access Journals (Sweden)

    Montree Kumngern

    2009-01-01

    Full Text Available A new voltage-mode quadrature oscillator using two differential difference current conveyors (DDCCs, two grounded capacitors, and three grounded resistors is presented. The proposed oscillator provides the following advantages: the oscillation condition and oscillation frequency are orthogonally controlled; the oscillation frequency is controlled through a single grounded resistor; the use of only grounded capacitors and resistors makes the proposed circuit ideal for IC implementation; low passive and active sensitivities. Simulation results verifying the theoretical analysis are also included.

  8. The STACEE Ground-Based Gamma-ray Observatory

    Science.gov (United States)

    Ragan, Ken

    2002-04-01

    The Solar Tower Atmospheric Cherenkov Effect Experiment (STACEE) is a ground-based instrument designed to study astrophysical sources of gamma rays in the energy range from 50 to 500 GeV using an array of heliostat mirrors at the National Solar Thermal Test Facility in New Mexico. The mirrors collect Cherenkov light generated by gamma-ray air showers and concentrate it onto cameras composed of photomultiplier tubes. The STACEE instrument is now complete, and uses a total of 64 heliostats. Prototype instruments, using smaller numbers of heliostats, have previously detected gamma emission from both the Crab Nebula and the Active Galactic Nucleus Mrk421. The complete instrument has a lower threshold -- approximately 50 GeV -- than those prototypes due to superior triggering and electronics, including flash ADCs for every channel.We will discuss the performance of the complete instrument in its first full season of operation, and present preliminary results of selected observations.

  9. Atmospheric contamination for CMB ground-based observations

    CERN Document Server

    Errard, J; Akiba, Y; Arnold, K; Atlas, M; Baccigalupi, C; Barron, D; Boettger, D; Borrill, J; Chapman, S; Chinone, Y; Cukierman, A; Delabrouille, J; Dobbs, M; Ducout, A; Elleflot, T; Fabbian, G; Feng, C; Feeney, S; Gilbert, A; Goeckner-Wald, N; Halverson, N W; Hasegawa, M; Hattori, K; Hazumi, M; Hill, C; Holzapfel, W L; Hori, Y; Inoue, Y; Jaehnig, G C; Jaffe, A H; Jeong, O; Katayama, N; Kaufman, J; Keating, B; Kermish, Z; Keskitalo, R; Kisner, T; Jeune, M Le; Lee, A T; Leitch, E M; Leon, D; Linder, E; Matsuda, F; Matsumura, T; Miller, N J; Myers, M J; Navaroli, M; Nishino, H; Okamura, T; Paar, H; Peloton, J; Poletti, D; Puglisi, G; Rebeiz, G; Reichardt, C L; Richards, P L; Ross, C; Rotermund, K M; Schenck, D E; Sherwin, B D; Siritanasak, P; Smecher, G; Stebor, N; Steinbach, B; Stompor, R; Suzuki, A; Tajima, O; Takakura, S; Tikhomirov, A; Tomaru, T; Whitehorn, N; Wilson, B; Yadav, A; Zahn, O

    2015-01-01

    Atmosphere is one of the most important noise sources for ground-based Cosmic Microwave Background (CMB) experiments. By increasing optical loading on the detectors, it amplifies their effective noise, while its fluctuations introduce spatial and temporal correlations between detected signals. We present a physically motivated 3d-model of the atmosphere total intensity emission in the millimeter and sub-millimeter wavelengths. We derive an analytical estimate for the correlation between detectors time-ordered data as a function of the instrument and survey design, as well as several atmospheric parameters such as wind, relative humidity, temperature and turbulence characteristics. Using numerical computation, we examine the effect of each physical parameter on the correlations in the time series of a given experiment. We then use a parametric-likelihood approach to validate the modeling and estimate atmosphere parameters from the POLARBEAR-I project first season data set. We compare our results to previous st...

  10. Observational Selection Effects with Ground-based Gravitational Wave Detectors

    CERN Document Server

    Chen, Hsin-Yu; Vitale, Salvatore; Holz, Daniel E; Katsavounidis, Erik

    2016-01-01

    Ground-based interferometers are not perfectly all-sky instruments, and it is important to account for their behavior when considering the distribution of detected events. In particular, the LIGO detectors are most sensitive to sources above North America and the Indian Ocean and, as the Earth rotates, the sensitive regions are swept across the sky. However, because the detectors do not acquire data uniformly over time, there is a net bias on detectable sources' right ascensions. Both LIGO detectors preferentially collect data during their local night; it is more than twice as likely to be local midnight than noon when both detectors are operating. We discuss these selection effects and how they impact LIGO's observations and electromagnetic follow-up. Beyond galactic foregrounds associated with seasonal variations, we find that equatorial observatories can access over $80\\%$ of the localization probability, while mid-latitudes will access closer to $70\\%$. Facilities located near the two LIGO sites can obser...

  11. Progress in the ULTRA 1-m ground-based telescope

    Science.gov (United States)

    Romeo, Robert C.; Martin, Robert N.; Twarog, Bruce; Anthony-Twarog, Barbara; Taghavi, Ray; Hale, Rick; Etzel, Paul; Fesen, Rob; Shawl, Steve

    2006-06-01

    We present the technical status of the Ultra Lightweight Telescope for Research in Astronomy (ULTRA) program. The program is a 3-year Major Research Instrumentation (MRI) program funded by NSF. The MRI is a collaborative effort involving Composite Mirror Applications, Inc. (CMA), University of Kansas, San Diego State University and Dartmouth College. Objectives are to demonstrate the feasibility of carbon fiber reinforced plastic (CFRP) composite mirror technology for ground-based optical telescopes. CMA is spearheading the development of surface replication techniques to produce the optics, fabricating the 1m glass mandrel, and constructing the optical tube assembly (OTA). Presented will be an overview and status of the 1-m mandrel fabrication, optics development, telescope design and CFRP telescope fabrication by CMA for the ULTRA Telescope.

  12. Ground-based optical observation system for LEO objects

    Science.gov (United States)

    Yanagisawa, T.; Kurosaki, H.; Oda, H.; Tagawa, M.

    2015-08-01

    We propose a ground-based optical observation system for monitoring LEO objects, which uses numerous optical sensors to cover a vast region of the sky. Its potential in terms of detection and orbital determination were examined. About 30 cm LEO objects at 1000 km altitude are detectable using an 18 cm telescope, a CCD camera and the analysis software developed. Simulations and a test observation showed that two longitudinally separate observation sites with arrays of optical sensors can identify the same objects from numerous data sets and determine their orbits precisely. The proposed system may complement or replace the current radar observation system for monitoring LEO objects, like space-situation awareness, in the near future.

  13. Optical vortex coronagraphs on ground-based telescopes

    CERN Document Server

    Jenkins, Charles

    2007-01-01

    The optical vortex coronagraph is potentially a remarkably effective device, at least for an ideal unobstructed telescope. Most ground-based telescopes however suffer from central obscuration and also have to operate through the aberrations of the turbulent atmosphere. This note analyzes the performance of the optical vortex in these circumstances and compares to some other designs, showing that it performs similarly in this situation. There is a large class of coronagraphs of this general type, and choosing between them in particular applications depends on details of performance at small off-axis distances and uniformity of response in the focal plane. Issues of manufacturability to the necessary tolerances are also likely to be important.

  14. Observational Selection Effects with Ground-based Gravitational Wave Detectors

    Science.gov (United States)

    Chen, Hsin-Yu; Essick, Reed; Vitale, Salvatore; Holz, Daniel; Katsavounidis, Erik

    2017-01-01

    Ground-based interferometers are not perfectly all-sky instruments, and it is important to account for their behavior when considering the distribution of detected events. In particular, the LIGO detectors are most sensitive to sources above North America and the Indian Ocean and, as the Earth rotates, the sensitive regions are swept across the sky. However, because the detectors do not acquire data uniformly over time, there is a net bias on detectable sources' right ascensions. Both LIGO detectors preferentially collect data during their local night; it is more than twice as likely to be local midnight than noon when both detectors are operating. We discuss these selection effects and how they impact LIGO's observations and electromagnetic follow-up. These effects can inform electromagnetic follow-up activities and optimization, including the possibility of directing observations even before gravitational-wave events occur.

  15. Ground-based Measurements of Next Generation Spectroradiometric Standard Stars

    Science.gov (United States)

    McGraw, John T.

    2013-01-01

    Accurate, radiometric standards are essential to the future of ground- and space-based astronomy and astrophysics. While astronomers tend to think of “standard stars” as available calibration sources, progress at NIST to accurately calibrate inexpensive, easy to use photodiode detectors as spectroradiometric standards from 200 nm to 1800 nm allows referencing astronomical measurements to these devices. Direction-, time-, and wavelength-dependent transmission of Earth’s atmosphere is the single largest source of error for ground-based radiometric measurement of astronomical objects. Measurements and impacts of atmospheric extinction - scattering and absorption - on imaging radiometric and spectroradiometric measurements are described. The conclusion is that accurate real-time measurement of extinction in the column of atmosphere through which standard star observations are made, over the spectral region being observed and over the field of view of the telescope are required. New techniques to directly and simultaneously measure extinction in the column of atmosphere through which observations are made are required. Our direct extinction measurement solution employs three small facility-class instruments working in parallel: a lidar to measure rapidly time variable transmission at three wavelengths with uncertainty of 0.25% per airmass, a spectrophotometer to measure rapidly wavelength variable extinction with sub-1% precision per nanometer resolution element from 350 to 1050nm, and a wide-field camera to measure angularly variable extinction over the field of view. These instruments and their operation will be described. We assert that application of atmospheric metadata provided by this instrument suite corrects for a significant fraction of systematic errors currently limiting radiometric precision, and provides a major step towards measurements that are provably dominated by random noise.

  16. New superfamily members identified for Schiff-base enzymes based on verification of catalytically essential residues.

    Science.gov (United States)

    Choi, Kyung H; Lai, Vicky; Foster, Christine E; Morris, Aaron J; Tolan, Dean R; Allen, Karen N

    2006-07-18

    Enzymes that utilize a Schiff-base intermediate formed with their substrates and that share the same alpha/beta barrel fold comprise a mechanistically diverse superfamily defined in the SCOPS database as the class I aldolase family. The family includes the "classical" aldolases fructose-1,6-(bis)phosphate (FBP) aldolase, transaldolase, and 2-keto-3-deoxy-6-phosphogluconate aldolase. Moreover, the N-acetylneuraminate lyase family has been included in the class I aldolase family on the basis of similar Schiff-base chemistry and fold. Herein, we generate primary sequence identities based on structural alignment that support the homology and reveal additional mechanistic similarities beyond the common use of a lysine for Schiff-base formation. The structural and mechanistic correspondence comprises the use of a catalytic dyad, wherein a general acid/base residue (Glu, Tyr, or His) involved in Schiff-base chemistry is stationed on beta-strand 5 of the alpha/beta barrel. The role of the acid/base residue was probed by site-directed mutagenesis and steady-state and pre-steady-state kinetics on a representative member of this family, FBP aldolase. The kinetic results are consistent with the participation of this conserved residue or position in the protonation of the carbinolamine intermediate and dehydration of the Schiff base in FBP aldolase and, by analogy, the class I aldolase family.

  17. Electrical performance verification methodology for large reflector antennas: based on the P-band SAR payload of the ESA BIOMASS candidate mission

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Kim, Oleksiy S.; Nielsen, Jeppe Majlund;

    2013-01-01

    In this paper, an electrical performance verification methodology for large reflector antennas is proposed. The verification methodology was developed for the BIOMASS P-band (435 MHz) synthetic aperture radar (SAR), but can be applied to other large deployable or fixed reflector antennas for which...... the verification of the entire antenna or payload is impossible. The two-step methodology is based on accurate measurement of the feed structure characteristics, such as complex radiation pattern and radiation efficiency, with an appropriate Measurement technique, and then accurate calculation of the radiation...... pattern and gain of the entire antenna including support and satellite structure with an appropriate computational software. A preliminary investigation of the proposed methodology was carried out by performing extensive simulations of different verification approaches. The experimental validation...

  18. Statistical design for biospecimen cohort size in proteomics-based biomarker discovery and verification studies.

    Science.gov (United States)

    Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S

    2013-12-01

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.

  19. Film based verification of calculation algorithms used for brachytherapy planning-getting ready for upcoming challenges of MBDCA

    Directory of Open Access Journals (Sweden)

    Grzegorz Zwierzchowski

    2016-08-01

    Full Text Available Purpose: Well-known defect of TG-43 based algorithms used in brachytherapy is a lack of information about interaction cross-sections, which are determined not only by electron density but also by atomic number. TG-186 recommendations with using of MBDCA (model-based dose calculation algorithm, accurate tissues segmentation, and the structure’s elemental composition continue to create difficulties in brachytherapy dosimetry. For the clinical use of new algorithms, it is necessary to introduce reliable and repeatable methods of treatment planning systems (TPS verification. The aim of this study is the verification of calculation algorithm used in TPS for shielded vaginal applicators as well as developing verification procedures for current and further use, based on the film dosimetry method. Material and methods : Calibration data was collected by separately irradiating 14 sheets of Gafchromic® EBT films with the doses from 0.25 Gy to 8.0 Gy using HDR 192Ir source. Standard vaginal cylinders of three diameters were used in the water phantom. Measurements were performed without any shields and with three shields combination. Gamma analyses were performed using the VeriSoft® package. Results : Calibration curve was determined as third-degree polynomial type. For all used diameters of unshielded cylinder and for all shields combinations, Gamma analysis were performed and showed that over 90% of analyzed points meets Gamma criteria (3%, 3 mm. Conclusions : Gamma analysis showed good agreement between dose distributions calculated using TPS and measured by Gafchromic films, thus showing the viability of using film dosimetry in brachytherapy.

  20. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  1. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  2. Hydrogeology, simulated ground-water flow, and ground-water quality, Wright-Patterson Air Force Base, Ohio

    Science.gov (United States)

    Dumouchelle, D.H.; Schalk, C.W.; Rowe, G.L.; De Roche, J.T.

    1993-01-01

    Ground water is the primary source of water in the Wright-Patterson Air Force Base area. The aquifer consists of glacial sands and gravels that fill a buried bedrock-valley system. Consolidated rocks in the area consist of poorly permeable Ordovician shale of the Richmondian stage, in the upland areas, the Brassfield Limestone of Silurian age. The valleys are filled with glacial sediments of Wisconsinan age consisting of clay-rich tills and coarse-grained outwash deposits. Estimates of hydraulic conductivity of the shales based on results of displacement/recovery tests range from 0.0016 to 12 feet per day; estimates for the glacial sediments range from less than 1 foot per day to more than 1,000 feet per day. Ground water flow from the uplands towards the valleys and the major rivers in the region, the Great Miami and the Mad Rivers. Hydraulic-head data indicate that ground water flows between the bedrock and unconsolidated deposits. Data from a gain/loss study of the Mad River System and hydrographs from nearby wells reveal that the reach of the river next to Wright-Patterson Air Force Base is a ground-water discharge area. A steady-state, three-dimensional ground-water-flow model was developed to simulate ground-water flow in the region. The model contains three layers and encompasses about 100 square miles centered on Wright-Patterson Air Force Base. Ground water enters the modeled area primarily by river leakage and underflow at the model boundary. Ground water exits the modeled area primarily by flow through the valleys at the model boundaries and through production wells. A model sensitivity analysis involving systematic changes in values of hydrologic parameters in the model indicates that the model is most sensitive to decreases in riverbed conductance and vertical conductance between the upper two layers. The analysis also indicates that the contribution of water to the buried-valley aquifer from the bedrock that forms the valley walls is about 2 to 4

  3. Independet Component Analyses of Ground-based Exoplanetary Transits

    Science.gov (United States)

    Silva Martins-Filho, Walter; Griffith, Caitlin Ann; Pearson, Kyle; Waldmann, Ingo; Biddle, Lauren; Zellem, Robert Thomas; Alvarez-Candal, Alvaro

    2016-10-01

    Most observations of exoplanetary atmospheres are conducted when a "Hot Jupiter" exoplanet transits in front of its host star. These Jovian-sized planets have small orbital periods, on the order of days, and therefore a short transit time, making them more ameanable to observations. Measurements of Hot Jupiter transits must achieve a 10-4 level of accuracy in the flux to determine the spectral modulations of the exoplanetary atmosphere. In order to accomplish this level of precision, we need to extract systematic errors, and, for ground-based measurements, the effects of Earth's atmosphere, from the signal due to the exoplanet, which is several orders of magnitudes smaller. Currently, the effects of the terrestrial atmosphere and the some of the time-dependent systematic errors are treated by dividing the host star by a reference star at each wavelength and time step of the transit. More recently, Independent Component Analyses (ICA) have been used to remove systematic effects from the raw data of space-based observations (Waldmann 2014,2012; Morello et al.,2015,2016). ICA is a statistical method born from the ideas of the blind-source separation studies, which can be used to de-trend several independent source signals of a data set (Hyvarinen and Oja, 2000). One strength of this method is that it requires no additional prior knowledge of the system. Here, we present a study of the application of ICA to ground-based transit observations of extrasolar planets, which are affected by Earth's atmosphere. We analyze photometric data of two extrasolar planets, WASP-1b and GJ3470b, recorded by the 61" Kuiper Telescope at Stewart Observatory using the Harris B and U filters. The presentation will compare the light curve depths and their dispersions as derived from the ICA analysis to those derived by analyses that ratio of the host star to nearby reference stars.References: Waldmann, I.P. 2012 ApJ, 747, 12, Waldamann, I. P. 2014 ApJ, 780, 23; Morello G. 2015 ApJ, 806

  4. Models of ionospheric VLF absorption of powerful ground based transmitters

    Science.gov (United States)

    Cohen, M. B.; Lehtinen, N. G.; Inan, U. S.

    2012-12-01

    Ground based Very Low Frequency (VLF, 3-30 kHz) radio transmitters play a role in precipitation of energetic Van Allen electrons. Initial analyses of the contribution of VLF transmitters to radiation belt losses were based on early models of trans-ionospheric propagation known as the Helliwell absorption curves, but some recent studies have found that the model overestimates (by 20-100 dB) the VLF energy reaching the magnetosphere. It was subsequently suggested that conversion of wave energy into electrostatic modes may be responsible for the error. We utilize a newly available extensive record of VLF transmitter energy reaching the magnetosphere, taken from the DEMETER satellite, and perform a direct comparison with a sophisticated full wave model of trans-ionospheric propagation. Although the model does not include the effect of ionospheric irregularities, it correctly predicts the average total power injected into the magnetosphere within several dB. The results, particularly at nighttime, appear to be robust against the variability of the ionospheric electron density. We conclude that the global effect of irregularity scattering on whistler mode conversion to quasi-electrostatic may be no larger than 6 dB.

  5. 基于 OCL的本体模型校验方法%ONTOLOGY MODEL VERIFICATION APPROACH BASED ON OCL

    Institute of Scientific and Technical Information of China (English)

    钱鹏飞; 王英林; 张申生

    2015-01-01

    In this paper, by combining the set and relation theory with ontology model and introducing and expanding Object Constraint Language ( OCL) in object oriented technology, we present an OCL-based ontology verification method.The method extracts an ontology defi-nition meta-model ( ODM) , which is based on set and relation theory, from a large number of ontology models.The ontology model is divided into'entity related element' and'constraint rule related element' , and through a series of OCL expansion functions the formalised expres-sion of the above 2 kinds of ontology model elements are completed so as to fulfil the OCL-based formalised ontology model verification.In the end, the issue of realising ontology model conflict inspection and reconciliation using this model verification approach is further discussed through an ontology model verification sample of'vehicle management ontology slice of Baosteel information sharing platform' .%将集合关系理论与本体模型相结合,同时引入并扩展面向对象中的OCL( Object Constraint Language)语言,提出一种基于OCL的本体校验方法. 该方法从大量本体模型中抽象出一个本体定义元模型ODM(Ontology Constraint Meta-model),该元模型基于集合关系理论,将本体模型划分为"实体相关元素"和"约束规则相关元素",并通过一系列OCL扩展函数来完成上述两种本体模型元素的形式化表示,以完成基于OCL的本体模型形式化校验. 最后,通过宝钢信息共享平台车辆管理本体片段的本体模型校验实例,进一步讨论如何使用该模型校验方法实现本体模型的冲突检测和冲突消解.

  6. Ground Based Investigation of Electrostatic Accelerometer in HUST

    Science.gov (United States)

    Bai, Y.; Zhou, Z.

    2013-12-01

    High-precision electrostatic accelerometers with six degrees of freedom (DOF) acceleration measurement were successfully used in CHAMP, GRACE and GOCE missions which to measure the Earth's gravity field. In our group, space inertial sensor based on the capacitance transducer and electrostatic control technique has been investigated for test of equivalence principle (TEPO), searching non-Newtonian force in micrometer range, and satellite Earth's field recovery. The significant techniques of capacitive position sensor with the noise level at 2×10-7pF/Hz1/2 and the μV/Hz1/2 level electrostatic actuator are carried out and all the six servo loop controls by using a discrete PID algorithm are realized in a FPGA device. For testing on ground, in order to compensate one g earth's gravity, the fiber torsion pendulum facility is adopt to measure the parameters of the electrostatic controlled inertial sensor such as the resolution, and the electrostatic stiffness, the cross couple between different DOFs. A short distance and a simple double capsule equipment the valid duration about 0.5 second is set up in our lab for the free fall tests of the engineering model which can directly verify the function of six DOF control. Meanwhile, high voltage suspension method is also realized and preliminary results show that the horizontal axis of acceleration noise is about 10-8m/s2/Hz1/2 level which limited mainly by the seismic noise. Reference: [1] Fen Gao, Ze-Bing Zhou, Jun Luo, Feasibility for Testing the Equivalence Principle with Optical Readout in Space, Chin. Phys. Lett. 28(8) (2011) 080401. [2] Z. Zhu, Z. B. Zhou, L. Cai, Y. Z. Bai, J. Luo, Electrostatic gravity gradiometer design for the advanced GOCE mission, Adv. Sp. Res. 51 (2013) 2269-2276. [3] Z B Zhou, L Liu, H B Tu, Y Z Bai, J Luo, Seismic noise limit for ground-based performance measurements of an inertial sensor using a torsion balance, Class. Quantum Grav. 27 (2010) 175012. [4] H B Tu, Y Z Bai, Z B Zhou, L Liu, L

  7. Probing Pluto's Atmosphere Using Ground-Based Stellar Occultations

    Science.gov (United States)

    Sicardy, Bruno; Rio de Janeiro Occultation Team, Granada Team, International Occultation and Timing Association, Royal Astronomical Society New Zealand Occultation Section, Lucky Star associated Teams

    2016-10-01

    Over the last three decades, some twenty stellar occultations by Pluto have been monitored from Earth. They occur when the dwarf planet blocks the light from a star for a few minutes as it moves on the sky. Such events led to the hint of a Pluto's atmosphere in 1985, that was fully confirmed during another occultation in 1988, but it was only in 2002 that a new occultation could be recorded. From then on, the dwarf planet started to move in front of the galactic center, which amplified by a large factor the number of events observable per year.Pluto occultations are essentially refractive events during which the stellar rays are bent by the tenuous atmosphere, causing a gradual dimming of the star. This provides the density, pressure and temperature profiles of the atmosphere from a few kilometers above the surface up to about 250 km altitude, corresponding respectively to pressure levels of about 10 and 0.1 μbar. Moreover, the extremely fine spatial resolution (a few km) obtained through this technique allows the detection of atmospheric gravity waves, and permits in principle the detection of hazes, if present.Several aspects make Pluto stellar occultations quite special: first, they are the only way to probe Pluto's atmosphere in detail, as the dwarf planet is far too small on the sky and the atmosphere is far too tenuous to be directly imaged from Earth. Second, they are an excellent example of participative science, as many amateurs have been able to record those events worldwide with valuable scientific returns, in collaboration with professional astronomers. Third, they reveal Pluto's climatic changes on decade-scales and constrain the various seasonal models currently explored.Finally, those observations are fully complementary to space exploration, in particular with the New Horizons (NH) mission. I will show how ground-based occultations helped to better calibrate some NH profiles, and conversely, how NH results provide some key boundary conditions

  8. SU-E-T-398: Verification of Gamma Knife EXtend System Based Fractionated Treatment Planning Using EBT2 Film.

    Science.gov (United States)

    Gopishankar, N; Bisht, R K

    2012-06-01

    To present EBT2 film verification of treatment planning with the eXtend System, a relocatable frame system for multiple-fraction or serial multiple-session radiosurgery. A human head shaped phantom simulated the verification process for fractionated Gamma Knife (GK) treatment. Phantom preparation for eXtend Frame based treatment planning involved creating a dental impression, fitting the phantom to the frame system, acquiring a stereotactic computed tomography (CT) scan. A CT scan (Siemens, Emotion6) of the phantom was obtained with following parameters: Tube Voltage - 110 kV, Tube Current - 280 mA, pixel size - 0.5 mm × 0.5 mm and 1 mm slice thickness. A treatment plan with two 8 mm collimator shots and three sector blocking in each shot was made. Dose prescription of 4.0 Gy at 100% was delivered for the first fraction out of the two fractions planned. Gafchromic EBT2 film (ISP Wayne, NJ) was used as 2D verification dosimeter in this process. Films were cut and placed inside the film insert of the phantom for treatment dose delivery. Meanwhile a set of films from the same batch were exposed from 0 Gy to 12 Gy doses for calibration purpose. EPSON (Expression 10000XL) scanner was used for scanning the exposed films in transparency mode. Scanned films were analyzed with in-house made Matlab codes. Gamma index analysis of film measurement in comparison with TPS calculated dose resulted in high pass rates >90% for different tolerance criteria of 2%/2mm, 1%/1mm, and 0.5%/0.5mm. The isodose overlay and linear dose profiles of film measured and computed dose distribution on sagittal and coronal plane was in close agreement. Through this study we propose a treatment verification QA method for eXtend frame based fractionated Gamma Knife radiosurgery using EBT2 film. Acknowledgement: Authors acknowledge the help of Andre Micke, ISP for sharing his expertise on EBT2 film. © 2012 American Association of Physicists in Medicine.

  9. 3D VMAT Verification Based on Monte Carlo Log File Simulation with Experimental Feedback from Film Dosimetry

    Science.gov (United States)

    Barbeiro, A. R.; Ureba, A.; Baeza, J. A.; Linares, R.; Perucha, M.; Jiménez-Ortega, E.; Velázquez, S.; Mateos, J. C.

    2016-01-01

    A model based on a specific phantom, called QuAArC, has been designed for the evaluation of planning and verification systems of complex radiotherapy treatments, such as volumetric modulated arc therapy (VMAT). This model uses the high accuracy provided by the Monte Carlo (MC) simulation of log files and allows the experimental feedback from the high spatial resolution of films hosted in QuAArC. This cylindrical phantom was specifically designed to host films rolled at different radial distances able to take into account the entrance fluence and the 3D dose distribution. Ionization chamber measurements are also included in the feedback process for absolute dose considerations. In this way, automated MC simulation of treatment log files is implemented to calculate the actual delivery geometries, while the monitor units are experimentally adjusted to reconstruct the dose-volume histogram (DVH) on the patient CT. Prostate and head and neck clinical cases, previously planned with Monaco and Pinnacle treatment planning systems and verified with two different commercial systems (Delta4 and COMPASS), were selected in order to test operational feasibility of the proposed model. The proper operation of the feedback procedure was proved through the achieved high agreement between reconstructed dose distributions and the film measurements (global gamma passing rates > 90% for the 2%/2 mm criteria). The necessary discretization level of the log file for dose calculation and the potential mismatching between calculated control points and detection grid in the verification process were discussed. Besides the effect of dose calculation accuracy of the analytic algorithm implemented in treatment planning systems for a dynamic technique, it was discussed the importance of the detection density level and its location in VMAT specific phantom to obtain a more reliable DVH in the patient CT. The proposed model also showed enough robustness and efficiency to be considered as a pre

  10. WE-D-BRA-04: Online 3D EPID-Based Dose Verification for Optimum Patient Safety

    Energy Technology Data Exchange (ETDEWEB)

    Spreeuw, H; Rozendaal, R; Olaciregui-Ruiz, I; Mans, A; Mijnheer, B; Herk, M van; Gonzalez, P [Netherlands Cancer Institute - Antoni van Leeuwenhoek, Amsterdam, Noord-Holland (Netherlands)

    2015-06-15

    Purpose: To develop an online 3D dose verification tool based on EPID transit dosimetry to ensure optimum patient safety in radiotherapy treatments. Methods: A new software package was developed which processes EPID portal images online using a back-projection algorithm for the 3D dose reconstruction. The package processes portal images faster than the acquisition rate of the portal imager (∼ 2.5 fps). After a portal image is acquired, the software seeks for “hot spots” in the reconstructed 3D dose distribution. A hot spot is in this study defined as a 4 cm{sup 3} cube where the average cumulative reconstructed dose exceeds the average total planned dose by at least 20% and 50 cGy. If a hot spot is detected, an alert is generated resulting in a linac halt. The software has been tested by irradiating an Alderson phantom after introducing various types of serious delivery errors. Results: In our first experiment the Alderson phantom was irradiated with two arcs from a 6 MV VMAT H&N treatment having a large leaf position error or a large monitor unit error. For both arcs and both errors the linac was halted before dose delivery was completed. When no error was introduced, the linac was not halted. The complete processing of a single portal frame, including hot spot detection, takes about 220 ms on a dual hexacore Intel Xeon 25 X5650 CPU at 2.66 GHz. Conclusion: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for various kinds of gross delivery errors. The detection of hot spots was proven to be effective for the timely detection of these errors. Current work is focused on hot spot detection criteria for various treatment sites and the introduction of a clinical pilot program with online verification of hypo-fractionated (lung) treatments.

  11. Human Identity Verification based on Heart Sounds: Recent Advances and Future Directions

    CERN Document Server

    Beritelli, Francesco

    2011-01-01

    Identity verification is an increasingly important process in our daily lives, and biometric recognition is a natural solution to the authentication problem. One of the most important research directions in the field of biometrics is the characterization of novel biometric traits that can be used in conjunction with other traits, to limit their shortcomings or to enhance their performance. The aim of this work is to introduce the reader to the usage of heart sounds for biometric recognition, describing the strengths and the weaknesses of this novel trait and analyzing in detail the methods developed so far by different research groups and their performance.

  12. Observing Tsunamis in the Ionosphere Using Ground Based GPS Measurements

    Science.gov (United States)

    Galvan, D. A.; Komjathy, A.; Song, Y. Tony; Stephens, P.; Hickey, M. P.; Foster, J.

    2011-01-01

    Ground-based Global Positioning System (GPS) measurements of ionospheric Total Electron Content (TEC) show variations consistent with atmospheric internal gravity waves caused by ocean tsunamis following recent seismic events, including the Tohoku tsunami of March 11, 2011. We observe fluctuations correlated in time, space, and wave properties with this tsunami in TEC estimates processed using JPL's Global Ionospheric Mapping Software. These TEC estimates were band-pass filtered to remove ionospheric TEC variations with periods outside the typical range of internal gravity waves caused by tsunamis. Observable variations in TEC appear correlated with the Tohoku tsunami near the epicenter, at Hawaii, and near the west coast of North America. Disturbance magnitudes are 1-10% of the background TEC value. Observations near the epicenter are compared to estimates of expected tsunami-driven TEC variations produced by Embry Riddle Aeronautical University's Spectral Full Wave Model, an atmosphere-ionosphere coupling model, and found to be in good agreement. The potential exists to apply these detection techniques to real-time GPS TEC data, providing estimates of tsunami speed and amplitude that may be useful for future early warning systems.

  13. Tissue Engineering of Cartilage on Ground-Based Facilities

    Science.gov (United States)

    Aleshcheva, Ganna; Bauer, Johann; Hemmersbach, Ruth; Egli, Marcel; Wehland, Markus; Grimm, Daniela

    2016-06-01

    Investigations under simulated microgravity offer the opportunity for a better understanding of the influence of altered gravity on cells and the scaffold-free three-dimensional (3D) tissue formation. To investigate the short-term influence, human chondrocytes were cultivated for 2 h, 4 h, 16 h, and 24 h on a 2D Fast-Rotating Clinostat (FRC) in DMEM/F-12 medium supplemented with 10 % FCS. We detected holes in the vimentin network, perinuclear accumulations of vimentin after 2 h, and changes in the chondrocytes shape visualised by F-actin staining after 4 h of FRC-exposure. Scaffold-free cultivation of chondrocytes for 7 d on the Random Positioning Machine (RPM), the FRC and the Rotating Wall Vessel (RWV) resulted in spheroid formation, a phenomenon already known from spaceflight experiments with chondrocytes (MIR Space Station) and thyroid cancer cells (SimBox/Shenzhou-8 space mission). The experiments enabled by the ESA-CORA-GBF programme gave us an optimal opportunity to study gravity-related cellular processes, validate ground-based facilities for our chosen cell system, and prepare long-term experiments under real microgravity conditions in space

  14. Theoretical validation of ground-based microwave ozone observations

    Directory of Open Access Journals (Sweden)

    P. Ricaud

    Full Text Available Ground-based microwave measurements of the diurnal and seasonal variations of ozoneat 42±4.5 and 55±8 km are validated by comparing with results from a zero-dimensional photochemical model and a two-dimensional (2D chemical/radiative/dynamical model, respectively. O3 diurnal amplitudes measured in Bordeaux are shown to be in agreement with theory to within 5%. For the seasonal analysis of O3 variation, at 42±4.5 km, the 2D model underestimates the yearly averaged ozone concentration compared with the measurements. A double maximum oscillation (~3.5% is measured in Bordeaux with an extended maximum in September and a maximum in February, whilst the 2D model predicts only a single large maximum (17% in August and a pronounced minimum in January. Evidence suggests that dynamical transport causes the winter O3 maximum by propagation of planetary waves, phenomena which are not explicitly reproduced by the 2D model. At 55±8 km, the modeled yearly averaged O3 concentration is in very good agreement with the measured yearly average. A strong annual oscillation is both measured and modeled with differences in the amplitude shown to be exclusively linked to temperature fields.

  15. Atmospheric Refraction Path Integrals in Ground-Based Interferometry

    CERN Document Server

    Mathar, R J

    2004-01-01

    The basic effect of the earth's atmospheric refraction on telescope operation is the reduction of the true zenith angle to the apparent zenith angle, associated with prismatic aberrations due to the dispersion in air. If one attempts coherent superposition of star images in ground-based interferometry, one is in addition interested in the optical path length associated with the refracted rays. In a model of a flat earth, the optical path difference between these is not concerned as the translational symmetry of the setup means no net effect remains. Here, I evaluate these interferometric integrals in the more realistic arrangement of two telescopes located on the surface of a common earth sphere and point to a star through an atmosphere which also possesses spherical symmetry. Some focus is put on working out series expansions in terms of the small ratio of the baseline over the earth radius, which allows to bypass some numerics which otherwise is challenged by strong cancellation effects in building the opti...

  16. Experiments on a Ground-Based Tomographic Synthetic Aperture Radar

    Directory of Open Access Journals (Sweden)

    Hoonyol Lee

    2016-08-01

    Full Text Available This paper presents the development and experiment of three-dimensional image formation by using a ground-based tomographic synthetic aperture radar (GB-TomoSAR system. GB-TomoSAR formulates two-dimensional synthetic aperture by the motion of antennae, both in azimuth and vertical directions. After range compression, three-dimensional image focusing is performed by applying Deramp-FFT (Fast Fourier Transform algorithms, both in azimuth and vertical directions. Geometric and radiometric calibrations were applied to make an image cube, which is then projected into range-azimuth and range-vertical cross-sections for visualization. An experiment with a C-band GB-TomoSAR system with a scan length of 2.49 m and 1.86 m in azimuth and vertical-direction, respectively, shows distinctive three-dimensional radar backscattering of stable buildings and roads with resolutions similar to the theoretical values. Unstable objects such as trees and moving cars generate severe noise due to decorrelation during the eight-hour image-acquisition time.

  17. Satellite Type Estination from Ground-based Photometric Observation

    Science.gov (United States)

    Endo, T.; Ono, H.; Suzuki, J.; Ando, T.; Takanezawa, T.

    2016-09-01

    The optical photometric observation is potentially a powerful tool for understanding of the Geostationary Earth Orbit (GEO) objects. At first, we measured in laboratory the surface reflectance of common satellite materials, for example, Multi-layer Insulation (MLI), mono-crystalline silicon cells, and Carbon Fiber Reinforced Plastic (CFRP). Next, we calculated visual magnitude of a satellite by simplified shape and albedo. In this calculation model, solar panels have dimensions of 2 by 8 meters, and the bus area is 2 meters squared with measured optical properties described above. Under these conditions, it clarified the brightness can change the range between 3 and 4 magnitudes in one night, but color index changes only from 1 to 2 magnitudes. Finally, we observed the color photometric data of several GEO satellites visible from Japan multiple times in August and September 2014. We obtained that light curves of GEO satellites recorded in the B and V bands (using Johnson filters) by a ground-base optical telescope. As a result, color index changed approximately from 0.5 to 1 magnitude in one night, and the order of magnitude was not changed in all cases. In this paper, we briefly discuss about satellite type estimation using the relation between brightness and color index obtained from the photometric observation.

  18. VME-based remote instrument control without ground loops

    CERN Document Server

    Belleman, J; González, J L

    1997-01-01

    New electronics has been developed for the remote control of the pick-up electrodes at the CERN Proton Synchrotron (PS). Communication between VME-based control computers and remote equipment is via full duplex point-to-point digital data links. Data are sent and received in serial format over simple twisted pairs at a rate of 1 Mbit/s, for distances of up to 300 m. Coupling transformers are used to avoid ground loops. The link hardware consists of a general-purpose VME-module, the 'TRX' (transceiver), containing four FIFO-buffered communication channels, and a dedicated control card for each remote station. Remote transceiver electronics is simple enough not to require micro-controllers or processors. Currently, some sixty pick-up stations of various types, all over the PS Complex (accelerators and associated beam transfer lines) are equipped with the new system. Even though the TRX was designed primarily for communication with pick-up electronics, it could also be used for other purposes, for example to for...

  19. Ground-based measurements of UV Index (UVI at Helwan

    Directory of Open Access Journals (Sweden)

    H. Farouk

    2012-12-01

    Full Text Available On October 2010 UV Index (UVI ground-based measurements were carried out by weather station at solar laboratory in NRIAG. The daily variation has maximum values in spring and summer days, while minimum values in autumn and winter days. The low level of UVI between 2.55 and 2.825 was found in December, January and February. The moderate level of UVI between 3.075 and 5.6 was found in March, October and November. The high level of UVI between 6.7 and 7.65 was found in April, May and September. The very high level of UVI between 8 and 8.6 was found in June, July and August. High level of radiation over 6 months per year including 3 months with a very high level UVI. According to the equation {UVI=a[SZA]b} the UVI increases with decreasing SZA by 82% on a daily scale and 88% on a monthly scale. Helwan exposure to a high level of radiation over 6 months per year including 3 months with a very high level UVI, so it is advisable not to direct exposure to the sun from 11 am to 2:00 pm.

  20. Replacement of Hydrochlorofluorocarbon (HCFC) -225 Solvent for Cleaning and Verification Sampling of NASA Propulsion Oxygen Systems Hardware, Ground Support Equipment, and Associated Test Systems

    Science.gov (United States)

    Burns, H. D.; Mitchell, M. A.; McMillian, J. H.; Farner, B. R.; Harper, S. A.; Peralta, S. F.; Lowrey, N. M.; Ross, H. R.; Juarez, A.

    2015-01-01

    Since the 1990's, NASA's rocket propulsion test facilities at Marshall Space Flight Center (MSFC) and Stennis Space Center (SSC) have used hydrochlorofluorocarbon-225 (HCFC-225), a Class II ozone-depleting substance, to safety clean and verify the cleanliness of large scale propulsion oxygen systems and associated test facilities. In 2012 through 2014, test laboratories at MSFC, SSC, and Johnson Space Center-White Sands Test Facility collaborated to seek out, test, and qualify an environmentally preferred replacement for HCFC-225. Candidate solvents were selected, a test plan was developed, and the products were tested for materials compatibility, oxygen compatibility, cleaning effectiveness, and suitability for use in cleanliness verification and field cleaning operations. Honewell Soltice (TradeMark) Performance Fluid (trans-1-chloro-3,3, 3-trifluoropropene) was selected to replace HCFC-225 at NASA's MSFC and SSC rocket propulsion test facilities.

  1. Ground-based monitoring of solar radiation in Moldova

    Science.gov (United States)

    Aculinin, Alexandr; Smicov, Vladimir

    2010-05-01

    Integrated measurements of solar radiation in Kishinev, Moldova have been started by Atmospheric Research Group (ARG) at the Institute of Applied Physics from 2003. Direct, diffuse and total components of solar and atmospheric long-wave radiation are measured by using of the radiometric complex at the ground-based solar radiation monitoring station. Measurements are fulfilled at the stationary and moving platforms equipped with the set of 9 broadband solar radiation sensors overlapping wavelength range from UV-B to IR. Detailed description of the station can be found at the site http://arg.phys.asm.md. Ground station is placed in an urban environment of Kishinev city (47.00N; 28.56E). Summary of observation data acquired at the station in the course of short-term period from 2004 to 2009 are presented below. Solar radiation measurements were fulfilled by using CM11(280-3000 nm) and CH1 sensors (Kipp&Zonen). In the course of a year maximum and minimum of monthly sums of total radiation was ~706.4 MJm-2 in June and ~82.1MJm-2 in December, respectively. Monthly sums of direct solar radiation (on horizontal plane) show the maximum and minimum values of the order ~456.9 MJm-2 in July and ~25.5MJm-2 in December, respectively. In an average, within a year should be marked the predominance of direct radiation over the scattered radiation, 51% and 49%, respectively. In the course of a year, the percentage contribution of the direct radiation into the total radiation is ~55-65% from May to September. In the remaining months, the percentage contribution decreases and takes the minimum value of ~ 28% in December. In an average, annual sum of total solar radiation is ~4679.9 MJm-2. For the period from April to September accounts for ~76% of the annual amount of total radiation. Annual sum of sunshine duration accounts for ~2149 hours, which is of ~ 48% from the possible sunshine duration. In an average, within a year maximum and minimum of sunshine duration is ~ 304 hours in

  2. Design verification enhancement of field programmable gate array-based safety-critical I&C system of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, Ibrahim [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of); Jung, Jaecheon, E-mail: jcjung@kings.ac.kr [Department of Nuclear Power Plant Engineering, KEPCO International Nuclear Graduate School, 658-91 Haemaji-ro, Seosang-myeon, Ulju-gun, Ulsan 45014 (Korea, Republic of); Heo, Gyunyoung [Department of Nuclear Engineering, Kyung Hee University, 1732 Deogyeong-daero, Giheung-gu, Yongin-si, Gyeonggi-do 17104 (Korea, Republic of)

    2017-06-15

    Highlights: • An enhanced, systematic and integrated design verification approach is proposed for V&V of FPGA-based I&C system of NPP. • RPS bistable fixed setpoint trip algorithm is designed, analyzed, verified and discussed using the proposed approaches. • The application of integrated verification approach simultaneously verified the entire design modules. • The applicability of the proposed V&V facilitated the design verification processes. - Abstract: Safety-critical instrumentation and control (I&C) system in nuclear power plant (NPP) implemented on programmable logic controllers (PLCs) plays a vital role in safe operation of the plant. The challenges such as fast obsolescence, the vulnerability to cyber-attack, and other related issues of software systems have currently led to the consideration of field programmable gate arrays (FPGAs) as an alternative to PLCs because of their advantages and hardware related benefits. However, safety analysis for FPGA-based I&C systems, and verification and validation (V&V) assessments still remain important issues to be resolved, which are now become a global research point of interests. In this work, we proposed a systematic design and verification strategies from start to ready-to-use in form of model-based approaches for FPGA-based reactor protection system (RPS) that can lead to the enhancement of the design verification and validation processes. The proposed methodology stages are requirement analysis, enhanced functional flow block diagram (EFFBD) models, finite state machine with data path (FSMD) models, hardware description language (HDL) code development, and design verifications. The design verification stage includes unit test – Very high speed integrated circuit Hardware Description Language (VHDL) test and modified condition decision coverage (MC/DC) test, module test – MATLAB/Simulink Co-simulation test, and integration test – FPGA hardware test beds. To prove the adequacy of the proposed

  3. GVT-Based Ground Flutter Test without Wind Tunnel Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology, Inc (ZONA) and Arizona State University (ASU) propose a R&D effort to develop a ground flutter testing system without wind tunnel, called the...

  4. GVT-Based Ground Flutter Test without Wind Tunnel Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology, Inc (ZONA) and Arizona State University (ASU) propose a R&D effort to further develop the ground flutter testing system in place of a wind...

  5. System of gait analysis based on ground reaction force assessment

    Directory of Open Access Journals (Sweden)

    František Vaverka

    2015-12-01

    Full Text Available Background: Biomechanical analysis of gait employs various methods used in kinematic and kinetic analysis, EMG, and others. One of the most frequently used methods is kinetic analysis based on the assessment of the ground reaction forces (GRF recorded on two force plates. Objective: The aim of the study was to present a method of gait analysis based on the assessment of the GRF recorded during the stance phase of two steps. Methods: The GRF recorded with a force plate on one leg during stance phase has three components acting in directions: Fx - mediolateral, Fy - anteroposterior, and Fz - vertical. A custom-written MATLAB script was used for gait analysis in this study. This software displays instantaneous force data for both legs as Fx(t, Fy(t and Fz(t curves, automatically determines the extremes of functions and sets the visual markers defining the individual points of interest. Positions of these markers can be easily adjusted by the rater, which may be necessary if the GRF has an atypical pattern. The analysis is fully automated and analyzing one trial takes only 1-2 minutes. Results: The method allows quantification of temporal variables of the extremes of the Fx(t, Fy(t, Fz(t functions, durations of the braking and propulsive phase, duration of the double support phase, the magnitudes of reaction forces in extremes of measured functions, impulses of force, and indices of symmetry. The analysis results in a standardized set of 78 variables (temporal, force, indices of symmetry which can serve as a basis for further research and diagnostics. Conclusions: The resulting set of variable offers a wide choice for selecting a specific group of variables with consideration to a particular research topic. The advantage of this method is the standardization of the GRF analysis, low time requirements allowing rapid analysis of a large number of trials in a short time, and comparability of the variables obtained during different research measurements.

  6. Acrylonitrile Butadiene Styrene (ABS) plastic based low cost tissue equivalent phantom for verification dosimetry in IMRT.

    Science.gov (United States)

    Kumar, Rajesh; Sharma, S D; Deshpande, Sudesh; Ghadi, Yogesh; Shaiju, V S; Amols, H I; Mayya, Y S

    2009-12-17

    A novel IMRT phantom was designed and fabricated using Acrylonitrile Butadiene Styrene (ABS) plastic. Physical properties of ABS plastic related to radiation interaction and dosimetry were compared with commonly available phantom materials for dose measurements in radiotherapy. The ABS IMRT phantom has provisions to hold various types of detectors such as ion chambers, radiographic/radiochromic films, TLDs, MOSFETs, and gel dosimeters. The measurements related to pre-treatment dose verification in IMRT of carcinoma prostate were carried out using ABS and Scanditronics-Wellhoffer RW3 IMRT phantoms for five different cases. Point dose data were acquired using ionization chamber and TLD discs while Gafchromic EBT and radiographic EDR2 films were used for generating 2-D dose distributions. Treatment planning system (TPS) calculated and measured doses in ABS plastic and RW3 IMRT phantom were in agreement within +/-2%. The dose values at a point in a given patient acquired using ABS and RW3 phantoms were found comparable within 1%. Fluence maps and dose distributions of these patients generated by TPS and measured in ABS IMRT phantom were also found comparable both numerically and spatially. This study indicates that ABS plastic IMRT phantom is a tissue equivalent phantom and dosimetrically it is similar to solid/plastic water IMRT phantoms. Though this material is demonstrated for IMRT dose verification but it can be used as a tissue equivalent phantom material for other dosimetry purposes in radiotherapy.

  7. Development and validation of MCNPX-based Monte Carlo treatment plan verification system.

    Science.gov (United States)

    Jabbari, Iraj; Monadi, Shahram

    2015-01-01

    A Monte Carlo treatment plan verification (MCTPV) system was developed for clinical treatment plan verification (TPV), especially for the conformal and intensity-modulated radiotherapy (IMRT) plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation therapy (DICOM-RT) format. In MCTPV several methods were applied in order to reduce the simulation time. The relative dose distribution of a clinical prostate conformal plan calculated by the MCTPV was compared with that of TiGRT planning system. The results showed well implementation of the beams configuration and patient information in this system. For quantitative evaluation of MCTPV a two-dimensional (2D) diode array (MapCHECK2) and gamma index analysis were used. The gamma passing rate (3%/3 mm) of an IMRT plan was found to be 98.5% for total beams. Also, comparison of the measured and Monte Carlo calculated doses at several points inside an inhomogeneous phantom for 6- and 18-MV photon beams showed a good agreement (within 1.5%). The accuracy and timing results of MCTPV showed that MCTPV could be used very efficiently for additional assessment of complicated plans such as IMRT plan.

  8. Development and validation of MCNPX-based Monte Carlo treatment plan verification system

    Directory of Open Access Journals (Sweden)

    Iraj Jabbari

    2015-01-01

    Full Text Available A Monte Carlo treatment plan verification (MCTPV system was developed for clinical treatment plan verification (TPV, especially for the conformal and intensity-modulated radiotherapy (IMRT plans. In the MCTPV, the MCNPX code was used for particle transport through the accelerator head and the patient body. MCTPV has an interface with TiGRT planning system and reads the information which is needed for Monte Carlo calculation transferred in digital image communications in medicine-radiation therapy (DICOM-RT format. In MCTPV several methods were applied in order to reduce the simulation time. The relative dose distribution of a clinical prostate conformal plan calculated by the MCTPV was compared with that of TiGRT planning system. The results showed well implementation of the beams configuration and patient information in this system. For quantitative evaluation of MCTPV a two-dimensional (2D diode array (MapCHECK2 and gamma index analysis were used. The gamma passing rate (3%/3 mm of an IMRT plan was found to be 98.5% for total beams. Also, comparison of the measured and Monte Carlo calculated doses at several points inside an inhomogeneous phantom for 6- and 18-MV photon beams showed a good agreement (within 1.5%. The accuracy and timing results of MCTPV showed that MCTPV could be used very efficiently for additional assessment of complicated plans such as IMRT plan.

  9. A Simple Visual Ethanol Biosensor Based on Alcohol Oxidase Immobilized onto Polyaniline Film for Halal Verification of Fermented Beverage Samples

    Science.gov (United States)

    Kuswandi, Bambang; Irmawati, Titi; Hidayat, Moch Amrun; Jayus; Ahmad, Musa

    2014-01-01

    A simple visual ethanol biosensor based on alcohol oxidase (AOX) immobilised onto polyaniline (PANI) film for halal verification of fermented beverage samples is described. This biosensor responds to ethanol via a colour change from green to blue, due to the enzymatic reaction of ethanol that produces acetaldehyde and hydrogen peroxide, when the latter oxidizes the PANI film. The procedure to obtain this biosensor consists of the immobilization of AOX onto PANI film by adsorption. For the immobilisation, an AOX solution is deposited on the PANI film and left at room temperature until dried (30 min). The biosensor was constructed as a dip stick for visual and simple use. The colour changes of the films have been scanned and analysed using image analysis software (i.e., ImageJ) to study the characteristics of the biosensor's response toward ethanol. The biosensor has a linear response in an ethanol concentration range of 0.01%–0.8%, with a correlation coefficient (r) of 0.996. The limit detection of the biosensor was 0.001%, with reproducibility (RSD) of 1.6% and a life time up to seven weeks when stored at 4 °C. The biosensor provides accurate results for ethanol determination in fermented drinks and was in good agreement with the standard method (gas chromatography) results. Thus, the biosensor could be used as a simple visual method for ethanol determination in fermented beverage samples that can be useful for Muslim community for halal verification. PMID:24473284

  10. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  11. Characterization of subarctic vegetation using ground based remote sensing methods

    Science.gov (United States)

    Finnell, D.; Garnello, A.; Palace, M. W.; Sullivan, F.; Herrick, C.; Anderson, S. M.; Crill, P. M.; Varner, R. K.

    2014-12-01

    Stordalen mire is located at 68°21'N and 19°02'E in the Swedish subarctic. Climate monitoring has revealed a warming trend spanning the past 150 years affecting the mires ability to hold stable palsa/hummock mounds. The micro-topography of the landscape has begun to degrade into thaw ponds changing the vegetation cover from ombrothrophic to minerotrophic. Hummocks are ecologically important due to their ability to act as a carbon sinks. Thaw ponds and sphagnum rich transitional zones have been documented as sources of atmospheric CH4. An objective of this project is to determine if a high resolution three band camera (RGB) and a RGNIR camera could detect differences in vegetation over five different site types. Species composition was collected for 50 plots with ten repetitions for each site type: palsa/hummock, tall shrub, semi-wet, tall graminoid, and wet. Sites were differentiated based on dominating species and features consisting of open water presence, sphagnum spp. cover, graminoid spp. cover, or the presence of dry raised plateaus/mounds. A pole based camera mount was used to collect images at a height of ~2.44m from the ground. The images were cropped in post-processing to fit a one-square meter quadrat. Texture analysis was performed on all images, including entropy, lacunarity, and angular second momentum. Preliminary results suggested that site type influences the number of species present. The p-values for the ability to predict site type using a t-test range from <0.0001 to 0.0461. A stepwise discriminant analysis on site type vs. texture yielded a 10% misclassification rate. Through the use of a stepwise regression of texture variables, actual vs. predicted percent of vegetation coverage provided R squared values of 0.73, 0.71, 0.67, and 0.89 for C. bigelowii, R. chamaemorus, Sphagnum spp., and open water respectively. These data have provided some support to the notion that texture analyses can be used for classification of mire site types. Future

  12. Observational Selection Effects with Ground-based Gravitational Wave Detectors

    Science.gov (United States)

    Chen, Hsin-Yu; Essick, Reed; Vitale, Salvatore; Holz, Daniel E.; Katsavounidis, Erik

    2017-01-01

    Ground-based interferometers are not perfect all-sky instruments, and it is important to account for their behavior when considering the distribution of detected events. In particular, the LIGO detectors are most sensitive to sources above North America and the Indian Ocean, and as the Earth rotates, the sensitive regions are swept across the sky. However, because the detectors do not acquire data uniformly over time, there is a net bias on detectable sources’ right ascensions. Both LIGO detectors preferentially collect data during their local night; it is more than twice as likely to be local midnight than noon when both detectors are operating. We discuss these selection effects and how they impact LIGO’s observations and electromagnetic (EM) follow-up. Beyond galactic foregrounds associated with seasonal variations, we find that equatorial observatories can access over 80% of the localization probability, while mid-latitudes will access closer to 70%. Facilities located near the two LIGO sites can observe sources closer to their zenith than their analogs in the south, but the average observation will still be no closer than 44° from zenith. We also find that observatories in Africa or the South Atlantic will wait systematically longer before they can begin observing compared to the rest of the world though, there is a preference for longitudes near the LIGOs. These effects, along with knowledge of the LIGO antenna pattern, can inform EM follow-up activities and optimization, including the possibility of directing observations even before gravitational-wave events occur.

  13. Ozone profiles above Kiruna from two ground-based radiometers

    Science.gov (United States)

    Ryan, Niall J.; Walker, Kaley A.; Raffalski, Uwe; Kivi, Rigel; Gross, Jochen; Manney, Gloria L.

    2016-09-01

    This paper presents new atmospheric ozone concentration profiles retrieved from measurements made with two ground-based millimetre-wave radiometers in Kiruna, Sweden. The instruments are the Kiruna Microwave Radiometer (KIMRA) and the Millimeter wave Radiometer 2 (MIRA 2). The ozone concentration profiles are retrieved using an optimal estimation inversion technique, and they cover an altitude range of ˜ 16-54 km, with an altitude resolution of, at best, 8 km. The KIMRA and MIRA 2 measurements are compared to each other, to measurements from balloon-borne ozonesonde measurements at Sodankylä, Finland, and to measurements made by the Microwave Limb Sounder (MLS) aboard the Aura satellite. KIMRA has a correlation of 0.82, but shows a low bias, with respect to the ozonesonde data, and MIRA 2 shows a smaller magnitude low bias and a 0.98 correlation coefficient. Both radiometers are in general agreement with each other and with MLS data, showing high correlation coefficients, but there are differences between measurements that are not explained by random errors. An oscillatory bias with a peak of approximately ±1 ppmv is identified in the KIMRA ozone profiles over an altitude range of ˜ 18-35 km, and is believed to be due to baseline wave features that are present in the spectra. A time series analysis of KIMRA ozone for winters 2008-2013 shows the existence of a local wintertime minimum in the ozone profile above Kiruna. The measurements have been ongoing at Kiruna since 2002 and late 2012 for KIMRA and MIRA 2, respectively.

  14. Project management for complex ground-based instruments: MEGARA plan

    Science.gov (United States)

    García-Vargas, María. Luisa; Pérez-Calpena, Ana; Gil de Paz, Armando; Gallego, Jesús; Carrasco, Esperanza; Cedazo, Raquel; Iglesias, Jorge

    2014-08-01

    The project management of complex instruments for ground-based large telescopes is a challenge itself. A good management is a clue for project success in terms of performance, schedule and budget. Being on time has become a strict requirement for two reasons: to assure the arrival at the telescope due to the pressure on demanding new instrumentation for this first world-class telescopes and to not fall in over-costs. The budget and cash-flow is not always the expected one and has to be properly handled from different administrative departments at the funding centers worldwide distributed. The complexity of the organizations, the technological and scientific return to the Consortium partners and the participation in the project of all kind of professional centers working in astronomical instrumentation: universities, research centers, small and large private companies, workshops and providers, etc. make the project management strategy, and the tools and procedures tuned to the project needs, crucial for success. MEGARA (Multi-Espectrógrafo en GTC de Alta Resolución para Astronomía) is a facility instrument of the 10.4m GTC (La Palma, Spain) working at optical wavelengths that provides both Integral-Field Unit (IFU) and Multi-Object Spectrograph (MOS) capabilities at resolutions in the range R=6,000-20,000. The project is an initiative led by Universidad Complutense de Madrid (Spain) in collaboration with INAOE (Mexico), IAA-CSIC (Spain) and Universidad Politécnica de Madrid (Spain). MEGARA is being developed under contract with GRANTECAN.

  15. Fast 3D dosimetric verifications based on an electronic portal imaging device using a GPU calculation engine.

    Science.gov (United States)

    Zhu, Jinhan; Chen, Lixin; Chen, Along; Luo, Guangwen; Deng, Xiaowu; Liu, Xiaowei

    2015-04-11

    To use a graphic processing unit (GPU) calculation engine to implement a fast 3D pre-treatment dosimetric verification procedure based on an electronic portal imaging device (EPID). The GPU algorithm includes the deconvolution and convolution method for the fluence-map calculations, the collapsed-cone convolution/superposition (CCCS) algorithm for the 3D dose calculations and the 3D gamma evaluation calculations. The results of the GPU-based CCCS algorithm were compared to those of Monte Carlo simulations. The planned and EPID-based reconstructed dose distributions in overridden-to-water phantoms and the original patients were compared for 6 MV and 10 MV photon beams in intensity-modulated radiation therapy (IMRT) treatment plans based on dose differences and gamma analysis. The total single-field dose computation time was less than 8 s, and the gamma evaluation for a 0.1-cm grid resolution was completed in approximately 1 s. The results of the GPU-based CCCS algorithm exhibited good agreement with those of the Monte Carlo simulations. The gamma analysis indicated good agreement between the planned and reconstructed dose distributions for the treatment plans. For the target volume, the differences in the mean dose were less than 1.8%, and the differences in the maximum dose were less than 2.5%. For the critical organs, minor differences were observed between the reconstructed and planned doses. The GPU calculation engine was used to boost the speed of 3D dose and gamma evaluation calculations, thus offering the possibility of true real-time 3D dosimetric verification.

  16. A coherency function model of ground motion at base rock corresponding to strike-slip fault

    Institute of Scientific and Technical Information of China (English)

    丁海平; 刘启方; 金星; 袁一凡

    2004-01-01

    At present, the method to study spatial variation of ground motions is statistic analysis based on dense array records such as SMART-1 array, etc. For lacking of information of ground motions, there is no coherency function model of base rock and different style site. Spatial variation of ground motions in elastic media is analyzed by deterministic method in this paper. Taking elastic half-space model with dislocation source of fault, near-field ground motions are simulated. This model takes strike-slip fault and earth media into account. A coherency function is proposed for base rock site.

  17. Diffusion-weighted MRI for verification of electroporation-based treatments

    DEFF Research Database (Denmark)

    Mahmood, Faisal; Hansen, Rasmus H; Agerholm-Larsen, Birgit

    2011-01-01

    such a tissue reaction represents a great clinical benefit since, in case of target miss, retreatment can be performed immediately. We propose diffusion-weighted magnetic resonance imaging (DW-MRI) as a method to monitor EP tissue, using the concept of the apparent diffusion coefficient (ADC). We hypothesize...... that the plasma membrane permeabilization induced by EP changes the ADC, suggesting that DW-MRI constitutes a noninvasive and quick means of EP verification. In this study we performed in vivo EP in rat brains, followed by DW-MRI using a clinical MRI scanner. We found a pulse amplitude-dependent increase...... in the ADC following EP, indicating that (1) DW-MRI is sensitive to the EP-induced changes and (2) the observed changes in ADC are indeed due to the applied electric field....

  18. Verification of Geometric Model-Based Plant Phenotyping Methods for Studies of Xerophytic Plants

    Directory of Open Access Journals (Sweden)

    Paweł Drapikowski

    2016-06-01

    Full Text Available This paper presents the results of verification of certain non-contact measurement methods of plant scanning to estimate morphological parameters such as length, width, area, volume of leaves and/or stems on the basis of computer models. The best results in reproducing the shape of scanned objects up to 50 cm in height were obtained with the structured-light DAVID Laserscanner. The optimal triangle mesh resolution for scanned surfaces was determined with the measurement error taken into account. The research suggests that measuring morphological parameters from computer models can supplement or even replace phenotyping with classic methods. Calculating precise values of area and volume makes determination of the S/V (surface/volume ratio for cacti and other succulents possible, whereas for classic methods the result is an approximation only. In addition, the possibility of scanning and measuring plant species which differ in morphology was investigated.

  19. Scaling earthquake ground motions for performance-based assessment of buildings

    Science.gov (United States)

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.; Hamburger, R.O.

    2011-01-01

    The impact of alternate ground-motion scaling procedures on the distribution of displacement responses in simplified structural systems is investigated. Recommendations are provided for selecting and scaling ground motions for performance-based assessment of buildings. Four scaling methods are studied, namely, (1)geometric-mean scaling of pairs of ground motions, (2)spectrum matching of ground motions, (3)first-mode-period scaling to a target spectral acceleration, and (4)scaling of ground motions per the distribution of spectral demands. Data were developed by nonlinear response-history analysis of a large family of nonlinear single degree-of-freedom (SDOF) oscillators that could represent fixed-base and base-isolated structures. The advantages and disadvantages of each scaling method are discussed. The relationship between spectral shape and a ground-motion randomness parameter, is presented. A scaling procedure that explicitly considers spectral shape is proposed. ?? 2011 American Society of Civil Engineers.

  20. Ground-based gamma-ray telescopes as ground stations in deep-space lasercom

    CERN Document Server

    Carrasco-Casado, Alberto; Vergaz, Ricardo

    2016-01-01

    As the amount of information to be transmitted from deep-space rapidly increases, the radiofrequency technology has become a bottleneck in space communications. RF is already limiting the scientific outcome of deep-space missions and could be a significant obstacle in the developing of manned missions. Lasercom holds the promise to solve this problem, as it will considerably increase the data rate while decreasing the energy, mass and volume of onboard communication systems. In RF deep-space communications, where the received power is the main limitation, the traditional approach to boost the data throughput has been increasing the receiver's aperture, e.g. the 70-m antennas in the NASA's Deep Space Network. Optical communications also can benefit from this strategy, thus 10-m class telescopes have typically been suggested to support future deep-space links. However, the cost of big telescopes increase exponentially with their aperture, and new ideas are needed to optimize this ratio. Here, the use of ground-...

  1. Biosensors for EVA: Improved Instrumentation for Ground-based Studies

    Science.gov (United States)

    Soller, B.; Ellerby, G.; Zou, F.; Scott, P.; Jin, C.; Lee, S. M. C.; Coates, J.

    2010-01-01

    During lunar excursions in the EVA suit, real-time measurement of metabolic rate is required to manage consumables and guide activities to ensure safe return to the base. Metabolic rate, or oxygen consumption (VO2), is normally measured from pulmonary parameters but cannot be determined with standard techniques in the oxygen-rich environment of a spacesuit. Our group has developed novel near infrared spectroscopic (NIRS) methods to calculate muscle oxygen saturation (SmO 2), hematocrit, and pH, and we recently demonstrated that we can use our NIRS sensor to measure VO 2 on the leg during cycling. Our NSBRI project has 4 objectives: (1) increase the accuracy of the metabolic rate calculation through improved prediction of stroke volume; (2) investigate the relative contributions of calf and thigh oxygen consumption to metabolic rate calculation for walking and running; (3) demonstrate that the NIRS-based noninvasive metabolic rate methodology is sensitive enough to detect decrement in VO 2 in a space analog; and (4) improve instrumentation to allow testing within a spacesuit. Over the past year we have made progress on all four objectives, but the most significant progress was made in improving the instrumentation. The NIRS system currently in use at JSC is based on fiber optics technology. Optical fiber bundles are used to deliver light from a light source in the monitor to the patient, and light reflected back from the patient s muscle to the monitor for spectroscopic analysis. The fiber optic cables are large and fragile, and there is no way to get them in and out of the test spacesuit used for ground-based studies. With complimentary funding from the US Army, we undertook a complete redesign of the sensor and control electronics to build a novel system small enough to be used within the spacesuit and portable enough to be used by a combat medic. In the new system the filament lamp used in the fiber optic system was replaced with a novel broadband near infrared

  2. Biosensors for EVA: Improved Instrumentation for Ground-based Studies

    Science.gov (United States)

    Soller, B.; Ellerby, G.; Zou, F.; Scott, P.; Jin, C.; Lee, S. M. C.; Coates, J.

    2010-01-01

    During lunar excursions in the EVA suit, real-time measurement of metabolic rate is required to manage consumables and guide activities to ensure safe return to the base. Metabolic rate, or oxygen consumption (VO2), is normally measured from pulmonary parameters but cannot be determined with standard techniques in the oxygen-rich environment of a spacesuit. Our group has developed novel near infrared spectroscopic (NIRS) methods to calculate muscle oxygen saturation (SmO 2), hematocrit, and pH, and we recently demonstrated that we can use our NIRS sensor to measure VO 2 on the leg during cycling. Our NSBRI project has 4 objectives: (1) increase the accuracy of the metabolic rate calculation through improved prediction of stroke volume; (2) investigate the relative contributions of calf and thigh oxygen consumption to metabolic rate calculation for walking and running; (3) demonstrate that the NIRS-based noninvasive metabolic rate methodology is sensitive enough to detect decrement in VO 2 in a space analog; and (4) improve instrumentation to allow testing within a spacesuit. Over the past year we have made progress on all four objectives, but the most significant progress was made in improving the instrumentation. The NIRS system currently in use at JSC is based on fiber optics technology. Optical fiber bundles are used to deliver light from a light source in the monitor to the patient, and light reflected back from the patient s muscle to the monitor for spectroscopic analysis. The fiber optic cables are large and fragile, and there is no way to get them in and out of the test spacesuit used for ground-based studies. With complimentary funding from the US Army, we undertook a complete redesign of the sensor and control electronics to build a novel system small enough to be used within the spacesuit and portable enough to be used by a combat medic. In the new system the filament lamp used in the fiber optic system was replaced with a novel broadband near infrared

  3. Ground-Based Observing Campaign of Briz-M Debris

    Science.gov (United States)

    Lederer, S. M.; Buckalew, B.; Frith, J.; Cowardin, H. M.; Hickson, P.; Matney, M.; Anz-Meador, P.

    2017-01-01

    In 2015, NASA's Orbital Debris Program Office (ODPO) completed the installation of the Meter Class Autonomous Telescope (MCAT) on Ascension Island. MCAT is a 1.3m optical telescope designed with a fast tracking capability for observing orbital debris at all orbital regimes (Low-Erath orbits to Geosyncronous (GEO) orbits) from a low latitude site. This new asset is dedicated year-round for debris observations, and its location fills a geographical gap in the Ground-based Electro Optical Space Surveillance (GEODSS) network. A commercial off the shelf (COTS) research grade 0.4m telescope (named the Benbrook telescope) will also be installed on Ascension at the end of 2016. This smaller version is controlled by the same master software, designed by Euclid Research, and can be tasked to work independently or in concert with MCAT. Like MCAT, it has a the same suite of filters, a similar field of view, and a fast-tracking Astelco mount, and is also capable of tracking debris at all orbital regimes. These assets are well suited for targeted campagins or surveys of debris. Since 2013, NASA's ODPO has also had extensive access to the 3.8m infrared UKIRT telescope, located on Mauna Kea. At nearly 14,000-ft, this site affords excellent conditions for collecting both photometery and spectroscopy at near-IR (0.9 - 2.5 micrometers SWIR) and thermal-IR (8 - 25 micrometers; LWIR) regimes, ideal for investigating material properties as well as thermal characteristics and sizes of debris. For the purposes of understanding orbital debris, taking data in both survey mode as well as targeting individual objects for more in-depth characterizations are desired. With the recent break-ups of Briz-M rocket bodies, we have collected a suite of data in the optical, near-infrared, and mid-infrared of in-tact objects as well as those classified as debris. A break-up at GEO of a Briz-M rocket occurred in January, 2016, well timed for the first remote observing survey-campaign with MCAT. Access to

  4. Validation of a deformable image registration technique for cone beam CT-based dose verification

    Energy Technology Data Exchange (ETDEWEB)

    Moteabbed, M., E-mail: mmoteabbed@partners.org; Sharp, G. C.; Wang, Y.; Trofimov, A.; Efstathiou, J. A.; Lu, H.-M. [Massachusetts General Hospital, Boston, Massachusetts 02114 and Harvard Medical School, Boston, Massachusetts 02115 (United States)

    2015-01-15

    Purpose: As radiation therapy evolves toward more adaptive techniques, image guidance plays an increasingly important role, not only in patient setup but also in monitoring the delivered dose and adapting the treatment to patient changes. This study aimed to validate a method for evaluation of delivered intensity modulated radiotherapy (IMRT) dose based on multimodal deformable image registration (DIR) for prostate treatments. Methods: A pelvic phantom was scanned with CT and cone-beam computed tomography (CBCT). Both images were digitally deformed using two realistic patient-based deformation fields. The original CT was then registered to the deformed CBCT resulting in a secondary deformed CT. The registration quality was assessed as the ability of the DIR method to recover the artificially induced deformations. The primary and secondary deformed CT images as well as vector fields were compared to evaluate the efficacy of the registration method and it’s suitability to be used for dose calculation. PLASTIMATCH, a free and open source software was used for deformable image registration. A B-spline algorithm with optimized parameters was used to achieve the best registration quality. Geometric image evaluation was performed through voxel-based Hounsfield unit (HU) and vector field comparison. For dosimetric evaluation, IMRT treatment plans were created and optimized on the original CT image and recomputed on the two warped images to be compared. The dose volume histograms were compared for the warped structures that were identical in both warped images. This procedure was repeated for the phantom with full, half full, and empty bladder. Results: The results indicated mean HU differences of up to 120 between registered and ground-truth deformed CT images. However, when the CBCT intensities were calibrated using a region of interest (ROI)-based calibration curve, these differences were reduced by up to 60%. Similarly, the mean differences in average vector field

  5. Seismic Responses of Asymmetric Base-Isolated Structures under Near-Fault Ground Motion

    Institute of Scientific and Technical Information of China (English)

    YE Kun; LI Li; FANG Qin-han

    2008-01-01

    An inter-story shear model of asymmetric base-isolated structures incorporating deformation of each isolation bearing was built, and a method to simultaneously simulate bi-directional near-fault and far-field ground motions was proposed. A comparative study on the dynamic responses of asymmetric base-isolated structures under near-fault and far-field ground motions were conducted to investigate the effects of eccentricity in the isolation system and in the superstructures, the ratio of the uncoupled torsional to lateral frequency of the superstructure and the pulse period of near-fault ground motions on the nonlinear seismic response of asymmetric base-isolated structures. Numerical results show that eccentricity in the isolation system makes asymmetric base-isolated structure more sensitive to near-fault ground motions, and the pulse period of near-fault ground motions plays an import role in governing the seismic responses of asymmetric base-isolated structures.

  6. Clinical evaluation of 3D/3D MRI-CBCT automatching on brain tumors for online patient setup verification - A step towards MRI-based treatment planning

    DEFF Research Database (Denmark)

    Buhl, Sune K.; Duun-Christensen, Anne Katrine; Kristensen, Brian H.

    2010-01-01

    Background. Magnetic Resonance Imaging (MRI) is often used in modern day radiotherapy (RT) due to superior soft tissue contrast. However, treatment planning based solely on MRI is restricted due to e. g. the limitations of conducting online patient setup verification using MRI as reference....... In this study 3D/3D MRI-Cone Beam CT (CBCT) automatching for online patient setup verification was investigated. Material and methods. Initially, a multi-modality phantom was constructed and used for a quantitative comparison of CT-CBCT and MRI-CBCT automatching. Following the phantom experiment three patients...

  7. A Method for Cyber-Physical System Behavior Modeling and Safety Verification Based on Extended Hybrid System Description Language

    Directory of Open Access Journals (Sweden)

    Tuo Ming Fu

    2016-01-01

    Full Text Available The safety of Cyber-physical system(CPS is up to its behavior, and it is a key property for CPS to be applied in critical application fields. A method for CPS behavior modeling and safety verification is put forward in this paper. The behavior model of CPS is described by extended hybrid system description language(EHYSDEL. The formal definition of hybrid program(HP is given, and the behavior model is transformed to HP based on the definition. The safety of CPS is verified by inputting the HP to KeYmarea. The advantage of the approach is that it models CPS intuitively and verify it’s safety strictly avoiding the state space explosion

  8. SMT-based Verification of LTL Specifications with Integer Constraints and its Application to Runtime Checking of Service Substitutability

    CERN Document Server

    Bersani, Marcello M; Frigeri, Achille; Pradella, Matteo; Rossi, Matteo

    2010-01-01

    An important problem that arises during the execution of service-based applications concerns the ability to determine whether a running service can be substituted with one with a different interface, for example if the former is no longer available. Standard Bounded Model Checking techniques can be used to perform this check, but they must be able to provide answers very quickly, lest the check hampers the operativeness of the application, instead of aiding it. The problem becomes even more complex when conversational services are considered, i.e., services that expose operations that have Input/Output data dependencies among them. In this paper we introduce a formal verification technique for an extension of Linear Temporal Logic that allows users to include in formulae constraints on integer variables. This technique applied to the substitutability problem for conversational services is shown to be considerably faster and with smaller memory footprint than existing ones.

  9. NO2 DOAS measurements from ground and space: comparison of ground based measurements and OMI data in Mexico City

    Science.gov (United States)

    Rivera, C.; Stremme, W.; Grutter, M.

    2012-04-01

    The combination of satellite data and ground based measurements can provide valuable information about atmospheric chemistry and air quality. In this work we present a comparison between measured ground based NO2 differential columns at the Universidad Nacional Autónoma de México (UNAM) in Mexico City, using the Differential Optical Absorption Spectroscopy (DOAS) technique and NO2 total columns measured by the Ozone Monitoring Instrument (OMI) onboard the Aura satellite using the same measurement technique. From these data, distribution maps of average NO2 above the Mexico basin were constructed and hot spots inside the city could be identified. In addition, a clear footprint was detected from the Tula industrial area, ~50 km northwest of Mexico City, where a refinery, a power plant and other industries are located. A less defined footprint was identified in the Cuernavaca basin, South of Mexico City, and the nearby cities of Toluca and Puebla do not present strong enhancements in the NO2 total columns. With this study we expect to cross-validate space and ground measurements and provide useful information for future studies.

  10. Integration of Remote Sensing Products with Ground-Based Measurements to Understand the Dynamics of Nepal's Forests and Plantation Sites

    Science.gov (United States)

    Gilani, H.; Jain, A. K.

    2016-12-01

    This study assembles information from three sources - remote sensing, terrestrial photography and ground-based inventory data, to understand the dynamics of Nepal's tropical and sub-tropical forests and plantation sites for the period 1990-2015. Our study focuses on following three specific district areas, which have conserved forests through social and agroforestry management practices: 1. Dolakha district: This site has been selected to study the impact of community-based forest management on land cover change using repeat photography and satellite imagery, in combination with interviews with community members. The study time period is during the period 1990-2010. We determined that satellite data with ground photographs can provide transparency for long term monitoring. The initial results also suggests that community-based forest management program in the mid-hills of Nepal was successful. 2. Chitwan district: Here we use high resolution remote sensing data and optimized community field inventories to evaluate potential application and operational feasibility of community level REDD+ measuring, reporting and verification (MRV) systems. The study uses temporal dynamics of land cover transitions, tree canopy size classes and biomass over a Kayar khola watershed REDD+ study area with community forest to evaluate satellite Image segmentation for land cover, linear regression model for above ground biomass (AGB), and estimation and monitoring field data for tree crowns and AGB. We study three specific years 2002, 2009, 2012. Using integration of WorldView-2 and airborne LiDAR data for tree species level. 3. Nuwakot district: This district was selected to study the impact of establishment of tree plantation on total barren/fallow. Over the last 40 year, this area has went through a drastic changes, from barren land to forest area with tree species consisting of Dalbergia sissoo, Leucaena leucocephala, Michelia champaca, etc. In 1994, this district area was registered

  11. Ground-based Space Weather Monitoring with LOFAR

    Science.gov (United States)

    Wise, Michael; van Haarlem, Michiel; Lawrence, Gareth; Reid, Simon; Bos, Andre; Rawlings, Steve; Salvini, Stef; Mitchell, Cathryn; Soleimani, Manuch; Amado, Sergio; Teresa, Vital

    As one of the first of a new generation of radio instruments, the International LOFAR Telescope (ILT) will provide a number of unique and novel capabilities for the astronomical community. These include remote configuration and operation, dynamic real-time processing and system response, and the ability to provide multiple simultaneous streams of data to a community whose scientific interests run the gamut from lighting in the atmospheres of distant planets to the origins of the universe itself. The LOFAR (LOw Frequency ARray) system is optimized for a frequency range from 30-240 MHz and consists of multiple antenna fields spread across Europe. In the Netherlands, a total 36 LOFAR stations are nearing completion with an initial 8 international stations currently being deployed in Germany, France, Sweden, and the UK. Digital beam-forming techniques make the LOFAR system agile and allow for rapid repointing of the telescope as well as the potential for multiple simultaneous observations. With its dense core array and long interferometric baselines, LOFAR has the potential to achieve unparalleled sensitivity and spatial resolution in the low frequency radio regime. LOFAR will also be one of the first radio observatories to feature automated processing pipelines to deliver fully calibrated science products to its user community. As we discuss in this presentation, the same capabilities that make LOFAR a powerful tool for radio astronomy also provide an excellent platform upon which to build a ground-based monitoring system for space weather events. For example, the ability to monitor Solar activity in near real-time is one of the key scientific capabilities being developed for LOFAR. With only a fraction of its total observing capacity, LOFAR will be able to provide continuous monitoring of the Solar spectrum over the entire 10-240 MHz band down to microsecond timescales. Autonomous routines will scan these incoming spectral data for evidence of Solar flares and be

  12. Environmental Technology Verification: Test Report of Mobile Source Selective Catalytic Reduction--Nett Technologies, Inc., BlueMAX 100 version A urea-based selective catalytic reduction technology

    Science.gov (United States)

    Nett Technologies’ BlueMAX 100 version A Urea-Based SCR System utilizes a zeolite catalyst coating on a cordierite honeycomb substrate for heavy-duty diesel nonroad engines for use with commercial ultra-low–sulfur diesel fuel. This environmental technology verification (ETV) repo...

  13. Analyses of Cryogenic Propellant Tank Pressurization based upon Ground Experiments

    OpenAIRE

    Ludwig, Carina; Dreyer, Michael

    2012-01-01

    The pressurization system of cryogenic propellant rockets requires on-board pressurant gas. The objective of this study was to analyze the influence of the pressurant gas temperature on the required pressurant gas mass in terms of lowering the launcher mass. First, ground experiments were performed in order to investigate the pressurization process with regard to the influence of the pressurant gas inlet temperature. Second, a system study for the cryogenic upper stage of a sma...

  14. Ground-based Remote Sensing of Cloud Liquid Water Path

    Science.gov (United States)

    Crewell, S.; Loehnert, U.

    Within the BALTEX Cloud LIquid WAter NETwork (CLIWA-NET) measurements of cloud parameters were performed to improve/evaluate cloud parameterizations in numerical weather prediction and climate models. The key variable is the cloud liq- uid water path (LWP) which is measured by passive microwave radiometry from the ground during three two-month CLIWA-NET observational periods. Additionally to the high temporal resolution time series from the ground, LWP fields are derived from satellite measurements. During the first two campaigns a continental scale network consisting of 12 stations was established. Most stations included further cloud sen- sitive instruments like infrared radiometer and lidar ceilometer. The third campaign started with a two-week long microwave intercomparison campaign (MICAM) in Cabauw, The Netherlands, and proceeded with a regional network within a 100 by 100 km area. The presentation will focus on the accuracy of LWP derived from the ground by in- vestigating the accuracy of the microwave brightness temperature measurement and examining the LWP retrieval uncertainty. Up to now microwave radiometer are no standard instruments and the seven radiometer involved in MICAM differ in frequen- cies, bandwidths, angular resolution, integration time etc. The influence of this instru- ment specifications on the LWP retrieval will be discussed.

  15. An Early Warning System from debris flows based on ground vibration monitoring data

    Science.gov (United States)

    Arattano, Massimo; Coviello, Velio

    2015-04-01

    -2014. The algorithm is based on the real time processing of ground vibration data detected by three vertical geophones. During the testing period, two debris flow events occurred that were both correctly detected by the algorithm with a relatively limited number of false alarms.

  16. Design Considerations and Experimental Verification of a Rail Brake Armature Based on Linear Induction Motor Technology

    Science.gov (United States)

    Sakamoto, Yasuaki; Kashiwagi, Takayuki; Hasegawa, Hitoshi; Sasakawa, Takashi; Fujii, Nobuo

    This paper describes the design considerations and experimental verification of an LIM rail brake armature. In order to generate power and maximize the braking force density despite the limited area between the armature and the rail and the limited space available for installation, we studied a design method that is suitable for designing an LIM rail brake armature; we considered adoption of a ring winding structure. To examine the validity of the proposed design method, we developed a prototype ring winding armature for the rail brakes and examined its electromagnetic characteristics in a dynamic test system with roller rigs. By repeating various tests, we confirmed that unnecessary magnetic field components, which were expected to be present under high speed running condition or when a ring winding armature was used, were not present. Further, the necessary magnetic field component and braking force attained the desired values. These studies have helped us to develop a basic design method that is suitable for designing the LIM rail brake armatures.

  17. Practical Applications of Cosmic Ray Science: Spacecraft, Aircraft, Ground Based Computation and Control Systems and Human Health and Safety

    Science.gov (United States)

    Atwell, William; Koontz, Steve; Normand, Eugene

    2012-01-01

    In this paper we review the discovery of cosmic ray effects on the performance and reliability of microelectronic systems as well as on human health and safety, as well as the development of the engineering and health science tools used to evaluate and mitigate cosmic ray effects in earth surface, atmospheric flight, and space flight environments. Three twentieth century technological developments, 1) high altitude commercial and military aircraft; 2) manned and unmanned spacecraft; and 3) increasingly complex and sensitive solid state micro-electronics systems, have driven an ongoing evolution of basic cosmic ray science into a set of practical engineering tools (e.g. ground based test methods as well as high energy particle transport and reaction codes) needed to design, test, and verify the safety and reliability of modern complex electronic systems as well as effects on human health and safety. The effects of primary cosmic ray particles, and secondary particle showers produced by nuclear reactions with spacecraft materials, can determine the design and verification processes (as well as the total dollar cost) for manned and unmanned spacecraft avionics systems. Similar considerations apply to commercial and military aircraft operating at high latitudes and altitudes near the atmospheric Pfotzer maximum. Even ground based computational and controls systems can be negatively affected by secondary particle showers at the Earth's surface, especially if the net target area of the sensitive electronic system components is large. Accumulation of both primary cosmic ray and secondary cosmic ray induced particle shower radiation dose is an important health and safety consideration for commercial or military air crews operating at high altitude/latitude and is also one of the most important factors presently limiting manned space flight operations beyond low-Earth orbit (LEO).

  18. Spectral invariance hypothesis study of polarized reflectance with Ground-based Multiangle SpectroPolarimetric Imager (GroundMSPI)

    Science.gov (United States)

    Bradley, Christine L.; Kupinski, Meredith; Diner, David J.; Xu, Feng; Chipman, Russell A.

    2015-09-01

    Many models used to represent the boundary condition for the separation of atmospheric scattering from the surface reflectance in polarized remote sensing measurements assume that the polarized surface reflectance is spectrally neutral. The Spectral Invariance Hypothesis asserts that the magnitude and shape of the polarized bidirectional reflectance factor (pBRF) is equal for all wavelengths. In order to test this hypothesis, JPL's Ground-based Multiangle SpectroPolarimetric Imager (GroundMSPI) is used to measure polarization information of different outdoor surface types. GroundMSPI measures the linear polarization Stokes parameters (I, Q, U), at three wavelengths, 470 nm, 660 nm, and 865 nm. The camera is mounted on a two-axis gimbal to accurately select the view azimuth and elevation directions. On clear sky days we acquired day-long scans of scenes that contain various surface types such as grass, dirt, cement, brick, and asphalt and placed a Spectralon panel in the camera field of view to provide a reflectance reference. Over the course of each day, changing solar position in the sky provides a large range of scattering angles for this study. The polarized bidirectional reflectance factor (pBRF) is measured for the three wavelengths and the best fit slope of the spectral correlation is reported. This work reports the range of best fit slopes measured for five region types.

  19. TPSPET—A TPS-based approach for in vivo dose verification with PET in proton therapy

    Science.gov (United States)

    Frey, K.; Bauer, J.; Unholtz, D.; Kurz, C.; Krämer, M.; Bortfeld, T.; Parodi, K.

    2014-01-01

    Since the interest in ion-irradiation for tumour therapy has significantly increased over the last few decades, intensive investigations are performed to improve the accuracy of this form of patient treatment. One major goal is the development of methods for in vivo dose verification. In proton therapy, a PET (positron emission tomography)-based approach measuring the irradiation-induced tissue activation inside the patient has been already clinically implemented. The acquired PET images can be compared to an expectation, derived under the assumption of a correct treatment application, to validate the particle range and the lateral field position in vivo. In the context of this work, TPSPET is introduced as a new approach to predict proton-irradiation induced three-dimensional positron emitter distributions by means of the same algorithms of the clinical treatment planning system (TPS). In order to perform additional activity calculations, reaction-channel-dependent input positron emitter depth distributions are necessary, which are determined from the application of a modified filtering approach to the TPS reference depth dose profiles in water. This paper presents the implementation of TPSPET on the basis of the research treatment planning software treatment planning for particles. The results are validated in phantom and patient studies against Monte Carlo simulations, and compared to β+-emitter distributions obtained from a slightly modified version of the originally proposed one-dimensional filtering approach applied to three-dimensional dose distributions. In contrast to previously introduced methods, TPSPET provides a faster implementation, the results show no sensitivity to lateral field extension and the predicted β+-emitter densities are fully consistent to the planned treatment dose as they are calculated by the same pencil beam algorithms. These findings suggest a large potential of the application of TPSPET for in vivo dose verification in the daily

  20. Response of base isolation system excited by spectrum compatible ground motions

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung Han; Kim, Min Kyu; Choi, In Kil [KAERI, Daejeon (Korea, Republic of)

    2012-10-15

    Structures in a nuclear power system are designed to be elastic even under an earthquake excitation. However a structural component such as an isolator shows inelastic behavior inherently. For the seismic assessment of nonlinear structures, the response history analysis should be performed. Especially for the performance based design, where the failure probability of a system needs to be evaluated, the variation of response should be evaluated. In this study, the spectrum compatible ground motions, the artificial ground motion and the modified ground motion, were generated. Using these ground motions, the variations of seismic responses of a simplified isolation system were evaluated.

  1. (Environmental investigation of ground water contamination at Wright-Patterson Air Force Base, Ohio)

    Energy Technology Data Exchange (ETDEWEB)

    1992-03-01

    This report presents information related to the sampling of ground water at the Wright-Patterson Air Force Base. It is part of an investigation into possible ground water contamination. Information concerns well drilling/construction; x-ray diffraction and sampling; soil boring logs; and chain-of-custody records.

  2. A knowledge base system for ground control over abandoned mines

    Energy Technology Data Exchange (ETDEWEB)

    Nazimko, V.V.; Zviagilsky, E.L. [Donetsk State Technical University, Donetsk (Ukraine)

    1999-07-01

    The knowledge of engineering systems has been developed to choose optimal technology for subsidence prevention over abandoned mines. The expert system treats a specific case, maps consequences of actions and derives relevant technology (or a set of technologies) that should be used to prevent ground subsidence. Input parameters that characterise the case are treated using fuzzy logic and are then fed to a neural network. The network has been successfully trained by a backpropagation algorithm on the basis of three fuzzy rules. 5 refs., 2 figs., 3 tabs.

  3. Ground-based measurement of surface temperature and thermal emissivity

    Science.gov (United States)

    Owe, M.; Van De Griend, A. A.

    1994-01-01

    Motorized cable systems for transporting infrared thermometers have been used successfully during several international field campaigns. Systems may be configured with as many as four thermal sensors up to 9 m above the surface, and traverse a 30 m transect. Ground and canopy temperatures are important for solving the surface energy balance. The spatial variability of surface temperature is often great, so that averaged point measurements result in highly inaccurate areal estimates. The cable systems are ideal for quantifying both temporal and spatial variabilities. Thermal emissivity is also necessary for deriving the absolute physical temperature, and measurements may be made with a portable measuring box.

  4. Commercial off the Shelf Ground Control Supports Calibration and Conflation from Ground to Space Based Sensors

    Science.gov (United States)

    Danielová, M.; Hummel, P.

    2016-06-01

    The need for rapid deployment of aerial and satellite imagery in support of GIS and engineering integration projects require new sources of geodetic control to ensure the accuracy for geospatial projects. In the past, teams of surveyors would need to deploy to project areas to provide targeted or photo identifiable points that are used to provide data for orthorecificaion, QA/QC and calibration for multi-platform sensors. The challenge of integrating street view, UAS, airborne and Space based sensors to produce the common operational picture requires control to tie multiple sources together. Today commercial off the shelf delivery of existing photo identifiable control is increasing the speed of deployment of this data without having to revisit sites over and over again. The presentation will discuss the processes developed by CompassData to build a global library of 40,000 control points available today. International Organization for Standardization (ISO) based processes and initiatives ensure consistent quality of survey data, photo identifiable features selected and meta data to support photogrammetrist, engineers and GIS professionals to quickly deliver projects with better accuracy.

  5. Principle and Design of a Single-phase Inverter Based Grounding System for Neutral-to-ground Voltage Compensation in Distribution Networks

    DEFF Research Database (Denmark)

    Wang, Wen; Yan, Lingjie; Zeng, Xiangjun

    2017-01-01

    Neutral-to-ground overvoltage may occur in non-effectively grounded power systems because of the distributed parameters asymmetry and resonance between Petersen coil and distributed capacitances. Thus, the constraint of neutral-to-ground voltage is critical for the safety of distribution networks....... In this paper, an active grounding system based on single-phase inverter and its control parameter design method is proposed to achieve this objective. Relationship between its output current and neutral-to-ground voltage is derived to explain the principle of neutral-to-ground voltage compensation. Then...... margin subjecting to large range of load change. The PI method is taken as the comparative method and the performances of both control methods are presented in detail. Experimental results prove the effectiveness and novelty of the proposed grounding system and control method....

  6. Development and Verification of a Pilot Code based on Two-fluid Three-field Model

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Moon Kyu; Bae, S. W.; Lee, Y. J.; Chung, B. D.; Jeong, J. J.; Ha, K. S.; Kang, D. H

    2006-09-15

    In this study, a semi-implicit pilot code is developed for a one-dimensional channel flow as three-fields. The three fields are comprised of a gas, continuous liquid and entrained liquid fields. All the three fields are allowed to have their own velocities. The temperatures of the continuous liquid and the entrained liquid are, however, assumed to be equilibrium. The interphase phenomena include heat and mass transfer, as well as momentum transfer. The fluid/structure interaction, generally, include both heat and momentum transfer. Assuming adiabatic system, only momentum transfer is considered in this study, leaving the wall heat transfer for the future study. Using 10 conceptual problems, the basic pilot code has been verified. The results of the verification are summarized below: It was confirmed that the basic pilot code can simulate various flow conditions (such as single-phase liquid flow, bubbly flow, slug/churn turbulent flow, annular-mist flow, and single-phase vapor flow) and transitions of the flow conditions. The pilot code was programmed so that the source terms of the governing equations and numerical solution schemes can be easily tested. The mass and energy conservation was confirmed for single-phase liquid and single-phase vapor flows. It was confirmed that the inlet pressure and velocity boundary conditions work properly. It was confirmed that, for single- and two-phase flows, the velocity and temperature of non-existing phase are calculated as intended. Complete phase depletion which might occur during a phase change was found to adversely affect the code stability. A further study would be required to enhance code capability in this regard.

  7. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  8. Simultaneous ground- and satellite-based observation of MF/HF auroral radio emissions

    Science.gov (United States)

    Sato, Yuka; Kumamoto, Atsushi; Katoh, Yuto; Shinbori, Atsuki; Kadokura, Akira; Ogawa, Yasunobu

    2016-05-01

    We report on the first simultaneous measurements of medium-high frequency (MF/HF) auroral radio emissions (above 1 MHz) by ground- and satellite-based instruments. Observational data were obtained by the ground-based passive receivers in Iceland and Svalbard, and by the Plasma Waves and Sounder experiment (PWS) mounted on the Akebono satellite. We observed two simultaneous appearance events, during which the frequencies of the auroral roar and MF bursts detected at ground level were different from those of the terrestrial hectometric radiation (THR) observed by the Akebono satellite passing over the ground-based stations. This frequency difference confirms that auroral roar and THR are generated at different altitudes across the F peak. We did not observe any simultaneous observations that indicated an identical generation region of auroral roar and THR. In most cases, MF/HF auroral radio emissions were observed only by the ground-based detector, or by the satellite-based detector, even when the satellite was passing directly over the ground-based stations. A higher detection rate was observed from space than from ground level. This can primarily be explained in terms of the idea that the Akebono satellite can detect THR emissions coming from a wider region, and because a considerable portion of auroral radio emissions generated in the bottomside F region are masked by ionospheric absorption and screening in the D/E regions associated with ionization which results from auroral electrons and solar UV radiation.

  9. 基于覆盖率驱动的SoC验证技术研究%Research of SoC Verification Technology Based on Coverage--Driven

    Institute of Scientific and Technical Information of China (English)

    朱车壮; 陈岚; 冯燕

    2011-01-01

    Coverage data--base is the qualitative measurement for the verification engineer to judge the complete degree of SoC verification, and it provides the safeguard for the SoC validation completeness. This paper takes the verification of SoC bus arbiter as an example, analyzes many kinds of coverage in detail, for example, structure coverage, functional coverage, assertion coverage and so on, then modifies the RTL design code and testcases based on the analysis results of these coverage, until the integrity of the verification meet the design requirements.%覆盖率数据是验证工程师判定SoC验证完备程度的定性度量,为SoC验证完全性提供了保障,指明了方向.文中以SoC总线仲裁器验证为例,对其结构覆盖率、功能覆盖率、断言覆盖率等多种覆盖率进行了全面的分析,然后根据覆盖率分析结果反馈到RTL设计代码和测试激励进行修正,直到验证的完整性满足设计的要求.

  10. Space- and ground-based particle physics meet at CERN

    CERN Document Server

    CERN Bulletin

    2012-01-01

    The fourth international conference on Particle and Fundamental Physics in Space (SpacePart12) will take place at CERN from 5 to 7 November. The conference will bring together scientists working on particle and fundamental physics in space and on ground, as well as space policy makers from around the world.   One hundred years after Victor Hess discovered cosmic rays using hot air balloons, the experimental study of particle and fundamental physics is still being pursued today with extremely sophisticated techniques: on the ground, with state-of-the-art accelerators like the LHC; and in space, with powerful observatories that probe, with amazing accuracy, the various forms of cosmic radiation, charged and neutral, which are messengers of the most extreme conditions of matter and energy. SpacePart12 will be the opportunity for participants to exchange views on the progress of space-related science and technology programmes in the field of particle and fundamental physics in space. SpacePar...

  11. Practical Applications of Cosmic Ray Science: Spacecraft, Aircraft, Ground-Based Computation and Control Systems, and Human Health and Safety

    Science.gov (United States)

    Atwell, William; Koontz, Steve; Normand, Eugene

    2012-01-01

    Three twentieth century technological developments, 1) high altitude commercial and military aircraft; 2) manned and unmanned spacecraft; and 3) increasingly complex and sensitive solid state micro-electronics systems, have driven an ongoing evolution of basic cosmic ray science into a set of practical engineering tools needed to design, test, and verify the safety and reliability of modern complex technological systems. The effects of primary cosmic ray particles and secondary particle showers produced by nuclear reactions with the atmosphere, can determine the design and verification processes (as well as the total dollar cost) for manned and unmanned spacecraft avionics systems. Similar considerations apply to commercial and military aircraft operating at high latitudes and altitudes near the atmospheric Pfotzer maximum. Even ground based computational and controls systems can be negatively affected by secondary particle showers at the Earth s surface, especially if the net target area of the sensitive electronic system components is large. Finally, accumulation of both primary cosmic ray and secondary cosmic ray induced particle shower radiation dose is an important health and safety consideration for commercial or military air crews operating at high altitude/latitude and is also one of the most important factors presently limiting manned space flight operations beyond low-Earth orbit (LEO). In this paper we review the discovery of cosmic ray effects on the performance and reliability of microelectronic systems as well as human health and the development of the engineering and health science tools used to evaluate and mitigate cosmic ray effects in ground-based atmospheric flight, and space flight environments. Ground test methods applied to microelectronic components and systems are used in combinations with radiation transport and reaction codes to predict the performance of microelectronic systems in their operating environments. Similar radiation transport

  12. A novel technique for extracting clouds base height using ground based imaging

    Directory of Open Access Journals (Sweden)

    E. Hirsch

    2011-01-01

    Full Text Available The height of a cloud in the atmospheric column is a key parameter in its characterization. Several remote sensing techniques (passive and active, either ground-based or on space-borne platforms and in-situ measurements are routinely used in order to estimate top and base heights of clouds. In this article we present a novel method that combines thermal imaging from the ground and sounded wind profile in order to derive the cloud base height. This method is independent of cloud types, making it efficient for both low boundary layer and high clouds. In addition, using thermal imaging ensures extraction of clouds' features during daytime as well as at nighttime. The proposed technique was validated by comparison to active sounding by ceilometers (which is a standard ground based method, to lifted condensation level (LCL calculations, and to MODIS products obtained from space. As all passive remote sensing techniques, the proposed method extracts only the height of the lowest cloud layer, thus upper cloud layers are not detected. Nevertheless, the information derived from this method can be complementary to space-borne cloud top measurements when deep-convective clouds are present. Unlike techniques such as LCL, this method is not limited to boundary layer clouds, and can extract the cloud base height at any level, as long as sufficient thermal contrast exists between the radiative temperatures of the cloud and its surrounding air parcel. Another advantage of the proposed method is its simplicity and modest power needs, making it particularly suitable for field measurements and deployment at remote locations. Our method can be further simplified for use with visible CCD or CMOS camera (although nighttime clouds will not be observed.

  13. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  14. An Investigation of Widespread Ozone Damage to the Soybean Crop in the Upper Midwest Determined From Ground-Based and Satellite Measurements

    Science.gov (United States)

    Fishman, Jack; Creilson, John K.; Parker, Peter A.; Ainsworth, Elizabeth A.; Vining, G. Geoffrey; Szarka, John; Booker, Fitzgerald L.; Xu, Xiaojing

    2010-01-01

    Elevated concentrations of ground-level ozone (O3) are frequently measured over farmland regions in many parts of the world. While numerous experimental studies show that O3 can significantly decrease crop productivity, independent verifications of yield losses at current ambient O3 concentrations in rural locations are sparse. In this study, soybean crop yield data during a 5-year period over the Midwest of the United States were combined with ground and satellite O3 measurements to provide evidence that yield losses on the order of 10% could be estimated through the use of a multiple linear regression model. Yield loss trends based on both conventional ground-based instrumentation and satellite-derived tropospheric O3 measurements were statistically significant and were consistent with results obtained from open-top chamber experiments and an open-air experimental facility (SoyFACE, Soybean Free Air Concentration Enrichment) in central Illinois. Our analysis suggests that such losses are a relatively new phenomenon due to the increase in background tropospheric O3 levels over recent decades. Extrapolation of these findings supports previous studies that estimate the global economic loss to the farming community of more than $10 billion annually.

  15. Ground-Based Lidar Measurements During the CALIPSO and Twilight Zone (CATZ) Campaign

    Science.gov (United States)

    Berkoff, Timothy; Qian, Li; Kleidman, Richard; Stewart, Sebastian; Welton, Ellsworth; Li, Zhu; Holbem, Brent

    2008-01-01

    The CALIPSO and Twilight Zone (CATZ) field campaign was carried out between June 26th and August 29th of 2007 in the multi-state Maryland-Virginia-Pennsylvania region of the U.S. to study aerosol properties and cloud-aerosol interactions during overpasses of the CALIPSO satellite. Field work was conducted on selected days when CALIPSO ground tracks occurred in the region. Ground-based measurements included data from multiple Cimel sunphotometers that were placed at intervals along a segment of the CALIPSO ground-track. These measurements provided sky radiance and AOD measurements to enable joints inversions and comparisons with CALIPSO retrievals. As part of this activity, four ground-based lidars provided backscatter measurements (at 523 nm) in the region. Lidars at University of Maryland Baltimore County (Catonsville, MD) and Goddard Space Flight Center (Greenbelt, MD) provided continuous data during the campaign, while two micro-pulse lidar (MPL) systems were temporarily stationed at various field locations directly on CALIPSO ground-tracks. As a result, thirteen on-track ground-based lidar observations were obtained from eight different locations in the region. In some cases, nighttime CALIPSO coincident measurements were also obtained. In most studies reported to date, ground-based lidar validation efforts for CALIPSO rely on systems that are at fixed locations some distance away from the satellite ground-track. The CATZ ground-based lidar data provide an opportunity to examine vertical structure properties of aerosols and clouds both on and off-track simultaneously during a CALIPSO overpass. A table of available ground-based lidar measurements during this campaign will be presented, along with example backscatter imagery for a number of coincident cases with CALIPSO. Results indicate that even for a ground-based measurements directly on-track, comparisons can still pose a challenge due to the differing spatio-temporal properties of the ground and satellite

  16. Improving the detection of explosive hazards with LIDAR-based ground plane estimation

    Science.gov (United States)

    Buck, A.; Keller, J. M.; Popescu, M.

    2016-05-01

    Three-dimensional point clouds generated by LIDAR offer the potential to build a more complete understanding of the environment in front of a moving vehicle. In particular, LIDAR data facilitates the development of a non-parametric ground plane model that can filter target predictions from other sensors into above-ground and below-ground sets. This allows for improved detection performance when, for example, a system designed to locate above-ground targets considers only the set of above-ground predictions. In this paper, we apply LIDAR-based ground plane filtering to a forward looking ground penetrating radar (FLGPR) sensor system and a side looking synthetic aperture acoustic (SAA) sensor system designed to detect explosive hazards along the side of a road. Additionally, we consider the value of the visual magnitude of the LIDAR return as a feature for identifying anomalies. The predictions from these sensors are evaluated independently with and without ground plane filtering and then fused to produce a combined prediction confidence. Sensor fusion is accomplished by interpolating the confidence scores of each sensor along the ground plane model to create a combined confidence vector at specified points in the environment. The methods are tested along an unpaved desert road at an arid U.S. Army test site.

  17. Biometric verification with correlation filters

    Science.gov (United States)

    Vijaya Kumar, B. V. K.; Savvides, Marios; Xie, Chunyan; Venkataramani, Krithika; Thornton, Jason; Mahalanobis, Abhijit

    2004-01-01

    Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric (such as a fingerprint, face image, or iris image) to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images (e.g., facial expressions, illumination changes, etc.). We investigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification.

  18. Microcontroller based ground weapon control system(Short Communication

    Directory of Open Access Journals (Sweden)

    M. Sankar Kishore

    2001-10-01

    Full Text Available Armoured vehicles and tanks generally consist of high resolution optical (both infrared and visible and display systems for recognition and identification of the targets. Different weapons/articles to engage the targets may be present. A fire control system (FCS controls all the above systems, monitors the status of the articles present and passes the information to the display system. Depending upon the health and availability of the articles, the FCS selects and fires the articles. Design and development of ground control unit which is the heart of the FCS, both in hardware and software, has been emphasised. The system has been developed using microcontroller and software developed in ASM 51 language. The system also has a facility to test all the systems and articles as initial power on condition. From the safety point of view, software and hardware interlocks have been provided in the critical operations, like firing sequence. "

  19. Research on Framework of Verification Code Rapid Identification Based on PIL%基于PIL的验证码快速识别框架的研究

    Institute of Scientific and Technical Information of China (English)

    胡光中; 欧阳鸿志

    2012-01-01

    The verification code is widely used on the network as a Turing test, the study of the verification code identification technology helps verification code development and solves some emerging issues. This paper presents a framework based on PIL library, through a combination of excellent features of Python and known identification methods, can quickly respond to common verification code in the form, and can achieve better recognition results.%验证码是网络上普遍采用的一种图灵测试,研究验证码识别技术有助于验证码发展和解决一些新生问题.本文提出一种基于PIL类库的识别框架,通过结合Python的优良特性和己知识别方法,能够快速应时常见验证码形式,并能达到较好的识别效果.

  20. (Environmental investigation of ground water contamination at Wright-Patterson Air Force Base, Ohio)

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-01

    This report presents information concerning field procedures employed during the monitoring, well construction, well purging, sampling, and well logging at the Wright-Patterson Air Force Base. Activities were conducted in an effort to evaluate ground water contamination.

  1. Ground-based Infrared Observations of Water Vapor and Hydrogen Peroxide in the Atmosphere of Mars

    Science.gov (United States)

    Encrenaz, T.; Greathouse, T. K.; Bitner, M.; Kruger, A.; Richter, M. J.; Lacy, J. H.; Bézard, B.; Fouchet, T.; Lefevre, F.; Forget, F.; Atreya, S. K.

    2008-11-01

    Ground-based observations of water vapor and hydrogen peroxide have been obtained in the thermal infrared range, using the TEXES instrument at the NASA Infrared Telescope Facility, for different times of the seasonal cycle.

  2. Informing hydrological models with ground-based time-lapse relative gravimetry: potential and limitations

    DEFF Research Database (Denmark)

    Bauer-Gottwein, Peter; Christiansen, Lars; Rosbjerg, Dan

    2011-01-01

    Coupled hydrogeophysical inversion emerges as an attractive option to improve the calibration and predictive capability of hydrological models. Recently, ground-based time-lapse relative gravity (TLRG) measurements have attracted increasing interest because there is a direct relationship between ...

  3. Changes in ground-based solar ultraviolet radiation during fire episodes: a case study

    CSIR Research Space (South Africa)

    Wright, CY

    2013-09-01

    Full Text Available about the relationship between fires and solar UVR without local high-quality column or ground-based ambient air pollution (particulate matter in particular) data; however, the threat to public health from fires was acknowledged....

  4. The relative importance of managerial competencies for predicting the perceived job performance of Broad-Based Black Economic Empowerment verification practitioners

    Directory of Open Access Journals (Sweden)

    Barbara M. Seate

    2016-02-01

    Full Text Available Orientation: There is a need for the growing Broad-Based Black Economic Empowerment (B-BBEE verification industry to assess competencies and determine skills gaps for the management of the verification practitioners’ perceived job performance. Knowing which managerial competencies are important for different managerial functions is vital for developing and improving training and development programmes.Research purpose: The purpose of this study was to determine the managerial capabilities that are required of the B-BBEE verification practitioners, in order to improve their perceived job performance.Motivation for the study: The growing number of the B-BBEE verification practitioners calls for more focused training and development. Generating such a training and development programme demands empirical research into the relative importance of managerial competencies.Research approach, design and method: A quantitative design using the survey approach was adopted. A questionnaire was administered to a stratified sample of 87 B-BBEE verification practitioners. Data were analysed using the Statistical Package for Social Sciences (version 22.0 and Smart Partial Least Squares software.Main findings: The results of the correlation analysis revealed that there were strong and positive associations between technical skills, interpersonal skills, compliance to standards and ethics, managerial skills and perceived job performance. Results of the regression analysis showed that managerial skills, compliance to standards and ethics and interpersonal skills were statistically significant in predicting perceived job performance. However, technical skills were insignificant in predicting perceived job performance.Practical/managerial implications: The study has shown that the B-BBEE verification industry, insofar as the technical skills of the practitioners are concerned, does have suitably qualified staff with the requisite educational qualifications. At the

  5. Spectrally selective surfaces for ground and space-based instrumentation: support for a resource base

    Science.gov (United States)

    McCall, Susan H.; Sinclair, R. Lawrence; Pompea, Stephen M.; Breault, Robert P.

    1993-11-01

    The performance of space telescopes, space instruments, and space radiator systems depends critically upon the selection of appropriate spectrally selective surfaces. Many space programs have suffered severe performance limitations, schedule setbacks, and spent hundreds of thousands of dollars in damage control because of a lack of readily-accessible, accurate data on the properties of spectrally selective surfaces, particularly black surfaces. A Canadian effort is underway to develop a resource base (database and support service) to help alleviate this problem. The assistance of the community is required to make the resource base comprehensive and useful to the end users. The paper aims to describe the objectives of this project. In addition, a request for information and support is made for various aspects of the project. The resource base will be useful for both ground and space-based instrumentation.

  6. Spherical coverage verification

    CERN Document Server

    Petkovic, Marko D; Latecki, Longin Jan

    2011-01-01

    We consider the problem of covering hypersphere by a set of spherical hypercaps. This sort of problem has numerous practical applications such as error correcting codes and reverse k-nearest neighbor problem. Using the reduction of non degenerated concave quadratic programming (QP) problem, we demonstrate that spherical coverage verification is NP hard. We propose a recursive algorithm based on reducing the problem to several lower dimension subproblems. We test the performance of the proposed algorithm on a number of generated constellations. We demonstrate that the proposed algorithm, in spite of its exponential worst-case complexity, is applicable in practice. In contrast, our results indicate that spherical coverage verification using QP solvers that utilize heuristics, due to numerical instability, may produce false positives.

  7. Evaluation of DVH-based treatment plan verification in addition to gamma passing rates for head and neck IMRT

    NARCIS (Netherlands)

    Visser, Ruurd; Wauben, David J. L.; de Groot, Martijn; Steenbakkers, Roel J. H. M.; Bijl, Henk P.; Godart, Jeremy; van t Veld, Aart; Langendijk, Johannes A.; Korevaar, Erik W.

    Background and purpose: Treatment plan verification of intensity modulated radiotherapy (IMRT) is generally performed with the gamma index (GI) evaluation method, which is difficult to extrapolate to clinical implications. Incorporating Dose Volume Histogram (DVH) information can compensate for

  8. System Identification and Automatic Mass Balancing of Ground-Based Three-Axis Spacecraft Simulator

    Science.gov (United States)

    2006-08-01

    System Identification and Automatic Mass Balancing of Ground-Based Three-Axis Spacecraft Simulator Jae-Jun Kim∗ and Brij N. Agrawal † Department of...TITLE AND SUBTITLE System Identification and Automatic Mass Balancing of Ground-Based Three-Axis Spacecraft Simulator 5a. CONTRACT NUMBER 5b...and Dynamics, Vol. 20, No. 4, July-August 1997, pp. 625-632. 6Schwartz, J. L. and Hall, C. D., “ System Identification of a Spherical Air-Bearing

  9. Verification of visual odometry algorithms with an OpenGL-based software tool

    Science.gov (United States)

    Skulimowski, Piotr; Strumillo, Pawel

    2015-05-01

    We present a software tool called a stereovision egomotion sequence generator that was developed for testing visual odometry (VO) algorithms. Various approaches to single and multicamera VO algorithms are reviewed first, and then a reference VO algorithm that has served to demonstrate the program's features is described. The program offers simple tools for defining virtual static three-dimensional scenes and arbitrary six degrees of freedom motion paths within such scenes and output sequences of stereovision images, disparity ground-truth maps, and segmented scene images. A simple script language is proposed that simplifies tests of VO algorithms for user-defined scenarios. The program's capabilities are demonstrated by testing a reference VO technique that employs stereoscopy and feature tracking.

  10. Thermal Analysis of the Driving Component Based on the Thermal Network Method in a Lunar Drilling System and Experimental Verification

    Directory of Open Access Journals (Sweden)

    Dewei Tang

    2017-03-01

    Full Text Available The main task of the third Chinese lunar exploration project is to obtain soil samples that are greater than two meters in length and to acquire bedding information from the surface of the moon. The driving component is the power output unit of the drilling system in the lander; it provides drilling power for core drilling tools. High temperatures can cause the sensors, permanent magnet, gears, and bearings to suffer irreversible damage. In this paper, a thermal analysis model for this driving component, based on the thermal network method (TNM was established and the model was solved using the quasi-Newton method. A vacuum test platform was built and an experimental verification method (EVM was applied to measure the surface temperature of the driving component. Then, the TNM was optimized, based on the principle of heat distribution. Through comparative analyses, the reasonableness of the TNM is validated. Finally, the static temperature field of the driving component was predicted and the “safe working times” of every mode are given.

  11. Ground-based hyperspectral analysis of the urban nightscape

    Science.gov (United States)

    Alamús, Ramon; Bará, Salvador; Corbera, Jordi; Escofet, Jaume; Palà, Vicenç; Pipia, Luca; Tardà, Anna

    2017-02-01

    Airborne hyperspectral cameras provide the basic information to estimate the energy wasted skywards by outdoor lighting systems, as well as to locate and identify their sources. However, a complete characterization of the urban light pollution levels also requires evaluating these effects from the city dwellers standpoint, e.g. the energy waste associated to the excessive illuminance on walls and pavements, light trespass, or the luminance distributions causing potential glare, to mention but a few. On the other hand, the spectral irradiance at the entrance of the human eye is the primary input to evaluate the possible health effects associated with the exposure to artificial light at night, according to the more recent models available in the literature. In this work we demonstrate the possibility of using a hyperspectral imager (routinely used in airborne campaigns) to measure the ground-level spectral radiance of the urban nightscape and to retrieve several magnitudes of interest for light pollution studies. We also present the preliminary results from a field campaign carried out in the downtown of Barcelona.

  12. Figure-ground organization based on three-dimensional symmetry

    Science.gov (United States)

    Michaux, Aaron; Jayadevan, Vijai; Delp, Edward; Pizlo, Zygmunt

    2016-11-01

    We present an approach to figure/ground organization using mirror symmetry as a general purpose and biologically motivated prior. Psychophysical evidence suggests that the human visual system makes use of symmetry in producing three-dimensional (3-D) percepts of objects. 3-D symmetry aids in scene organization because (i) almost all objects exhibit symmetry, and (ii) configurations of objects are not likely to be symmetric unless they share some additional relationship. No general purpose approach is known for solving 3-D symmetry correspondence in two-dimensional (2-D) camera images, because few invariants exist. Therefore, we present a general purpose method for finding 3-D symmetry correspondence by pairing the problem with the two-view geometry of the binocular correspondence problem. Mirror symmetry is a spatially global property that is not likely to be lost in the spatially local noise of binocular depth maps. We tested our approach on a corpus of 180 images collected indoors with a stereo camera system. K-means clustering was used as a baseline for comparison. The informative nature of the symmetry prior makes it possible to cluster data without a priori knowledge of which objects may appear in the scene, and without knowing how many objects there are in the scene.

  13. DATA PROCESSING AND ANALYSIS TOOLS BASED ON GROUND-BASED SYNTHETIC APERTURE RADAR IMAGERY

    Directory of Open Access Journals (Sweden)

    M. Crosetto

    2017-09-01

    Full Text Available The Ground-Based SAR (GBSAR is a terrestrial remote sensing technique used to measure and monitor deformation. In this paper we describe two complementary approaches to derive deformation measurements using GBSAR data. The first approach is based on radar interferometry, while the second one exploits the GBSAR amplitude. In this paper we consider the so-called discontinuous GBSAR acquisition mode. The interferometric process is not always straightforward: it requires appropriate data processing and analysis tools. One of the main critical steps is phase unwrapping, which can critically affect the deformation measurements. In this paper we describe the procedure used at the CTTC to process and analyse discontinuous GBSAR data. In the second part of the paper we describe the approach based on GBSAR amplitude images and an image-matching method.

  14. Cloud Base Height and Effective Cloud Emissivity Retrieval with Ground-Based Infrared Interferometer

    Institute of Scientific and Technical Information of China (English)

    PAN Lin-Jun; LU Da-Ren

    2012-01-01

    Based on ground-based Atmospheric Emitted Radiance Interferometer (AERI) observations in Shouxian, Anhui province, China, the authors retrieve the cloud base height (CBH) and effective cloud emissivity by using the minimum root-mean-square difference method. This method was originally developed for satellite remote sensing. The high-temporal-resolution retrieval results can depict the trivial variations of the zenith clouds continu- ously. The retrieval results are evaluated by comparing them with observations by the cloud radar. The compari- son shows that the retrieval bias is smaller for the middle and low cloud, especially for the opaque cloud. When two layers of clouds exist, the retrieval results reflect the weighting radiative contribution of the multi-layer cloud. The retrieval accuracy is affected by uncertainties of the AERI radiances and sounding profiles, in which the role of uncertainty in the temperature profile is dominant.

  15. Robust verification analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rider, William, E-mail: wjrider@sandia.gov [Sandia National Laboratories, Center for Computing Research, Albuquerque, NM 87185 (United States); Witkowski, Walt [Sandia National Laboratories, Verification and Validation, Uncertainty Quantification, Credibility Processes Department, Engineering Sciences Center, Albuquerque, NM 87185 (United States); Kamm, James R. [Los Alamos National Laboratory, Methods and Algorithms Group, Computational Physics Division, Los Alamos, NM 87545 (United States); Wildey, Tim [Sandia National Laboratories, Center for Computing Research, Albuquerque, NM 87185 (United States)

    2016-02-15

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  16. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  17. Marker-based quantification of interfractional tumor position variation and the use of markers for setup verification in radiation therapy for esophageal cancer.

    Science.gov (United States)

    Jin, Peng; van der Horst, Astrid; de Jong, Rianne; van Hooft, Jeanin E; Kamphuis, Martijn; van Wieringen, Niek; Machiels, Melanie; Bel, Arjan; Hulshof, Maarten C C M; Alderliesten, Tanja

    2015-12-01

    The aim of this study was to quantify interfractional esophageal tumor position variation using markers and investigate the use of markers for setup verification. Sixty-five markers placed in the tumor volumes of 24 esophageal cancer patients were identified in computed tomography (CT) and follow-up cone-beam CT. For each patient we calculated pairwise distances between markers over time to evaluate geometric tumor volume variation. We then quantified marker displacements relative to bony anatomy and estimated the variation of systematic (Σ) and random errors (σ). During bony anatomy-based setup verification, we visually inspected whether the markers were inside the planning target volume (PTV) and attempted marker-based registration. Minor time trends with substantial fluctuations in pairwise distances implied tissue deformation. Overall, Σ(σ) in the left-right/cranial-caudal/anterior-posterior direction was 2.9(2.4)/4.1(2.4)/2.2(1.8) mm; for the proximal stomach, it was 5.4(4.3)/4.9(3.2)/1.9(2.4) mm. After bony anatomy-based setup correction, all markers were inside the PTV. However, due to large tissue deformation, marker-based registration was not feasible. Generally, the interfractional position variation of esophageal tumors is more pronounced in the cranial-caudal direction and in the proximal stomach. Currently, marker-based setup verification is not feasible for clinical routine use, but markers can facilitate the setup verification by inspecting whether the PTV covers the tumor volume adequately. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  18. Verification of the new detection method for irradiated spices based on microbial survival by collaborative blind trial

    Science.gov (United States)

    Miyahara, M.; Furuta, M.; Takekawa, T.; Oda, S.; Koshikawa, T.; Akiba, T.; Mori, T.; Mimura, T.; Sawada, C.; Yamaguchi, T.; Nishioka, S.; Tada, M.

    2009-07-01

    An irradiation detection method using the difference of the radiation sensitivity of the heat-treated microorganisms was developed as one of the microbiological detection methods of the irradiated foods. This detection method is based on the difference of the viable cell count before and after heat treatment (70 °C and 10 min). The verification by collaborative blind trial of this method was done by nine inspecting agencies in Japan. The samples used for this trial were five kinds of spices consisting of non-irradiated, 5 kGy irradiated, and 7 kGy irradiated black pepper, allspice, oregano, sage, and paprika, respectively. As a result of this collaboration, a high percentage (80%) of the correct answers was obtained for irradiated black pepper and allspice. However, the method was less successful for irradiated oregano, sage, and paprika. It might be possible to use this detection method for preliminary screening of the irradiated foods but further work is necessary to confirm these findings.

  19. Comparing the Efficacy of Reform-Based and Traditional/Verification Curricula to Support Student Learning about Space Science

    Science.gov (United States)

    Granger, E. M.; Bevis, T. H.; Saka, Y.; Southerland, S. A.

    2010-08-01

    This research explores the relationship between reform-based curriculum and the development of students' knowledge of and attitudes toward space science. Using a randomized cluster design, the effectiveness of Great Exploration in Math and Science (GEMS) Space Science Curriculum Sequence was compared with the effectiveness of more traditional curriculum in supporting 4th and 5th grade students' learning of and attitudes toward space science. GEMS employed an inductive approach to content (learning cycle), explicit use of evidence, and attention to scientific inquiry. The comparison group experienced traditional, verification means of teaching. Randomization occurred at the level of the teacher assignment to treatment group (not at the student level). Students in the classrooms in which GEMS was employed demonstrated a statistically significant increase in content knowledge and attitudes toward space science: Students in classrooms in which the traditional curriculum was employed did not show these increases. The GEMS effect on student achievement was greater for students in classrooms in which the teacher experienced a greater increase in content knowledge.

  20. Model-based virtual VSB mask writer verification for efficient mask error checking and optimization prior to MDP

    Science.gov (United States)

    Pack, Robert C.; Standiford, Keith; Lukanc, Todd; Ning, Guo Xiang; Verma, Piyush; Batarseh, Fadi; Chua, Gek Soon; Fujimura, Akira; Pang, Linyong

    2014-10-01

    A methodology is described wherein a calibrated model-based `Virtual' Variable Shaped Beam (VSB) mask writer process simulator is used to accurately verify complex Optical Proximity Correction (OPC) and Inverse Lithography Technology (ILT) mask designs prior to Mask Data Preparation (MDP) and mask fabrication. This type of verification addresses physical effects which occur in mask writing that may impact lithographic printing fidelity and variability. The work described here is motivated by requirements for extreme accuracy and control of variations for today's most demanding IC products. These extreme demands necessitate careful and detailed analysis of all potential sources of uncompensated error or variation and extreme control of these at each stage of the integrated OPC/ MDP/ Mask/ silicon lithography flow. The important potential sources of variation we focus on here originate on the basis of VSB mask writer physics and other errors inherent in the mask writing process. The deposited electron beam dose distribution may be examined in a manner similar to optical lithography aerial image analysis and image edge log-slope analysis. This approach enables one to catch, grade, and mitigate problems early and thus reduce the likelihood for costly long-loop iterations between OPC, MDP, and wafer fabrication flows. It moreover describes how to detect regions of a layout or mask where hotspots may occur or where the robustness to intrinsic variations may be improved by modification to the OPC, choice of mask technology, or by judicious design of VSB shots and dose assignment.

  1. Verification of the new detection method for irradiated spices based on microbial survival by collaborative blind trial

    Energy Technology Data Exchange (ETDEWEB)

    Miyahara, M. [National Institute of Health Sciences, 1-18-1 Kamiyoga, Setagaya-Ku, Tokyo (Japan)], E-mail: mfuruta@b.s.osakafu-u.ac.jp; Furuta, M. [Osaka Prefecture University, 1-2 Gakuen-Cho, Naka-ku, Sakai, 599-8570 Osaka (Japan); Takekawa, T. [Nuclear Fuel Industries Ltd., 950-1 Asashiro-Nishi, Kumatori-Cho, Sennan-Gun, Osaka (Japan); Oda, S. [Japan Food Research Laboratories, 52-1 Motoyoyogi-Cho, Sibuya-Ku, Tokyo (Japan); Koshikawa, T. [Japan Radio Isotope Association, 121-19 Toriino, Koka, Shiga (Japan); Akiba, T. [Japan Food Hygiene Association, 2-5-47 Tadao, Machida, Tokyo (Japan); Mori, T. [Tokyo Kenbikyo-In Foundation, 44-1 Nihonbashi, Hakozaki-Cho, Chuo-Ku, Tokyo (Japan); Mimura, T. [Japan Oilstuff Inspector' s Corporation, 26-1 Kaigandori 5-Chome, Naka-Ku, Yokohama (Japan); Sawada, C. [Japan Frozen Foods Inspection Corp., 2-13-45 Fukuura, Kanazawa-Ku, Yokohama (Japan); Yamaguchi, T. [Japan Electron Beam Irradiation Service Co., Ltd., 4-16 Midorigahara, Tukuba, Ibaraki (Japan); Nishioka, S. [Mycotoxin Inspection Corp., 15 Daikokufuto, Turumi-Ku, Yokohama (Japan); Tada, M. [Chugoku Gakuen University, 83 Niwase, Okayama (Japan)

    2009-07-15

    An irradiation detection method using the difference of the radiation sensitivity of the heat-treated microorganisms was developed as one of the microbiological detection methods of the irradiated foods. This detection method is based on the difference of the viable cell count before and after heat treatment (70 deg. C and 10 min). The verification by collaborative blind trial of this method was done by nine inspecting agencies in Japan. The samples used for this trial were five kinds of spices consisting of non-irradiated, 5 kGy irradiated, and 7 kGy irradiated black pepper, allspice, oregano, sage, and paprika, respectively. As a result of this collaboration, a high percentage (80%) of the correct answers was obtained for irradiated black pepper and allspice. However, the method was less successful for irradiated oregano, sage, and paprika. It might be possible to use this detection method for preliminary screening of the irradiated foods but further work is necessary to confirm these findings.

  2. Simulation-To-Flight (STF-1): A Mission to Enable CubeSat Software-Based Validation and Verification

    Science.gov (United States)

    Morris, Justin; Zemerick, Scott; Grubb, Matt; Lucas, John; Jaridi, Majid; Gross, Jason N.; Ohi, Nicholas; Christian, John A.; Vassiliadis, Dimitris; Kadiyala, Anand; hide

    2016-01-01

    The Simulation-to-Flight 1 (STF-1) CubeSat mission aims to demonstrate how legacy simulation technologies may be adapted for flexible and effective use on missions using the CubeSat platform. These technologies, named NASA Operational Simulator (NOS), have demonstrated significant value on several missions such as James Webb Space Telescope, Global Precipitation Measurement, Juno, and Deep Space Climate Observatory in the areas of software development, mission operations/training, verification and validation (V&V), test procedure development and software systems check-out. STF-1 will demonstrate a highly portable simulation and test platform that allows seamless transition of mission development artifacts to flight products. This environment will decrease development time of future CubeSat missions by lessening the dependency on hardware resources. In addition, through a partnership between NASA GSFC, the West Virginia Space Grant Consortium and West Virginia University, the STF-1 CubeSat will hosts payloads for three secondary objectives that aim to advance engineering and physical-science research in the areas of navigation systems of small satellites, provide useful data for understanding magnetosphere-ionosphere coupling and space weather, and verify the performance and durability of III-V Nitride-based materials.

  3. SU-E-T-761: TOMOMC, A Monte Carlo-Based Planning VerificationTool for Helical Tomotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Chibani, O; Ma, C [Fox Chase Cancer Center, Philadelphia, PA (United States)

    2015-06-15

    Purpose: Present a new Monte Carlo code (TOMOMC) to calculate 3D dose distributions for patients undergoing helical tomotherapy treatments. TOMOMC performs CT-based dose calculations using the actual dynamic variables of the machine (couch motion, gantry rotation, and MLC sequences). Methods: TOMOMC is based on the GEPTS (Gama Electron and Positron Transport System) general-purpose Monte Carlo system (Chibani and Li, Med. Phys. 29, 2002, 835). First, beam models for the Hi-Art Tomotherpy machine were developed for the different beam widths (1, 2.5 and 5 cm). The beam model accounts for the exact geometry and composition of the different components of the linac head (target, primary collimator, jaws and MLCs). The beams models were benchmarked by comparing calculated Pdds and lateral/transversal dose profiles with ionization chamber measurements in water. See figures 1–3. The MLC model was tuned in such a way that tongue and groove effect, inter-leaf and intra-leaf transmission are modeled correctly. See figure 4. Results: By simulating the exact patient anatomy and the actual treatment delivery conditions (couch motion, gantry rotation and MLC sinogram), TOMOMC is able to calculate the 3D patient dose distribution which is in principal more accurate than the one from the treatment planning system (TPS) since it relies on the Monte Carlo method (gold standard). Dose volume parameters based on the Monte Carlo dose distribution can also be compared to those produced by the TPS. Attached figures show isodose lines for a H&N patient calculated by TOMOMC (transverse and sagittal views). Analysis of differences between TOMOMC and TPS is an ongoing work for different anatomic sites. Conclusion: A new Monte Carlo code (TOMOMC) was developed for Tomotherapy patient-specific QA. The next step in this project is implementing GPU computing to speed up Monte Carlo simulation and make Monte Carlo-based treatment verification a practical solution.

  4. GROUND FILTERING LiDAR DATA BASED ON MULTI-SCALE ANALYSIS OF HEIGHT DIFFERENCE THRESHOLD

    Directory of Open Access Journals (Sweden)

    P. Rashidi

    2017-09-01

    Full Text Available Separating point clouds into ground and non-ground points is a necessary step to generate digital terrain model (DTM from LiDAR dataset. In this research, a new method based on multi-scale analysis of height difference threshold is proposed for ground filtering of LiDAR data. The proposed method utilizes three windows with different sizes in small, average and large to cover the entire LiDAR point clouds, then with a height difference threshold, point clouds can be separated to ground and non-ground in each local window. Meanwhile, the best threshold values for size of windows are considered based on physical characteristics of the ground surface and size of objects. Also, the minimum of height of object in each window selected as height difference threshold. In order to evaluate the performance of the proposed algorithm, two datasets in rural and urban area were applied. The overall accuracy in rural and urban area was 96.06% and 94.88% respectively. These results of the filtering showed that the proposed method can successfully filters non-ground points from LiDAR point clouds despite of the data area.

  5. Low Power Ground-Based Laser Illumination for Electric Propulsion Applications

    Science.gov (United States)

    Lapointe, Michael R.; Oleson, Steven R.

    1994-01-01

    A preliminary evaluation of low power, ground-based laser powered electric propulsion systems is presented. A review of available and near-term laser, photovoltaic, and adaptive optic systems indicates that approximately 5-kW of ground-based laser power can be delivered at an equivalent one-sun intensity to an orbit of approximately 2000 km. Laser illumination at the proper wavelength can double photovoltaic array conversion efficiencies compared to efficiencies obtained with solar illumination at the same intensity, allowing a reduction in array mass. The reduced array mass allows extra propellant to be carried with no penalty in total spacecraft mass. The extra propellant mass can extend the satellite life in orbit, allowing additional revenue to be generated. A trade study using realistic cost estimates and conservative ground station viewing capability was performed to estimate the number of communication satellites which must be illuminated to make a proliferated system of laser ground stations economically attractive. The required number of satellites is typically below that of proposed communication satellite constellations, indicating that low power ground-based laser beaming may be commercially viable. However, near-term advances in low specific mass solar arrays and high energy density batteries for LEO applications would render the ground-based laser system impracticable.

  6. Systems, methods and apparatus for verification of knowledge-based systems

    Science.gov (United States)

    Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor); Rouff, Christopher A. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, domain knowledge is translated into a knowledge-based system. In some embodiments, a formal specification is derived from rules of a knowledge-based system, the formal specification is analyzed, and flaws in the formal specification are used to identify and correct errors in the domain knowledge, from which a knowledge-based system is translated.

  7. Designed microtremor array based actual measurement and analysis of strong ground motion at Palu city, Indonesia

    Science.gov (United States)

    Thein, Pyi Soe; Pramumijoyo, Subagyo; Brotopuspito, Kirbani Sri; Wilopo, Wahyu; Kiyono, Junji; Setianto, Agung; Putra, Rusnardi Rahmat

    2015-04-01

    In this study, we investigated the strong ground motion characteristics under Palu City, Indonesia. The shear wave velocity structures evaluated by eight microtremors measurement are the most applicable to determine the thickness of sediments and average shear wave velocity with Vs ≤ 300 m/s. Based on subsurface underground structure models identified, earthquake ground motion was estimated in the future Palu-Koro earthquake by using statistical green's function method. The seismic microzonation parameters were carried out by considering several significant controlling factors on ground response at January 23, 2005 earthquake.

  8. Designed microtremor array based actual measurement and analysis of strong ground motion at Palu city, Indonesia

    Energy Technology Data Exchange (ETDEWEB)

    Thein, Pyi Soe, E-mail: pyisoethein@yahoo.com [Geology Department, Yangon University (Myanmar); Pramumijoyo, Subagyo; Wilopo, Wahyu; Setianto, Agung [Geological Engineering Department, Gadjah Mada University (Indonesia); Brotopuspito, Kirbani Sri [Physics Department, Gadjah Mada University (Indonesia); Kiyono, Junji; Putra, Rusnardi Rahmat [Graduate School of Global Environmental Studies, Kyoto University (Japan)

    2015-04-24

    In this study, we investigated the strong ground motion characteristics under Palu City, Indonesia. The shear wave velocity structures evaluated by eight microtremors measurement are the most applicable to determine the thickness of sediments and average shear wave velocity with Vs ≤ 300 m/s. Based on subsurface underground structure models identified, earthquake ground motion was estimated in the future Palu-Koro earthquake by using statistical green’s function method. The seismic microzonation parameters were carried out by considering several significant controlling factors on ground response at January 23, 2005 earthquake.

  9. A Laser Absorption Spectroscopy System for 2D Mapping of CO2 Over Large Spatial Areas for Monitoring, Reporting and Verification of Ground Carbon Storage Sites

    Science.gov (United States)

    Dobler, J. T.; Braun, M.; Blume, N.; McGregor, D.; Zaccheo, T. S.; Pernini, T.; Botos, C.

    2014-12-01

    We will present the development of the Greenhouse gas Laser Imaging Tomography Experiment (GreenLITE). GreenLITE consists of two laser based transceivers and a number of retro-reflectors to measure differential transmission (DT) of a number of overlapping chords in a plane over the site being monitored. The transceivers use the Intensity Modulated Continuous Wave (IM-CW) approach, which is a technique that allows simultaneous transmission/reception of multiple fixed wavelength lasers and a lock-in, or matched filter, to measure amplitude and phase of the different wavelengths in the digital domain. The technique was developed by Exelis and has been evaluated using an airborne demonstrator for the past 10 years by NASA Langley Research Center. The method has demonstrated high accuracy and high precision measurements as compared to an in situ monitor tracable to WMO standards, agreeing to 0.65 ppm +/-1.7 ppm. The GreenLITE system is coupled to a cloud-based data storage and processing system that takes the measured chord data, along with auxiliary data to retrieve an average CO2 concentration per chord and which combines the chords to provide an estimate of the spatial distribution of CO2 concentration in the plane. A web-based interface allows users to view real-time CO2 concentrations and 2D concentration maps of the area being monitored. The 2D maps can be differenced as a function of time for an estimate of the flux across the plane measured by the system. The system is designed to operate autonomously from semi-remote locations with a very low maintenance cycle. Initial instrument tests, conducted in June, showed signal to noise in the measured ratio of >3000 for 10 s averages. Additional local field testing and a quantifiable field testing at the Zero Emissions Research and Technology (ZERT) site in Bozeman, MT are planned for this fall. We will present details on the instrument and software tools that have been developed, along with results from the local

  10. Establishing common ground in community-based arts in health.

    Science.gov (United States)

    White, Mike

    2006-05-01

    This article originates in current research into community-based arts in health. Arts in health is now a diverse field of practice, and community-based arts in health interventions have extended the work beyond healthcare settings into public health. Examples of this work can now be found internationally in different health systems and cultural contexts. The paper argues that researchers need to understand the processes through which community-based arts in health projects evolve, and how they work holistically in their attempt to produce therapeutic and social benefits for both individuals and communities, and to connect with a cultural base in healthcare services themselves. A development model that might be adapted to assist in analysing this is the World Health Organisation Quality of Life Index (WHOQOL). Issues raised in the paper around community engagement, healthy choice and self-esteem are then illustrated in case examples of community-based arts in health practice in South Africa and England; namely the DramAide and Siyazama projects in KwaZulu-Natal, and Looking Well Healthy Living Centre in North Yorkshire. In South Africa there are arts and media projects attempting to raise awareness about HIV/AIDS through mass messaging, but they also recognize that they lack models of longer-term community engagement. Looking Well by contrast addresses health issues identified by the community itself in ways that are personal, empathic and domesticated. But there are also similarities among these projects in their aims to generate a range of social, educational and economic benefits within a community-health framework, and they are successfully regenerating traditional cultural forms to create public participation in health promotion. Process evaluation may provide a framework in which community-based arts in health projects, especially if they are networked together to share practice and thinking, can assess their ability to address health inequalities and focus

  11. Design exploration and verification platform, based on high-level modeling and FPGA prototyping, for fast and flexible digital communication in physics experiments

    Science.gov (United States)

    Magazzù, G.; Borgese, G.; Costantino, N.; Fanucci, L.; Incandela, J.; Saponara, S.

    2013-02-01

    In many research fields as high energy physics (HEP), astrophysics, nuclear medicine or space engineering with harsh operating conditions, the use of fast and flexible digital communication protocols is becoming more and more important. The possibility to have a smart and tested top-down design flow for the design of a new protocol for control/readout of front-end electronics is very useful. To this aim, and to reduce development time, costs and risks, this paper describes an innovative design/verification flow applied as example case study to a new communication protocol called FF-LYNX. After the description of the main FF-LYNX features, the paper presents: the definition of a parametric SystemC-based Integrated Simulation Environment (ISE) for high-level protocol definition and validation; the set up of figure of merits to drive the design space exploration; the use of ISE for early analysis of the achievable performances when adopting the new communication protocol and its interfaces for a new (or upgraded) physics experiment; the design of VHDL IP cores for the TX and RX protocol interfaces; their implementation on a FPGA-based emulator for functional verification and finally the modification of the FPGA-based emulator for testing the ASIC chipset which implements the rad-tolerant protocol interfaces. For every step, significant results will be shown to underline the usefulness of this design and verification approach that can be applied to any new digital protocol development for smart detectors in physics experiments.

  12. Analysis of meteorological variables in the Australasian region using ground- and space-based GPS techniques

    Science.gov (United States)

    Kuleshov, Yuriy; Choy, Suelynn; Fu, Erjiang Frank; Chane-Ming, Fabrice; Liou, Yuei-An; Pavelyev, Alexander G.

    2016-07-01

    Results of analysis of meteorological variables (temperature and moisture) in the Australasian region using the global positioning system (GPS) radio occultation (RO) and GPS ground-based observations verified with in situ radiosonde (RS) data are presented. The potential of using ground-based GPS observations for retrieving column integrated precipitable water vapour (PWV) over the Australian continent has been demonstrated using the Australian ground-based GPS reference stations network. Using data from the 15 ground-based GPS stations, the state of the atmosphere over Victoria during a significant weather event, the March 2010 Melbourne storm, has been investigated, and it has been shown that the GPS observations has potential for monitoring the movement of a weather front that has sharp moisture contrast. Temperature and moisture variability in the atmosphere over various climatic regions (the Indian and the Pacific Oceans, the Antarctic and Australia) has been examined using satellite-based GPS RO and in situ RS observations. Investigating recent atmospheric temperature trends over Antarctica, the time series of the collocated GPS RO and RS data were examined, and strong cooling in the lower stratosphere and warming through the troposphere over Antarctica has been identified, in agreement with outputs of climate models. With further expansion of the Global Navigation Satellite Systems (GNSS) system, it is expected that GNSS satellite- and ground-based measurements would be able to provide an order of magnitude larger amount of data which in turn could significantly advance weather forecasting services, climate monitoring and analysis in the Australasian region.

  13. Ground-Based Surveillance and Tracking System (GSTS)

    Science.gov (United States)

    1987-08-01

    reported availabilty of relatively high- paying jobs. The consequences of increased migration could be significant. No significant impacts at U.S. Army...Air Force Base are contributing to overdrawing the aquifers, and at current usage rates the aquifers could be depleted (44). The "Draft Environmental

  14. Tracking of urban aerosols using combined lidar-based remote sensing and ground-based measurements

    Directory of Open Access Journals (Sweden)

    T.-Y. He

    2011-10-01

    Full Text Available A measuring campaign was performed over the neighboring towns of Nova Gorica in Slovenia and Gorizia in Italy on 24 and 25 May 2010, to investigate the concentration and distribution of urban aerosols. Tracking of two-dimensional spatial and temporal aerosol distributions was performed using scanning elastic lidar operating at 1064 nm. In addition, PM10 concentrations of particles, NOx and meteorological data were continuously monitored within the lidar scanning region. Based on the collected data, we investigated the flow dynamics and the aerosol concentrations within the lower troposphere and an evidence for daily aerosol cycles. We observed a number of cases with spatially localized increased lidar returns, which were found to be due to the presence of point sources of particulate matter. Daily aerosol concentration cycles were also clearly visible with a peak in aerosol concentration during the morning rush hours and daily maximum at around 17:00 Central European Time. We also found that the averaged horizontal atmospheric extinction within the scanning region 200 m above the ground is correlated to the PM10 concentration at the ground level with a correlation coefficient of 0.64, which may be due to relatively quiet meteorological conditions and basin-like terrain configuration.

  15. Ground Based GPS Phase Measurements for Atmospheric Sounding

    Science.gov (United States)

    2016-06-14

    based GPS observations for the correction of radar observations. 6 REFERENCES Alber, C., R. Ware, C. Rocken, and J. Braun, A new method for sensing ...rocken@ucar.edu Award #: N00014-97-1-0258 LONG-TERM GOAL The goal is to develop GPS remote sensing techniques to determine atmospheric signal delay and...agrees best with the observations in a least squares sense is selected. The corresponding refractivity profile is then selected. • We tested this

  16. Constraint-based Ground contact handling in Humanoid Robotics Simulation

    OpenAIRE

    Martin Moraud, Eduardo; Hale, Joshua G.; Cheng, Gordon

    2008-01-01

    International audience; This paper presents a method for resolving contact in dynamic simulations of articulated figures. It is intended for humanoids with polygonal feet and incorporates Coulomb friction exactly. The proposed technique is based on a constraint selection paradigm. Its implementation offers an exact mode which guarantees correct behavior, as well as an efficiency optimized mode which sacrifices accuracy for a tightly bounded computational burden, thus facilitating batch simula...

  17. Ground-based follow-up in relation to Kepler Asteroseismic Investigation

    CERN Document Server

    Uytterhoeven, K; Bruntt, H; De Cat, P; Frandsen, S; Gutierrez-Soto, J; Kiss, L; Kurtz, D W; Marconi, M; Molenda-Zakowicz, J; Ostensen, R; Randall, S; Southworth, J; Szabo, R

    2010-01-01

    The Kepler space mission, successfully launched in March 2009, is providing continuous, high-precision photometry of thousands of stars simultaneously. The uninterrupted time-series of stars of all known pulsation types are a precious source for asteroseismic studies. The Kepler data do not provide information on the physical parameters, such as effective temperature, surface gravity, metallicity, and vsini, which are crucial for successful asteroseismic modelling. Additional ground-based time-series data are needed to characterize mode parameters in several types of pulsating stars. Therefore, ground-based multi-colour photometry and mid/high-resolution spectroscopy are needed to complement the space data. We present ground-based activities within KASC on selected asteroseismic Kepler targets of several pulsation types. (Based on observations made with the Isaac Newton Telescope, William Herschel Telescope, Nordic Optical Telescope, Telescopio Nazionale Galileo, Mercator Telescope (La Palma, Spain), and IAC-...

  18. Ka-band bistatic ground-based SAR using noise signals

    Science.gov (United States)

    Lukin, K.; Mogyla, A.; Vyplavin, P.; Palamarchuk, V.; Zemlyaniy, O.; Tarasenko, V.; Zaets, N.; Skretsanov, V.; Shubniy, A.; Glamazdin, V.; Natarov, M.; Nechayev, O.

    2008-01-01

    Currently, one of the actual problems is remote monitoring of technical state of large objects. Different methods can be used for that purpose. The most promising of them relies on application of ground based synthetic aperture radars (SAR) and differential interferometry. We have designed and tested Ground Based Noise Waveform SAR based on noise radar technology [1] and synthetic aperture antennas [2]. It enabled to build an instrument for precise all-weather monitoring of large objects in real-time. We describe main performance of ground-based interferometric SAR which uses continuous Ka-band noise waveform as a probe signal. Besides, results of laboratory trials and evaluation of its main performance are presented as well.

  19. Physiologically Based Pharmacokinetic (PBPK) Modeling and Simulation Approaches: A Systematic Review of Published Models, Applications, and Model Verification.

    Science.gov (United States)

    Sager, Jennifer E; Yu, Jingjing; Ragueneau-Majlessi, Isabelle; Isoherranen, Nina

    2015-11-01

    Modeling and simulation of drug disposition has emerged as an important tool in drug development, clinical study design and regulatory review, and the number of physiologically based pharmacokinetic (PBPK) modeling related publications and regulatory submissions have risen dramatically in recent years. However, the extent of use of PBPK modeling by researchers, and the public availability of models has not been systematically evaluated. This review evaluates PBPK-related publications to 1) identify the common applications of PBPK modeling; 2) determine ways in which models are developed; 3) establish how model quality is assessed; and 4) provide a list of publically available PBPK models for sensitive P450 and transporter substrates as well as selective inhibitors and inducers. PubMed searches were conducted using the terms "PBPK" and "physiologically based pharmacokinetic model" to collect published models. Only papers on PBPK modeling of pharmaceutical agents in humans published in English between 2008 and May 2015 were reviewed. A total of 366 PBPK-related articles met the search criteria, with the number of articles published per year rising steadily. Published models were most commonly used for drug-drug interaction predictions (28%), followed by interindividual variability and general clinical pharmacokinetic predictions (23%), formulation or absorption modeling (12%), and predicting age-related changes in pharmacokinetics and disposition (10%). In total, 106 models of sensitive substrates, inhibitors, and inducers were identified. An in-depth analysis of the model development and verification revealed a lack of consistency in model development and quality assessment practices, demonstrating a need for development of best-practice guidelines.

  20. SU-E-J-16: Automatic Image Contrast Enhancement Based On Automatic Parameter Optimization for Radiation Therapy Setup Verification

    Energy Technology Data Exchange (ETDEWEB)

    Qiu, J [Taishan Medical University, Taian, Shandong (China); Washington University in St Louis, St Louis, MO (United States); Li, H. Harlod; Zhang, T; Yang, D [Washington University in St Louis, St Louis, MO (United States); Ma, F [Taishan Medical University, Taian, Shandong (China)

    2015-06-15

    Purpose: In RT patient setup 2D images, tissues often cannot be seen well due to the lack of image contrast. Contrast enhancement features provided by image reviewing software, e.g. Mosaiq and ARIA, require manual selection of the image processing filters and parameters thus inefficient and cannot be automated. In this work, we developed a novel method to automatically enhance the 2D RT image contrast to allow automatic verification of patient daily setups as a prerequisite step of automatic patient safety assurance. Methods: The new method is based on contrast limited adaptive histogram equalization (CLAHE) and high-pass filtering algorithms. The most important innovation is to automatically select the optimal parameters by optimizing the image contrast. The image processing procedure includes the following steps: 1) background and noise removal, 2) hi-pass filtering by subtracting the Gaussian smoothed Result, and 3) histogram equalization using CLAHE algorithm. Three parameters were determined through an iterative optimization which was based on the interior-point constrained optimization algorithm: the Gaussian smoothing weighting factor, the CLAHE algorithm block size and clip limiting parameters. The goal of the optimization is to maximize the entropy of the processed Result. Results: A total 42 RT images were processed. The results were visually evaluated by RT physicians and physicists. About 48% of the images processed by the new method were ranked as excellent. In comparison, only 29% and 18% of the images processed by the basic CLAHE algorithm and by the basic window level adjustment process, were ranked as excellent. Conclusion: This new image contrast enhancement method is robust and automatic, and is able to significantly outperform the basic CLAHE algorithm and the manual window-level adjustment process that are currently used in clinical 2D image review software tools.

  1. Model-Based Verification and Validation of the SMAP Uplink Processes

    Science.gov (United States)

    Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun

    2013-01-01

    This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.

  2. Verification of a characterization method of the laser-induced selective activation based on industrial lasers

    DEFF Research Database (Denmark)

    Zhang, Yang; Hansen, Hans Nørgaard; Tang, Peter T.

    2013-01-01

    In this article, laser-induced selective activation (LISA) for subsequent autocatalytic copper plating is performed by several types of industrial scale lasers, including a Nd:YAG laser, a UV laser, a fiber laser, a green laser, and a short pulsed laser. Based on analysis of all the laser-machine...

  3. Analysis of English Complex Sentences based on Figure-Ground Theory

    Institute of Scientific and Technical Information of China (English)

    侯皓

    2015-01-01

    English is a language featuring its complex sentences composed of main and sub-ordinate clauses. The subordinate clause conveys the unifnished messages in main clause and it becomes quite complicated. English complex sentence is a fair impor-tant sentence type and also of importance in English teaching. Analyzing complex sentence based on Figure-Ground Theory, especially the Adverbial Clause, is help-ful to learn English and translate it. The Figure-Ground Theory originated in psychol-ogy studies and it was introduced in cognitive linguistics to explain some language phenomena. From Figure-Ground perspective, the essay studies attributive clause, adverbial clause and nominal clause and some critical sentence types have been analyzed carefully and the major ifnding is Figure-Ground Theory is dynamic not static.

  4. Novel identification strategy for ground coffee adulteration based on UPLC-HRMS oligosaccharide profiling.

    Science.gov (United States)

    Cai, Tie; Ting, Hu; Jin-Lan, Zhang

    2016-01-01

    Coffee is one of the most common and most valuable beverages. According to International Coffee Organization (ICO) reports, the adulteration of coffee for financial reasons is regarded as the most serious threat to the sustainable development of the coffee market. In this work, a novel strategy for adulteration identification in ground coffee was developed based on UPLC-HRMS oligosaccharide profiling. Along with integrated statistical analysis, 17 oligosaccharide composition were identified as markers for the identification of soybeans and rice in ground coffee. This strategy, validated by manual mixtures, optimized both the reliability and authority of adulteration identification. Rice and soybean adulterants present in ground coffee in amounts as low as 5% were identified and evaluated. Some commercial ground coffees were also successfully tested using this strategy.

  5. Coastal wind study based on Sentinel-1 and ground-based scanning lidar

    DEFF Research Database (Denmark)

    Ahsbahs, Tobias Torben; Badger, Merete; Pena Diaz, Alfredo

    , the project "Reducing the Uncertainty of Near-shore Energy estimates from meso- and micro-scale wind models" (RUNE) was established. The lidar measurement campaign started November 2015 and ended in February 2016 at the Danish North Sea coast at around 56.5 ◦N, 8.2 ◦E. 107 satellite SAR scenes were collected...... fields from the Sentinel-1A satellite using APL/NOAA’s SAROPS system with GFS model wind directions as input. For the presented cases CMOD5.n is used. Ground-based scanning lidar located on land can also cover near shore areas. In order to improve wind farm planning for near-shore coastal areas...

  6. Dust optical properties retrieved from ground-based polarimetric measurements.

    Science.gov (United States)

    Li, Zhengqiang; Goloub, Philippe; Blarel, Luc; Damiri, Bahaiddin; Podvin, Thierry; Jankowiak, Isabelle

    2007-03-20

    We have systematically processed one year of sunphotometer measurements (recorded at five AERONET/PHOTONS sites in Africa) in order to assess mineral dust optical properties with the use of a new polarimetry-based algorithm. We consider the Cimel CE318 polarized sunphotometer version to obtain single-scattering albedo, scattering phase matrix elements F(11) and F(12) for dust aerosols selected with Angström exponents ranging from -0.05 to 0.25. Retrieved F(11) and F(12) differ significantly from those of spherical particles. The degree of linear polarization -F(12)/F(11) for single scattering of atmospheric total column dust aerosols in the case of unpolarized incident light is systematically retrieved for the first time to our knowledge from sunphotometer measurements and shows consistency with previous laboratory characterizations of nonspherical particles.

  7. Wind-farm power performance verification

    Energy Technology Data Exchange (ETDEWEB)

    Dutilleux, P. [DEWI German Wind Energy Institute, Wilhelmshaven (Germany)

    2005-07-01

    Details of wind farm power performance verification procedures were presented. Verifications were performed at the DEWI test site at Wilhelmhaven, Germany. Types of power performance guarantees included power performance of individual turbines with IEC verification measurement, and Nacelle anemometer verification. In addition, availability guarantees were examined, as well as issues concerning energy production guarantees of complete wind farms in relation to nearby meteorology masts. An evaluation of power curve verification measurements was presented as well as measurement procedures relating to IEC standards. Methods for Nacelle anemometry verification included calibration of the anemometer; documentation of its exact position and chain signal; and end-to-end calibration from sensor to SCADA data base. Classification of anemometers included impact of dynamical effects; and influence on annual energy production. An example of a project for performance verification of a wind farm with 9 identical wind turbines was presented. The significance of status signals was discussed, as well as alternative methods for power-curve measurements. Evaluation procedures for energy yield and power curve verifications were presented. The upcoming set of IEC standards concerning power curve measurements was discussed. Various alternative verification procedures for wind farm power performance were reviewed. refs., tabs., figs.

  8. The SeaHorn Verification Framework

    Science.gov (United States)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  9. Quantitative Verification of a Force-based Model for Pedestrian Dynamics

    CERN Document Server

    Chraibi, Mohcine; Schadschneider, Andreas; Mackens, Wolfgang

    2009-01-01

    This paper introduces a spatially continuous force-based model for simulating pedestrian dynamics. The main intention of this work is the quantitative description of pedestrian movement through bottlenecks and in corridors. Measurements of flow and density at bottlenecks will be presented and compared with empirical data. Furthermore the fundamental diagram for the movement in a corridor is reproduced. The results of the proposed model show a good agreement with empirical data.

  10. Knowledge Based Systems (KBS) Verification, Validation, Evaluation, and Testing (VVE&T) Bibliography: Topical Categorization

    Science.gov (United States)

    2003-03-01

    Index in Ankylosing Spondylitis ," Proc. of the 5"’ World Conf. on Medical Informatics (MEDINFO-89), B. Barber, C. Dexian, Q. Dulie & G. Wagner, eds...34Evaluation and Validation of a Functional Index in Ankylosing Spondylitis ," Proc. of the 51h World Conf. on Medical Informatics (MEDINFO-89), B...Webster, L., "The Development and Validation of an Exercise Countermeasure Protocol Management Expert System Based on Authentic Methods of Reasoning

  11. Analysis-Based Verification: A Programmer-Oriented Approach to the Assurance of Mechanical Program Properties

    Science.gov (United States)

    2010-05-27

    Metamathematics. D. Van Nostrand Company, 1950. [71] Glenn E. Krasner and Stephen T. Pope. A cookbook for using the model-view controller user interface...Software-Practice and Experience, 9(4):255–65, 1979. [45] Cormac Flanagan and Stephen H. Freund. Type-based race detection for Java. In 2000 Conference on...Programming Language Design and Implementation (PLDI’00), pages 219–232. ACM Press, 2000. [46] Cormac Flanagan and Stephen N. Freund. Detecting race

  12. Simulink based behavioural modelling of a pulse oximeter for deployment in rapid development, prototyping and verification.

    Science.gov (United States)

    Shokouhian, M; Morling, R C S; Kale, I

    2012-01-01

    The pulse oximeter is a well-known device for measuring the level of oxygen in blood. Since their invention, pulse oximeters have been under constant development in both aspects of hardware and software; however there are still unsolved problems that limit their performance [6], [7]. Many fresh algorithms and new design techniques are being suggested every year by industry and academic researchers which claim that they can improve accuracy of measurements [8], [9]. With the lack of an accurate computer-based behavioural model for pulse oximeters, the only way for evaluation of these newly developed systems and algorithms is through hardware implementation which can be both expensive and time consuming. This paper presents an accurate Simulink based behavioural model for a pulse oximeter that can be used by industry and academia alike working in this area, as an exploration as well as productivity enhancement tool during their research and development process. The aim of this paper is to introduce a new computer-based behavioural model which provides a simulation environment from which new ideas can be rapidly evaluated long before the real implementation.

  13. Mechatronics design and experimental verification of an electric-vehicle-based hybrid thermal management system

    Directory of Open Access Journals (Sweden)

    Yi-Hsuan Hung

    2016-02-01

    Full Text Available In this study, an electric-vehicle-based thermal management system was designed for dual energy sources. An experimental platform developed in a previous study was modified. Regarding the mechanical components, a heat exchanger with a radiator, proportional valve, coolant pipes, and coolant pump was appropriately integrated. Regarding the electric components, two heaters emulating waste heat were controlled using two programmable power supply machines. A rapid-prototyping controller with two temperature inputs and three outputs was designed. Rule-based control strategies were coded to maintain optimal temperatures for the emulated proton exchange membrane fuel cells and lithium batteries. To evaluate the heat power of dual energy sources, driving cycles, energy management control, and efficiency maps of energy sources were considered for deriving time-variant values. The main results are as follows: (a an advanced mechatronics platform was constructed; (b a driving cycle simulation was successfully conducted; and (c coolant temperatures reached their optimal operating ranges when the proportional valve, radiator, and coolant pump were sequentially controlled. The benefits of this novel electric-vehicle-based thermal management system are (a high-efficiency operation of energy sources, (b low occupied volume integrated with energy sources, and (c higher electric vehicle traveling mileage. This system will be integrated with real energy sources and a real electric vehicle in the future.

  14. Analysis of the substorm trigger phase using multiple ground-based instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Kauristie, K.; Pulkkinen, T.I.; Pellinen, R.J. [Finnish Meteorological Institute, Helsinki (Finland)] [and others

    1995-08-01

    The authors discuss in detail the observation of an event of auroral activity fading during the trigger, or growth phase of a magnetic storm. This event was observed by all-sky cameras, EISCAT radar and magnetometers, riometers, and pulsation magnetometers, from ground based stations in Finland and Scandanavia. Based on their detailed analysis, they present a possible cause for the observed fading.

  15. Plans of a test bed for ionospheric modelling based on Fennoscandian ground-based instrumentation

    Science.gov (United States)

    Kauristie, Kirsti; Kero, Antti; Verronen, Pekka T.; Aikio, Anita; Vierinen, Juha; Lehtinen, Markku; Turunen, Esa; Pulkkinen, Tuija; Virtanen, Ilkka; Norberg, Johannes; Vanhamäki, Heikki; Kallio, Esa; Kestilä, Antti; Partamies, Noora; Syrjäsuo, Mikko

    2016-07-01

    One of the recommendations for teaming among research groups in the COSPAR/ILWS roadmap is about building test beds in which coordinated observing supports model development. In the presentation we will describe a test bed initiative supporting research on ionosphere-thermosphere-magnetosphere interactions. The EISCAT incoherent scatter radars with their future extension, EISCAT3D, form the backbone of the proposed system. The EISCAT radars are surrounded by versatile and dense arrays of ground-based instrumentation: magnetometers and auroral cameras (the MIRACLE and IMAGE networks), ionospheric tomography receivers (the TomoScand network) and other novel technology for upper atmospheric probing with radio waves (e.g. the KAIRA facility, riometers and the ionosonde maintained by the Sodankylä Geophysical Observatory). As a new opening, close coordination with the Finnish national cubesat program is planned. We will investigate opportunities to establish a cost efficient nanosatellite program which would support the ground-based observations in a systematic and persistent manner. First experiences will be gathered with the Aalto-1 and Aalto-2 satellites, latter of which will be the Finnish contribution to the international QB50 mission. We envisage close collaboration also in the development of data analysis tools with the goal to integrate routines and models from different research groups to one system, where the different elements support each other. In the longer run we are aiming for a modelling framework with observational guidance which gives a holistic description on ionosphere-thermosphere processes and this way enables reliable forecasts on upper atmospheric space weather activity.

  16. Comparing Aerosol Retrievals from Ground-Based Instruments at the Impact-Pm Field Campaign

    Science.gov (United States)

    Kupinski, M.; Bradley, C. L.; Kalashnikova, O. V.; Xu, F.; Diner, D. J.; Clements, C. B.; Camacho, C.

    2016-12-01

    Detection of aerosol types, components having different size and chemical composition, over urban areas is important for understanding their impact on health and climate. In particular, sustained contact with size-differentiated airborne particulate matter: PM10 and PM2.5 can lead to adverse health effects such as asthma attacks, heart and lung diseases, and premature mortality. Multi-angular polarimetric measurements have been advocated in recent years as an additional tool to better understand and retrieve the aerosol properties needed for improved predictions of aerosol impart on air quality and climate. We deployed the ground-based Multiangle SpectroPolarimetric Imager (GroundMSPI) for accurate spectropolarimetric and radiance measurements co-located with the AERONET CIMEL sun photometer and a Halo Doppler 18 m resolution lidar from San José State University at the Garland-Fresno Air Quality supersite in Fresno, CA on July 7 during the Imaging Polarimetric Assessment and Characterization of Tropospheric Particulate Matter (ImPACT-PM) field experiment. GroundMSPI sampled the atmospheric scattering phase function in and 90 degrees out of the principal plane every 15 minutes in an automated manner, utilizing the 2-axis gimbal mount in elevation and azimuth. The goal of this work is verify atmospheric measurement of GroundMSPI with the coincident CIMEL sun photometer and ground-based lidar. Diffuse-sky radiance measurements of GroundMSPI are compared with the CIMEL sun photometer throughout the day. AERONET aerosol parameters such as size, shape, and index of refraction as well as lidar aerosol extinction profiles will be used in a forward radiative transfer model to compare with GroundMSPI observations and optimize these parameters to best match GroundMSPI data.

  17. Tracing ground water input to base flow using sulfate (S, O) isotopes.

    Science.gov (United States)

    Gu, Ailiang; Gray, Floyd; Eastoe, Christopher J; Norman, Laura M; Duarte, Oscar; Long, Austin

    2008-01-01

    Sulfate (S and O) isotopes used in conjunction with sulfate concentration provide a tracer for ground water contributions to base flow. They are particularly useful in areas where rock sources of contrasting S isotope character are juxtaposed, where water chemistry or H and O isotopes fail to distinguish water sources, and in arid areas where rain water contributions to base flow are minimal. Sonoita Creek basin in southern Arizona, where evaporite and igneous sources of sulfur are commonly juxtaposed, serves as an example. Base flow in Sonoita Creek is a mixture of three ground water sources: A, basin ground water with sulfate resembling that from Permian evaporite; B, ground water from the Patagonia Mountains; and C, ground water associated with Temporal Gulch. B and C contain sulfate like that of acid rock drainage in the region but differ in sulfate content. Source A contributes 50% to 70%, with the remainder equally divided between B and C during the base flow seasons. The proportion of B generally increases downstream. The proportion of A is greatest under drought conditions.

  18. Tracing ground water input to base flow using sulfate (S, O) isotopes

    Science.gov (United States)

    Gu, A.; Gray, F.; Eastoe, C.J.; Norman, L.M.; Duarte, O.; Long, A.

    2008-01-01

    Sulfate (S and O) isotopes used in conjunction with sulfate concentration provide a tracer for ground water contributions to base flow. They are particularly useful in areas where rock sources of contrasting S isotope character are juxtaposed, where water chemistry or H and O isotopes fail to distinguish water sources, and in arid areas where rain water contributions to base flow are minimal. Sonoita Creek basin in southern Arizona, where evaporite and igneous sources of sulfur are commonly juxtaposed, serves as an example. Base flow in Sonoita Creek is a mixture of three ground water sources: A, basin ground water with sulfate resembling that from Permian evaporite; B, ground water from the Patagonia Mountains; and C, ground water associated with Temporal Gulch. B and C contain sulfate like that of acid rock drainage in the region but differ in sulfate content. Source A contributes 50% to 70%, with the remainder equally divided between B and C during the base flow seasons. The proportion of B generally increases downstream. The proportion of A is greatest under drought conditions.

  19. Adaptive beamlet-based finite-size pencil beam dose calculation for independent verification of IMRT and VMAT

    Energy Technology Data Exchange (ETDEWEB)

    Park, Justin C.; Li, Jonathan G.; Arhjoul, Lahcen; Yan, Guanghua; Lu, Bo; Fan, Qiyong; Liu, Chihray, E-mail: liucr@ufl.edu [Department of Radiation Oncology, University of Florida, Gainesville, Florida 32610-0385 (United States)

    2015-04-15

    Purpose: The use of sophisticated dose calculation procedure in modern radiation therapy treatment planning is inevitable in order to account for complex treatment fields created by multileaf collimators (MLCs). As a consequence, independent volumetric dose verification is time consuming, which affects the efficiency of clinical workflow. In this study, the authors present an efficient adaptive beamlet-based finite-size pencil beam (AB-FSPB) dose calculation algorithm that minimizes the computational procedure while preserving the accuracy. Methods: The computational time of finite-size pencil beam (FSPB) algorithm is proportional to the number of infinitesimal and identical beamlets that constitute an arbitrary field shape. In AB-FSPB, dose distribution from each beamlet is mathematically modeled such that the sizes of beamlets to represent an arbitrary field shape no longer need to be infinitesimal nor identical. As a result, it is possible to represent an arbitrary field shape with combinations of different sized and minimal number of beamlets. In addition, the authors included the model parameters to consider MLC for its rounded edge and transmission. Results: Root mean square error (RMSE) between treatment planning system and conventional FSPB on a 10 × 10 cm{sup 2} square field using 10 × 10, 2.5 × 2.5, and 0.5 × 0.5 cm{sup 2} beamlet sizes were 4.90%, 3.19%, and 2.87%, respectively, compared with RMSE of 1.10%, 1.11%, and 1.14% for AB-FSPB. This finding holds true for a larger square field size of 25 × 25 cm{sup 2}, where RMSE for 25 × 25, 2.5 × 2.5, and 0.5 × 0.5 cm{sup 2} beamlet sizes were 5.41%, 4.76%, and 3.54% in FSPB, respectively, compared with RMSE of 0.86%, 0.83%, and 0.88% for AB-FSPB. It was found that AB-FSPB could successfully account for the MLC transmissions without major discrepancy. The algorithm was also graphical processing unit (GPU) compatible to maximize its computational speed. For an intensity modulated radiation therapy (

  20. A framework for recovery-oriented, COTS-based ground station networks

    Science.gov (United States)

    Cutler, James William

    The complexity of space communication has limited our access to space systems and kept mission operations costs high. Ultimately, this results in reduced mission capabilities and yields. In particular, ground stations, the access point between space and terrestrial networks, suffer from monolithic designs, narrow interfaces, and unreliability that raise significant financial barriers for low-cost, experimental satellite missions. This research reduces these barriers by developing technology for recovery-oriented, flexible access networks built from commercial-off-the-shelf (COTS) components. Based on our extensive small satellite experiences, we decomposed ground station services and captured them in an extensible framework that simplified reuse of ground station services and improved portability across heterogeneous installations. This capability, combined with selective customization through virtual machine technology, allowed us to deliver "just in time" ground stations for QuakeSat-1 at a fraction of the price of current commodity solutions. This decomposition is also informed by principles of robust system design. Thus, our ground station reference implementation called Mercury was a candidate for recursive recovery (RR), a high availability technique whose effectiveness in reducing recovery time has been demonstrated on research prototypes of Internet server systems. Augmenting Mercury to implement RR reduced recovery time of typical ground station software failures by a factor of four, dropping recovery time to within the "window of recovery" and effectively eliminating the adverse effects of these failures. Since the time of failures cannot be predicted, RR allowed us to mitigate the effects of the failures and greatly reduce their potential impact on ground station operations. Our ground station architecture harnessed the benefits of COTS components, including rapid prototyping and deployment, while overcoming the challenges of COTS reliability and mission

  1. Verification of Emulated Channels in Multi-Probe Based MIMO OTA Testing Setup

    DEFF Research Database (Denmark)

    Fan, Wei; Carreño, Xavier; Nielsen, Jesper Ødum;

    2013-01-01

    Standardization work for MIMO OTA testing methods is currently ongoing, where a multi-probe anechoic chamber based solution is an important candidate. In this paper, the probes located on an OTA ring are used to synthesize a plane wave field in the center of the OTA ring. This paper investigates...... the extent to which we can approach the synthesized plane wave in practical measurement systems. Both single plane wave with certain AoA and multiple plane waves with different AoAs and power weightings are synthesized and measured. Deviations of the measured plane wave and the simulated plane wave field...

  2. Formal Verification of Safety Buffers for Sate-Based Conflict Detection and Resolution

    Science.gov (United States)

    Herencia-Zapana, Heber; Jeannin, Jean-Baptiste; Munoz, Cesar A.

    2010-01-01

    The information provided by global positioning systems is never totally exact, and there are always errors when measuring position and velocity of moving objects such as aircraft. This paper studies the effects of these errors in the actual separation of aircraft in the context of state-based conflict detection and resolution. Assuming that the state information is uncertain but that bounds on the errors are known, this paper provides an analytical definition of a safety buffer and sufficient conditions under which this buffer guarantees that actual conflicts are detected and solved. The results are presented as theorems, which were formally proven using a mechanical theorem prover.

  3. Refinement and verification in component-based model-driven design

    DEFF Research Database (Denmark)

    Chen, Zhenbang; Liu, Zhiming; Ravn, Anders Peter

    2009-01-01

    Modern software development is complex as it has to deal with many different and yet related aspects of applications. In practical software engineering this is now handled by a UML-like modelling approach in which different aspects are modelled by different notations. Component-based and object...... of Refinement of Component and Object Systems (rCOS) and illustrates it with experiences from the work on the Common Component Modelling Example (CoCoME). This gives evidence that the formal techniques developed in rCOS can be integrated into a model-driven development process and shows where it may...

  4. Experimental verification of a broadband planar focusing antenna based on transformation optics

    Science.gov (United States)

    Lei Mei, Zhong; Bai, Jing; Cui, Tie Jun

    2011-06-01

    It is experimentally verified that a two-dimensional planar focusing antenna based on gradient-index metamaterials has a similar performance as that of its parabolic counterpart. The antenna is designed using quasi-conformal transformation optics, and is realized with non-resonant I-shaped metamaterial unit cells. It is shown that the antenna has a broad bandwidth and very low loss. Near-field distributions of the antenna are measured and far-field radiation patterns are calculated from the measured data, which have good agreement with the full-wave simulations. Using all-dielectric metamaterials, the design can be scaled down to find applications at optical frequencies.

  5. Experimental verification of a broadband planar focusing antenna based on transformation optics

    Energy Technology Data Exchange (ETDEWEB)

    Mei Zhonglei; Bai Jing [School of Information Science and Engineering, Lanzhou University, Lanzhou 730000 (China); Cui Tiejun, E-mail: meizl@lzu.edu.cn, E-mail: tjcui@seu.edu.cn [State Key Laboratory of Millimeter Waves, Department of Radio Engineering, Southeast University, Nanjing 210096 (China)

    2011-06-15

    It is experimentally verified that a two-dimensional planar focusing antenna based on gradient-index metamaterials has a similar performance as that of its parabolic counterpart. The antenna is designed using quasi-conformal transformation optics, and is realized with non-resonant I-shaped metamaterial unit cells. It is shown that the antenna has a broad bandwidth and very low loss. Near-field distributions of the antenna are measured and far-field radiation patterns are calculated from the measured data, which have good agreement with the full-wave simulations. Using all-dielectric metamaterials, the design can be scaled down to find applications at optical frequencies.

  6. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  7. A New Method of Desired Gait Synthesis for Biped Walking Robot Based on Ground Reaction Force

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    A new method of desired gait synthesis for biped walking robot based on the ground reaction force was proposed. The relation between the ground reaction force and joint motion is derived using the D'Almbert principle. In view of dynamic walking with high stability, the ZMP(Zero Moment Point)stability criterion must be considered in the desired gait synthesis. After that, the joint trajectories of biped walking robot are decided by substituting the ground reaction force into the aforesaid relation based on the ZMP criterion. The trajectory of desired ZMP is determined by a fuzzy logic based upon the body posture of biped walking robot. The proposed scheme is simulated and experimented on a 10 degree of freedom biped walking robot. The results indicate that the proposed method is feasible.

  8. BigBOSS: The Ground-Based Stage IV BAO Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Schlegel, David; Bebek, Chris; Heetderks, Henry; Ho, Shirley; Lampton, Michael; Levi, Michael; Mostek, Nick; Padmanabhan, Nikhil; Perlmutter, Saul; Roe, Natalie; Sholl, Michael; Smoot, George; White, Martin; Dey, Arjun; Abraham, Tony; Jannuzi, Buell; Joyce, Dick; Liang, Ming; Merrill, Mike; Olsen, Knut; Salim, Samir

    2009-04-01

    The BigBOSS experiment is a proposed DOE-NSF Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with an all-sky galaxy redshift survey. The project is designed to unlock the mystery of dark energy using existing ground-based facilities operated by NOAO. A new 4000-fiber R=5000 spectrograph covering a 3-degree diameter field will measure BAO and redshift space distortions in the distribution of galaxies and hydrogen gas spanning redshifts from 0.2< z< 3.5. The Dark Energy Task Force figure of merit (DETF FoM) for this experiment is expected to be equal to that of a JDEM mission for BAO with the lower risk and cost typical of a ground-based experiment.

  9. Comparing Dawn, Hubble Space Telescope, and Ground-Based Interpretations of (4) Vesta

    CERN Document Server

    Reddy, Vishnu; Corre, Lucille Le; Scully, Jennifer E C; Gaskell, Robert; Russell, Christopher T; Park, Ryan S; Nathues, Andreas; Raymond, Carol; Gaffey, Michael J; Sierks, Holger; Becker, Kris J; McFadden, Lucy A

    2013-01-01

    Observations of asteroid 4 Vesta by NASA's Dawn spacecraft are interesting because its surface has the largest range of albedo, color and composition of any other asteroid visited by spacecraft to date. These hemispherical and rotational variations in surface brightness and composition have been attributed to impact processes since Vesta's formation. Prior to Dawn's arrival at Vesta, its surface properties were the focus of intense telescopic investigations for nearly a hundred years. Ground-based photometric and spectroscopic observations first revealed these variations followed later by those using Hubble Space Telescope. Here we compare interpretations of Vesta's rotation period, pole, albedo, topographic, color, and compositional properties from ground-based telescopes and HST with those from Dawn. Rotational spectral variations observed from ground-based studies are also consistent with those observed by Dawn. While the interpretation of some of these features was tenuous from past data, the interpretati...

  10. Experimental verification of color flow imaging based on wideband Doppler method.

    Science.gov (United States)

    Tanaka, Naohiko

    2014-01-01

    The purpose of this study is to eliminate the aliasing in color flow imaging. The wideband Doppler method is applied to generate a color flow image, and the validity of the method is experimentally confirmed. The single beam experiment is carried out to confirm the velocity estimation based on the wideband Doppler method. The echo data for the conventional pulsed Doppler method and the wideband Doppler method are obtained using a flow model, and the estimated velocity for each method is compared. The color flow images for each method are also generated using several types of flow model. The generated images are compared, and the characteristics of the imaging based on the wideband Doppler method are discussed. The high velocity beyond the Nyquist limit is successfully estimated by the wideband Doppler method, and the availability in low velocity estimation is also confirmed. The aliasing in color flow images is eliminated, and the generated images show the significance of the elimination of the aliasing in the flow imaging. The aliasing in color flow imaging can be eliminated by the wideband Doppler method. This technique is useful for the exact understanding of blood flow dynamics.

  11. Preliminary verification of instantaneous air temperature estimation for clear sky conditions based on SEBAL

    Science.gov (United States)

    Zhu, Shanyou; Zhou, Chuxuan; Zhang, Guixin; Zhang, Hailong; Hua, Junwei

    2017-02-01

    Spatially distributed near surface air temperature at the height of 2 m is an important input parameter for the land surface models. It is of great significance in both theoretical research and practical applications to retrieve instantaneous air temperature data from remote sensing observations. An approach based on Surface Energy Balance Algorithm for Land (SEBAL) to retrieve air temperature under clear sky conditions is presented. Taking the meteorological measurement data at one station as the reference and remotely sensed data as the model input, the research estimates the air temperature by using an iterative computation. The method was applied to the area of Jiangsu province for nine scenes by using MODIS data products, as well as part of Fujian province, China based on four scenes of Landsat 8 imagery. Comparing the air temperature estimated from the proposed method with that of the meteorological station measurement, results show that the root mean square error is 1.7 and 2.6 °C at 1000 and 30 m spatial resolution respectively. Sensitivity analysis of influencing factors reveals that land surface temperature is the most sensitive to the estimation precision. Research results indicate that the method has great potentiality to be used to estimate instantaneous air temperature distribution under clear sky conditions.

  12. High-precision ground-based photometry of exoplanets

    Directory of Open Access Journals (Sweden)

    de Mooij Ernst J.W.

    2013-04-01

    Full Text Available High-precision photometry of transiting exoplanet systems has contributed significantly to our understanding of the properties of their atmospheres. The best targets are the bright exoplanet systems, for which the high number of photons allow very high signal-to-noise ratios. Most of the current instruments are not optimised for these high-precision measurements, either they have a large read-out overhead to reduce the readnoise and/or their field-of-view is limited, preventing simultaneous observations of both the target and a reference star. Recently we have proposed a new wide-field imager for the Observatoir de Mont-Megantic optimised for these bright systems (PI: Jayawardhana. The instruments has a dual beam design and a field-of-view of 17' by 17'. The cameras have a read-out time of 2 seconds, significantly reducing read-out overheads. Over the past years we have obtained significant experience with how to reach the high precision required for the characterisation of exoplanet atmospheres. Based on our experience we provide the following advice: Get the best calibrations possible. In the case of bad weather, characterise the instrument (e.g. non-linearity, dome flats, bias level, this is vital for better understanding of the science data. Observe the target for as long as possible, the out-of-transit baseline is as important as the transit/eclipse itself. A short baseline can lead to improperly corrected systematic and mis-estimation of the red-noise. Keep everything (e.g. position on detector, exposure time as stable as possible. Take care that the defocus is not too strong. For a large defocus, the contribution of the total flux from the sky-background in the aperture could well exceed that of the target, resulting in very strict requirements on the precision at which the background is measured.

  13. Evaluation of Real-Time Ground-Based GPS Meteorology

    Science.gov (United States)

    Fang, P.; Bock, Y.; Gutman, S.

    2003-04-01

    We demonstrate and evaluate a system to estimate zenith tropospheric delays in real time (5-10 minute latency) based on the technique of instantaneous GPS positioning as described by Bock et al. [2000] using data from the Orange County Real Time GPS Network. OCRTN is an upgrade of a sub-network of SCIGN sites in southern California to low latency (1-2 sec), high-rate (1 Hz) data streaming. Currently, ten sites are streaming data (Ashtech binary MBEN format) by means of dedicated, point-to-point radio modems to a network hub that translates the asynchronous serial data to TCP/IP and onto a PC workstation residing on a local area network. Software residing on the PC allows multiple clients to access the raw data simultaneously though TCP/IP. One of the clients is a Geodetics RTD server that receives and archives (1) the raw 1 Hz network data, (2) estimates of instantaneous positions and zenith tropospheric delays, and (3) RINEX data to decimated to 30 seconds. The network is composed of ten sites. The distribution of nine of the sites approximates a right triangle with two 60 km legs, and a tenth site on Catalina Island a distance of about 50 km (over water) from the hypotenuse of the triangle. Relative zenith delays are estimated every second with a latency less than a second. Median values are computed at a user-specified interval (e.g., 10 minutes) with outliers greater than 4 times the interquartile range rejected. We describe the results with those generated by our operational system using the GAMIT software, with a latency of 30-60 minutes. Earlier results (from a similar network) comparing 30-minute median RTD values to GAMIT 30-minute estimates indicate that the two solutions differ by about 1 cm. We also describe our approach to determining absolute zenith delays. If an Internet connection is available we will present a real-time demonstration. [Bock, Y., R. Nikolaidis, P. J. de Jonge, and M. Bevis, Instantaneous resolution of crustal motion at medium

  14. Verification of a laboratory-based dilation model for in situ conditions using continuum models

    Institute of Scientific and Technical Information of China (English)

    G. Walton; M.S. Diederichs; L.R. Alejano; J. Arzúa

    2014-01-01

    With respect to constitutive models for continuum modeling applications, the post-yield domain re-mains the area of greatest uncertainty. Recent studies based on laboratory testing have led to the development of a number of models for brittle rock dilation, which account for both the plastic shear strain and confining stress dependencies of this phenomenon. Although these models are useful in providing an improved understanding of how dilatancy evolves during a compression test, there has been relatively little work performed examining their validity for modeling brittle rock yield in situ. In this study, different constitutive models for rock dilation are reviewed and then tested, in the context of a number of case studies, using a continuum finite-difference approach (FLAC). The uncertainty associated with the modeling of brittle fracture localization is addressed, and the overall ability of mobilized dilation models to replicate in situ deformation measurements and yield patterns is evaluated.

  15. Analysis of the Properties of Adjoint Equations and Accuracy Verification of Adjoint Model Based on FVM

    Directory of Open Access Journals (Sweden)

    Yaodeng Chen

    2014-01-01

    Full Text Available There are two different approaches on how to formulate adjoint numerical model (ANM. Aiming at the disputes arising from the construction methods of ANM, the differences between nonlinear shallow water equation and its adjoint equation are analyzed; the hyperbolicity and homogeneity of the adjoint equation are discussed. Then, based on unstructured meshes and finite volume method, a new adjoint model was advanced by getting numerical model of the adjoint equations directly. Using a gradient check, the correctness of the adjoint model was verified. The results of twin experiments to invert the bottom friction coefficient (Manning’s roughness coefficient indicate that the adjoint model can extract the observation information and produce good quality inversion. The reason of disputes about construction methods of ANM is also discussed in the paper.

  16. Security Policy Development: Towards a Life-Cycle and Logic-Based Verification Model

    Directory of Open Access Journals (Sweden)

    Luay A. Wahsheh

    2008-01-01

    Full Text Available Although security plays a major role in the design of software systems, security requirements and policies are usually added to an already existing system, not created in conjunction with the product. As a result, there are often numerous problems with the overall design. In this paper, we discuss the relationship between software engineering, security engineering, and policy engineering and present a security policy life-cycle; an engineering methodology to policy development in high assurance computer systems. The model provides system security managers with a procedural engineering process to develop security policies. We also present an executable Prolog-based model as a formal specification and knowledge representation method using a theorem prover to verify system correctness with respect to security policies in their life-cycle stages.

  17. Verification of simple illuminance based measures for indication of discomfort glare from windows

    DEFF Research Database (Denmark)

    Karlsen, Line Røseth; Heiselberg, Per Kvols; Bryn, Ida

    2015-01-01

    predictions of discomfort glare from windows already in the early design stage when decisions regarding the façade are taken. This study focus on verifying if simple illuminance based measures like vertical illuminance at eye level or horizontal illuminance at the desk are correlated with the perceived glare...... confirm that there is a statistically significant correlation between both vertical eye illuminance and horizontal illuminance at the desk and the occupants’ perception of glare in a perimeter zone office environment, which is promising evidence towards utilizing such simple measures for indication...... of discomfort glare in early building design. Further, the observed response indicate that the participants in the present study were more tolerant to low illuminance levels and more sensitive to high illuminance levels than the DGPs model would predict. More and larger studies are needed to confirm or enfeeble...

  18. Verification of the IVA4 film boiling model with the data base of Liu and Theofanous

    Energy Technology Data Exchange (ETDEWEB)

    Kolev, N.I. [Siemens AG Unternehmensbereich KWU, Erlangen (Germany)

    1998-01-01

    Part 1 of this work presents a closed analytical solution for mixed-convection film boiling on vertical walls. Heat transfer coefficients predicted by the proposed model and experimental data obtained at the Royal Institute of Technology in Sweden by Okkonen et al are compared. All data predicted are inside the {+-}10% error band, with mean averaged error being below 4% using the slightly modified analytical solution. The solution obtained is recommended for practical applications. The method presented here is used in Part 2 as a guideline for developing model for film boiling on spheres. The new semi-empirical film boiling model for spheres used in IVA4 computer code is compared with the experimental data base obtained by Liu and Theofanous. The data are predicted within {+-}30% error band. (author)

  19. Implementation and verification of different ECC mitigation designs for BRAMs in flash-based FPGAs

    CERN Document Server

    Yang, Zhenlei; Zhang, Zhangang; Liu, Jie; Su, Hong

    2015-01-01

    Embedded RAM blocks (BRAMs) in field programmable gate arrays (FPGAs) are susceptible to single event effects (SEEs) induced by environmental factors such as cosmic rays, heavy ions, alpha particles and so on. As technology scales, the issue will be more serious. In order to tackle this issue, two different error correcting codes (ECCs), the shortened Hamming codes and shortened BCH codes, are investigated in this paper. The concrete design methods of the codes are presented. Also, the codes are both implemented in flash-based FPGAs. Finally, the synthesis report and simulation results are presented in the paper. Moreover, the heavy-ion experiments are performed, the experimental results indicate that the error cross-section using the shortened Hamming codes can be reduced by two orders of magnitude compared with the device without mitigation, and no errors are discovered in the experiments for the device using the shortened BCH codes.

  20. Experimental Verification of Electric Drive Technologies Based on Artificial Intelligence Tools

    Science.gov (United States)

    Rubaai, Ahmed; Kankam, David (Technical Monitor)

    2003-01-01

    A laboratory implementation of a fuzzy logic-tracking controller using a low cost Motorola MC68HC11E9 microprocessor is described in this report. The objective is to design the most optimal yet practical controller that can be implemented and marketed, and which gives respectable performance, even when the system loads, inertia and parameters are varying. A distinguishing feature of this work is the by-product goal of developing a marketable, simple, functional and low cost controller. Additionally, real-time nonlinearities are not ignored, and a mathematical model is not required. A number of components have been designed, built and tested individually, and in various combinations of hardware and software segments. These components have been integrated with a brushless motor to constitute the drive system. A microprocessor-based FLC is incorporated to provide robust speed and position control. Design objectives that are difficult to express mathematically can be easily incorporated in a fuzzy logic-based controller by linguistic information (in the form of fuzzy IF-THEN rules). The theory and design are tested in the laboratory using a hardware setup. Several test cases have been conducted to confirm the effectiveness of the proposed controller. The results indicate excellent tracking performance for both speed and position trajectories. For the purpose of comparison, a bang-bang controller has been tested. The fuzzy logic controller performs significantly better than the traditional bang-bang controller. The bang-bang controller has been shown to be relatively inaccurate and lacking in robustness. Description of the implementation hardware system is also given.

  1. A hybrid framework for verification of satellite precipitation products

    Science.gov (United States)

    Li, J.; Hsu, K.; AghaKouchak, A.; Sorooshian, S.

    2011-12-01

    Advances in satellite technology have led to the development of many remote-sensing algorithms to estimate precipitation at quasi-global scales. A number of satellite precipitation products are provided at high spatial and temporal resolutions that are suitable for short-term hydrologic applications. Several coordinated validation activities have been established to evaluate the accuracy of satellite precipitation. Traditional verification measures summarize pixel-to-pixel differences between observation and estimates. Object-based verification methods, however, extend pixel based validation to address errors related to spatial patterns and storm structure, such as the shape, volume, and distribution of precipitation rain-objects. In this investigation, a 2D watershed segmentation technique is used to identify rain storm objects and is further adopted in a hybrid verification framework to diagnose the storm-scale rainfall objects from both satellite-based precipitation estimates and ground observations (radar estimates). Five key scores are identified in the objective-based verification framework, including false alarm ratio, missing ratio, maximum of total interest, equal weight and weighted summation of total interest. These scores indicate the performance of satellite estimates with features extracted from the segmented storm objects. The proposed object-based verification framework was used to evaluate PERSIANN, PERSIANN-CCS, CMORPH, 3B42RT against NOAA stage IV MPE multi-sensor composite rain analysis. All estimates are evaluated at 0.25°x0.25° daily-scale in summer 2008 over the continental United States (CONUS). The five final scores for each precipitation product are compared with the median of maximum interest (MMI) of the Method for Object-Based Diagnostic Evaluation (MODE). The results show PERSIANN and CMORPH outperform 3B42RT and PERSIANN-CCS. Different satellite products presented distinct features of precipitation. For example, the sizes of

  2. Using Verification Code with Opportunity Model Based on CAPTCHA%基于CAPTCHA技术的挑战信息使用时机模型

    Institute of Scientific and Technical Information of China (English)

    朴顺姬; 戴德伟

    2012-01-01

    Rational use of verification code for the purpose of inhibiting the spread of the virus worm,the use of mathematical modeling methods to build a verification code uses opportunity mathematic model based CAPTCHA.Improvement strategy is proposed to make up an increase of the burden of customers and the network load for CAPTCHA.Thereby it is a qualitative analysis of the best time to use verification code.%为了合理使用挑战信息并达到抑制蠕虫病毒传播的目的,采用数学建模方法建立一个基于CAPTCHA技术的挑战信息使用时机数学模型.针对CAPTCHA技术增加了用户负担和网络负载的不足,提出了改进策略,要找到一个合适的时间发送挑战信息,从而定性的分析了使用挑战信息的最佳时机.

  3. OGLE-2015-BLG-0196: Ground-based Gravitational Microlens Parallax Confirmed By Space-Based Observation

    CERN Document Server

    Han, C; Gould, A; Zhu, Wei; Szymański, M K; Soszyński, I; Skowron, J; Mróz, P; Poleski, R; Pietrukowicz, P; Kozłowski, S; Ulaczyk, K; Pawlak, M; Yee, J C; Beichman, C; Novati, S Calchi; Carey, S; Bryden, C; Fausnaugh, M; Gaudi, B S; Henderson, Calen B; Shvartzvald, Y; Wibking, B

    2016-01-01

    In this paper, we present the analysis of the binary gravitational microlensing event OGLE-2015-BLG-0196. The event lasted for almost a year and the light curve exhibited significant deviations from the lensing model based on the rectilinear lens-source relative motion, enabling us to measure the microlens parallax. The ground-based microlens parallax is confirmed by the data obtained from space-based microlens observations using the {\\it Spitzer} telescope. By additionally measuring the angular Einstein radius from the analysis of the resolved caustic crossing, the physical parameters of the lens are determined up to the two-fold degeneracy: $u_00$ solutions caused by the well-known "ecliptic" degeneracy. It is found that the binary lens is composed of two M dwarf stars with similar masses $M_1=0.38\\pm 0.04\\ M_\\odot$ ($0.50\\pm 0.05\\ M_\\odot)$ and $M_2=0.38\\pm 0.04\\ M_\\odot$ ($0.55\\pm 0.06\\ M_\\odot$) and the distance to the lens is $D_{\\rm L}=2.77\\pm 0.23$ kpc ($3.30\\pm 0.29$ kpc). Here the physical parameter...

  4. First ground-based FTIR-observations of methane in the tropics

    Directory of Open Access Journals (Sweden)

    A. K. Petersen

    2010-02-01

    Full Text Available Total column concentrations and volume mixing ratio profiles of methane have been retrieved from ground-based solar absorption FTIR spectra in the near-infrared recorded in Paramaribo (Suriname. The methane FTIR observations are compared with TM5 model simulations and satellite observations from SCIAMACHY, and represent the first validation of SCIAMACHY retrievals in the tropics using ground-based remote sensing techniques. Apart from local biomass burning features, our methane FTIR observations agree well with the SCIAMACHY retrievals and TM5 model simulations.

  5. Extended lateral heating of the nighttime ionosphere by ground-based VLF transmitters

    OpenAIRE

    İnan, Umran Savaş; Graf, K. L.; Spasojevic, M.; Marshall, R. A.; Lehtinen, N. G.; Foust, F. R.

    2013-01-01

    JOURNAL OF GEOPHYSICAL RESEARCH: SPACE PHYSICS, VOL. 118, 7783–7797, doi:10.1002/2013JA019337, 2013 Extended lateral heating of the nighttime ionosphere by ground-based VLF transmitters K. L. Graf,1 M. Spasojevic,1 R. A. Marshall,2 N. G. Lehtinen,1 F. R. Foust,1 and U. S. Inan1,3 Received 16 August 2013; revised 9 October 2013; accepted 11 November 2013; published 3 December 2013. [1] The effects of ground-based very low frequency (VLF) transmitters on the lower ionospher...

  6. A transit timing analysis with combined ground- and space-based photometry

    Directory of Open Access Journals (Sweden)

    Raetz St.

    2015-01-01

    The CoRoT satellite looks back on six years of high precision photometry of a very high number of stars. Thousands of transiting events are detected from which 27 were confirmed to be transiting planets so far. In my research I search and analyze TTVs in the CoRoT sample and combine the unprecedented precision of the light curves with ground-based follow-up photometry. Because CoRoT can observe transiting planets only for a maximum duration of 150 days the ground-based follow-up can help to refine the ephemeris. Here we present first examples.

  7. Status of advanced ground-based laser interferometers for gravitational-wave detection

    CERN Document Server

    Dooley, Katherine L; Dwyer, Sheila; Puppo, Paola

    2014-01-01

    Ground-based laser interferometers for gravitational-wave (GW) detection were first constructed starting 20 years ago and as of 2010 collection of several years' worth of science data at initial design sensitivities was completed. Upgrades to the initial detectors together with construction of brand new detectors are ongoing and feature advanced technologies to improve the sensitivity to GWs. This conference proceeding provides an overview of the common design features of ground-based laser interferometric GW detectors and establishes the context for the status updates of each of the four gravitational-wave detectors around the world: Advanced LIGO, Advanced Virgo, GEO600 and KAGRA.

  8. Asteroseismology of solar-type stars with Kepler: III. Ground-based data

    DEFF Research Database (Denmark)

    Karoff, Christoffer; Molenda-Żakowicz , J.

    2010-01-01

    We report on the ground-based follow-up program of spectroscopic and photometric observations of solar-like asteroseismic targets for the Kepler space mission. These stars constitute a large group of more than a thousand objects which are the subject of an intensive study by the Kepler Asteroseis......We report on the ground-based follow-up program of spectroscopic and photometric observations of solar-like asteroseismic targets for the Kepler space mission. These stars constitute a large group of more than a thousand objects which are the subject of an intensive study by the Kepler...

  9. Estimation of solar irradiance using ground-based whole sky imagers

    CERN Document Server

    Dev, Soumyabrata; Lee, Yee Hui; Winkler, Stefan

    2016-01-01

    Ground-based whole sky imagers (WSIs) can provide localized images of the sky of high temporal and spatial resolution, which permits fine-grained cloud observation. In this paper, we show how images taken by WSIs can be used to estimate solar radiation. Sky cameras are useful here because they provide additional information about cloud movement and coverage, which are otherwise not available from weather station data. Our setup includes ground-based weather stations at the same location as the imagers. We use their measurements to validate our methods.

  10. Intuitive Terrain Reconstruction Using Height Observation-Based Ground Segmentation and 3D Object Boundary Estimation

    Directory of Open Access Journals (Sweden)

    Sungdae Sim

    2012-12-01

    Full Text Available Mobile robot operators must make rapid decisions based on information about the robot’s surrounding environment. This means that terrain modeling and photorealistic visualization are required for the remote operation of mobile robots. We have produced a voxel map and textured mesh from the 2D and 3D datasets collected by a robot’s array of sensors, but some upper parts of objects are beyond the sensors’ measurements and these parts are missing in the terrain reconstruction result. This result is an incomplete terrain model. To solve this problem, we present a new ground segmentation method to detect non-ground data in the reconstructed voxel map. Our method uses height histograms to estimate the ground height range, and a Gibbs-Markov random field model to refine the segmentation results. To reconstruct a complete terrain model of the 3D environment, we develop a 3D boundary estimation method for non-ground objects. We apply a boundary detection technique to the 2D image, before estimating and refining the actual height values of the non-ground vertices in the reconstructed textured mesh. Our proposed methods were tested in an outdoor environment in which trees and buildings were not completely sensed. Our results show that the time required for ground segmentation is faster than that for data sensing, which is necessary for a real-time approach. In addition, those parts of objects that were not sensed are accurately recovered to retrieve their real-world appearances.

  11. Intuitive terrain reconstruction using height observation-based ground segmentation and 3D object boundary estimation.

    Science.gov (United States)

    Song, Wei; Cho, Kyungeun; Um, Kyhyun; Won, Chee Sun; Sim, Sungdae

    2012-12-12

    Mobile robot operators must make rapid decisions based on information about the robot's surrounding environment. This means that terrain modeling and photorealistic visualization are required for the remote operation of mobile robots. We have produced a voxel map and textured mesh from the 2D and 3D datasets collected by a robot's array of sensors, but some upper parts of objects are beyond the sensors' measurements and these parts are missing in the terrain reconstruction result. This result is an incomplete terrain model. To solve this problem, we present a new ground segmentation method to detect non-ground data in the reconstructed voxel map. Our method uses height histograms to estimate the ground height range, and a Gibbs-Markov random field model to refine the segmentation results. To reconstruct a complete terrain model of the 3D environment, we develop a 3D boundary estimation method for non-ground objects. We apply a boundary detection technique to the 2D image, before estimating and refining the actual height values of the non-ground vertices in the reconstructed textured mesh. Our proposed methods were tested in an outdoor environment in which trees and buildings were not completely sensed. Our results show that the time required for ground segmentation is faster than that for data sensing, which is necessary for a real-time approach. In addition, those parts of objects that were not sensed are accurately recovered to retrieve their real-world appearances.

  12. Modelling, verification, and calibration of a photoacoustics based continuous non-invasive blood glucose monitoring system.

    Science.gov (United States)

    Pai, Praful P; Sanki, Pradyut K; Sarangi, Satyabrata; Banerjee, Swapna

    2015-06-01

    This paper examines the use of photoacoustic spectroscopy (PAS) at an excitation wavelength of 905 nm for making continuous non-invasive blood glucose measurements. The theoretical background of the measurement technique is verified through simulation. An apparatus is fabricated for performing photoacoustic measurements in vitro on glucose solutions and in vivo on human subjects. The amplitude of the photoacoustic signals measured from glucose solutions is observed to increase with the solution concentration, while photoacoustic amplitude obtained from in vivo measurements follows the blood glucose concentration of the subjects, indicating a direct proportionality between the two quantities. A linear calibration method is applied separately on measurements obtained from each individual in order to estimate the blood glucose concentration. The estimated glucose values are compared to reference glucose concentrations measured using a standard glucose meter. A plot of 196 measurement pairs taken over 30 normal subjects on a Clarke error grid gives a point distribution of 82.65% and 17.35% over zones A and B of the grid with a mean absolute relative deviation (MARD) of 11.78% and a mean absolute difference (MAD) of 15.27 mg/dl (0.85 mmol/l). The results obtained are better than or comparable to those obtained using photoacoustic spectroscopy based methods or other non-invasive measurement techniques available. The accuracy levels obtained are also comparable to commercially available continuous glucose monitoring systems.

  13. Monte Carlo based verification of a beam model used in a treatment planning system

    Science.gov (United States)

    Wieslander, E.; Knöös, T.

    2008-02-01

    Modern treatment planning systems (TPSs) usually separate the dose modelling into a beam modelling phase, describing the beam exiting the accelerator, followed by a subsequent dose calculation in the patient. The aim of this work is to use the Monte Carlo code system EGSnrc to study the modelling of head scatter as well as the transmission through multi-leaf collimator (MLC) and diaphragms in the beam model used in a commercial TPS (MasterPlan, Nucletron B.V.). An Elekta Precise linear accelerator equipped with an MLC has been modelled in BEAMnrc, based on available information from the vendor regarding the material and geometry of the treatment head. The collimation in the MLC direction consists of leafs which are complemented with a backup diaphragm. The characteristics of the electron beam, i.e., energy and spot size, impinging on the target have been tuned to match measured data. Phase spaces from simulations of the treatment head are used to extract the scatter from, e.g., the flattening filter and the collimating structures. Similar data for the source models used in the TPS are extracted from the treatment planning system, thus a comprehensive analysis is possible. Simulations in a water phantom, with DOSXYZnrc, are also used to study the modelling of the MLC and the diaphragms by the TPS. The results from this study will be helpful to understand the limitations of the model in the TPS and provide knowledge for further improvements of the TPS source modelling.

  14. Verification of Spin Magnetic Attitude Control System using air-bearing-based attitude control simulator

    Science.gov (United States)

    Ousaloo, H. S.; Nodeh, M. T.; Mehrabian, R.

    2016-09-01

    This paper accomplishes one goal and it was to verify and to validate a Spin Magnetic Attitude Control System (SMACS) program and to perform Hardware-In-the-Loop (HIL) air-bearing experiments. A study of a closed-loop magnetic spin controller is presented using only magnetic rods as actuators. The magnetic spin rate control approach is able to perform spin rate control and it is verified with an Attitude Control System (ACS) air-bearing MATLAB® SIMULINK® model and a hardware-embedded LABVIEW® algorithm that controls the spin rate of the test platform on a spherical air bearing table. The SIMULINK® model includes dynamic model of air-bearing, its disturbances, actuator emulation and the time delays caused by on-board calculations. The air-bearing simulator is employed to develop, improve, and carry out objective tests of magnetic torque rods and spin rate control algorithm in the experimental framework and to provide a more realistic demonstration of expected performance of attitude control as compared with software-based architectures. Six sets of two torque rods are used as actuators for the SMACS. It is implemented and simulated to fulfill mission requirement including spin the satellite up to 12 degs-1 around the z-axis. These techniques are documented for the full nonlinear equations of motion of the system and the performances of these techniques are compared in several simulations.

  15. Experimental verification of clock noise transfer and components for space based gravitational wave detectors.

    Science.gov (United States)

    Sweeney, Dylan; Mueller, Guido

    2012-11-05

    The Laser Interferometer Space Antenna (LISA) and other space based gravitational wave detector designs require a laser communication subsystem to, among other things, transfer clock signals between spacecraft (SC) in order to cancel clock noise in post-processing. The original LISA baseline design requires frequency synthesizers to convert each SC clock into a 2 GHz signal, and electro-optic modulators (EOMs) to modulate this 2 GHz clock signal onto the laser light. Both the frequency synthesizers and the EOMs must operate with a phase fidelity of 2×10(-4)cycles/√Hz. In this paper we present measurements of the phase fidelity of frequency synthesizers and EOMs. We found that both the frequency synthesizers and the EOMs meet the requirement when tested independently and together. We also performed an electronic test of the clock noise transfer using frequency synthesizers and the University of Florida LISA Interferometry (UFLIS) phasemeter. We found that by applying a time varying fractional delay filter we could suppress the clock noise to a level below our measurement limit, which is currently determined by timing jitter and is less than an order of magnitude above the LISA requirement for phase measurements.

  16. Precision simulation of ground-based lensing data using observations from space

    CERN Document Server

    Mandelbaum, Rachel; Leauthaud, Alexie; Massey, Richard J; Rhodes, Jason

    2011-01-01

    Current and upcoming wide-field, ground-based, broad-band imaging surveys promise to address a wide range of outstanding problems in galaxy formation and cosmology. Several such uses of ground-based data, especially weak gravitational lensing, require highly precise measurements of galaxy image statistics with careful correction for the effects of the point-spread function (PSF). In this paper, we introduce the SHERA (SHEar Reconvolution Analysis) software to simulate ground-based imaging data with realistic galaxy morphologies and observing conditions, starting from space-based data (from COSMOS, the Cosmological Evolution Survey) and accounting for the effects of the space-based PSF. This code simulates ground-based data, optionally with a weak lensing shear applied, in a model-independent way using a general Fourier space formalism. The utility of this pipeline is that it allows for a precise, realistic assessment of systematic errors due to the method of data processing, for example in extracting weak len...

  17. Finding common ground in team-based qualitative research using the convergent interviewing method.

    Science.gov (United States)

    Driedger, S Michelle; Gallois, Cindy; Sanders, Carrie B; Santesso, Nancy

    2006-10-01

    Research councils, agencies, and researchers recognize the benefits of team-based health research. However, researchers involved in large-scale team-based research projects face multiple challenges as they seek to identify epistemological and ontological common ground. Typically, these challenges occur between quantitative and qualitative researchers but can occur between qualitative researchers, particularly when the project involves multiple disciplinary perspectives. The authors use the convergent interviewing technique in their multidisciplinary research project to overcome these challenges. This technique assists them in developing common epistemological and ontological ground while enabling swift and detailed data collection and analysis. Although convergent interviewing is a relatively new method described primarily in marketing research, it compares and contrasts well with grounded theory and other techniques. The authors argue that this process provides a rigorous method to structure and refine research projects and requires researchers to identify and be accountable for developing a common epistemological and ontological position.

  18. Ground-Based VIS/NIR Reflectance Spectra of 25143 Itokawa: What Hayabusa will See and How Ground-Based Data can Augment Analyses

    Science.gov (United States)

    Vilas, Faith; Abell, P. A.; Jarvis, K. S.

    2004-01-01

    Planning for the arrival of the Hayabusa spacecraft at asteroid 25143 Itokawa includes consideration of the expected spectral information to be obtained using the AMICA and NIRS instruments. The rotationally-resolved spatial coverage the asteroid we have obtained with ground-based telescopic spectrophotometry in the visible and near-infrared can be utilized here to address expected spacecraft data. We use spectrophotometry to simulate the types of data that Hayabusa will receive with the NIRS and AMICA instruments, and will demonstrate them here. The NIRS will cover a wavelength range from 0.85 m, and have a dispersion per element of 250 Angstroms. Thus, we are limited in coverage of the 1.0 micrometer and 2.0 micrometer mafic silicate absorption features. The ground-based reflectance spectra of Itokawa show a large component of olivine in its surface material, and the 2.0 micrometer feature is shallow. Determining the olivine to pyroxene abundance ratio is critically dependent on the attributes of the 1.0- and 2.0 micrometer features. With a cut-off near 2,1 micrometer the longer edge of the 2.0- feature will not be obtained by NIRS. Reflectance spectra obtained using ground-based telescopes can be used to determine the regional composition around space-based spectral observations, and possibly augment the longer wavelength spectral attributes. Similarly, the shorter wavelength end of the 1.0 micrometer absorption feature will be partially lost to the NIRS. The AMICA filters mimic the ECAS filters, and have wavelength coverage overlapping with the NIRS spectral range. We demonstrate how merging photometry from AMICA will extend the spectral coverage of the NIRS. Lessons learned from earlier spacecraft to asteroids should be considered.

  19. Reliable selection of earthquake ground motions for performance-based design

    DEFF Research Database (Denmark)

    Katsanos, Evangelos; Sextos, A.G.

    2016-01-01

    A decision support process is presented to accommodate selecting and scaling of earthquake motions as required for the time domain analysis of structures. Prequalified code-compatible suites of seismic motions are provided through a multi-criterion approach to satisfy prescribed reduced variability...... of selected Engineering Demand Parameters. Such a procedure, even though typically overlooked, is imperative to increase the reliability of the average response values, as required for the code-prescribed design verification of structures. Structure-related attributes such as the dynamic characteristics...... of the method, by being subjected to numerous suites of motions that were highly ranked according to both the proposed approach (δsv-sc) and the conventional index (δconv), already used by most existing code-based earthquake records selection and scaling procedures. The findings reveal the superiority...

  20. Simulation of the imaging quality of ground-based telescopes affected by atmospheric disturbances

    Science.gov (United States)

    Ren, Yubin; Kou, Songfeng; Gu, Bozhong

    2014-08-01

    Ground-based telescope imaging model is developed in this paper, the relationship between the atmospheric disturbances and the ground-based telescope image quality is studied. Simulation of the wave-front distortions caused by atmospheric turbulences has long been an important method in the study of the propagation of light through the atmosphere. The phase of the starlight wave-front is changed over time, but in an appropriate short exposure time, the atmospheric disturbances can be considered as "frozen". In accordance with Kolmogorov turbulence theory, simulating atmospheric disturbances of image model based on the phase screen distorted by atmospheric turbulences is achieved by the fast Fourier transform (FFT). Geiger mode avalanche photodiode array (APD arrays) model is used for atmospheric wave-front detection, the image is achieved by inversion method of photon counting after the target starlight goes through phase screens and ground-based telescopes. Ground-based telescope imaging model is established in this paper can accurately achieve the relationship between the quality of telescope imaging and monolayer or multilayer atmosphere disturbances, and it is great significance for the wave-front detection and optical correction in a Multi-conjugate Adaptive Optics system (MCAO).

  1. Hanford ground-water data base management guide and user's manual. [CIRMIS

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, P.J.; Argo, R.S.; Bradymire, S.L.; Newbill, C.A.

    1985-05-01

    This management guide and user's manual is a working document for the computerized Hanford Ground-water Data Base maintained by the Geosciences Research and Engineering Department at Pacific Northwest Laboratory for the Hanford Ground-Water Surveillance Program. The program is managed by the Occupational and Environmental Protection Department for the US Department of Energy. The data base is maintained to provide rapid access to data that are rountinely collected from ground-water monitoring wells at the Hanford site. The data include water levels, sample analyses, geologic descriptions and well construction information of over 3000 existing or destroyed wells. These data are used to monitor water quality and for the evaluation of ground-water flow and pollutant transport problems. The management guide gives instructions for maintenance of the data base on the Digital Equipment Corporation PDP 11/70 Computer using the CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) data base management software developed at Pacific Northwest Laboratory. Maintenance activities include inserting, modifying and deleting data, making back-up copies of the data base, and generating tables for annual monitoring reports. The user's guide includes instructions for running programs to retrieve the data in the form of listings of graphical plots. 3 refs.

  2. A Robust and Efficient Homography Based Approach for Ground Plane Detection

    Directory of Open Access Journals (Sweden)

    Sanjeev Sofat

    2012-07-01

    Full Text Available This paper presents a homography based ground planedetection method. The method is developed as a part of stereovision based obstacle detection technique for the visuallyimpaired people. The method assumes the presence of a texturedominant ground plane in the lower portion of the scene, whichis not severe restriction in a real world. SIFT algorithm is usedto extract features in the stereo images. The extracted SIFTfeatures are robustly matched by model fitting using RANSAC.A sample of putative matches lying in the lower portion of theimage is selected. A fitness function is developed to selectmatches from this sample, which are used to estimate groundplane homography hypothesis. The ground plane homographyhypothesis is used to classify the SIFT features as eitherbelonging to ground plane or not. Image segmentation usingmean shift and normalized cut is further used to filter theoutliers and augment the ground plane. Experimental testshave been conducted to test the performance of the proposedapproach. The tests indicate that the proposed approach hasgood classification rate and have operating distance rangefrom 3 feet to 12 feet.

  3. Development of access-based metrics for site location of ground segment in LEO missions

    Directory of Open Access Journals (Sweden)

    Hossein Bonyan Khamseh

    2010-09-01

    Full Text Available The classical metrics of ground segment site location do not take account of the pattern of ground segment access to the satellite. In this paper, based on the pattern of access between the ground segment and the satellite, two metrics for site location of ground segments in Low Earth Orbits (LEO missions were developed. The two developed access-based metrics are total accessibility duration and longest accessibility gap in a given period of time. It is shown that repeatability cycle is the minimum necessary time interval to study the steady behavior of the two proposed metrics. System and subsystem characteristics of the satellite represented by each of the metrics are discussed. Incorporation of the two proposed metrics, along with the classical ones, in the ground segment site location process results in financial saving in satellite development phase and reduces the minimum required level of in-orbit autonomy of the satellite. To show the effectiveness of the proposed metrics, simulation results are included for illustration.

  4. Which future for electromagnetic Astronomy: Ground Based vs Space Borne Large Astrophysical Facilities

    Science.gov (United States)

    Ubertini, Pietro

    2015-08-01

    The combined use of large ground based facilities and large space observatories is playing a key role in the advance of astrophysics by providing access to the entire electromagnetic spectrum, allowing high sensitivity observations from the lower radio wavelength to the higher energy gamma rays.It is nowadays clear that a forward steps in the understanding of the Universe evolution and large scale structure formation is essential and only possible with the combined use of multiwavelength imaging and spectral high resolution instruments.The increasing size, complexity and cost of large ground and space observatories places a growing emphasis on international collaboration. If the present set of astronomical facilities is impressive and complete, with nicely complementary space and ground based telescopes, the scenario becomes worrisome and critical in the next two decades. In fact, only a few ‘Large’ main space missions are planned and there is a need to ensure proper ground facility coverage: the synergy Ground-Space is not escapable in the timeframe 2020-2030.The scope of this talk is to review the current astronomical instrumentation panorama also in view of the recent major national agencies and international bodies programmatic decisions.This Division B meeting give us a unique opportunity to review the current situation and discuss the future perspectives taking advantage of the large audience ensured by the IAU GA.

  5. Ground Control Point - Wireless System Network for UAV-based environmental monitoring applications

    Science.gov (United States)

    Mejia-Aguilar, Abraham

    2016-04-01

    In recent years, Unmanned Aerial Vehicles (UAV) have seen widespread civil applications including usage for survey and monitoring services in areas such as agriculture, construction and civil engineering, private surveillance and reconnaissance services and cultural heritage management. Most aerial monitoring services require the integration of information acquired during the flight (such as imagery) with ground-based information (such as GPS information or others) for improved ground truth validation. For example, to obtain an accurate 3D and Digital Elevation Model based on aerial imagery, it is necessary to include ground-based information of coordinate points, which are normally acquired with surveying methods based on Global Position Systems (GPS). However, GPS surveys are very time consuming and especially for longer time series of monitoring data repeated GPS surveys are necessary. In order to improve speed of data collection and integration, this work presents an autonomous system based on Waspmote technologies build on single nodes interlinked in a Wireless Sensor Network (WSN) star-topology for ground based information collection and later integration with surveying data obtained by UAV. Nodes are designed to be visible from the air, to resist extreme weather conditions with low-power consumption. Besides, nodes are equipped with GPS as well as Inertial Measurement Unit (IMU), accelerometer, temperature and soil moisture sensors and thus provide significant advantages in a broad range of applications for environmental monitoring. For our purpose, the WSN transmits the environmental data with 3G/GPRS to a database on a regular time basis. This project provides a detailed case study and implementation of a Ground Control Point System Network for UAV-based vegetation monitoring of dry mountain grassland in the Matsch valley, Italy.

  6. Eggspectation : organic egg verification tool

    NARCIS (Netherlands)

    Ruth, van S.M.; Hoogenboom, L.A.P.

    2011-01-01

    In 2009 RIKILT conducted a study on about 2,000 eggs to evaluate three different analytical verification methods: carotenoid profiling, fatty acid profiling and isotope ratio mass spectrometry. The eggs were collected from about 50 Dutch farms. The selection was based on the farms’ location and size

  7. On reconciling ground-based with spaceborne normalized radar cross section measurements

    DEFF Research Database (Denmark)

    Baumgartner, Francois; Munk, Jens; Jezek, K C

    2002-01-01

    This study examines differences in the normalized radar cross section, derived from ground-based versus spaceborne radar data. A simple homogeneous half-space model, indicates that agreement between the two improves as 1) the distance from the scatterer is increased; and/or 2) the extinction...

  8. Facilitating Grounded Online Interactions in Video-Case-Based Teacher Professional Development

    Science.gov (United States)

    Nemirovsky, Ricardo; Galvis, Alvaro

    2004-01-01

    The use of interactive video cases for teacher professional development is an emergent medium inspired by case study methods used extensively in law, management, and medicine, and by the advent of multimedia technology available to support online discussions. This paper focuses on Web-based "grounded" discussions--in which the participants base…

  9. Ground-based LIDAR: a novel approach to quantify fine-scale fuelbed characteristics

    Science.gov (United States)

    E.L. Loudermilk; J.K. Hiers; J.J. O’Brien; R.J. Mitchell; A. Singhania; J.C. Fernandez; W.P. Cropper; K.C. Slatton

    2009-01-01

    Ground-based LIDAR (also known as laser ranging) is a novel technique that may precisely quantify fuelbed characteristics important in determining fire behavior. We measured fuel properties within a south-eastern US longleaf pine woodland at the individual plant and fuelbed scale. Data were collected using a mobile terrestrial LIDAR unit at sub-cm scale for individual...

  10. Identifying Barriers in Implementing Outcomes-Based Assessment Program Review: A Grounded Theory Analysis

    Science.gov (United States)

    Bresciani, Marilee J.

    2011-01-01

    The purpose of this grounded theory study was to identify the typical barriers encountered by faculty and administrators when implementing outcomes-based assessment program review. An analysis of interviews with faculty and administrators at nine institutions revealed a theory that faculty and administrators' promotion, tenure (if applicable),…

  11. Use of neural networks in ground-based aerosol retrievals from multi-angle spectropolarimetric observations

    NARCIS (Netherlands)

    Di Noia, A.; Hasekamp, O.P.; Harten, G. van; Rietjens, J.H.H.; Smit, J.M.; Snik, F.; Henzing, J.S.; Boer, J. de; Keller, C.U.; Volten, H.

    2015-01-01

    In this paper, the use of a neural network algorithm for the retrieval of the aerosol properties from ground-based spectropolarimetric measurements is discussed. The neural network is able to retrieve the aerosol properties with an accuracy that is almost comparable to that of an iterative retrieval

  12. Retrieval of liquid water cloud properties from ground-based remote sensing observations

    NARCIS (Netherlands)

    Knist, C.L.

    2014-01-01

    Accurate ground-based remotely sensed microphysical and optical properties of liquid water clouds are essential references to validate satellite-observed cloud properties and to improve cloud parameterizations in weather and climate models. This requires the evaluation of algorithms for retrieval of

  13. Ground-based remote sensing scheme for monitoring aerosol–cloud interactions (discussion)

    NARCIS (Netherlands)

    Sarna, K.; Russchenberg, H.W.J.

    2015-01-01

    A method for continuous observation of aerosol–cloud interactions with ground-based remote sensing instruments is presented. The main goal of this method is to enable the monitoring of cloud microphysical changes due to the changing aerosol concentration. We use high resolution measurements from lid

  14. Ground-based remote sensing scheme for monitoring aerosol-cloud interactions

    NARCIS (Netherlands)

    Sarna, K.; Russchenberg, H.W.J.

    2016-01-01

    A new method for continuous observation of aerosol–cloud interactions with ground-based remote sensing instruments is presented. The main goal of this method is to enable the monitoring of the change of the cloud droplet size due to the change in the aerosol concentration. We use high-resolution mea

  15. Asteroseismology of solar-type stars with Kepler: III. Ground-based data

    DEFF Research Database (Denmark)

    Karoff, Christoffer; Molenda-Żakowicz , J.

    2010-01-01

    We report on the ground-based follow-up program of spectroscopic and photometric observations of solar-like asteroseismic targets for the Kepler space mission. These stars constitute a large group of more than a thousand objects which are the subject of an intensive study by the Kepler Asteroseis...

  16. A Fast Method for Embattling Optimization of Ground-Based Radar Surveillance Network

    Science.gov (United States)

    Jiang, H.; Cheng, H.; Zhang, Y.; Liu, J.

    A growing number of space activities have created an orbital debris environment that poses increasing impact risks to existing space systems and human space flight. For the safety of in-orbit spacecraft, a lot of observation facilities are needed to catalog space objects, especially in low earth orbit. Surveillance of Low earth orbit objects are mainly rely on ground-based radar, due to the ability limitation of exist radar facilities, a large number of ground-based radar need to build in the next few years in order to meet the current space surveillance demands. How to optimize the embattling of ground-based radar surveillance network is a problem to need to be solved. The traditional method for embattling optimization of ground-based radar surveillance network is mainly through to the detection simulation of all possible stations with cataloged data, and makes a comprehensive comparative analysis of various simulation results with the combinational method, and then selects an optimal result as station layout scheme. This method is time consuming for single simulation and high computational complexity for the combinational analysis, when the number of stations increases, the complexity of optimization problem will be increased exponentially, and cannot be solved with traditional method. There is no better way to solve this problem till now. In this paper, target detection procedure was simplified. Firstly, the space coverage of ground-based radar was simplified, a space coverage projection model of radar facilities in different orbit altitudes was built; then a simplified objects cross the radar coverage model was established according to the characteristics of space objects orbit motion; after two steps simplification, the computational complexity of the target detection was greatly simplified, and simulation results shown the correctness of the simplified results. In addition, the detection areas of ground-based radar network can be easily computed with the

  17. An Improved Algorithm of Grounding Grids Corrosion Diagnosis Based on Total Least Square Method

    Institute of Scientific and Technical Information of China (English)

    ZHANG Ying-jiao; NIU Tao; WANG Sen

    2011-01-01

    A new model considering corrosion property for grounding grids diagnosis is proposed, which provides reference solutions of ambiguous branches. The constraint total least square method based on singular value decomposition is adopted to improve the effectiveness of grounding grids' diagnosis algorithm. The improvement can weaken the influence of the model's error, which results from the differences between design paper and actual grid. Its influence on touch and step voltages caused by the interior resistance of conductors is taken into account. Simulation results show the validity of this approach.

  18. A Method Based on Active Appearance Model and Gradient Orientation Pyramid of Face Verification as People Age

    Directory of Open Access Journals (Sweden)

    Ji-Xiang Du

    2014-01-01

    Full Text Available Face verification in the presence of age progression is an important problem that has not been widely addressed. In this paper, we propose to use the active appearance model (AAM and gradient orientation pyramid (GOP feature representation for this problem. First, we use the AAM on the dataset and generate the AAM images; we then get the representation of gradient orientation on a hierarchical model, which is the appearance of GOP. When combined with a support vector machine (SVM, experimental results show that our approach has excellent performance on two public domain face aging datasets: FGNET and MORPH. Second, we compare the performance of the proposed methods with a number of related face verification methods; the results show that the new approach is more robust and performs better.

  19. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  20. An approach to the verification of a fault-tolerant, computer-based reactor safety system: A case study using automated reasoning: Volume 1: Interim report

    Energy Technology Data Exchange (ETDEWEB)

    Chisholm, G.H.; Kljaich, J.; Smith, B.T.; Wojcik, A.S.

    1987-01-01

    The purpose of this project is to explore the feasibility of automating the verification process for computer systems. The intent is to demonstrate that both the software and hardware that comprise the system meet specified availability and reliability criteria, that is, total design analysis. The approach to automation is based upon the use of Automated Reasoning Software developed at Argonne National Laboratory. This approach is herein referred to as formal analysis and is based on previous work on the formal verification of digital hardware designs. Formal analysis represents a rigorous evaluation which is appropriate for system acceptance in critical applications, such as a Reactor Safety System (RSS). This report describes a formal analysis technique in the context of a case study, that is, demonstrates the feasibility of applying formal analysis via application. The case study described is based on the Reactor Safety System (RSS) for the Experimental Breeder Reactor-II (EBR-II). This is a system where high reliability and availability are tantamount to safety. The conceptual design for this case study incorporates a Fault-Tolerant Processor (FTP) for the computer environment. An FTP is a computer which has the ability to produce correct results even in the presence of any single fault. This technology was selected as it provides a computer-based equivalent to the traditional analog based RSSs. This provides a more conservative design constraint than that imposed by the IEEE Standard, Criteria For Protection Systems For Nuclear Power Generating Stations (ANSI N42.7-1972).

  1. An approach to the verification of a fault-tolerant, computer-based reactor safety system: A case study using automated reasoning: Volume 2, Appendixes: Interim report

    Energy Technology Data Exchange (ETDEWEB)

    Chisholm, G.H.; Kljaich, J.; Smith, B.T.; Wojcik, A.S.

    1987-01-01

    The purpose of this project is to explore the feasibility of automating the verification process for computer systems. The intent is to demonstrate that both the software and hardware that comprise the system meet specified availability and reliability criteria, that is, total design analysis. The approach to automation is based upon the use of Automated Reasoning Software developed at Argonne National Laboratory. This approach is herein referred to as formal analysis and is based on previous work on the formal verification of digital hardware designs. Formal analysis represents a rigorous evaluation which is appropriate for system acceptance in critical applications, such as a Reactor Safety System (RSS). This report describes a formal analysis technique in the context of a case study, that is, demonstrates the feasibility of applying formal analysis via application. The case study described is based on the Reactor Safety System (RSS) for the Experimental Breeder Reactor-II (EBR-II). This is a system where high reliability and availability are tantamount to safety. The conceptual design for this case study incorporates a Fault-Tolerant Processor (FTP) for the computer environment. An FTP is a computer which has the ability to produce correct results even in the presence of any single fault. This technology was selected as it provides a computer-based equivalent to the traditional analog based RSSs. This provides a more conservative design constraint than that imposed by the IEEE Standard, Criteria For Protection Systems For Nuclear Power Generating Stations (ANSI N42.7-1972).

  2. A comparison of two prompt gamma imaging techniques with collimator-based cameras for range verification in proton therapy

    Science.gov (United States)

    Lin, Hsin-Hon; Chang, Hao-Ting; Chao, Tsi-Chian; Chuang, Keh-Shih

    2017-08-01

    In vivo range verification plays an important role in proton therapy to fully utilize the benefits of the Bragg peak (BP) for delivering high radiation dose to tumor, while sparing the normal tissue. For accurately locating the position of BP, camera equipped with collimators (multi-slit and knife-edge collimator) to image prompt gamma (PG) emitted along the proton tracks in the patient have been proposed for range verification. The aim of the work is to compare the performance of multi-slit collimator and knife-edge collimator for non-invasive proton beam range verification. PG imaging was simulated by a validated GATE/GEANT4 Monte Carlo code to model the spot-scanning proton therapy and cylindrical PMMA phantom in detail. For each spot, 108 protons were simulated. To investigate the correlation between the acquired PG profile and the proton range, the falloff regions of PG profiles were fitted with a 3-line-segment curve function as the range estimate. Factors including the energy window setting, proton energy, phantom size, and phantom shift that may influence the accuracy of detecting range were studied. Results indicated that both collimator systems achieve reasonable accuracy and good response to the phantom shift. The accuracy of range predicted by multi-slit collimator system is less affected by the proton energy, while knife-edge collimator system can achieve higher detection efficiency that lead to a smaller deviation in predicting range. We conclude that both collimator systems have potentials for accurately range monitoring in proton therapy. It is noted that neutron contamination has a marked impact on range prediction of the two systems, especially in multi-slit system. Therefore, a neutron reduction technique for improving the accuracy of range verification of proton therapy is needed.

  3. A Dynamic Programming-Based Heuristic for the Shift Design Problem in Airport Ground Handling

    DEFF Research Database (Denmark)

    Clausen, Tommy

    We consider the heterogeneous shift design problem for a workforce with multiple skills, where work shifts are created to cover a given demand as well as possible while minimizing cost and satisfying a flexible set of constraints. We focus mainly on applications within airport ground handling where...... the demand can be highly irregular and specified on time intervals as short as five minutes. Ground handling operations are subject to a high degree of cooperation and specialization that require workers with different qualifications to be planned together. Different labor regulations or organizational rules...... can apply to different ground handling operations, so the rules and restrictions can be numerous and vary significantly. This is modeled using flexible volume constraints that limit the creation of certain shifts. We present a fast heuristic for the heterogeneous shift design problem based on dynamic...

  4. (21) Lutetia spectrophotometry from Rosetta-OSIRIS images and comparison to ground-based observations

    Science.gov (United States)

    Magrin, S.; La Forgia, F.; Pajola, M.; Lazzarin, M.; Massironi, M.; Ferri, F.; da Deppo, V.; Barbieri, C.; Sierks, H.; Osiris Team

    2012-06-01

    Here we present some preliminary results on surface variegation found on (21) Lutetia from ROSETTA-OSIRIS images acquired on 2010-07-10. The spectrophotometry obtained by means of the two cameras NAC and WAC (Narrow and Wide Angle Cameras) is consistent with ground based observations, and does not show surface diversity above the data error bars. The blue and UV images (shortward 500 nm) may, however, indicate a variegation of the optical properties of the asteroid surface on the Baetica region (Sierks et al., 2011). We also speculate on the contribution due to different illumination and to different ground properties (composition or, more probably, grain size diversity). In particular a correlation with geologic units independently defined by Massironi et al. (2012) is evident, suggesting that the variegation of the ground optical properties is likely to be real.

  5. Protection Measures for Buildings Based on Coordinating Action Theory of Ground, Foundation and Structure

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Based on the theory of coordinating action of building ground, foundation and structure, this paper presents a modified method for calculating additional stresses on buildings in mining areas by considering the joint effect of curvature deformation and horizontal deformation on buildings. It points out that for buildings over the coal pillar, it is advisable to soften the intermediate ground of buildings when they are affected by mining. For buildings over the goaf, it is preferable to soften the ground at both ends of buildings. In order to enhance the ability of a building to resist tensile deformation, the key measure is to reinforce the bottom foundation of the building. In addition, the concept of "angle of break of building" is proposed. It is because of this angle that the protecting coal pillar is left, which is a better solution than prevailing solutions The findings provide a more scientific basis for mining under buildings.

  6. Ground truth delineation for medical image segmentation based on Local Consistency and Distribution Map analysis.

    Science.gov (United States)

    Cheng, Irene; Sun, Xinyao; Alsufyani, Noura; Xiong, Zhihui; Major, Paul; Basu, Anup

    2015-01-01

    Computer-aided detection (CAD) systems are being increasingly deployed for medical applications in recent years with the goal to speed up tedious tasks and improve precision. Among others, segmentation is an important component in CAD systems as a preprocessing step to help recognize patterns in medical images. In order to assess the accuracy of a CAD segmentation algorithm, comparison with ground truth data is necessary. To-date, ground truth delineation relies mainly on contours that are either manually defined by clinical experts or automatically generated by software. In this paper, we propose a systematic ground truth delineation method based on a Local Consistency Set Analysis approach, which can be used to establish an accurate ground truth representation, or if ground truth is available, to assess the accuracy of a CAD generated segmentation algorithm. We validate our computational model using medical data. Experimental results demonstrate the robustness of our approach. In contrast to current methods, our model also provides consistency information at distributed boundary pixel level, and thus is invariant to global compensation error.

  7. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    Government drawings, specifications, or other data included in this document for any purpose other than Government procurement does not in any way...information exchange, and its publication does not constitute the Government’s approval or disapproval of its ideas or findings. REPORT DOCUMENTATION ...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES: CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750

  8. Particle production during inflation and gravitational waves detectable by ground-based interferometers

    OpenAIRE

    Cook, Jessica L.; Sorbo, Lorenzo

    2011-01-01

    Inflation typically predicts a quasi scale-invariant spectrum of gravitational waves. In models of slow-roll inflation, the amplitude of such a background is too small to allow direct detection without a dedicated space-based experiment such as the proposed BBO or DECIGO. In this paper we note that particle production during inflation can generate a feature in the spectrum of primordial gravitational waves. We discuss the possibility that such a feature might be detected by ground-based laser...

  9. Comparison of NO2 vertical profiles from satellite and ground based measurements over Antarctica

    OpenAIRE

    Kulkarni, Pavan; Bortoli, Daniele; Costa, Maria João; Silva, Ana Maria; Ravegnani, Fabrizio; Giovanelli, Giorgio

    2011-01-01

    The Intercomparison of nitrogen dioxide (NO2) vertical profiles, derived from the satellite based HALogen Occultation Experiment (HALOE) measurements and from the ground based UV-VIS spectrometer GASCOD (Gas Analyzer Spectrometer Correlating Optical Differences) observations at the Mario Zucchelli Station (MZS), in Antarctica, are done for the first time. It is shown here that both datasets are in good agreement showing the same features in terms of magnitude, profile structure, a...

  10. Shear wave velocity-based evaluation and design of stone column improved ground for liquefaction mitigation

    Institute of Scientific and Technical Information of China (English)

    Zhou Yanguo; Sun Zhengbo; Chen Jie; Chen Yunmin; Chen Renpeng

    2017-01-01

    The evaluation and design of stone column improvement ground for liquefaction mitigation is a challenging issue for the state of practice.In this paper,a shear wave velocity-based approach is proposed based on the well-defined correlations of liquefaction resistance (CRR)-shear wave velocity (Vs)-void ratio (e) of sandy soils,and the values of parameters in this approach are recommended for preliminary design purpose when site specific values are not available.The detailed procedures of pre-and post-improvement liquefaction evaluations and stone column design are given.According to this approach,the required level of ground improvement will be met once the target Vs of soil is raised high enough (i.e.,no less than the critical velocity) to resist the given earthquake loading according to the CRR-Vs relationship,and then this requirement is transferred to the control of target void ratio (i.e.,the critical e) according to the Vs-e relationship.As this approach relies on the densification of the surrounding soil instead of the whole improved ground and is conservative by nature,specific considerations of the densification mechanism and effect are given,and the effects of drainage and reinforcement of stone columns are also discussed.A case study of a thermal power plant in Indonesia is introduced,where the effectiveness of stone column improved ground was evaluated by the proposed Vs-based method and compared with the SPT-based evaluation.This improved ground performed well and experienced no liquefaction during subsequent strong earthquakes.

  11. Shear wave velocity-based evaluation and design of stone column improved ground for liquefaction mitigation

    Science.gov (United States)

    Zhou, Yanguo; Sun, Zhengbo; Chen, Jie; Chen, Yunmin; Chen, Renpeng

    2017-04-01

    The evaluation and design of stone column improvement ground for liquefaction mitigation is a challenging issue for the state of practice. In this paper, a shear wave velocity-based approach is proposed based on the well-defined correlations of liquefaction resistance (CRR)-shear wave velocity ( V s)-void ratio ( e) of sandy soils, and the values of parameters in this approach are recommended for preliminary design purpose when site specific values are not available. The detailed procedures of pre- and post-improvement liquefaction evaluations and stone column design are given. According to this approach, the required level of ground improvement will be met once the target V s of soil is raised high enough (i.e., no less than the critical velocity) to resist the given earthquake loading according to the CRR- V s relationship, and then this requirement is transferred to the control of target void ratio (i.e., the critical e) according to the V s- e relationship. As this approach relies on the densification of the surrounding soil instead of the whole improved ground and is conservative by nature, specific considerations of the densification mechanism and effect are given, and the effects of drainage and reinforcement of stone columns are also discussed. A case study of a thermal power plant in Indonesia is introduced, where the effectiveness of stone column improved ground was evaluated by the proposed V s-based method and compared with the SPT-based evaluation. This improved ground performed well and experienced no liquefaction during subsequent strong earthquakes.

  12. Key Ground-Based and Space-Based Assets to Disentangle Magnetic Field Sources in the Earth's Environment

    Science.gov (United States)

    Chulliat, A.; Matzka, J.; Masson, A.; Milan, S. E.

    2016-10-01

    The magnetic field measured on the ground or in space is the addition of several sources: from flows within the Earth's core to electric currents in distant regions of the magnetosphere. Properly separating and characterizing these sources requires appropriate observations, both ground-based and space-based. In the present paper, we review the existing observational infrastructure, from magnetic observatories and magnetometer arrays on the ground to satellites in low-Earth (Swarm) and highly elliptical (Cluster) orbits. We also review the capability of SuperDARN to provide polar ionospheric convection patterns supporting magnetic observations. The past two decades have been marked by exciting new developments in all observation types. We review these developments, focusing on how they complement each other and how they have led or could lead in the near future to improved separation and modeling of the geomagnetic sources.

  13. Assessment of a 2D electronic portal imaging devices-based dosimetry algorithm for pretreatment and in-vivo midplane dose verification

    Science.gov (United States)

    Jomehzadeh, Ali; Shokrani, Parvaneh; Mohammadi, Mohammad; Amouheidari, Alireza

    2016-01-01

    Background: The use of electronic portal imaging devices (EPIDs) is a method for the dosimetric verification of radiotherapy plans, both pretreatment and in vivo. The aim of this study is to test a 2D EPID-based dosimetry algorithm for dose verification of some plans inside a homogenous and anthropomorphic phantom and in vivo as well. Materials and Methods: Dose distributions were reconstructed from EPID images using a 2D EPID dosimetry algorithm inside a homogenous slab phantom for a simple 10 × 10 cm2 box technique, 3D conformal (prostate, head-and-neck, and lung), and intensity-modulated radiation therapy (IMRT) prostate plans inside an anthropomorphic (Alderson) phantom and in the patients (one fraction in vivo) for 3D conformal plans (prostate, head-and-neck and lung). Results: The planned and EPID dose difference at the isocenter, on an average, was 1.7% for pretreatment verification and less than 3% for all in vivo plans, except for head-and-neck, which was 3.6%. The mean γ values for a seven-field prostate IMRT plan delivered to the Alderson phantom varied from 0.28 to 0.65. For 3D conformal plans applied for the Alderson phantom, all γ1% values were within the tolerance level for all plans and in both anteroposterior and posteroanterior (AP-PA) beams. Conclusion: The 2D EPID-based dosimetry algorithm provides an accurate method to verify the dose of a simple 10 × 10 cm2 field, in two dimensions, inside a homogenous slab phantom and an IMRT prostate plan, as well as in 3D conformal plans (prostate, head-and-neck, and lung plans) applied using an anthropomorphic phantom and in vivo. However, further investigation to improve the 2D EPID dosimetry algorithm for a head-and-neck case, is necessary. PMID:28028511

  14. Monitoring greenhouse gas emissions in Australian landscapes: Comparing ground based mobile surveying data to GOSAT observations

    Science.gov (United States)

    Bashir, S.; Iverach, C.; Kelly, B. F. J.

    2016-12-01

    Climate change is threatening the health and stability of the natural world and human society. Such concerns were emphasized at COP21 conference in Paris 2015 which highlighted the global need to improve our knowledge of sources of greenhouse gas and to develop methods to mitigate the effects of their emissions. Ongoing spatial and temporal measurements of greenhouse gases at both point and regional scales is important for clarification of climate change mechanisms and accounting. The Greenhouse gas Observing SATellite (GOSAT) is designed to monitor the global distribution of carbon dioxide (CO2) and methane (CH4) from orbit. As existing ground monitoring stations are limited and still unevenly distributed, satellite observations provide important frequent, spatially extensive, but low resolution observations. Recent developments in portable laser based greenhouse gas measurement systems have enabled the rapid measurement of greenhouse gases in ppb at the ground surface. This study was conducted to map major sources of CO2 and CH4 in the eastern states of Australia at the landscape scale and to compare the results to GOSAT observations. During April 2016 we conducted a regional CH4 and CO2 mobile survey, using an LGR greenhouse gas analyzer. Measurements were made along a 4000 KM circuit through major cities, country towns, dry sclerophyll forests, coastal wetlands, coal mining regions, coal seam gas developments, dryland farming and irrigated agricultural landscapes. The ground-based survey data were then compared with the data (L2) from GOSAT. Ground-based mobile surveys showed that there are clear statistical differences in the ground level atmospheric concentration of CH4 and CO2 associated with all major changes in land use. These changes extend for kilometers, and cover one or more GOSAT pixels. In the coal mining districts the ground-level atmospheric concentration of CH4 exceeded 2 ppm for over 40 km, yet this was not discernable in the retrieved data (L2

  15. CRRES/Ground-based multi-instrument observations of an interval of substorm activity

    Directory of Open Access Journals (Sweden)

    T. K. Yeoman

    Full Text Available Observations are presented of data taken during a 3-h interval in which five clear substorm onsets/intensifications took place. During this interval ground-based data from the EISCAT incoherent scatter radar, a digital CCD all sky camera, and an extensive array of magnetometers were recorded. In addition data from the CRRES and DMSP spacecraft, whose footprints passed over Scandinavia very close to most of the ground-based instrumentation, are available. The locations and movements of the substorm current system in latitude and longitude, determined from ground and spacecraft magnetic field data, have been correlated with the locations and propagation of increased particle precipitation in the E-region at EISCAT, increased particle fluxes measured by CRRES and DMSP, with auroral luminosity and with ionospheric convection velocities. The onsets and propagation of the injection of magnetospheric particle populations and auroral luminosity have been compared. CRRES was within or very close to the substorm expansion phase onset sector during the interval. The onset region was observed at low latitudes on the ground, and has been confirmed to map back to within L=7 in the magnetotail. The active region was then observed to propagate tailward and poleward. Delays between the magnetic signature of the substorm field aligned currents and field dipolarisation have been measured. The observations support a near-Earth plasma instability mechanism for substorm expansion phase onset.

  16. Nulling interferometry: performance comparison between space and ground-based sites for exozodiacal disc detection

    CERN Document Server

    Defrère, D; Foresto, V Coudé du; Danchi, W C; Hartog, R den

    2008-01-01

    Characterising the circumstellar dust around nearby main sequence stars is a necessary step in understanding the planetary formation process and is crucial for future life-finding space missions such as ESA's Darwin or NASA's Terrestrial Planet Finder (TPF). Besides paving the technological way to Darwin/TPF, the space-based infrared interferometers Pegase and FKSI (Fourier-Kelvin Stellar Interferometer) will be valuable scientific precursors in that respect. In this paper, we investigate the performance of Pegase and FKSI for exozodiacal disc detection and compare the results with ground-based nulling interferometers. Besides their main scientific goal (characterising hot giant extrasolar planets), Pegase and FKSI are very efficient in assessing within a few minutes the level of circumstellar dust in the habitable zone around nearby main sequence stars. They are capable of detecting exozodiacal discs respectively 5 and 1 time as dense as the solar zodiacal cloud and they outperform any ground-based instrumen...

  17. Validation of Aura OMI by Aircraft and Ground-Based Measurements

    Science.gov (United States)

    McPeters, R. D.; Petropavlovskikh, I.; Kroon, M.

    2006-12-01

    Both aircraft-based and ground-based measurements have been used to validate ozone measurements by the OMI instrument on Aura. Three Aura Validation Experiment (AVE) flights have been conducted, in November 2004 and June 2005 with the NASA WB57, and in January/February 2005 with the NASA DC-8. On these flights, validation of OMI was primarily done using data from the CAFS (CCD Actinic Flux Spectroradiometer) instrument, which is used to measure total column ozone above the aircraft. These measurements are used to differentiate changes in stratospheric ozone from changes in total column ozone. Also, changes in ozone over high clouds measured by OMI were checked in a flight over tropical storm Arlene on a flight on June 11th. Ground-based measurements were made during the SAUNA campaign in Sodankyla, Finland, in March and April 2006. Both total column ozone and the ozone vertical distribution were validated.

  18. REMOTE SENSING OF WATER VAPOR CONTENT USING GROUND-BASED GPS DATA

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Spatial and temporal resolution of water vapor content is useful in improving the accuracy of short-term weather prediction.Dense and continuously tracking regional GPS arrays will play an important role in remote sensing atmospheric water vapor content.In this study,a piecewise linear solution method was proposed to estimate the precipitable water vapor (PWV) content from ground-based GPS observations in Hong Kong.To evaluate the solution accuracy of the water vapor content sensed by GPS,the upper air sounding data (radiosonde) that are collected locally was used to calculate the precipitable water vapor during the same period.One-month results of PWV from both ground-based GPS sensing technique and radiosonde method are in agreement within 1~2 mm.This encouraging result will motivate the GPS meteorology application based on the establishment of a dense GPS array in Hong Kong.

  19. DEM extraction and its accuracy analysis with ground-based SAR interferometry

    Science.gov (United States)

    Dong, J.; Yue, J. P.; Li, L. H.

    2014-03-01

    Two altimetry models extracting DEM (Digital Elevation Model) with the GBSAR (Ground-Based Synthetic Aperture Radar) technology are studied and their accuracies are analyzed in detail. The approximate and improved altimetry models of GBSAR were derived from the spaceborne radar altimetry based on the principles of the GBSAR technology. The error caused by the parallel ray approximation in the approximate model was analyzed quantitatively, and the results show that the errors cannot be ignored for the ground-based radar system. For the improved altimetry model, the elevation error expression can be acquired by simulating and analyzing the error propagation coefficients of baseline length, wavelength, differential phase and range distance in the mathematical model. By analyzing the elevation error with the baseline and range distance, the results show that the improved altimetry model is suitable for high-precision DEM and the accuracy can be improved by adjusting baseline and shortening slant distance.

  20. Empirically Grounded Agent-Based Models of Innovation Diffusion: A Critical Review

    CERN Document Server

    Zhang, Haifeng

    2016-01-01

    Innovation diffusion has been studied extensively in a variety of disciplines, including sociology, economics, marketing, ecology, and computer science. Traditional literature on innovation diffusion has been dominated by models of aggregate behavior and trends. However, the agent-based modeling (ABM) paradigm is gaining popularity as it captures agent heterogeneity and enables fine-grained modeling of interactions mediated by social and geographic networks. While most ABM work on innovation diffusion is theoretical, empirically grounded models are increasingly important, particularly in guiding policy decisions. We present a critical review of empirically grounded agent-based models of innovation diffusion, developing a categorization of this research based on types of agent models as well as applications. By connecting the modeling methodologies in the fields of information and innovation diffusion, we suggest that the maximum likelihood estimation framework widely used in the former is a promising paradigm...

  1. A novel intelligent adaptive control of laser-based ground thermal test

    Directory of Open Access Journals (Sweden)

    Gan Zhengtao

    2016-08-01

    Full Text Available Laser heating technology is a type of potential and attractive space heat flux simulation technology, which is characterized by high heating rate, controlled spatial intensity distribution and rapid response. However, the controlled plant is nonlinear, time-varying and uncertainty when implementing the laser-based heat flux simulation. In this paper, a novel intelligent adaptive controller based on proportion–integration–differentiation (PID type fuzzy logic is proposed to improve the performance of laser-based ground thermal test. The temperature range of thermal cycles is more than 200 K in many instances. In order to improve the adaptability of controller, output scaling factors are real time adjusted while the thermal test is underway. The initial values of scaling factors are optimized using a stochastic hybrid particle swarm optimization (H-PSO algorithm. A validating system has been established in the laboratory. The performance of the proposed controller is evaluated through extensive experiments under different operating conditions (reference and load disturbance. The results show that the proposed adaptive controller performs remarkably better compared to the conventional PID (PID controller and the conventional PID type fuzzy (F-PID controller considering performance indicators of overshoot, settling time and steady state error for laser-based ground thermal test. It is a reliable tool for effective temperature control of laser-based ground thermal test.

  2. A novel intelligent adaptive control of laser-based ground thermal test

    Institute of Scientific and Technical Information of China (English)

    Gan Zhengtao; Yu Gang; Li Shaoxia; He Xiuli; Chen Ru; Zheng Caiyun; Ning Weijian

    2016-01-01

    Laser heating technology is a type of potential and attractive space heat flux simulation technology, which is characterized by high heating rate, controlled spatial intensity distribution and rapid response. However, the controlled plant is nonlinear, time-varying and uncertainty when implementing the laser-based heat flux simulation. In this paper, a novel intelligent adaptive controller based on proportion–integration–differentiation (PID) type fuzzy logic is proposed to improve the performance of laser-based ground thermal test. The temperature range of thermal cycles is more than 200 K in many instances. In order to improve the adaptability of controller, output scaling factors are real time adjusted while the thermal test is underway. The initial values of scaling factors are optimized using a stochastic hybrid particle swarm optimization (H-PSO) algorithm. A validating system has been established in the laboratory. The performance of the pro-posed controller is evaluated through extensive experiments under different operating conditions (reference and load disturbance). The results show that the proposed adaptive controller performs remarkably better compared to the conventional PID (PID) controller and the conventional PID type fuzzy (F-PID) controller considering performance indicators of overshoot, settling time and steady state error for laser-based ground thermal test. It is a reliable tool for effective temperature control of laser-based ground thermal test.

  3. Comparison of Precipitation Observations from a Prototype Space-based Cloud Radar and Ground-based Radars

    Institute of Scientific and Technical Information of China (English)

    LIU Liping; ZHANG Zhiqiang; YU Danru; YANG Hu; ZHAO Chonghui; ZHONG Lingzhi

    2012-01-01

    A prototype space-based cloud radar has been developed and was installed on an airplane to observe a precipitation system over Tianjin,China in July 2010.Ground-based S-band and Ka-band radars were used to examine the observational capability of the prototype. A cross-comparison algorithm between different wavelengths,spatial resolutions and platform radars is presented.The reflectivity biases,correlation coefficients and standard deviations between the radars are analyzed.The equivalent reflectivity bias between the S- and Ka-band radars were simulated with a given raindrop size distribution.The results indicated that reflectivity bias between the S- and Ka-band radars due to scattering properties was less than 5 dB,and for weak precipitation the bias was negligible. The prototype space-based cloud radar was able to measure a reasonable vertical profile of reflectivity,but the reflectivity below an altitude of 1.5 km above ground level was obscured by ground clutter.The measured reflectivity by the prototype space-based cloud radar was approximately 10.9 dB stronger than that by the S-band Doppler radar (SA radar),and 13.7 dB stronger than that by the ground-based cloud radar.The reflectivity measured by the SA radar was 0.4 dB stronger than that by the ground-based cloud radar.This study could provide a method for the quantitative examination of the observation ability for space-based radars.

  4. Toward High Altitude Airship Ground-Based Boresight Calibration of Hyperspectral Pushbroom Imaging Sensors

    Directory of Open Access Journals (Sweden)

    Aiwu Zhang

    2015-12-01

    Full Text Available The complexity of the single linear hyperspectral pushbroom imaging based on a high altitude airship (HAA without a three-axis stabilized platform is much more than that based on the spaceborne and airborne. Due to the effects of air pressure, temperature and airflow, the large pitch and roll angles tend to appear frequently that create pushbroom images highly characterized with severe geometric distortions. Thus, the in-flight calibration procedure is not appropriate to apply to the single linear pushbroom sensors on HAA having no three-axis stabilized platform. In order to address this problem, a new ground-based boresight calibration method is proposed. Firstly, a coordinate’s transformation model is developed for direct georeferencing (DG of the linear imaging sensor, and then the linear error equation is derived from it by using the Taylor expansion formula. Secondly, the boresight misalignments are worked out by using iterative least squares method with few ground control points (GCPs and ground-based side-scanning experiments. The proposed method is demonstrated by three sets of experiments: (i the stability and reliability of the method is verified through simulation-based experiments; (ii the boresight calibration is performed using ground-based experiments; and (iii the validation is done by applying on the orthorectification of the real hyperspectral pushbroom images from a HAA Earth observation payload system developed by our research team—“LanTianHao”. The test results show that the proposed boresight calibration approach significantly improves the quality of georeferencing by reducing the geometric distortions caused by boresight misalignments to the minimum level.

  5. The comparison between a ground based and a space based probabilistic landslide susceptibility assessment

    Science.gov (United States)

    Reichenbach, P.; Mondini, A.; Guzzetti, F.; Rossi, M.; Ardizzone, F.; Cardinali, M.

    2009-04-01

    , thematic maps obtained processing satellite data can be an effective alternative to maps prepared using more traditional, ground based methods.

  6. Ground Motion Prediction Trends For Eastern North America Based on the Next Generation Attenuation East Ground Motion Database

    Science.gov (United States)

    Cramer, C. H.; Kutliroff, J.; Dangkua, D.

    2010-12-01

    A five-year Next Generation Attenuation (NGA) East project to develop new ground motion prediction equations for stable continental regions (SCRs), including eastern North America (ENA), has begun at the Pacific Earthquake Engineering Research (PEER) Center funded by the Nuclear Regulatory Commission (NRC), the U.S. Geological Survey (USGS), the Electric Power Research Institute (EPRI), and the Department of Energy (DOE). The initial effort focused on database design and collection of appropriate M>4 ENA broadband and accelerograph records to populate the database. Ongoing work has focused on adding records from smaller ENA earthquakes and from other SCRs such as Europe, Australia, and India. Currently, over 6500 horizontal and vertical component records from 60 ENA earthquakes have been collected and prepared (instrument response removed, filtering to acceptable-signal band, determining peak and spectral parameter values, quality assurance, etc.) for the database. Geologic Survey of Canada (GSC) strong motion recordings, previously not available, have also been added to the NGA East database. The additional earthquakes increase the number of ground motion recordings in the 10 - 100 km range, particularly from the 2008 M5.2 Mt. Carmel, IL event, and the 2005 M4.7 Riviere du Loup and 2010 M5.0 Val des Bois earthquakes in Quebec, Canada. The goal is to complete the ENA database and make it available in 2011 followed by a SCR database in 2012. Comparisons of ground motion observations from four recent M5 ENA earthquakes with current ENA ground motion prediction equations (GMPEs) suggest that current GMPEs, as a group, reasonably agree with M5 observations at short periods, particularly at distances less than 200 km. However, at one second, current GMPEs over predict M5 ground motion observations. The 2001 M7.6 Bhuj, India, earthquake provides some constraint at large magnitudes, as geology and regional attenuation is analogous to ENA. Cramer and Kumar, 2003, have

  7. Flow Characteristics of Tidewater Glaciers in Greenland and Alaska using Ground-Based LiDAR

    Science.gov (United States)

    Finnegan, D. C.; Stearns, L. A.; Hamilton, G. S.; O'Neel, S.

    2010-12-01

    LiDAR scanning systems have been employed to characterize and quantify multi-temporal glacier and ice sheet changes for nearly three decades. Until recently, LiDAR scanning systems were limited to airborne and space-based platforms which come at a significant cost to deploy and are limited in spatial and temporal sampling capabilities necessary to compare with in-situ field measurements. Portable ground-based LiDAR scanning systems are now being used as a glaciological tool. We discuss research efforts to employ ground-based near-infrared LiDAR systems at two differing tidewater glacier systems in the spring of 2009; Helheim Glacier in southeast Greenland and Columbia Glacier in southeast Alaska. Preliminary results allow us to characterize short term displacement rates and detailed observations of calving processes. These results highlight the operational limitations and capabilities of commercially available LiDAR systems, and allow us to identify optimal operating characteristics for monitoring small to large-scale tidewater glaciers in near real-time. Furthermore, by identifying the operational limitations of these sensors it allows for optimal design characteristics of new sensors necessary to meet ground-based calibration and validation requirements of ongoing scientific missions.

  8. Entry Dispersion Analysis for the Hayabusa Spacecraft using Ground Based Optical Observation

    CERN Document Server

    Yamaguchi, T; Yagi, M; Tholen, D J

    2011-01-01

    Hayabusa asteroid explorer successfully released the sample capsule to Australia on June 13, 2010. Since the Earth reentry phase of sample return was critical, many backup plans for predicting the landing location were prepared. This paper investigates the reentry dispersion using ground based optical observation as a backup observation for radiometric observation. Several scenarios are calculated and compared for the reentry phase of the Hayabusa to evaluate the navigation accuracy of the ground-based observation. The optical observation doesn't require any active reaction from a spacecraft, thus these results show that optical observations could be a steady backup strategy even if a spacecraft had some trouble. We also evaluate the landing dispersion of the Hayabusa only with the optical observation.

  9. Ground-based walking training improves quality of life and exercise capacity in COPD.

    Science.gov (United States)

    Wootton, Sally L; Ng, L W Cindy; McKeough, Zoe J; Jenkins, Sue; Hill, Kylie; Eastwood, Peter R; Hillman, David R; Cecins, Nola; Spencer, Lissa M; Jenkins, Christine; Alison, Jennifer A

    2014-10-01

    This study was designed to determine the effect of ground-based walking training on health-related quality of life and exercise capacity in people with chronic obstructive pulmonary disease (COPD). People with COPD were randomised to either a walking group that received supervised, ground-based walking training two to three times a week for 8-10 weeks, or a control group that received usual medical care and did not participate in exercise training. 130 out of 143 participants (mean±sd age 69±8 years, forced expiratory volume in 1 s 43±15% predicted) completed the study. Compared to the control group, the walking group demonstrated greater improvements in the St George's Respiratory Questionnaire total score (mean difference -6 points (95% CI -10- -2), pimproves quality of life and endurance exercise capacity in people with COPD.

  10. Coherent receiving efficiency in satellite-ground coherent laser communication system based on analysis of polarization

    Science.gov (United States)

    Hao, Shiqi; Zhang, Dai; Zhao, Qingsong; Wang, Lei; Zhao, Qi

    2017-06-01

    Aimed at analyzing the coherent receiving efficiency of a satellite-ground coherent laser communication system, polarization state of the received light is analyzed. We choose the circularly polarized, partially coherent laser as transmitted light source. The analysis process includes 3 parts. Firstly, an theoretical model to analyze received light's polarization state is constructed based on Gaussian-Schell model (GSM) and cross spectral density function matrix. Then, analytic formulas to calculate coherent receiving efficiency are derived in which both initial ellipticity modification and deflection angle between polarization axes of the received light and the intrinsic light are considered. At last, numerical simulations are operated based on our study. The research findings investigate variations of polarization state and obtain analytic formulas to calculate the coherent receiving efficiency. Our study has theoretical guiding significances in construction and optimization of satellite-ground coherent laser communication system.

  11. Techniques to extend the reach of ground based gravitational wave detectors

    Science.gov (United States)

    Dwyer, Sheila

    2016-03-01

    While the current generation of advanced ground based detectors will open the gravitational wave universe to observation, ground based interferometry has the potential to extend the reach of these observatories to high redshifts. Several techniques have the potential to improve the advanced detectors beyond design sensitivity, including the use of squeezed light, upgraded suspensions, and possibly new optical coatings, new test mass materials, and cryogenic suspensions. To improve the sensitivity by more than a factor of 10 compared to advanced detectors new, longer facilities will be needed. Future observatories capable of hosting interferometers 10s of kilometers long have the potential to extend the reach of gravitational wave astronomy to cosmological distances, enabling detection of binary inspirals from throughout the history of star formation.

  12. Ground-based near-infrared imaging of the HD141569 circumstellar disk

    CERN Document Server

    Boccaletti, A; Marchis, F; Hanh, J

    2003-01-01

    We present the first ground-based near-infrared image of the circumstellar disk around the post-Herbig Ae/Be star HD141569A initially detected with the HST. Observations were carried out in the near-IR (2.2 $\\mu$m) at the Palomar 200-inch telescope using the adaptive optics system PALAO. The main large scale asymmetric features of the disk are detected on our ground-based data. In addition, we measured that the surface brightness of the disk is slightly different than that derived by HST observations (at 1.1 $\\mu$m and 1.6 $\\mu$m). We interpret this possible color-effect in terms of dust properties and derive a minimal

  13. Topographic gradient based site characterization in India complemented by strong ground-motion spectral attributes

    KAUST Repository

    Nath, Sankar Kumar

    2013-12-01

    We appraise topographic-gradient approach for site classification that employs correlations between 30. m column averaged shear-wave velocity and topographic gradients. Assessments based on site classifications reported from cities across India indicate that the approach is reasonably viable at regional level. Additionally, we experiment three techniques for site classification based on strong ground-motion recordings, namely Horizontal-to-Vertical Spectral Ratio (HVSR), Response Spectra Shape (RSS), and Horizontal-to-Vertical Response Spectral Ratio (HVRSR) at the strong motion stations located across the Himalayas and northeast India. Statistical tests on the results indicate that these three techniques broadly differentiate soil and rock sites while RSS and HVRSR yield better signatures. The results also support the implemented site classification in the light of strong ground-motion spectral attributes observed in different parts of the globe. © 2013 Elsevier Ltd.

  14. Neural Correlates of Auditory Figure-Ground Segregation Based on Temporal Coherence

    Science.gov (United States)

    Teki, Sundeep; Barascud, Nicolas; Picard, Samuel; Payne, Christopher; Griffiths, Timothy D.; Chait, Maria

    2016-01-01

    To make sense of natural acoustic environments, listeners must parse complex mixtures of sounds that vary in frequency, space, and time. Emerging work suggests that, in addition to the well-studied spectral cues for segregation, sensitivity to temporal coherence—the coincidence of sound elements in and across time—is also critical for the perceptual organization of acoustic scenes. Here, we examine pre-attentive, stimulus-driven neural processes underlying auditory figure-ground segregation using stimuli that capture the challenges of listening in complex scenes where segregation cannot be achieved based on spectral cues alone. Signals (“stochastic figure-ground”: SFG) comprised a sequence of brief broadband chords containing random pure tone components that vary from 1 chord to another. Occasional tone repetitions across chords are perceived as “figures” popping out of a stochastic “ground.” Magnetoencephalography (MEG) measurement in naïve, distracted, human subjects revealed robust evoked responses, commencing from about 150 ms after figure onset that reflect the emergence of the “figure” from the randomly varying “ground.” Neural sources underlying this bottom-up driven figure-ground segregation were localized to planum temporale, and the intraparietal sulcus, demonstrating that this area, outside the “classic” auditory system, is also involved in the early stages of auditory scene analysis.” PMID:27325682

  15. Single Phase-to-Ground Fault Line Identification and Section Location Method for Non-Effectively Grounded Distribution Systems Based on Signal Injection

    Institute of Scientific and Technical Information of China (English)

    PAN Zhencun; WANG Chengshan; CONG Wei; ZHANG Fan

    2008-01-01

    A diagnostic signal current trace detecting based single phase-to-ground fault line identifica- tion and section location method for non-effectively grounded distribution systems is presented in thisi oaper. A special diagnostic signal current is injected into the fault distribution system, and then it is de- tected at the outlet terminals to identify the fault line and at the sectionalizing or branching point along the fault line to locate the fault section. The method has been put into application in actual distribution network and field experience shows that it can identify the fault line and locate the fault section correctly and effectively.

  16. Seismic Response of Base-Isolated Structures under Multi-component Ground Motion Excitation

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    An analysis of a base-isolated structure for multi-component random ground motion is presented. The mean square response of the system is obtained under different parametric variations. The effectiveness of main parameters and the torsional component during an earthquake is quantified with the help of the response ratio and the root mean square response with and without base isolation. It is observed that the base isolation has considerable influence on the response and the effect of the torsional component is not ignored.

  17. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  18. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of LightWater Reactors (CASL). Fivemain types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  19. CoRoT and asteroseismology. Preparatory work and simultaneous ground-based monitoring

    CERN Document Server

    Poretti, Ennio; Uytterhoeven, Katrien; Cutispoto, Giuseppe; Distefano, Elisa; Romano, Paolo

    2007-01-01

    The successful launch of the CoRoT (COnvection, ROtation and planetary Transits) satellite opens a new era in asteroseismology. The space photometry is complemented by high-resolution spectroscopy and multicolour photometry from ground, to disclose the pulsational content of the asteroseismic targets in the most complete way. Some preliminary results obtained with both types of data are presented. The paper is based on observations collected at S. Pedro Martir, Serra La Nave, La Silla, and Telescopio Nazionale Galileo Observatories.

  20. Investigating the long-term evolution of subtropical ozone profiles applying ground-based FTIR spectrometry

    OpenAIRE

    García, O.E.; Schneider, M; A. Redondas; Y. González; Hase, F.; Blumenstock, T.; Sepúlveda, E.

    2012-01-01

    This study investigates the long-term evolution of subtropical ozone profile time series (1999–2010) obtained from ground-based FTIR (Fourier Transform InfraRed) spectrometry at the Izaña Observatory ozone super-site. Different ozone retrieval strategies are examined, analysing the influence of an additional temperature retrieval and different constraints. The theoretical assessment reveals that the FTIR system is able to resolve four independent ozone layers with a precision of better than 6...