WorldWideScience

Sample records for reliable georeferenced points

  1. Georeferenced Point Clouds: A Survey of Features and Point Cloud Management

    Directory of Open Access Journals (Sweden)

    Johannes Otepka

    2013-10-01

    Full Text Available This paper presents a survey of georeferenced point clouds. Concentration is, on the one hand, put on features, which originate in the measurement process themselves, and features derived by processing the point cloud. On the other hand, approaches for the processing of georeferenced point clouds are reviewed. This includes the data structures, but also spatial processing concepts. We suggest a categorization of features into levels that reflect the amount of processing. Point clouds are found across many disciplines, which is reflected in the versatility of the literature suggesting specific features.

  2. 3D Maize Plant Reconstruction Based on Georeferenced Overlapping LiDAR Point Clouds

    Directory of Open Access Journals (Sweden)

    Miguel Garrido

    2015-12-01

    Full Text Available 3D crop reconstruction with a high temporal resolution and by the use of non-destructive measuring technologies can support the automation of plant phenotyping processes. Thereby, the availability of such 3D data can give valuable information about the plant development and the interaction of the plant genotype with the environment. This article presents a new methodology for georeferenced 3D reconstruction of maize plant structure. For this purpose a total station, an IMU, and several 2D LiDARs with different orientations were mounted on an autonomous vehicle. By the multistep methodology presented, based on the application of the ICP algorithm for point cloud fusion, it was possible to perform the georeferenced point clouds overlapping. The overlapping point cloud algorithm showed that the aerial points (corresponding mainly to plant parts were reduced to 1.5%–9% of the total registered data. The remaining were redundant or ground points. Through the inclusion of different LiDAR point of views of the scene, a more realistic representation of the surrounding is obtained by the incorporation of new useful information but also of noise. The use of georeferenced 3D maize plant reconstruction at different growth stages, combined with the total station accuracy could be highly useful when performing precision agriculture at the crop plant level.

  3. Georeferencing UAS Derivatives Through Point Cloud Registration with Archived Lidar Datasets

    Science.gov (United States)

    Magtalas, M. S. L. Y.; Aves, J. C. L.; Blanco, A. C.

    2016-10-01

    Georeferencing gathered images is a common step before performing spatial analysis and other processes on acquired datasets using unmanned aerial systems (UAS). Methods of applying spatial information to aerial images or their derivatives is through onboard GPS (Global Positioning Systems) geotagging, or through tying of models through GCPs (Ground Control Points) acquired in the field. Currently, UAS (Unmanned Aerial System) derivatives are limited to meter-levels of accuracy when their generation is unaided with points of known position on the ground. The use of ground control points established using survey-grade GPS or GNSS receivers can greatly reduce model errors to centimeter levels. However, this comes with additional costs not only with instrument acquisition and survey operations, but also in actual time spent in the field. This study uses a workflow for cloud-based post-processing of UAS data in combination with already existing LiDAR data. The georeferencing of the UAV point cloud is executed using the Iterative Closest Point algorithm (ICP). It is applied through the open-source CloudCompare software (Girardeau-Montaut, 2006) on a `skeleton point cloud'. This skeleton point cloud consists of manually extracted features consistent on both LiDAR and UAV data. For this cloud, roads and buildings with minimal deviations given their differing dates of acquisition are considered consistent. Transformation parameters are computed for the skeleton cloud which could then be applied to the whole UAS dataset. In addition, a separate cloud consisting of non-vegetation features automatically derived using CANUPO classification algorithm (Brodu and Lague, 2012) was used to generate a separate set of parameters. Ground survey is done to validate the transformed cloud. An RMSE value of around 16 centimeters was found when comparing validation data to the models georeferenced using the CANUPO cloud and the manual skeleton cloud. Cloud-to-cloud distance computations of

  4. Accuracy assessment of minimum control points for UAV photography and georeferencing

    Science.gov (United States)

    Skarlatos, D.; Procopiou, E.; Stavrou, G.; Gregoriou, M.

    2013-08-01

    In recent years, Autonomous Unmanned Aerial Vehicles (AUAV) became popular among researchers across disciplines because they combine many advantages. One major application is monitoring and mapping. Their ability to fly beyond eye sight autonomously, collecting data over large areas whenever, wherever, makes them excellent platform for monitoring hazardous areas or disasters. In both cases rapid mapping is needed while human access isn't always a given. Indeed, current automatic processing of aerial photos using photogrammetry and computer vision algorithms allows for rapid orthophomap production and Digital Surface Model (DSM) generation, as tools for monitoring and damage assessment. In such cases, control point measurement using GPS is either impossible, or time consuming or costly. This work investigates accuracies that can be attained using few or none control points over areas of one square kilometer, in two test sites; a typical block and a corridor survey. On board GPS data logged during AUAV's flight are being used for direct georeferencing, while ground check points are being used for evaluation. In addition various control point layouts are being tested using bundle adjustment for accuracy evaluation. Results indicate that it is possible to use on board single frequency GPS for direct georeferencing in cases of disaster management or areas without easy access, or even over featureless areas. Due to large numbers of tie points in the bundle adjustment, horizontal accuracy can be fulfilled with a rather small number of control points, but vertical accuracy may not.

  5. DIRECT GEOREFERENCING ON SMALL UNMANNED AERIAL PLATFORMS FOR IMPROVED RELIABILITY AND ACCURACY OF MAPPING WITHOUT THE NEED FOR GROUND CONTROL POINTS

    Directory of Open Access Journals (Sweden)

    O. Mian

    2015-08-01

    Full Text Available This paper presents results from a Direct Mapping Solution (DMS comprised of an Applanix APX-15 UAV GNSS-Inertial system integrated with a Sony a7R camera to produce highly accurate ortho-rectified imagery without Ground Control Points on a Microdrones md4-1000 platform. A 55 millimeter Nikkor f/1.8 lens was mounted on the Sony a7R and the camera was then focused and calibrated terrestrially using the Applanix camera calibration facility, and then integrated with the APX-15 UAV GNSS-Inertial system using a custom mount specifically designed for UAV applications. In July 2015, Applanix and Avyon carried out a test flight of this system. The goal of the test flight was to assess the performance of DMS APX-15 UAV direct georeferencing system on the md4-1000. The area mapped during the test was a 250 x 300 meter block in a rural setting in Ontario, Canada. Several ground control points are distributed within the test area. The test included 8 North-South lines and 1 cross strip flown at 80 meters AGL, resulting in a ~1 centimeter Ground Sample Distance (GSD. Map products were generated from the test flight using Direct Georeferencing, and then compared for accuracy against the known positions of ground control points in the test area. The GNSS-Inertial data collected by the APX-15 UAV was post-processed in Single Base mode, using a base station located in the project area via POSPac UAV. The base-station’s position was precisely determined by processing a 12-hour session using the CSRS-PPP Post Processing service. The ground control points were surveyed in using differential GNSS post-processing techniques with respect to the base-station.

  6. Assessing the Accuracy of Georeferenced Point Clouds Produced via Multi-View Stereopsis from Unmanned Aerial Vehicle (UAV Imagery

    Directory of Open Access Journals (Sweden)

    Arko Lucieer

    2012-05-01

    Full Text Available Sensor miniaturisation, improved battery technology and the availability of low-cost yet advanced Unmanned Aerial Vehicles (UAV have provided new opportunities for environmental remote sensing. The UAV provides a platform for close-range aerial photography. Detailed imagery captured from micro-UAV can produce dense point clouds using multi-view stereopsis (MVS techniques combining photogrammetry and computer vision. This study applies MVS techniques to imagery acquired from a multi-rotor micro-UAV of a natural coastal site in southeastern Tasmania, Australia. A very dense point cloud ( < 1–3 cm point spacing is produced in an arbitrary coordinate system using full resolution imagery, whereas other studies usually downsample the original imagery. The point cloud is sparse in areas of complex vegetation and where surfaces have a homogeneous texture. Ground control points collected with Differential Global Positioning System (DGPS are identified and used for georeferencing via a Helmert transformation. This study compared georeferenced point clouds to a Total Station survey in order to assess and quantify their geometric accuracy. The results indicate that a georeferenced point cloud accurate to 25–40 mm can be obtained from imagery acquired from 50 m. UAV-based image capture provides the spatial and temporal resolution required to map and monitor natural landscapes. This paper assesses the accuracy of the generated point clouds based on field survey points. Based on our key findings we conclude that sub-decimetre terrain change (in this case coastal erosion can be monitored.

  7. Georeferencing in QGIS 2.0

    Directory of Open Access Journals (Sweden)

    Jim Clifford

    2013-12-01

    Full Text Available In this lesson, you will learn how to georeference historical maps so that they may be added to a GIS as a raster layer. Georeferencing is required for anyone who wants to accurately digitize data found on a paper map, and since historians work mostly in the realm of paper, georeferencing is one of our most commonly used tools. The technique uses a series of control points to give a two-dimensional object like a paper map the real world coordinates it needs to align with the three-dimensional features of the earth in GIS software (in Intro to Google Maps and Google Earth we saw an ‘overlay’ which is a Google Earth shortcut version of georeferencing. Georeferencing a historical map requires a knowledge of both the geography and the history of the place you are studying to ensure accuracy. The built and natural landscapes change over time, and it is important to confirm that the location of your control points — whether they be houses, intersections, or even towns — have remained constant. Entering control points in a GIS is easy, but behind the scenes, georeferencing uses complex transformation and compression processes. These are used to correct the distortions and inaccuracies found in many historical maps and stretch the maps so that they fit geographic coordinates. In cartography this is known as rubber-sheeting because it treats the map as if it were made of rubber and the control points as if they were tacks ‘pinning’ the historical document to a three dimensional surface like the globe. To offer some examples of georeferenced historical maps, we prepared some National Topographic Series maps hosted on the University of Toronto Map Library website courtesy of Marcel Fortin, and we overlaid them on a Google web map. Viewers can adjust the transparency with the slider bar on the top right, view the historical map as an overlay on terrain or satellite images, or click ‘Earth’ to switch into Google Earth mode and see 3D

  8. Robust and Accurate Image-Based Georeferencing Exploiting Relative Orientation Constraints

    Science.gov (United States)

    Cavegn, S.; Blaser, S.; Nebiker, S.; Haala, N.

    2018-05-01

    Urban environments with extended areas of poor GNSS coverage as well as indoor spaces that often rely on real-time SLAM algorithms for camera pose estimation require sophisticated georeferencing in order to fulfill our high requirements of a few centimeters for absolute 3D point measurement accuracies. Since we focus on image-based mobile mapping, we extended the structure-from-motion pipeline COLMAP with georeferencing capabilities by integrating exterior orientation parameters from direct sensor orientation or SLAM as well as ground control points into bundle adjustment. Furthermore, we exploit constraints for relative orientation parameters among all cameras in bundle adjustment, which leads to a significant robustness and accuracy increase especially by incorporating highly redundant multi-view image sequences. We evaluated our integrated georeferencing approach on two data sets, one captured outdoors by a vehicle-based multi-stereo mobile mapping system and the other captured indoors by a portable panoramic mobile mapping system. We obtained mean RMSE values for check point residuals between image-based georeferencing and tachymetry of 2 cm in an indoor area, and 3 cm in an urban environment where the measurement distances are a multiple compared to indoors. Moreover, in comparison to a solely image-based procedure, our integrated georeferencing approach showed a consistent accuracy increase by a factor of 2-3 at our outdoor test site. Due to pre-calibrated relative orientation parameters, images of all camera heads were oriented correctly in our challenging indoor environment. By performing self-calibration of relative orientation parameters among respective cameras of our vehicle-based mobile mapping system, remaining inaccuracies from suboptimal test field calibration were successfully compensated.

  9. ROBUST AND ACCURATE IMAGE-BASED GEOREFERENCING EXPLOITING RELATIVE ORIENTATION CONSTRAINTS

    Directory of Open Access Journals (Sweden)

    S. Cavegn

    2018-05-01

    Full Text Available Urban environments with extended areas of poor GNSS coverage as well as indoor spaces that often rely on real-time SLAM algorithms for camera pose estimation require sophisticated georeferencing in order to fulfill our high requirements of a few centimeters for absolute 3D point measurement accuracies. Since we focus on image-based mobile mapping, we extended the structure-from-motion pipeline COLMAP with georeferencing capabilities by integrating exterior orientation parameters from direct sensor orientation or SLAM as well as ground control points into bundle adjustment. Furthermore, we exploit constraints for relative orientation parameters among all cameras in bundle adjustment, which leads to a significant robustness and accuracy increase especially by incorporating highly redundant multi-view image sequences. We evaluated our integrated georeferencing approach on two data sets, one captured outdoors by a vehicle-based multi-stereo mobile mapping system and the other captured indoors by a portable panoramic mobile mapping system. We obtained mean RMSE values for check point residuals between image-based georeferencing and tachymetry of 2 cm in an indoor area, and 3 cm in an urban environment where the measurement distances are a multiple compared to indoors. Moreover, in comparison to a solely image-based procedure, our integrated georeferencing approach showed a consistent accuracy increase by a factor of 2–3 at our outdoor test site. Due to pre-calibrated relative orientation parameters, images of all camera heads were oriented correctly in our challenging indoor environment. By performing self-calibration of relative orientation parameters among respective cameras of our vehicle-based mobile mapping system, remaining inaccuracies from suboptimal test field calibration were successfully compensated.

  10. GEOREFERENCED IMAGE SYSTEM WITH DRONES

    Directory of Open Access Journals (Sweden)

    Héctor A. Pérez-Sánchez

    2017-07-01

    Full Text Available This paper has as general purpose develop and implementation of a system that allows the generation of flight routes for a drone, the acquisition of geographic location information (GPS during the flight and taking photographs of points of interest for creating georeferenced images, same that will be used to generate KML files (Keyhole Markup Language for the representation of geographical data in three dimensions to be displayed on the Google Earth tool.

  11. Object Georeferencing in UAV-Based SAR Terrain Images

    Directory of Open Access Journals (Sweden)

    Łabowski Michał

    2016-12-01

    Full Text Available Synthetic aperture radars (SAR allow to obtain high resolution terrain images comparable with the resolution of optical methods. Radar imaging is independent on the weather conditions and the daylight. The process of analysis of the SAR images consists primarily of identifying of interesting objects. The ability to determine their geographical coordinates can increase usability of the solution from a user point of view. The paper presents a georeferencing method of the radar terrain images. The presented images were obtained from the SAR system installed on board an Unmanned Aerial Vehicle (UAV. The system was developed within a project under acronym WATSAR realized by the Military University of Technology and WB Electronics S.A. The source of the navigation data was an INS/GNSS system integrated by the Kalman filter with a feed-backward correction loop. The paper presents the terrain images obtained during flight tests and results of selected objects georeferencing with an assessment of the accuracy of the method.

  12. Real-Time and Post-Processed Georeferencing for Hyperpspectral Drone Remote Sensing

    Science.gov (United States)

    Oliveira, R. A.; Khoramshahi, E.; Suomalainen, J.; Hakala, T.; Viljanen, N.; Honkavaara, E.

    2018-05-01

    The use of drones and photogrammetric technologies are increasing rapidly in different applications. Currently, drone processing workflow is in most cases based on sequential image acquisition and post-processing, but there are great interests towards real-time solutions. Fast and reliable real-time drone data processing can benefit, for instance, environmental monitoring tasks in precision agriculture and in forest. Recent developments in miniaturized and low-cost inertial measurement systems and GNSS sensors, and Real-time kinematic (RTK) position data are offering new perspectives for the comprehensive remote sensing applications. The combination of these sensors and light-weight and low-cost multi- or hyperspectral frame sensors in drones provides the opportunity of creating near real-time or real-time remote sensing data of target object. We have developed a system with direct georeferencing onboard drone to be used combined with hyperspectral frame cameras in real-time remote sensing applications. The objective of this study is to evaluate the real-time georeferencing comparing with post-processing solutions. Experimental data sets were captured in agricultural and forested test sites using the system. The accuracy of onboard georeferencing data were better than 0.5 m. The results showed that the real-time remote sensing is promising and feasible in both test sites.

  13. DIRECT GEOREFERENCING : A NEW STANDARD IN PHOTOGRAMMETRY FOR HIGH ACCURACY MAPPING

    Directory of Open Access Journals (Sweden)

    A. Rizaldy

    2012-07-01

    Full Text Available Direct georeferencing is a new method in photogrammetry, especially in the digital camera era. Theoretically, this method does not require ground control points (GCP and the Aerial Triangulation (AT, to process aerial photography into ground coordinates. Compared with the old method, this method has three main advantages: faster data processing, simple workflow and less expensive project, at the same accuracy. Direct georeferencing using two devices, GPS and IMU. GPS recording the camera coordinates (X, Y, Z, and IMU recording the camera orientation (omega, phi, kappa. Both parameters merged into Exterior Orientation (EO parameter. This parameters required for next steps in the photogrammetric projects, such as stereocompilation, DSM generation, orthorectification and mosaic. Accuracy of this method was tested on topographic map project in Medan, Indonesia. Large-format digital camera Ultracam X from Vexcel is used, while the GPS / IMU is IGI AeroControl. 19 Independent Check Point (ICP were used to determine the accuracy. Horizontal accuracy is 0.356 meters and vertical accuracy is 0.483 meters. Data with this accuracy can be used for 1:2.500 map scale project.

  14. Automatic Georeferencing of Aerial Images by Means of Topographic Database Information

    DEFF Research Database (Denmark)

    Høhle, Joachim

    The book includes a preface and four articles which deal with the automatic georeferencing of aerial images. The articles are the written contribution of an seminar, held at Aalborg University in October 2002. The georeferencing or orientation of aerial images is the first step in mapping tasks l...... like generation of orthoimages, updating of topographic map data bases and generation of digial terrain models.......The book includes a preface and four articles which deal with the automatic georeferencing of aerial images. The articles are the written contribution of an seminar, held at Aalborg University in October 2002. The georeferencing or orientation of aerial images is the first step in mapping tasks...

  15. User Defined Geo-referenced Information

    DEFF Research Database (Denmark)

    Konstantas, Dimitri; Villalba, Alfredo; di Marzo Serugendo, Giovanna

    2009-01-01

    . In this paper we present two novel mobile and wireless collaborative services and concepts, the Hovering Information, a mobile, geo-referenced content information management system, and the QoS Information service, providing user observed end-to-end infrastructure geo-related QoS information....

  16. The Development of an UAV Borne Direct Georeferenced Photogrammetric Platform for Ground Control Point Free Applications

    Directory of Open Access Journals (Sweden)

    Chien-Hsun Chu

    2012-07-01

    Full Text Available To facilitate applications such as environment detection or disaster monitoring, the development of rapid low cost systems for collecting near real time spatial information is very critical. Rapid spatial information collection has become an emerging trend for remote sensing and mapping applications. In this study, a fixed-wing Unmanned Aerial Vehicle (UAV-based spatial information acquisition platform that can operate in Ground Control Point (GCP free environments is developed and evaluated. The proposed UAV based photogrammetric platform has a Direct Georeferencing (DG module that includes a low cost Micro Electro Mechanical Systems (MEMS Inertial Navigation System (INS/ Global Positioning System (GPS integrated system. The DG module is able to provide GPS single frequency carrier phase measurements for differential processing to obtain sufficient positioning accuracy. All necessary calibration procedures are implemented. Ultimately, a flight test is performed to verify the positioning accuracy in DG mode without using GCPs. The preliminary results of positioning accuracy in DG mode illustrate that horizontal positioning accuracies in the x and y axes are around 5 m at 300 m flight height above the ground. The positioning accuracy of the z axis is below 10 m. Therefore, the proposed platform is relatively safe and inexpensive for collecting critical spatial information for urgent response such as disaster relief and assessment applications where GCPs are not available.

  17. Reliability Estimates for Undergraduate Grade Point Average

    Science.gov (United States)

    Westrick, Paul A.

    2017-01-01

    Undergraduate grade point average (GPA) is a commonly employed measure in educational research, serving as a criterion or as a predictor depending on the research question. Over the decades, researchers have used a variety of reliability coefficients to estimate the reliability of undergraduate GPA, which suggests that there has been no consensus…

  18. Feature extraction and descriptor calculation methods for automatic georeferencing of Philippines' first microsatellite imagery

    Science.gov (United States)

    Tupas, M. E. A.; Dasallas, J. A.; Jiao, B. J. D.; Magallon, B. J. P.; Sempio, J. N. H.; Ramos, M. K. F.; Aranas, R. K. D.; Tamondong, A. M.

    2017-10-01

    The FAST-SIFT corner detector and descriptor extractor combination was used to automatically georeference DIWATA-1 Spaceborne Multispectral Imager images. Features from the Fast Accelerated Segment Test (FAST) algorithm detects corners or keypoints in an image, and these robustly detected keypoints have well-defined positions. Descriptors were computed using Scale-Invariant Feature Transform (SIFT) extractor. FAST-SIFT method effectively SMI same-subscene images detected by the NIR sensor. The method was also tested in stitching NIR images with varying subscene swept by the camera. The slave images were matched to the master image. The keypoints served as the ground control points. Random sample consensus was used to eliminate fall-out matches and ensure accuracy of the feature points from which the transformation parameters were derived. Keypoints are matched based on their descriptor vector. Nearest-neighbor matching is employed based on a metric distance between the descriptors. The metrics include Euclidean and city block, among others. Rough matching outputs not only the correct matches but also the faulty matches. A previous work in automatic georeferencing incorporates a geometric restriction. In this work, we applied a simplified version of the learning method. RANSAC was used to eliminate fall-out matches and ensure accuracy of the feature points. This method identifies if a point fits the transformation function and returns inlier matches. The transformation matrix was solved by Affine, Projective, and Polynomial models. The accuracy of the automatic georeferencing method were determined by calculating the RMSE of interest points, selected randomly, between the master image and transformed slave image.

  19. Approximate direct georeferencing in national coordinates

    Science.gov (United States)

    Legat, Klaus

    Direct georeferencing has gained an increasing importance in photogrammetry and remote sensing. Thereby, the parameters of exterior orientation (EO) of an image sensor are determined by GPS/INS, yielding results in a global geocentric reference frame. Photogrammetric products like digital terrain models or orthoimages, however, are often required in national geodetic datums and mapped by national map projections, i.e., in "national coordinates". As the fundamental mathematics of photogrammetry is based on Cartesian coordinates, the scene restitution is often performed in a Cartesian frame located at some central position of the image block. The subsequent transformation to national coordinates is a standard problem in geodesy and can be done in a rigorous manner-at least if the formulas of the map projection are rigorous. Drawbacks of this procedure include practical deficiencies related to the photogrammetric processing as well as the computational cost of transforming the whole scene. To avoid these problems, the paper pursues an alternative processing strategy where the EO parameters are transformed prior to the restitution. If only this transition was done, however, the scene would be systematically distorted. The reason is that the national coordinates are not Cartesian due to the earth curvature and the unavoidable length distortion of map projections. To settle these distortions, several corrections need to be applied. These are treated in detail for both passive and active imaging. Since all these corrections are approximations only, the resulting technique is termed "approximate direct georeferencing". Still, the residual distortions are usually very low as is demonstrated by simulations, rendering the technique an attractive approach to direct georeferencing.

  20. AN ACCURACY ASSESSMENT OF GEOREFERENCED POINT CLOUDS PRODUCED VIA MULTI-VIEW STEREO TECHNIQUES APPLIED TO IMAGERY ACQUIRED VIA UNMANNED AERIAL VEHICLE

    Directory of Open Access Journals (Sweden)

    S. Harwin

    2012-08-01

    Full Text Available Low-cost Unmanned Aerial Vehicles (UAVs are becoming viable environmental remote sensing tools. Sensor and battery technology is expanding the data capture opportunities. The UAV, as a close range remote sensing platform, can capture high resolution photography on-demand. This imagery can be used to produce dense point clouds using multi-view stereopsis techniques (MVS combining computer vision and photogrammetry. This study examines point clouds produced using MVS techniques applied to UAV and terrestrial photography. A multi-rotor micro UAV acquired aerial imagery from a altitude of approximately 30–40 m. The point clouds produced are extremely dense (<1–3 cm point spacing and provide a detailed record of the surface in the study area, a 70 m section of sheltered coastline in southeast Tasmania. Areas with little surface texture were not well captured, similarly, areas with complex geometry such as grass tussocks and woody scrub were not well mapped. The process fails to penetrate vegetation, but extracts very detailed terrain in unvegetated areas. Initially the point clouds are in an arbitrary coordinate system and need to be georeferenced. A Helmert transformation is applied based on matching ground control points (GCPs identified in the point clouds to GCPs surveying with differential GPS. These point clouds can be used, alongside laser scanning and more traditional techniques, to provide very detailed and precise representations of a range of landscapes at key moments. There are many potential applications for the UAV-MVS technique, including coastal erosion and accretion monitoring, mine surveying and other environmental monitoring applications. For the generated point clouds to be used in spatial applications they need to be converted to surface models that reduce dataset size without loosing too much detail. Triangulated meshes are one option, another is Poisson Surface Reconstruction. This latter option makes use of point normal

  1. INTEGRATED GEOREFERENCING OF STEREO IMAGE SEQUENCES CAPTURED WITH A STEREOVISION MOBILE MAPPING SYSTEM – APPROACHES AND PRACTICAL RESULTS

    Directory of Open Access Journals (Sweden)

    H. Eugster

    2012-07-01

    Full Text Available Stereovision based mobile mapping systems enable the efficient capturing of directly georeferenced stereo pairs. With today's camera and onboard storage technologies imagery can be captured at high data rates resulting in dense stereo sequences. These georeferenced stereo sequences provide a highly detailed and accurate digital representation of the roadside environment which builds the foundation for a wide range of 3d mapping applications and image-based geo web-services. Georeferenced stereo images are ideally suited for the 3d mapping of street furniture and visible infrastructure objects, pavement inspection, asset management tasks or image based change detection. As in most mobile mapping systems, the georeferencing of the mapping sensors and observations – in our case of the imaging sensors – normally relies on direct georeferencing based on INS/GNSS navigation sensors. However, in urban canyons the achievable direct georeferencing accuracy of the dynamically captured stereo image sequences is often insufficient or at least degraded. Furthermore, many of the mentioned application scenarios require homogeneous georeferencing accuracy within a local reference frame over the entire mapping perimeter. To achieve these demands georeferencing approaches are presented and cost efficient workflows are discussed which allows validating and updating the INS/GNSS based trajectory with independently estimated positions in cases of prolonged GNSS signal outages in order to increase the georeferencing accuracy up to the project requirements.

  2. Automatic Georeferencing of Astronaut Auroral Photography: Providing a New Dataset for Space Physics

    Science.gov (United States)

    Riechert, Maik; Walsh, Andrew P.; Taylor, Matt

    2014-05-01

    Astronauts aboard the International Space Station (ISS) have taken tens of thousands of photographs showing the aurora in high temporal and spatial resolution. The use of these images in research though is limited as they often miss accurate pointing and scale information. In this work we develop techniques and software libraries to automatically georeference such images, and provide a time and location-searchable database and website of those images. Aurora photographs very often include a visible starfield due to the necessarily long camera exposure times. We extend on the proof-of-concept of Walsh et al. (2012) who used starfield recognition software, Astrometry.net, to reconstruct the pointing and scale information. Previously a manual pre-processing step, the starfield can now in most cases be separated from earth and spacecraft structures successfully using image recognition. Once the pointing and scale of an image are known, latitudes and longitudes can be calculated for each pixel corner for an assumed auroral emission height. As part of this work, an open-source Python library is developed which automates the georeferencing process and aids in visualization tasks. The library facilitates the resampling of the resulting data from an irregular to a regular coordinate grid in a given pixel per degree density, it supports the export of data in CDF and NetCDF formats, and it generates polygons for drawing graphs and stereographic maps. In addition, the THEMIS all-sky imager web archive has been included as a first transparently accessible imaging source which in this case is useful when drawing maps of ISS passes over North America. The database and website are in development and will use the Python library as their base. Through this work, georeferenced auroral ISS photography is made available as a continously extended and easily accessible dataset. This provides potential not only for new studies on the aurora australis, as there are few all-sky imagers in

  3. Direct Georeferencing of Uav Data Based on Simple Building Structures

    Science.gov (United States)

    Tampubolon, W.; Reinhardt, W.

    2016-06-01

    Unmanned Aerial Vehicle (UAV) data acquisition is more flexible compared with the more complex traditional airborne data acquisition. This advantage puts UAV platforms in a position as an alternative acquisition method in many applications including Large Scale Topographical Mapping (LSTM). LSTM, i.e. larger or equal than 1:10.000 map scale, is one of a number of prominent priority tasks to be solved in an accelerated way especially in third world developing countries such as Indonesia. As one component of fundamental geospatial data sets, large scale topographical maps are mandatory in order to enable detailed spatial planning. However, the accuracy of the products derived from the UAV data are normally not sufficient for LSTM as it needs robust georeferencing, which requires additional costly efforts such as the incorporation of sophisticated GPS Inertial Navigation System (INS) or Inertial Measurement Unit (IMU) on the platform and/or Ground Control Point (GCP) data on the ground. To reduce the costs and the weight on the UAV alternative solutions have to be found. This paper outlines a direct georeferencing method of UAV data by providing image orientation parameters derived from simple building structures and presents results of an investigation on the achievable results in a LSTM application. In this case, the image orientation determination has been performed through sequential images without any input from INS/IMU equipment. The simple building structures play a significant role in such a way that geometrical characteristics have been considered. Some instances are the orthogonality of the building's wall/rooftop and the local knowledge of the building orientation in the field. In addition, we want to include the Structure from Motion (SfM) approach in order to reduce the number of required GCPs especially for the absolute orientation purpose. The SfM technique applied to the UAV data and simple building structures additionally presents an effective tool

  4. DIRECT GEOREFERENCING OF UAV DATA BASED ON SIMPLE BUILDING STRUCTURES

    Directory of Open Access Journals (Sweden)

    W. Tampubolon

    2016-06-01

    Full Text Available Unmanned Aerial Vehicle (UAV data acquisition is more flexible compared with the more complex traditional airborne data acquisition. This advantage puts UAV platforms in a position as an alternative acquisition method in many applications including Large Scale Topographical Mapping (LSTM. LSTM, i.e. larger or equal than 1:10.000 map scale, is one of a number of prominent priority tasks to be solved in an accelerated way especially in third world developing countries such as Indonesia. As one component of fundamental geospatial data sets, large scale topographical maps are mandatory in order to enable detailed spatial planning. However, the accuracy of the products derived from the UAV data are normally not sufficient for LSTM as it needs robust georeferencing, which requires additional costly efforts such as the incorporation of sophisticated GPS Inertial Navigation System (INS or Inertial Measurement Unit (IMU on the platform and/or Ground Control Point (GCP data on the ground. To reduce the costs and the weight on the UAV alternative solutions have to be found. This paper outlines a direct georeferencing method of UAV data by providing image orientation parameters derived from simple building structures and presents results of an investigation on the achievable results in a LSTM application. In this case, the image orientation determination has been performed through sequential images without any input from INS/IMU equipment. The simple building structures play a significant role in such a way that geometrical characteristics have been considered. Some instances are the orthogonality of the building’s wall/rooftop and the local knowledge of the building orientation in the field. In addition, we want to include the Structure from Motion (SfM approach in order to reduce the number of required GCPs especially for the absolute orientation purpose. The SfM technique applied to the UAV data and simple building structures additionally presents an

  5. Vision-Based Georeferencing of GPR in Urban Areas

    Directory of Open Access Journals (Sweden)

    Riccardo Barzaghi

    2016-01-01

    Full Text Available Ground Penetrating Radar (GPR surveying is widely used to gather accurate knowledge about the geometry and position of underground utilities. The sensor arrays need to be coupled to an accurate positioning system, like a geodetic-grade Global Navigation Satellite System (GNSS device. However, in urban areas this approach is not always feasible because GNSS accuracy can be substantially degraded due to the presence of buildings, trees, tunnels, etc. In this work, a photogrammetric (vision-based method for GPR georeferencing is presented. The method can be summarized in three main steps: tie point extraction from the images acquired during the survey, computation of approximate camera extrinsic parameters and finally a refinement of the parameter estimation using a rigorous implementation of the collinearity equations. A test under operational conditions is described, where accuracy of a few centimeters has been achieved. The results demonstrate that the solution was robust enough for recovering vehicle trajectories even in critical situations, such as poorly textured framed surfaces, short baselines, and low intersection angles.

  6. A library of georeferenced photos from the field

    Science.gov (United States)

    Xiao, Xiangming; Dorovskoy, Pavel; Biradar, Chandrashekhar; Bridge, Eli

    2011-12-01

    A picture is worth a thousand of words, and every day hundreds of scientists, students, and environmentally aware citizens are taking field photos to document their observations of rocks, glaciers, soils, forests, wetlands, croplands, rangelands, livestock, and birds and mammals, as well as important events such as droughts, floods, wildfires, insect emergences, and infectious disease outbreaks. Where are those field photos stored? Can they be shared in a timely fashion to support education, research, and the leisure activities of citizens across the world? What are the financial and intellectual costs if those field photos are lost or not shared? Recently, researchers at the University of Oklahoma developed and released the Global Geo-Referenced Field Photo Library (hereinafter referred to as the Field Photo Library; http://www.eomf.ou.edu/photos/), a Web-based data portal designed for researchers and educators who wish to archive and share field photos from across the world, each tagged with exact positioning data (Figure 1). The data portal has a simple user interface that allows people to upload, query, and download georeferenced field photos in the library.

  7. Reliability assessment of Indian Point Unit 3 containment structure

    International Nuclear Information System (INIS)

    Kawakami, J.; Hwang, H.; Chang, M.T.; Reich, M.

    1984-01-01

    In the current design criteria, the load combinations specified for design of concrete containment structures are in the deterministic formats. However, by applying the probability-based reliability method developed by BNL to the concrete containment structures designed according to the criteria, it is possible to evaluate the reliability levels implied in the current design criteria. For this purpose, the reliability analysis is applied to the Indian Point Unit No. 3 containment. The details of the containment structure such as the geometries and the rebar arrangements, etc., are taken from the working drawings and the final safety analysis reports. Three kinds of loads are considered in the reliability analysis. They are, dead load (D), accidental pressure due to a large LOCA (P), and earthquake ground acceleration (E). Reliability analysis of the containment subjected to all combinations of loads is performed. Results are presented in this report

  8. Reliability of physical examination for diagnosis of myofascial trigger points: a systematic review of the literature.

    Science.gov (United States)

    Lucas, Nicholas; Macaskill, Petra; Irwig, Les; Moran, Robert; Bogduk, Nikolai

    2009-01-01

    Trigger points are promoted as an important cause of musculoskeletal pain. There is no accepted reference standard for the diagnosis of trigger points, and data on the reliability of physical examination for trigger points are conflicting. To systematically review the literature on the reliability of physical examination for the diagnosis of trigger points. MEDLINE, EMBASE, and other sources were searched for articles reporting the reliability of physical examination for trigger points. Included studies were evaluated for their quality and applicability, and reliability estimates were extracted and reported. Nine studies were eligible for inclusion. None satisfied all quality and applicability criteria. No study specifically reported reliability for the identification of the location of active trigger points in the muscles of symptomatic participants. Reliability estimates varied widely for each diagnostic sign, for each muscle, and across each study. Reliability estimates were generally higher for subjective signs such as tenderness (kappa range, 0.22-1.0) and pain reproduction (kappa range, 0.57-1.00), and lower for objective signs such as the taut band (kappa range, -0.08-0.75) and local twitch response (kappa range, -0.05-0.57). No study to date has reported the reliability of trigger point diagnosis according to the currently proposed criteria. On the basis of the limited number of studies available, and significant problems with their design, reporting, statistical integrity, and clinical applicability, physical examination cannot currently be recommended as a reliable test for the diagnosis of trigger points. The reliability of trigger point diagnosis needs to be further investigated with studies of high quality that use current diagnostic criteria in clinically relevant patients.

  9. Reliability assessment of Indian Point Unit 3 containment structure under combined loads

    International Nuclear Information System (INIS)

    Hwang, H.; Shinozuka, M.; Kawakami, J.; Reich, M.

    1984-01-01

    In the current design criteria, the load combinations specified for design of concrete containment structures are in the deterministic format. However, by applying the probability-based reliability analysis method developed by BNL to the concrete containment structures designed according to the criteria, it is possible to evaluate the reliability levels implied in the current design criteria. For this purpose, the reliability analysis is applied to the Indian Point Unit No. 3 containment. The details of the containment structure such as the geometries and the rebar arrangements, etc., are taken from the working drawings and the Final Safety Analysis Report. Three kinds of loads are considered in the reliability analysis. They are, dead load, accidental pressure due to a large LOCA, and earthquake ground acceleration. This paper presents the reliability analysis results of the Indian Point Unit 3 containment subjected to all combinations of loads

  10. Georeferencing natural disaster impact footprints : lessons learned from the EM-DAT experience

    Science.gov (United States)

    Wallemacq, Pascaline; Guha Sapir, Debarati

    2014-05-01

    The Emergency Events Database (EM-DAT) contains data about the occurrence and consequences of all the disasters that have taken place since 1900. The main objectives of the database are to serve the purposes of humanitarian action at national and international levels; to aid decision making for disaster preparedness, as well as providing an objective base for vulnerability assessments and priority setting. EM-DAT records data on the human and economic impacts for each event as well as the location of said event. This is recorded as text data, namely the province, department, county, district, or village. The first purpose of geocoding (or georeferencing) the EM-DAT database is to transform the location data from text format into code data. The GAUL (Global Administrative Unit Layers) database (FAO) is used as a basis to identify the geographic footprint of the disaster, ideally to the second administrative level and add a unique code for each affected unit. Our first step has involved georeferencing earthquakes since the location of these is precise. The second purpose is to detail the degree of precision of georeferencing. The application and benefits of georeferencing are manifold. The geographic information of the footprint of past (after 2000) and future natural disasters permits the location of vulnerable areas with a GIS system and to cross data from different sources. It will allow the study of different elements such as the extent of a disaster and its human and economic consequences; the exposure and vulnerability of the population in space and time and the efficiency of mitigation measures. In addition, any association between events and external factors can be identified (e.g.: is the famine located at the same places as drought?) and precision of the information in the disaster report can be evaluated. Besides this, these maps will provide valuable communication support since maps have a high communication power and are easily understandable by the

  11. Reliable and Valid Assessment of Point-of-care Ultrasonography

    DEFF Research Database (Denmark)

    Todsen, Tobias; Tolsgaard, Martin Grønnebæk; Olsen, Beth Härstedt

    2015-01-01

    physicians' OSAUS scores with diagnostic accuracy. RESULTS: The generalizability coefficient was high (0.81) and a D-study demonstrated that 1 assessor and 5 cases would result in similar reliability. The construct validity of the OSAUS scale was supported by a significant difference in the mean scores......OBJECTIVE: To explore the reliability and validity of the Objective Structured Assessment of Ultrasound Skills (OSAUS) scale for point-of-care ultrasonography (POC US) performance. BACKGROUND: POC US is increasingly used by clinicians and is an essential part of the management of acute surgical...... conditions. However, the quality of performance is highly operator-dependent. Therefore, reliable and valid assessment of trainees' ultrasonography competence is needed to ensure patient safety. METHODS: Twenty-four physicians, representing novices, intermediates, and experts in POC US, scanned 4 different...

  12. Integrated GNSS attitude determination and positioning for direct geo-referencing

    NARCIS (Netherlands)

    Nadarajah, N.; Paffenholz, J.A.; Teunissen, P.J.G.

    2014-01-01

    Direct geo-referencing is an efficient methodology for the fast acquisition of 3D spatial data. It requires the fusion of spatial data acquisition sensors with navigation sensors, such as Global Navigation Satellite System (GNSS) receivers. In this contribution, we consider an integrated GNSS

  13. Towards an Automatic Framework for Urban Settlement Mapping from Satellite Images: Applications of Geo-referenced Social Media and One Class Classification

    Science.gov (United States)

    Miao, Zelang

    2017-04-01

    Currently, urban dwellers comprise more than half of the world's population and this percentage is still dramatically increasing. The explosive urban growth over the next two decades poses long-term profound impact on people as well as the environment. Accurate and up-to-date delineation of urban settlements plays a fundamental role in defining planning strategies and in supporting sustainable development of urban settlements. In order to provide adequate data about urban extents and land covers, classifying satellite data has become a common practice, usually with accurate enough results. Indeed, a number of supervised learning methods have proven effective in urban area classification, but they usually depend on a large amount of training samples, whose collection is a time and labor expensive task. This issue becomes particularly serious when classifying large areas at the regional/global level. As an alternative to manual ground truth collection, in this work we use geo-referenced social media data. Cities and densely populated areas are an extremely fertile land for the production of individual geo-referenced data (such as GPS and social network data). Training samples derived from geo-referenced social media have several advantages: they are easy to collect, usually they are freely exploitable; and, finally, data from social media are spatially available in many locations, and with no doubt in most urban areas around the world. Despite these advantages, the selection of training samples from social media meets two challenges: 1) there are many duplicated points; 2) method is required to automatically label them as "urban/non-urban". The objective of this research is to validate automatic sample selection from geo-referenced social media and its applicability in one class classification for urban extent mapping from satellite images. The findings in this study shed new light on social media applications in the field of remote sensing.

  14. Real-time geo-referenced video mosaicking with the MATISSE system

    DEFF Research Database (Denmark)

    Vincent, Anne-Gaelle; Pessel, Nathalie; Borgetto, Manon

    This paper presents the MATISSE system: Mosaicking Advanced Technologies Integrated in a Single Software Environment. This system aims at producing in-line and off-line geo-referenced video mosaics of seabed given a video input and navigation data. It is based upon several techniques of image...

  15. Reliability of impingement sampling designs: An example from the Indian Point station

    International Nuclear Information System (INIS)

    Mattson, M.T.; Waxman, J.B.; Watson, D.A.

    1988-01-01

    A 4-year data base (1976-1979) of daily fish impingement counts at the Indian Point electric power station on the Hudson River was used to compare the precision and reliability of three random-sampling designs: (1) simple random, (2) seasonally stratified, and (3) empirically stratified. The precision of daily impingement estimates improved logarithmically for each design as more days in the year were sampled. Simple random sampling was the least, and empirically stratified sampling was the most precise design, and the difference in precision between the two stratified designs was small. Computer-simulated sampling was used to estimate the reliability of the two stratified-random-sampling designs. A seasonally stratified sampling design was selected as the most appropriate reduced-sampling program for Indian Point station because: (1) reasonably precise and reliable impingement estimates were obtained using this design for all species combined and for eight common Hudson River fish by sampling only 30% of the days in a year (110 d); and (2) seasonal strata may be more precise and reliable than empirical strata if future changes in annual impingement patterns occur. The seasonally stratified design applied to the 1976-1983 Indian Point impingement data showed that selection of sampling dates based on daily species-specific impingement variability gave results that were more precise, but not more consistently reliable, than sampling allocations based on the variability of all fish species combined. 14 refs., 1 fig., 6 tabs

  16. Standardization of a geo-referenced fishing data set for the Indian Ocean bigeye tuna, Thunnus obesus (1952-2014)

    Science.gov (United States)

    Wibawa, Teja A.; Lehodey, Patrick; Senina, Inna

    2017-02-01

    Geo-referenced catch and fishing effort data of the bigeye tuna fisheries in the Indian Ocean over 1952-2014 were analyzed and standardized to facilitate population dynamics modeling studies. During this 62-year historical period of exploitation, many changes occurred both in the fishing techniques and the monitoring of activity. This study includes a series of processing steps used for standardization of spatial resolution, conversion and standardization of catch and effort units, raising of geo-referenced catch into nominal catch level, screening and correction of outliers, and detection of major catchability changes over long time series of fishing data, i.e., the Japanese longline fleet operating in the tropical Indian Ocean. A total of 30 fisheries were finally determined from longline, purse seine and other-gears data sets, from which 10 longline and 4 purse seine fisheries represented 96 % of the whole historical geo-referenced catch. Nevertheless, one-third of total nominal catch is still not included due to a total lack of geo-referenced information and would need to be processed separately, accordingly to the requirements of the study. The geo-referenced records of catch, fishing effort and associated length frequency samples of all fisheries are available at PANGAEA.864154" target="_blank">doi:10.1594/PANGAEA.864154.

  17. Near Real-Time Dissemination of Geo-Referenced Imagery by an Enterprise Server

    National Research Council Canada - National Science Library

    Brown, Alison; Gilbert, Chris; Holland, Heather; Lu, Yan

    2006-01-01

    .... The payload is connected through a data link to a ground-based server that can process the georegistered data in near-real-time using our GeoReferenced Information Manager (GRIM) Enterprise Server...

  18. Reliability of an experimental method to analyse the impact point on a golf ball during putting.

    Science.gov (United States)

    Richardson, Ashley K; Mitchell, Andrew C S; Hughes, Gerwyn

    2015-06-01

    This study aimed to examine the reliability of an experimental method identifying the location of the impact point on a golf ball during putting. Forty trials were completed using a mechanical putting robot set to reproduce a putt of 3.2 m, with four different putter-ball combinations. After locating the centre of the dimple pattern (centroid) the following variables were tested; distance of the impact point from the centroid, angle of the impact point from the centroid and distance of the impact point from the centroid derived from the X, Y coordinates. Good to excellent reliability was demonstrated in all impact variables reflected in very strong relative (ICC = 0.98-1.00) and absolute reliability (SEM% = 0.9-4.3%). The highest SEM% observed was 7% for the angle of the impact point from the centroid. In conclusion, the experimental method was shown to be reliable at locating the centroid location of a golf ball, therefore allowing for the identification of the point of impact with the putter head and is suitable for use in subsequent studies.

  19. An Emergency Georeferencing Framework for GF-4 Imagery Based on GCP Prediction and Dynamic RPC Refinement

    Directory of Open Access Journals (Sweden)

    Pengfei Li

    2017-10-01

    Full Text Available GaoFen-4 (GF-4 imagery has very potential in terms of emergency response due to its gazing mode. However, only poor geometric accuracy can be obtained using the rational polynomial coefficient (RPC parameters provided, making ground control points (GCPs necessary for emergency response. However, selecting GCPs is traditionally time-consuming, labor-intensive, and not fully reliable. This is mainly due to the facts that (1 manual GCP selection is time-consuming and cumbersome because of too many human interventions, especially for the first few GCPs; (2 typically, GF-4 gives planar array imagery acquired at rather large tilt angles, and the distortion introduces problems in image matching; (3 reference data will not always be available, especially under emergency circumstances. This paper provides a novel emergency georeferencing framework for GF-4 Level 1 imagery. The key feature is GCP prediction based on dynamic RPC refinement, which is able to predict even the first GCP and the prediction will be dynamically refined as the selection goes on. This is done by two techniques: (1 GCP prediction using RPC parameters and (2 dynamic RPC refinement using as few as only one GCP. Besides, online map services are also adopted to automatically provide reference data. Experimental results show that (1 GCP predictions improve using dynamic RPC refinement; (2 GCP selection becomes more efficient with GCP prediction; (3 the integration of online map services constitutes a good example for emergency response.

  20. The influence of the in situ camera calibration for direct georeferencing of aerial imagery

    Science.gov (United States)

    Mitishita, E.; Barrios, R.; Centeno, J.

    2014-11-01

    The direct determination of exterior orientation parameters (EOPs) of aerial images via GNSS/INS technologies is an essential prerequisite in photogrammetric mapping nowadays. Although direct sensor orientation technologies provide a high degree of automation in the process due to the GNSS/INS technologies, the accuracies of the obtained results depend on the quality of a group of parameters that models accurately the conditions of the system at the moment the job is performed. One sub-group of parameters (lever arm offsets and boresight misalignments) models the position and orientation of the sensors with respect to the IMU body frame due to the impossibility of having all sensors on the same position and orientation in the airborne platform. Another sub-group of parameters models the internal characteristics of the sensor (IOP). A system calibration procedure has been recommended by worldwide studies to obtain accurate parameters (mounting and sensor characteristics) for applications of the direct sensor orientation. Commonly, mounting and sensor characteristics are not stable; they can vary in different flight conditions. The system calibration requires a geometric arrangement of the flight and/or control points to decouple correlated parameters, which are not available in the conventional photogrammetric flight. Considering this difficulty, this study investigates the feasibility of the in situ camera calibration to improve the accuracy of the direct georeferencing of aerial images. The camera calibration uses a minimum image block, extracted from the conventional photogrammetric flight, and control point arrangement. A digital Vexcel UltraCam XP camera connected to POS AV TM system was used to get two photogrammetric image blocks. The blocks have different flight directions and opposite flight line. In situ calibration procedures to compute different sets of IOPs are performed and their results are analyzed and used in photogrammetric experiments. The IOPs

  1. Web-GIS approach for integrated analysis of heterogeneous georeferenced data

    Science.gov (United States)

    Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander; Shulgina, Tamara

    2014-05-01

    Georeferenced datasets are currently actively used for modeling, interpretation and forecasting of climatic and ecosystem changes on different spatial and temporal scales [1]. Due to inherent heterogeneity of environmental datasets as well as their huge size (up to tens terabytes for a single dataset) a special software supporting studies in the climate and environmental change areas is required [2]. Dedicated information-computational system for integrated analysis of heterogeneous georeferenced climatological and meteorological data is presented. It is based on combination of Web and GIS technologies according to Open Geospatial Consortium (OGC) standards, and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library (http://www.geoext.org), ExtJS Framework (http://www.sencha.com/products/extjs) and OpenLayers software (http://openlayers.org). The main advantage of the system lies in it's capability to perform integrated analysis of time series of georeferenced data obtained from different sources (in-situ observations, model results, remote sensing data) and to combine the results in a single map [3, 4] as WMS and WFS layers in a web-GIS application. Also analysis results are available for downloading as binary files from the graphical user interface or can be directly accessed through web mapping (WMS) and web feature (WFS) services for a further processing by the user. Data processing is performed on geographically distributed computational cluster comprising data storage systems and corresponding computational nodes. Several geophysical datasets represented by NCEP/NCAR Reanalysis II, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, DWD Global Precipitation Climatology Centre's data, GMAO Modern Era-Retrospective analysis for Research and Applications, reanalysis of Monitoring

  2. Automatic UAV Image Geo-Registration by Matching UAV Images to Georeferenced Image Data

    Directory of Open Access Journals (Sweden)

    Xiangyu Zhuo

    2017-04-01

    Full Text Available Recent years have witnessed the fast development of UAVs (unmanned aerial vehicles. As an alternative to traditional image acquisition methods, UAVs bridge the gap between terrestrial and airborne photogrammetry and enable flexible acquisition of high resolution images. However, the georeferencing accuracy of UAVs is still limited by the low-performance on-board GNSS and INS. This paper investigates automatic geo-registration of an individual UAV image or UAV image blocks by matching the UAV image(s with a previously taken georeferenced image, such as an individual aerial or satellite image with a height map attached or an aerial orthophoto with a DSM (digital surface model attached. As the biggest challenge for matching UAV and aerial images is in the large differences in scale and rotation, we propose a novel feature matching method for nadir or slightly tilted images. The method is comprised of a dense feature detection scheme, a one-to-many matching strategy and a global geometric verification scheme. The proposed method is able to find thousands of valid matches in cases where SIFT and ASIFT fail. Those matches can be used to geo-register the whole UAV image block towards the reference image data. When the reference images offer high georeferencing accuracy, the UAV images can also be geolocalized in a global coordinate system. A series of experiments involving different scenarios was conducted to validate the proposed method. The results demonstrate that our approach achieves not only decimeter-level registration accuracy, but also comparable global accuracy as the reference images.

  3. GPS receivers for georeferencing of spatial variability of soil attributes Receptores GPS para georreferenciamento da variabilidade espacial de atributos do solo

    Directory of Open Access Journals (Sweden)

    David L Rosalen

    2011-12-01

    Full Text Available The characterization of the spatial variability of soil attributes is essential to support agricultural practices in a sustainable manner. The use of geostatistics to characterize spatial variability of these attributes, such as soil resistance to penetration (RP and gravimetric soil moisture (GM is now usual practice in precision agriculture. The result of geostatistical analysis is dependent on the sample density and other factors according to the georeferencing methodology used. Thus, this study aimed to compare two methods of georeferencing to characterize the spatial variability of RP and GM as well as the spatial correlation of these variables. Sampling grid of 60 points spaced 20 m was used. For RP measurements, an electronic penetrometer was used and to determine the GM, a Dutch auger (0.0-0.1 m depth was used. The samples were georeferenced using a GPS navigation receiver, Simple Point Positioning (SPP with navigation GPS receiver, and Semi-Kinematic Relative Positioning (SKRP with an L1 geodetic GPS receiver. The results indicated that the georeferencing conducted by PPS did not affect the characterization of spatial variability of RP or GM, neither the spatial structure relationship of these attributes.A caracterização da variabilidade espacial dos atributos do solo é indispensável para subsidiar práticas agrícolas de maneira sustentável. A utilização da geoestatística para caracterizar a variabilidade espacial desses atributos, como a resistência mecânica do solo à penetração (RP e a umidade gravimétrica do solo (UG, é, hoje, prática usual na agricultura de precisão. O resultado da análise geoestatística é dependente da densidade amostral e de outros fatores, como o método de georreferencimento utilizado. Desta forma, o presente trabalho teve como objetivo comparar dois métodos de georreferenciamento para a caracterização da variabilidade espacial da RP e da UG, bem como a correlação espacial dessas vari

  4. Accurate Reconstruction of the Roman Circus in Milan by Georeferencing Heterogeneous Data Sources with GIS

    Directory of Open Access Journals (Sweden)

    Gabriele Guidi

    2017-09-01

    Full Text Available This paper presents the methodological approach and the actual workflow for creating the 3D digital reconstruction in time of the ancient Roman Circus of Milan, which is presently covered completely by the urban fabric of the modern city. The diachronic reconstruction is based on a proper mix of quantitative data originated by current 3D surveys and historical sources, such as ancient maps, drawings, archaeological reports, restrictions decrees, and old photographs. When possible, such heterogeneous sources have been georeferenced and stored in a GIS system. In this way the sources have been analyzed in depth, allowing the deduction of geometrical information not explicitly revealed by the material available. A reliable reconstruction of the area in different historical periods has been therefore hypothesized. This research has been carried on in the framework of the project Cultural Heritage Through Time—CHT2, funded by the Joint Programming Initiative on Cultural Heritage (JPI-CH, supported by the Italian Ministry for Cultural Heritage (MiBACT, the Italian Ministry for University and Research (MIUR, and the European Commission.

  5. DactyLoc : A minimally geo-referenced WiFi+GSM-fingerprint-based localization method for positioning in urban spaces

    DEFF Research Database (Denmark)

    Cujia, Kristian; Wirz, Martin; Kjærgaard, Mikkel Baun

    2012-01-01

    Fingerprinting-based localization methods relying on WiFi and GSM information provide sufficient localization accuracy for many mobile phone applications. Most of the existing approaches require a training set consisting of geo-referenced fingerprints to build a reference database. We propose...... a collaborative, semi-supervised WiFi+GSM fingerprinting method where only a small fraction of all fingerprints needs to be geo-referenced. Our approach enables indexing of areas in the absence of GPS reception as often found in urban spaces and indoors without manual labeling of fingerprints. The method takes...

  6. Mobile Laser Scanning along Dieppe coastal cliffs: reliability of the acquired point clouds applied to rockfall assessments

    Science.gov (United States)

    Michoud, Clément; Carrea, Dario; Augereau, Emmanuel; Cancouët, Romain; Costa, Stéphane; Davidson, Robert; Delacourt, Chirstophe; Derron, Marc-Henri; Jaboyedoff, Michel; Letortu, Pauline; Maquaire, Olivier

    2013-04-01

    Dieppe coastal cliffs, in Normandy, France, are mainly formed by sub-horizontal deposits of chalk and flintstone. Largely destabilized by an intense weathering and the Channel sea erosion, small and large rockfalls are regularly observed and contribute to retrogressive cliff processes. During autumn 2012, cliff and intertidal topographies have been acquired with a Terrestrial Laser Scanner (TLS) and a Mobile Laser Scanner (MLS), coupled with seafloor bathymetries realized with a multibeam echosounder (MBES). MLS is a recent development of laser scanning based on the same theoretical principles of aerial LiDAR, but using smaller, cheaper and portable devices. The MLS system, which is composed by an accurate dynamic positioning and orientation (INS) devices and a long range LiDAR, is mounted on a marine vessel; it is then possible to quickly acquire in motion georeferenced LiDAR point clouds with a resolution of about 15 cm. For example, it takes about 1 h to scan of shoreline of 2 km long. MLS is becoming a promising technique supporting erosion and rockfall assessments along the shores of lakes, fjords or seas. In this study, the MLS system used to acquire cliffs and intertidal areas of the Cap d'Ailly was composed by the INS Applanix POS-MV 320 V4 and the LiDAR Optech Ilirs LR. On the same day, three MLS scans with large overlaps (J1, J21 and J3) have been performed at ranges from 600 m at 4 knots (low tide) up to 200 m at 2.2 knots (up tide) with a calm sea at 2.5 Beaufort (small wavelets). Mean scan resolutions go from 26 cm for far scan (J1) to about 8.1 cm for close scan (J3). Moreover, one TLS point cloud on this test site has been acquired with a mean resolution of about 2.3 cm, using a Riegl LMS Z390i. In order to quantify the reliability of the methodology, comparisons between scans have been realized with the software Polyworks™, calculating shortest distances between points of one cloud and the interpolated surface of the reference point cloud. A Mat

  7. Develop Direct Geo-referencing System Based on Open Source Software and Hardware Platform

    Directory of Open Access Journals (Sweden)

    H. S. Liu

    2015-08-01

    Full Text Available Direct geo-referencing system uses the technology of remote sensing to quickly grasp images, GPS tracks, and camera position. These data allows the construction of large volumes of images with geographic coordinates. So that users can be measured directly on the images. In order to properly calculate positioning, all the sensor signals must be synchronized. Traditional aerial photography use Position and Orientation System (POS to integrate image, coordinates and camera position. However, it is very expensive. And users could not use the result immediately because the position information does not embed into image. To considerations of economy and efficiency, this study aims to develop a direct geo-referencing system based on open source software and hardware platform. After using Arduino microcontroller board to integrate the signals, we then can calculate positioning with open source software OpenCV. In the end, we use open source panorama browser, panini, and integrate all these to open source GIS software, Quantum GIS. A wholesome collection of data – a data processing system could be constructed.

  8. Develop Direct Geo-referencing System Based on Open Source Software and Hardware Platform

    Science.gov (United States)

    Liu, H. S.; Liao, H. M.

    2015-08-01

    Direct geo-referencing system uses the technology of remote sensing to quickly grasp images, GPS tracks, and camera position. These data allows the construction of large volumes of images with geographic coordinates. So that users can be measured directly on the images. In order to properly calculate positioning, all the sensor signals must be synchronized. Traditional aerial photography use Position and Orientation System (POS) to integrate image, coordinates and camera position. However, it is very expensive. And users could not use the result immediately because the position information does not embed into image. To considerations of economy and efficiency, this study aims to develop a direct geo-referencing system based on open source software and hardware platform. After using Arduino microcontroller board to integrate the signals, we then can calculate positioning with open source software OpenCV. In the end, we use open source panorama browser, panini, and integrate all these to open source GIS software, Quantum GIS. A wholesome collection of data - a data processing system could be constructed.

  9. Object-Based Coregistration of Terrestrial Photogrammetric and ALS Point Clouds in Forested Areas

    Science.gov (United States)

    Polewski, P.; Erickson, A.; Yao, W.; Coops, N.; Krzystek, P.; Stilla, U.

    2016-06-01

    Airborne Laser Scanning (ALS) and terrestrial photogrammetry are methods applicable for mapping forested environments. While ground-based techniques provide valuable information about the forest understory, the measured point clouds are normally expressed in a local coordinate system, whose transformation into a georeferenced system requires additional effort. In contrast, ALS point clouds are usually georeferenced, yet the point density near the ground may be poor under dense overstory conditions. In this work, we propose to combine the strengths of the two data sources by co-registering the respective point clouds, thus enriching the georeferenced ALS point cloud with detailed understory information in a fully automatic manner. Due to markedly different sensor characteristics, coregistration methods which expect a high geometric similarity between keypoints are not suitable in this setting. Instead, our method focuses on the object (tree stem) level. We first calculate approximate stem positions in the terrestrial and ALS point clouds and construct, for each stem, a descriptor which quantifies the 2D and vertical distances to other stem centers (at ground height). Then, the similarities between all descriptor pairs from the two point clouds are calculated, and standard graph maximum matching techniques are employed to compute corresponding stem pairs (tiepoints). Finally, the tiepoint subset yielding the optimal rigid transformation between the terrestrial and ALS coordinate systems is determined. We test our method on simulated tree positions and a plot situated in the northern interior of the Coast Range in western Oregon, USA, using ALS data (76 x 121 m2) and a photogrammetric point cloud (33 x 35 m2) derived from terrestrial photographs taken with a handheld camera. Results on both simulated and real data show that the proposed stem descriptors are discriminative enough to derive good correspondences. Specifically, for the real plot data, 24

  10. Toward Automatic Georeferencing of Archival Aerial Photogrammetric Surveys

    Science.gov (United States)

    Giordano, S.; Le Bris, A.; Mallet, C.

    2018-05-01

    Images from archival aerial photogrammetric surveys are a unique and relatively unexplored means to chronicle 3D land-cover changes over the past 100 years. They provide a relatively dense temporal sampling of the territories with very high spatial resolution. Such time series image analysis is a mandatory baseline for a large variety of long-term environmental monitoring studies. The current bottleneck for accurate comparison between epochs is their fine georeferencing step. No fully automatic method has been proposed yet and existing studies are rather limited in terms of area and number of dates. State-of-the art shows that the major challenge is the identification of ground references: cartographic coordinates and their position in the archival images. This task is manually performed, and extremely time-consuming. This paper proposes to use a photogrammetric approach, and states that the 3D information that can be computed is the key to full automation. Its original idea lies in a 2-step approach: (i) the computation of a coarse absolute image orientation; (ii) the use of the coarse Digital Surface Model (DSM) information for automatic absolute image orientation. It only relies on a recent orthoimage+DSM, used as master reference for all epochs. The coarse orthoimage, compared with such a reference, allows the identification of dense ground references and the coarse DSM provides their position in the archival images. Results on two areas and 5 dates show that this method is compatible with long and dense archival aerial image series. Satisfactory planimetric and altimetric accuracies are reported, with variations depending on the ground sampling distance of the images and the location of the Ground Control Points.

  11. TOWARD AUTOMATIC GEOREFERENCING OF ARCHIVAL AERIAL PHOTOGRAMMETRIC SURVEYS

    Directory of Open Access Journals (Sweden)

    S. Giordano

    2018-05-01

    Full Text Available Images from archival aerial photogrammetric surveys are a unique and relatively unexplored means to chronicle 3D land-cover changes over the past 100 years. They provide a relatively dense temporal sampling of the territories with very high spatial resolution. Such time series image analysis is a mandatory baseline for a large variety of long-term environmental monitoring studies. The current bottleneck for accurate comparison between epochs is their fine georeferencing step. No fully automatic method has been proposed yet and existing studies are rather limited in terms of area and number of dates. State-of-the art shows that the major challenge is the identification of ground references: cartographic coordinates and their position in the archival images. This task is manually performed, and extremely time-consuming. This paper proposes to use a photogrammetric approach, and states that the 3D information that can be computed is the key to full automation. Its original idea lies in a 2-step approach: (i the computation of a coarse absolute image orientation; (ii the use of the coarse Digital Surface Model (DSM information for automatic absolute image orientation. It only relies on a recent orthoimage+DSM, used as master reference for all epochs. The coarse orthoimage, compared with such a reference, allows the identification of dense ground references and the coarse DSM provides their position in the archival images. Results on two areas and 5 dates show that this method is compatible with long and dense archival aerial image series. Satisfactory planimetric and altimetric accuracies are reported, with variations depending on the ground sampling distance of the images and the location of the Ground Control Points.

  12. Integration of georeferencing, habitat, sampling, and genetic data for documentation of wild plant genetic resources

    Science.gov (United States)

    Plant genetic resource collections provide novel materials to the breeding and research communities. Availability of detailed documentation of passport, phenotypic, and genetic data increases the value of the genebank accessions. Inclusion of georeferenced sources, habitats, and sampling data in co...

  13. Obesity and Fast Food in Urban Markets: A New Approach Using Geo-referenced Micro Data

    NARCIS (Netherlands)

    Chen, S.E.; Florax, R.J.G.M.; Snyder, S.D.

    2013-01-01

    This paper presents a new method of assessing the relationship between features of the built environment and obesity, particularly in urban areas. Our empirical application combines georeferenced data on the location of fast-food restaurants with data about personal health, behavioral, and

  14. Alternative Estimates of the Reliability of College Grade Point Averages. Professional File. Article 130, Spring 2013

    Science.gov (United States)

    Saupe, Joe L.; Eimers, Mardy T.

    2013-01-01

    The purpose of this paper is to explore differences in the reliabilities of cumulative college grade point averages (GPAs), estimated for unweighted and weighted, one-semester, 1-year, 2-year, and 4-year GPAs. Using cumulative GPAs for a freshman class at a major university, we estimate internal consistency (coefficient alpha) reliabilities for…

  15. A Geo-referenced 3D model of the Juan de Fuca Slab and associated seismicity

    Science.gov (United States)

    Blair, J.L.; McCrory, P.A.; Oppenheimer, D.H.; Waldhauser, F.

    2011-01-01

    We present a Geographic Information System (GIS) of a new 3-dimensional (3D) model of the subducted Juan de Fuca Plate beneath western North America and associated seismicity of the Cascadia subduction system. The geo-referenced 3D model was constructed from weighted control points that integrate depth information from hypocenter locations and regional seismic velocity studies. We used the 3D model to differentiate earthquakes that occur above the Juan de Fuca Plate surface from earthquakes that occur below the plate surface. This GIS project of the Cascadia subduction system supersedes the one previously published by McCrory and others (2006). Our new slab model updates the model with new constraints. The most significant updates to the model include: (1) weighted control points to incorporate spatial uncertainty, (2) an additional gridded slab surface based on the Generic Mapping Tools (GMT) Surface program which constructs surfaces based on splines in tension (see expanded description below), (3) double-differenced hypocenter locations in northern California to better constrain slab location there, and (4) revised slab shape based on new hypocenter profiles that incorporate routine depth uncertainties as well as data from new seismic-reflection and seismic-refraction studies. We also provide a 3D fly-through animation of the model for use as a visualization tool.

  16. Operational reliability of the Point Lepreau GS standby generators

    Energy Technology Data Exchange (ETDEWEB)

    Loughead, D. A.; McGregor, A. T. [Safety and Compllance Group, New Brunswick Electric Power Commission, Point Lepreau Generating Station, P.O.Box 10 Lepreau, New Brunswick E0G 2H0 (Canada)

    1986-02-15

    Performance of the two Point Lepreau GS standby generators during the first three years of licensed station operation is reviewed. It is shown that the mandated reliability/availability requirements have been met. The nature of starting and running failures has been examined and the consequences, in terms of design and procedural changes, discussed. A brief review of standby generator outages is included to permit estimates of standby generator availability and total Class III standby power unavailability. A pair of simple equations is introduced as a means of estimating the probable economic penalty, both cumulative and incremental, associated with the running failure of one standby generator while the other is on a maintenance outage. (authors)

  17. Operational reliability of the Point Lepreau GS standby generators

    International Nuclear Information System (INIS)

    Loughead, D.A.; McGregor, A.T.

    1986-01-01

    Performance of the two Point Lepreau GS standby generators during the first three years of licensed station operation is reviewed. It is shown that the mandated reliability/availability requirements have been met. The nature of starting and running failures has been examined and the consequences, in terms of design and procedural changes, discussed. A brief review of standby generator outages is included to permit estimates of standby generator availability and total Class III standby power unavailability. A pair of simple equations is introduced as a means of estimating the probable economic penalty, both cumulative and incremental, associated with the running failure of one standby generator while the other is on a maintenance outage. (authors)

  18. Geo-referenced modelling of metal concentrations in river basins at the catchment scale

    Science.gov (United States)

    Hüffmeyer, N.; Berlekamp, J.; Klasmeier, J.

    2009-04-01

    1. Introduction The European Water Framework Directive demands the good ecological and chemical state of surface waters [1]. This implies the reduction of unwanted metal concentrations in surface waters. To define reasonable environmental target values and to develop promising mitigation strategies a detailed exposure assessment is required. This includes the identification of emission sources and the evaluation of their effect on local and regional surface water concentrations. Point source emissions via municipal or industrial wastewater that collect metal loads from a wide variety of applications and products are important anthropogenic pathways into receiving waters. Natural background and historical influences from ore-mining activities may be another important factor. Non-point emissions occur via surface runoff and erosion from drained land area. Besides deposition metals can be deposited by fertilizer application or the use of metal products such as wires or metal fences. Surface water concentrations vary according to the emission strength of sources located nearby and upstream of the considered location. A direct link between specific emission sources and pathways on the one hand and observed concentrations can hardly be established by monitoring alone. Geo-referenced models such as GREAT-ER (Geo-referenced Regional Exposure Assessment Tool for European Rivers) deliver spatially resolved concentrations in a whole river basin and allow for evaluating the causal relationship between specific emissions and resulting concentrations. This study summarizes the results of investigations for the metals zinc and copper in three German catchments. 2. The model GREAT-ER The geo-referenced model GREAT-ER has originally been developed to simulate and assess chemical burden of European river systems from multiple emission sources [2]. Emission loads from private households and rainwater runoff are individually estimated based on average consumption figures, runoff rates

  19. Inter- and Intraexaminer Reliability in Identifying and Classifying Myofascial Trigger Points in Shoulder Muscles.

    Science.gov (United States)

    Nascimento, José Diego Sales do; Alburquerque-Sendín, Francisco; Vigolvino, Lorena Passos; Oliveira, Wandemberg Fortunato de; Sousa, Catarina de Oliveira

    2018-01-01

    To determine inter- and intraexaminer reliability of examiners without clinical experience in identifying and classifying myofascial trigger points (MTPs) in the shoulder muscles of subjects asymptomatic and symptomatic for unilateral subacromial impact syndrome (SIS). Within-day inter- and intraexaminer reliability study. Physical therapy department of a university. Fifty-two subjects participated in the study, 26 symptomatic and 26 asymptomatic for unilateral SIS. Two examiners, without experience for assessing MTPs, independent and blind to the clinical conditions of the subjects, assessed bilaterally the presence of MTPs (present or absent) in 6 shoulder muscles and classified them (latent or active) on the affected side of the symptomatic group. Each examiner performed the same assessment twice in the same day. Reliability was calculated through percentage agreement, prevalence- and bias-adjusted kappa (PABAK) statistics, and weighted kappa. Intraexaminer reliability in identifying MTPs for the symptomatic and asymptomatic groups was moderate to perfect (PABAK, .46-1 and .60-1, respectively). Interexaminer reliability was between moderate and almost perfect in the 2 groups (PABAK, .46-.92), except for the muscles of the symptomatic group, which were below these values. With respect to MTP classification, intraexaminer reliability was moderate to high for most muscles, but interexaminer reliability was moderate for only 1 muscle (weighted κ=.45), and between weak and reasonable for the rest (weighted κ=.06-.31). Intraexaminer reliability is acceptable in clinical practice to identify and classify MTPs. However, interexaminer reliability proved to be reliable only to identify MTPs, with the symptomatic side exhibiting lower values of reliability. Copyright © 2017 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  20. Software Framework for Development of Web-GIS Systems for Analysis of Georeferenced Geophysical Data

    Science.gov (United States)

    Okladnikov, I.; Gordov, E. P.; Titov, A. G.

    2011-12-01

    Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated software framework for rapid development of providing such support information-computational systems based on Web-GIS technologies has been created. The software framework consists of 3 basic parts: computational kernel developed using ITTVIS Interactive Data Language (IDL), a set of PHP-controllers run within specialized web portal, and JavaScript class library for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology. Computational kernel comprise of number of modules for datasets access, mathematical and statistical data analysis and visualization of results. Specialized web-portal consists of web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript library aiming at graphical user interface development is based on GeoExt library combining ExtJS Framework and OpenLayers software. Based on the software framework an information-computational system for complex analysis of large georeferenced data archives was developed. Structured environmental datasets available for processing now include two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis

  1. Orthopaedic nurses' knowledge and interrater reliability of neurovascular assessments with 2-point discrimination test.

    Science.gov (United States)

    Turney, Jennifer; Raley Noble, Deana; Kim, Son Chae

    2013-01-01

    : This study was conducted to evaluate the effects of education on knowledge and interrater reliability of neurovascular assessments with 2-point discrimination (2-PD) test among pediatric orthopaedic nurses. : A pre- and posttest study was done among 60 nurses attending 2-hour educational sessions. Neurovascular assessments with 2-PD test were performed on 64 casted pediatric patients by the nurses and 5 nurse experts before and after the educational sessions. : The mean neurovascular assessment knowledge score was improved at posteducation compared with the preeducation (p < .001). The 2-PD test interrater reliability also improved from Cohen's kappa value of 0.24-0.48 at posteducation. : The 2-hour educational session may be effective in improving nurses' knowledge and the interrater reliability of neurovascular assessment with 2-PD test.

  2. The Direct Georeferencing Application and Performance Analysis of Uav Helicopter in Gcp-Free Area

    Science.gov (United States)

    Lo, C. F.; Tsai, M. L.; Chiang, K. W.; Chu, C. H.; Tsai, G. J.; Cheng, C. K.; El-Sheimy, N.; Ayman, H.

    2015-08-01

    There are many disasters happened because the weather changes extremely in these years. To facilitate applications such as environment detection or monitoring becomes very important. Therefore, the development of rapid low cost systems for collecting near real-time spatial information is very critical. Rapid spatial information collection has become an emerging trend for remote sensing and mapping applications. This study develops a Direct Georeferencing (DG) based Unmanned Aerial Vehicle (UAV) helicopter photogrammetric platform where an Inertial Navigation System (INS)/Global Navigation Satellite System (GNSS) integrated Positioning and Orientation System (POS) system is implemented to provide the DG capability of the platform. The performance verification indicates that the proposed platform can capture aerial images successfully. A flight test is performed to verify the positioning accuracy in DG mode without using Ground Control Points (GCP). The preliminary results illustrate that horizontal DG positioning accuracies in the x and y axes are around 5 meter with 100 meter flight height. The positioning accuracy in the z axis is less than 10 meter. Such accuracy is good for near real-time disaster relief. The DG ready function of proposed platform guarantees mapping and positioning capability even in GCP free environments, which is very important for rapid urgent response for disaster relief. Generally speaking, the data processing time for the DG module, including POS solution generalization, interpolation, Exterior Orientation Parameters (EOP) generation, and feature point measurements, is less than 1 hour.

  3. Development of web-GIS system for analysis of georeferenced geophysical data

    Science.gov (United States)

    Okladnikov, I.; Gordov, E. P.; Titov, A. G.; Bogomolov, V. Y.; Genina, E.; Martynova, Y.; Shulgina, T. M.

    2012-12-01

    Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their huge size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated web-GIS information-computational system for analysis of georeferenced climatological and meteorological data has been created. The information-computational system consists of 4 basic parts: computational kernel developed using GNU Data Language (GDL), a set of PHP-controllers run within specialized web-portal, JavaScript class libraries for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology, and an archive of geophysical datasets. Computational kernel comprises of a number of dedicated modules for querying and extraction of data, mathematical and statistical data analysis, visualization, and preparing output files in geoTIFF and netCDF format containing processing results. Specialized web-portal consists of a web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript libraries aiming at graphical user interface development are based on GeoExt library combining ExtJS Framework and OpenLayers software. The archive of geophysical data consists of a number of structured environmental datasets represented by data files in netCDF, HDF, GRIB, ESRI Shapefile formats. For processing by the system are available: two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25

  4. Methods for Georeferencing and Spectral Scaling of Remote Imagery using ArcView, ArcGIS, and ENVI

    Science.gov (United States)

    Remote sensing images can be used to support variable-rate (VR) application of material from aircraft. Geographic coordinates must be assigned to an image (georeferenced) so that the variable-rate system can determine where in the field to apply these inputs and adjust the system when a zone has bee...

  5. RIGOROUS GEOREFERENCING OF ALSAT-2A PANCHROMATIC AND MULTISPECTRAL IMAGERY

    Directory of Open Access Journals (Sweden)

    I. Boukerch

    2013-04-01

    Full Text Available The exploitation of the full geometric capabilities of the High-Resolution Satellite Imagery (HRSI, require the development of an appropriate sensor orientation model. Several authors studied this problem; generally we have two categories of geometric models: physical and empirical models. Based on the analysis of the metadata provided with ALSAT-2A, a rigorous pushbroom camera model can be developed. This model has been successfully applied to many very high resolution imagery systems. The relation between the image and ground coordinates by the time dependant collinearity involving many coordinates systems has been tested. The interior orientation parameters must be integrated in the model, the interior parameters can be estimated from the viewing angles corresponding to the pointing directions of any detector, these values are derived from cubic polynomials provided in the metadata. The developed model integrates all the necessary elements with 33 unknown. All the approximate values of the 33 unknowns parameters may be derived from the informations contained in the metadata files provided with the imagery technical specifications or they are simply fixed to zero, so the condition equation is linearized and solved using SVD in a least square sense in order to correct the initial values using a suitable number of well-distributed GCPs. Using Alsat-2A images over the town of Toulouse in the south west of France, three experiments are done. The first is about 2D accuracy analysis using several sets of parameters. The second is about GCPs number and distribution. The third experiment is about georeferencing multispectral image by applying the model calculated from panchromatic image.

  6. Generation of High-Resolution Geo-referenced Photo-Mosaics From Navigation Data

    Science.gov (United States)

    Delaunoy, O.; Elibol, A.; Garcia, R.; Escartin, J.; Fornari, D.; Humphris, S.

    2006-12-01

    Optical images of the ocean floor are a rich source of data to understand biological and geological processes. However, due to the attenuation of light in sea water, the area covered by the optical systems is very reduced, and a large number of images are then needed in order to cover an area of interest, as individually they do not provide a global view of the surveyed area. Therefore, generating a composite view (or photo-mosaic) from multiple overlapping images is usually the most practical and flexible solution to visually cover a wide area, allowing the analysis of the site in one single representation of the ocean floor. In most of the camera surveys which are carried out nowadays, some sort of positioning information is available (e.g., USBL, DVL, INS, gyros, etc). If it is a towed camera an estimation of the length of the tether and the mother ship GPS reading could also serve as navigation data. In any case, a photo-mosaic can be build just by taking into account the position and orientation of the camera. On the other hand, most of the regions of interest for the scientific community are quite large (>1Km2) and since better resolution is always required, the final photo-mosaic can be very large (>1,000,000 × 1,000,000 pixels), and cannot be handled by commonly available software. For this reason, we have developed a software package able to load a navigation file and the sequence of acquired images to automatically build a geo-referenced mosaic. This navigated mosaic provides a global view of the interest site, at the maximum available resolution. The developed package includes a viewer, allowing the user to load, view and annotate these geo-referenced photo-mosaics on a personal computer. A software library has been developed to allow the viewer to manage such very big images. Therefore, the size of the resulting mosaic is now only limited by the size of the hard drive. Work is being carried out to apply image processing techniques to the navigated

  7. PLÉIADES PROJECT: ASSESSMENT OF GEOREFERENCING ACCURACY, IMAGE QUALITY, PANSHARPENING PERFORMENCE AND DSM/DTM QUALITY

    Directory of Open Access Journals (Sweden)

    H. Topan

    2016-06-01

    Full Text Available Pléiades 1A and 1B are twin optical satellites of Optical and Radar Federated Earth Observation (ORFEO program jointly running by France and Italy. They are the first satellites of Europe with sub-meter resolution. Airbus DS (formerly Astrium Geo runs a MyGIC (formerly Pléiades Users Group program to validate Pléiades images worldwide for various application purposes. The authors conduct three projects, one is within this program, the second is supported by BEU Scientific Research Project Program, and the third is supported by TÜBİTAK. Assessment of georeferencing accuracy, image quality, pansharpening performance and Digital Surface Model/Digital Terrain Model (DSM/DTM quality subjects are investigated in these projects. For these purposes, triplet panchromatic (50 cm Ground Sampling Distance (GSD and VNIR (2 m GSD Pléiades 1A images were investigated over Zonguldak test site (Turkey which is urbanised, mountainous and covered by dense forest. The georeferencing accuracy was estimated with a standard deviation in X and Y (SX, SY in the range of 0.45m by bias corrected Rational Polynomial Coefficient (RPC orientation, using ~170 Ground Control Points (GCPs. 3D standard deviation of ±0.44m in X, ±0.51m in Y, and ±1.82m in Z directions have been reached in spite of the very narrow angle of convergence by bias corrected RPC orientation. The image quality was also investigated with respect to effective resolution, Signal to Noise Ratio (SNR and blur coefficient. The effective resolution was estimated with factor slightly below 1.0, meaning that the image quality corresponds to the nominal resolution of 50cm. The blur coefficients were achieved between 0.39-0.46 for triplet panchromatic images, indicating a satisfying image quality. SNR is in the range of other comparable space borne images which may be caused by de-noising of Pléiades images. The pansharpened images were generated by various methods, and are validated by most common

  8. Test-retest reliability and minimal detectable change of two simplified 3-point balance measures in patients with stroke.

    Science.gov (United States)

    Chen, Yi-Miau; Huang, Yi-Jing; Huang, Chien-Yu; Lin, Gong-Hong; Liaw, Lih-Jiun; Lee, Shih-Chieh; Hsieh, Ching-Lin

    2017-10-01

    The 3-point Berg Balance Scale (BBS-3P) and 3-point Postural Assessment Scale for Stroke Patients (PASS-3P) were simplified from the BBS and PASS to overcome the complex scoring systems. The BBS-3P and PASS-3P were more feasible in busy clinical practice and showed similarly sound validity and responsiveness to the original measures. However, the reliability of the BBS-3P and PASS-3P is unknown limiting their utility and the interpretability of scores. We aimed to examine the test-retest reliability and minimal detectable change (MDC) of the BBS-3P and PASS-3P in patients with stroke. Cross-sectional study. The rehabilitation departments of a medical center and a community hospital. A total of 51 chronic stroke patients (64.7% male). Both balance measures were administered twice 7 days apart. The test-retest reliability of both the BBS-3P and PASS-3P were examined by intraclass correlation coefficients (ICC). The MDC and its percentage over the total score (MDC%) of each measure was calculated for examining the random measurement errors. The ICC values of the BBS-3P and PASS-3P were 0.99 and 0.97, respectively. The MDC% (MDC) of the BBS-3P and PASS-3P were 9.1% (5.1 points) and 8.4% (3.0 points), respectively, indicating that both measures had small and acceptable random measurement errors. Our results showed that both the BBS-3P and the PASS-3P had good test-retest reliability, with small and acceptable random measurement error. These two simplified 3-level balance measures can provide reliable results over time. Our findings support the repeated administration of the BBS-3P and PASS-3P to monitor the balance of patients with stroke. The MDC values can help clinicians and researchers interpret the change scores more precisely.

  9. Test-retest reliability of myofascial trigger point detection in hip and thigh areas.

    Science.gov (United States)

    Rozenfeld, E; Finestone, A S; Moran, U; Damri, E; Kalichman, L

    2017-10-01

    Myofascial trigger points (MTrP's) are a primary source of pain in patients with musculoskeletal disorders. Nevertheless, they are frequently underdiagnosed. Reliable MTrP palpation is the necessary for their diagnosis and treatment. The few studies that have looked for intra-tester reliability of MTrPs detection in upper body, provide preliminary evidence that MTrP palpation is reliable. Reliability tests for MTrP palpation on the lower limb have not yet been performed. To evaluate inter- and intra-tester reliability of MTrP recognition in hip and thigh muscles. Reliability study. 21 patients (15 males and 6 females, mean age 21.1 years) referred to the physical therapy clinic, 10 with knee or hip pain and 11 with pain in an upper limb, low back, shin or ankle. Two experienced physical therapists performed the examinations, blinded to the subjects' identity, medical condition and results of the previous MTrP evaluation. Each subject was evaluated four times, twice by each examiner in a random order. Dichotomous findings included a palpable taut band, tenderness, referred pain, and relevance of referred pain to patient's complaint. Based on these, diagnosis of latent MTrP's or active MTrP's was established. The evaluation was performed on both legs and included a total of 16 locations in the following muscles: rectus femoris (proximal), vastus medialis (middle and distal), vastus lateralis (middle and distal) and gluteus medius (anterior, posterior and distal). Inter- and intra-tester reliability (Cohen's kappa (κ)) values for single sites ranged from -0.25 to 0.77. Median intra-tester reliability was 0.45 and 0.46 for latent and active MTrP's, and median inter-tester reliability was 0.51 and 0.64 for latent and active MTrPs, respectively. The examination of the distal vastus medialis was most reliable for latent and active MTrP's (intra-tester k = 0.27-0.77, inter-tester k = 0.77 and intra-tester k = 0.53-0.72, inter-tester k = 0.72, correspondingly

  10. Mobile TDR for geo-referenced measurement of soil water content and electrical conductivity

    DEFF Research Database (Denmark)

    Thomsen, Anton; Schelde, Kirsten; Drøscher, Per

    2007-01-01

    The development of site-specific crop management is constrained by the availability of sensors for monitoring important soil and crop related conditions. A mobile time-domain reflectometry (TDR) unit for geo-referenced soil measurements has been developed and used for detailed mapping of soil wat...... analysis of the soil water measurements, recommendations are made with respect to sampling strategies. Depending on the variability of a given area, between 15 and 30 ha can be mapped with respect to soil moisture and electrical conductivity with sufficient detail within 8 h...

  11. Reliable four-point flexion test and model for die-to-wafer direct bonding

    Energy Technology Data Exchange (ETDEWEB)

    Tabata, T., E-mail: toshiyuki.tabata@cea.fr; Sanchez, L.; Fournel, F.; Moriceau, H. [Univ. Grenoble Alpes, F-38000 Grenoble, France and CEA, LETI, MINATEC Campus, F-38054 Grenoble (France)

    2015-07-07

    For many years, wafer-to-wafer (W2W) direct bonding has been very developed particularly in terms of bonding energy measurement and bonding mechanism comprehension. Nowadays, die-to-wafer (D2W) direct bonding has gained significant attention, for instance, in photonics and microelectro-mechanics, which supposes controlled and reliable fabrication processes. So, whatever the stuck materials may be, it is not obvious whether bonded D2W structures have the same bonding strength as bonded W2W ones, because of possible edge effects of dies. For that reason, it has been strongly required to develop a bonding energy measurement technique which is suitable for D2W structures. In this paper, both D2W- and W2W-type standard SiO{sub 2}-to-SiO{sub 2} direct bonding samples are fabricated from the same full-wafer bonding. Modifications of the four-point flexion test (4PT) technique and applications for measuring D2W direct bonding energies are reported. Thus, the comparison between the modified 4PT and the double-cantilever beam techniques is drawn, also considering possible impacts of the conditions of measures such as the water stress corrosion at the debonding interface and the friction error at the loading contact points. Finally, reliability of a modified technique and a new model established for measuring D2W direct bonding energies is demonstrated.

  12. Low aerial imagery - an assessment of georeferencing errors and the potential for use in environmental inventory

    Science.gov (United States)

    Smaczyński, Maciej; Medyńska-Gulij, Beata

    2017-06-01

    Unmanned aerial vehicles are increasingly being used in close range photogrammetry. Real-time observation of the Earth's surface and the photogrammetric images obtained are used as material for surveying and environmental inventory. The following study was conducted on a small area (approximately 1 ha). In such cases, the classical method of topographic mapping is not accurate enough. The geodetic method of topographic surveying, on the other hand, is an overly precise measurement technique for the purpose of inventorying the natural environment components. The author of the following study has proposed using the unmanned aerial vehicle technology and tying in the obtained images to the control point network established with the aid of GNSS technology. Georeferencing the acquired images and using them to create a photogrammetric model of the studied area enabled the researcher to perform calculations, which yielded a total root mean square error below 9 cm. The performed comparison of the real lengths of the vectors connecting the control points and their lengths calculated on the basis of the photogrammetric model made it possible to fully confirm the RMSE calculated and prove the usefulness of the UAV technology in observing terrain components for the purpose of environmental inventory. Such environmental components include, among others, elements of road infrastructure, green areas, but also changes in the location of moving pedestrians and vehicles, as well as other changes in the natural environment that are not registered on classical base maps or topographic maps.

  13. Mapping with Small UAS: A Point Cloud Accuracy Assessment

    Science.gov (United States)

    Toth, Charles; Jozkow, Grzegorz; Grejner-Brzezinska, Dorota

    2015-12-01

    Interest in using inexpensive Unmanned Aerial System (UAS) technology for topographic mapping has recently significantly increased. Small UAS platforms equipped with consumer grade cameras can easily acquire high-resolution aerial imagery allowing for dense point cloud generation, followed by surface model creation and orthophoto production. In contrast to conventional airborne mapping systems, UAS has limited ground coverage due to low flying height and limited flying time, yet it offers an attractive alternative to high performance airborne systems, as the cost of the sensors and platform, and the flight logistics, is relatively low. In addition, UAS is better suited for small area data acquisitions and to acquire data in difficult to access areas, such as urban canyons or densely built-up environments. The main question with respect to the use of UAS is whether the inexpensive consumer sensors installed in UAS platforms can provide the geospatial data quality comparable to that provided by conventional systems. This study aims at the performance evaluation of the current practice of UAS-based topographic mapping by reviewing the practical aspects of sensor configuration, georeferencing and point cloud generation, including comparisons between sensor types and processing tools. The main objective is to provide accuracy characterization and practical information for selecting and using UAS solutions in general mapping applications. The analysis is based on statistical evaluation as well as visual examination of experimental data acquired by a Bergen octocopter with three different image sensor configurations, including a GoPro HERO3+ Black Edition, a Nikon D800 DSLR and a Velodyne HDL-32. In addition, georeferencing data of varying quality were acquired and evaluated. The optical imagery was processed by using three commercial point cloud generation tools. Comparing point clouds created by active and passive sensors by using different quality sensors, and finally

  14. A reliable, fast and low cost maximum power point tracker for photovoltaic applications

    Energy Technology Data Exchange (ETDEWEB)

    Enrique, J.M.; Andujar, J.M.; Bohorquez, M.A. [Departamento de Ingenieria Electronica, de Sistemas Informaticos y Automatica, Universidad de Huelva (Spain)

    2010-01-15

    This work presents a new maximum power point tracker system for photovoltaic applications. The developed system is an analog version of the ''P and O-oriented'' algorithm. It maintains its main advantages: simplicity, reliability and easy practical implementation, and avoids its main disadvantages: inaccurateness and relatively slow response. Additionally, the developed system can be implemented in a practical way at a low cost, which means an added value. The system also shows an excellent behavior for very fast variables in incident radiation levels. (author)

  15. Design and development of a geo-referenced database to radionuclides in food

    Science.gov (United States)

    Nascimento, L. M. E.; Ferreira, A. C. M.; Gonzalez, S. A.

    2018-03-01

    The primary purpose of the range of activities concerning the info management of the environmental assessment is to provide to scientific community an improved access to environmental data, as well as to support the decision making loop, in case of contamination events due either to accidental or intentional causes. In recent years, geotechnologies became a key reference in environmental research and monitoring, since they deliver an efficient data retrieval and subsequent processing about natural resources. This study aimed at the development of a georeferenced database (SIGLARA – SIstema Georeferenciado Latino Americano de Radionuclídeos em Alimentos), designed to radioactivity in food data storage, available in three languages (Spanish, Portuguese and English), employing free software[l].

  16. Design and development of a geo-referenced database to radionuclides in food

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento, Lucia Maria Evangelista do; Ferreira, Ana Cristina de Melo; Gonzalez, Sergio de Albuquerque, E-mail: anacris@ird.gov.br [Instituto de Radioproteção e Dosimetria (RD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2017-07-01

    The primary purpose of the range of activities concerning the info management of the environmental assessment is to provide to scientific community an improved access to environmental data, as well as to support the decision making loop, in case of contamination events due either to accidental or intentional causes. In recent years, geotechnologies became a key reference in environmental research and monitoring, since they deliver an efficient data retrieval and subsequent processing about natural resources. This study aimed at the development of a georeferenced database (SIGLARA - Sistema Georeferenciado Latino Americano de Radionuclídeos em Alimentos), designed to radioactivity in food data storage, available in three languages (Spanish, Portuguese and English), employing free software. (author)

  17. Design and development of a geo-referenced database to radionuclides in food

    International Nuclear Information System (INIS)

    Nascimento, Lucia Maria Evangelista do; Ferreira, Ana Cristina de Melo; Gonzalez, Sergio de Albuquerque

    2017-01-01

    The primary purpose of the range of activities concerning the info management of the environmental assessment is to provide to scientific community an improved access to environmental data, as well as to support the decision making loop, in case of contamination events due either to accidental or intentional causes. In recent years, geotechnologies became a key reference in environmental research and monitoring, since they deliver an efficient data retrieval and subsequent processing about natural resources. This study aimed at the development of a georeferenced database (SIGLARA - Sistema Georeferenciado Latino Americano de Radionuclídeos em Alimentos), designed to radioactivity in food data storage, available in three languages (Spanish, Portuguese and English), employing free software. (author)

  18. Development of a georeferenced data bank of radionuclides in typical food of Latin America - SIGLARA

    International Nuclear Information System (INIS)

    Nascimento, Lucia Maria Evangelista do

    2014-01-01

    The related management information related to the environmental assessment activity aims to provide the world community with better access to meaningful environmental information and help use this information in making decisions in case of contamination due to accident or deliberate actions. In recent years, the geotechnologies acquired are fundamental to research and environmental monitoring, once it possible, efficiently obtaining large amount of data natural resources. The result of this work was the development of a database system to store georeferenced data values of radionuclides in typical foods in Latin America (SIGLARA), defined in three languages (Spanish, Portuguese and English), using free software. The developed system meets the primary need of the RLA 09/72 ARCAL Project, funded by the International Atomic Energy Agency (IAEA), as having eleven participants countries in Latin America. The database of georeferenced created for SIGLARA system was tested in its applicability through the entry and manipulation of real data analyzed, which showed that the system is able to store, retrieve, view reports and maps of the samples of registered food. Interfaces that connect the user with the database show up efficient, making the system easy operability. Their application to environmental management is already showing results, it is hoped that these results will encourage its widespread adoption by other countries, institutions, the scientific community and the general public. (author)

  19. Low aerial imagery – an assessment of georeferencing errors and the potential for use in environmental inventory

    Directory of Open Access Journals (Sweden)

    Smaczyński Maciej

    2017-06-01

    Full Text Available Unmanned aerial vehicles are increasingly being used in close range photogrammetry. Real-time observation of the Earth’s surface and the photogrammetric images obtained are used as material for surveying and environmental inventory. The following study was conducted on a small area (approximately 1 ha. In such cases, the classical method of topographic mapping is not accurate enough. The geodetic method of topographic surveying, on the other hand, is an overly precise measurement technique for the purpose of inventorying the natural environment components. The author of the following study has proposed using the unmanned aerial vehicle technology and tying in the obtained images to the control point network established with the aid of GNSS technology. Georeferencing the acquired images and using them to create a photogrammetric model of the studied area enabled the researcher to perform calculations, which yielded a total root mean square error below 9 cm. The performed comparison of the real lengths of the vectors connecting the control points and their lengths calculated on the basis of the photogrammetric model made it possible to fully confirm the RMSE calculated and prove the usefulness of the UAV technology in observing terrain components for the purpose of environmental inventory. Such environmental components include, among others, elements of road infrastructure, green areas, but also changes in the location of moving pedestrians and vehicles, as well as other changes in the natural environment that are not registered on classical base maps or topographic maps.

  20. Automatic 3D relief acquisition and georeferencing of road sides by low-cost on-motion SfM

    Science.gov (United States)

    Voumard, Jérémie; Bornemann, Perrick; Malet, Jean-Philippe; Derron, Marc-Henri; Jaboyedoff, Michel

    2017-04-01

    3D terrain relief acquisition is important for a large part of geosciences. Several methods have been developed to digitize terrains, such as total station, LiDAR, GNSS or photogrammetry. To digitize road (or rail tracks) sides on long sections, mobile spatial imaging system or UAV are commonly used. In this project, we compare a still fairly new method -the SfM on-motion technics- with some traditional technics of terrain digitizing (terrestrial laser scanning, traditional SfM, UAS imaging solutions, GNSS surveying systems and total stations). The SfM on-motion technics generates 3D spatial data by photogrammetric processing of images taken from a moving vehicle. Our mobile system consists of six action cameras placed on a vehicle. Four fisheye cameras mounted on a mast on the vehicle roof are placed at 3.2 meters above the ground. Three of them have a GNNS chip providing geotagged images. Two pictures were acquired every second by each camera. 4K resolution fisheye videos were also used to extract 8.3M not geotagged pictures. All these pictures are then processed with the Agisoft PhotoScan Professional software. Results from the SfM on-motion technics are compared with results from classical SfM photogrammetry on a 500 meters long alpine track. They were also compared with mobile laser scanning data on the same road section. First results seem to indicate that slope structures are well observable up to decimetric accuracy. For the georeferencing, the planimetric (XY) accuracy of few meters is much better than the altimetric (Z) accuracy. There is indeed a Z coordinate shift of few tens of meters between GoPro cameras and Garmin camera. This makes necessary to give a greater freedom to altimetric coordinates in the processing software. Benefits of this low-cost SfM on-motion method are: 1) a simple setup to use in the field (easy to switch between vehicle types as car, train, bike, etc.), 2) a low cost and 3) an automatic georeferencing of 3D points clouds. Main

  1. Systems reliability/structural reliability

    International Nuclear Information System (INIS)

    Green, A.E.

    1980-01-01

    The question of reliability technology using quantified techniques is considered for systems and structures. Systems reliability analysis has progressed to a viable and proven methodology whereas this has yet to be fully achieved for large scale structures. Structural loading variants over the half-time of the plant are considered to be more difficult to analyse than for systems, even though a relatively crude model may be a necessary starting point. Various reliability characteristics and environmental conditions are considered which enter this problem. The rare event situation is briefly mentioned together with aspects of proof testing and normal and upset loading conditions. (orig.)

  2. Accuracy analysis of indirect georeferencing about TH-1 satellite in Weinan test area

    International Nuclear Information System (INIS)

    Yunlan, Yang; Haiyan, Hu

    2014-01-01

    Optical linear scanning sensors can be divided into single-lens sensors and multi-lens sensors according to the number of lenses. In order to build stereo imaging, for single-lens optical systems such as aerial mapping camera ADS40 and ADS80, there are more than two parallel linear arrays placed on the focal plane. And for a multi-lens optical system there is only one linear CCD arrays placed on the center of every focal plan for each lens which is often carried on spacecraft. The difference of design between these two kinds of optical systems leads to the systematic errors, calibration in orbit and approach of data adjustment are different completely. Recent years the domestic space optical sensor systems are focused on multi-lens linear CCD sensor in China, such as TH-1 and ZY-3 both belong to multi-lens optical systems. The parameters influencing the position accuracy of the satellite system which are unknown or unknown precisely even changed after sensor posted launch can be estimated by self-calibration in orbit. So after self-calibration in orbit the accuracy of mapping satellite will often be improved strongly. Comparing to direct georeferencing, the indirect georeferencing as a research approach is introduced to TH-1 satellite in this paper considering the systematic errors completely. Parameters about geometry position systematic error are introduced to the basic co-linearity equations for multi-lenses linear array CCD sensor, and based on the extended model the method of space multi-lens linear array CCD sensor self-calibration bundle adjustment is presented. The test field is in some area of Weinan, Shaanxi province, and the observation data of GCPs and orbit are collected. The extended rigors model is used in bundle adjustment and the accuracy analysis shown that TH-1 has a satisfied metric performance

  3. Pointing Verification Method for Spaceborne Lidars

    Directory of Open Access Journals (Sweden)

    Axel Amediek

    2017-01-01

    Full Text Available High precision acquisition of atmospheric parameters from the air or space by means of lidar requires accurate knowledge of laser pointing. Discrepancies between the assumed and actual pointing can introduce large errors due to the Doppler effect or a wrongly assumed air pressure at ground level. In this paper, a method for precisely quantifying these discrepancies for airborne and spaceborne lidar systems is presented. The method is based on the comparison of ground elevations derived from the lidar ranging data with high-resolution topography data obtained from a digital elevation model and allows for the derivation of the lateral and longitudinal deviation of the laser beam propagation direction. The applicability of the technique is demonstrated by using experimental data from an airborne lidar system, confirming that geo-referencing of the lidar ground spot trace with an uncertainty of less than 10 m with respect to the used digital elevation model (DEM can be obtained.

  4. An Automated Technique for Generating Georectified Mosaics from Ultra-High Resolution Unmanned Aerial Vehicle (UAV Imagery, Based on Structure from Motion (SfM Point Clouds

    Directory of Open Access Journals (Sweden)

    Christopher Watson

    2012-05-01

    Full Text Available Unmanned Aerial Vehicles (UAVs are an exciting new remote sensing tool capable of acquiring high resolution spatial data. Remote sensing with UAVs has the potential to provide imagery at an unprecedented spatial and temporal resolution. The small footprint of UAV imagery, however, makes it necessary to develop automated techniques to geometrically rectify and mosaic the imagery such that larger areas can be monitored. In this paper, we present a technique for geometric correction and mosaicking of UAV photography using feature matching and Structure from Motion (SfM photogrammetric techniques. Images are processed to create three dimensional point clouds, initially in an arbitrary model space. The point clouds are transformed into a real-world coordinate system using either a direct georeferencing technique that uses estimated camera positions or via a Ground Control Point (GCP technique that uses automatically identified GCPs within the point cloud. The point cloud is then used to generate a Digital Terrain Model (DTM required for rectification of the images. Subsequent georeferenced images are then joined together to form a mosaic of the study area. The absolute spatial accuracy of the direct technique was found to be 65–120 cm whilst the GCP technique achieves an accuracy of approximately 10–15 cm.

  5. Cortical projection of the inferior choroidal point as a reliable landmark to place the corticectomy and reach the temporal horn through a middle temporal gyrus approach.

    Science.gov (United States)

    Frigeri, Thomas; Rhoton, Albert; Paglioli, Eliseu; Azambuja, Ney

    2014-10-01

    To establish preoperatively the localization of the cortical projection of the inferior choroidal point (ICP) and use it as a reliable landmark when approaching the temporal horn through a middle temporal gyrus access. To review relevant anatomical features regarding selective amigdalohippocampectomy (AH) for treatment of mesial temporal lobe epilepsy (MTLE). The cortical projection of the inferior choroidal point was used in more than 300 surgeries by one authors as a reliable landmark to reach the temporal horn. In the laboratory, forty cerebral hemispheres were examined. The cortical projection of the ICP is a reliable landmark for reaching the temporal horn.

  6. Efficient reliability analysis of structures with the rotational quasi-symmetric point- and the maximum entropy methods

    Science.gov (United States)

    Xu, Jun; Dang, Chao; Kong, Fan

    2017-10-01

    This paper presents a new method for efficient structural reliability analysis. In this method, a rotational quasi-symmetric point method (RQ-SPM) is proposed for evaluating the fractional moments of the performance function. Then, the derivation of the performance function's probability density function (PDF) is carried out based on the maximum entropy method in which constraints are specified in terms of fractional moments. In this regard, the probability of failure can be obtained by a simple integral over the performance function's PDF. Six examples, including a finite element-based reliability analysis and a dynamic system with strong nonlinearity, are used to illustrate the efficacy of the proposed method. All the computed results are compared with those by Monte Carlo simulation (MCS). It is found that the proposed method can provide very accurate results with low computational effort.

  7. Validity and Reliability of Clinical Examination in the Diagnosis of Myofascial Pain Syndrome and Myofascial Trigger Points in Upper Quarter Muscles.

    Science.gov (United States)

    Mayoral Del Moral, Orlando; Torres Lacomba, María; Russell, I Jon; Sánchez Méndez, Óscar; Sánchez Sánchez, Beatriz

    2017-12-15

    To determine whether two independent examiners can agree on a diagnosis of myofascial pain syndrome (MPS). To evaluate interexaminer reliability in identifying myofascial trigger points in upper quarter muscles. To evaluate the reliability of clinical diagnostic criteria for the diagnosis of MPS. To evaluate the validity of clinical diagnostic criteria for the diagnosis of MPS. Validity and reliability study. Provincial Hospital. Toledo, Spain. Twenty myofascial pain syndrome patients and 20 healthy, normal control subjects, enrolled by a trained and experienced examiner. Ten bilateral muscles from the upper quarter were evaluated by two experienced examiners. The second examiner was blinded to the diagnosis group. The MPS diagnosis required at least one muscle to have an active myofascial trigger point. Three to four days separated the two examinations. The primary outcome measure was the frequency with which the two examiners agreed on the classification of the subjects as patients or as healthy controls. The kappa statistic (K) was used to determine the level of agreement between both examinations, interpreted as very good (0.81-1.00), good (0.61-0.80), moderate (0.41-0.60), fair (0.21-0.40), or poor (≤0.20). Interexaminer reliability for identifying subjects with MPS was very good (K = 1.0). Interexaminer reliability for identifying muscles leading to a diagnosis of MPS was also very good (K = 0.81). Sensitivity and specificity showed high values for most examination tests in all muscles, which confirms the validity of clinical diagnostic criteria in the diagnosis of MPS. Interrater reliability between two expert examiners identifying subjects with MPS involving upper quarter muscles exhibited substantial agreement. These results suggest that clinical criteria can be valid and reliable in the diagnosis of this condition. © 2017 American Academy of Pain Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  8. Human reliability guidance - How to increase the synergies between human reliability, human factors, and system design and engineering. Phase 2: The American Point of View - Insights of how the US nuclear industry works with human reliability analysis

    International Nuclear Information System (INIS)

    Oxstrand, J.

    2010-12-01

    The main goal of this Nordic Nuclear Safety Research Council (NKS) project is to produce guidance for how to use human reliability analysis (HRA) to strengthen overall safety. The project consists of two substudies: The Nordic Point of View - A User Needs Analysis, and The American Point of View - Insights of How the US Nuclear Industry Works with HRA. The purpose of the Nordic Point of View study was a user needs analysis that aimed to survey current HRA practices in the Nordic nuclear industry, with the main focus being to connect HRA to system design. In this study, 26 Nordic (Swedish and Finnish) nuclear power plant specialists with research, practitioner, and regulatory expertise in HRA, PRA, HSI, and human performance were interviewed. This study was completed in 2009. This study concludes that HRA is an important tool when dealing with human factors in control room design or modernizations. The Nordic Point of View study showed areas where the use of HRA in the Nordic nuclear industry could be improved. To gain more knowledge about how these improvements could be made, and what improvements to focus on, the second study was conducted. The second study is focused on the American nuclear industry, which has many more years of experience with risk assessment and human reliability than the Nordic nuclear industry. Interviews were conducted to collect information to help the author understand the similarities and differences between the American and the Nordic nuclear industries, and to find data regarding the findings from the first study. The main focus of this report is to identify potential HRA improvements based on the data collected in the American Point of View survey. (Author)

  9. Human reliability guidance - How to increase the synergies between human reliability, human factors, and system design and engineering. Phase 2: The American Point of View - Insights of how the US nuclear industry works with human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, J. (Vattenfall Ringhals AB, Stockholm (Sweden))

    2010-12-15

    The main goal of this Nordic Nuclear Safety Research Council (NKS) project is to produce guidance for how to use human reliability analysis (HRA) to strengthen overall safety. The project consists of two substudies: The Nordic Point of View - A User Needs Analysis, and The American Point of View - Insights of How the US Nuclear Industry Works with HRA. The purpose of the Nordic Point of View study was a user needs analysis that aimed to survey current HRA practices in the Nordic nuclear industry, with the main focus being to connect HRA to system design. In this study, 26 Nordic (Swedish and Finnish) nuclear power plant specialists with research, practitioner, and regulatory expertise in HRA, PRA, HSI, and human performance were interviewed. This study was completed in 2009. This study concludes that HRA is an important tool when dealing with human factors in control room design or modernizations. The Nordic Point of View study showed areas where the use of HRA in the Nordic nuclear industry could be improved. To gain more knowledge about how these improvements could be made, and what improvements to focus on, the second study was conducted. The second study is focused on the American nuclear industry, which has many more years of experience with risk assessment and human reliability than the Nordic nuclear industry. Interviews were conducted to collect information to help the author understand the similarities and differences between the American and the Nordic nuclear industries, and to find data regarding the findings from the first study. The main focus of this report is to identify potential HRA improvements based on the data collected in the American Point of View survey. (Author)

  10. The quadrant method measuring four points is as a reliable and accurate as the quadrant method in the evaluation after anatomical double-bundle ACL reconstruction.

    Science.gov (United States)

    Mochizuki, Yuta; Kaneko, Takao; Kawahara, Keisuke; Toyoda, Shinya; Kono, Norihiko; Hada, Masaru; Ikegami, Hiroyasu; Musha, Yoshiro

    2017-11-20

    The quadrant method was described by Bernard et al. and it has been widely used for postoperative evaluation of anterior cruciate ligament (ACL) reconstruction. The purpose of this research is to further develop the quadrant method measuring four points, which we named four-point quadrant method, and to compare with the quadrant method. Three-dimensional computed tomography (3D-CT) analyses were performed in 25 patients who underwent double-bundle ACL reconstruction using the outside-in technique. The four points in this study's quadrant method were defined as point1-highest, point2-deepest, point3-lowest, and point4-shallowest, in femoral tunnel position. Value of depth and height in each point was measured. Antero-medial (AM) tunnel is (depth1, height2) and postero-lateral (PL) tunnel is (depth3, height4) in this four-point quadrant method. The 3D-CT images were evaluated independently by 2 orthopaedic surgeons. A second measurement was performed by both observers after a 4-week interval. Intra- and inter-observer reliability was calculated by means of intra-class correlation coefficient (ICC). Also, the accuracy of the method was evaluated against the quadrant method. Intra-observer reliability was almost perfect for both AM and PL tunnel (ICC > 0.81). Inter-observer reliability of AM tunnel was substantial (ICC > 0.61) and that of PL tunnel was almost perfect (ICC > 0.81). The AM tunnel position was 0.13% deep, 0.58% high and PL tunnel position was 0.01% shallow, 0.13% low compared to quadrant method. The four-point quadrant method was found to have high intra- and inter-observer reliability and accuracy. This method can evaluate the tunnel position regardless of the shape and morphology of the bone tunnel aperture for use of comparison and can provide measurement that can be compared with various reconstruction methods. The four-point quadrant method of this study is considered to have clinical relevance in that it is a detailed and accurate tool for

  11. Cortical projection of the inferior choroidal point as a reliable landmark to place the corticectomy and reach the temporal horn through a middle temporal gyrus approach

    Directory of Open Access Journals (Sweden)

    Thomas Frigeri

    2014-10-01

    Full Text Available Objective To establish preoperatively the localization of the cortical projection of the inferior choroidal point (ICP and use it as a reliable landmark when approaching the temporal horn through a middle temporal gyrus access. To review relevant anatomical features regarding selective amigdalohippocampectomy (AH for treatment of mesial temporal lobe epilepsy (MTLE. Method The cortical projection of the inferior choroidal point was used in more than 300 surgeries by one authors as a reliable landmark to reach the temporal horn. In the laboratory, forty cerebral hemispheres were examined. Conclusion The cortical projection of the ICP is a reliable landmark for reaching the temporal horn.

  12. The reliability of the McCabe score as a marker of co-morbidity in healthcare-associated infection point prevalence studies.

    Science.gov (United States)

    Reilly, J S; Coignard, B; Price, L; Godwin, J; Cairns, S; Hopkins, S; Lyytikäinen, O; Hansen, S; Malcolm, W; Hughes, G J

    2016-05-01

    This study aimed to ascertain the reliability of the McCabe score in a healthcare-associated infection point prevalence survey.   A 10 European Union Member States survey in 20 hospitals (n = 1912) indicated that there was a moderate level of agreement (κ = 0.57) with the score. The reliability of the application of the score could be increased by training data collectors, particularly with reference to the ultimately fatal criteria. This is important if the score is to be used to risk adjust data to drive infection prevention and control interventions.

  13. GEOREFERENCING IN GNSS-CHALLENGED ENVIRONMENT: INTEGRATING UWB AND IMU TECHNOLOGIES

    Directory of Open Access Journals (Sweden)

    C. K. Toth

    2017-05-01

    Full Text Available Acquiring geospatial data in GNSS compromised environments remains a problem in mapping and positioning in general. Urban canyons, heavily vegetated areas, indoor environments represent different levels of GNSS signal availability from weak to no signal reception. Even outdoors, with multiple GNSS systems, with an ever-increasing number of satellites, there are many situations with limited or no access to GNSS signals. Independent navigation sensors, such as IMU can provide high-data rate information but their initial accuracy degrades quickly, as the measurement data drift over time unless positioning fixes are provided from another source. At The Ohio State University’s Satellite Positioning and Inertial Navigation (SPIN Laboratory, as one feasible solution, Ultra- Wideband (UWB radio units are used to aid positioning and navigating in GNSS compromised environments, including indoor and outdoor scenarios. Here we report about experiences obtained with georeferencing a pushcart based sensor system under canopied areas. The positioning system is based on UWB and IMU sensor integration, and provides sensor platform orientation for an electromagnetic inference (EMI sensor. Performance evaluation results are provided for various test scenarios, confirming acceptable results for applications where high accuracy is not required.

  14. As reliable as the sun

    Science.gov (United States)

    Leijtens, J. A. P.

    2017-11-01

    Fortunately there is almost nothing as reliable as the sun which can consequently be utilized as a very reliable source of spacecraft power. In order to harvest this power, the solar panels have to be pointed towards the sun as accurately and reliably as possible. To this extend, sunsensors are available on almost every satellite to support vital sun-pointing capability throughout the mission, even in the deployment and save mode phases of the satellites life. Given the criticality of the application one would expect that after more than 50 years of sun sensor utilisation, such sensors would be fully matured and optimised. In actual fact though, the majority of sunsensors employed are still coarse sunsensors which have a proven extreme reliability but present major issues regarding albedo sensitivity and pointing accuracy.

  15. The Feasibility of 3d Point Cloud Generation from Smartphones

    Science.gov (United States)

    Alsubaie, N.; El-Sheimy, N.

    2016-06-01

    This paper proposes a new technique for increasing the accuracy of direct geo-referenced image-based 3D point cloud generated from low-cost sensors in smartphones. The smartphone's motion sensors are used to directly acquire the Exterior Orientation Parameters (EOPs) of the captured images. These EOPs, along with the Interior Orientation Parameters (IOPs) of the camera/ phone, are used to reconstruct the image-based 3D point cloud. However, because smartphone motion sensors suffer from poor GPS accuracy, accumulated drift and high signal noise, inaccurate 3D mapping solutions often result. Therefore, horizontal and vertical linear features, visible in each image, are extracted and used as constraints in the bundle adjustment procedure. These constraints correct the relative position and orientation of the 3D mapping solution. Once the enhanced EOPs are estimated, the semi-global matching algorithm (SGM) is used to generate the image-based dense 3D point cloud. Statistical analysis and assessment are implemented herein, in order to demonstrate the feasibility of 3D point cloud generation from the consumer-grade sensors in smartphones.

  16. Measuring reliability under epistemic uncertainty: Review on non-probabilistic reliability metrics

    Directory of Open Access Journals (Sweden)

    Kang Rui

    2016-06-01

    Full Text Available In this paper, a systematic review of non-probabilistic reliability metrics is conducted to assist the selection of appropriate reliability metrics to model the influence of epistemic uncertainty. Five frequently used non-probabilistic reliability metrics are critically reviewed, i.e., evidence-theory-based reliability metrics, interval-analysis-based reliability metrics, fuzzy-interval-analysis-based reliability metrics, possibility-theory-based reliability metrics (posbist reliability and uncertainty-theory-based reliability metrics (belief reliability. It is pointed out that a qualified reliability metric that is able to consider the effect of epistemic uncertainty needs to (1 compensate the conservatism in the estimations of the component-level reliability metrics caused by epistemic uncertainty, and (2 satisfy the duality axiom, otherwise it might lead to paradoxical and confusing results in engineering applications. The five commonly used non-probabilistic reliability metrics are compared in terms of these two properties, and the comparison can serve as a basis for the selection of the appropriate reliability metrics.

  17. Reliability Engineering

    CERN Document Server

    Lazzaroni, Massimo

    2012-01-01

    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  18. Determination of Exterior Orientation Parameters Through Direct Geo-Referencing in a Real-Time Aerial Monitoring System

    Science.gov (United States)

    Kim, H.; Lee, J.; Choi, K.; Lee, I.

    2012-07-01

    Rapid responses for emergency situations such as natural disasters or accidents often require geo-spatial information describing the on-going status of the affected area. Such geo-spatial information can be promptly acquired by a manned or unmanned aerial vehicle based multi-sensor system that can monitor the emergent situations in near real-time from the air using several kinds of sensors. Thus, we are in progress of developing such a real-time aerial monitoring system (RAMS) consisting of both aerial and ground segments. The aerial segment acquires the sensory data about the target areas by a low-altitude helicopter system equipped with sensors such as a digital camera and a GPS/IMU system and transmits them to the ground segment through a RF link in real-time. The ground segment, which is a deployable ground station installed on a truck, receives the sensory data and rapidly processes them to generate ortho-images, DEMs, etc. In order to generate geo-spatial information, in this system, exterior orientation parameters (EOP) of the acquired images are obtained through direct geo-referencing because it is difficult to acquire coordinates of ground points in disaster area. The main process, since the data acquisition stage until the measurement of EOP, is discussed as follows. First, at the time of data acquisition, image acquisition time synchronized by GPS time is recorded as part of image file name. Second, the acquired data are then transmitted to the ground segment in real-time. Third, by processing software for ground segment, positions/attitudes of acquired images are calculated through a linear interpolation using the GPS time of the received position/attitude data and images. Finally, the EOPs of images are obtained from position/attitude data by deriving the relationships between a camera coordinate system and a GPS/IMU coordinate system. In this study, we evaluated the accuracy of the EOP decided by direct geo-referencing in our system. To perform this

  19. DETERMINATION OF EXTERIOR ORIENTATION PARAMETERS THROUGH DIRECT GEO-REFERENCING IN A REAL-TIME AERIAL MONITORING SYSTEM

    Directory of Open Access Journals (Sweden)

    H. Kim

    2012-07-01

    Full Text Available Rapid responses for emergency situations such as natural disasters or accidents often require geo-spatial information describing the on-going status of the affected area. Such geo-spatial information can be promptly acquired by a manned or unmanned aerial vehicle based multi-sensor system that can monitor the emergent situations in near real-time from the air using several kinds of sensors. Thus, we are in progress of developing such a real-time aerial monitoring system (RAMS consisting of both aerial and ground segments. The aerial segment acquires the sensory data about the target areas by a low-altitude helicopter system equipped with sensors such as a digital camera and a GPS/IMU system and transmits them to the ground segment through a RF link in real-time. The ground segment, which is a deployable ground station installed on a truck, receives the sensory data and rapidly processes them to generate ortho-images, DEMs, etc. In order to generate geo-spatial information, in this system, exterior orientation parameters (EOP of the acquired images are obtained through direct geo-referencing because it is difficult to acquire coordinates of ground points in disaster area. The main process, since the data acquisition stage until the measurement of EOP, is discussed as follows. First, at the time of data acquisition, image acquisition time synchronized by GPS time is recorded as part of image file name. Second, the acquired data are then transmitted to the ground segment in real-time. Third, by processing software for ground segment, positions/attitudes of acquired images are calculated through a linear interpolation using the GPS time of the received position/attitude data and images. Finally, the EOPs of images are obtained from position/attitude data by deriving the relationships between a camera coordinate system and a GPS/IMU coordinate system. In this study, we evaluated the accuracy of the EOP decided by direct geo-referencing in our system

  20. Sensor Fusion of a Mobile Device to Control and Acquire Videos or Images of Coffee Branches and for Georeferencing Trees

    Directory of Open Access Journals (Sweden)

    Paula Jimena Ramos Giraldo

    2017-04-01

    Full Text Available Smartphones show potential for controlling and monitoring variables in agriculture. Their processing capacity, instrumentation, connectivity, low cost, and accessibility allow farmers (among other users in rural areas to operate them easily with applications adjusted to their specific needs. In this investigation, the integration of inertial sensors, a GPS, and a camera are presented for the monitoring of a coffee crop. An Android-based application was developed with two operating modes: (i Navigation: for georeferencing trees, which can be as close as 0.5 m from each other; and (ii Acquisition: control of video acquisition, based on the movement of the mobile device over a branch, and measurement of image quality, using clarity indexes to select the most appropriate frames for application in future processes. The integration of inertial sensors in navigation mode, shows a mean relative error of ±0.15 m, and total error ±5.15 m. In acquisition mode, the system correctly identifies the beginning and end of mobile phone movement in 99% of cases, and image quality is determined by means of a sharpness factor which measures blurriness. With the developed system, it will be possible to obtain georeferenced information about coffee trees, such as their production, nutritional state, and presence of plagues or diseases.

  1. Sensor Fusion of a Mobile Device to Control and Acquire Videos or Images of Coffee Branches and for Georeferencing Trees.

    Science.gov (United States)

    Giraldo, Paula Jimena Ramos; Aguirre, Álvaro Guerrero; Muñoz, Carlos Mario; Prieto, Flavio Augusto; Oliveros, Carlos Eugenio

    2017-04-06

    Smartphones show potential for controlling and monitoring variables in agriculture. Their processing capacity, instrumentation, connectivity, low cost, and accessibility allow farmers (among other users in rural areas) to operate them easily with applications adjusted to their specific needs. In this investigation, the integration of inertial sensors, a GPS, and a camera are presented for the monitoring of a coffee crop. An Android-based application was developed with two operating modes: ( i ) Navigation: for georeferencing trees, which can be as close as 0.5 m from each other; and ( ii ) Acquisition: control of video acquisition, based on the movement of the mobile device over a branch, and measurement of image quality, using clarity indexes to select the most appropriate frames for application in future processes. The integration of inertial sensors in navigation mode, shows a mean relative error of ±0.15 m, and total error ±5.15 m. In acquisition mode, the system correctly identifies the beginning and end of mobile phone movement in 99% of cases, and image quality is determined by means of a sharpness factor which measures blurriness. With the developed system, it will be possible to obtain georeferenced information about coffee trees, such as their production, nutritional state, and presence of plagues or diseases.

  2. Reliability analyses to detect weak points in secondary-side residual heat removal systems of KWU PWR plants

    International Nuclear Information System (INIS)

    Schilling, R.

    1983-01-01

    Requirements made by Federal German licensing authorities called for the analysis of the secondary-side residual heat removal systems of new PWR plants with regard to availability, possible weak points and the balanced nature of the overall system for different incident sequences. Following a description of the generic concept and the process and safety-related systems for steam generator feed and main steam discharge, the reliability of the latter is analyzed for the small break LOCA and emergency power mode incidents, weak points in the process systems identified, remedial measures of a system-specific and test-strategic nature presented and their contribution to improving system availability quantified. A comparison with the results of the German Risk Study on Nuclear Power Plants (GRS) shows a distinct reduction in core meltdown frequency. (orig.)

  3. Geoinformation web-system for processing and visualization of large archives of geo-referenced data

    Science.gov (United States)

    Gordov, E. P.; Okladnikov, I. G.; Titov, A. G.; Shulgina, T. M.

    2010-12-01

    Developed working model of information-computational system aimed at scientific research in area of climate change is presented. The system will allow processing and analysis of large archives of geophysical data obtained both from observations and modeling. Accumulated experience of developing information-computational web-systems providing computational processing and visualization of large archives of geo-referenced data was used during the implementation (Gordov et al, 2007; Okladnikov et al, 2008; Titov et al, 2009). Functional capabilities of the system comprise a set of procedures for mathematical and statistical analysis, processing and visualization of data. At present five archives of data are available for processing: 1st and 2nd editions of NCEP/NCAR Reanalysis, ECMWF ERA-40 Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, and NOAA-CIRES XX Century Global Reanalysis Version I. To provide data processing functionality a computational modular kernel and class library providing data access for computational modules were developed. Currently a set of computational modules for climate change indices approved by WMO is available. Also a special module providing visualization of results and writing to Encapsulated Postscript, GeoTIFF and ESRI shape files was developed. As a technological basis for representation of cartographical information in Internet the GeoServer software conforming to OpenGIS standards is used. Integration of GIS-functionality with web-portal software to provide a basis for web-portal’s development as a part of geoinformation web-system is performed. Such geoinformation web-system is a next step in development of applied information-telecommunication systems offering to specialists from various scientific fields unique opportunities of performing reliable analysis of heterogeneous geophysical data using approved computational algorithms. It will allow a wide range of researchers to work with geophysical data without specific programming

  4. Georeferenced cartography dataset of the La Fossa crater fumarolic field at Vulcano Island (Aeolian Archipelago, Italy: conversion and comparison of data from local to global positioning methods

    Directory of Open Access Journals (Sweden)

    Carmelo Sammarco

    2011-07-01

    Full Text Available The present study illustrates the procedures applied for the coordinate system conversion of the historical fumarole positions at La Fossa crater, to allow their comparison with newly acquired global positioning system (GPS data. Due to the absence of ground control points in the field and on both the old Gauss Boaga and the new UTM WGS 1984 maps, we had to model the transformation errors between the two systems using differential GPS techniques. Once corrected, the maps show a residual Easting shifting, due to erroneous georeferencing of the original base maps; this is corrected by morphological comparative methods. The good correspondence between the corrected positions of the historical data and the results of the new GPS survey that was carried out in 2009 highlights the good quality of the old surveys, although they were carried out without the use of accurate topographical instruments.

  5. DTM GENERATION WITH UAV BASED PHOTOGRAMMETRIC POINT CLOUD

    Directory of Open Access Journals (Sweden)

    N. Polat

    2017-11-01

    Full Text Available Nowadays Unmanned Aerial Vehicles (UAVs are widely used in many applications for different purposes. Their benefits however are not entirely detected due to the integration capabilities of other equipment such as; digital camera, GPS, or laser scanner. The main scope of this paper is evaluating performance of cameras integrated UAV for geomatic applications by the way of Digital Terrain Model (DTM generation in a small area. In this purpose, 7 ground control points are surveyed with RTK and 420 photographs are captured. Over 30 million georeferenced points were used in DTM generation process. Accuracy of the DTM was evaluated with 5 check points. The root mean square error is calculated as 17.1 cm for an altitude of 100 m. Besides, a LiDAR derived DTM is used as reference in order to calculate correlation. The UAV based DTM has o 94.5 % correlation with reference DTM. Outcomes of the study show that it is possible to use the UAV Photogrammetry data as map producing, surveying, and some other engineering applications with the advantages of low-cost, time conservation, and minimum field work.

  6. DTM Generation with Uav Based Photogrammetric Point Cloud

    Science.gov (United States)

    Polat, N.; Uysal, M.

    2017-11-01

    Nowadays Unmanned Aerial Vehicles (UAVs) are widely used in many applications for different purposes. Their benefits however are not entirely detected due to the integration capabilities of other equipment such as; digital camera, GPS, or laser scanner. The main scope of this paper is evaluating performance of cameras integrated UAV for geomatic applications by the way of Digital Terrain Model (DTM) generation in a small area. In this purpose, 7 ground control points are surveyed with RTK and 420 photographs are captured. Over 30 million georeferenced points were used in DTM generation process. Accuracy of the DTM was evaluated with 5 check points. The root mean square error is calculated as 17.1 cm for an altitude of 100 m. Besides, a LiDAR derived DTM is used as reference in order to calculate correlation. The UAV based DTM has o 94.5 % correlation with reference DTM. Outcomes of the study show that it is possible to use the UAV Photogrammetry data as map producing, surveying, and some other engineering applications with the advantages of low-cost, time conservation, and minimum field work.

  7. Human Reliability Program Overview

    Energy Technology Data Exchange (ETDEWEB)

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  8. Self-Similar Spin Images for Point Cloud Matching

    Science.gov (United States)

    Pulido, Daniel

    based on the concept of self-similarity to aid in the scale and feature matching steps. An open problem in fusion is how best to extract features from two point clouds and then perform feature-based matching. The proposed approach for this matching step is the use of local self-similarity as an invariant measure to match features. In particular, the proposed approach is to combine the concept of local self-similarity with a well-known feature descriptor, Spin Images, and thereby define "Self-Similar Spin Images". This approach is then extended to the case of matching two points clouds in very different coordinate systems (e.g., a geo-referenced Lidar point cloud and stereo-image derived point cloud without geo-referencing). The use of Self-Similar Spin Images is again applied to address this problem by introducing a "Self-Similar Keyscale" that matches the spatial scales of two point clouds. Another open problem is how best to detect changes in content between two point clouds. A method is proposed to find changes between two point clouds by analyzing the order statistics of the nearest neighbors between the two clouds, and thereby define the "Nearest Neighbor Order Statistic" method. Note that the well-known Hausdorff distance is a special case as being just the maximum order statistic. Therefore, by studying the entire histogram of these nearest neighbors it is expected to yield a more robust method to detect points that are present in one cloud but not the other. This approach is applied at multiple resolutions. Therefore, changes detected at the coarsest level will yield large missing targets and at finer levels will yield smaller targets.

  9. PET-MR image fusion in soft tissue sarcoma: accuracy, reliability and practicality of interactive point-based and automated mutual information techniques

    International Nuclear Information System (INIS)

    Somer, Edward J.R.; Marsden, Paul K.; Benatar, Nigel A.; O'Doherty, Michael J.; Goodey, Joanne; Smith, Michael A.

    2003-01-01

    The fusion of functional positron emission tomography (PET) data with anatomical magnetic resonance (MR) or computed tomography images, using a variety of interactive and automated techniques, is becoming commonplace, with the technique of choice dependent on the specific application. The case of PET-MR image fusion in soft tissue is complicated by a lack of conspicuous anatomical features and deviation from the rigid-body model. Here we compare a point-based external marker technique with an automated mutual information algorithm and discuss the practicality, reliability and accuracy of each when applied to the study of soft tissue sarcoma. Ten subjects with suspected sarcoma in the knee, thigh, groin, flank or back underwent MR and PET scanning after the attachment of nine external fiducial markers. In the assessment of the point-based technique, three error measures were considered: fiducial localisation error (FLE), fiducial registration error (FRE) and target registration error (TRE). FLE, which represents the accuracy with which the fiducial points can be located, is related to the FRE minimised by the registration algorithm. The registration accuracy is best characterised by the TRE, which is the distance between corresponding points in each image space after registration. In the absence of salient features within the target volume, the TRE can be measured at fiducials excluded from the registration process. To assess the mutual information technique, PET data, acquired after physically removing the markers, were reconstructed in a variety of ways and registered with MR. Having applied the transform suggested by the algorithm to the PET scan acquired before the markers were removed, the residual distance between PET and MR marker-pairs could be measured. The manual point-based technique yielded the best results (RMS TRE =8.3 mm, max =22.4 mm, min =1.7 mm), performing better than the automated algorithm (RMS TRE =20.0 mm, max =30.5 mm, min =7.7 mm) when

  10. Frequency and Proximity Clustering Analyses for Georeferencing Toponyms and Points-of-Interest Names from a Travel Journal

    Science.gov (United States)

    McDermott, Scott D.

    2017-01-01

    This research study uses geographic information retrieval (GIR) to georeference toponyms and points-of-interest (POI) names from a travel journal. Travel journals are an ideal data source with which to conduct this study because they are significant accounts specific to the author's experience, and contain geographic instances based on the…

  11. An Initial Seed Selection Algorithm for K-means Clustering of Georeferenced Data to Improve Replicability of Cluster Assignments for Mapping Application

    OpenAIRE

    Khan, Fouad

    2016-01-01

    K-means is one of the most widely used clustering algorithms in various disciplines, especially for large datasets. However the method is known to be highly sensitive to initial seed selection of cluster centers. K-means++ has been proposed to overcome this problem and has been shown to have better accuracy and computational efficiency than k-means. In many clustering problems though -such as when classifying georeferenced data for mapping applications- standardization of clustering methodolo...

  12. Probabilistic reliability analyses to detect weak points in secondary-side residual heat removal systems of KWU PWR plants

    International Nuclear Information System (INIS)

    Schilling, R.

    1984-01-01

    Requirements made by Federal German licensing authorities called for the analysis of the second-side residual heat removal systems of new PWR plants with regard to availability, possible weak points and the balanced nature of the overall system for different incident sequences. Following a description of the generic concept and the process and safety-related systems for steam generator feed and main steam discharge, the reliability of the latter is analyzed for the small break LOCA and emergency power mode incidents, weak points in the process systems are identified, remedial measures of a system-specific and test-strategic nature are presented and their contribution to improving system availability is quantified. A comparison with the results of the German Risk Study on Nuclear Power Plants (GRS) shows a distinct reduction in core meltdown frequency. (orig.)

  13. The Berg Balance Scale has high intra- and inter-rater reliability but absolute reliability varies across the scale: a systematic review.

    Science.gov (United States)

    Downs, Stephen; Marquez, Jodie; Chiarelli, Pauline

    2013-06-01

    What is the intra-rater and inter-rater relative reliability of the Berg Balance Scale? What is the absolute reliability of the Berg Balance Scale? Does the absolute reliability of the Berg Balance Scale vary across the scale? Systematic review with meta-analysis of reliability studies. Any clinical population that has undergone assessment with the Berg Balance Scale. Relative intra-rater reliability, relative inter-rater reliability, and absolute reliability. Eleven studies involving 668 participants were included in the review. The relative intrarater reliability of the Berg Balance Scale was high, with a pooled estimate of 0.98 (95% CI 0.97 to 0.99). Relative inter-rater reliability was also high, with a pooled estimate of 0.97 (95% CI 0.96 to 0.98). A ceiling effect of the Berg Balance Scale was evident for some participants. In the analysis of absolute reliability, all of the relevant studies had an average score of 20 or above on the 0 to 56 point Berg Balance Scale. The absolute reliability across this part of the scale, as measured by the minimal detectable change with 95% confidence, varied between 2.8 points and 6.6 points. The Berg Balance Scale has a higher absolute reliability when close to 56 points due to the ceiling effect. We identified no data that estimated the absolute reliability of the Berg Balance Scale among participants with a mean score below 20 out of 56. The Berg Balance Scale has acceptable reliability, although it might not detect modest, clinically important changes in balance in individual subjects. The review was only able to comment on the absolute reliability of the Berg Balance Scale among people with moderately poor to normal balance. Copyright © 2013 Australian Physiotherapy Association. Published by .. All rights reserved.

  14. Georeferenced historical forest maps of Bukovina (Northern Romania) - important tool for paleoenvironmental analyses

    Science.gov (United States)

    Popa, Ionel; Crǎciunescu, Vasile; Candrea, Bogdan; Timár, Gábor

    2010-05-01

    The historical region of Bukovina is one of the most forested areas of Romania. The name itself, beech land, suggest the high wood resources located here. The systematic wood exploitation started in Bukovina during the Austrian rule (1775 - 1918). To fully asses the region's wood potential and to make the exploitation and replantation processes more efficient, the Austrian engineers developed a dedicated mapping system. The result was a series of maps, surveyed for each forest district. In the first editions, we can find maps crafted at different scales (e.g. 1:50 000, 1: 20 000, 1: 25 000). Later on (after 1900), the map sheets scale was standardized to 1: 25 000. Each sheet was accompanied by a register with information regarding the forest parcels. The system was kept after 1918, when Bukovina become a part of Romania. For another 20 years, the forest districts were periodically surveyed and the maps updated. The basemap content also changed during time. For most of the maps, the background was compiled from the Austrian Third Military Survey maps. After the Second World War, the Romanian military maps ("planurile directoare de tragere") were also used. The forest surveys were positioned using the Austrian triangulation network with the closest baseline at Rădăuţi. Considered lost after WWII, an important part of this maps were recently recovered by a fortunate and accidental finding. Such informations are highly valuable for the today forest planners. By careful studying this kind of documents, a modern forest manager can better understand the way forests were managed in the past and the implications of that management in today's forest reality. In order to do that, the maps should be first georeferenced into a known coordinate system of the Third Survey and integrated with recent geospatial datasets using a GIS environment. The paper presents the challenges of finding and applying the right informations regarding the datum and projection used by the

  15. Comparative analysis of different configurations of PLC-based safety systems from reliability point of view

    Science.gov (United States)

    Tapia, Moiez A.

    1993-01-01

    The study of a comparative analysis of distinct multiplex and fault-tolerant configurations for a PLC-based safety system from a reliability point of view is presented. It considers simplex, duplex and fault-tolerant triple redundancy configurations. The standby unit in case of a duplex configuration has a failure rate which is k times the failure rate of the standby unit, the value of k varying from 0 to 1. For distinct values of MTTR and MTTF of the main unit, MTBF and availability for these configurations are calculated. The effect of duplexing only the PLC module or only the sensors and the actuators module, on the MTBF of the configuration, is also presented. The results are summarized and merits and demerits of various configurations under distinct environments are discussed.

  16. Psychogeography in the Age of the Quantified Self — Mental Map modelling with Georeferenced Personal Activity Data

    Science.gov (United States)

    Meier, Sebastian; Glinka, Katrin

    2018-05-01

    Personal and subjective perceptions of urban space have been a focus of various research projects in the area of cartography, geography, and related fields such as urban planning. This paper illustrates how personal georeferenced activity data can be used in algorithmic modelling of certain aspects of mental maps and customised spatial visualisations. The technical implementation of the algorithm is accompanied by a preliminary study which evaluates the performance of the algorithm. As a linking element between personal perception, interpretation, and depiction of space and the field of cartography and geography, we include perspectives from artistic practice and cultural theory. By developing novel visualisation concepts based on personal data, the paper in part mitigates the challenges presented by user modelling that is, amongst others, used in LBS applications.

  17. Reliability-guided digital image correlation for image deformation measurement

    International Nuclear Information System (INIS)

    Pan Bing

    2009-01-01

    A universally applicable reliability-guided digital image correlation (DIC) method is proposed for reliable image deformation measurement. The zero-mean normalized cross correlation (ZNCC) coefficient is used to identify the reliability of the point computed. The correlation calculation begins with a seed point and is then guided by the ZNCC coefficient. That means the neighbors of the point with the highest ZNCC coefficient in a queue for computed points will be processed first. Thus the calculation path is always along the most reliable direction, and possible error propagation of the conventional DIC method can be avoided. The proposed novel DIC method is universally applicable to the images with shadows, discontinuous areas, and deformation discontinuity. Two image pairs were used to evaluate the performance of the proposed technique, and the successful results clearly demonstrate its robustness and effectiveness

  18. Monte Carlo simulation based reliability evaluation in a multi-bilateral contracts market

    International Nuclear Information System (INIS)

    Goel, L.; Viswanath, P.A.; Wang, P.

    2004-01-01

    This paper presents a time sequential Monte Carlo simulation technique to evaluate customer load point reliability in multi-bilateral contracts market. The effects of bilateral transactions, reserve agreements, and the priority commitments of generating companies on customer load point reliability have been investigated. A generating company with bilateral contracts is modelled as an equivalent time varying multi-state generation (ETMG). A procedure to determine load point reliability based on ETMG has been developed. The developed procedure is applied to a reliability test system to illustrate the technique. Representing each bilateral contract by an ETMG provides flexibility in determining the reliability at various customer load points. (authors)

  19. Reliability Estimation Based Upon Test Plan Results

    National Research Council Canada - National Science Library

    Read, Robert

    1997-01-01

    The report contains a brief summary of aspects of the Maximus reliability point and interval estimation technique as it has been applied to the reliability of a device whose surveillance tests contain...

  20. Reliability-Based Optimization of Structural Elements

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    In this paper structural elements from an optimization point of view are considered, i.e. only the geometry of a structural element is optimized. Reliability modelling of the structural element is discussed both from an element point of view and from a system point of view. The optimization...

  1. A heuristic-based approach for reliability importance assessment of energy producers

    International Nuclear Information System (INIS)

    Akhavein, A.; Fotuhi Firuzabad, M.

    2011-01-01

    Reliability of energy supply is one of the most important issues of service quality. On one hand, customers usually have different expectations for service reliability and price. On the other hand, providing different level of reliability at load points is a challenge for system operators. In order to take reasonable decisions and obviate reliability implementation difficulties, market players need to know impacts of their assets on system and load-point reliabilities. One tool to specify reliability impacts of assets is the criticality or reliability importance measure by which system components can be ranked based on their effect on reliability. Conventional methods for determination of reliability importance are essentially on the basis of risk sensitivity analysis and hence, impose prohibitive calculation burden in large power systems. An approach is proposed in this paper to determine reliability importance of energy producers from perspective of consumers or distribution companies in a composite generation and transmission system. In the presented method, while avoiding immense computational burden, the energy producers are ranked based on their rating, unavailability and impact on power flows in the lines connecting to the considered load points. Study results on the IEEE reliability test system show successful application of the proposed method. - Research highlights: → Required reliability level at load points is a concern in modern power systems. → It is important to assess reliability importance of energy producers or generators. → Generators can be ranked based on their impacts on power flow to a selected area. → Ranking of generators is an efficient tool to assess their reliability importance.

  2. Unemployment estimation: Spatial point referenced methods and models

    KAUST Repository

    Pereira, Soraia; Turkman, Kamil Feridun; Correia, Luis; Rue, Haavard

    2017-01-01

    Portuguese Labor force survey, from 4th quarter of 2014 onwards, started geo-referencing the sampling units, namely the dwellings in which the surveys are carried. This opens new possibilities in analysing and estimating unemployment and its spatial

  3. Reliable and accurate point-based prediction of cumulative infiltration using soil readily available characteristics: A comparison between GMDH, ANN, and MLR

    Science.gov (United States)

    Rahmati, Mehdi

    2017-08-01

    Developing accurate and reliable pedo-transfer functions (PTFs) to predict soil non-readily available characteristics is one of the most concerned topic in soil science and selecting more appropriate predictors is a crucial factor in PTFs' development. Group method of data handling (GMDH), which finds an approximate relationship between a set of input and output variables, not only provide an explicit procedure to select the most essential PTF input variables, but also results in more accurate and reliable estimates than other mostly applied methodologies. Therefore, the current research was aimed to apply GMDH in comparison with multivariate linear regression (MLR) and artificial neural network (ANN) to develop several PTFs to predict soil cumulative infiltration point-basely at specific time intervals (0.5-45 min) using soil readily available characteristics (RACs). In this regard, soil infiltration curves as well as several soil RACs including soil primary particles (clay (CC), silt (Si), and sand (Sa)), saturated hydraulic conductivity (Ks), bulk (Db) and particle (Dp) densities, organic carbon (OC), wet-aggregate stability (WAS), electrical conductivity (EC), and soil antecedent (θi) and field saturated (θfs) water contents were measured at 134 different points in Lighvan watershed, northwest of Iran. Then, applying GMDH, MLR, and ANN methodologies, several PTFs have been developed to predict cumulative infiltrations using two sets of selected soil RACs including and excluding Ks. According to the test data, results showed that developed PTFs by GMDH and MLR procedures using all soil RACs including Ks resulted in more accurate (with E values of 0.673-0.963) and reliable (with CV values lower than 11 percent) predictions of cumulative infiltrations at different specific time steps. In contrast, ANN procedure had lower accuracy (with E values of 0.356-0.890) and reliability (with CV values up to 50 percent) compared to GMDH and MLR. The results also revealed

  4. FRELIB, Failure Reliability Index Calculation

    International Nuclear Information System (INIS)

    Parkinson, D.B.; Oestergaard, C.

    1984-01-01

    1 - Description of problem or function: Calculation of the reliability index given the failure boundary. A linearization point (design point) is found on the failure boundary for a stationary reliability index (min) and a stationary failure probability density function along the failure boundary, provided that the basic variables are normally distributed. 2 - Method of solution: Iteration along the failure boundary which must be specified - together with its partial derivatives with respect to the basic variables - by the user in a subroutine FSUR. 3 - Restrictions on the complexity of the problem: No distribution information included (first-order-second-moment-method). 20 basic variables (could be extended)

  5. Scale Reliability Evaluation with Heterogeneous Populations

    Science.gov (United States)

    Raykov, Tenko; Marcoulides, George A.

    2015-01-01

    A latent variable modeling approach for scale reliability evaluation in heterogeneous populations is discussed. The method can be used for point and interval estimation of reliability of multicomponent measuring instruments in populations representing mixtures of an unknown number of latent classes or subpopulations. The procedure is helpful also…

  6. Common-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest, M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CROSSER, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CROSSER, CUMBIN (NPO-17555), and NEWTONP (NPO-17556), used independently of one another. Point of equality between reliability of system and common reliability of components found. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  7. A Reliability Based Model for Wind Turbine Selection

    Directory of Open Access Journals (Sweden)

    A.K. Rajeevan

    2013-06-01

    Full Text Available A wind turbine generator output at a specific site depends on many factors, particularly cut- in, rated and cut-out wind speed parameters. Hence power output varies from turbine to turbine. The objective of this paper is to develop a mathematical relationship between reliability and wind power generation. The analytical computation of monthly wind power is obtained from weibull statistical model using cubic mean cube root of wind speed. Reliability calculation is based on failure probability analysis. There are many different types of wind turbinescommercially available in the market. From reliability point of view, to get optimum reliability in power generation, it is desirable to select a wind turbine generator which is best suited for a site. The mathematical relationship developed in this paper can be used for site-matching turbine selection in reliability point of view.

  8. Study on the reliability of large coal-fired and nuclear power plants. Factors affecting power plant reliability. Volume I. Final report

    International Nuclear Information System (INIS)

    1975-01-01

    The study consisted of a comparative evaluation of 2 nuclear units (Indian Point 2 - Consolidated Edison of New York, Turkey Point 4 - Florida Power and Light Company) and 2 coal-fired units (Bull Run and Widows Creek Unit 8 - Tennessee Valley Authority). The purpose of the study was to identify and assess the underlying causes of unit reliability and the causes of the observed differences in reliability performance of the units. Recommended actions for improving the reliability of one of the study units was to be presented in a format useful to other utility companies for improving reliability of their generating units. The emphasis of the study was on the aspects of management, manning, operations, and maintenance which had a significant impact on unit reliability. Volume 1 includes a summary, a description of the major findings from the comparative evaluation, conclusions based on these findings, and recommendations for improving the reliability of the below average units

  9. New Approaches to Reliability Assessment

    DEFF Research Database (Denmark)

    Ma, Ke; Wang, Huai; Blaabjerg, Frede

    2016-01-01

    of energy. New approaches for reliability assessment are being taken in the design phase of power electronics systems based on the physics-of-failure in components. In this approach, many new methods, such as multidisciplinary simulation tools, strength testing of components, translation of mission profiles......, and statistical analysis, are involved to enable better prediction and design of reliability for products. This article gives an overview of the new design flow in the reliability engineering of power electronics from the system-level point of view and discusses some of the emerging needs for the technology...

  10. Geopan AT@S: a Brokering Based Gateway to Georeferenced Historical Maps for Risk Analysis

    Science.gov (United States)

    Previtali, M.

    2017-08-01

    Importance of ancient and historical maps is nowadays recognized in many applications (e.g., urban planning, landscape valorisation and preservation, land changes identification, etc.). In the last years a great effort has been done by different institutions, such as Geographical Institutes, Public Administrations, and collaborative communities, for digitizing and publishing online collections of historical maps. In spite of this variety and availability of data, information overload makes difficult their discovery and management: without knowing the specific repository where the data are stored, it is difficult to find the information required. In addition, problems of interconnection between different data sources and their restricted interoperability may arise. This paper describe a new brokering based gateway developed to assure interoperability between data, in particular georeferenced historical maps and geographic data, gathered from different data providers, with various features and referring to different historical periods. The developed approach is exemplified by a new application named GeoPAN Atl@s that is aimed at linking in Northern Italy area land changes with risk analysis (local seismicity amplification and flooding risk) by using multi-temporal data sources and historic maps.

  11. Digital microwave communication engineering point-to-point microwave systems

    CERN Document Server

    Kizer, George

    2013-01-01

    The first book to cover all engineering aspects of microwave communication path design for the digital age Fixed point-to-point microwave systems provide moderate-capacity digital transmission between well-defined locations. Most popular in situations where fiber optics or satellite communication is impractical, it is commonly used for cellular or PCS site interconnectivity where digital connectivity is needed but not economically available from other sources, and in private networks where reliability is most important. Until now, no book has adequately treated all en

  12. Reliability of infrared thermometric measurements of skin temperature in the hand.

    Science.gov (United States)

    Packham, Tara L; Fok, Diana; Frederiksen, Karen; Thabane, Lehana; Buckley, Norman

    2012-01-01

    Clinical measurement study. Skin temperature asymmetries (STAs) are used in the diagnosis of complex regional pain syndrome (CRPS), but little evidence exists for reliability of the equipment and methods. This study examined the reliability of an inexpensive infrared (IR) thermometer and measurement points in the hand for the study of STA. ST was measured three times at five points on both hands with an IR thermometer by two raters in 20 volunteers (12 normals and 8 CRPS). ST measurement results using IR thermometers support inter-rater reliability: intraclass correlation coefficient (ICC) estimate for single measures 0.80; all ST measurement points were also highly reliable (ICC single measures, 0.83-0.91). The equipment demonstrated excellent reliability, with little difference in the reliability of the five measurement sites. These preliminary findings support their use in future CRPS research. Not applicable. Copyright © 2012 Hanley & Belfus. Published by Elsevier Inc. All rights reserved.

  13. Reliability issues of free-space communications systems and networks

    Science.gov (United States)

    Willebrand, Heinz A.

    2003-04-01

    Free space optics (FSO) is a high-speed point-to-point connectivity solution traditionally used in the enterprise campus networking market for building-to-building LAN connectivity. However, more recently some wire line and wireless carriers started to deploy FSO systems in their networks. The requirements on FSO system reliability, meaing both system availability and component reliability, are far more stringent in the carrier market when compared to the requirements in the enterprise market segment. This paper tries to outline some of the aspects that are important to ensure carrier class system reliability.

  14. Direct Georeferencing of a Pushbroom, Lightweight Hyperspectral System for Mini-UAV Applications

    Directory of Open Access Journals (Sweden)

    Marion Jaud

    2018-01-01

    Full Text Available Hyperspectral imagery has proven its potential in many research applications, especially in the field of environmental sciences. Currently, hyperspectral imaging is generally performed by satellite or aircraft platforms, but mini-UAV (Unmanned Aerial Vehicle platforms (<20 kg are now emerging. On such platforms, payload restrictions are critical, so sensors must be selected according to stringent specifications. This article presents the integration of a light pushbroom hyperspectral sensor onboard a multirotor UAV, which we have called Hyper-DRELIO (Hyperspectral DRone for Environmental and LIttoral Observations. This article depicts the system design: the UAV platform, the imaging module, the navigation module, and the interfacing between the different elements. Pushbroom sensors offer a better combination of spatial and spectral resolution than full-frame cameras. Nevertheless, data georectification has to be performed line by line, the quality of direct georeferencing being limited by mechanical stability, good timing accuracy, and the resolution and accuracy of the proprioceptive sensors. A georegistration procedure is proposed for geometrical pre-processing of hyperspectral data. The specifications of Hyper-DRELIO surveys are described through two examples of surveys above coastal or inland waters, with different flight altitudes. This system can collect hyperspectral data in VNIR (Visible and Near InfraRed domain above small study sites (up to about 4 ha with both high spatial resolution (<10 cm and high spectral resolution (1.85 nm and with georectification accuracy on the order of 1 to 2 m.

  15. Georeferenced measurement of soil EC as a tool to detect susceptible areas to water erosion.

    Science.gov (United States)

    Fabian Sallesses, Leonardo; Aparicio, Virginia Carolina; Costa, Jose Luis

    2017-04-01

    The Southeast region of Buenos Aires Province, Argentina, is one of the main region for the cultivation of potato (Solanum tuberosum L.) in that country. The implementation of complementary irrigation for potato cultivation meant an increase in yield of up to 60%. Therefore, all potato production in the region is under irrigation. In this way, the area under central pivot irrigation has increased to 150% in the last two decades. The water used for irrigation in that region is underground with a high concentration of sodium bicarbonate. The combination of irrigation and rain increases the sodium absorption ratio of soil (SARs), consequently raising the clay dispersion and reducing infiltration. A reduction in infiltration means greater partitioning of precipitation into runoff. The degree of slope of the terrain, added to its length, increases the erosive potential of runoff water. The content of dissolved salts, in combination with the water content, affect the apparent Electrical Conductivity of the soil (EC), which is directly related to the concentration of Na + 2 in the soil solution. In August 2016, severe rill erosion was detected in a productive plot of 300 ha. The predecessor crop was a potato under irrigation campaign. However the history of the lot consists of various winter and summer crops, always made in dry land and no till. Cumulative rainfall from harvest to erosion detection (four months) was 250 mm. A georeferenced EC measurement was performed using the Verys 3100® contact sensor. With the data obtained, a geostatistical analysis was performed using Kriging spatial interpolation. The maps obtained were processed, dividing them into 4 EC ranges. The values and amplitude of the CEa ranges for each lot were determined according to the distribution observed in the generated histograms. It was observed a distribution of elevated EC ranges and consequently of a higher concentration of Na+ 2 coincident with the irrigation areas of the pivots. These

  16. Basics of Bayesian reliability estimation from attribute test data

    International Nuclear Information System (INIS)

    Martz, H.F. Jr.; Waller, R.A.

    1975-10-01

    The basic notions of Bayesian reliability estimation from attribute lifetest data are presented in an introductory and expository manner. Both Bayesian point and interval estimates of the probability of surviving the lifetest, the reliability, are discussed. The necessary formulas are simply stated, and examples are given to illustrate their use. In particular, a binomial model in conjunction with a beta prior model is considered. Particular attention is given to the procedure for selecting an appropriate prior model in practice. Empirical Bayes point and interval estimates of reliability are discussed and examples are given. 7 figures, 2 tables

  17. Prime implicants in dynamic reliability analysis

    International Nuclear Information System (INIS)

    Tyrväinen, Tero

    2016-01-01

    This paper develops an improved definition of a prime implicant for the needs of dynamic reliability analysis. Reliability analyses often aim to identify minimal cut sets or prime implicants, which are minimal conditions that cause an undesired top event, such as a system's failure. Dynamic reliability analysis methods take the time-dependent behaviour of a system into account. This means that the state of a component can change in the analysed time frame and prime implicants can include the failure of a component at different time points. There can also be dynamic constraints on a component's behaviour. For example, a component can be non-repairable in the given time frame. If a non-repairable component needs to be failed at a certain time point to cause the top event, we consider that the condition that it is failed at the latest possible time point is minimal, and the condition in which it fails earlier non-minimal. The traditional definition of a prime implicant does not account for this type of time-related minimality. In this paper, a new definition is introduced and illustrated using a dynamic flowgraph methodology model. - Highlights: • A new definition of a prime implicant is developed for dynamic reliability analysis. • The new definition takes time-related minimality into account. • The new definition is needed in dynamic flowgraph methodology. • Results can be represented by a smaller number of prime implicants.

  18. Reliability of High-Temperature Fixed-Point Installations over 8 Years

    Science.gov (United States)

    Elliott, C. J.; Ford, T.; Ongrai, O.; Pearce, J. V.

    2017-12-01

    At NPL, high-temperature metal-carbon eutectic fixed points have been set up for thermocouple calibration purposes since 2006, for realising reference temperatures above the highest point specified in the International Temperature Scale of 1990 for contact thermometer calibrations. Additionally, cells of the same design have been provided by NPL to other national measurement institutes (NMIs) and calibration laboratories over this period, creating traceable and ISO 17025 accredited facilities around the world for calibrating noble metal thermocouples at 1324 {°}C (Co-C) and 1492 {°}C (Pd-C). This paper shows collections of thermocouple calibration results obtained during use of the high-temperature fixed-point cells at NPL and, as further examples, the use of cells installed at CCPI Europe (UK) and NIMT (Thailand). The lifetime of the cells can now be shown to be in excess of 7 years, whether used on a weekly or monthly basis, and whether used in an NMI or industrial calibration laboratory.

  19. DETECTION OF SLOPE MOVEMENT BY COMPARING POINT CLOUDS CREATED BY SFM SOFTWARE

    Directory of Open Access Journals (Sweden)

    K. Oda

    2016-06-01

    Full Text Available This paper proposes movement detection method between point clouds created by SFM software, without setting any onsite georeferenced points. SfM software, like Smart3DCaputure, PhotoScan, and Pix4D, are convenient for non-professional operator of photogrammetry, because these systems require simply specification of sequence of photos and output point clouds with colour index which corresponds to the colour of original image pixel where the point is projected. SfM software can execute aerial triangulation and create dense point clouds fully automatically. This is useful when monitoring motion of unstable slopes, or loos rocks in slopes along roads or railroads. Most of existing method, however, uses mesh-based DSM for comparing point clouds before/after movement and it cannot be applied in such cases that part of slopes forms overhangs. And in some cases movement is smaller than precision of ground control points and registering two point clouds with GCP is not appropriate. Change detection method in this paper adopts CCICP (Classification and Combined ICP algorithm for registering point clouds before / after movement. The CCICP algorithm is a type of ICP (Iterative Closest Points which minimizes point-to-plane, and point-to-point distances, simultaneously, and also reject incorrect correspondences based on point classification by PCA (Principle Component Analysis. Precision test shows that CCICP method can register two point clouds up to the 1 pixel size order in original images. Ground control points set in site are useful for initial setting of two point clouds. If there are no GCPs in site of slopes, initial setting is achieved by measuring feature points as ground control points in the point clouds before movement, and creating point clouds after movement with these ground control points. When the motion is rigid transformation, in case that a loose Rock is moving in slope, motion including rotation can be analysed by executing CCICP for a

  20. Point Cloud Management Through the Realization of the Intelligent Cloud Viewer Software

    Science.gov (United States)

    Costantino, D.; Angelini, M. G.; Settembrini, F.

    2017-05-01

    The paper presents a software dedicated to the elaboration of point clouds, called Intelligent Cloud Viewer (ICV), made in-house by AESEI software (Spin-Off of Politecnico di Bari), allowing to view point cloud of several tens of millions of points, also on of "no" very high performance systems. The elaborations are carried out on the whole point cloud and managed by means of the display only part of it in order to speed up rendering. It is designed for 64-bit Windows and is fully written in C ++ and integrates different specialized modules for computer graphics (Open Inventor by SGI, Silicon Graphics Inc), maths (BLAS, EIGEN), computational geometry (CGAL, Computational Geometry Algorithms Library), registration and advanced algorithms for point clouds (PCL, Point Cloud Library), advanced data structures (BOOST, Basic Object Oriented Supporting Tools), etc. ICV incorporates a number of features such as, for example, cropping, transformation and georeferencing, matching, registration, decimation, sections, distances calculation between clouds, etc. It has been tested on photographic and TLS (Terrestrial Laser Scanner) data, obtaining satisfactory results. The potentialities of the software have been tested by carrying out the photogrammetric survey of the Castel del Monte which was already available in previous laser scanner survey made from the ground by the same authors. For the aerophotogrammetric survey has been adopted a flight height of approximately 1000ft AGL (Above Ground Level) and, overall, have been acquired over 800 photos in just over 15 minutes, with a covering not less than 80%, the planned speed of about 90 knots.

  1. POINT CLOUD MANAGEMENT THROUGH THE REALIZATION OF THE INTELLIGENT CLOUD VIEWER SOFTWARE

    Directory of Open Access Journals (Sweden)

    D. Costantino

    2017-05-01

    Full Text Available The paper presents a software dedicated to the elaboration of point clouds, called Intelligent Cloud Viewer (ICV, made in-house by AESEI software (Spin-Off of Politecnico di Bari, allowing to view point cloud of several tens of millions of points, also on of “no” very high performance systems. The elaborations are carried out on the whole point cloud and managed by means of the display only part of it in order to speed up rendering. It is designed for 64-bit Windows and is fully written in C ++ and integrates different specialized modules for computer graphics (Open Inventor by SGI, Silicon Graphics Inc, maths (BLAS, EIGEN, computational geometry (CGAL, Computational Geometry Algorithms Library, registration and advanced algorithms for point clouds (PCL, Point Cloud Library, advanced data structures (BOOST, Basic Object Oriented Supporting Tools, etc. ICV incorporates a number of features such as, for example, cropping, transformation and georeferencing, matching, registration, decimation, sections, distances calculation between clouds, etc. It has been tested on photographic and TLS (Terrestrial Laser Scanner data, obtaining satisfactory results. The potentialities of the software have been tested by carrying out the photogrammetric survey of the Castel del Monte which was already available in previous laser scanner survey made from the ground by the same authors. For the aerophotogrammetric survey has been adopted a flight height of approximately 1000ft AGL (Above Ground Level and, overall, have been acquired over 800 photos in just over 15 minutes, with a covering not less than 80%, the planned speed of about 90 knots.

  2. An Imaging Sensor-Aided Vision Navigation Approach that Uses a Geo-Referenced Image Database.

    Science.gov (United States)

    Li, Yan; Hu, Qingwu; Wu, Meng; Gao, Yang

    2016-01-28

    In determining position and attitude, vision navigation via real-time image processing of data collected from imaging sensors is advanced without a high-performance global positioning system (GPS) and an inertial measurement unit (IMU). Vision navigation is widely used in indoor navigation, far space navigation, and multiple sensor-integrated mobile mapping. This paper proposes a novel vision navigation approach aided by imaging sensors and that uses a high-accuracy geo-referenced image database (GRID) for high-precision navigation of multiple sensor platforms in environments with poor GPS. First, the framework of GRID-aided vision navigation is developed with sequence images from land-based mobile mapping systems that integrate multiple sensors. Second, a highly efficient GRID storage management model is established based on the linear index of a road segment for fast image searches and retrieval. Third, a robust image matching algorithm is presented to search and match a real-time image with the GRID. Subsequently, the image matched with the real-time scene is considered to calculate the 3D navigation parameter of multiple sensor platforms. Experimental results show that the proposed approach retrieves images efficiently and has navigation accuracies of 1.2 m in a plane and 1.8 m in height under GPS loss in 5 min and within 1500 m.

  3. Integrated Reliability-Based Optimal Design of Structures

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1987-01-01

    In conventional optimal design of structural systems the weight or the initial cost of the structure is usually used as objective function. Further, the constraints require that the stresses and/or strains at some critical points have to be less than some given values. Finally, all variables......-based optimal design is discussed. Next, an optimal inspection and repair strategy for existing structural systems is presented. An optimization problem is formulated , where the objective is to minimize the expected total future cost of inspection and repair subject to the constraint that the reliability...... value. The reliability can be measured from an element and/or a systems point of view. A number of methods to solve reliability-based optimization problems has been suggested, see e.g. Frangopol [I]. Murotsu et al. (2], Thoft-Christensen & Sørensen (3] and Sørensen (4). For structures where...

  4. Recent Advances in Optimal Design of Structures from a Reliability Point of View

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Sørensen, John Dalsgaard

    1987-01-01

    -Christensen and Baker[4] and Madsen et al.[5]. Next a heuristic method, the so-called ß-unzipping method[6] is mentioned. This method can be used to estimate the reliability of a structural system if some modelling assumptions are fulfilled. In the third section some elements of structural optimisation theory...... section. First, a short review of the reliability theory for structural elements (e.g., beams and tubular joints) based on the so-called ß-index philosophy (Cornell[1], Ditlevsen[2], and Hasofer and Lind[3]) is given. Detailed descriptions are given in textbooks such as those by Thoft...

  5. power system reliability in supplying nuclear reactors

    International Nuclear Information System (INIS)

    Gad, M.M.M.

    2007-01-01

    this thesis presents a simple technique for deducing minimal cut set (MCS) from the defined minimal path set (MPS) of generic distribution system and this technique have been used to evaluate the basic reliability indices of Egypt's second research reactor (ETRR-2) electrical distribution network. the alternative system configurations are then studied to evaluate their impact on service reliability. the proposed MCS approach considers both sustained and temporary outage. the temporary outage constitutes an important parameter in characterizing the system reliability indices for critical load point in distribution system. it is also consider the power quality impact on the reliability indices

  6. Acid dew point measurement in flue gases

    Energy Technology Data Exchange (ETDEWEB)

    Struschka, M.; Baumbach, G.

    1986-06-01

    The operation of modern boiler plants requires the continuous measurement of the acid dew point in flue gases. An existing measuring instrument was modified in such a way that it can determine acid dew points reliably, reproduceably and continuously. The authors present the mechanisms of the dew point formation, the dew point measuring principle, the modification and the operational results.

  7. System reliability effects in wind turbine blades

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov; Friis-Hansen, Peter; Berggreen, Christian

    2012-01-01

    from reliability point of view. The present paper discusses the specifics of system reliability behavior of laminated composite sandwich panels, and solves an example system reliability problem for a glass fiber-reinforced composite sandwich structure subjected to in-plane compression.......Laminated composite sandwich panels have a layered structure, where individual layers have randomly varying stiffness and strength properties. The presence of multiple failure modes and load redistribution following partial failures are the reason for laminated composites to exhibit system behavior...

  8. MHTGR thermal performance envelopes: Reliability by design

    International Nuclear Information System (INIS)

    Etzel, K.T.; Howard, W.W.; Zgliczynski, J.B.

    1992-05-01

    This document discusses thermal performance envelopes which are used to specify steady-state design requirements for the systems of the Modular High Temperature Gas-Cooled Reactor to maximize plant performance reliability with optimized design. The thermal performance envelopes are constructed around the expected operating point accounting for uncertainties in actual plant as-built parameters and plant operation. The components are then designed to perform successfully at all points within the envelope. As a result, plant reliability is maximized by accounting for component thermal performance variation in the design. The design is optimized by providing a means to determine required margins in a disciplined and visible fashion

  9. Inter-Rater Reliability of Neck Reflex Points in Women with Chronic Neck Pain.

    Science.gov (United States)

    Weinschenk, Stefan; Göllner, Richard; Hollmann, Markus W; Hotz, Lorenz; Picardi, Susanne; Hubbert, Katharina; Strowitzki, Thomas; Meuser, Thomas

    2016-01-01

    Neck reflex points (NRP) are tender soft tissue areas of the cervical region that display reflectory changes in response to chronic inflammations of correlated regions in the visceral cranium. Six bilateral areas, NRP C0, C1, C2, C3, C4 and C7, are detectable by palpating the lateral neck. We investigated the inter-rater reliability of NRP to assess their potential clinical relevance. 32 consecutive patients with chronic neck pain were examined for NRP tenderness by an experienced physician and an inexperienced medical student in a blinded design. A detailed description of the palpation technique is included in this section. Absence of pain was defined as pain index (PI) = 0, slight tenderness = 1, and marked pain = 2. Findings were evaluated either by pair-wise Cohen's kappa (ĸ) or by percentage of agreement (PA). Examiners identified 40% and 41% of positive NRP, respectively (PI > 0, physician: 155, student: 157) with a slight preference for the left side (1.2:1). The number of patients identified with >6 positive NRP by the examiners was similar (13 vs. 12 patients). ĸ values ranged from 0.52 to 0.95. The overall kappa was ĸ = 0.80 for the left and ĸ = 0.74 for the right side. PA varied from 78.1% to 96.9% with strongest agreement at NRP C0, NRP C2, and NRP C7. Inter-rater agreement was independent of patients' age, gender, body mass index and examiner's experience. The high reproducibility suggests the clinical relevance of NRP in women. © 2016 S. Karger GmbH, Freiburg.

  10. Evaluating horizontal positional accuracy of low-cost UAV orthomosaics over forest terrain using ground control points extracted from different sources

    Science.gov (United States)

    Patias, Petros; Giagkas, Fotis; Georgiadis, Charalampos; Mallinis, Giorgos; Kaimaris, Dimitris; Tsioukas, Vassileios

    2017-09-01

    Within the field of forestry, forest road mapping and inventory plays an important role in management activities related to wood harvesting industry, sentiment and water run-off modelling, biodiversity distribution and ecological connectivity, recreation activities, future planning of forest road networks and wildfire protection and fire-fighting. Especially in countries of the Mediterranean Rim, knowledge at regional and national scales regarding the distribution and the characteristics of rural and forest road network is essential in order to ensure an effective emergency management and rapid response of the fire-fighting mechanism. Yet, the absence of accurate and updated geodatabases and the drawbacks related to the use of traditional cartographic methods arising from the forest environment settings, and the cost and efforts needed, as thousands of meters need to be surveyed per site, trigger the need for new data sources and innovative mapping approaches. Monitoring the condition of unpaved forest roads with unmanned aerial vehicle technology is an attractive option for substituting objective, laboursome surveys. Although photogrammetric processing of UAV imagery can achieve accuracy of 1-2 centimeters and dense point clouds, the process is commonly based on the establishment of control points. In the case of forest road networks, which are linear features, there is a need for a great number of control points. Our aim is to evaluate low-cost UAV orthoimages generated over forest areas with GCP's captured from existing national scale aerial orthoimagery, satellite imagery available through a web mapping service (WMS), field surveys using Mobile Mapping System and GNSS receiver. We also explored the direct georeferencing potential through the GNSS onboard the low cost UAV. The results suggest that the GNSS approach proved to most accurate, while the positional accuracy derived using the WMS and the aerial orthoimagery datasets deemed satisfactory for the

  11. Operational reliability management; Gestao da confiabilidade operacional

    Energy Technology Data Exchange (ETDEWEB)

    Bressan, Edemir [Refinaria Alberto Pasqualini (REFAP), Canoas, RS (Brazil). Setor de Tecnologia de Equipamentos

    2000-07-01

    It is described the PETROBRAS Alberto Pasqualini Refinery process plant reliability management, strategies, maintenance organizational structure, management processes, predictive and preventive maintenance, condition monitoring techniques, reliability metrics, pointing out a need for close work relationship between production, maintenance and project engineering functions with highly qualified and committed proper teams, in order to reach one of the highest mechanical availability among Latin America refineries. (author)

  12. Obesity and fast food in urban markets: a new approach using geo-referenced micro data.

    Science.gov (United States)

    Chen, Susan Elizabeth; Florax, Raymond J; Snyder, Samantha D

    2013-07-01

    This paper presents a new method of assessing the relationship between features of the built environment and obesity, particularly in urban areas. Our empirical application combines georeferenced data on the location of fast-food restaurants with data about personal health, behavioral, and neighborhood characteristics. We define a 'local food environment' for every individual utilizing buffers around a person's home address. Individual food landscapes are potentially endogenous because of spatial sorting of the population and food outlets, and the body mass index (BMI) values for individuals living close to each other are likely to be spatially correlated because of observed and unobserved individual and neighborhood effects. The potential biases associated with endogeneity and spatial correlation are handled using spatial econometric estimation techniques. Our application provides quantitative estimates of the effect of proximity to fast-food restaurants on obesity in an urban food market. We also present estimates of a policy simulation that focuses on reducing the density of fast-food restaurants in urban areas. In the simulations, we account for spatial heterogeneity in both the policy instruments and individual neighborhoods and find a small effect for the hypothesized relationships between individual BMI values and the density of fast-food restaurants. Copyright © 2012 John Wiley & Sons, Ltd.

  13. Probabilistic assessment of pressure vessel and piping reliability

    International Nuclear Information System (INIS)

    Sundararajan, C.

    1986-01-01

    The paper presents a critical review of the state-of-the-art in probabilistic assessment of pressure vessel and piping reliability. First the differences in assessing the reliability directly from historical failure data and indirectly by a probabilistic analysis of the failure phenomenon are discussed and the advantages and disadvantages are pointed out. The rest of the paper deals with the latter approach of reliability assessment. Methods of probabilistic reliability assessment are described and major projects where these methods are applied for pressure vessel and piping problems are discussed. An extensive list of references is provided at the end of the paper

  14. [Dancing with Pointe Shoes: Characteristics and Assessment Criteria for Pointe Readiness].

    Science.gov (United States)

    Wanke, Eileen M; Exner-Grave, Elisabeth

    2017-12-01

    Training with pointe shoes is an integral part of professional dance education and ambitious hobby dancing. Pointe shoes - developed more than hundred years ago and almost unaltered since then - are highly specific and strike a balance between aesthetics, function, protection, and health care. Therefore, pointe readiness should be tested prior to all dance training or career training. Medical specialists are often confronted with this issue. Specific anatomical dance technique-orientated general conditional and coordinative preconditions as well as dance-technical prerequisites must be met by pointe readiness tests in order to keep traumatic injuries or long-term damage at a minimum. In addition to a (training) history, medical counselling sessions have come to include various tests that enable a reliable decision for or against pointe work. This article suggests adequate testing procedures (STT TEST), taking account of professional dancing as well as hobby dancing. © Georg Thieme Verlag KG Stuttgart · New York.

  15. Estimation of reliability of a interleaving PFC boost converter

    Directory of Open Access Journals (Sweden)

    Gulam Amer Sandepudi

    2010-01-01

    Full Text Available Reliability plays an important role in power supplies. For other electronic equipment, a certain failure mode, at least for a part of the total system, can often be employed without serious (critical effects. However, for power supply no such condition can be accepted, since very high demands on its reliability must be achieved. At higher power levels, the continuous conduction mode (CCM boost converter is preferred topology for implementation a front end with PFC. As a result, significant efforts have been made to improve the performance of high boost converter. This paper is one of the efforts for improving the performance of the converter from the reliability point of view. In this paper, interleaving boost power factor correction converter is simulated with single switch in continuous conduction mode (CCM, discontinuous conduction mode (DCM and critical conduction mode (CRM under different output power ratings. Results of the converter are explored from reliability point of view.

  16. Cyber security for greater service reliability

    Energy Technology Data Exchange (ETDEWEB)

    Vickery, P. [N-Dimension Solutions Inc., Richmond Hill, ON (Canada)

    2008-05-15

    Service reliability in the electricity transmission and distribution (T and D) industry is being challenged by increased equipment failures, harsher climatic conditions, and computer hackers who aim to disrupt services by gaining access to transmission and distribution resources. This article discussed methods of ensuring the cyber-security of T and D operators. Weak points in the T and D industry include remote terminal units; intelligent electronic devices; distributed control systems; programmable logic controllers; and various intelligent field devices. An increasing number of interconnection points exist between an operator's service control system and external systems. The North American Electric Reliability Council (NERC) standards specify that cyber security strategies should ensure that all cyber assets are protected, and that access points must be monitored to detect intrusion attempts. The introduction of new advanced metering initiatives must also be considered. Comprehensive monitoring systems should be available to support compliance with cyber security standards. It was concluded that senior management should commit to a periodic cyber security re-assessment program in order to keep up-to-date.

  17. Structural Optimization with Reliability Constraints

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1986-01-01

    During the last 25 years considerable progress has been made in the fields of structural optimization and structural reliability theory. In classical deterministic structural optimization all variables are assumed to be deterministic. Due to the unpredictability of loads and strengths of actual......]. In this paper we consider only structures which can be modelled as systems of elasto-plastic elements, e.g. frame and truss structures. In section 2 a method to evaluate the reliability of such structural systems is presented. Based on a probabilistic point of view a modern structural optimization problem...... is formulated in section 3. The formulation is a natural extension of the commonly used formulations in determinstic structural optimization. The mathematical form of the optimization problem is briefly discussed. In section 4 two new optimization procedures especially designed for the reliability...

  18. Research on reliability management systems for Nuclear Power Plant

    International Nuclear Information System (INIS)

    Maki, Nobuo

    2000-01-01

    Investigation on a reliability management system for Nuclear Power Plants (NPPs) has been performed on national and international archived documents as well as on current status of studies at Idaho National Engineering and Environmental Laboratory (INEEL), US NPPs (McGuire, Seabrook), a French NPP (St. Laurent-des-Eaux), Japan Atomic Energy Research Institute (JAERI), Central Research Institute of Electric Power Industries (CRIEPI), and power plant manufacturers in Japan. As a result of the investigation, the following points were identified: (i) A reliability management system is composed of a maintenance management system to inclusively manage maintenance data, and an anomalies information and reliability data management system to extract data from maintenance results stored in the maintenance management system and construct a reliability database. (ii) The maintenance management system, which is widely-used among NPPs in the US and Europe, is an indispensable system for the increase of maintenance reliability. (iii) Maintenance management methods utilizing reliability data like Reliability Centered Maintenance are applied for NPP maintenance in the US and Europe, and contributing to cost saving. Maintenance templates are effective in the application process. In addition, the following points were proposed on the design of the system: (i) A detailed database on specifications of facilities and components is necessary for the effective use of the system. (ii) A demand database is indispensable for the application of the methods. (iii) Full-time database managers are important to maintain the quality of the reliability data. (author)

  19. Preparedness for the Rio 2016 Olympic Games: hospital treatment capacity in georeferenced areas

    Directory of Open Access Journals (Sweden)

    Carolina Figueiredo Freitas

    2016-01-01

    Full Text Available Abstract: Recently, Brazil has hosted mass events with recognized international relevance. The 2014 FIFA World Cup was held in 12 Brazilian state capitals and health sector preparedness drew on the history of other World Cups and Brazil's own experience with the 2013 FIFA Confederations Cup. The current article aims to analyze the treatment capacity of hospital facilities in georeferenced areas for sports events in the 2016 Olympic Games in the city of Rio de Janeiro, based on a model built drawing on references from the literature. Source of data were Brazilian health databases and the Rio 2016 website. Sports venues for the Olympic Games and surrounding hospitals in a 10km radius were located by geoprocessing and designated a "health area" referring to the probable inflow of persons to be treated in case of hospital referral. Six different factors were used to calculate needs for surge and one was used to calculate needs in case of disasters (20/1,000. Hospital treatment capacity is defined by the coincidence of beds and life support equipment, namely the number of cardiac monitors (electrocardiographs and ventilators in each hospital unit. Maracanã followed by the Olympic Stadium (Engenhão and the Sambódromo would have the highest single demand for hospitalizations (1,572, 1,200 and 600, respectively. Hospital treatment capacity proved capable of accommodating surges, but insufficient in cases of mass casualties. In mass events most treatments involve easy clinical management, it is expected that the current capacity will not have negative consequences for participants.

  20. WWER-1000 unit reliability problems from the point of view of the main supplier of technological equipment

    International Nuclear Information System (INIS)

    Bursa, V.; Holousova, M.; Turnik, S.

    1990-01-01

    At Skoda Works in Plzen, data from the period of construction of nuclear power plants are processed on an ICL DRS 300 computer. The database systems DBASE II and DATAFLEX are available for creating reliability information systems. The information system that is being developed for WWER-1000 units is tested at the WWER-440 units of the Dukovany and Mochovce nuclear power plants. Activities in the field of evaluation of structure reliability of WWER-1000 nuclear power plants are aimed at two major goals, viz., developing a methodology for testing the reliability of the whole unit and its subsystems, and performing reliability analysis and calculations of reliability indices of the secondary circuit of a WWER-1000 nuclear power plant. The reason for the latter concern is the fact that in 1984-1986, secondary circuits contributed 46% to failures of Czechoslovak WWER-440 nuclear power plants. According to existing analyses, the time fluctuations of reliability indices obey no rule that could be employed for inferring indices expected in steady-state operating conditions from indices established in the starting stage of operation. (Z.M.). 10 refs

  1. Reliability-based design optimization via high order response surface method

    International Nuclear Information System (INIS)

    Li, Hong Shuang

    2013-01-01

    To reduce the computational effort of reliability-based design optimization (RBDO), the response surface method (RSM) has been widely used to evaluate reliability constraints. We propose an efficient methodology for solving RBDO problems based on an improved high order response surface method (HORSM) that takes advantage of an efficient sampling method, Hermite polynomials and uncertainty contribution concept to construct a high order response surface function with cross terms for reliability analysis. The sampling method generates supporting points from Gauss-Hermite quadrature points, which can be used to approximate response surface function without cross terms, to identify the highest order of each random variable and to determine the significant variables connected with point estimate method. The cross terms between two significant random variables are added to the response surface function to improve the approximation accuracy. Integrating the nested strategy, the improved HORSM is explored in solving RBDO problems. Additionally, a sampling based reliability sensitivity analysis method is employed to reduce the computational effort further when design variables are distributional parameters of input random variables. The proposed methodology is applied on two test problems to validate its accuracy and efficiency. The proposed methodology is more efficient than first order reliability method based RBDO and Monte Carlo simulation based RBDO, and enables the use of RBDO as a practical design tool.

  2. Recording Approach of Heritage Sites Based on Merging Point Clouds from High Resolution Photogrammetry and Terrestrial Laser Scanning

    Science.gov (United States)

    Grussenmeyer, P.; Alby, E.; Landes, T.; Koehl, M.; Guillemin, S.; Hullo, J. F.; Assali, P.; Smigiel, E.

    2012-07-01

    Different approaches and tools are required in Cultural Heritage Documentation to deal with the complexity of monuments and sites. The documentation process has strongly changed in the last few years, always driven by technology. Accurate documentation is closely relied to advances of technology (imaging sensors, high speed scanning, automation in recording and processing data) for the purposes of conservation works, management, appraisal, assessment of the structural condition, archiving, publication and research (Patias et al., 2008). We want to focus in this paper on the recording aspects of cultural heritage documentation, especially the generation of geometric and photorealistic 3D models for accurate reconstruction and visualization purposes. The selected approaches are based on the combination of photogrammetric dense matching and Terrestrial Laser Scanning (TLS) techniques. Both techniques have pros and cons and recent advances have changed the way of the recording approach. The choice of the best workflow relies on the site configuration, the performances of the sensors, and criteria as geometry, accuracy, resolution, georeferencing, texture, and of course processing time. TLS techniques (time of flight or phase shift systems) are widely used for recording large and complex objects and sites. Point cloud generation from images by dense stereo or multi-view matching can be used as an alternative or as a complementary method to TLS. Compared to TLS, the photogrammetric solution is a low cost one, as the acquisition system is limited to a high-performance digital camera and a few accessories only. Indeed, the stereo or multi-view matching process offers a cheap, flexible and accurate solution to get 3D point clouds. Moreover, the captured images might also be used for models texturing. Several software packages are available, whether web-based, open source or commercial. The main advantage of this photogrammetric or computer vision based technology is to get

  3. An accurate and efficient reliability-based design optimization using the second order reliability method and improved stability transformation method

    Science.gov (United States)

    Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo

    2018-05-01

    The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.

  4. SharePoint Server 2010 Enterprise Content Management

    CERN Document Server

    Kitta, Todd; Caplinger, Chris; Houberg, Russ

    2011-01-01

    SharePoint experts focus on SharePoint 2010 as a platform for Enterprise Content Management SharePoint allows all users in an organization to manage and share their content reliably and securely. If you're interested in building Web sites using the new capabilities of enterprise content management (ECM) in SharePoint 2010, then this book is for you. You'll discover how SharePoint 2010 spans rich document management, records management, business process management and web content management in a seamless way to manage and share content. The team of SharePoint experts discusses the ECM capabi

  5. An integrated approach to human reliability analysis -- decision analytic dynamic reliability model

    International Nuclear Information System (INIS)

    Holmberg, J.; Hukki, K.; Norros, L.; Pulkkinen, U.; Pyy, P.

    1999-01-01

    The reliability of human operators in process control is sensitive to the context. In many contemporary human reliability analysis (HRA) methods, this is not sufficiently taken into account. The aim of this article is that integration between probabilistic and psychological approaches in human reliability should be attempted. This is achieved first, by adopting such methods that adequately reflect the essential features of the process control activity, and secondly, by carrying out an interactive HRA process. Description of the activity context, probabilistic modeling, and psychological analysis form an iterative interdisciplinary sequence of analysis in which the results of one sub-task maybe input to another. The analysis of the context is carried out first with the help of a common set of conceptual tools. The resulting descriptions of the context promote the probabilistic modeling, through which new results regarding the probabilistic dynamics can be achieved. These can be incorporated in the context descriptions used as reference in the psychological analysis of actual performance. The results also provide new knowledge of the constraints of activity, by providing information of the premises of the operator's actions. Finally, the stochastic marked point process model gives a tool, by which psychological methodology may be interpreted and utilized for reliability analysis

  6. Test-Retest Reliability of the Short-Form Survivor Unmet Needs Survey.

    Science.gov (United States)

    Taylor, Karen; Bulsara, Max; Monterosso, Leanne

    2018-01-01

    Reliable and valid needs assessment measures are important assessment tools in cancer survivorship care. A new 30-item short-form version of the Survivor Unmet Needs Survey (SF-SUNS) was developed and validated with cancer survivors, including hematology cancer survivors; however, test-retest reliability has not been established. The objective of this study was to assess the test-retest reliability of the SF-SUNS with a cohort of lymphoma survivors ( n = 40). Test-retest reliability of the SF-SUNS was conducted at two time points: baseline (time 1) and 5 days later (time 2). Test-retest data were collected from lymphoma cancer survivors ( n = 40) in a large tertiary cancer center in Western Australia. Intraclass correlation analyses compared data at time 1 (baseline) and time 2 (5 days later). Cronbach's alpha analyses were performed to assess the internal consistency at both time points. The majority (23/30, 77%) of items achieved test-retest reliability scores 0.45-0.74 (fair to good). A high degree of overall internal consistency was demonstrated (time 1 = 0.92, time 2 = 0.95), with scores 0.65-0.94 across subscales for both time points. Mixed test-retest reliability of the SF-SUNS was established. Our results indicate the SF-SUNS is responsive to the changing needs of lymphoma cancer survivors. Routine use of cancer survivorship specific needs-based assessments is required in oncology care today. Nurses are well placed to administer these assessments and provide tailored information and resources. Further assessment of test-retest reliability in hematology and other cancer cohorts is warranted.

  7. Thermal performance envelopes for MHTGRs - Reliability by design

    International Nuclear Information System (INIS)

    Etzel, K.T.; Howard, W.W.; Zgliczynski, J.

    1992-01-01

    Thermal performance envelopes are used to specify steady-state design requirements for the systems of the modular high-temperature gas-cooled reactor (MHTGR) to maximize plant performance reliability with optimized design. The thermal performance envelopes are constructed around the expected operating point to account for uncertainties in actual plant as-built parameters and plant operation. The components are then designed to perform successfully at all points within the envelope. As a result, plant reliability is maximized by accounting for component thermal performance variation in the design. The design is optimized by providing a means to determine required margins in a disciplined and visible fashion. This is accomplished by coordinating these requirements with the various system and component designers in the early stages of the design, applying the principles of total quality management. The design is challenged by the more complex requirements associated with a range of operating conditions, but in return, high probability of delivering reliable performance throughout the plant life is ensured

  8. Structural reliability codes for probabilistic design

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    probabilistic code format has not only strong influence on the formal reliability measure, but also on the formal cost of failure to be associated if a design made to the target reliability level is considered to be optimal. In fact, the formal cost of failure can be different by several orders of size for two...... different, but by and large equally justifiable probabilistic code formats. Thus, the consequence is that a code format based on decision theoretical concepts and formulated as an extension of a probabilistic code format must specify formal values to be used as costs of failure. A principle of prudence...... is suggested for guiding the choice of the reference probabilistic code format for constant reliability. In the author's opinion there is an urgent need for establishing a standard probabilistic reliability code. This paper presents some considerations that may be debatable, but nevertheless point...

  9. Reliability analysis of safety systems of nuclear power plant and utility experience with reliability safeguarding of systems during specified normal operation

    International Nuclear Information System (INIS)

    Balfanz, H.P.

    1989-01-01

    The paper gives an outline of the methods applied for reliability analysis of safety systems in nuclear power plant. The main tasks are to check the system design for detection of weak points, and to find possibilities of optimizing the strategies for inspection, inspection intervals, maintenance periods. Reliability safeguarding measures include the determination and verification of the broundary conditions of the analysis with regard to the reliability parameters and maintenance parameters used in the analysis, and the analysis of data feedback reflecting the plant response during operation. (orig.) [de

  10. Evaluation of European blanket concepts for DEMO from availability and reliability point of view

    International Nuclear Information System (INIS)

    Nardi, C.

    1995-12-01

    This technical report is concerned with the ENEA activities relating to reliability and availability for the selection among two of the four European blanket concepts for the DEMO reactor. The activities on the BIT concept, the one proposed by ENEA, are emphasized. In spite of the lack of data relating to the behaviour of structures in an environment similar to that of a fusion reactor, it is evidenced that the available data are relevant to the BIT concept geometry. Moreover, it is evidenced that the qualitative reliability evaluations, compared to the quantitative ones, can lead to a better understanding of the typical problems of a structure to be used in a fusion reactor

  11. The cost of reliability

    International Nuclear Information System (INIS)

    Ilic, M.

    1998-01-01

    In this article the restructuring process under way in the US power industry is being revisited from the point of view of transmission system provision and reliability was rolled into the average cost of electricity to all, it is not so obvious how is this cost managed in the new industry. A new MIT approach to transmission pricing is here suggested as a possible solution [it

  12. Effects of image enhancement on reliability of landmark identification in digital cephalometry

    Directory of Open Access Journals (Sweden)

    M Oshagh

    2013-01-01

    Full Text Available Introduction: Although digital cephalometric radiography is gaining popularity in orthodontic practice, the most important source of error in its tracing is uncertainty in landmark identification. Therefore, efforts to improve accuracy in landmark identification were directed primarily toward the improvement in image quality. One of the more useful techniques of this process involves digital image enhancement which can increase overall visual quality of image, but this does not necessarily mean a better identification of landmarks. The purpose of this study was to evaluate the effectiveness of digital image enhancements on reliability of landmark identification. Materials and Methods: Fifteen common landmarks including 10 skeletal and 5 soft tissues were selected on the cephalograms of 20 randomly selected patients, prepared in Natural Head Position (NHP. Two observers (orthodontists identified landmarks on the 20 original photostimulable phosphor (PSP digital cephalogram images and 20 enhanced digital images twice with an intervening time interval of at least 4 weeks. The x and y coordinates were further analyzed to evaluate the pattern of recording differences in horizontal and vertical directions. Reliability of landmarks identification was analyzed by paired t test. Results: There was a significant difference between original and enhanced digital images in terms of reliability of points Ar and N in vertical and horizontal dimensions, and enhanced images were significantly more reliable than original images. Identification of A point, Pogonion and Pronasal points, in vertical dimension of enhanced images was significantly more reliable than original ones. Reliability of Menton point identification in horizontal dimension was significantly more in enhanced images than original ones. Conclusion: Direct digital image enhancement by altering brightness and contrast can increase reliability of some landmark identification and this may lead to more

  13. Reliability issues : a Canadian perspective

    International Nuclear Information System (INIS)

    Konow, H.

    2004-01-01

    A Canadian perspective of power reliability issues was presented. Reliability depends on adequacy of supply and a framework for standards. The challenges facing the electric power industry include new demand, plant replacement and exports. It is expected that demand will by 670 TWh by 2020, with 205 TWh coming from new plants. Canada will require an investment of $150 billion to meet this demand and the need is comparable in the United States. As trade grows, the challenge becomes a continental issue and investment in the bi-national transmission grid will be essential. The 5 point plan of the Canadian Electricity Association is to: (1) establish an investment climate to ensure future electricity supply, (2) move government and industry towards smart and effective regulation, (3) work to ensure a sustainable future for the next generation, (4) foster innovation and accelerate skills development, and (5) build on the strengths of an integrated North American system to maximize opportunity for Canadians. The CEA's 7 measures that enhance North American reliability were listed with emphasis on its support for a self-governing international organization for developing and enforcing mandatory reliability standards. CEA also supports the creation of a binational Electric Reliability Organization (ERO) to identify and solve reliability issues in the context of a bi-national grid. tabs., figs

  14. Metrological Reliability of Medical Devices

    Science.gov (United States)

    Costa Monteiro, E.; Leon, L. F.

    2015-02-01

    The prominent development of health technologies of the 20th century triggered demands for metrological reliability of physiological measurements comprising physical, chemical and biological quantities, essential to ensure accurate and comparable results of clinical measurements. In the present work, aspects concerning metrological reliability in premarket and postmarket assessments of medical devices are discussed, pointing out challenges to be overcome. In addition, considering the social relevance of the biomeasurements results, Biometrological Principles to be pursued by research and innovation aimed at biomedical applications are proposed, along with the analysis of their contributions to guarantee the innovative health technologies compliance with the main ethical pillars of Bioethics.

  15. Establishing meaningful cut points for online user ratings.

    Science.gov (United States)

    Hirschfeld, Gerrit; Thielsch, Meinald T

    2015-01-01

    Subjective perceptions of websites can be reliably measured with questionnaires. But it is unclear how such scores should be interpreted in practice, e.g. is an aesthetics score of 4 points on a seven-point-scale satisfactory? The current paper introduces a receiver-operating characteristic (ROC)-based methodology to establish meaningful cut points for the VisAWI (visual aesthetics of websites inventory) and its short form the VisAWI-S. In two studies we use users' global ratings (UGRs) and website rankings as anchors. A total of 972 participants took part in the studies which yielded similar results. First, one-item UGRs correlate highly with the VisAWI. Second, cut points on the VisAWI reliably differentiate between sites that are perceived as attractive versus unattractive. Third, these cut points are variable, but only within a certain range. Together the research presented here establishes a score of 4.5 on the VisAWI which is a reasonable goal for website designers and highlights the utility of the ROC methodology to derive relevant scores for rating scales.

  16. Mapping and prediction of schistosomiasis in Nigeria using compiled survey data and Bayesian geospatial modelling

    DEFF Research Database (Denmark)

    Ekpo, Uwem F.; Hürlimann, Eveline; Schur, Nadine

    2013-01-01

    Schistosomiasis prevalence data for Nigeria were extracted from peer-reviewed journals and reports, geo-referenced and collated in a nationwide geographical information system database for the generation of point prevalence maps. This exercise revealed that the disease is endemic in 35 of the cou......Schistosomiasis prevalence data for Nigeria were extracted from peer-reviewed journals and reports, geo-referenced and collated in a nationwide geographical information system database for the generation of point prevalence maps. This exercise revealed that the disease is endemic in 35...

  17. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  18. Development of reliability-based safety enhancement technology

    International Nuclear Information System (INIS)

    Kim, Kil Yoo; Han, Sang Hoon; Jang, Seung Cherl

    2002-04-01

    This project aims to develop critical technologies and the necessary reliability DB for maximizing the economics in the NPP operation with keeping the safety using the information of the risk (or reliability). For the research goal, firstly the four critical technologies(Risk Informed Tech. Spec. Optimization, Risk Informed Inservice Testing, On-line Maintenance, Maintenance Rule) for RIR and A have been developed. Secondly, KIND (Korea Information System for Nuclear Reliability Data) has been developed. Using KIND, YGN 3,4 and UCN 3,4 component reliability DB have been established. A reactor trip history DB for all NPP in Korea also has been developed and analyzed. Finally, a detailed reliability analysis of RPS/ESFAS for KNSP has been performed. With the result of the analysis, the sensitivity analysis also has been performed to optimize the AOT/STI of tech. spec. A statistical analysis procedure and computer code have been developed for the set point drift analysis

  19. Reliability analysis of offshore structures using OMA based fatigue stresses

    DEFF Research Database (Denmark)

    Silva Nabuco, Bruna; Aissani, Amina; Glindtvad Tarpø, Marius

    2017-01-01

    focus is on the uncertainty observed on the different stresses used to predict the damage. This uncertainty can be reduced by Modal Based Fatigue Monitoring which is a technique based on continuously measuring of the accelerations in few points of the structure with the use of accelerometers known...... points of the structure, the stress history can be calculated in any arbitrary point of the structure. The accuracy of the estimated actual stress is analyzed by experimental tests on a scale model where the obtained stresses are compared to strain gauges measurements. After evaluating the fatigue...... stresses directly from the operational response of the structure, a reliability analysis is performed in order to estimate the reliability of using Modal Based Fatigue Monitoring for long term fatigue studies....

  20. Reliability Analysis of Safety Grade PLC(POSAFE-Q) for Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, J. Y.; Lyou, J.; Lee, D. Y.; Choi, J. G.; Park, W. M.

    2006-01-01

    The Part Count Method of the military standard MILHDK- 217F has been used for the reliability prediction of the nuclear field. This handbook determines the Programmable Logic Controller (PLC) failure rate by summing the failure rates of the individual component included in the PLC. Normally it is easily predictable that the components added for the fault detection improve the reliability of the PLC. But the application of this handbook is estimated with poor reliability because of the increased component number for the fault detection. To compensate this discrepancy, the quantitative reliability analysis method is suggested using the functional separation model in this paper. And it is applied to the Reactor Protection System (RPS) being developed in Korea to identify any design weak points from a safety point of view

  1. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  2. Strategic bidding of generating units in competitive electricity market with considering their reliability

    International Nuclear Information System (INIS)

    Soleymani, S.; Ranjbar, A.M.; Shirani, A.R.

    2008-01-01

    In the restructured power systems, they are typically scheduled based on the offers and bids to buy and sell energy and ancillary services (AS) subject to operational and security constraints. Generally, no account is taken of unit reliability when scheduling it. Therefore generating units have no incentive to improve their reliability. This paper proposes a new method to obtain the equilibrium points for reliability and price bidding strategy of units when the unit reliability is considered in the scheduling problem. The proposed methodology employs the supply function equilibrium (SFE) for modeling a unit's bidding strategy. Units change their bidding strategies and improve their reliability until Nash equilibrium points are obtained. GAMS (general algebraic modeling system) language has been used to solve the market scheduling problem using DICOPT optimization software with mixed integer non-linear programming. (author)

  3. Ultra-Reliable Communication in a Factory Environment for 5G Wireless Networks

    DEFF Research Database (Denmark)

    Singh, Bikramjit; Lee, Zexian; Tirkkonen, Olav

    2016-01-01

    The focus of this paper on mission-critical Communications in a 5G cellular communication system. Technologies to provide ultra-reliable communication, with 99:999 % availability in a factory environment are studied. We have analysed the feasibility requirements for ultra-reliable communication...... are compared. Last, the importance of multi-hop communication and multi-point coordination schemes are highlighted to improve the reliable communication in presence of interference and clutter. Keywords—5G; mission-critical communications; ultra-reliable communication; availability; reliability...

  4. A framework for reliability and risk centered maintenance

    International Nuclear Information System (INIS)

    Selvik, J.T.; Aven, T.

    2011-01-01

    Reliability centered maintenance (RCM) is a well-established analysis method for preventive maintenance planning. As its name indicates, reliability is the main point of reference for the planning, but consequences of failures are also assessed. However, uncertainties and risk are to a limited extent addressed by the RCM method, and in this paper we suggest an extension of the RCM to reliability and risk centered maintenance (RRCM) by also considering risk as the reference for the analysis in addition to reliability. A broad perspective on risk is adopted where uncertainties are the main component of risk in addition to possible events and associated consequences. A case from the offshore oil and gas industry is presented to illustrate and discuss the suggested approach.

  5. The impact of scheduling on service reliability : Trip-time determination and holding points in long-headway services

    NARCIS (Netherlands)

    Van Oort, N.; Boterman, J.W.; Van Nes, R.

    2012-01-01

    This paper presents research on optimizing service reliability of longheadway services in urban public transport. Setting the driving time, and thus the departure time at stops, is an important decision when optimizing reliability in urban public transport. The choice of the percentile out of

  6. Development of a georeferenced data bank of radionuclides in typical food of Latin America - SIGLARA; Desenvolvimento de um banco de dados georeferenciado de radionuclideos em alimentos tipicos na America Latina - SIGLARA

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento, Lucia Maria Evangelista do

    2014-07-01

    The related management information related to the environmental assessment activity aims to provide the world community with better access to meaningful environmental information and help use this information in making decisions in case of contamination due to accident or deliberate actions. In recent years, the geotechnologies acquired are fundamental to research and environmental monitoring, once it possible, efficiently obtaining large amount of data natural resources. The result of this work was the development of a database system to store georeferenced data values of radionuclides in typical foods in Latin America (SIGLARA), defined in three languages (Spanish, Portuguese and English), using free software. The developed system meets the primary need of the RLA 09/72 ARCAL Project, funded by the International Atomic Energy Agency (IAEA), as having eleven participants countries in Latin America. The database of georeferenced created for SIGLARA system was tested in its applicability through the entry and manipulation of real data analyzed, which showed that the system is able to store, retrieve, view reports and maps of the samples of registered food. Interfaces that connect the user with the database show up efficient, making the system easy operability. Their application to environmental management is already showing results, it is hoped that these results will encourage its widespread adoption by other countries, institutions, the scientific community and the general public. (author)

  7. Reliability of sonographic assessment of tendinopathy in tennis elbow.

    Science.gov (United States)

    Poltawski, Leon; Ali, Syed; Jayaram, Vijay; Watson, Tim

    2012-01-01

    To assess the reliability and compute the minimum detectable change using sonographic scales to quantify the extent of pathology and hyperaemia in the common extensor tendon in people with tennis elbow. The lateral elbows of 19 people with tennis elbow were assessed sonographically twice, 1-2 weeks apart. Greyscale and power Doppler images were recorded for subsequent rating of abnormalities. Tendon thickening, hypoechogenicity, fibrillar disruption and calcification were each rated on four-point scales, and scores were summed to provide an overall rating of structural abnormality; hyperaemia was scored on a five point scale. Inter-rater reliability was established using the intraclass correlation coefficient (ICC) to compare scores assigned independently to the same set of images by a radiologist and a physiotherapist with training in musculoskeletal imaging. Test-retest reliability was assessed by comparing scores assigned by the physiotherapist to images recorded at the two sessions. The minimum detectable change (MDC) was calculated from the test-retest reliability data. ICC values for inter-rater reliability ranged from 0.35 (95% CI: 0.05, 0.60) for fibrillar disruption to 0.77 (0.55, 0.88) for overall greyscale score, and 0.89 (0.79, 0.95) for hyperaemia. Test-retest reliability ranged from 0.70 (0.48, 0.84) for tendon thickening to 0.82 (0.66, 0.90) for overall greyscale score and 0.86 (0.73, 0.93) for calcification. The MDC for the greyscale total score was 2.0/12 and for the hyperaemia score was 1.1/5. The sonographic scoring system used in this study may be used reliably to quantify tendon abnormalities and change over time. A relatively inexperienced imager can conduct the assessment and use the rating scales reliably.

  8. Reliable control using the primary and dual Youla parameterizations

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Stoustrup, J.

    2002-01-01

    Different aspects of modeling faults in dynamic systems are considered in connection with reliable control (RC). The fault models include models with additive faults, multiplicative faults and structural changes in the models due to faults in the systems. These descriptions are considered...... in connection with reliable control and feedback control with fault rejection. The main emphasis is on fault modeling. A number of fault diagnosis problems, reliable control problems, and feedback control with fault rejection problems are formulated/considered, again, mainly from a fault modeling point of view....... Reliability is introduced by means of the (primary) Youla parameterization of all stabilizing controllers, where an additional loop is closed around a diagnostic signal. In order to quantify the level of reliability, the dual Youla parameterization is introduced which can be used to analyze how large faults...

  9. Power system reliability memento; Memento de la surete du systeme electrique

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-07-01

    The reliability memento of the French power system (national power transmission grid) is an educational document which purpose is to point out the role of each one as regards power system operating reliability. This memento was first published in 1999. Extensive changes have taken place since then. The new 2002 edition shows that system operating reliability is as an important subject as ever: 1 - foreword; 2 - system reliability: the basics; 3 - equipment measures taken in order to guarantee the reliability of the system; 4 - organisational and human measures taken to guarantee the reliability of the system; appendix 1 - system operation: basic concepts; appendix 2 - guiding principles governing the reliability of the power system; appendix 3 - international associations of transmission system operators; appendix 4 - description of major incidents.

  10. Reliability Worth Analysis of Distribution Systems Using Cascade Correlation Neural Networks

    DEFF Research Database (Denmark)

    Heidari, Alireza; Agelidis, Vassilios; Pou, Josep

    2018-01-01

    Reliability worth analysis is of great importance in the area of distribution network planning and operation. The reliability worth's precision can be affected greatly by the customer interruption cost model used. The choice of the cost models can change system and load point reliability indices....... In this study, a cascade correlation neural network is adopted to further develop two cost models comprising a probabilistic distribution model and an average or aggregate model. A contingency-based analytical technique is adopted to conduct the reliability worth analysis. Furthermore, the possible effects...

  11. Use of reliability analysis for the safety evaluation of technical facilities

    International Nuclear Information System (INIS)

    Balfanz, H.P.; Eggert, H.; Lindauer, E.

    1975-01-01

    Using examples from nuclear technology, the following is discussed: how efficient the present practical measures are for increasing reliability, which weak points can be recognized and what appears to be the most promising direction to take for improvements. The following are individually dealt with: 1) determination of the relevant parameters for the safety of a plant; 2) definition and fixing of reliability requirements; 3) process to prove the fulfilment of requirements; 4) measures to guarantee the reliability; 5) data feed-back to check and improve the reliability. (HP/LH) [de

  12. LOFT pressurizer safety: relief valve reliability

    Energy Technology Data Exchange (ETDEWEB)

    Brown, E.S.

    1978-01-18

    The LOFT pressurizer self-actuating safety-relief valves are constructed to the present state-of-the-art and should have reliability equivalent to the valves in use on PWR plants in the U.S. There have been no NRC incident reports on valve failures to lift that would challenge the Technical Specification Safety Limit. Fourteen valves have been reported as lifting a few percentage points outside the +-1% Tech. Spec. surveillance tolerance (9 valves tested over and 5 valves tested under specification). There have been no incident reports on failures to reseat. The LOFT surveillance program for assuring reliability is equivalent to nuclear industry practice.

  13. LOFT pressurizer safety: relief valve reliability

    International Nuclear Information System (INIS)

    Brown, E.S.

    1978-01-01

    The LOFT pressurizer self-actuating safety-relief valves are constructed to the present state-of-the-art and should have reliability equivalent to the valves in use on PWR plants in the U.S. There have been no NRC incident reports on valve failures to lift that would challenge the Technical Specification Safety Limit. Fourteen valves have been reported as lifting a few percentage points outside the +-1% Tech. Spec. surveillance tolerance (9 valves tested over and 5 valves tested under specification). There have been no incident reports on failures to reseat. The LOFT surveillance program for assuring reliability is equivalent to nuclear industry practice

  14. Reliability evaluation of deregulated electric power systems for planning applications

    International Nuclear Information System (INIS)

    Ehsani, A.; Ranjbar, A.M.; Jafari, A.; Fotuhi-Firuzabad, M.

    2008-01-01

    In a deregulated electric power utility industry in which a competitive electricity market can influence system reliability, market risks cannot be ignored. This paper (1) proposes an analytical probabilistic model for reliability evaluation of competitive electricity markets and (2) develops a methodology for incorporating the market reliability problem into HLII reliability studies. A Markov state space diagram is employed to evaluate the market reliability. Since the market is a continuously operated system, the concept of absorbing states is applied to it in order to evaluate the reliability. The market states are identified by using market performance indices and the transition rates are calculated by using historical data. The key point in the proposed method is the concept that the reliability level of a restructured electric power system can be calculated using the availability of the composite power system (HLII) and the reliability of the electricity market. Two case studies are carried out over Roy Billinton Test System (RBTS) to illustrate interesting features of the proposed methodology

  15. Cost- and reliability-oriented aggregation point association in long-term evolution and passive optical network hybrid access infrastructure for smart grid neighborhood area network

    Science.gov (United States)

    Cheng, Xiao; Feng, Lei; Zhou, Fanqin; Wei, Lei; Yu, Peng; Li, Wenjing

    2018-02-01

    With the rapid development of the smart grid, the data aggregation point (AP) in the neighborhood area network (NAN) is becoming increasingly important for forwarding the information between the home area network and wide area network. Due to limited budget, it is unable to use one-single access technology to meet the ongoing requirements on AP coverage. This paper first introduces the wired and wireless hybrid access network with the integration of long-term evolution (LTE) and passive optical network (PON) system for NAN, which allows a good trade-off among cost, flexibility, and reliability. Then, based on the already existing wireless LTE network, an AP association optimization model is proposed to make the PON serve as many APs as possible, considering both the economic efficiency and network reliability. Moreover, since the features of the constraints and variables of this NP-hard problem, a hybrid intelligent optimization algorithm is proposed, which is achieved by the mixture of the genetic, ant colony and dynamic greedy algorithm. By comparing with other published methods, simulation results verify the performance of the proposed method in improving the AP coverage and the performance of the proposed algorithm in terms of convergence.

  16. Reliability and Validity of Five Mental Health Scales in Older Persons.

    Science.gov (United States)

    Himmelfarb, Samuel; Murrell, Stanley A.

    1983-01-01

    Assessed five scales as mental health measures for older persons (N=318). The internal consistency reliabilities for the anxiety, depression, and well-being scales were moderately high to high, but the reliabilities for the affect balance scale suggest some caution. Cutting points for the well-being and depression scales are suggested. (Author/JAC)

  17. The fair value of operational reliability

    International Nuclear Information System (INIS)

    Patino-Echeverri, Dalia; Morel, Benoit

    2005-01-01

    Information about the uncertainties that surround the operation of the power system can be used to enlighten the debate of how much reliability should be pursued and how resources should be allocated to pursue it. In this paper we present a method to determine the value of having flexible generators to react to load fluctuations. This value can be seen as the value of hedging against the uncertainty on the load due to the volatility of the demand and the possibility of congestion. Because having this flexibility can be related to a financial option, we will use an extension of options theory and in particular the risk-neutral valuation method, to find a risk neutral quantification of its value. We illustrate our point valuing the flexibility that leads to ''operational reliability'' in the PJM market. Our formula for that value is what we call ''the fair value'' of operational reliability. (Author)

  18. Quality assurance and reliability in the Japanese electronics industry

    Science.gov (United States)

    Pecht, Michael; Boulton, William R.

    1995-02-01

    Quality and reliability are two attributes required for all Japanese products, although the JTEC panel found these attributes to be secondary to customer cost requirements. While our Japanese hosts gave presentations on the challenges of technology, cost, and miniaturization, quality and reliability were infrequently the focus of our discussions. Quality and reliability were assumed to be sufficient to meet customer needs. Fujitsu's slogan, 'quality built-in, with cost and performance as prime consideration,' illustrates this point. Sony's definition of a next-generation product is 'one that is going to be half the size and half the price at the same performance of the existing one'. Quality and reliability are so integral to Japan's electronics industry that they need no new emphasis.

  19. Tactile acuity charts: a reliable measure of spatial acuity.

    Science.gov (United States)

    Bruns, Patrick; Camargo, Carlos J; Campanella, Humberto; Esteve, Jaume; Dinse, Hubert R; Röder, Brigitte

    2014-01-01

    For assessing tactile spatial resolution it has recently been recommended to use tactile acuity charts which follow the design principles of the Snellen letter charts for visual acuity and involve active touch. However, it is currently unknown whether acuity thresholds obtained with this newly developed psychophysical procedure are in accordance with established measures of tactile acuity that involve passive contact with fixed duration and control of contact force. Here we directly compared tactile acuity thresholds obtained with the acuity charts to traditional two-point and grating orientation thresholds in a group of young healthy adults. For this purpose, two types of charts, using either Braille-like dot patterns or embossed Landolt rings with different orientations, were adapted from previous studies. Measurements with the two types of charts were equivalent, but generally more reliable with the dot pattern chart. A comparison with the two-point and grating orientation task data showed that the test-retest reliability of the acuity chart measurements after one week was superior to that of the passive methods. Individual thresholds obtained with the acuity charts agreed reasonably with the grating orientation threshold, but less so with the two-point threshold that yielded relatively distinct acuity estimates compared to the other methods. This potentially considerable amount of mismatch between different measures of tactile acuity suggests that tactile spatial resolution is a complex entity that should ideally be measured with different methods in parallel. The simple test procedure and high reliability of the acuity charts makes them a promising complement and alternative to the traditional two-point and grating orientation thresholds.

  20. Exploiting Deep Matching and SAR Data for the Geo-Localization Accuracy Improvement of Optical Satellite Images

    Directory of Open Access Journals (Sweden)

    Nina Merkle

    2017-06-01

    Full Text Available Improving the geo-localization of optical satellite images is an important pre-processing step for many remote sensing tasks like monitoring by image time series or scene analysis after sudden events. These tasks require geo-referenced and precisely co-registered multi-sensor data. Images captured by the high resolution synthetic aperture radar (SAR satellite TerraSAR-X exhibit an absolute geo-location accuracy within a few decimeters. These images represent therefore a reliable source to improve the geo-location accuracy of optical images, which is in the order of tens of meters. In this paper, a deep learning-based approach for the geo-localization accuracy improvement of optical satellite images through SAR reference data is investigated. Image registration between SAR and optical images requires few, but accurate and reliable matching points. These are derived from a Siamese neural network. The network is trained using TerraSAR-X and PRISM image pairs covering greater urban areas spread over Europe, in order to learn the two-dimensional spatial shifts between optical and SAR image patches. Results confirm that accurate and reliable matching points can be generated with higher matching accuracy and precision with respect to state-of-the-art approaches.

  1. Evaluation of Validity and Reliability for Hierarchical Scales Using Latent Variable Modeling

    Science.gov (United States)

    Raykov, Tenko; Marcoulides, George A.

    2012-01-01

    A latent variable modeling method is outlined, which accomplishes estimation of criterion validity and reliability for a multicomponent measuring instrument with hierarchical structure. The approach provides point and interval estimates for the scale criterion validity and reliability coefficients, and can also be used for testing composite or…

  2. Contribution to high voltage matrix switches reliability

    International Nuclear Information System (INIS)

    Lausenaz, Yvan

    2000-01-01

    Nowadays, power electronic equipment requirements are important, concerning performances, quality and reliability. On the other hand, costs have to be reduced in order to satisfy the market rules. To provide cheap, reliability and performances, many standard components with mass production are developed. But the construction of specific products must be considered following these two different points: in one band you can produce specific components, with delay, over-cost problems and eventuality quality and reliability problems, in the other and you can use standard components in a adapted topologies. The CEA of Pierrelatte has adopted this last technique of power electronic conception for the development of these high voltage pulsed power converters. The technique consists in using standard components and to associate them in series and in parallel. The matrix constitutes high voltage macro-switch where electrical parameters are distributed between the synchronized components. This study deals with the reliability of these structures. It brings up the high reliability aspect of MOSFETs matrix associations. Thanks to several homemade test facilities, we obtained lots of data concerning the components we use. The understanding of defects propagation mechanisms in matrix structures has allowed us to put forwards the necessity of robust drive system, adapted clamping voltage protection, and careful geometrical construction. All these reliability considerations in matrix associations have notably allowed the construction of a new matrix structure regrouping all solutions insuring reliability. Reliable and robust, this product has already reaches the industrial stage. (author) [fr

  3. Uncertainties and reliability theories for reactor safety

    International Nuclear Information System (INIS)

    Veneziano, D.

    1975-01-01

    What makes the safety problem of nuclear reactors particularly challenging is the demand for high levels of reliability and the limitation of statistical information. The latter is an unfortunate circumstance, which forces deductive theories of reliability to use models and parameter values with weak factual support. The uncertainty about probabilistic models and parameters which are inferred from limited statistical evidence can be quantified and incorporated rationally into inductive theories of reliability. In such theories, the starting point is the information actually available, as opposed to an estimated probabilistic model. But, while the necessity of introducing inductive uncertainty into reliability theories has been recognized by many authors, no satisfactory inductive theory is presently available. The paper presents: a classification of uncertainties and of reliability models for reactor safety; a general methodology to include these uncertainties into reliability analysis; a discussion about the relative advantages and the limitations of various reliability theories (specifically, of inductive and deductive, parametric and nonparametric, second-moment and full-distribution theories). For example, it is shown that second-moment theories, which were originally suggested to cope with the scarcity of data, and which have been proposed recently for the safety analysis of secondary containment vessels, are the least capable of incorporating statistical uncertainty. The focus is on reliability models for external threats (seismic accelerations and tornadoes). As an application example, the effect of statistical uncertainty on seismic risk is studied using parametric full-distribution models

  4. Composite reliability evaluation for transmission network planning

    Directory of Open Access Journals (Sweden)

    Jiashen Teh

    2018-01-01

    Full Text Available As the penetration of wind power into the power system increases, the ability to assess the reliability impact of such interaction becomes more important. The composite reliability evaluations involving wind energy provide ample opportunities for assessing the benefits of different wind farm connection points. A connection to the weak area of the transmission network will require network reinforcement for absorbing the additional wind energy. Traditionally, the reinforcements are performed by constructing new transmission corridors. However, a new state-of-art technology such as the dynamic thermal rating (DTR system, provides new reinforcement strategy and this requires new reliability assessment method. This paper demonstrates a methodology for assessing the cost and the reliability of network reinforcement strategies by considering the DTR systems when large scale wind farms are connected to the existing power network. Sequential Monte Carlo simulations were performed and all DTRs and wind speed were simulated using the auto-regressive moving average (ARMA model. Various reinforcement strategies were assessed from their cost and reliability aspects. Practical industrial standards are used as guidelines when assessing costs. Due to this, the proposed methodology in this paper is able to determine the optimal reinforcement strategies when both the cost and reliability requirements are considered.

  5. Automated Identification of Fiducial Points on 3D Torso Images

    Directory of Open Access Journals (Sweden)

    Manas M. Kawale

    2013-01-01

    Full Text Available Breast reconstruction is an important part of the breast cancer treatment process for many women. Recently, 2D and 3D images have been used by plastic surgeons for evaluating surgical outcomes. Distances between different fiducial points are frequently used as quantitative measures for characterizing breast morphology. Fiducial points can be directly marked on subjects for direct anthropometry, or can be manually marked on images. This paper introduces novel algorithms to automate the identification of fiducial points in 3D images. Automating the process will make measurements of breast morphology more reliable, reducing the inter- and intra-observer bias. Algorithms to identify three fiducial points, the nipples, sternal notch, and umbilicus, are described. The algorithms used for localization of these fiducial points are formulated using a combination of surface curvature and 2D color information. Comparison of the 3D coordinates of automatically detected fiducial points and those identified manually, and geodesic distances between the fiducial points are used to validate algorithm performance. The algorithms reliably identified the location of all three of the fiducial points. We dedicate this article to our late colleague and friend, Dr. Elisabeth K. Beahm. Elisabeth was both a talented plastic surgeon and physician-scientist; we deeply miss her insight and her fellowship.

  6. A point of application study to determine the accuracy, precision and reliability of a low-cost balance plate for center of pressure measurement.

    Science.gov (United States)

    Goble, Daniel J; Khan, Ehran; Baweja, Harsimran S; O'Connor, Shawn M

    2018-04-11

    Changes in postural sway measured via force plate center of pressure have been associated with many aspects of human motor ability. A previous study validated the accuracy and precision of a relatively new, low-cost and portable force plate called the Balance Tracking System (BTrackS). This work compared a laboratory-grade force plate versus BTrackS during human-like dynamic sway conditions generated by an inverted pendulum device. The present study sought to extend previous validation attempts for BTrackS using a more traditional point of application (POA) approach. Computer numerical control (CNC) guided application of ∼155 N of force was applied five times to each of 21 points on five different BTrackS Balance Plate (BBP) devices with a hex-nose plunger. Results showed excellent agreement (ICC > 0.999) between the POAs and measured COP by the BBP devices, as well as high accuracy ( 0.999) providing evidence of almost perfect inter-device reliability. Taken together, these results provide an important, static corollary to the previously obtained dynamic COP results from inverted pendulum testing of the BBP. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Reliability of fitness tests using methods and time periods common in sport and occupational management.

    Science.gov (United States)

    Burnstein, Bryan D; Steele, Russell J; Shrier, Ian

    2011-01-01

    Fitness testing is used frequently in many areas of physical activity, but the reliability of these measurements under real-world, practical conditions is unknown. To evaluate the reliability of specific fitness tests using the methods and time periods used in the context of real-world sport and occupational management. Cohort study. Eighteen different Cirque du Soleil shows. Cirque du Soleil physical performers who completed 4 consecutive tests (6-month intervals) and were free of injury or illness at each session (n = 238 of 701 physical performers). Performers completed 6 fitness tests on each assessment date: dynamic balance, Harvard step test, handgrip, vertical jump, pull-ups, and 60-second jump test. We calculated the intraclass coefficient (ICC) and limits of agreement between baseline and each time point and the ICC over all 4 time points combined. Reliability was acceptable (ICC > 0.6) over an 18-month time period for all pairwise comparisons and all time points together for the handgrip, vertical jump, and pull-up assessments. The Harvard step test and 60-second jump test had poor reliability (ICC < 0.6) between baseline and other time points. When we excluded the baseline data and calculated the ICC for 6-month, 12-month, and 18-month time points, both the Harvard step test and 60-second jump test demonstrated acceptable reliability. Dynamic balance was unreliable in all contexts. Limit-of-agreement analysis demonstrated considerable intraindividual variability for some tests and a learning effect by administrators on others. Five of the 6 tests in this battery had acceptable reliability over an 18-month time frame, but the values for certain individuals may vary considerably from time to time for some tests. Specific tests may require a learning period for administrators.

  8. Neglect Of Parameter Estimation Uncertainty Can Significantly Overestimate Structural Reliability

    Directory of Open Access Journals (Sweden)

    Rózsás Árpád

    2015-12-01

    Full Text Available Parameter estimation uncertainty is often neglected in reliability studies, i.e. point estimates of distribution parameters are used for representative fractiles, and in probabilistic models. A numerical example examines the effect of this uncertainty on structural reliability using Bayesian statistics. The study reveals that the neglect of parameter estimation uncertainty might lead to an order of magnitude underestimation of failure probability.

  9. Assessing the Reliability of Student Evaluations of Teaching: Choosing the Right Coefficient

    Science.gov (United States)

    Morley, Donald

    2014-01-01

    Many of the studies used to support the claim that student evaluations of teaching are reliable measures of teaching effectiveness have frequently calculated inappropriate reliability coefficients. This paper points to three coefficients that would be appropriate depending on if student evaluations were used for formative or summative purposes.…

  10. Validation and Test-Retest Reliability of New Thermographic Technique Called Thermovision Technique of Dry Needling for Gluteus Minimus Trigger Points in Sciatica Subjects and TrPs-Negative Healthy Volunteers

    Science.gov (United States)

    Rychlik, Michał; Samborski, Włodzimierz

    2015-01-01

    The aim of this study was to assess the validity and test-retest reliability of Thermovision Technique of Dry Needling (TTDN) for the gluteus minimus muscle. TTDN is a new thermography approach used to support trigger points (TrPs) diagnostic criteria by presence of short-term vasomotor reactions occurring in the area where TrPs refer pain. Method. Thirty chronic sciatica patients (n=15 TrP-positive and n=15 TrPs-negative) and 15 healthy volunteers were evaluated by TTDN three times during two consecutive days based on TrPs of the gluteus minimus muscle confirmed additionally by referred pain presence. TTDN employs average temperature (T avr), maximum temperature (T max), low/high isothermal-area, and autonomic referred pain phenomenon (AURP) that reflects vasodilatation/vasoconstriction. Validity and test-retest reliability were assessed concurrently. Results. Two components of TTDN validity and reliability, T avr and AURP, had almost perfect agreement according to κ (e.g., thigh: 0.880 and 0.938; calf: 0.902 and 0.956, resp.). The sensitivity for T avr, T max, AURP, and high isothermal-area was 100% for everyone, but specificity of 100% was for T avr and AURP only. Conclusion. TTDN is a valid and reliable method for T avr and AURP measurement to support TrPs diagnostic criteria for the gluteus minimus muscle when digitally evoked referred pain pattern is present. PMID:26137486

  11. Strategy for continuous improvement in IC manufacturability, yield, and reliability

    Science.gov (United States)

    Dreier, Dean J.; Berry, Mark; Schani, Phil; Phillips, Michael; Steinberg, Joe; DePinto, Gary

    1993-01-01

    Continual improvements in yield, reliability and manufacturability measure a fab and ultimately result in Total Customer Satisfaction. A new organizational and technical methodology for continuous defect reduction has been established in a formal feedback loop, which relies on yield and reliability, failed bit map analysis, analytical tools, inline monitoring, cross functional teams and a defect engineering group. The strategy requires the fastest detection, identification and implementation of possible corrective actions. Feedback cycle time is minimized at all points to improve yield and reliability and reduce costs, essential for competitiveness in the memory business. Payoff was a 9.4X reduction in defectivity and a 6.2X improvement in reliability of 256 K fast SRAMs over 20 months.

  12. A technical survey on issues of the quantitative evaluation of software reliability

    International Nuclear Information System (INIS)

    Park, J. K; Sung, T. Y.; Eom, H. S.; Jeong, H. S.; Park, J. H.; Kang, H. G.; Lee, K. Y.; Park, J. K.

    2000-04-01

    To develop the methodology for evaluating the software reliability included in digital instrumentation and control system (I and C), many kinds of methodologies/techniques that have been proposed from the software reliability engineering fuel are analyzed to identify the strong and week points of them. According to analysis results, methodologies/techniques that can be directly applied for the evaluation of the software reliability are not exist. Thus additional researches to combine the most appropriate methodologies/techniques from existing ones would be needed to evaluate the software reliability. (author)

  13. The fair value of operational reliability

    Energy Technology Data Exchange (ETDEWEB)

    Patino-Echeverri, Dalia; Morel, Benoit

    2005-12-15

    Information about the uncertainties that surround the operation of the power system can be used to enlighten the debate of how much reliability should be pursued and how resources should be allocated to pursue it. In this paper we present a method to determine the value of having flexible generators to react to load fluctuations. This value can be seen as the value of hedging against the uncertainty on the load due to the volatility of the demand and the possibility of congestion. Because having this flexibility can be related to a financial option, we will use an extension of options theory and in particular the risk-neutral valuation method, to find a risk neutral quantification of its value. We illustrate our point valuing the flexibility that leads to ''operational reliability'' in the PJM market. Our formula for that value is what we call ''the fair value'' of operational reliability. (Author)

  14. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  15. Reliable and valid assessment of performance in thoracoscopy

    DEFF Research Database (Denmark)

    Konge, Lars; Lehnert, Per; Hansen, Henrik Jessen

    2012-01-01

    BACKGROUND: As we move toward competency-based education in medicine, we have lagged in developing competency-based evaluation methods. In the era of minimally invasive surgery, there is a need for a reliable and valid tool dedicated to measure competence in video-assisted thoracoscopic surgery....... The purpose of this study is to create such an assessment tool, and to explore its reliability and validity. METHODS: An expert group of physicians created an assessment tool consisting of 10 items rated on a five-point rating scale. The following factors were included: economy and confidence of movement...

  16. Assessing the consistency of UAV-derived point clouds and images acquired at different altitudes

    Science.gov (United States)

    Ozcan, O.

    2016-12-01

    Unmanned Aerial Vehicles (UAVs) offer several advantages in terms of cost and image resolution compared to terrestrial photogrammetry and satellite remote sensing system. Nowadays, UAVs that bridge the gap between the satellite scale and field scale applications were initiated to be used in various application areas to acquire hyperspatial and high temporal resolution imageries due to working capacity and acquiring in a short span of time with regard to conventional photogrammetry methods. UAVs have been used for various fields such as for the creation of 3-D earth models, production of high resolution orthophotos, network planning, field monitoring and agricultural lands as well. Thus, geometric accuracy of orthophotos and volumetric accuracy of point clouds are of capital importance for land surveying applications. Correspondingly, Structure from Motion (SfM) photogrammetry, which is frequently used in conjunction with UAV, recently appeared in environmental sciences as an impressive tool allowing for the creation of 3-D models from unstructured imagery. In this study, it was aimed to reveal the spatial accuracy of the images acquired from integrated digital camera and the volumetric accuracy of Digital Surface Models (DSMs) which were derived from UAV flight plans at different altitudes using SfM methodology. Low-altitude multispectral overlapping aerial photography was collected at the altitudes of 30 to 100 meters and georeferenced with RTK-GPS ground control points. These altitudes allow hyperspatial imagery with the resolutions of 1-5 cm depending upon the sensor being used. Preliminary results revealed that the vertical comparison of UAV-derived point clouds with respect to GPS measurements pointed out an average distance at cm-level. Larger values are found in areas where instantaneous changes in surface are present.

  17. Tactile acuity charts: a reliable measure of spatial acuity.

    Directory of Open Access Journals (Sweden)

    Patrick Bruns

    Full Text Available For assessing tactile spatial resolution it has recently been recommended to use tactile acuity charts which follow the design principles of the Snellen letter charts for visual acuity and involve active touch. However, it is currently unknown whether acuity thresholds obtained with this newly developed psychophysical procedure are in accordance with established measures of tactile acuity that involve passive contact with fixed duration and control of contact force. Here we directly compared tactile acuity thresholds obtained with the acuity charts to traditional two-point and grating orientation thresholds in a group of young healthy adults. For this purpose, two types of charts, using either Braille-like dot patterns or embossed Landolt rings with different orientations, were adapted from previous studies. Measurements with the two types of charts were equivalent, but generally more reliable with the dot pattern chart. A comparison with the two-point and grating orientation task data showed that the test-retest reliability of the acuity chart measurements after one week was superior to that of the passive methods. Individual thresholds obtained with the acuity charts agreed reasonably with the grating orientation threshold, but less so with the two-point threshold that yielded relatively distinct acuity estimates compared to the other methods. This potentially considerable amount of mismatch between different measures of tactile acuity suggests that tactile spatial resolution is a complex entity that should ideally be measured with different methods in parallel. The simple test procedure and high reliability of the acuity charts makes them a promising complement and alternative to the traditional two-point and grating orientation thresholds.

  18. Reliable clinical serum analysis with reusable electrochemical sensor: Toward point-of-care measurement of the antipsychotic medication clozapine.

    Science.gov (United States)

    Kang, Mijeong; Kim, Eunkyoung; Winkler, Thomas E; Banis, George; Liu, Yi; Kitchen, Christopher A; Kelly, Deanna L; Ghodssi, Reza; Payne, Gregory F

    2017-09-15

    Clozapine is one of the most promising medications for managing schizophrenia but it is under-utilized because of the challenges of maintaining serum levels in a safe therapeutic range (1-3μM). Timely measurement of serum clozapine levels has been identified as a barrier to the broader use of clozapine, which is however challenging due to the complexity of serum samples. We demonstrate a robust and reusable electrochemical sensor with graphene-chitosan composite for rapidly measuring serum levels of clozapine. Our electrochemical measurements in clinical serum from clozapine-treated and clozapine-untreated schizophrenia groups are well correlated to centralized laboratory analysis for the readily detected uric acid and for the clozapine which is present at 100-fold lower concentration. The benefits of our electrochemical measurement approach for serum clozapine monitoring are: (i) rapid measurement (≈20min) without serum pretreatment; (ii) appropriate selectivity and sensitivity (limit of detection 0.7μM); (iii) reusability of an electrode over several weeks; and (iv) rapid reliability testing to detect common error-causing problems. This simple and rapid electrochemical approach for serum clozapine measurements should provide clinicians with the timely point-of-care information required to adjust dosages and personalize the management of schizophrenia. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. The factorial reliability of the Middlesex Hospital Questionnaire in normal subjects.

    Science.gov (United States)

    Bagley, C

    1980-03-01

    The internal reliability of the Middlesex Hospital Questionnaire and its component subscales has been checked by means of principal components analyses of data on 256 normal subjects. The subscales (with the possible exception of Hysteria) were found to contribute to the general underlying factor of psychoneurosis. In general, the principal components analysis points to the reliability of the subscales, despite some item overlap.

  20. Wolf Point Substation, Roosevelt County, Montana

    International Nuclear Information System (INIS)

    1991-05-01

    The Western Area Power Administration (Western), an agency of the United States Department of Energy, is proposing to construct the 115-kV Wolf Point Substation near Wolf Point in Roosevelt County, Montana (Figure 1). As part of the construction project, Western's existing Wolf Point Substation would be taken out of service. The existing 115-kV Wolf Point Substation is located approximately 3 miles west of Wolf Point, Montana (Figure 2). The substation was constructed in 1949. The existing Wolf Point Substation serves as a ''Switching Station'' for the 115-kV transmission in the region. The need for substation improvements is based on operational and reliability issues. For this environmental assessment (EA), the environmental review of the proposed project took into account the removal of the old Wolf Point Substation, rerouting of the five Western lines and four lines from the Cooperatives and Montana-Dakota Utilities Company, and the new road into the proposed substation. Reference to the new proposed Wolf Point Substation in the EA includes these facilities as well as the old substation site. The environmental review looked at the impacts to all resource areas in the Wolf Point area. 7 refs., 6 figs

  1. Fort Peck-Wolf Point transmission line project, Montana

    International Nuclear Information System (INIS)

    1992-01-01

    The primary objective of the project is to replace the existing 36-mile Fort Peck-Wolf Point transmission line which has reached the end of its useful service life. Presently, the overall condition of this existing section of the 47-year-old line is poor. Frequent repairs have been required because of the absence of overhead ground wires. The continued maintenance of the line will become more expensive and customer interruptions will persist because of the damage due to lightning. The expense of replacing shell rotted poles, and the concern for the safety of the maintenance personnel because of hazards caused by severe shell rot are also of primary importance. The operational and maintenance problems coupled with power system simulation studies, demonstrate the need for improvements to the Wolf Point area to serve area loads. Western's Wolf Point Substation is an important point of interconnection for the power output from the Fort Peck Dam to area loads as far away as Williston, North Dakota. The proposed transmission line replacement would assure that there will continue to be reliable transmission capacity available to serve area electrical loads, as well as provide a reliable second high-voltage transmission path from the Fort Peck generation to back-up a loss of the Fort Peck-Wolf Point 115-kV Line No. 1

  2. Reliability technology and nuclear power

    International Nuclear Information System (INIS)

    Garrick, B.J.; Kaplan, S.

    1976-01-01

    This paper reviews some of the history and status of nuclear reliability and the evolution of this subject from art towards science. It shows that that probability theory is the appropriate and essential mathematical language of this subject. The authors emphasize that it is more useful to view probability not as a $prime$frequency$prime$, i.e., not as the result of a statistical experiment, but rather as a measure of state of confidence or a state of knowledge. They also show that the probabilistic, quantitative approach has a considerable history of application in the electric power industry in the area of power system planning. Finally, the authors show that the decision theory notion of utility provides a point of view from which risks, benefits, safety, and reliability can be viewed in a unified way thus facilitating understanding, comparison, and communication. 29 refs

  3. Intra-rater and inter-rater reliability of a medical record abstraction study on transition of care after childhood cancer.

    Directory of Open Access Journals (Sweden)

    Micòl E Gianinazzi

    Full Text Available The abstraction of data from medical records is a widespread practice in epidemiological research. However, studies using this means of data collection rarely report reliability. Within the Transition after Childhood Cancer Study (TaCC which is based on a medical record abstraction, we conducted a second independent abstraction of data with the aim to assess a intra-rater reliability of one rater at two time points; b the possible learning effects between these two time points compared to a gold-standard; and c inter-rater reliability.Within the TaCC study we conducted a systematic medical record abstraction in the 9 Swiss clinics with pediatric oncology wards. In a second phase we selected a subsample of medical records in 3 clinics to conduct a second independent abstraction. We then assessed intra-rater reliability at two time points, the learning effect over time (comparing each rater at two time-points with a gold-standard and the inter-rater reliability of a selected number of variables. We calculated percentage agreement and Cohen's kappa.For the assessment of the intra-rater reliability we included 154 records (80 for rater 1; 74 for rater 2. For the inter-rater reliability we could include 70 records. Intra-rater reliability was substantial to excellent (Cohen's kappa 0-6-0.8 with an observed percentage agreement of 75%-95%. In all variables learning effects were observed. Inter-rater reliability was substantial to excellent (Cohen's kappa 0.70-0.83 with high agreement ranging from 86% to 100%.Our study showed that data abstracted from medical records are reliable. Investigating intra-rater and inter-rater reliability can give confidence to draw conclusions from the abstracted data and increase data quality by minimizing systematic errors.

  4. Interval Mathematics Applied to Critical Point Transitions

    Directory of Open Access Journals (Sweden)

    Benito A. Stradi

    2012-03-01

    Full Text Available The determination of critical points of mixtures is important for both practical and theoretical reasons in the modeling of phase behavior, especially at high pressure. The equations that describe the behavior of complex mixtures near critical points are highly nonlinear and with multiplicity of solutions to the critical point equations. Interval arithmetic can be used to reliably locate all the critical points of a given mixture. The method also verifies the nonexistence of a critical point if a mixture of a given composition does not have one. This study uses an interval Newton/Generalized Bisection algorithm that provides a mathematical and computational guarantee that all mixture critical points are located. The technique is illustrated using several example problems. These problems involve cubic equation of state models; however, the technique is general purpose and can be applied in connection with other nonlinear problems.

  5. Human reliability analysis of dependent events

    International Nuclear Information System (INIS)

    Swain, A.D.; Guttmann, H.E.

    1977-01-01

    In the human reliability analysis in WASH-1400, the continuous variable of degree of interaction among human events was approximated by selecting four points on this continuum to represent the entire continuum. The four points selected were identified as zero coupling (i.e., zero dependence), complete coupling (i.e., complete dependence), and two intermediate points--loose coupling (a moderate level of dependence) and tight coupling (a high level of dependence). The paper expands the WASH-1400 treatment of common mode failure due to the interaction of human activities. Mathematical expressions for the above four levels of dependence are derived for parallel and series systems. The psychological meaning of each level of dependence is illustrated by examples, with probability tree diagrams to illustrate the use of conditional probabilities resulting from the interaction of human actions in nuclear power plant tasks

  6. Improved Reliability-Based Optimization with Support Vector Machines and Its Application in Aircraft Wing Design

    Directory of Open Access Journals (Sweden)

    Yu Wang

    2015-01-01

    Full Text Available A new reliability-based design optimization (RBDO method based on support vector machines (SVM and the Most Probable Point (MPP is proposed in this work. SVM is used to create a surrogate model of the limit-state function at the MPP with the gradient information in the reliability analysis. This guarantees that the surrogate model not only passes through the MPP but also is tangent to the limit-state function at the MPP. Then, importance sampling (IS is used to calculate the probability of failure based on the surrogate model. This treatment significantly improves the accuracy of reliability analysis. For RBDO, the Sequential Optimization and Reliability Assessment (SORA is employed as well, which decouples deterministic optimization from the reliability analysis. The improved SVM-based reliability analysis is used to amend the error from linear approximation for limit-state function in SORA. A mathematical example and a simplified aircraft wing design demonstrate that the improved SVM-based reliability analysis is more accurate than FORM and needs less training points than the Monte Carlo simulation and that the proposed optimization strategy is efficient.

  7. Assessment and Improving Methods of Reliability Indices in Bakhtar Regional Electricity Company

    Directory of Open Access Journals (Sweden)

    Saeed Shahrezaei

    2013-04-01

    Full Text Available Reliability of a system is the ability of a system to do prospected duties in future and the probability of desirable operation for doing predetermined duties. Power system elements failures data are the main data of reliability assessment in the network. Determining antiseptic parameters is the goal of reliability assessment by using system history data. These parameters help to recognize week points of the system. In other words, the goal of reliability assessment is operation improving and decreasing of the failures and power outages. This paper is developed to assess reliability indices of Bakhtar Regional Electricity Company up to 1393 and the improving methods and their effects on the reliability indices in this network. DIgSILENT Power Factory software is employed for simulation. Simulation results show the positive effect of improving methods in reliability indices of Bakhtar Regional Electricity Company.

  8. Test-retest reliability of the KINARM end-point robot for assessment of sensory, motor and neurocognitive function in young adult athletes.

    Directory of Open Access Journals (Sweden)

    Cameron S Mang

    Full Text Available Current assessment tools for sport-related concussion are limited by a reliance on subjective interpretation and patient symptom reporting. Robotic assessments may provide more objective and precise measures of neurological function than traditional clinical tests.To determine the reliability of assessments of sensory, motor and cognitive function conducted with the KINARM end-point robotic device in young adult elite athletes.Sixty-four randomly selected healthy, young adult elite athletes participated. Twenty-five individuals (25 M, mean age±SD, 20.2±2.1 years participated in a within-season study, where three assessments were conducted within a single season (assessments labeled by session: S1, S2, S3. An additional 39 individuals (28M; 22.8±6.0 years participated in a year-to-year study, where annual pre-season assessments were conducted for three consecutive seasons (assessments labeled by year: Y1, Y2, Y3. Forty-four parameters from five robotic tasks (Visually Guided Reaching, Position Matching, Object Hit, Object Hit and Avoid, and Trail Making B and overall Task Scores describing performance on each task were quantified.Test-retest reliability was determined by intra-class correlation coefficients (ICCs between the first and second, and second and third assessments. In the within-season study, ICCs were ≥0.50 for 68% of parameters between S1 and S2, 80% of parameters between S2 and S3, and for three of the five Task Scores both between S1 and S2, and S2 and S3. In the year-to-year study, ICCs were ≥0.50 for 64% of parameters between Y1 and Y2, 82% of parameters between Y2 and Y3, and for four of the five Task Scores both between Y1 and Y2, and Y2 and Y3.Overall, the results suggest moderate-to-good test-retest reliability for the majority of parameters measured by the KINARM robot in healthy young adult elite athletes. Future work will consider the potential use of this information for clinical assessment of concussion

  9. Reliability analysis of grid connected small wind turbine power electronics

    International Nuclear Information System (INIS)

    Arifujjaman, Md.; Iqbal, M.T.; Quaicoe, J.E.

    2009-01-01

    Grid connection of small permanent magnet generator (PMG) based wind turbines requires a power conditioning system comprising a bridge rectifier, a dc-dc converter and a grid-tie inverter. This work presents a reliability analysis and an identification of the least reliable component of the power conditioning system of such grid connection arrangements. Reliability of the configuration is analyzed for the worst case scenario of maximum conversion losses at a particular wind speed. The analysis reveals that the reliability of the power conditioning system of such PMG based wind turbines is fairly low and it reduces to 84% of initial value within one year. The investigation is further enhanced by identifying the least reliable component within the power conditioning system and found that the inverter has the dominant effect on the system reliability, while the dc-dc converter has the least significant effect. The reliability analysis demonstrates that a permanent magnet generator based wind energy conversion system is not the best option from the point of view of power conditioning system reliability. The analysis also reveals that new research is required to determine a robust power electronics configuration for small wind turbine conversion systems.

  10. Design methodologies for reliability of SSL LED boards

    NARCIS (Netherlands)

    Jakovenko, J.; Formánek, J.; Perpiñà, X.; Jorda, X.; Vellvehi, M.; Werkhoven, R.J.; Husák, M.; Kunen, J.M.G.; Bancken, P.; Bolt, P.J.; Gasse, A.

    2013-01-01

    This work presents a comparison of various LED board technologies from thermal, mechanical and reliability point of view provided by an accurate 3-D modelling. LED boards are proposed as a possible technology replacement of FR4 LED boards used in 400 lumen retrofit SSL lamps. Presented design

  11. Reliability Analysis for Adhesive Bonded Composite Stepped Lap Joints Loaded in Fatigue

    DEFF Research Database (Denmark)

    Kimiaeifar, Amin; Sørensen, John Dalsgaard; Lund, Erik

    2012-01-01

    -1, where partial safety factors are introduced together with characteristic values. Asymptotic sampling is used to estimate the reliability with support points generated by randomized Sobol sequences. The predicted reliability level is compared with the implicitly required target reliability level defined......This paper describes a probabilistic approach to calculate the reliability of adhesive bonded composite stepped lap joints loaded in fatigue using three- dimensional finite element analysis (FEA). A method for progressive damage modelling is used to assess fatigue damage accumulation and residual...... by the wind turbine standard IEC 61400-1. Finally, an approach for the assessment of the reliability of adhesive bonded composite stepped lap joints loaded in fatigue is presented. The introduced methodology can be applied in the same way to calculate the reliability level of wind turbine blade components...

  12. Space Shuttle Program Primary Avionics Software System (PASS) Success Legacy - Quality and Reliability Date

    Science.gov (United States)

    Orr, James K.; Peltier, Daryl

    2010-01-01

    Thsi slide presentation reviews the avionics software system on board the space shuttle, with particular emphasis on the quality and reliability. The Primary Avionics Software System (PASS) provides automatic and fly-by-wire control of critical shuttle systems which executes in redundant computers. Charts given show the number of space shuttle flights vs time, PASS's development history, and other charts that point to the reliability of the system's development. The reliability of the system is also compared to predicted reliability.

  13. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  14. Longitudinal Reliability of Self-Reported Age at Menarche in Adolescent Girls: Variability across Time and Setting

    Science.gov (United States)

    Dorn, Lorah D.; Sontag-Padilla, Lisa M.; Pabst, Stephanie; Tissot, Abbigail; Susman, Elizabeth J.

    2013-01-01

    Age at menarche is critical in research and clinical settings, yet there is a dearth of studies examining its reliability in adolescents. We examined age at menarche during adolescence, specifically, (a) average method reliability across 3 years, (b) test-retest reliability between time points and methods, (c) intraindividual variability of…

  15. Structural reliability analysis based on the cokriging technique

    International Nuclear Information System (INIS)

    Zhao Wei; Wang Wei; Dai Hongzhe; Xue Guofeng

    2010-01-01

    Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.

  16. Reliability of Broadcast Communications Under Sparse Random Linear Network Coding

    OpenAIRE

    Brown, Suzie; Johnson, Oliver; Tassi, Andrea

    2018-01-01

    Ultra-reliable Point-to-Multipoint (PtM) communications are expected to become pivotal in networks offering future dependable services for smart cities. In this regard, sparse Random Linear Network Coding (RLNC) techniques have been widely employed to provide an efficient way to improve the reliability of broadcast and multicast data streams. This paper addresses the pressing concern of providing a tight approximation to the probability of a user recovering a data stream protected by this kin...

  17. Some reliability issues for incomplete two-dimensional warranty claims data

    International Nuclear Information System (INIS)

    Kumar Gupta, Sanjib; De, Soumen; Chatterjee, Aditya

    2017-01-01

    Bivariate reliability and vector bivariate hazard rate or hazard gradient functions are expected to have a role for meaningful assessment of the field performance for items under two-dimensional warranty coverage. In this paper a usage rate based simple class of bivariate reliability function is proposed and various bivariate reliability characteristics are studied for warranty claims data. The utilities of such study are explored with the help of a real life synthetic data. - Highlights: • Independence between age and usage rate is established. • Conditional reliability and hazard gradient along age and usage are determined. • The change point of the hazard gradients is estimated. • The concepts of layered renewal process and NHPP are introduced. • Expected number of renewals and failures at different age-usage cut-offs are obtained.

  18. Reliability of Two Smartphone Applications for Radiographic Measurements of Hallux Valgus Angles.

    Science.gov (United States)

    Mattos E Dinato, Mauro Cesar; Freitas, Marcio de Faria; Milano, Cristiano; Valloto, Elcio; Ninomiya, André Felipe; Pagnano, Rodrigo Gonçalves

    The objective of the present study was to assess the reliability of 2 smartphone applications compared with the traditional goniometer technique for measurement of radiographic angles in hallux valgus and the time required for analysis with the different methods. The radiographs of 31 patients (52 feet) with a diagnosis of hallux valgus were analyzed. Four observers, 2 with >10 years' experience in foot and ankle surgery and 2 in-training surgeons, measured the hallux valgus angle and intermetatarsal angle using a manual goniometer technique and 2 smartphone applications (Hallux Angles and iPinPoint). The interobserver and intermethod reliability were estimated using intraclass correlation coefficients (ICCs), and the time required for measurement of the angles among the 3 methods was compared using the Friedman test. A very good or good interobserver reliability was found among the 4 observers measuring the hallux valgus angle and intermetatarsal angle using the goniometer (ICC 0.913 and 0.821, respectively) and iPinPoint (ICC 0.866 and 0.638, respectively). Using the Hallux Angles application, a very good interobserver reliability was found for measurements of the hallux valgus angle (ICC 0.962) and intermetatarsal angle (ICC 0.935) only among the more experienced observers. The time required for the measurements was significantly shorter for the measurements using both smartphone applications compared with the goniometer method. One smartphone application (iPinPoint) was reliable for measurements of the hallux valgus angles by either experienced or nonexperienced observers. The use of these tools might save time in the evaluation of radiographic angles in the hallux valgus. Copyright © 2016 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  19. Estimation of some stochastic models used in reliability engineering

    International Nuclear Information System (INIS)

    Huovinen, T.

    1989-04-01

    The work aims to study the estimation of some stochastic models used in reliability engineering. In reliability engineering continuous probability distributions have been used as models for the lifetime of technical components. We consider here the following distributions: exponential, 2-mixture exponential, conditional exponential, Weibull, lognormal and gamma. Maximum likelihood method is used to estimate distributions from observed data which may be either complete or censored. We consider models based on homogeneous Poisson processes such as gamma-poisson and lognormal-poisson models for analysis of failure intensity. We study also a beta-binomial model for analysis of failure probability. The estimators of the parameters for three models are estimated by the matching moments method and in the case of gamma-poisson and beta-binomial models also by maximum likelihood method. A great deal of mathematical or statistical problems that arise in reliability engineering can be solved by utilizing point processes. Here we consider the statistical analysis of non-homogeneous Poisson processes to describe the failing phenomena of a set of components with a Weibull intensity function. We use the method of maximum likelihood to estimate the parameters of the Weibull model. A common cause failure can seriously reduce the reliability of a system. We consider a binomial failure rate (BFR) model as an application of the marked point processes for modelling common cause failure in a system. The parameters of the binomial failure rate model are estimated with the maximum likelihood method

  20. A novel reliability evaluation method for large engineering systems

    Directory of Open Access Journals (Sweden)

    Reda Farag

    2016-06-01

    Full Text Available A novel reliability evaluation method for large nonlinear engineering systems excited by dynamic loading applied in time domain is presented. For this class of problems, the performance functions are expected to be function of time and implicit in nature. Available first- or second-order reliability method (FORM/SORM will be challenging to estimate reliability of such systems. Because of its inefficiency, the classical Monte Carlo simulation (MCS method also cannot be used for large nonlinear dynamic systems. In the proposed approach, only tens instead of hundreds or thousands of deterministic evaluations at intelligently selected points are used to extract the reliability information. A hybrid approach, consisting of the stochastic finite element method (SFEM developed by the author and his research team using FORM, response surface method (RSM, an interpolation scheme, and advanced factorial schemes, is proposed. The method is clarified with the help of several numerical examples.

  1. Scale for positive aspects of caregiving experience: development, reliability, and factor structure.

    Science.gov (United States)

    Kate, N; Grover, S; Kulhara, P; Nehra, R

    2012-06-01

    OBJECTIVE. To develop an instrument (Scale for Positive Aspects of Caregiving Experience [SPACE]) that evaluates positive caregiving experience and assess its psychometric properties. METHODS. Available scales which assess some aspects of positive caregiving experience were reviewed and a 50-item questionnaire with a 5-point rating was constructed. In all, 203 primary caregivers of patients with severe mental disorders were asked to complete the questionnaire. Internal consistency, test-retest reliability, cross-language reliability, split-half reliability, and face validity were evaluated. Principal component factor analysis was run to assess the factorial validity of the scale. RESULTS. The scale developed as part of the study was found to have good internal consistency, test-retest reliability, cross-language reliability, split-half reliability, and face validity. Principal component factor analysis yielded a 4-factor structure, which also had good test-retest reliability and cross-language reliability. There was a strong correlation between the 4 factors obtained. CONCLUSION. The SPACE developed as part of this study has good psychometric properties.

  2. Stress-strength reliability for general bivariate distributions

    Directory of Open Access Journals (Sweden)

    Alaa H. Abdel-Hamid

    2016-10-01

    Full Text Available An expression for the stress-strength reliability R=P(X1reliability function R are obtained. In the non-parametric case, point and interval estimates of R are developed using Govindarajulu's asymptotic distribution-free method when X1 and X2 are dependent. An example is given when the population distribution is bivariate compound Weibull. Simulation is performed, based on different sample sizes to study the performance of estimates.

  3. Teaching accuracy and reliability for student projects

    Science.gov (United States)

    Fisher, Nick

    2002-09-01

    Physics students at Rugby School follow the Salters Horners A-level course, which involves working on a two-week practical project of their own choosing. Pupils often misunderstand the concepts of accuracy and reliability, believing, for example, that repeating readings makes them more accurate and more reliable, whereas all it does is help to check repeatability. The course emphasizes the ideas of checking anomalous points, improving accuracy and making readings more sensitive. This article describes how we teach pupils in preparation for their projects. Based on many years of running such projects, much of this material is from a short booklet that we give out to pupils, when we train them in practical project skills.

  4. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  5. Evaluation of 2-point, 3-point, and 6-point Dixon magnetic resonance imaging with flexible echo timing for muscle fat quantification.

    Science.gov (United States)

    Grimm, Alexandra; Meyer, Heiko; Nickel, Marcel D; Nittka, Mathias; Raithel, Esther; Chaudry, Oliver; Friedberger, Andreas; Uder, Michael; Kemmler, Wolfgang; Quick, Harald H; Engelke, Klaus

    2018-06-01

    The purpose of this study is to evaluate and compare 2-point (2pt), 3-point (3pt), and 6-point (6pt) Dixon magnetic resonance imaging (MRI) sequences with flexible echo times (TE) to measure proton density fat fraction (PDFF) within muscles. Two subject groups were recruited (G1: 23 young and healthy men, 31 ± 6 years; G2: 50 elderly men, sarcopenic, 77 ± 5 years). A 3-T MRI system was used to perform Dixon imaging on the left thigh. PDFF was measured with six Dixon prototype sequences: 2pt, 3pt, and 6pt sequences once with optimal TEs (in- and opposed-phase echo times), lower resolution, and higher bandwidth (optTE sequences) and once with higher image resolution (highRes sequences) and shortest possible TE, respectively. Intra-fascia PDFF content was determined. To evaluate the comparability among the sequences, Bland-Altman analysis was performed. The highRes 6pt Dixon sequences served as reference as a high correlation of this sequence to magnetic resonance spectroscopy has been shown before. The PDFF difference between the highRes 6pt Dixon sequence and the optTE 6pt, both 3pt, and the optTE 2pt was low (between 2.2% and 4.4%), however, not to the highRes 2pt Dixon sequence (33%). For the optTE sequences, difference decreased with the number of echoes used. In conclusion, for Dixon sequences with more than two echoes, the fat fraction measurement was reliable with arbitrary echo times, while for 2pt Dixon sequences, it was reliable with dedicated in- and opposed-phase echo timing. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Practice of value-based distribution reliability assessment (VBDRA) at Scarborough Public Utilities

    International Nuclear Information System (INIS)

    Chen, R-L.

    1995-01-01

    The development of value-based distribution reliability assessment (VBDRA) at Scarborough Public Utilities was described. Load point reliability indices, customer interruption costs (CIC), continuity and service reliability, accuracy of CIC, and the aspects of application of VBDRA were addressed. The application of VBDRA to a long-term rebuild plan for 4.16 kV distribution system was described. The importance of a cost-benefit analysis for implementation of VBDRA was emphasized. In the case of the Scarborough Public Utilities Commission the enhanced feeder reliability assessment was found to influence the allocation of funding to where it provided the most value to customers. 14 refs., 3 tabs., 3 figs

  7. Position Mooring Control Based on a Structural Reliability Criterion

    DEFF Research Database (Denmark)

    Fang, Shaoji; Leira, Bernt J.; Blanke, Mogens

    2013-01-01

    is achieved using structural reliability indices in a cost function, where both the mean mooring-line tension and dynamic effects are considered. An optimal set-point is automatically produced without need for manual interaction. The parameters of the extreme value distribution are calculated on-line thereby...... mooring lines simultaneously from exceeding a stress threshold, this paper suggests a new algorithm to determine the reference position and an associated control system. The safety of each line is assessed through a structural reliability index. A reference position where all mooring lines are safe...

  8. Reliability in endoscopic diagnosis of portal hypertensive gastropathy

    Science.gov (United States)

    de Macedo, George Fred Soares; Ferreira, Fabio Gonçalves; Ribeiro, Maurício Alves; Szutan, Luiz Arnaldo; Assef, Mauricio Saab; Rossini, Lucio Giovanni Battista

    2013-01-01

    AIM: To analyze reliability among endoscopists in diagnosing portal hypertensive gastropathy (PHG) and to determine which criteria from the most utilized classifications are the most suitable. METHODS: From January to July 2009, in an academic quaternary referral center at Santa Casa of São Paulo Endoscopy Service, Brazil, we performed this single-center prospective study. In this period, we included 100 patients, including 50 sequential patients who had portal hypertension of various etiologies; who were previously diagnosed based on clinical, laboratory and imaging exams; and who presented with esophageal varices. In addition, our study included 50 sequential patients who had dyspeptic symptoms and were referred for upper digestive endoscopy without portal hypertension. All subjects underwent upper digestive endoscopy, and the images of the exam were digitally recorded. Five endoscopists with more than 15 years of experience answered an electronic questionnaire, which included endoscopic criteria from the 3 most commonly used Portal Hypertensive Gastropathy classifications (McCormack, NIEC and Baveno) and the presence of elevated or flat antral erosive gastritis. All five endoscopists were blinded to the patients’ clinical information, and all images of varices were deliberately excluded for the analysis. RESULTS: The three most common etiologies of portal hypertension were schistosomiasis (36%), alcoholic cirrhosis (20%) and viral cirrhosis (14%). Of the 50 patients with portal hypertension, 84% were Child A, 12% were Child B, 4% were Child C, 64% exhibited previous variceal bleeding and 66% were previously endoscopic treated. The endoscopic parameters, presence or absence of mosaic-like pattern, red point lesions and cherry-red spots were associated with high inter-observer reliability and high specificity for diagnosing Portal Hypertensive Gastropathy. Sensitivity, specificity and reliability for the diagnosis of PHG (%) were as follows: mosaic-like pattern

  9. Validity and Reliability Study of the Korean Tinetti Mobility Test for Parkinson's Disease.

    Science.gov (United States)

    Park, Jinse; Koh, Seong-Beom; Kim, Hee Jin; Oh, Eungseok; Kim, Joong-Seok; Yun, Ji Young; Kwon, Do-Young; Kim, Younsoo; Kim, Ji Seon; Kwon, Kyum-Yil; Park, Jeong-Ho; Youn, Jinyoung; Jang, Wooyoung

    2018-01-01

    Postural instability and gait disturbance are the cardinal symptoms associated with falling among patients with Parkinson's disease (PD). The Tinetti mobility test (TMT) is a well-established measurement tool used to predict falls among elderly people. However, the TMT has not been established or widely used among PD patients in Korea. The purpose of this study was to evaluate the reliability and validity of the Korean version of the TMT for PD patients. Twenty-four patients diagnosed with PD were enrolled in this study. For the interrater reliability test, thirteen clinicians scored the TMT after watching a video clip. We also used the test-retest method to determine intrarater reliability. For concurrent validation, the unified Parkinson's disease rating scale, Hoehn and Yahr staging, Berg Balance Scale, Timed-Up and Go test, 10-m walk test, and gait analysis by three-dimensional motion capture were also used. We analyzed receiver operating characteristic curve to predict falling. The interrater reliability and intrarater reliability of the Korean Tinetti balance scale were 0.97 and 0.98, respectively. The interrater reliability and intra-rater reliability of the Korean Tinetti gait scale were 0.94 and 0.96, respectively. The Korean TMT scores were significantly correlated with the other clinical scales and three-dimensional motion capture. The cutoff values for predicting falling were 14 points (balance subscale) and 10 points (gait subscale). We found that the Korean version of the TMT showed excellent validity and reliability for gait and balance and had high sensitivity and specificity for predicting falls among patients with PD.

  10. Grand Canyon as a universally accessible virtual field trip for intro Geoscience classes using geo-referenced mobile game technology

    Science.gov (United States)

    Bursztyn, N.; Pederson, J. L.; Shelton, B.

    2012-12-01

    There is a well-documented and nationally reported trend of declining interest, poor preparedness, and lack of diversity within U.S. students pursuing geoscience and other STEM disciplines. We suggest that a primary contributing factor to this problem is that introductory geoscience courses simply fail to inspire (i.e. they are boring). Our experience leads us to believe that the hands-on, contextualized learning of field excursions are often the most impactful component of lower division geoscience classes. However, field trips are becoming increasingly more difficult to run due to logistics and liability, high-enrollments, decreasing financial and administrative support, and exclusivity of the physically disabled. Recent research suggests that virtual field trips can be used to simulate this contextualized physical learning through the use of mobile devices - technology that exists in most students' hands already. Our overarching goal is to enhance interest in introductory geoscience courses by providing the kinetic and physical learning experience of field trips through geo-referenced educational mobile games and test the hypothesis that these experiences can be effectively simulated through virtual field trips. We are doing this by developing "serious" games for mobile devices that deliver introductory geology material in a fun and interactive manner. Our new teaching strategy will enhance undergraduate student learning in the geosciences, be accessible to students of diverse backgrounds and physical abilities, and be easily incorporated into higher education programs and curricula at institutions globally. Our prototype involves students virtually navigating downstream along a scaled down Colorado River through Grand Canyon - physically moving around their campus quad, football field or other real location, using their smart phone or a tablet. As students reach the next designated location, a photo or video in Grand Canyon appears along with a geological

  11. NDT Reliability - Final Report. Reliability in non-destructive testing (NDT) of the canister components

    Energy Technology Data Exchange (ETDEWEB)

    Pavlovic, Mato; Takahashi, Kazunori; Mueller, Christina; Boehm, Rainer (BAM, Federal Inst. for Materials Research and Testing, Berlin (Germany)); Ronneteg, Ulf (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden))

    2008-12-15

    This report describes the methodology of the reliability investigation performed on the ultrasonic phased array NDT system, developed by SKB in collaboration with Posiva, for inspection of the canisters for permanent storage of nuclear spent fuel. The canister is composed of a cast iron insert surrounded by a copper shell. The shell is composed of the tube and the lid/base which are welded to the tube after the fuel has been place, in the tube. The manufacturing process of the canister parts and the welding process are described. Possible defects, which might arise in the canister components during the manufacturing or in the weld during the welding, are identified. The number of real defects in manufactured components have been limited. Therefore the reliability of the NDT system has been determined using a number of test objects with artificial defects. The reliability analysis is based on the signal response analysis. The conventional signal response analysis is adopted and further developed before applied on the modern ultrasonic phased-array NDT system. The concept of multi-parameter a, where the response of the NDT system is dependent on more than just one parameter, is introduced. The weakness of use of the peak signal response in the analysis is demonstrated and integration of the amplitudes in the C-scan is proposed as an alternative. The calculation of the volume POD, when the part is inspected with more configurations, is also presented. The reliability analysis is supported by the ultrasonic simulation based on the point source synthesis method

  12. Reactor coolant flow measurements at Point Lepreau

    International Nuclear Information System (INIS)

    Brenciaglia, G.; Gurevich, Y.; Liu, G.

    1996-01-01

    The CROSSFLOW ultrasonic flow measurement system manufactured by AMAG is fully proven as reliable and accurate when applied to large piping in defined geometries for such applications as feedwater flows measurement. Its application to direct reactor coolant flow (RCF) measurements - both individual channel flows and bulk flows such as pump suction flow - has been well established through recent work by AMAG at Point Lepreau, with application to other reactor types (eg. PWR) imminent. At Point Lepreau, Measurements have been demonstrated at full power; improvements to consistently meet ±1% accuracy are in progress. The development and recent customization of CROSSFLOW to RCF measurement at Point Lepreau are described in this paper; typical measurement results are included. (author)

  13. Reliability of accumulators systems for Angra-I: a reavaluation

    International Nuclear Information System (INIS)

    Oliveira, L.F.S. de; Fleming, P.V.; Frutuoso e Melo, P.F.F.; Tayt-Sohn, L.C.

    1983-01-01

    A new evaluation of reliability analysis for accumulators systems of Angra-1, based on a study done in 1979/80 is done the methodology used is the same (WASH-1400). An additional point, a computer program was used to obtain the minimum cuts. (author) [pt

  14. Reliability improvement of multiversion software by exchanging modules

    International Nuclear Information System (INIS)

    Shima, Kazuyuki; Matsumoto, Ken-ichi; Torii, Koji

    1996-01-01

    In this paper, we proposes a method to improve reliability of multiversion software. In CER proposed in, checkpoints are put in versions of program and errors of versions are detected and recovered at the checkpoints. It prevent versions from failing and improve the reliability of multiversion software. But it is point out that CER decreases the reliability of the multiversion software if the detection and recovery of errors are assumed to be able to fail. In the method proposed in this paper, versions of program are developed following the same module specifications. When failures of versions of program are detected, faulty modules are identified and replaced them to other modules. It create versions without faulty modules and improve the reliability of multiversion software. The failure probability of multiversion software is estimated to become about a hundredth of the failure probability by the proposed method where the failure probability of each version is 0.000698, the number of versions is 5 and the number of modules is 20. (author)

  15. Reshaping the Science of Reliability with the Entropy Function

    Directory of Open Access Journals (Sweden)

    Paolo Rocchi

    2015-01-01

    Full Text Available The present paper revolves around two argument points. As first, we have observed a certain parallel between the reliability of systems and the progressive disorder of thermodynamical systems; and we import the notion of reversibility/irreversibility into the reliability domain. As second, we note that the reliability theory is a very active area of research which although has not yet become a mature discipline. This is due to the majority of researchers who adopt the inductive logic instead of the deductive logic typical of mature scientific sectors. The deductive approach was inaugurated by Gnedenko in the reliability domain. We mean to continue Gnedenko’s work and we use the Boltzmann-like entropy to pursue this objective. This paper condenses the papers published in the past decade which illustrate the calculus of the Boltzmann-like entropy. It is demonstrated how the every result complies with the deductive logic and are consistent with Gnedenko’s achievements.

  16. Approach to developing reliable space reactor power systems

    International Nuclear Information System (INIS)

    Mondt, J.F.; Shinbrot, C.H.

    1991-01-01

    The Space Reactor Power System Project is in the engineering development phase of a three-phase program. During Phase II, the Engineering Development Phase, the SP-100 Project has defined and is pursuing a new approach to developing reliable power systems. The approach to developing such a system during the early technology phase is described in this paper along with some preliminary examples to help explain the approach. Developing reliable components to meet space reactor power system requirements is based on a top down systems approach which includes a point design based on a detailed technical specification of a 100 kW power system

  17. Protection of HCl dew point corrosion in municipal incinerators

    Energy Technology Data Exchange (ETDEWEB)

    Yamamoto, S.; Tsuruta, T.; Maeda, N.

    1976-12-01

    HCl dew point corrosion is often observed on the components of municipal incinerators used for burning wastes which contain polyvinyl chloride. In order to solve the problem, the relation between concentrations of gaseous HCl and the corresponding dew points as well as concentrations of condensed HCl, was investigated. A series of HCl dipping tests for the materials concerned was performed and the dip test results were compared with in-plant tests. As a result it was concluded that HCl dew point corrosion can be reliably predicted from measurements of HCl concentrations in the water and in the gas and the partial pressure of the saturated steam at the dew point.

  18. Reliability Analysis Study of Digital Reactor Protection System in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Guo, Xiao Ming; Liu, Tao; Tong, Jie Juan; Zhao, Jun

    2011-01-01

    The Digital I and C systems are believed to improve a plants safety and reliability generally. The reliability analysis of digital I and C system has become one research hotspot. Traditional fault tree method is one of means to quantify the digital I and C system reliability. Review of advanced nuclear power plant AP1000 digital protection system evaluation makes clear both the fault tree application and analysis process to the digital system reliability. One typical digital protection system special for advanced reactor has been developed, which reliability evaluation is necessary for design demonstration. The typical digital protection system construction is introduced in the paper, and the process of FMEA and fault tree application to the digital protection system reliability evaluation are described. Reliability data and bypass logic modeling are two points giving special attention in the paper. Because the factors about time sequence and feedback not exist in reactor protection system obviously, the dynamic feature of digital system is not discussed

  19. Reliability analysis of reactor protection systems

    International Nuclear Information System (INIS)

    Alsan, S.

    1976-07-01

    A theoretical mathematical study of reliability is presented and the concepts subsequently defined applied to the study of nuclear reactor safety systems. The theory is applied to investigations of the operational reliability of the Siloe reactor from the point of view of rod drop. A statistical study conducted between 1964 and 1971 demonstrated that most rod drop incidents arose from circumstances associated with experimental equipment (new set-ups). The reliability of the most suitable safety system for some recently developed experimental equipment is discussed. Calculations indicate that if all experimental equipment were equipped with these new systems, only 1.75 rod drop accidents would be expected to occur per year on average. It is suggested that all experimental equipment should be equipped with these new safety systems and tested every 21 days. The reliability of the new safety system currently being studied for the Siloe reactor was also investigated. The following results were obtained: definite failures must be detected immediately as a result of the disturbances produced; the repair time must not exceed a few hours; the equipment must be tested every week. Under such conditions, the rate of accidental rod drops is about 0.013 on average per year. The level of nondefinite failures is less than 10 -6 per hour and the level of nonprotection 1 hour per year. (author)

  20. Identification of reliable gridded reference data for statistical downscaling methods in Alberta

    Science.gov (United States)

    Eum, H. I.; Gupta, A.

    2017-12-01

    Climate models provide essential information to assess impacts of climate change at regional and global scales. However, statistical downscaling methods have been applied to prepare climate model data for various applications such as hydrologic and ecologic modelling at a watershed scale. As the reliability and (spatial and temporal) resolution of statistically downscaled climate data mainly depend on a reference data, identifying the most reliable reference data is crucial for statistical downscaling. A growing number of gridded climate products are available for key climate variables which are main input data to regional modelling systems. However, inconsistencies in these climate products, for example, different combinations of climate variables, varying data domains and data lengths and data accuracy varying with physiographic characteristics of the landscape, have caused significant challenges in selecting the most suitable reference climate data for various environmental studies and modelling. Employing various observation-based daily gridded climate products available in public domain, i.e. thin plate spline regression products (ANUSPLIN and TPS), inverse distance method (Alberta Townships), and numerical climate model (North American Regional Reanalysis) and an optimum interpolation technique (Canadian Precipitation Analysis), this study evaluates the accuracy of the climate products at each grid point by comparing with the Adjusted and Homogenized Canadian Climate Data (AHCCD) observations for precipitation, minimum and maximum temperature over the province of Alberta. Based on the performance of climate products at AHCCD stations, we ranked the reliability of these publically available climate products corresponding to the elevations of stations discretized into several classes. According to the rank of climate products for each elevation class, we identified the most reliable climate products based on the elevation of target points. A web-based system

  1. Prospective comparison of liver stiffness measurements between two point wave elastography methods: Virtual ouch quantification and elastography point quantification

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Hyun Suk; Lee, Jeong Min; Yoon, Jeong Hee; Lee, Dong Ho; Chang, Won; Han, Joon Koo [Seoul National University Hospital, Seoul (Korea, Republic of)

    2016-09-15

    To prospectively compare technical success rate and reliable measurements of virtual touch quantification (VTQ) elastography and elastography point quantification (ElastPQ), and to correlate liver stiffness (LS) measurements obtained by the two elastography techniques. Our study included 85 patients, 80 of whom were previously diagnosed with chronic liver disease. The technical success rate and reliable measurements of the two kinds of point shear wave elastography (pSWE) techniques were compared by χ{sup 2} analysis. LS values measured using the two techniques were compared and correlated via Wilcoxon signed-rank test, Spearman correlation coefficient, and 95% Bland-Altman limit of agreement. The intraobserver reproducibility of ElastPQ was determined by 95% Bland-Altman limit of agreement and intraclass correlation coefficient (ICC). The two pSWE techniques showed similar technical success rate (98.8% for VTQ vs. 95.3% for ElastPQ, p = 0.823) and reliable LS measurements (95.3% for VTQ vs. 90.6% for ElastPQ, p = 0.509). The mean LS measurements obtained by VTQ (1.71 ± 0.47 m/s) and ElastPQ (1.66 ± 0.41 m/s) were not significantly different (p = 0.209). The LS measurements obtained by the two techniques showed strong correlation (r = 0.820); in addition, the 95% limit of agreement of the two methods was 27.5% of the mean. Finally, the ICC of repeat ElastPQ measurements was 0.991. Virtual touch quantification and ElastPQ showed similar technical success rate and reliable measurements, with strongly correlated LS measurements. However, the two methods are not interchangeable due to the large limit of agreement.

  2. Reliability of Carbon Stock Estimates in Imperata Grassland (East Kalimantan, Indonesia), Using Georeferenced Information

    NARCIS (Netherlands)

    Yassir, I.; Putten, van B.; Buurman, P.

    2012-01-01

    Knowledge of the spatial distribution of total carbon is important for understanding the impact of regional land use change on the global carbon cycle. We studied spatial total carbon variability using transect sampling in an Imperata grassland area. Spatial variability was modeled following an

  3. Feature Extraction from 3D Point Cloud Data Based on Discrete Curves

    Directory of Open Access Journals (Sweden)

    Yi An

    2013-01-01

    Full Text Available Reliable feature extraction from 3D point cloud data is an important problem in many application domains, such as reverse engineering, object recognition, industrial inspection, and autonomous navigation. In this paper, a novel method is proposed for extracting the geometric features from 3D point cloud data based on discrete curves. We extract the discrete curves from 3D point cloud data and research the behaviors of chord lengths, angle variations, and principal curvatures at the geometric features in the discrete curves. Then, the corresponding similarity indicators are defined. Based on the similarity indicators, the geometric features can be extracted from the discrete curves, which are also the geometric features of 3D point cloud data. The threshold values of the similarity indicators are taken from [0,1], which characterize the relative relationship and make the threshold setting easier and more reasonable. The experimental results demonstrate that the proposed method is efficient and reliable.

  4. Reliability of the Radiographic Sagittal and Frontal Tibiotalar Alignment after Ankle Arthrodesis.

    Science.gov (United States)

    Willegger, Madeleine; Holinka, Johannes; Nemecek, Elena; Bock, Peter; Wanivenhaus, Axel Hugo; Windhager, Reinhard; Schuh, Reinhard

    2016-01-01

    Accurate measurement of the tibiotalar alignment is important in radiographic outcome assessment of ankle arthrodesis (AA). In studies, various radiological methods have been used to measure the tibiotalar alignment leading to facultative misinterpretation of results. However, to our knowledge, no previous study has investigated the reliability of tibiotalar alignment measurement in AA. We aimed to investigate the reliability of four different methods of measurement of the frontal and sagittal tibiotalar alignment after AA, and to further clarify the most reliable method for determining the longitudinal axis of the tibia. Thirty-eight weight bearing anterior to posterior and lateral ankle radiographs of thirty-seven patients who had undergone AA with a two screw fixation technique were selected. Three observers measured the frontal tibiotalar angle (FTTA) and the sagittal tibiotalar angle (STTA) using four different methods. The methods differed by the definition of the longitudinal tibial axis. Method A was defined by a line drawn along the lateral tibial border in anterior to posterior radiographs and along the posterior tibial border in lateral radiographs. Method B was defined by a line connecting two points in the middle of the proximal and the distal tibial shaft. Method C was drawn "freestyle"along the longitudinal axis of the tibia, and method D was defined by a line connecting the center of the tibial articular surface and a point in the middle of the proximal tibial shaft. Intra- and interobserver correlation coefficients (ICC) and repeated measurement ANOVA were calculated to assess measurement reliability and accuracy. All four methods showed excellent inter- and intraobserver reliability for the FTTA and the STTA. When the longitudinal tibial axis is defined by connecting two points in the middle of the proximal and the distal tibial shaft, the highest interobserver reliability for the FTTA (ICC: 0.980; CI 95%: 0.966-0.989) and for the STTA (ICC: 0

  5. Reliability of the Radiographic Sagittal and Frontal Tibiotalar Alignment after Ankle Arthrodesis.

    Directory of Open Access Journals (Sweden)

    Madeleine Willegger

    Full Text Available Accurate measurement of the tibiotalar alignment is important in radiographic outcome assessment of ankle arthrodesis (AA. In studies, various radiological methods have been used to measure the tibiotalar alignment leading to facultative misinterpretation of results. However, to our knowledge, no previous study has investigated the reliability of tibiotalar alignment measurement in AA. We aimed to investigate the reliability of four different methods of measurement of the frontal and sagittal tibiotalar alignment after AA, and to further clarify the most reliable method for determining the longitudinal axis of the tibia.Thirty-eight weight bearing anterior to posterior and lateral ankle radiographs of thirty-seven patients who had undergone AA with a two screw fixation technique were selected. Three observers measured the frontal tibiotalar angle (FTTA and the sagittal tibiotalar angle (STTA using four different methods. The methods differed by the definition of the longitudinal tibial axis. Method A was defined by a line drawn along the lateral tibial border in anterior to posterior radiographs and along the posterior tibial border in lateral radiographs. Method B was defined by a line connecting two points in the middle of the proximal and the distal tibial shaft. Method C was drawn "freestyle"along the longitudinal axis of the tibia, and method D was defined by a line connecting the center of the tibial articular surface and a point in the middle of the proximal tibial shaft. Intra- and interobserver correlation coefficients (ICC and repeated measurement ANOVA were calculated to assess measurement reliability and accuracy.All four methods showed excellent inter- and intraobserver reliability for the FTTA and the STTA. When the longitudinal tibial axis is defined by connecting two points in the middle of the proximal and the distal tibial shaft, the highest interobserver reliability for the FTTA (ICC: 0.980; CI 95%: 0.966-0.989 and for the

  6. Using Generalizability Theory to Assess the Score Reliability of Communication Skills of Dentistry Students

    Science.gov (United States)

    Uzun, N. Bilge; Aktas, Mehtap; Asiret, Semih; Yormaz, Seha

    2018-01-01

    The goal of this study is to determine the reliability of the performance points of dentistry students regarding communication skills and to examine the scoring reliability by generalizability theory in balanced random and fixed facet (mixed design) data, considering also the interactions of student, rater and duty. The study group of the research…

  7. Modified personal interviews: resurrecting reliable personal interviews for admissions?

    Science.gov (United States)

    Hanson, Mark D; Kulasegaram, Kulamakan Mahan; Woods, Nicole N; Fechtig, Lindsey; Anderson, Geoff

    2012-10-01

    Traditional admissions personal interviews provide flexible faculty-student interactions but are plagued by low inter-interview reliability. Axelson and Kreiter (2009) retrospectively showed that multiple independent sampling (MIS) may improve reliability of personal interviews; thus, the authors incorporated MIS into the admissions process for medical students applying to the University of Toronto's Leadership Education and Development Program (LEAD). They examined the reliability and resource demands of this modified personal interview (MPI) format. In 2010-2011, LEAD candidates submitted written applications, which were used to screen for participation in the MPI process. Selected candidates completed four brief (10-12 minutes) independent MPIs each with a different interviewer. The authors blueprinted MPI questions to (i.e., aligned them with) leadership attributes, and interviewers assessed candidates' eligibility on a five-point Likert-type scale. The authors analyzed inter-interview reliability using the generalizability theory. Sixteen candidates submitted applications; 10 proceeded to the MPI stage. Reliability of the written application components was 0.75. The MPI process had overall inter-interview reliability of 0.79. Correlation between the written application and MPI scores was 0.49. A decision study showed acceptable reliability of 0.74 with only three MPIs scored using one global rating. Furthermore, a traditional admissions interview format would take 66% more time than the MPI format. The MPI format, used during the LEAD admissions process, achieved high reliability with minimal faculty resources. The MPI format's reliability and effective resource use were possible through MIS and employment of expert interviewers. MPIs may be useful for other admissions tasks.

  8. Optimised and balanced structural and system reliability of offshore wind turbines. An account

    Energy Technology Data Exchange (ETDEWEB)

    Tarp-Johansen, N.J.; Kozine, I. (Risoe National Lab., DTU, Roskilde, (DK)); Rademarkers, L. (Netherlands Energy Research Foundation (NL)); Dalsgaard Soerensen, J. (Aalborg Univ. (DK)) Ronold, K. (Det Norske Veritas (DK))

    2005-04-15

    This report gives the results of the research project 'Optimised and Uniform Safety and Reliability of Offshore Wind Turbines (an account)'. The main subject of the project has been the account of the state-of-the art of knowledge about, and/or attempts to, harmonisation of the structural reliability of wind turbines, on the one hand, and the reliability of the wind turbine's control/safety system, on the other hand. Within the project some research pointing ahead has also been conducted. (au)

  9. Reliability analysis for thermal cutting method based non-explosive separation device

    International Nuclear Information System (INIS)

    Choi, Jun Woo; Hwang, Kuk Ha; Kim, Byung Kyu

    2016-01-01

    In order to increase the reliability of a separation device for a small satellite, a new non-explosive separation device is invented. This device is activated using a thermal cutting method with a Ni-Cr wire. A reliability analysis is carried out for the proposed non-explosive separation device by applying the Fault tree analysis (FTA) method. In the FTA results for the separation device, only ten single-point failure modes are found. The reliability modeling and analysis for the device are performed considering failure of the power supply, the Ni-Cr wire burns failure and unwinds, the holder separation failure, the balls separation failure, and the pin release failure. Ultimately, the reliability of the proposed device is calculated as 0.999989 with five Ni-Cr wire coils

  10. Reliability analysis for thermal cutting method based non-explosive separation device

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jun Woo; Hwang, Kuk Ha; Kim, Byung Kyu [Korea Aerospace University, Goyang (Korea, Republic of)

    2016-12-15

    In order to increase the reliability of a separation device for a small satellite, a new non-explosive separation device is invented. This device is activated using a thermal cutting method with a Ni-Cr wire. A reliability analysis is carried out for the proposed non-explosive separation device by applying the Fault tree analysis (FTA) method. In the FTA results for the separation device, only ten single-point failure modes are found. The reliability modeling and analysis for the device are performed considering failure of the power supply, the Ni-Cr wire burns failure and unwinds, the holder separation failure, the balls separation failure, and the pin release failure. Ultimately, the reliability of the proposed device is calculated as 0.999989 with five Ni-Cr wire coils.

  11. Point-Connecting Measurements of the Hallux Valgus Deformity: A New Measurement and Its Clinical Application

    Science.gov (United States)

    Seo, Jeong-Ho; Boedijono, Dimas

    2016-01-01

    Purpose The aim of this study was to investigate new point-connecting measurements for the hallux valgus angle (HVA) and the first intermetatarsal angle (IMA), which can reflect the degree of subluxation of the first metatarsophalangeal joint (MTPJ). Also, this study attempted to compare the validity of midline measurements and the new point-connecting measurements for the determination of HVA and IMA values. Materials and Methods Sixty feet of hallux valgus patients who underwent surgery between 2007 and 2011 were classified in terms of the severity of HVA, congruency of the first MTPJ, and type of chevron metatarsal osteotomy. On weight-bearing dorsal-plantar radiographs, HVA and IMA values were measured and compared preoperatively and postoperatively using both the conventional and new methods. Results Compared with midline measurements, point-connecting measurements showed higher inter- and intra-observer reliability for preoperative HVA/IMA and similar or higher inter- and intra-observer reliability for postoperative HVA/IMA. Patients who underwent distal chevron metatarsal osteotomy (DCMO) had higher intraclass correlation coefficient for inter- and intra-observer reliability for pre- and post-operative HVA and IMA measured by the point-connecting method compared with the midline method. All differences in the preoperative HVAs and IMAs determined by both the midline method and point-connecting methods were significant between the deviated group and subluxated groups (p=0.001). Conclusion The point-connecting method for measuring HVA and IMA in the subluxated first MTPJ may better reflect the severity of a HV deformity with higher reliability than the midline method, and is more useful in patients with DCMO than in patients with proximal chevron metatarsal osteotomy. PMID:26996576

  12. Validity and Reliability of the Catastrophic Cognitions Questionnaire-Turkish Version

    Directory of Open Access Journals (Sweden)

    Ayse Kart

    2016-01-01

    Full Text Available Aim: Importance of catastrophic cognitions is well known for the development and maintance of panic disorder. Catastrophic Cognitions Questionnaire (CCQ measures thoughts associated with danger and was originally developed by Khawaja (1992. In this study, it is aimed to evaluate the validity and reliability of CCQ- Turkish version. Material and Method: CCQ was administered to 250 patients with panic disorder. Turkish version of CCQ was created by translation, back-translation and pilot assessment. Socio-demographic Data Form and CCQ Turkish version were administered to participants. Reliability of CCQ was analyzed by test-retest correlation, split-half technique, Cronbach%u2019s alpha coefficient. Construct validity was evaluated by factor analysis after the Kaiser-Meyer-Olkin (KMO and Bartlett test had been performed. Principal component analysis and varimax rotation were used for factor analysis. Results: Fifty-five point six percent (n=139 of the participants were female and fourty-four point four percent (n=111 were male. Internal consistency of the questionnaire was calculated 0.920 by Cronbach alpha. In analysis performed by split-half method reliability coefficients of half questionnaire were found as 0.917 and 0.832. Again spearmen-brown coefficient was found as 0.875 by the same analysis. Factor analysis revealed five basic factors. These five factors explained %66.2 of the total variance. Discussion: The results of this study show that the Turkish version of CCQ is a reliable and valid scale.

  13. Methods to compute reliabilities for genomic predictions of feed intake

    Science.gov (United States)

    For new traits without historical reference data, cross-validation is often the preferred method to validate reliability (REL). Time truncation is less useful because few animals gain substantial REL after the truncation point. Accurate cross-validation requires separating genomic gain from pedigree...

  14. Reliability and Validity of 10 Different Standard Setting Procedures.

    Science.gov (United States)

    Halpin, Glennelle; Halpin, Gerald

    Research indicating that different cut-off points result from the use of different standard-setting techniques leaves decision makers with a disturbing dilemma: Which standard-setting method is best? This investigation of the reliability and validity of 10 different standard-setting approaches was designed to provide information that might help…

  15. Sample size planning for composite reliability coefficients: accuracy in parameter estimation via narrow confidence intervals.

    Science.gov (United States)

    Terry, Leann; Kelley, Ken

    2012-11-01

    Composite measures play an important role in psychology and related disciplines. Composite measures almost always have error. Correspondingly, it is important to understand the reliability of the scores from any particular composite measure. However, the point estimates of the reliability of composite measures are fallible and thus all such point estimates should be accompanied by a confidence interval. When confidence intervals are wide, there is much uncertainty in the population value of the reliability coefficient. Given the importance of reporting confidence intervals for estimates of reliability, coupled with the undesirability of wide confidence intervals, we develop methods that allow researchers to plan sample size in order to obtain narrow confidence intervals for population reliability coefficients. We first discuss composite reliability coefficients and then provide a discussion on confidence interval formation for the corresponding population value. Using the accuracy in parameter estimation approach, we develop two methods to obtain accurate estimates of reliability by planning sample size. The first method provides a way to plan sample size so that the expected confidence interval width for the population reliability coefficient is sufficiently narrow. The second method ensures that the confidence interval width will be sufficiently narrow with some desired degree of assurance (e.g., 99% assurance that the 95% confidence interval for the population reliability coefficient will be less than W units wide). The effectiveness of our methods was verified with Monte Carlo simulation studies. We demonstrate how to easily implement the methods with easy-to-use and freely available software. ©2011 The British Psychological Society.

  16. Problems in diagnosing and forecasting power equipment reliability

    Energy Technology Data Exchange (ETDEWEB)

    Popkov, V I; Demirchyan, K S

    1979-11-01

    This general survey deals with approaches to the resolution of such problems as the gathering, analysis and systematization of data on component defects in power equipment and setting up feedback with the manufacturing plants and planning organizations to improve equipment reliability. Such efforts on the part of designers, manufacturers and operating and repair organizations in analyzing faults in 300 MW turbogenerators during 1974-1977 reduced the specific fault rate by 20 to 25% and the downtime per failure by 35 to 40%. Since power equipment should operate for several hundreds of thousands of hours (20 to 30 years) and the majority of power components have guaranteed service lives of no more than 10/sup 5/ hours, an extremely difficult problem is the determination of the reliability of equipment past the 10/sup 5/ point. The present trend in the USSR Unified Power System towards increasing the number of shutdowns and startups, which in the case of turbogenerators of up 1200 MW power can reach 7500 to 10,000 cycles is noted. Other areas briefly treated are: MHD generator reliability and economy; nuclear power plant reliability and safety; the reliability of high-power high-voltage thyristor converters; the difficulties involved in scale modeling of power system reliability and the high cost of the requisite full-scale studies; the poor understanding of long term corrosion and erosion processes. The review concludes with arguments in favor of greater computerization of all aspects of power system management.

  17. Multinomial-exponential reliability function: a software reliability model

    International Nuclear Information System (INIS)

    Saiz de Bustamante, Amalio; Saiz de Bustamante, Barbara

    2003-01-01

    The multinomial-exponential reliability function (MERF) was developed during a detailed study of the software failure/correction processes. Later on MERF was approximated by a much simpler exponential reliability function (EARF), which keeps most of MERF mathematical properties, so the two functions together makes up a single reliability model. The reliability model MERF/EARF considers the software failure process as a non-homogeneous Poisson process (NHPP), and the repair (correction) process, a multinomial distribution. The model supposes that both processes are statistically independent. The paper discusses the model's theoretical basis, its mathematical properties and its application to software reliability. Nevertheless it is foreseen model applications to inspection and maintenance of physical systems. The paper includes a complete numerical example of the model application to a software reliability analysis

  18. On the reliability of finite element solutions

    International Nuclear Information System (INIS)

    Prasad, K.S.R.K.

    1975-01-01

    The extent of reliability of the finite element method for analysis of nuclear reactor structures, and that of reactor vessels in particular and the need for the engineer to guard against the pitfalls that may arise out of both physical and mathematical models have been high-lighted. A systematic way of checking the model to obtain reasonably accurate solutions is presented. Quite often sophisticated elements are suggested for specific design and stress concentration problems. The desirability or otherwise of these elements, their scope and utility vis-a-vis the use of large stack of conventional elements are discussed from the view point of stress analysts. The methods of obtaining a check on the reliability of the finite element solutions either through modelling changes or an extrapolation technique are discussed. (author)

  19. Bulk electric system reliability evaluation incorporating wind power and demand side management

    Science.gov (United States)

    Huang, Dange

    Electric power systems are experiencing dramatic changes with respect to structure, operation and regulation and are facing increasing pressure due to environmental and societal constraints. Bulk electric system reliability is an important consideration in power system planning, design and operation particularly in the new competitive environment. A wide range of methods have been developed to perform bulk electric system reliability evaluation. Theoretically, sequential Monte Carlo simulation can include all aspects and contingencies in a power system and can be used to produce an informative set of reliability indices. It has become a practical and viable tool for large system reliability assessment technique due to the development of computing power and is used in the studies described in this thesis. The well-being approach used in this research provides the opportunity to integrate an accepted deterministic criterion into a probabilistic framework. This research work includes the investigation of important factors that impact bulk electric system adequacy evaluation and security constrained adequacy assessment using the well-being analysis framework. Load forecast uncertainty is an important consideration in an electrical power system. This research includes load forecast uncertainty considerations in bulk electric system reliability assessment and the effects on system, load point and well-being indices and reliability index probability distributions are examined. There has been increasing worldwide interest in the utilization of wind power as a renewable energy source over the last two decades due to enhanced public awareness of the environment. Increasing penetration of wind power has significant impacts on power system reliability, and security analyses become more uncertain due to the unpredictable nature of wind power. The effects of wind power additions in generating and bulk electric system reliability assessment considering site wind speed

  20. The reliability of a severity rating scale to measure stuttering in an unfamiliar language.

    Science.gov (United States)

    Hoffman, Laura; Wilson, Linda; Copley, Anna; Hewat, Sally; Lim, Valerie

    2014-06-01

    With increasing multiculturalism, speech-language pathologists (SLPs) are likely to work with stuttering clients from linguistic backgrounds that differ from their own. No research to date has estimated SLPs' reliability when measuring severity of stuttering in an unfamiliar language. Therefore, this study was undertaken to estimate the reliability of SLPs' use of a 9-point severity rating (SR) scale, to measure severity of stuttering in a language that was different from their own. Twenty-six Australian SLPs rated 20 speech samples (10 Australian English [AE] and 10 Mandarin) of adults who stutter using a 9-point SR scale on two separate occasions. Judges showed poor agreement when using the scale to measure stuttering in Mandarin samples. Results also indicated that 50% of individual judges were unable to reliably measure the severity of stuttering in AE. The results highlight the need for (a) SLPs to develop intra- and inter-judge agreement when using the 9-point SR scale to measure severity of stuttering in their native language (in this case AE) and in unfamiliar languages; and (b) research into the development and evaluation of practice and/or training packages to assist SLPs to do so.

  1. Maximum power point tracking

    International Nuclear Information System (INIS)

    Enslin, J.H.R.

    1990-01-01

    A well engineered renewable remote energy system, utilizing the principal of Maximum Power Point Tracking can be m ore cost effective, has a higher reliability and can improve the quality of life in remote areas. This paper reports that a high-efficient power electronic converter, for converting the output voltage of a solar panel, or wind generator, to the required DC battery bus voltage has been realized. The converter is controlled to track the maximum power point of the input source under varying input and output parameters. Maximum power point tracking for relative small systems is achieved by maximization of the output current in a battery charging regulator, using an optimized hill-climbing, inexpensive microprocessor based algorithm. Through practical field measurements it is shown that a minimum input source saving of 15% on 3-5 kWh/day systems can easily be achieved. A total cost saving of at least 10-15% on the capital cost of these systems are achievable for relative small rating Remote Area Power Supply systems. The advantages at larger temperature variations and larger power rated systems are much higher. Other advantages include optimal sizing and system monitor and control

  2. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  3. LBA-ECO ND-11 Soil Properties of Forested Headwater Catchments, Mato Grosso, Brazil

    Data.gov (United States)

    National Aeronautics and Space Administration — The results of the analysis of soil chemical parameters, texture, and color are reported for 185 georeferenced soil profile sample points over four forested...

  4. Accounting basis adjustments and deficit reliability: Evidence from southern European countries

    Directory of Open Access Journals (Sweden)

    Maria Antónia Jesus

    2016-01-01

    The main findings point to the need for standardised procedures to convert cash-based (GA into accrual-based (NA data as a crucial step, preventing accounting manipulation, thus increasing reliability of informative outputs for both micro and macro purposes.

  5. IFMIF - Layout and arrangement of cells according to requirements of technical logistics, reliability and remote handling

    Energy Technology Data Exchange (ETDEWEB)

    Mittwollen, Martin, E-mail: martin.mittwollen@kit.edu [Karlsruhe Institute of Technology, Institute for Conveying Technology and Logistics, Karlsruhe (Germany); Eilert, Dirk; Kubaschewski, Martin; Madzharov, Vladimir [Karlsruhe Institute of Technology, Institute for Conveying Technology and Logistics, Karlsruhe (Germany); Tian Kuo [Karlsruhe Institute of Technology, Institute for Neutron Physics and Reactor Technology, Karlsruhe (Germany)

    2012-08-15

    Highlights: Black-Right-Pointing-Pointer In a first approach, layout and arrangement of the cells followed a predetermined plant layout. Black-Right-Pointing-Pointer Disadvantages in technical logistics, reliability and remote handling have been detected. Black-Right-Pointing-Pointer Deliberation with project teams opened space for improvements. Black-Right-Pointing-Pointer Layout and arrangement of cells have been improved by simplification of design. Black-Right-Pointing-Pointer Speed and reliability have been increased significantly. - Abstract: The International Fusion Material Irradiation Facility (IFMIF) is designed to study and qualify structural and functional materials which shall be used in future fusion nuclear power plants. During the current engineering validation and engineering design activities (EVEDA) phase the development of e.g. an optimized layout and arrangement of the cells (Access Cell, Test Cell, and Test Module Handling Cells) is of major interest. After defining different functions for the individual cells like e.g. large scale/fine scale disassembling of test modules a first layout has been developed. This design followed requirements like having a minimum of carrier changes to avoid sources of failures. On the other hand it has had to be a compact arrangement of cells due to restrictions from plant layout. A row of changes of transfer direction, and different crane systems were the consequence. Constructive discussion with project team results in the statement, that for reasons of being reliable and fast, layout and arrangement of cells goes first, plant layout then will follow. The chance for big improvements was taken and the result was a simplified design with strong reduced number of functional elements, and increased reliability and speed.

  6. Material and design considerations of FBGA reliability performance

    International Nuclear Information System (INIS)

    Lee, Teck Kheng; Ng, T.C.; Chai, Y.M.

    2004-01-01

    FBGA package reliability is usually assessed through the conventional approaches of die attach and mold compound material optimization. However, with the rapid changes and fast-moving pace of electronic packaging and the introduction of new soldermask and core materials, substrate design has also become a critical factor in determining overall package reliability. The purpose of this paper is to understand the impact design and soldermask material of a rigid substrate on overall package reliability. Three different soldermask patterns with a matrix of different die attach, mold compound, and soldermask materials are assessed using the moisture sensitivity test (MST). Package reliability is also assessed through the use of temperature cycling (T/C) at conditions 'B' and 'C'. For material optimization, three different mold compounds and die attach materials are used. Material adhesion between different die attach materials and soldermask materials are obtained through die shear performed at various temperatures and preset moisture conditions. A study correlating the different packaging material properties and their relative adhesion strengths with overall package reliability in terms of both MST and T/C performance was performed. Soldermask design under the die pads was found to affect package reliability. For example, locating vias at the edge of the die is not desirable because the vias acts as initiation point for delamination and moisture-induced failure. Through die shear testing, soldermask B demonstrated higher adhesion properties compared to soldermask A across several packaging materials and enhanced the overall package reliability in terms of both MST and T/C performance. Both MST JEDEC level 1 and the T/C of 'B' and 'C' at 1000 cycles have been achieved through design and package material optimization

  7. Identification of Influential Points in a Linear Regression Model

    Directory of Open Access Journals (Sweden)

    Jan Grosz

    2011-03-01

    Full Text Available The article deals with the detection and identification of influential points in the linear regression model. Three methods of detection of outliers and leverage points are described. These procedures can also be used for one-sample (independentdatasets. This paper briefly describes theoretical aspects of several robust methods as well. Robust statistics is a powerful tool to increase the reliability and accuracy of statistical modelling and data analysis. A simulation model of the simple linear regression is presented.

  8. Reliability Approach of a Compressor System using Reliability Block ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... This paper presents a reliability analysis of such a system using reliability ... Keywords-compressor system, reliability, reliability block diagram, RBD .... the same structure has been kept with the three subsystems: air flow, oil flow and .... and Safety in Engineering Design", Springer, 2009. [3] P. O'Connor ...

  9. Data Applicability of Heritage and New Hardware For Launch Vehicle Reliability Models

    Science.gov (United States)

    Al Hassan, Mohammad; Novack, Steven

    2015-01-01

    Bayesian reliability requires the development of a prior distribution to represent degree of belief about the value of a parameter (such as a component's failure rate) before system specific data become available from testing or operations. Generic failure data are often provided in reliability databases as point estimates (mean or median). A component's failure rate is considered a random variable where all possible values are represented by a probability distribution. The applicability of the generic data source is a significant source of uncertainty that affects the spread of the distribution. This presentation discusses heuristic guidelines for quantifying uncertainty due to generic data applicability when developing prior distributions mainly from reliability predictions.

  10. Frontiers of reliability

    CERN Document Server

    Basu, Asit P; Basu, Sujit K

    1998-01-01

    This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul

  11. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  12. System Reliability Engineering

    International Nuclear Information System (INIS)

    Lim, Tae Jin

    2005-02-01

    This book tells of reliability engineering, which includes quality and reliability, reliability data, importance of reliability engineering, reliability and measure, the poisson process like goodness of fit test and the poisson arrival model, reliability estimation like exponential distribution, reliability of systems, availability, preventive maintenance such as replacement policies, minimal repair policy, shock models, spares, group maintenance and periodic inspection, analysis of common cause failure, and analysis model of repair effect.

  13. FINDING CUBOID-BASED BUILDING MODELS IN POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    W. Nguatem

    2012-07-01

    Full Text Available In this paper, we present an automatic approach for the derivation of 3D building models of level-of-detail 1 (LOD 1 from point clouds obtained from (dense image matching or, for comparison only, from LIDAR. Our approach makes use of the predominance of vertical structures and orthogonal intersections in architectural scenes. After robustly determining the scene's vertical direction based on the 3D points we use it as constraint for a RANSAC-based search for vertical planes in the point cloud. The planes are further analyzed to segment reliable outlines for rectangular surface within these planes, which are connected to construct cuboid-based building models. We demonstrate that our approach is robust and effective over a range of real-world input data sets with varying point density, amount of noise, and outliers.

  14. LIF: A new Kriging based learning function and its application to structural reliability analysis

    International Nuclear Information System (INIS)

    Sun, Zhili; Wang, Jian; Li, Rui; Tong, Cao

    2017-01-01

    The main task of structural reliability analysis is to estimate failure probability of a studied structure taking randomness of input variables into account. To consider structural behavior practically, numerical models become more and more complicated and time-consuming, which increases the difficulty of reliability analysis. Therefore, sequential strategies of design of experiment (DoE) are raised. In this research, a new learning function, named least improvement function (LIF), is proposed to update DoE of Kriging based reliability analysis method. LIF values how much the accuracy of estimated failure probability will be improved if adding a given point into DoE. It takes both statistical information provided by the Kriging model and the joint probability density function of input variables into account, which is the most important difference from the existing learning functions. Maximum point of LIF is approximately determined with Markov Chain Monte Carlo(MCMC) simulation. A new reliability analysis method is developed based on the Kriging model, in which LIF, MCMC and Monte Carlo(MC) simulation are employed. Three examples are analyzed. Results show that LIF and the new method proposed in this research are very efficient when dealing with nonlinear performance function, small probability, complicated limit state and engineering problems with high dimension. - Highlights: • Least improvement function (LIF) is proposed for structural reliability analysis. • LIF takes both Kriging based statistical information and joint PDF into account. • A reliability analysis method is constructed based on Kriging, MCS and LIF.

  15. Improving inspection reliability through operator selection and training

    International Nuclear Information System (INIS)

    McGrath, Bernard; Carter, Luke

    2013-01-01

    A number of years ago the UK's Health and Safety Executive sponsored a series of three PANI projects investigating the application of manual ultrasonics, which endeavoured to establish the necessary steps that ensure a reliable inspection is performed. The results of the three projects were each reported separately on completion and also presented at number of international conferences. This paper summarises the results of these projects from the point of view of operator performance. The correlation of operator ultrasonic performance with results of aptitude tests is presented along with observations on the impact of training and qualifications of the operators. The results lead to conclusions on how the selection and training of operators could be modified to improve reliability of inspections.

  16. Reliable avionics design for deep space

    Science.gov (United States)

    Johnson, Stephen B.

    The technical and organizational problems posed by the Space Exploration Initiative (SEI) are discussed, and some possible solutions are examined. It is pointed out that SEI poses a whole new set of challenging problems in the design of reliable systems. These missions and their corresponding systems are far more complex than current systems. The initiative requires a set of vehicles and systems which must have very high levels of autonomy, reliability, and operability for long periods of time. It is emphasized that to achieve these goals in the face of great complexity, new technologies and organizational techniques will be necessary. It is noted that the key to a good design is good people. Not only must good people be found, but they must be placed in positions appropriate to their skills. It is argued that the atomistic and autocratic paradigm of vertical organizations must be replaced with more team-oriented and democratic structures.

  17. Test-retest reliability of the Middlesex Assessment of Mental State (MEAMS): a preliminary investigation in people with probable dementia.

    Science.gov (United States)

    Powell, T; Brooker, D J; Papadopolous, A

    1993-05-01

    Relative and absolute test-retest reliability of the MEAMS was examined in 12 subjects with probable dementia and 12 matched controls. Relative reliability was good. Measures of absolute reliability showed scores changing by up to 3 points over an interval of a week. A version effect was found to be in evidence.

  18. A reliability simulation language for reliability analysis

    International Nuclear Information System (INIS)

    Deans, N.D.; Miller, A.J.; Mann, D.P.

    1986-01-01

    The results of work being undertaken to develop a Reliability Description Language (RDL) which will enable reliability analysts to describe complex reliability problems in a simple, clear and unambiguous way are described. Component and system features can be stated in a formal manner and subsequently used, along with control statements to form a structured program. The program can be compiled and executed on a general-purpose computer or special-purpose simulator. (DG)

  19. A study of operational and testing reliability in software reliability analysis

    International Nuclear Information System (INIS)

    Yang, B.; Xie, M.

    2000-01-01

    Software reliability is an important aspect of any complex equipment today. Software reliability is usually estimated based on reliability models such as nonhomogeneous Poisson process (NHPP) models. Software systems are improving in testing phase, while it normally does not change in operational phase. Depending on whether the reliability is to be predicted for testing phase or operation phase, different measure should be used. In this paper, two different reliability concepts, namely, the operational reliability and the testing reliability, are clarified and studied in detail. These concepts have been mixed up or even misused in some existing literature. Using different reliability concept will lead to different reliability values obtained and it will further lead to different reliability-based decisions made. The difference of the estimated reliabilities is studied and the effect on the optimal release time is investigated

  20. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  1. [Reliability and validity of the Braden Scale for predicting pressure sore risk].

    Science.gov (United States)

    Boes, C

    2000-12-01

    For more accurate and objective pressure sore risk assessment various risk assessment tools were developed mainly in the USA and Great Britain. The Braden Scale for Predicting Pressure Sore Risk is one such example. By means of a literature analysis of German and English texts referring to the Braden Scale the scientific control criteria reliability and validity will be traced and consequences for application of the scale in Germany will be demonstrated. Analysis of 4 reliability studies shows an exclusive focus on interrater reliability. Further, even though examination of 19 validity studies occurs in many different settings, such examination is limited to the criteria sensitivity and specificity (accuracy). The range of sensitivity and specificity level is 35-100%. The recommended cut off points rank in the field of 10 to 19 points. The studies prove to be not comparable with each other. Furthermore, distortions in these studies can be found which affect accuracy of the scale. The results of the here presented analysis show an insufficient proof for reliability and validity in the American studies. In Germany, the Braden scale has not yet been tested under scientific criteria. Such testing is needed before using the scale in different German settings. During the course of such testing, construction and study procedures of the American studies can be used as a basis as can the problems be identified in the analysis presented below.

  2. Identifying and characterizing major emission point sources as a basis for geospatial distribution of mercury emissions inventories

    Science.gov (United States)

    Steenhuisen, Frits; Wilson, Simon J.

    2015-07-01

    Mercury is a global pollutant that poses threats to ecosystem and human health. Due to its global transport, mercury contamination is found in regions of the Earth that are remote from major emissions areas, including the Polar regions. Global anthropogenic emission inventories identify important sectors and industries responsible for emissions at a national level; however, to be useful for air transport modelling, more precise information on the locations of emission is required. This paper describes the methodology applied, and the results of work that was conducted to assign anthropogenic mercury emissions to point sources as part of geospatial mapping of the 2010 global anthropogenic mercury emissions inventory prepared by AMAP/UNEP. Major point-source emission sectors addressed in this work account for about 850 tonnes of the emissions included in the 2010 inventory. This work allocated more than 90% of these emissions to some 4600 identified point source locations, including significantly more point source locations in Africa, Asia, Australia and South America than had been identified during previous work to geospatially-distribute the 2005 global inventory. The results demonstrate the utility and the limitations of using existing, mainly public domain resources to accomplish this work. Assumptions necessary to make use of selected online resources are discussed, as are artefacts that can arise when these assumptions are applied to assign (national-sector) emissions estimates to point sources in various countries and regions. Notwithstanding the limitations of the available information, the value of this procedure over alternative methods commonly used to geo-spatially distribute emissions, such as use of 'proxy' datasets to represent emissions patterns, is illustrated. Improvements in information that would facilitate greater use of these methods in future work to assign emissions to point-sources are discussed. These include improvements to both national

  3. Business models & business cases for point-of-care testing

    NARCIS (Netherlands)

    Staring, A.J.; Meertens, L. O.; Sikkel, N.

    2016-01-01

    Point-Of-Care Testing (POCT) enables clinical tests at or near the patient, with test results that are available instantly or in a very short time frame, to assist caregivers with immediate diagnosis and/or clinical intervention. The goal of POCT is to provide accurate, reliable, fast, and

  4. The Reliability of Assessing Radiographic Healing of Osteochondritis Dissecans of the Knee.

    Science.gov (United States)

    Wall, Eric J; Milewski, Matthew D; Carey, James L; Shea, Kevin G; Ganley, Theodore J; Polousky, John D; Grimm, Nathan L; Eismann, Emily A; Jacobs, Jake C; Murnaghan, Lucas; Nissen, Carl W; Myer, Gregory D; Weiss, Jennifer; Edmonds, Eric W; Anderson, Allen F; Lyon, Roger M; Heyworth, Benton E; Fabricant, Peter D; Zbojniewicz, Andy

    2017-05-01

    The reliability of assessing healing on plain radiographs has not been well-established for knee osteochondritis dissecans (OCD). To determine the inter- and intrarater reliability of specific radiographic criteria in judging healing of femoral condyle OCD. Cohort study (Diagnosis); Level of evidence, 3. Ten orthopedic sports surgeons rated the radiographic healing of 30 knee OCD lesions at 2 time points, a minimum of 1 month apart. First, raters compared pretreatment and 2-year follow-up radiographs on "overall healing" and on 5 subfeatures of healing, including OCD boundary, sclerosis, size, shape, and ossification using a continuous slider scale. "Overall healing" was also rated using a 7-tier ordinal scale. Raters then compared the same 30 pretreatment knee radiographs in a stepwise progression to the 2-, 4-, 7-, 12-, and 24-month follow-up radiographs on "overall healing" using a continuous slider scale. Interrater and intrarater reliability were assessed using intraclass correlations (ICC) derived from a 2-way mixed effects analysis of variance for absolute agreement. Overall healing of the OCD lesions from pretreatment to 2-year follow-up radiographs was rated with excellent interrater reliability (ICC = 0.94) and intrarater reliability (ICC = 0.84) when using a continuous scale. The reliability of the 5 subfeatures of healing was also excellent (interrater ICCs of 0.87-0.89; intrarater ICCs of 0.74-0.84). The 7-tier ordinal scale rating of overall healing had lower interrater (ICC = 0.61) and intrarater (ICC = 0.68) reliability. The overall healing of OCD lesions at the 5 time points up to 24 months had interrater ICCs of 0.81-0.88 and intrarater ICCs of 0.65-0.70. Interrater reliability was excellent when judging the overall healing of OCD femoral condyle lesions on radiographs as well as on 5 specific features of healing on 2-year follow-up radiographs. Continuous scale rating of OCD radiographic healing yielded higher reliability than the ordinal scale

  5. Towards higher reliability of CMS computing facilities

    International Nuclear Information System (INIS)

    Bagliesi, G; Bloom, K; Brew, C; Flix, J; Kreuzer, P; Sciabà, A

    2012-01-01

    The CMS experiment has adopted a computing system where resources are distributed worldwide in more than 50 sites. The operation of the system requires a stable and reliable behaviour of the underlying infrastructure. CMS has established procedures to extensively test all relevant aspects of a site and their capability to sustain the various CMS computing workflows at the required scale. The Site Readiness monitoring infrastructure has been instrumental in understanding how the system as a whole was improving towards LHC operations, measuring the reliability of sites when running CMS activities, and providing sites with the information they need to troubleshoot any problem. This contribution reviews the complete automation of the Site Readiness program, with the description of monitoring tools and their inclusion into the Site Status Board (SSB), the performance checks, the use of tools like HammerCloud, and the impact in improving the overall reliability of the Grid from the point of view of the CMS computing system. These results are used by CMS to select good sites to conduct workflows, in order to maximize workflows efficiencies. The performance against these tests seen at the sites during the first years of LHC running is as well reviewed.

  6. Study of structural reliability of existing concrete structures

    Science.gov (United States)

    Druķis, P.; Gaile, L.; Valtere, K.; Pakrastiņš, L.; Goremikins, V.

    2017-10-01

    Structural reliability of buildings has become an important issue after the collapse of a shopping center in Riga 21.11.2013, caused the death of 54 people. The reliability of a building is the practice of designing, constructing, operating, maintaining and removing buildings in ways that ensure maintained health, ward suffered injuries or death due to use of the building. Evaluation and improvement of existing buildings is becoming more and more important. For a large part of existing buildings, the design life has been reached or will be reached in the near future. The structures of these buildings need to be reassessed in order to find out whether the safety requirements are met. The safety requirements provided by the Eurocodes are a starting point for the assessment of safety. However, it would be uneconomical to require all existing buildings and structures to comply fully with these new codes and corresponding safety levels, therefore the assessment of existing buildings differs with each design situation. This case study describes the simple and practical procedure of determination of minimal reliability index β of existing concrete structures designed by different codes than Eurocodes and allows to reassess the actual reliability level of different structural elements of existing buildings under design load.

  7. Considerations concerning the reliability of reactor safety equipment

    International Nuclear Information System (INIS)

    Furet, J.; Guyot, Ch.

    1967-01-01

    A review is made of the circumstances which favor a good collection of maintenance data at the C.E.A. The large amount of data to be treated has made necessary the use of a computer for analyzing automatically the results collected. Here, only particular aspects of the reliability from the point of view of the electronics used for nuclear reactor control will be dealt with: sale and unsafe failures; probability of survival (in the case of reactor safety); availability. The general diagrams of the safety assemblies which have been drawn up for two types of reactor (power reactor and low power experimental reactor) are given. Results are presented of reliability analysis which could be applied to the use of functional modular elements, developed industrially in France. Improvement of this reliability appears to be fairly limited by an increase in the redundancy; on the other hand it is shown how it may be very markedly improved by the use of automatic tests with different frequencies for detecting unsafe failures rates of measurements for the sub-assemblies and for the logic sub-assemblies. Finally examples are given to show the incidence of the complexity and of the use of different technologies in reactor safety equipment on the reliability. (authors) [fr

  8. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...

  9. Point-based POMDP Risk Based Inspection of Offshore Wind Substructures

    DEFF Research Database (Denmark)

    Morato, Pablo G.; Mai, Quang A.; Rigo, Philippe

    2018-01-01

    This article presents a novel methodology to select the optimal maintenance strategy of an offshore wind structural component, providing a flexible and reliable support to decision-making and balancing inspection, repair and failure costs. The procedure to create a “Point-Based” Partially...

  10. Material and design considerations of FBGA reliability performance

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Teck Kheng; Ng, T.C.; Chai, Y.M

    2004-09-01

    FBGA package reliability is usually assessed through the conventional approaches of die attach and mold compound material optimization. However, with the rapid changes and fast-moving pace of electronic packaging and the introduction of new soldermask and core materials, substrate design has also become a critical factor in determining overall package reliability. The purpose of this paper is to understand the impact design and soldermask material of a rigid substrate on overall package reliability. Three different soldermask patterns with a matrix of different die attach, mold compound, and soldermask materials are assessed using the moisture sensitivity test (MST). Package reliability is also assessed through the use of temperature cycling (T/C) at conditions 'B' and 'C'. For material optimization, three different mold compounds and die attach materials are used. Material adhesion between different die attach materials and soldermask materials are obtained through die shear performed at various temperatures and preset moisture conditions. A study correlating the different packaging material properties and their relative adhesion strengths with overall package reliability in terms of both MST and T/C performance was performed. Soldermask design under the die pads was found to affect package reliability. For example, locating vias at the edge of the die is not desirable because the vias acts as initiation point for delamination and moisture-induced failure. Through die shear testing, soldermask B demonstrated higher adhesion properties compared to soldermask A across several packaging materials and enhanced the overall package reliability in terms of both MST and T/C performance. Both MST JEDEC level 1 and the T/C of 'B' and 'C' at 1000 cycles have been achieved through design and package material optimization.

  11. Reliability of an instrument to determine lower limb comfort in professional football

    Directory of Open Access Journals (Sweden)

    Michael Kinchington

    2010-06-01

    Full Text Available Michael Kinchington1, Kevin Ball1, Geraldine Naughton21School of Human Movement, Recreation and Performance, Victoria University, Melbourne, Australia; 2The Centre of Physical Activity Across the Lifespan (COPAAL, Australian Catholic University, Victoria, AustraliaAims and Objectives: This study extends previous work in the field of injury awareness using a novel lower limb comfort index (LLCI, which was developed to assess comfort in professional football. Participants rated comfort for designated anatomical segments of the lower limb utilizing a seven point Likert scale. The aims of the study were (i to assess the reliability of the LLCI in a competitive football environment (Australian Rules and Rugby League, and (ii to assess whether LLCI measurements were responsive to changes in lower limb comfort over time.Methods and Results: The reliability of the LLCI was observed in two professional football environments: Training Week (mean difference 0.1 point, intra-class correlation coefficient, ICC 0.99 for n = 41 participants; and Match Day (mean difference 0.2 points, ICC 0.97 for n = 22 players. Measurements of lower limb comfort were responsive to changes in comfort over time. Within-player differences were not significant for periods 0–8 hrs (P > 0.05 but, generally, significant for time periods 0–24 hrs (P < 0.05, and significant between 24–96 hrs (P < 0.01. The results indicate that the LLCI was reliable when tested for repeated measures and indicated how the index measures lower limb comfort changes over time.Conclusion: This study shows that the use of a lower limb comfort index, when used in a competitive football environment, is both reliable and responsive to change during both a training week and under match day conditions.Keywords: lower limb comfort, musculoskeletal, football, injury

  12. A Reliable Method to Measure Lip Height Using Photogrammetry in Unilateral Cleft Lip Patients.

    Science.gov (United States)

    van der Zeeuw, Frederique; Murabit, Amera; Volcano, Johnny; Torensma, Bart; Patel, Brijesh; Hay, Norman; Thorburn, Guy; Morris, Paul; Sommerlad, Brian; Gnarra, Maria; van der Horst, Chantal; Kangesu, Loshan

    2015-09-01

    There is still no reliable tool to determine the outcome of the repaired unilateral cleft lip (UCL). The aim of this study was therefore to develop an accurate, reliable tool to measure vertical lip height from photographs. The authors measured the vertical height of the cutaneous and vermilion parts of the lip in 72 anterior-posterior view photographs of 17 patients with repairs to a UCL. Points on the lip's white roll and vermillion were marked on both the cleft and the noncleft sides on each image. Two new concepts were tested. First, photographs were standardized using the horizontal (medial to lateral) eye fissure width (EFW) for calibration. Second, the authors tested the interpupillary line (IPL) and the alar base line (ABL) for their reliability as horizontal lines of reference. Measurements were taken by 2 independent researchers, at 2 different time points each. Overall 2304 data points were obtained and analyzed. Results showed that the method was very effective in measuring the height of the lip on the cleft side with the noncleft side. When using the IPL, inter- and intra-rater reliability was 0.99 to 1.0, with the ABL it varied from 0.91 to 0.99 with one exception at 0.84. The IPL was easier to define because in some subjects the overhanging nasal tip obscured the alar base and gave more consistent measurements possibly because the reconstructed alar base was sometimes indistinct. However, measurements from the IPL can only give the percentage difference between the left and right sides of the lip, whereas those from the ABL can also give exact measurements. Patient examples were given that show how the measurements correlate with clinical assessment. The authors propose this method of photogrammetry with the innovative use of the IPL as a reliable horizontal plane and use of the EFW for calibration as a useful and reliable tool to assess the outcome of UCL repair.

  13. Systems Reliability Framework for Surface Water Sustainability and Risk Management

    Science.gov (United States)

    Myers, J. R.; Yeghiazarian, L.

    2016-12-01

    With microbial contamination posing a serious threat to the availability of clean water across the world, it is necessary to develop a framework that evaluates the safety and sustainability of water systems in respect to non-point source fecal microbial contamination. The concept of water safety is closely related to the concept of failure in reliability theory. In water quality problems, the event of failure can be defined as the concentration of microbial contamination exceeding a certain standard for usability of water. It is pertinent in watershed management to know the likelihood of such an event of failure occurring at a particular point in space and time. Microbial fate and transport are driven by environmental processes taking place in complex, multi-component, interdependent environmental systems that are dynamic and spatially heterogeneous, which means these processes and therefore their influences upon microbial transport must be considered stochastic and variable through space and time. A physics-based stochastic model of microbial dynamics is presented that propagates uncertainty using a unique sampling method based on artificial neural networks to produce a correlation between watershed characteristics and spatial-temporal probabilistic patterns of microbial contamination. These results are used to address the question of water safety through several sustainability metrics: reliability, vulnerability, resilience and a composite sustainability index. System reliability is described uniquely though the temporal evolution of risk along watershed points or pathways. Probabilistic resilience describes how long the system is above a certain probability of failure, and the vulnerability metric describes how the temporal evolution of risk changes throughout a hierarchy of failure levels. Additionally our approach allows for the identification of contributions in microbial contamination and uncertainty from specific pathways and sources. We expect that this

  14. Fix These First: How the World's Leading Companies Point the Way Toward High Reliability in the Military Health System.

    Science.gov (United States)

    Beauvais, Brad; Richter, Jason; Brezinski, Paul

    The 2014 Military Health System Review calls for healthcare system leaders to implement effective strategies used by other high-performing organizations. The authors state, " the [military health system] MHS can create an optimal healthcare environment that focuses on continuous quality improvement where every patient receives safe, high-quality care at all times" (Military Health System, 2014, p. 1). Although aspirational, the document does not specify how a highly reliable health system is developed or what systemic factors are necessary to sustain highly reliable performance. Our work seeks to address this gap and provide guidance to MHS leaders regarding how high-performing organizations develop exceptional levels of performance.The authors' expectation is that military medicine will draw on these lessons to enhance leadership, develop exceptional organizational cultures, onboard and engage employees, build customer loyalty, and improve quality of care. Leaders from other segments of the healthcare field likely will find this study valuable given the size of the military healthcare system (9.6 million beneficiaries), the United States' steady progression toward population-based health, and the increasing need for highly reliable systems and performance.

  15. Constructing the Best Reliability Data for the Job

    Science.gov (United States)

    Kleinhammer, R. K.; Kahn, J. C.

    2014-01-01

    Modern business and technical decisions are based on the results of analyses. When considering assessments using "reliability data", the concern is how long a system will continue to operate as designed. Generally, the results are only as good as the data used. Ideally, a large set of pass/fail tests or observations to estimate the probability of failure of the item under test would produce the best data. However, this is a costly endeavor if used for every analysis and design. Developing specific data is costly and time consuming. Instead, analysts rely on available data to assess reliability. Finding data relevant to the specific use and environment for any project is difficult, if not impossible. Instead, we attempt to develop the "best" or composite analog data to support our assessments. One method used incorporates processes for reviewing existing data sources and identifying the available information based on similar equipment, then using that generic data to derive an analog composite. Dissimilarities in equipment descriptions, environment of intended use, quality and even failure modes impact the "best" data incorporated in an analog composite. Once developed, this composite analog data provides a "better" representation of the reliability of the equipment or component can be used to support early risk or reliability trade studies, or analytical models to establish the predicted reliability data points. Data that is more representative of reality and more project specific would provide more accurate analysis, and hopefully a better final decision.

  16. Constructing the "Best" Reliability Data for the Job

    Science.gov (United States)

    DeMott, D. L.; Kleinhammer, R. K.

    2014-01-01

    Modern business and technical decisions are based on the results of analyses. When considering assessments using "reliability data", the concern is how long a system will continue to operate as designed. Generally, the results are only as good as the data used. Ideally, a large set of pass/fail tests or observations to estimate the probability of failure of the item under test would produce the best data. However, this is a costly endeavor if used for every analysis and design. Developing specific data is costly and time consuming. Instead, analysts rely on available data to assess reliability. Finding data relevant to the specific use and environment for any project is difficult, if not impossible. Instead, we attempt to develop the "best" or composite analog data to support our assessments. One method used incorporates processes for reviewing existing data sources and identifying the available information based on similar equipment, then using that generic data to derive an analog composite. Dissimilarities in equipment descriptions, environment of intended use, quality and even failure modes impact the "best" data incorporated in an analog composite. Once developed, this composite analog data provides a "better" representation of the reliability of the equipment or component can be used to support early risk or reliability trade studies, or analytical models to establish the predicted reliability data points. Data that is more representative of reality and more project specific would provide more accurate analysis, and hopefully a better final decision.

  17. Human reliability analysis methods for probabilistic safety assessment

    International Nuclear Information System (INIS)

    Pyy, P.

    2000-11-01

    Human reliability analysis (HRA) of a probabilistic safety assessment (PSA) includes identifying human actions from safety point of view, modelling the most important of them in PSA models, and assessing their probabilities. As manifested by many incidents and studies, human actions may have both positive and negative effect on safety and economy. Human reliability analysis is one of the areas of probabilistic safety assessment (PSA) that has direct applications outside the nuclear industry. The thesis focuses upon developments in human reliability analysis methods and data. The aim is to support PSA by extending the applicability of HRA. The thesis consists of six publications and a summary. The summary includes general considerations and a discussion about human actions in the nuclear power plant (NPP) environment. A condensed discussion about the results of the attached publications is then given, including new development in methods and data. At the end of the summary part, the contribution of the publications to good practice in HRA is presented. In the publications, studies based on the collection of data on maintenance-related failures, simulator runs and expert judgement are presented in order to extend the human reliability analysis database. Furthermore, methodological frameworks are presented to perform a comprehensive HRA, including shutdown conditions, to study reliability of decision making, and to study the effects of wrong human actions. In the last publication, an interdisciplinary approach to analysing human decision making is presented. The publications also include practical applications of the presented methodological frameworks. (orig.)

  18. 3-D OBJECT RECOGNITION FROM POINT CLOUD DATA

    Directory of Open Access Journals (Sweden)

    W. Smith

    2012-09-01

    roofs. Several case studies have been conducted using a variety of point densities, terrain types and building densities. The results have been encouraging. More work is required for better processing of, for example, forested areas, buildings with sides that are not at right angles or are not straight, and single trees that impinge on buildings. Further work may also be required to ensure that the buildings extracted are of fully cartographic quality. A first version will be included in production software later in 2011. In addition to the standard geospatial applications and the UAV navigation, the results have a further advantage: since LiDAR data tends to be accurately georeferenced, the building models extracted can be used to refine image metadata whenever the same buildings appear in imagery for which the GPS/IMU values are poorer than those for the LiDAR.

  19. 3-D Object Recognition from Point Cloud Data

    Science.gov (United States)

    Smith, W.; Walker, A. S.; Zhang, B.

    2011-09-01

    studies have been conducted using a variety of point densities, terrain types and building densities. The results have been encouraging. More work is required for better processing of, for example, forested areas, buildings with sides that are not at right angles or are not straight, and single trees that impinge on buildings. Further work may also be required to ensure that the buildings extracted are of fully cartographic quality. A first version will be included in production software later in 2011. In addition to the standard geospatial applications and the UAV navigation, the results have a further advantage: since LiDAR data tends to be accurately georeferenced, the building models extracted can be used to refine image metadata whenever the same buildings appear in imagery for which the GPS/IMU values are poorer than those for the LiDAR.

  20. Reliability in the utility computing era: Towards reliable Fog computing

    DEFF Research Database (Denmark)

    Madsen, Henrik; Burtschy, Bernard; Albeanu, G.

    2013-01-01

    This paper considers current paradigms in computing and outlines the most important aspects concerning their reliability. The Fog computing paradigm as a non-trivial extension of the Cloud is considered and the reliability of the networks of smart devices are discussed. Combining the reliability...... requirements of grid and cloud paradigms with the reliability requirements of networks of sensor and actuators it follows that designing a reliable Fog computing platform is feasible....

  1. Dependent systems reliability estimation by structural reliability approach

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2014-01-01

    Estimation of system reliability by classical system reliability methods generally assumes that the components are statistically independent, thus limiting its applicability in many practical situations. A method is proposed for estimation of the system reliability with dependent components, where...... the leading failure mechanism(s) is described by physics of failure model(s). The proposed method is based on structural reliability techniques and accounts for both statistical and failure effect correlations. It is assumed that failure of any component is due to increasing damage (fatigue phenomena...... identification. Application of the proposed method can be found in many real world systems....

  2. Reliability and validity of the de Morton Mobility Index in individuals with sub-acute stroke.

    Science.gov (United States)

    Braun, Tobias; Marks, Detlef; Thiel, Christian; Grüneberg, Christian

    2018-02-04

    To establish the validity and reliability of the de Morton Mobility Index (DEMMI) in patients with sub-acute stroke. This cross-sectional study was performed in a neurological rehabilitation hospital. We assessed unidimensionality, construct validity, internal consistency reliability, inter-rater reliability, minimal detectable change and possible floor and ceiling effects of the DEMMI in adult patients with sub-acute stroke. The study included a total sample of 121 patients with sub-acute stroke. We analysed validity (n = 109) and reliability (n = 51) in two sub-samples. Rasch analysis indicated unidimensionality with an overall fit to the model (chi-square = 12.37, p = 0.577). All hypotheses on construct validity were confirmed. Internal consistency reliability (Cronbach's alpha = 0.94) and inter-rater reliability (intraclass correlation coefficient = 0.95; 95% confidence interval: 0.92-0.97) were excellent. The minimal detectable change with 90% confidence was 13 points. No floor or ceiling effects were evident. These results indicate unidimensionality, sufficient internal consistency reliability, inter-rater reliability, and construct validity of the DEMMI in patients with a sub-acute stroke. Advantages of the DEMMI in clinical application are the short administration time, no need for special equipment and interval level data. The de Morton Mobility Index, therefore, may be a useful performance-based bedside test to measure mobility in individuals with a sub-acute stroke across the whole mobility spectrum. Implications for Rehabilitation The de Morton Mobility Index (DEMMI) is an unidimensional measurement instrument of mobility in individuals with sub-acute stroke. The DEMMI has excellent internal consistency and inter-rater reliability, and sufficient construct validity. The minimal detectable change of the DEMMI with 90% confidence in stroke rehabilitation is 13 points. The lack of any floor or ceiling effects on hospital admission indicates

  3. Reliability of the Wii Balance Board in kayak.

    Science.gov (United States)

    Vando, Stefano; Laffaye, Guillaume; Masala, Daniele; Falese, Lavinia; Padulo, Johnny

    2015-01-01

    the seat of the kayaker represent the principal contact point to express mechanical Energy. therefore we investigated the reliability of the Wii Balance Board measures in the kayak vs. on the ground. Bland-Altman test showed a low systematic bias on the ground (2.85%) and in kayak (-2.13%) respectively; while 0.996 for Intra-class correlation coefficient. the Wii Balance Board is useful to assess postural sway in kayak.

  4. Reliability analysis of mining equipment: A case study of a crushing plant at Jajarm Bauxite Mine in Iran

    International Nuclear Information System (INIS)

    Barabady, Javad; Kumar, Uday

    2008-01-01

    The performance of mining machines depends on the reliability of the equipment used, the operating environment, the maintenance efficiency, the operation process, the technical expertise of the miners, etc. As the size and complexity of mining equipments continue to increase, the implications of equipment failure become ever more critical. Therefore, reliability analysis is required to identify the bottlenecks in the system and to find the components or subsystems with low reliability for a given designed performance. It is important to select a suitable method for data collection as well as for reliability analysis. This paper presents a case study describing reliability and availability analysis of the crushing plant number 3 at Jajarm Bauxite Mine in Iran. In this study, the crushing plant number 3 is divided into six subsystems. The parameters of some probability distributions, such as Weibull, Exponential, and Lognormal distributions have been estimated by using ReliaSoft's Weibull++6 software. The results of the analysis show that the conveyer subsystem and secondary screen subsystem are critical from a reliability point of view, and the secondary crusher subsystem and conveyer subsystem are critical from an availability point of view. The study also shows that the reliability analysis is very useful for deciding maintenance intervals

  5. Reliability evaluation of a natural circulation system

    International Nuclear Information System (INIS)

    Jafari, Jalil; D'Auria, Francesco; Kazeminejad, Hossein; Davilu, Hadi

    2003-01-01

    This paper discusses a reliability study performed with reference to a passive thermohydraulic natural circulation (NC) system, named TTL-1. A methodology based on probabilistic techniques has been applied with the main purpose to optimize the system design. The obtained results have been adopted to estimate the thermal-hydraulic reliability (TH-R) of the same system. A total of 29 relevant parameters (including nominal values and plausible ranges of variations) affecting the design and the NC performance of the TTL-1 loop are identified and a probability of occurrence is assigned for each value based on expert judgment. Following procedures established for the uncertainty evaluation of thermal-hydraulic system codes results, 137 system configurations have been selected and each configuration has been analyzed via the Relap5 best-estimate code. The reference system configuration and the failure criteria derived from the 'mission' of the passive system are adopted for the evaluation of the system TH-R. Four different definitions of a less-than-unity 'reliability-values' (where unity represents the maximum achievable reliability) are proposed for the performance of the selected passive system. This is normally considered fully reliable, i.e. reliability-value equal one, in typical Probabilistic Safety Assessment (PSA) applications in nuclear reactor safety. The two 'point' TH-R values for the considered NC system were found equal to 0.70 and 0.85, i.e. values comparable with the reliability of a pump installed in an 'equivalent' forced circulation (active) system having the same 'mission'. The design optimization study was completed by a regression analysis addressing the output of the 137 calculations: heat losses, undetected leakage, loop length, riser diameter, and equivalent diameter of the test section have been found as the most important parameters bringing to the optimal system design and affecting the TH-R. As added values for this work, the comparison has

  6. Advances in small zero-leak valves point to better nuclear power-plant reliability

    Energy Technology Data Exchange (ETDEWEB)

    Eacott, K B; Kin, J C; Hotta, Y [Dresser Japan, Ltd.

    1978-04-01

    In the selection of small valves less than two inches used for nuclear power plants, sufficient consideration must be given to the reliability to radioactive material, the easy operability, and the significant function, especially zero leak. These valves are classified into bellows and diaphragm seal types which must satisfy zero leak, 4000 cycles life test and good maintainability. Welded bellows, formed bellows, and metal diaphragms are actually used for these requirements. The construction of these types are shown. The requirements and principal specifications for these small valves are explained, and some examples are given. These zero leak valves are installed in reactor coolant loop system, borated water from B. A. system, pressurizer instrument system, containment spray system, high head system and off gas system for PWRS, and main steam line system, diesel generator cooling water system, re-circulation system, clean up water system, etc. for BWRS.

  7. Analyzing the reliability of shuffle-exchange networks using reliability block diagrams

    International Nuclear Information System (INIS)

    Bistouni, Fathollah; Jahanshahi, Mohsen

    2014-01-01

    Supercomputers and multi-processor systems are comprised of thousands of processors that need to communicate in an efficient way. One reasonable solution would be the utilization of multistage interconnection networks (MINs), where the challenge is to analyze the reliability of such networks. One of the methods to increase the reliability and fault-tolerance of the MINs is use of various switching stages. Therefore, recently, the reliability of one of the most common MINs namely shuffle-exchange network (SEN) has been evaluated through the investigation on the impact of increasing the number of switching stage. Also, it is concluded that the reliability of SEN with one additional stage (SEN+) is better than SEN or SEN with two additional stages (SEN+2), even so, the reliability of SEN is better compared to SEN with two additional stages (SEN+2). Here we re-evaluate the reliability of these networks where the results of the terminal, broadcast, and network reliability analysis demonstrate that SEN+ and SEN+2 continuously outperform SEN and are very alike in terms of reliability. - Highlights: • The impact of increasing the number of stages on reliability of MINs is investigated. • The RBD method as an accurate method is used for the reliability analysis of MINs. • Complex series–parallel RBDs are used to determine the reliability of the MINs. • All measures of the reliability (i.e. terminal, broadcast, and network reliability) are analyzed. • All reliability equations will be calculated for different size N×N

  8. Uncertainty propagation and sensitivity analysis in system reliability assessment via unscented transformation

    International Nuclear Information System (INIS)

    Rocco Sanseverino, Claudio M.; Ramirez-Marquez, José Emmanuel

    2014-01-01

    The reliability of a system, notwithstanding it intended function, can be significantly affected by the uncertainty in the reliability estimate of the components that define the system. This paper implements the Unscented Transformation to quantify the effects of the uncertainty of component reliability through two approaches. The first approach is based on the concept of uncertainty propagation, which is the assessment of the effect that the variability of the component reliabilities produces on the variance of the system reliability. This assessment based on UT has been previously considered in the literature but only for system represented through series/parallel configuration. In this paper the assessment is extended to systems whose reliability cannot be represented through analytical expressions and require, for example, Monte Carlo Simulation. The second approach consists on the evaluation of the importance of components, i.e., the evaluation of the components that most contribute to the variance of the system reliability. An extension of the UT is proposed to evaluate the so called “main effects” of each component, as well to assess high order component interaction. Several examples with excellent results illustrate the proposed approach. - Highlights: • Simulation based approach for computing reliability estimates. • Computation of reliability variance via 2n+1 points. • Immediate computation of component importance. • Application to network systems

  9. Design of a composite structure to achieve a specified reliability level

    International Nuclear Information System (INIS)

    Boyer, C.; Beakou, A.; Lemaire, M.

    1997-01-01

    Safety factors are widely used in structural design. For composite material structures, however, the lack of experimental feed-back does not allow the use of safety factors optimized from cost and reliability point of view. Reliability methods are one way to achieve the calibration of partial safety factors using a more rational method than judgement alone. First we present the calibration process. The reliability methods FORM, SORM, simulation, are initially applied to a laminate plate under uniform pressure. In this example, we compare three design criteria; the different reliability methods agree with the reference method for all criteria used. We chose the Tsai-Hill criteria and the FORM method to calculate safety factors. Then, a calibration process is undertaken on a composite pipe and this serves to illustrate the different steps in the calculation. Finally, we present a calibration of a general plate structure. The partial safety factors and their sensitivities to the different parameters of the stochastic variables are given according to load type

  10. An Intelligent Method for Structural Reliability Analysis Based on Response Surface

    Institute of Scientific and Technical Information of China (English)

    桂劲松; 刘红; 康海贵

    2004-01-01

    As water depth increases, the structural safety and reliability of a system become more and more important and challenging. Therefore, the structural reliability method must be applied in ocean engineering design such as offshore platform design. If the performance function is known in structural reliability analysis, the first-order second-moment method is often used. If the performance function could not be definitely expressed, the response surface method is always used because it has a very clear train of thought and simple programming. However, the traditional response surface method fits the response surface of quadratic polynomials where the problem of accuracy could not be solved, because the true limit state surface can be fitted well only in the area near the checking point. In this paper, an intelligent computing method based on the whole response surface is proposed, which can be used for the situation where the performance function could not be definitely expressed in structural reliability analysis. In this method, a response surface of the fuzzy neural network for the whole area should be constructed first, and then the structural reliability can be calculated by the genetic algorithm. In the proposed method, all the sample points for the training network come from the whole area, so the true limit state surface in the whole area can be fitted. Through calculational examples and comparative analysis, it can be known that the proposed method is much better than the traditional response surface method of quadratic polynomials, because, the amount of calculation of finite element analysis is largely reduced, the accuracy of calculation is improved,and the true limit state surface can be fitted very well in the whole area. So, the method proposed in this paper is suitable for engineering application.

  11. Reliability optimization design of the gear modification coefficient based on the meshing stiffness

    Science.gov (United States)

    Wang, Qianqian; Wang, Hui

    2018-04-01

    Since the time varying meshing stiffness of gear system is the key factor affecting gear vibration, it is important to design the meshing stiffness to reduce vibration. Based on the effect of gear modification coefficient on the meshing stiffness, considering the random parameters, reliability optimization design of the gear modification is researched. The dimension reduction and point estimation method is used to estimate the moment of the limit state function, and the reliability is obtained by the forth moment method. The cooperation of the dynamic amplitude results before and after optimization indicates that the research is useful for the reduction of vibration and noise and the improvement of the reliability.

  12. Circuit reliability boosted by soldering pins of disconnect plugs to sockets

    Science.gov (United States)

    Pierce, W. B.

    1964-01-01

    Where disconnect pins must be used for wiring and testing a circuit, improved system reliability is obtained by making a permanent joint between pins and sockets of the disconnect plug. After the circuit has been tested, contact points may be fused through soldering, brazing, or welding.

  13. Methods for estimating the reliability of the RBMK fuel assemblies and elements

    International Nuclear Information System (INIS)

    Klemin, A.I.; Sitkarev, A.G.

    1985-01-01

    Applied non-parametric methods for calculation of point and interval estimations for the basic nomenclature of reliability factors for the RBMK fuel assemblies and elements are described. As the fuel assembly and element reliability factors, the average lifetime is considered at a preset operating time up to unloading due to fuel burnout as well as the average lifetime at the reactor transient operation and at the steady-state fuel reloading mode of reactor operation. The formulae obtained are included into the special standardized engineering documentation

  14. Test-retest reliability of a balance testing protocol with external perturbations in young healthy adults.

    Science.gov (United States)

    Robbins, Shawn M; Caplan, Ryan M; Aponte, Daniel I; St-Onge, Nancy

    2017-10-01

    External perturbations are utilized to challenge balance and mimic realistic balance threats in patient populations. The reliability of such protocols has not been established. The purpose was to examine test-retest reliability of balance testing with external perturbations. Healthy adults (n=34; mean age 23 years) underwent balance testing over two visits. Participants completed ten balance conditions in which the following parameters were combined: perturbation or non-perturbation, single or double leg, and eyes open or closed. Three trials were collected for each condition. Data were collected on a force plate and external perturbations were applied by translating the plate. Force plate center of pressure (CoP) data were summarized using 13 different CoP measures. Test-retest reliability was examined using intraclass correlation coefficients (ICC) and Bland-Altman plots. CoP measures of total speed and excursion in both anterior-posterior and medial-lateral directions generally had acceptable ICC values for perturbation conditions (ICC=0.46 to 0.87); however, many other CoP measures (e.g. range, area of ellipse) had unacceptable test-retest reliability (ICCbalance testing protocols that include external perturbations should be made to improve test-retest reliability and diminish learning including more extensive participant training and increasing the number of trials. CoP measures that consider all data points (e.g. total speed) are more reliable than those that only consider a few data points. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Reliability data banks

    International Nuclear Information System (INIS)

    Cannon, A.G.; Bendell, A.

    1991-01-01

    Following an introductory chapter on Reliability, what is it, why it is needed, how it is achieved and measured, the principles of reliability data bases and analysis methodologies are the subject of the next two chapters. Achievements due to the development of data banks are mentioned for different industries in the next chapter, FACTS, a comprehensive information system for industrial safety and reliability data collection in process plants are covered next. CREDO, the Central Reliability Data Organization is described in the next chapter and is indexed separately, as is the chapter on DANTE, the fabrication reliability Data analysis system. Reliability data banks at Electricite de France and IAEA's experience in compiling a generic component reliability data base are also separately indexed. The European reliability data system, ERDS, and the development of a large data bank come next. The last three chapters look at 'Reliability data banks, - friend foe or a waste of time'? and future developments. (UK)

  16. An Introduction To Reliability

    International Nuclear Information System (INIS)

    Park, Kyoung Su

    1993-08-01

    This book introduces reliability with definition of reliability, requirement of reliability, system of life cycle and reliability, reliability and failure rate such as summary, reliability characteristic, chance failure, failure rate which changes over time, failure mode, replacement, reliability in engineering design, reliability test over assumption of failure rate, and drawing of reliability data, prediction of system reliability, conservation of system, failure such as summary and failure relay and analysis of system safety.

  17. Addressing Uniqueness and Unison of Reliability and Safety for a Better Integration

    Science.gov (United States)

    Huang, Zhaofeng; Safie, Fayssal

    2016-01-01

    Over time, it has been observed that Safety and Reliability have not been clearly differentiated, which leads to confusion, inefficiency, and, sometimes, counter-productive practices in executing each of these two disciplines. It is imperative to address this situation to help Reliability and Safety disciplines improve their effectiveness and efficiency. The paper poses an important question to address, "Safety and Reliability - Are they unique or unisonous?" To answer the question, the paper reviewed several most commonly used analyses from each of the disciplines, namely, FMEA, reliability allocation and prediction, reliability design involvement, system safety hazard analysis, Fault Tree Analysis, and Probabilistic Risk Assessment. The paper pointed out uniqueness and unison of Safety and Reliability in their respective roles, requirements, approaches, and tools, and presented some suggestions for enhancing and improving the individual disciplines, as well as promoting the integration of the two. The paper concludes that Safety and Reliability are unique, but compensating each other in many aspects, and need to be integrated. Particularly, the individual roles of Safety and Reliability need to be differentiated, that is, Safety is to ensure and assure the product meets safety requirements, goals, or desires, and Reliability is to ensure and assure maximum achievability of intended design functions. With the integration of Safety and Reliability, personnel can be shared, tools and analyses have to be integrated, and skill sets can be possessed by the same person with the purpose of providing the best value to a product development.

  18. Value-Eroding Teacher Behaviors Scale: A Validity and Reliability Study

    Science.gov (United States)

    Arseven, Zeynep; Kiliç, Abdurrahman; Sahin, Seyma

    2016-01-01

    In the present study, it is aimed to develop a valid and reliable scale for determining value-eroding behaviors of teachers, hence their values of judgment. The items of the "Value-eroding Teacher Behaviors Scale" were designed in the form of 5-point likert type rating scale. The exploratory factor analysis (EFA) was conducted to…

  19. Image Relaxation Matching Based on Feature Points for DSM Generation

    Institute of Scientific and Technical Information of China (English)

    ZHENG Shunyi; ZHANG Zuxun; ZHANG Jianqing

    2004-01-01

    In photogrammetry and remote sensing, image matching is a basic and crucial process for automatic DEM generation. In this paper we presented a image relaxation matching method based on feature points. This method can be considered as an extention of regular grid point based matching. It avoids the shortcome of grid point based matching. For example, with this method, we can avoid low or even no texture area where errors frequently appear in cross correlaton matching. In the mean while, it makes full use of some mature techniques such as probability relaxation, image pyramid and the like which have already been successfully used in grid point matching process. Application of the technique to DEM generaton in different regions proved that it is more reasonable and reliable.

  20. Reliability calculations

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1986-03-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  1. Reliable and Efficient Communications in Wireless Sensor Networks

    International Nuclear Information System (INIS)

    Abdelhakim, M.M.

    2014-01-01

    Wireless sensor network (WSN) is a key technology for a wide range of military and civilian applications. Limited by the energy resources and processing capabilities of the sensor nodes, reliable and efficient communications in wireless sensor networks are challenging, especially when the sensors are deployed in hostile environments. This research aims to improve the reliability and efficiency of time-critical communications in WSNs, under both benign and hostile environments. We start with wireless sensor network with mobile access points (SENMA), where the mobile access points traverse the network to collect information from individual sensors. Due to its routing simplicity and energy efficiency, SENMA has attracted lots of attention from the research community. Here, we study reliable distributed detection in SENMA under Byzantine attacks, where some authenticated sensors are compromised to report fictitious information. The q-out-of-m rule is considered. It is popular in distributed detection and can achieve a good trade-off between the miss detection probability and the false alarm rate. However, a major limitation with this rule is that the optimal scheme parameters can only be obtained through exhaustive search. By exploiting the linear relationship between the scheme parameters and the network size, we propose simple but effective sub-optimal linear approaches. Then, for better flexibility and scalability, we derive a near-optimal closed-form solution based on the central limit theorem. It is proved that the false alarm rate of the q-out-of-m scheme diminishes exponentially as the network size increases, even if the percentage of malicious nodes remains fixed. This implies that large-scale sensor networks are more reliable under malicious attacks. To further improve the performance under time varying attacks, we propose an effective malicious node detection scheme for adaptive data fusion; the proposed scheme is analyzed using the entropy-based trust model

  2. Reliability-based trajectory optimization using nonintrusive polynomial chaos for Mars entry mission

    Science.gov (United States)

    Huang, Yuechen; Li, Haiyang

    2018-06-01

    This paper presents the reliability-based sequential optimization (RBSO) method to settle the trajectory optimization problem with parametric uncertainties in entry dynamics for Mars entry mission. First, the deterministic entry trajectory optimization model is reviewed, and then the reliability-based optimization model is formulated. In addition, the modified sequential optimization method, in which the nonintrusive polynomial chaos expansion (PCE) method and the most probable point (MPP) searching method are employed, is proposed to solve the reliability-based optimization problem efficiently. The nonintrusive PCE method contributes to the transformation between the stochastic optimization (SO) and the deterministic optimization (DO) and to the approximation of trajectory solution efficiently. The MPP method, which is used for assessing the reliability of constraints satisfaction only up to the necessary level, is employed to further improve the computational efficiency. The cycle including SO, reliability assessment and constraints update is repeated in the RBSO until the reliability requirements of constraints satisfaction are satisfied. Finally, the RBSO is compared with the traditional DO and the traditional sequential optimization based on Monte Carlo (MC) simulation in a specific Mars entry mission to demonstrate the effectiveness and the efficiency of the proposed method.

  3. Fuzzy logic prediction of dew point pressure of selected Iranian gas condensate reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Nowroozi, Saeed [Shahid Bahonar Univ. of Kerman (Iran); Iranian Offshore Oil Company (I.O.O.C.) (Iran); Ranjbar, Mohammad; Hashemipour, Hassan; Schaffie, Mahin [Shahid Bahonar Univ. of Kerman (Iran)

    2009-12-15

    The experimental determination of dew point pressure in a window PVT cell is often difficult especially in the case of lean retrograde gas condensate. Besides all statistical, graphical and experimental methods, the fuzzy logic method can be useful and more reliable for estimation of reservoir properties. Fuzzy logic can overcome uncertainty existent in many reservoir properties. Complexity, non-linearity and vagueness are some reservoir parameter characteristics, which can be propagated simply by fuzzy logic. The fuzzy logic dew point pressure modeling system used in this study is a multi input single output (MISO) Mamdani system. The model was developed using experimentally constant volume depletion (CVD) measured samples of some Iranian fields. The performance of the model is compared against the performance of some of the most accurate and general correlations for dew point pressure calculation. Results show that this novel method is more accurate and reliable with an average absolute deviation of 1.33% and 2.68% for developing and checking, respectively. (orig.)

  4. Field data sets for seagrass biophysical properties for the Eastern Banks, Moreton Bay, Australia, 2004-2014

    Science.gov (United States)

    Roelfsema, Chris M.; Kovacs, Eva M.; Phinn, Stuart R.

    2015-08-01

    This paper describes seagrass species and percentage cover point-based field data sets derived from georeferenced photo transects. Annually or biannually over a ten year period (2004-2014) data sets were collected using 30-50 transects, 500-800 m in length distributed across a 142 km2 shallow, clear water seagrass habitat, the Eastern Banks, Moreton Bay, Australia. Each of the eight data sets include seagrass property information derived from approximately 3000 georeferenced, downward looking photographs captured at 2-4 m intervals along the transects. Photographs were manually interpreted to estimate seagrass species composition and percentage cover (Coral Point Count excel; CPCe). Understanding seagrass biology, ecology and dynamics for scientific and management purposes requires point-based data on species composition and cover. This data set, and the methods used to derive it are a globally unique example for seagrass ecological applications. It provides the basis for multiple further studies at this site, regional to global comparative studies, and, for the design of similar monitoring programs elsewhere.

  5. CERN Technical training 2008 - Learning for the LHC: Special Workshop demonstrating reliability with accelerated testing

    CERN Multimedia

    2008-01-01

    Larry Edson’s workshop will show examples of quantitative reliability predictions based upon accelerated testing and demonstrates that reliability testing during the prototyping phase will help ascertain product shortcomings. When these weak points are addressed and the redesigned product is re-tested, the reliability of that product will become much higher. These methodologies successfully used in industry might be exceedingly useful also for component development in particle physics where reliability is of utmost importance. This training will provide participants with the skills necessary to demonstrate reliability requirements using accelerated testing methods. The workshop will focus on accelerated test design that employs increased stress levels. This approach has the advantage of reducing test time, sample size and test facility resources. The methodologies taught are applicable to all types of stresses, spanning the electro...

  6. CERN Technical training 2008 - Learning for the LHC: Special Workshop demonstrating reliability with accelerated testing

    CERN Multimedia

    2008-01-01

    Larry Edson’s workshop will show examples of quantitative reliability predictions based upon accelerated testing and demonstrate that reliability testing during the prototyping phase will help ascertain product shortcomings. When these weak points are addressed and the redesigned product is re-tested, the reliability of that product will become much higher. These methodologies successfully used in industry might be exceedingly useful also for component development in particle physics where reliability is of the utmost importance. This training will provide participants with the skills necessary to demonstrate reliability requirements using accelerated testing methods. The workshop will focus on accelerated test design that employs increased stress levels. This approach has the advantage of reducing test time, sample size and test facility resources. The methodologies taught are applicable to all types of stresses, spanning the elec...

  7. CERN Technical training 2008 - Learning for the LHC: Special workshop demonstrating reliability with accelerated testing

    CERN Multimedia

    2008-01-01

    Larry Edson’s workshop will show examples of quantitative reliability predictions based upon accelerated testing and demonstrate that reliability testing during the prototyping phase will help ascertain product shortcomings. When these weak points are addressed and the redesigned product is re-tested, the reliability of that product will become much higher. These methodologies successfully used in industry might be exceedingly useful also for component development in particle physics where reliability is of the utmost importance. This training will provide participants with the skills necessary to demonstrate reliability requirements using accelerated testing methods. The workshop will focus on accelerated test design that employs increased stress levels. This approach has the advantage of reducing test time, sample size and test facility resources. The methodologies taught are applicable to all types of stresses, spanning the elec...

  8. Interactive reliability assessment using an integrated reliability data bank

    International Nuclear Information System (INIS)

    Allan, R.N.; Whitehead, A.M.

    1986-01-01

    The logical structure, techniques and practical application of a computer-aided technique based on a microcomputer using floppy disc Random Access Files is described. This interactive computational technique is efficient if the reliability prediction program is coupled directly to a relevant source of data to create an integrated reliability assessment/reliability data bank system. (DG)

  9. An application of the fault tree analysis for the power system reliability estimation

    International Nuclear Information System (INIS)

    Volkanovski, A.; Cepin, M.; Mavko, B.

    2007-01-01

    The power system is a complex system with its main function to produce, transfer and provide consumers with electrical energy. Combinations of failures of components in the system can result in a failure of power delivery to certain load points and in some cases in a full blackout of power system. The power system reliability directly affects safe and reliable operation of nuclear power plants because the loss of offsite power is a significant contributor to the core damage frequency in probabilistic safety assessments of nuclear power plants. The method, which is based on the integration of the fault tree analysis with the analysis of the power flows in the power system, was developed and implemented for power system reliability assessment. The main contributors to the power system reliability are identified, both quantitatively and qualitatively. (author)

  10. 76 FR 23222 - Electric Reliability Organization Interpretation of Transmission Operations Reliability

    Science.gov (United States)

    2011-04-26

    ....3d 1342 (DC Cir. 2009). \\5\\ Mandatory Reliability Standards for the Bulk-Power System, Order No. 693... Reliability Standards for the Bulk-Power System. Action: FERC-725A. OMB Control No.: 1902-0244. Respondents...] Electric Reliability Organization Interpretation of Transmission Operations Reliability AGENCY: Federal...

  11. Automated Generation of Geo-Referenced Mosaics From Video Data Collected by Deep-Submergence Vehicles: Preliminary Results

    Science.gov (United States)

    Rhzanov, Y.; Beaulieu, S.; Soule, S. A.; Shank, T.; Fornari, D.; Mayer, L. A.

    2005-12-01

    Many advances in understanding geologic, tectonic, biologic, and sedimentologic processes in the deep ocean are facilitated by direct observation of the seafloor. However, making such observations is both difficult and expensive. Optical systems (e.g., video, still camera, or direct observation) will always be constrained by the severe attenuation of light in the deep ocean, limiting the field of view to distances that are typically less than 10 meters. Acoustic systems can 'see' much larger areas, but at the cost of spatial resolution. Ultimately, scientists want to study and observe deep-sea processes in the same way we do land-based phenomena so that the spatial distribution and juxtaposition of processes and features can be resolved. We have begun development of algorithms that will, in near real-time, generate mosaics from video collected by deep-submergence vehicles. Mosaics consist of >>10 video frames and can cover 100's of square-meters. This work builds on a publicly available still and video mosaicking software package developed by Rzhanov and Mayer. Here we present the results of initial tests of data collection methodologies (e.g., transects across the seafloor and panoramas across features of interest), algorithm application, and GIS integration conducted during a recent cruise to the Eastern Galapagos Spreading Center (0 deg N, 86 deg W). We have developed a GIS database for the region that will act as a means to access and display mosaics within a geospatially-referenced framework. We have constructed numerous mosaics using both video and still imagery and assessed the quality of the mosaics (including registration errors) under different lighting conditions and with different navigation procedures. We have begun to develop algorithms for efficient and timely mosaicking of collected video as well as integration with navigation data for georeferencing the mosaics. Initial results indicate that operators must be properly versed in the control of the

  12. Improving the Accuracy of Direct Geo-referencing of Smartphone-Based Mobile Mapping Systems Using Relative Orientation and Scene Geometric Constraints

    Directory of Open Access Journals (Sweden)

    Naif M. Alsubaie

    2017-09-01

    Full Text Available This paper introduces a new method which facilitate the use of smartphones as a handheld low-cost mobile mapping system (MMS. Smartphones are becoming more sophisticated and smarter and are quickly closing the gap between computers and portable tablet devices. The current generation of smartphones are equipped with low-cost GPS receivers, high-resolution digital cameras, and micro-electro mechanical systems (MEMS-based navigation sensors (e.g., accelerometers, gyroscopes, magnetic compasses, and barometers. These sensors are in fact the essential components for a MMS. However, smartphone navigation sensors suffer from the poor accuracy of global navigation satellite System (GNSS, accumulated drift, and high signal noise. These issues affect the accuracy of the initial Exterior Orientation Parameters (EOPs that are inputted into the bundle adjustment algorithm, which then produces inaccurate 3D mapping solutions. This paper proposes new methodologies for increasing the accuracy of direct geo-referencing of smartphones using relative orientation and smartphone motion sensor measurements as well as integrating geometric scene constraints into free network bundle adjustment. The new methodologies incorporate fusing the relative orientations of the captured images and their corresponding motion sensor measurements to improve the initial EOPs. Then, the geometric features (e.g., horizontal and vertical linear lines visible in each image are extracted and used as constraints in the bundle adjustment procedure which correct the relative position and orientation of the 3D mapping solution.

  13. FPFH-based graph matching for 3D point cloud registration

    Science.gov (United States)

    Zhao, Jiapeng; Li, Chen; Tian, Lihua; Zhu, Jihua

    2018-04-01

    Correspondence detection is a vital step in point cloud registration and it can help getting a reliable initial alignment. In this paper, we put forward an advanced point feature-based graph matching algorithm to solve the initial alignment problem of rigid 3D point cloud registration with partial overlap. Specifically, Fast Point Feature Histograms are used to determine the initial possible correspondences firstly. Next, a new objective function is provided to make the graph matching more suitable for partially overlapping point cloud. The objective function is optimized by the simulated annealing algorithm for final group of correct correspondences. Finally, we present a novel set partitioning method which can transform the NP-hard optimization problem into a O(n3)-solvable one. Experiments on the Stanford and UWA public data sets indicates that our method can obtain better result in terms of both accuracy and time cost compared with other point cloud registration methods.

  14. The relative and absolute reliability of the Functional Independence and Difficulty Scale in community-dwelling frail elderly Japanese people using long-term care insurance services.

    Science.gov (United States)

    Saito, Takashi; Izawa, Kazuhiro P; Watanabe, Shuichiro

    2017-06-01

    The newly developed Functional Independence and Difficulty Scale is a tool for assessing the performance of basic activities of daily living in terms of both independence and difficulty. The reliability of this new scale has not been assessed. The aim of this study was to examine the relative reliability and absolute reliability of the newly developed scale in community-dwelling frail elderly people in Japan. Participants were 47 community-dwelling elderly subjects (22 for assessing test-retest reliability and 25 for assessing inter-rater reliability). As relative reliability indices, intra-class correlation coefficients were used. From an absolute reliability perspective, we conducted Bland-Altman analysis and calculated the limit of agreement or minimal detectable change to determine the acceptable range of error. Intra-class correlation coefficients for test-retest and inter-rater reliability were 0.90 (P reliability was -5.2 to 1.8, representing an increase of over six points for improvement and a decrease of over two points for decline of basic activities of daily living ability. The minimal detectable change for inter-rater reliability was 3.7, indicating that a three-point difference might be existed between difference raters. The results of this study demonstrated that the FIDS appeared to be a reliable instrument for use in Japanese community-dwelling frail elderly people. While further research using a large and more diverse sample of participants is needed, our findings support the use of FIDS in clinical practice or clinical research targeting frail elderly Japanese people.

  15. Point Topography and Within-Session Learning Are Important Predictors of Pet Dogs’ (Canis lupus familiaris Performance on Human Guided Tasks

    Directory of Open Access Journals (Sweden)

    Dorey, Nicole R.

    2013-07-01

    Full Text Available Pet domestic dogs (Canis lupus familiaris are generally considered successful on object choice tasks, reliably following human points to a target. However, defining the specific topography of the point types utilized and assessing the potential for dogs to generalize their responses across similar point types has received little attention. In Experiment 1, we assessed pet dogs’ performance on an object choice task utilizing nine different point types that varied across the dimensions of movement, duration, and distance. These dimensions reliably predicted the performance of pet dogs on this task. In Experiment 2, pet dogs presented with nine different point types in the order of increasing difficulty performed better on more difficult point types than both naive dogs and dogs experiencing the nine points in the order of decreasing difficulty. In Experiment 3, we manipulated the attentional state of the experimenter (as in perspective taking studies and found that human orientation was not a strong predictor of performance on pointing tasks. The results of this study indicate that dogs do not reliably follow all point types without additional training or experience. Furthermore, dogs appear to continuously learn about the dimensions of human points, adjusting their behavior accordingly, even over the course of experimental testing. These findings bring claims of pet dogs’ spontaneous success on pointing tasks into question. The ability to learn about, and respond flexibly to, human gestures may benefit pet dogs living in human homes more than a spontaneous responsiveness to specific gesture types.

  16. 76 FR 23171 - Electric Reliability Organization Interpretations of Interconnection Reliability Operations and...

    Science.gov (United States)

    2011-04-26

    ... Reliability Standards for the Bulk-Power System, Order No. 693, FERC Stats. & Regs. ] 31,242, order on reh'g...-Power System reliability may request an interpretation of a Reliability Standard.\\7\\ The ERO's standards... information in its reliability assessments. The Reliability Coordinator must monitor Bulk Electric System...

  17. Reliability and Validity Assessment of a Linear Position Transducer

    Science.gov (United States)

    Garnacho-Castaño, Manuel V.; López-Lastra, Silvia; Maté-Muñoz, José L.

    2015-01-01

    The objectives of the study were to determine the validity and reliability of peak velocity (PV), average velocity (AV), peak power (PP) and average power (AP) measurements were made using a linear position transducer. Validity was assessed by comparing measurements simultaneously obtained using the Tendo Weightlifting Analyzer Systemi and T-Force Dynamic Measurement Systemr (Ergotech, Murcia, Spain) during two resistance exercises, bench press (BP) and full back squat (BS), performed by 71 trained male subjects. For the reliability study, a further 32 men completed both lifts using the Tendo Weightlifting Analyzer Systemz in two identical testing sessions one week apart (session 1 vs. session 2). Intraclass correlation coefficients (ICCs) indicating the validity of the Tendo Weightlifting Analyzer Systemi were high, with values ranging from 0.853 to 0.989. Systematic biases and random errors were low to moderate for almost all variables, being higher in the case of PP (bias ±157.56 W; error ±131.84 W). Proportional biases were identified for almost all variables. Test-retest reliability was strong with ICCs ranging from 0.922 to 0.988. Reliability results also showed minimal systematic biases and random errors, which were only significant for PP (bias -19.19 W; error ±67.57 W). Only PV recorded in the BS showed no significant proportional bias. The Tendo Weightlifting Analyzer Systemi emerged as a reliable system for measuring movement velocity and estimating power in resistance exercises. The low biases and random errors observed here (mainly AV, AP) make this device a useful tool for monitoring resistance training. Key points This study determined the validity and reliability of peak velocity, average velocity, peak power and average power measurements made using a linear position transducer The Tendo Weight-lifting Analyzer Systemi emerged as a reliable system for measuring movement velocity and power. PMID:25729300

  18. FireProt: Energy- and Evolution-Based Computational Design of Thermostable Multiple-Point Mutants.

    Science.gov (United States)

    Bednar, David; Beerens, Koen; Sebestova, Eva; Bendl, Jaroslav; Khare, Sagar; Chaloupkova, Radka; Prokop, Zbynek; Brezovsky, Jan; Baker, David; Damborsky, Jiri

    2015-11-01

    There is great interest in increasing proteins' stability to enhance their utility as biocatalysts, therapeutics, diagnostics and nanomaterials. Directed evolution is a powerful, but experimentally strenuous approach. Computational methods offer attractive alternatives. However, due to the limited reliability of predictions and potentially antagonistic effects of substitutions, only single-point mutations are usually predicted in silico, experimentally verified and then recombined in multiple-point mutants. Thus, substantial screening is still required. Here we present FireProt, a robust computational strategy for predicting highly stable multiple-point mutants that combines energy- and evolution-based approaches with smart filtering to identify additive stabilizing mutations. FireProt's reliability and applicability was demonstrated by validating its predictions against 656 mutations from the ProTherm database. We demonstrate that thermostability of the model enzymes haloalkane dehalogenase DhaA and γ-hexachlorocyclohexane dehydrochlorinase LinA can be substantially increased (ΔTm = 24°C and 21°C) by constructing and characterizing only a handful of multiple-point mutants. FireProt can be applied to any protein for which a tertiary structure and homologous sequences are available, and will facilitate the rapid development of robust proteins for biomedical and biotechnological applications.

  19. FireProt: Energy- and Evolution-Based Computational Design of Thermostable Multiple-Point Mutants.

    Directory of Open Access Journals (Sweden)

    David Bednar

    2015-11-01

    Full Text Available There is great interest in increasing proteins' stability to enhance their utility as biocatalysts, therapeutics, diagnostics and nanomaterials. Directed evolution is a powerful, but experimentally strenuous approach. Computational methods offer attractive alternatives. However, due to the limited reliability of predictions and potentially antagonistic effects of substitutions, only single-point mutations are usually predicted in silico, experimentally verified and then recombined in multiple-point mutants. Thus, substantial screening is still required. Here we present FireProt, a robust computational strategy for predicting highly stable multiple-point mutants that combines energy- and evolution-based approaches with smart filtering to identify additive stabilizing mutations. FireProt's reliability and applicability was demonstrated by validating its predictions against 656 mutations from the ProTherm database. We demonstrate that thermostability of the model enzymes haloalkane dehalogenase DhaA and γ-hexachlorocyclohexane dehydrochlorinase LinA can be substantially increased (ΔTm = 24°C and 21°C by constructing and characterizing only a handful of multiple-point mutants. FireProt can be applied to any protein for which a tertiary structure and homologous sequences are available, and will facilitate the rapid development of robust proteins for biomedical and biotechnological applications.

  20. Reliability, Validity, and Sensitivity of a Novel Smartphone-Based Eccentric Hamstring Strength Test in Professional Football Players.

    Science.gov (United States)

    Lee, Justin W Y; Cai, Ming-Jing; Yung, Patrick S H; Chan, Kai-Ming

    2018-05-01

    To evaluate the test-retest reliability, sensitivity, and concurrent validity of a smartphone-based method for assessing eccentric hamstring strength among male professional football players. A total of 25 healthy male professional football players performed the Chinese University of Hong Kong (CUHK) Nordic break-point test, hamstring fatigue protocol, and isokinetic hamstring strength test. The CUHK Nordic break-point test is based on a Nordic hamstring exercise. The Nordic break-point angle was defined as the maximum point where the participant could no longer support the weight of his body against gravity. The criterion for the sensitivity test was the presprinting and postsprinting difference of the Nordic break-point angle with a hamstring fatigue protocol. The hamstring fatigue protocol consists of 12 repetitions of the 30-m sprint with 30-s recoveries between sprints. Hamstring peak torque of the isokinetic hamstring strength test was used as the criterion for validity. A high test-retest reliability (intraclass correlation coefficient = .94; 95% confidence interval, .82-.98) was found in the Nordic break-point angle measurements. The Nordic break-point angle significantly correlated with isokinetic hamstring peak torques at eccentric action of 30°/s (r = .88, r 2  = .77, P hamstring strength measures among male professional football players.

  1. INTRA- AND INTER-OBSERVER RELIABILITY IN SELECTION OF THE HEART RATE DEFLECTION POINT DURING INCREMENTAL EXERCISE: COMPARISON TO A COMPUTER-GENERATED DEFLECTION POINT

    Directory of Open Access Journals (Sweden)

    Bridget A. Duoos

    2002-12-01

    Full Text Available This study was designed to 1 determine the relative frequency of occurrence of a heart rate deflection point (HRDP, when compared to a linear relationship, during progressive exercise, 2 measure the reproducibility of a visual assessment of a heart rate deflection point (HRDP, both within and between observers 3 compare visual and computer-assessed deflection points. Subjects consisted of 73 competitive male cyclists with mean age of 31.4 ± 6.3 years, mean height 178.3 ± 4.8 cm. and weight 74.0 ± 4.4 kg. Tests were conducted on an electrically-braked cycle ergometer beginning at 25 watts and progressing 25 watts per minute to fatigue. Heart Rates were recorded the last 10 seconds of each stage and at fatigue. Scatter plots of heart rate versus watts were computer-generated and given to 3 observers on two different occasions. A computer program was developed to assess if data points were best represented by a single line or two lines. The HRDP represented the intersection of the two lines. Results of this study showed that 1 computer-assessed HRDP showed that 44 of 73 subjects (60.3% had scatter plots best represented by a straight line with no HRDP 2in those subjects having HRDP, all 3 observers showed significant differences(p = 0.048, p = 0.007, p = 0.001 in reproducibility of their HRDP selection. Differences in HRDP selection were significant for two of the three comparisons between observers (p = 0.002, p = 0.305, p = 0.0003 Computer-generated HRDP was significantly different than visual HRDP for 2 of 3 observers (p = 0.0016, p = 0.513, p = 0.0001. It is concluded that 1 HRDP occurs in a minority of subjects 2 significant differences exist, both within and between observers, in selection of HRDP and 3 differences in agreement between visual and computer-generated HRDP would indicate that, when HRDP exists, it should be computer-assessed

  2. Workplace Bullying Scale: The Study of Validity and Reliability

    Directory of Open Access Journals (Sweden)

    Nizamettin Doğar

    2015-01-01

    Full Text Available The aim of this research is to adapt the Workplace Bullying Scale (Tınaz, Gök & Karatuna, 2013 to Albanian language and to examine its psychometric properties. The research was conducted on 386 person from different sectors of Albania. Results of exploratory and confirmatory factor analysis demonstrated that Albanian scale yielded 2 factors different from original form because of cultural differences. Internal consistency coefficients are,890 -,801 and split-half test reliability coefficients, 864 -,808. Comfirmatory Factor Analysis results change from,40 to,73. Corrected item-total correlations ranged,339 to,672 and according to t-test results differences between each item’s means of upper 27% and lower 27% points were significant. Thus Workplace Bullying Scale can be use as a valid and reliable instrument in social sciences in Albania.

  3. Reliability of multi-model and structurally different single-model ensembles

    Energy Technology Data Exchange (ETDEWEB)

    Yokohata, Tokuta [National Institute for Environmental Studies, Center for Global Environmental Research, Tsukuba, Ibaraki (Japan); Annan, James D.; Hargreaves, Julia C. [Japan Agency for Marine-Earth Science and Technology, Research Institute for Global Change, Yokohama, Kanagawa (Japan); Collins, Matthew [University of Exeter, College of Engineering, Mathematics and Physical Sciences, Exeter (United Kingdom); Jackson, Charles S.; Tobis, Michael [The University of Texas at Austin, Institute of Geophysics, 10100 Burnet Rd., ROC-196, Mail Code R2200, Austin, TX (United States); Webb, Mark J. [Met Office Hadley Centre, Exeter (United Kingdom)

    2012-08-15

    The performance of several state-of-the-art climate model ensembles, including two multi-model ensembles (MMEs) and four structurally different (perturbed parameter) single model ensembles (SMEs), are investigated for the first time using the rank histogram approach. In this method, the reliability of a model ensemble is evaluated from the point of view of whether the observations can be regarded as being sampled from the ensemble. Our analysis reveals that, in the MMEs, the climate variables we investigated are broadly reliable on the global scale, with a tendency towards overdispersion. On the other hand, in the SMEs, the reliability differs depending on the ensemble and variable field considered. In general, the mean state and historical trend of surface air temperature, and mean state of precipitation are reliable in the SMEs. However, variables such as sea level pressure or top-of-atmosphere clear-sky shortwave radiation do not cover a sufficiently wide range in some. It is not possible to assess whether this is a fundamental feature of SMEs generated with particular model, or a consequence of the algorithm used to select and perturb the values of the parameters. As under-dispersion is a potentially more serious issue when using ensembles to make projections, we recommend the application of rank histograms to assess reliability when designing and running perturbed physics SMEs. (orig.)

  4. Reproducibility of tender point examination in chronic low back pain patients as measured by intrarater and inter-rater reliability and agreement

    DEFF Research Database (Denmark)

    Jensen, Ole Kudsk; Callesen, Jacob; Nielsen, Merete Graakjaer

    2013-01-01

    back examination and return-to-work intervention, 43 and 39 patients, respectively (18 women, 46%) entered and completed the study. MAIN OUTCOME MEASURES: The reliability was estimated by the intraclass correlation coefficient (ICC), and agreement was calculated for up to ±3 TPs. Furthermore......, the smallest detectable difference was calculated. RESULTS: TP examination was performed twice by two consultants in rheumatology and rehabilitation at 20 min intervals and repeated 1 week later. Intrarater reliability in the more and less experienced rater was ICC 0.84 (95% CI 0.69 to 0.98) and 0.72 (95% CI 0.......49 to 0.95), respectively. The figures for inter-rater reliability were intermediate between these figures. In more than 70% of the cases, the raters agreed within ±3 TPs in both men and women and between test days. The smallest detectable difference between raters was 5, and for the more and less...

  5. Effect of fuel fabrication parameters on performance- designer's point of view

    International Nuclear Information System (INIS)

    Prasad, P.N.; Ravi, M.; Soni, R.; Bajaj, S.S.; Bhardwaj, S.A.

    2004-01-01

    The fuel bundle performance in reactor depends upon the material properties, dimensions of the different components and their inter-compatibility. This paper brings out the fuel parameters required to be optimised to achieve better fuel reliability, operational flexibility, safety and economics from the designer point of view

  6. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  7. Unemployment estimation: Spatial point referenced methods and models

    KAUST Repository

    Pereira, Soraia

    2017-06-26

    Portuguese Labor force survey, from 4th quarter of 2014 onwards, started geo-referencing the sampling units, namely the dwellings in which the surveys are carried. This opens new possibilities in analysing and estimating unemployment and its spatial distribution across any region. The labor force survey choose, according to an preestablished sampling criteria, a certain number of dwellings across the nation and survey the number of unemployed in these dwellings. Based on this survey, the National Statistical Institute of Portugal presently uses direct estimation methods to estimate the national unemployment figures. Recently, there has been increased interest in estimating these figures in smaller areas. Direct estimation methods, due to reduced sampling sizes in small areas, tend to produce fairly large sampling variations therefore model based methods, which tend to

  8. Multidisciplinary Inverse Reliability Analysis Based on Collaborative Optimization with Combination of Linear Approximations

    Directory of Open Access Journals (Sweden)

    Xin-Jia Meng

    2015-01-01

    Full Text Available Multidisciplinary reliability is an important part of the reliability-based multidisciplinary design optimization (RBMDO. However, it usually has a considerable amount of calculation. The purpose of this paper is to improve the computational efficiency of multidisciplinary inverse reliability analysis. A multidisciplinary inverse reliability analysis method based on collaborative optimization with combination of linear approximations (CLA-CO is proposed in this paper. In the proposed method, the multidisciplinary reliability assessment problem is first transformed into a problem of most probable failure point (MPP search of inverse reliability, and then the process of searching for MPP of multidisciplinary inverse reliability is performed based on the framework of CLA-CO. This method improves the MPP searching process through two elements. One is treating the discipline analyses as the equality constraints in the subsystem optimization, and the other is using linear approximations corresponding to subsystem responses as the replacement of the consistency equality constraint in system optimization. With these two elements, the proposed method realizes the parallel analysis of each discipline, and it also has a higher computational efficiency. Additionally, there are no difficulties in applying the proposed method to problems with nonnormal distribution variables. One mathematical test problem and an electronic packaging problem are used to demonstrate the effectiveness of the proposed method.

  9. 78 FR 41339 - Electric Reliability Organization Proposal To Retire Requirements in Reliability Standards

    Science.gov (United States)

    2013-07-10

    ...] Electric Reliability Organization Proposal To Retire Requirements in Reliability Standards AGENCY: Federal... Reliability Standards identified by the North American Electric Reliability Corporation (NERC), the Commission-certified Electric Reliability Organization. FOR FURTHER INFORMATION CONTACT: Kevin Ryan (Legal Information...

  10. Reliability Analyses of Groundwater Pollutant Transport

    Energy Technology Data Exchange (ETDEWEB)

    Dimakis, Panagiotis

    1997-12-31

    This thesis develops a probabilistic finite element model for the analysis of groundwater pollution problems. Two computer codes were developed, (1) one using finite element technique to solve the two-dimensional steady state equations of groundwater flow and pollution transport, and (2) a first order reliability method code that can do a probabilistic analysis of any given analytical or numerical equation. The two codes were connected into one model, PAGAP (Probability Analysis of Groundwater And Pollution). PAGAP can be used to obtain (1) the probability that the concentration at a given point at a given time will exceed a specified value, (2) the probability that the maximum concentration at a given point will exceed a specified value and (3) the probability that the residence time at a given point will exceed a specified period. PAGAP could be used as a tool for assessment purposes and risk analyses, for instance the assessment of the efficiency of a proposed remediation technique or to study the effects of parameter distribution for a given problem (sensitivity study). The model has been applied to study the greatest self sustained, precipitation controlled aquifer in North Europe, which underlies Oslo`s new major airport. 92 refs., 187 figs., 26 tabs.

  11. Robust tie points selection for InSAR image coregistration

    Science.gov (United States)

    Skanderi, Takieddine; Chabira, Boulerbah; Afifa, Belkacem; Belhadj Aissa, Aichouche

    2013-10-01

    Image coregistration is an important step in SAR interferometry which is a well known method for DEM generation and surface displacement monitoring. A practical and widely used automatic coregistration algorithm is based on selecting a number of tie points in the master image and looking for the correspondence of each point in the slave image using correlation technique. The characteristics of these points, their number and their distribution have a great impact on the reliability of the estimated transformation. In this work, we present a method for automatic selection of suitable tie points that are well distributed over the common area without decreasing the desired tie points' number. First we select candidate points using Harris operator. Then from these points we select tie points depending on their cornerness measure (the highest first). Once a tie point is selected, its correspondence is searched for in the slave image, if the similarity measure maximum is less than a given threshold or it is at the border of the search window, this point is discarded and we proceed to the next Harris point, else, the cornerness of the remaining candidates Harris points are multiplied by a spatially radially increasing function centered at the selected point to disadvantage the points in a neighborhood of a radius determined from the size of the common area and the desired number of points. This is repeated until the desired number of points is selected. Results of an ERS1/2 tandem pair are presented and discussed.

  12. Content and Construct Validity, Reliability, and Responsiveness of the Rheumatoid Arthritis Flare Questionnaire

    DEFF Research Database (Denmark)

    Bartlett, Susan J; Barbic, Skye P; Bykerk, Vivian P

    2017-01-01

    -FQ), and the voting results at OMERACT 2016. METHODS: Classic and modern psychometric methods were used to assess reliability, validity, sensitivity, factor structure, scoring, and thresholds. Interviews with patients and clinicians also assessed content validity, utility, and meaningfulness of RA-FQ scores. RESULTS......: People with RA in observational trials in Canada (n = 896) and France (n = 138), and an RCT in the Netherlands (n = 178) completed 5 items (11-point numerical rating scale) representing RA Flare core domains. There was moderate to high evidence of reliability, content and construct validity...... to identify and measure RA flares. Its review through OMERACT Filter 2.0 shows evidence of reliability, content and construct validity, and responsiveness. These properties merit its further validation as an outcome for clinical trials....

  13. Sensitivity Weaknesses in Application of some Statistical Distribution in First Order Reliability Methods

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Enevoldsen, I.

    1993-01-01

    It has been observed and shown that in some examples a sensitivity analysis of the first order reliability index results in increasing reliability index, when the standard deviation for a stochastic variable is increased while the expected value is fixed. This unfortunate behaviour can occur when...... a stochastic variable is modelled by an asymmetrical density function. For lognormally, Gumbel and Weibull distributed stochastic variables it is shown for which combinations of the/3-point, the expected value and standard deviation the weakness can occur. In relation to practical application the behaviour...... is probably rather infrequent. A simple example is shown as illustration and to exemplify that for second order reliability methods and for exact calculations of the probability of failure this behaviour is much more infrequent....

  14. 76 FR 58101 - Electric Reliability Organization Interpretation of Transmission Operations Reliability Standard

    Science.gov (United States)

    2011-09-20

    ....C. Cir. 2009). \\4\\ Mandatory Reliability Standards for the Bulk-Power System, Order No. 693, FERC... for maintaining real and reactive power balance. \\14\\ Electric Reliability Organization Interpretation...; Order No. 753] Electric Reliability Organization Interpretation of Transmission Operations Reliability...

  15. An Impact of Thermodynamic Processes in Human Bodies on Performance Reliability of Individuals

    Directory of Open Access Journals (Sweden)

    Smalko Zbigniew

    2015-01-01

    Full Text Available The article presents the problem of the influence of thermodynamic factors on human fallibility in different zones of thermal discomfort. Describes the processes of energy in the human body. Been given a formal description of the energy balance of the human body thermoregulation. Pointed to human reactions to temperature changes of internal and external environment, including reactions associated with exercise. The methodology to estimate and determine the reliability of indicators of human basal acting in different zones of thermal discomfort. The significant effect of thermodynamic factors on the reliability and security ofperson.

  16. 76 FR 42534 - Mandatory Reliability Standards for Interconnection Reliability Operating Limits; System...

    Science.gov (United States)

    2011-07-19

    ... Reliability Operating Limits; System Restoration Reliability Standards AGENCY: Federal Energy Regulatory... data necessary to analyze and monitor Interconnection Reliability Operating Limits (IROL) within its... Interconnection Reliability Operating Limits, Order No. 748, 134 FERC ] 61,213 (2011). \\2\\ The term ``Wide-Area...

  17. Fixed Point Learning Based Intelligent Traffic Control System

    Science.gov (United States)

    Zongyao, Wang; Cong, Sui; Cheng, Shao

    2017-10-01

    Fixed point learning has become an important tool to analyse large scale distributed system such as urban traffic network. This paper presents a fixed point learning based intelligence traffic network control system. The system applies convergence property of fixed point theorem to optimize the traffic flow density. The intelligence traffic control system achieves maximum road resources usage by averaging traffic flow density among the traffic network. The intelligence traffic network control system is built based on decentralized structure and intelligence cooperation. No central control is needed to manage the system. The proposed system is simple, effective and feasible for practical use. The performance of the system is tested via theoretical proof and simulations. The results demonstrate that the system can effectively solve the traffic congestion problem and increase the vehicles average speed. It also proves that the system is flexible, reliable and feasible for practical use.

  18. Using minimal cuts to evaluate the system reliability of a stochastic-flow network with failures at nodes and arcs

    International Nuclear Information System (INIS)

    Lin, Y.-K.

    2002-01-01

    This paper deals with a stochastic-flow network in which each node and arc has a designated capacity, which will have different lower levels due to various partial and complete failures. We try to evaluate the system reliability that the maximum flow of the network is not less than a demand (d+1). A simple algorithm in terms of minimal cuts is first proposed to generate all upper boundary points for d, and then the system reliability can be calculated in terms of such points. The upper boundary point for d is a maximal vector, which represents the capacity of each component (arc or node), such that the maximum flow of the network is d. A computer example is shown to illustrate the solution procedure

  19. 76 FR 66055 - North American Electric Reliability Corporation; Order Approving Interpretation of Reliability...

    Science.gov (United States)

    2011-10-25

    ...\\ Mandatory Reliability Standards for the Bulk-Power System, Order No. 693, FERC Stats. & Regs. ] 31,242... materially affected'' by Bulk-Power System reliability may request an interpretation of a Reliability... Electric Reliability Corporation; Order Approving Interpretation of Reliability Standard; Before...

  20. An adaptive cubature formula for efficient reliability assessment of nonlinear structural dynamic systems

    Science.gov (United States)

    Xu, Jun; Kong, Fan

    2018-05-01

    Extreme value distribution (EVD) evaluation is a critical topic in reliability analysis of nonlinear structural dynamic systems. In this paper, a new method is proposed to obtain the EVD. The maximum entropy method (MEM) with fractional moments as constraints is employed to derive the entire range of EVD. Then, an adaptive cubature formula is proposed for fractional moments assessment involved in MEM, which is closely related to the efficiency and accuracy for reliability analysis. Three point sets, which include a total of 2d2 + 1 integration points in the dimension d, are generated in the proposed formula. In this regard, the efficiency of the proposed formula is ensured. Besides, a "free" parameter is introduced, which makes the proposed formula adaptive with the dimension. The "free" parameter is determined by arranging one point set adjacent to the boundary of the hyper-sphere which contains the bulk of total probability. In this regard, the tail distribution may be better reproduced and the fractional moments could be evaluated with accuracy. Finally, the proposed method is applied to a ten-storey shear frame structure under seismic excitations, which exhibits strong nonlinearity. The numerical results demonstrate the efficacy of the proposed method.

  1. Advanced reliability improvement of AC-modules (ARIA)

    International Nuclear Information System (INIS)

    Rooij, P.; Real, M.; Moschella, U.; Sample, T.; Kardolus, M.

    2001-09-01

    The AC-module is a relatively new development in PV-system technology and offers significant advantages over conventional PV-systems with a central inverter : e.g. increased modularity, ease of installation and freedom of system design. The Netherlands and Switzerland have a leading position in the field of AC-modules, both in terms of technology and of commercial and large-scale application. An obstacle towards large-scale market introduction of AC-modules is that the reliability and operational lifetime of AC-modules and the integrated inverters in particular are not yet proven. Despite the advantages, no module-integrated inverter has yet achieved large scale introduction. The AC-modules will lower the barrier towards market penetration. But due to the great interest in the new AC-module technology there is the risk of introducing a not fully proven product. This may damage the image of PV-systems. To speed up the development and to improve the reliability, research institutes and PV-industry will address the aspects of reliability and operational lifetime of AC-modules. From field experiences we learn that in general the inverter is still the weakest point in PV-systems. The lifetime of inverters is an important factor on reliability. Some authors are indicating a lifetime of 1.5 years, whereas the field experiences in Germany and Switzerland have shown that for central inverter systems, an availability of 97% has been achieved in the last years. From this point of view it is highly desirable that the operational lifetime and reliability of PV-inverters and especially AC-modules is demonstrated/improved to make large scale use of PV a success. Module Integrated Inverters will most likely be used in modules in the power range between 100 and 300 Watt DC-power. These are modules with more than 100 cells in series, assuming that the module inverter will benefit from the higher voltage. Hot-spot is the phenomenon that can occur when one or more cells of a string

  2. The Influence of Nutrition Labeling and Point-of-Purchase Information on Food Behaviours.

    Science.gov (United States)

    Volkova, Ekaterina; Ni Mhurchu, Cliona

    2015-03-01

    Point-of-purchase information on packaged food has been a highly debated topic. Various types of nutrition labels and point-of-purchase information have been studied to determine their ability to attract consumers' attention, be well understood and promote healthy food choices. Country-specific regulatory and monitoring frameworks have been implemented to ensure reliability and accuracy of such information. However, the impact of such information on consumers' behaviour remains contentious. This review summarizes recent evidence on the real-world effectiveness of nutrition labels and point-of-purchase information.

  3. Development and Evaluation of a UAV-Photogrammetry System for Precise 3D Environmental Modeling.

    Science.gov (United States)

    Shahbazi, Mozhdeh; Sohn, Gunho; Théau, Jérôme; Menard, Patrick

    2015-10-30

    The specific requirements of UAV-photogrammetry necessitate particular solutions for system development, which have mostly been ignored or not assessed adequately in recent studies. Accordingly, this paper presents the methodological and experimental aspects of correctly implementing a UAV-photogrammetry system. The hardware of the system consists of an electric-powered helicopter, a high-resolution digital camera and an inertial navigation system. The software of the system includes the in-house programs specifically designed for camera calibration, platform calibration, system integration, on-board data acquisition, flight planning and on-the-job self-calibration. The detailed features of the system are discussed, and solutions are proposed in order to enhance the system and its photogrammetric outputs. The developed system is extensively tested for precise modeling of the challenging environment of an open-pit gravel mine. The accuracy of the results is evaluated under various mapping conditions, including direct georeferencing and indirect georeferencing with different numbers, distributions and types of ground control points. Additionally, the effects of imaging configuration and network stability on modeling accuracy are assessed. The experiments demonstrated that 1.55 m horizontal and 3.16 m vertical absolute modeling accuracy could be achieved via direct geo-referencing, which was improved to 0.4 cm and 1.7 cm after indirect geo-referencing.

  4. Inverse Reliability Task: Artificial Neural Networks and Reliability-Based Optimization Approaches

    OpenAIRE

    Lehký , David; Slowik , Ondřej; Novák , Drahomír

    2014-01-01

    Part 7: Genetic Algorithms; International audience; The paper presents two alternative approaches to solve inverse reliability task – to determine the design parameters to achieve desired target reliabilities. The first approach is based on utilization of artificial neural networks and small-sample simulation Latin hypercube sampling. The second approach considers inverse reliability task as reliability-based optimization task using double-loop method and also small-sample simulation. Efficie...

  5. 76 FR 23801 - North American Electric Reliability Corporation; Order Approving Reliability Standard

    Science.gov (United States)

    2011-04-28

    ... have an operating plan and facilities for backup functionality to ensure Bulk-Power System reliability... entity's primary control center on the reliability of the Bulk-Power System. \\1\\ Mandatory Reliability... potential impact of a violation of the Requirement on the reliability of the Bulk-Power System. The...

  6. Suncor maintenance and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Little, S. [Suncor Energy, Calgary, AB (Canada)

    2006-07-01

    Fleet maintenance and reliability at Suncor Energy was discussed in this presentation, with reference to Suncor Energy's primary and support equipment fleets. This paper also discussed Suncor Energy's maintenance and reliability standard involving people, processes and technology. An organizational maturity chart that graphed organizational learning against organizational performance was illustrated. The presentation also reviewed the maintenance and reliability framework; maintenance reliability model; the process overview of the maintenance and reliability standard; a process flow chart of maintenance strategies and programs; and an asset reliability improvement process flow chart. An example of an improvement initiative was included, with reference to a shovel reliability review; a dipper trip reliability investigation; bucket related failures by type and frequency; root cause analysis of the reliability process; and additional actions taken. Last, the presentation provided a graph of the results of the improvement initiative and presented the key lessons learned. tabs., figs.

  7. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    Science.gov (United States)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  8. Identification of a practical and reliable method for the evaluation of litter moisture in turkey production.

    Science.gov (United States)

    Vinco, L J; Giacomelli, S; Campana, L; Chiari, M; Vitale, N; Lombardi, G; Veldkamp, T; Hocking, P M

    2018-02-01

    1. An experiment was conducted to compare 5 different methods for the evaluation of litter moisture. 2. For litter collection and assessment, 55 farms were selected, one shed from each farm was inspected and 9 points were identified within each shed. 3. For each device, used for the evaluation of litter moisture, mean and standard deviation of wetness measures per collection point were assessed. 4. The reliability and overall consistency between the 5 instruments used to measure wetness were high (α = 0.72). 5. Measurement of three out of the 9 collection points were sufficient to provide a reliable assessment of litter moisture throughout the shed. 6. Based on the direct correlation between litter moisture and footpad lesions, litter moisture measurement can be used as a resource based on-farm animal welfare indicator. 7. Among the 5 methods analysed, visual scoring is the most simple and practical, and therefore the best candidate to be used on-farm for animal welfare assessment.

  9. Critical Assessment of the Foundations of Power Transmission and Distribution Reliability Metrics and Standards.

    Science.gov (United States)

    Nateghi, Roshanak; Guikema, Seth D; Wu, Yue Grace; Bruss, C Bayan

    2016-01-01

    The U.S. federal government regulates the reliability of bulk power systems, while the reliability of power distribution systems is regulated at a state level. In this article, we review the history of regulating electric service reliability and study the existing reliability metrics, indices, and standards for power transmission and distribution networks. We assess the foundations of the reliability standards and metrics, discuss how they are applied to outages caused by large exogenous disturbances such as natural disasters, and investigate whether the standards adequately internalize the impacts of these events. Our reflections shed light on how existing standards conceptualize reliability, question the basis for treating large-scale hazard-induced outages differently from normal daily outages, and discuss whether this conceptualization maps well onto customer expectations. We show that the risk indices for transmission systems used in regulating power system reliability do not adequately capture the risks that transmission systems are prone to, particularly when it comes to low-probability high-impact events. We also point out several shortcomings associated with the way in which regulators require utilities to calculate and report distribution system reliability indices. We offer several recommendations for improving the conceptualization of reliability metrics and standards. We conclude that while the approaches taken in reliability standards have made considerable advances in enhancing the reliability of power systems and may be logical from a utility perspective during normal operation, existing standards do not provide a sufficient incentive structure for the utilities to adequately ensure high levels of reliability for end-users, particularly during large-scale events. © 2015 Society for Risk Analysis.

  10. Improving Stochastic Communication Network Performance: Reliability vs. Throughput

    Science.gov (United States)

    1991-12-01

    increased to one. 2) arc survivabil.. ities will be increased in increments of one tenths. and 3) the costs to increase- arc si’rvivabilities were equal and...This reliability value is leni used to maximize the associated expected flow. For Net work A. a bIdget of (8)() pro(duces a tradcoff point at (.58.37...Network B for a buidgel of 2000 which allows a nel \\\\ork relial)ilitv of one to be achieved and a bidget of 1200 which allows for ;, maximum 57

  11. Point of care hematocrit and hemoglobin in cardiac surgery: a review.

    Science.gov (United States)

    Myers, Gerard J; Browne, Joe

    2007-05-01

    The use of point-of-care blood gas analyzers in cardiac surgery has been on the increase over the past decade. The availability of these analyzers in the operating room and post-operative intensive care units eliminates the time delays to transport samples to the main laboratory and reduces the amount of blood sampled to measure such parameters as electrolytes, blood gases, lactates, glucose and hemoglobin/hematocrit. Point-of-care analyzers also lead to faster and more reliable clinical decisions while the patient is still on the heart lung machine. Point-of-care devices were designed to provide safe, appropriate and consistent care of those patients in need of rapid acid/base balance and electrolyte management in the clinical setting. As a result, clinicians rely on their values to make decisions regarding ventilation, acid/base management, transfusion and glucose management. Therefore, accuracy and reliability are an absolute must for these bedside analyzers in both the cardiac operating room and the post-op intensive care units. Clinicians have a choice of two types of technology to measure hemoglobin/hematocrit during bypass, which subsequently determines their patient's level of hemodilution, as well as their transfusion threshold. All modern point-of-care blood gas analyzers measure hematocrit using a technology called conductivity, while other similar devices measure hemoglobin using a technology called co-oximetry. The two methods are analyzed and compared in this review. The literature indicates that using conductivity to measure hematocrit during and after cardiac surgery could produce inaccurate results when hematocrits are less than 30%, and, therefore, result in unnecessary homologous red cell transfusions in some patients. These inaccuracies are influenced by several factors that are common and unique to cardiopulmonary bypass, and will also be reviewed here. It appears that the only accurate, consistent and reliable method to determine hemodilution

  12. Optimization Algorithms for Calculation of the Joint Design Point in Parallel Systems

    DEFF Research Database (Denmark)

    Enevoldsen, I.; Sørensen, John Dalsgaard

    1992-01-01

    In large structures it is often necessary to estimate the reliability of the system by use of parallel systems. Optimality criteria-based algorithms for calculation of the joint design point in a parallel system are described and efficient active set strategies are developed. Three possible...

  13. Reliability training

    Science.gov (United States)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  14. Problems Related to Use of Some Terms in System Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Nadezda Hanusova

    2004-01-01

    Full Text Available The paper deals with problems of using dependability terms, defined in actual standard STN IEC 50 (191: International electrotechnical dictionary, chap. 191: Dependability and quality of service (1993, in a technical systems dependability analysis. The goal of the paper is to find a relation between terms introduced in the mentioned standard and used in the technical systems dependability analysis and rules and practices used in a system analysis of the system theory. Description of a part of the system life cycle related to reliability is used as a starting point. The part of a system life cycle is described by the state diagram and reliability relevant therms are assigned.

  15. Relativity evaluation of reliability on operation in nuclear power plant

    International Nuclear Information System (INIS)

    Inata, Takashi

    1987-01-01

    The report presents a quantitative method for evaluating the reliability of operations conducted in nuclear power plants. The quantitative reliability evaluation method is based on the 'detailed block diagram analysis (De-BDA)'. All units of a series of operations are separately displayed for each block and combined sequentially. Then, calculation is performed to evaluate the reliability. Basically, De-BDA calculation is made for pairs of operation labels, which are connected in parallel or in series at different subordination levels. The applicability of the De-BDA method is demonstrated by carrying out calculation for three model cases: operations in the event of malfunction of the control valve in the main water supply system for PWR, switching from an electrically-operated water supply pump to a turbin-operated water supply pump, and isolation and water removal operation for a low-pressure condensate pump. It is shown that the relative importance of each unit of a series of operations can be evaluated, making it possible to extract those units of greater importance, and that the priority among the factors which affect the reliability of operations can be determined. Results of the De-BDA calculation can serve to find important points to be considered in developing an operation manual, conducting education and training, and improving facilities. (Nogami, K.)

  16. Factors Influencing the Reliability of the Glasgow Coma Scale: A Systematic Review.

    Science.gov (United States)

    Reith, Florence Cm; Synnot, Anneliese; van den Brande, Ruben; Gruen, Russell L; Maas, Andrew Ir

    2017-06-01

    The Glasgow Coma Scale (GCS) characterizes patients with diminished consciousness. In a recent systematic review, we found overall adequate reliability across different clinical settings, but reliability estimates varied considerably between studies, and methodological quality of studies was overall poor. Identifying and understanding factors that can affect its reliability is important, in order to promote high standards for clinical use of the GCS. The aim of this systematic review was to identify factors that influence reliability and to provide an evidence base for promoting consistent and reliable application of the GCS. A comprehensive literature search was undertaken in MEDLINE, EMBASE, and CINAHL from 1974 to July 2016. Studies assessing the reliability of the GCS in adults or describing any factor that influences reliability were included. Two reviewers independently screened citations, selected full texts, and undertook data extraction and critical appraisal. Methodological quality of studies was evaluated with the consensus-based standards for the selection of health measurement instruments checklist. Data were synthesized narratively and presented in tables. Forty-one studies were included for analysis. Factors identified that may influence reliability are education and training, the level of consciousness, and type of stimuli used. Conflicting results were found for experience of the observer, the pathology causing the reduced consciousness, and intubation/sedation. No clear influence was found for the professional background of observers. Reliability of the GCS is influenced by multiple factors and as such is context dependent. This review points to the potential for improvement from training and education and standardization of assessment methods, for which recommendations are presented. Copyright © 2017 by the Congress of Neurological Surgeons.

  17. ETARA PC version 3.3 user's guide: Reliability, availability, maintainability simulation model

    Science.gov (United States)

    Hoffman, David J.; Viterna, Larry A.

    1991-01-01

    A user's manual describing an interactive, menu-driven, personal computer based Monte Carlo reliability, availability, and maintainability simulation program called event time availability reliability (ETARA) is discussed. Given a reliability block diagram representation of a system, ETARA simulates the behavior of the system over a specified period of time using Monte Carlo methods to generate block failure and repair intervals as a function of exponential and/or Weibull distributions. Availability parameters such as equivalent availability, state availability (percentage of time as a particular output state capability), continuous state duration and number of state occurrences can be calculated. Initial spares allotment and spares replenishment on a resupply cycle can be simulated. The number of block failures are tabulated both individually and by block type, as well as total downtime, repair time, and time waiting for spares. Also, maintenance man-hours per year and system reliability, with or without repair, at or above a particular output capability can be calculated over a cumulative period of time or at specific points in time.

  18. 76 FR 16240 - Mandatory Reliability Standards for Interconnection Reliability Operating Limits

    Science.gov (United States)

    2011-03-23

    ... identified by the Commission. \\5\\ Mandatory Reliability Standards for the Bulk-Power System, Order No. 693... reliability of the interconnection by ensuring that the bulk electric system is assessed during the operations... responsibility for SOLs. Further, Bulk-Power System reliability practices assign responsibilities for analyzing...

  19. Pneumothorax size measurements on digital chest radiographs: Intra- and inter- rater reliability.

    Science.gov (United States)

    Thelle, Andreas; Gjerdevik, Miriam; Grydeland, Thomas; Skorge, Trude D; Wentzel-Larsen, Tore; Bakke, Per S

    2015-10-01

    Detailed and reliable methods may be important for discussions on the importance of pneumothorax size in clinical decision-making. Rhea's method is widely used to estimate pneumothorax size in percent based on chest X-rays (CXRs) from three measure points. Choi's addendum is used for anterioposterior projections. The aim of this study was to examine the intrarater and interrater reliability of the Rhea and Choi method using digital CXR in the ward based PACS monitors. Three physicians examined a retrospective series of 80 digital CXRs showing pneumothorax, using Rhea and Choi's method, then repeated in a random order two weeks later. We used the analysis of variance technique by Eliasziw et al. to assess the intrarater and interrater reliability in altogether 480 estimations of pneumothorax size. Estimated pneumothorax sizes ranged between 5% and 100%. The intrarater reliability coefficient was 0.98 (95% one-sided lower-limit confidence interval C 0.96), and the interrater reliability coefficient was 0.95 (95% one-sided lower-limit confidence interval 0.93). This study has shown that the Rhea and Choi method for calculating pneumothorax size has high intrarater and interrater reliability. These results are valid across gender, side of pneumothorax and whether the patient is diagnosed with primary or secondary pneumothorax. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. The validity and reliability of the Moroccan version of the Revised Fibromyalgia Impact Questionnaire.

    Science.gov (United States)

    Srifi, Najlaa; Bahiri, Rachid; Rostom, Samira; Bendeddouche, Imad; Lazrek, Noufissa; Hajjaj-Hassouni, Najia

    2013-01-01

    The Revised Fibromyalgia Impact Questionnaire (FIQ-R) is an updated version of the FIQ attempts to address the limitations of the Fibromyalgia Impact Questionnaire (FIQ). As there is no Moroccan version of the FIQ-R available, we aimed to investigate the validity and reliability of a Moroccan translation of the FIQR in Moroccan fibromyalgia (FM) patients. After translating the FIQR into Moroccan, it was administered to 80 patients with FM. All of the patients filled out the questionnaire together with Arabic version of short form-36 (SF-36). The tender-point count was calculated from tender points identified by thumb palpation. Three days later, FM patients filled out the Moroccan FIQR at their second visit. The test-retest reliability of the Moroccan FIQR questions ranged from 0.72 to 0.87. The test and retest reliability of total FIQR score was 0.84. Cronbach's alpha was 0.91 for FIQR visit 1 (the first assessment) and 0.92 for FIQR visit 2 (the second assessment), indicating acceptable levels of internal consistency for both assessments. Significant correlations for construct validity were obtained between the Moroccan FIQ-R total and domain scores and the subscales of the SF-36 (FIQR total versus SF-36 physical component score and mental component score were r = -0.69, P FIQ-R showed adequate reliability and validity. This instrument can be used in the clinical evaluation of Moroccan and Arabic-speaking patients with FM.

  1. Design for reliability: NASA reliability preferred practices for design and test

    Science.gov (United States)

    Lalli, Vincent R.

    1994-01-01

    This tutorial summarizes reliability experience from both NASA and industry and reflects engineering practices that support current and future civil space programs. These practices were collected from various NASA field centers and were reviewed by a committee of senior technical representatives from the participating centers (members are listed at the end). The material for this tutorial was taken from the publication issued by the NASA Reliability and Maintainability Steering Committee (NASA Reliability Preferred Practices for Design and Test. NASA TM-4322, 1991). Reliability must be an integral part of the systems engineering process. Although both disciplines must be weighed equally with other technical and programmatic demands, the application of sound reliability principles will be the key to the effectiveness and affordability of America's space program. Our space programs have shown that reliability efforts must focus on the design characteristics that affect the frequency of failure. Herein, we emphasize that these identified design characteristics must be controlled by applying conservative engineering principles.

  2. Prediction of safety critical software operational reliability from test reliability using testing environment factors

    International Nuclear Information System (INIS)

    Jung, Hoan Sung; Seong, Poong Hyun

    1999-01-01

    It has been a critical issue to predict the safety critical software reliability in nuclear engineering area. For many years, many researches have focused on the quantification of software reliability and there have been many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. User's interest is however on the operational reliability rather than on the test reliability. The experiences show that the operational reliability is higher than the test reliability. With the assumption that the difference in reliability results from the change of environment, from testing to operation, testing environment factors comprising the aging factor and the coverage factor are developed in this paper and used to predict the ultimate operational reliability with the failure data in testing phase. It is by incorporating test environments applied beyond the operational profile into testing environment factors. The application results show that the proposed method can estimate the operational reliability accurately. (Author). 14 refs., 1 tab., 1 fig

  3. Inter-rater and intra-rater reliability of the Bahasa Melayu version of Rose Angina Questionnaire.

    Science.gov (United States)

    Hassan, N B; Choudhury, S R; Naing, L; Conroy, R M; Rahman, A R A

    2007-01-01

    The objective of the study is to translate the Rose Questionnaire (RQ) into a Bahasa Melayu version and adapt it cross-culturally, and to measure its inter-rater and intrarater reliability. This cross sectional study was conducted in the respondents' homes or workplaces in Kelantan, Malaysia. One hundred respondents aged 30 and above with different socio-demographic status were interviewed for face validity. For each inter-rater and intra-rater reliability, a sample of 150 respondents was interviewed. Inter-rater and intra-rater reliabilities were assessed by Cohen's kappa. The overall inter-rater agreements by the five pair of interviewers at point one and two were 0.86, and intrarater reliability by the five interviewers on the seven-item questionnaire at poinone and two was 0.88, as measured by kappa coefficient. The translated Malay version of RQ demonstrated an almost perfect inter-rater and intra-rater reliability and further validation such as sensitivity and specificity analysis of this translated questionnaire is highly recommended.

  4. Methodologies of the hardware reliability prediction for PSA of digital I and C systems

    International Nuclear Information System (INIS)

    Jung, H. S.; Sung, T. Y.; Eom, H. S.; Park, J. K.; Kang, H. G.; Park, J.

    2000-09-01

    Digital I and C systems are being used widely in the Non-safety systems of the NPP and they are expanding their applications to safety critical systems. The regulatory body shifts their policy to risk based and may require Probabilistic Safety Assessment for the digital I and C systems. But there is no established reliability prediction methodology for the digital I and C systems including both software and hardware yet. This survey report includes a lot of reliability prediction methods for electronic systems in view of hardware. Each method has both the strong and the weak points. This report provides the state-of-art of prediction methods and focus on Bellcore method and MIL-HDBK-217F method in deeply. The reliability analysis models are reviewed and discussed to help analysts. Also this report includes state-of-art of software tools that are supporting reliability prediction

  5. Methodologies of the hardware reliability prediction for PSA of digital I and C systems

    Energy Technology Data Exchange (ETDEWEB)

    Jung, H. S.; Sung, T. Y.; Eom, H. S.; Park, J. K.; Kang, H. G.; Park, J

    2000-09-01

    Digital I and C systems are being used widely in the Non-safety systems of the NPP and they are expanding their applications to safety critical systems. The regulatory body shifts their policy to risk based and may require Probabilistic Safety Assessment for the digital I and C systems. But there is no established reliability prediction methodology for the digital I and C systems including both software and hardware yet. This survey report includes a lot of reliability prediction methods for electronic systems in view of hardware. Each method has both the strong and the weak points. This report provides the state-of-art of prediction methods and focus on Bellcore method and MIL-HDBK-217F method in deeply. The reliability analysis models are reviewed and discussed to help analysts. Also this report includes state-of-art of software tools that are supporting reliability prediction.

  6. Power electronics reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  7. How Reliable Are Students' Evaluations of Teaching Quality? A Variance Components Approach

    Science.gov (United States)

    Feistauer, Daniela; Richter, Tobias

    2017-01-01

    The inter-rater reliability of university students' evaluations of teaching quality was examined with cross-classified multilevel models. Students (N = 480) evaluated lectures and seminars over three years with a standardised evaluation questionnaire, yielding 4224 data points. The total variance of these student evaluations was separated into the…

  8. Interrater and Intrarater Reliability of the Balance Computerized Adaptive Test in Patients With Stroke.

    Science.gov (United States)

    Chiang, Hsin-Yu; Lu, Wen-Shian; Yu, Wan-Hui; Hsueh, I-Ping; Hsieh, Ching-Lin

    2018-04-11

    To examine the interrater and intrarater reliability of the Balance Computerized Adaptive Test (Balance CAT) in patients with chronic stroke having a wide range of balance functions. Repeated assessments design (1wk apart). Seven teaching hospitals. A pooled sample (N=102) including 2 independent groups of outpatients (n=50 for the interrater reliability study; n=52 for the intrarater reliability study) with chronic stroke. Not applicable. Balance CAT. For the interrater reliability study, the values of intraclass correlation coefficient, minimal detectable change (MDC), and percentage of MDC (MDC%) for the Balance CAT were .84, 1.90, and 31.0%, respectively. For the intrarater reliability study, the values of intraclass correlation coefficient, MDC, and MDC% ranged from .89 to .91, from 1.14 to 1.26, and from 17.1% to 18.6%, respectively. The Balance CAT showed sufficient intrarater reliability in patients with chronic stroke having balance functions ranging from sitting with support to independent walking. Although the Balance CAT may have good interrater reliability, we found substantial random measurement error between different raters. Accordingly, if the Balance CAT is used as an outcome measure in clinical or research settings, same raters are suggested over different time points to ensure reliable assessments. Copyright © 2018 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  9. Effect of knee angle on neuromuscular assessment of plantar flexor muscles: A reliability study

    Science.gov (United States)

    Cornu, Christophe; Jubeau, Marc

    2018-01-01

    Introduction This study aimed to determine the intra- and inter-session reliability of neuromuscular assessment of plantar flexor (PF) muscles at three knee angles. Methods Twelve young adults were tested for three knee angles (90°, 30° and 0°) and at three time points separated by 1 hour (intra-session) and 7 days (inter-session). Electrical (H reflex, M wave) and mechanical (evoked and maximal voluntary torque, activation level) parameters were measured on the PF muscles. Intraclass correlation coefficients (ICC) and coefficients of variation were calculated to determine intra- and inter-session reliability. Results The mechanical measurements presented excellent (ICC>0.75) intra- and inter-session reliabilities regardless of the knee angle considered. The reliability of electrical measurements was better for the 90° knee angle compared to the 0° and 30° angles. Conclusions Changes in the knee angle may influence the reliability of neuromuscular assessments, which indicates the importance of considering the knee angle to collect consistent outcomes on the PF muscles. PMID:29596480

  10. Practical treatment of risks from the point of view of reliability of the human component of the system

    International Nuclear Information System (INIS)

    Burkardt, F.

    One consequence with regard to the reduction of errors in the field of reliability of controlling and supervising activities is the use of aptitude tests. In addition, the working system must be planned according to the principles of information collection and processing and working activities. In certain fields such activities can have a greater effect than selection processes. (DG) [de

  11. Application of a Terrestrial LIDAR System for Elevation Mapping in Terra Nova Bay, Antarctica

    Directory of Open Access Journals (Sweden)

    Hyoungsig Cho

    2015-09-01

    Full Text Available A terrestrial Light Detection and Ranging (LIDAR system has high productivity and accuracy for topographic mapping, but the harsh conditions of Antarctica make LIDAR operation difficult. Low temperatures cause malfunctioning of the LIDAR system, and unpredictable strong winds can deteriorate data quality by irregularly shaking co-registration targets. For stable and efficient LIDAR operation in Antarctica, this study proposes and demonstrates the following practical solutions: (1 a lagging cover with a heating pack to maintain the temperature of the terrestrial LIDAR system; (2 co-registration using square planar targets and two-step point-merging methods based on extracted feature points and the Iterative Closest Point (ICP algorithm; and (3 a georeferencing module consisting of an artificial target and a Global Navigation Satellite System (GNSS receiver. The solutions were used to produce a topographic map for construction of the Jang Bogo Research Station in Terra Nova Bay, Antarctica. Co-registration and georeferencing precision reached 5 and 45 mm, respectively, and the accuracy of the Digital Elevation Model (DEM generated from the LIDAR scanning data was ±27.7 cm.

  12. Application of a Terrestrial LIDAR System for Elevation Mapping in Terra Nova Bay, Antarctica.

    Science.gov (United States)

    Cho, Hyoungsig; Hong, Seunghwan; Kim, Sangmin; Park, Hyokeun; Park, Ilsuk; Sohn, Hong-Gyoo

    2015-09-16

    A terrestrial Light Detection and Ranging (LIDAR) system has high productivity and accuracy for topographic mapping, but the harsh conditions of Antarctica make LIDAR operation difficult. Low temperatures cause malfunctioning of the LIDAR system, and unpredictable strong winds can deteriorate data quality by irregularly shaking co-registration targets. For stable and efficient LIDAR operation in Antarctica, this study proposes and demonstrates the following practical solutions: (1) a lagging cover with a heating pack to maintain the temperature of the terrestrial LIDAR system; (2) co-registration using square planar targets and two-step point-merging methods based on extracted feature points and the Iterative Closest Point (ICP) algorithm; and (3) a georeferencing module consisting of an artificial target and a Global Navigation Satellite System (GNSS) receiver. The solutions were used to produce a topographic map for construction of the Jang Bogo Research Station in Terra Nova Bay, Antarctica. Co-registration and georeferencing precision reached 5 and 45 mm, respectively, and the accuracy of the Digital Elevation Model (DEM) generated from the LIDAR scanning data was ±27.7 cm.

  13. 76 FR 73608 - Reliability Technical Conference, North American Electric Reliability Corporation, Public Service...

    Science.gov (United States)

    2011-11-29

    ... or municipal authority play in forming your bulk power system reliability plans? b. Do you support..., North American Electric Reliability Corporation (NERC) Nick Akins, CEO of American Electric Power (AEP..., EL11-62-000] Reliability Technical Conference, North American Electric Reliability Corporation, Public...

  14. The Accelerator Reliability Forum

    CERN Document Server

    Lüdeke, Andreas; Giachino, R

    2014-01-01

    A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.

  15. Design reliability engineering

    International Nuclear Information System (INIS)

    Buden, D.; Hunt, R.N.M.

    1989-01-01

    Improved design techniques are needed to achieve high reliability at minimum cost. This is especially true of space systems where lifetimes of many years without maintenance are needed and severe mass limitations exist. Reliability must be designed into these systems from the start. Techniques are now being explored to structure a formal design process that will be more complete and less expensive. The intent is to integrate the best features of design, reliability analysis, and expert systems to design highly reliable systems to meet stressing needs. Taken into account are the large uncertainties that exist in materials, design models, and fabrication techniques. Expert systems are a convenient method to integrate into the design process a complete definition of all elements that should be considered and an opportunity to integrate the design process with reliability, safety, test engineering, maintenance and operator training. 1 fig

  16. The commissioning of CMS sites: Improving the site reliability

    International Nuclear Information System (INIS)

    Belforte, S; Fisk, I; Flix, J; Hernandez, J M; Klem, J; Letts, J; Magini, N; Saiz, P; Sciaba, A

    2010-01-01

    The computing system of the CMS experiment works using distributed resources from more than 60 computing centres worldwide. These centres, located in Europe, America and Asia are interconnected by the Worldwide LHC Computing Grid. The operation of the system requires a stable and reliable behaviour of the underlying infrastructure. CMS has established a procedure to extensively test all relevant aspects of a Grid site, such as the ability to efficiently use their network to transfer data, the functionality of all the site services relevant for CMS and the capability to sustain the various CMS computing workflows at the required scale. This contribution describes in detail the procedure to rate CMS sites depending on their performance, including the complete automation of the program, the description of monitoring tools, and its impact in improving the overall reliability of the Grid from the point of view of the CMS computing system.

  17. A framework for efficient spatial web object retrieval

    DEFF Research Database (Denmark)

    Wu, Dinging; Cong, Gao; Jensen, Christian S.

    2012-01-01

    The conventional Internet is acquiring a geospatial dimension. Web documents are being geo-tagged and geo-referenced objects such as points of interest are being associated with descriptive text documents. The resulting fusion of geo-location and documents enables new kinds of queries that take...

  18. RELIABILITY AND RESPONSIVENESS OF THE DANISH MODIFIED INTERNATIONAL KNEE DOCUMENTATION COMMITTEE SUBJECTIVE KNEE FORM FOR CHILDREN WITH KNEE DISORDERS

    DEFF Research Database (Denmark)

    Jacobsen, Julie Sandell; Knudsen, Pernille; Fynbo, Charlotte

    2016-01-01

    Introduction The modified international Knee Documentation Committee Subjective Knee Form (Pedi-IKDC) is a widely used patient-reported tool applicable for children with knee disorders ranging on a scale from 0-100. We aimed to translate the Pedi-IKDC Subjective Knee Form into Danish......, and furthermore to assess its reliability and responsiveness. Material and Methods The Pedi-IKDC Subjective Knee Form was translated to Danish according to international guidelines. Reliability was assessed with Bland Altman plots, standard error of measurement (SEM), Minimal Detectable Change (MDC) and the Intra....... Reliability and responsiveness were assessed in 50 children (median 15 years) referred to hospital due to different knee disorders. Results The SEM was 4.2 points and the MDC was 11.5 points. The ICC was 0.91 (0.9-1.0). The change score of the Pedi-IKDC Subjective Knee form was correlated to the external...

  19. Reliability of electronic systems

    International Nuclear Information System (INIS)

    Roca, Jose L.

    2001-01-01

    Reliability techniques have been developed subsequently as a need of the diverse engineering disciplines, nevertheless they are not few those that think they have been work a lot on reliability before the same word was used in the current context. Military, space and nuclear industries were the first ones that have been involved in this topic, however not only in these environments it is that it has been carried out this small great revolution in benefit of the increase of the reliability figures of the products of those industries, but rather it has extended to the whole industry. The fact of the massive production, characteristic of the current industries, drove four decades ago, to the fall of the reliability of its products, on one hand, because the massively itself and, for other, to the recently discovered and even not stabilized industrial techniques. Industry should be changed according to those two new requirements, creating products of medium complexity and assuring an enough reliability appropriated to production costs and controls. Reliability began to be integral part of the manufactured product. Facing this philosophy, the book describes reliability techniques applied to electronics systems and provides a coherent and rigorous framework for these diverse activities providing a unifying scientific basis for the entire subject. It consists of eight chapters plus a lot of statistical tables and an extensive annotated bibliography. Chapters embrace the following topics: 1- Introduction to Reliability; 2- Basic Mathematical Concepts; 3- Catastrophic Failure Models; 4-Parametric Failure Models; 5- Systems Reliability; 6- Reliability in Design and Project; 7- Reliability Tests; 8- Software Reliability. This book is in Spanish language and has a potentially diverse audience as a text book from academic to industrial courses. (author)

  20. An overview of reliability methods in mechanical and structural design

    Science.gov (United States)

    Wirsching, P. H.; Ortiz, K.; Lee, S. J.

    1987-01-01

    An evaluation is made of modern methods of fast probability integration and Monte Carlo treatment for the assessment of structural systems' and components' reliability. Fast probability integration methods are noted to be more efficient than Monte Carlo ones. This is judged to be an important consideration when several point probability estimates must be made in order to construct a distribution function. An example illustrating the relative efficiency of the various methods is included.

  1. Using a Hybrid Cost-FMEA Analysis for Wind Turbine Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Nacef Tazi

    2017-02-01

    Full Text Available Failure mode and effects analysis (FMEA has been proven to be an effective methodology to improve system design reliability. However, the standard approach reveals some weaknesses when applied to wind turbine systems. The conventional criticality assessment method has been criticized as having many limitations such as the weighting of severity and detection factors. In this paper, we aim to overcome these drawbacks and develop a hybrid cost-FMEA by integrating cost factors to assess the criticality, these costs vary from replacement costs to expected failure costs. Then, a quantitative comparative study is carried out to point out average failure rate, main cause of failure, expected failure costs and failure detection techniques. A special reliability analysis of gearbox and rotor-blades are presented.

  2. Inter-rater reliability of measures to characterize the tobacco retail environment in Mexico

    Directory of Open Access Journals (Sweden)

    Marissa G Hall

    2015-11-01

    Full Text Available Objective. To evaluate the inter-rater reliability of a data collection instrument to assess the tobacco retail environ- ment in Mexico, after major marketing regulations were implemented. Materials and methods. In 2013, two data collectors independently evaluated 21 stores in two census tracts, through a data collection instrument that assessed the presence of price promotions, whether single cigarettes were sold, the number of visible advertisements, the pre- sence of signage prohibiting the sale of cigarettes to minors, and characteristics of cigarette pack displays. We evaluated the inter-rater reliability of the collected data, through the calculation of metrics such as intraclass correlation coefficient, percent agreement, Cohen’s kappa and Krippendorff’s alpha. Results. Most measures demonstrated substantial or perfect inter-rater reliability. Conclusions. Our results indicate the potential utility of the data collection instrument for future point-of-sale research.

  3. Reliability and criterion-related validity testing (construct) of the Endotracheal Suction Assessment Tool (ESAT©).

    Science.gov (United States)

    Davies, Kylie; Bulsara, Max K; Ramelet, Anne-Sylvie; Monterosso, Leanne

    2018-05-01

    To establish criterion-related construct validity and test-retest reliability for the Endotracheal Suction Assessment Tool© (ESAT©). Endotracheal tube suction performed in children can significantly affect clinical stability. Previously identified clinical indicators for endotracheal tube suction were used as criteria when designing the ESAT©. Content validity was reported previously. The final stages of psychometric testing are presented. Observational testing was used to measure construct validity and determine whether the ESAT© could guide "inexperienced" paediatric intensive care nurses' decision-making regarding endotracheal tube suction. Test-retest reliability of the ESAT© was performed at two time points. The researchers and paediatric intensive care nurse "experts" developed 10 hypothetical clinical scenarios with predetermined endotracheal tube suction outcomes. "Experienced" (n = 12) and "inexperienced" (n = 14) paediatric intensive care nurses were presented with the scenarios and the ESAT© guiding decision-making about whether to perform endotracheal tube suction for each scenario. Outcomes were compared with those predetermined by the "experts" (n = 9). Test-retest reliability of the ESAT© was measured at two consecutive time points (4 weeks apart) with "experienced" and "inexperienced" paediatric intensive care nurses using the same scenarios and tool to guide decision-making. No differences were observed between endotracheal tube suction decisions made by "experts" (n = 9), "inexperienced" (n = 14) and "experienced" (n = 12) nurses confirming the tool's construct validity. No differences were observed between groups for endotracheal tube suction decisions at T1 and T2. Criterion-related construct validity and test-retest reliability of the ESAT© were demonstrated. Further testing is recommended to confirm reliability in the clinical setting with the "inexperienced" nurse to guide decision-making related to endotracheal tube

  4. Analysis and Application of Reliability

    International Nuclear Information System (INIS)

    Jeong, Hae Seong; Park, Dong Ho; Kim, Jae Ju

    1999-05-01

    This book tells of analysis and application of reliability, which includes definition, importance and historical background of reliability, function of reliability and failure rate, life distribution and assumption of reliability, reliability of unrepaired system, reliability of repairable system, sampling test of reliability, failure analysis like failure analysis by FEMA and FTA, and cases, accelerated life testing such as basic conception, acceleration and acceleration factor, and analysis of accelerated life testing data, maintenance policy about alternation and inspection.

  5. Reliability in automotive and mechanical engineering determination of component and system reliability

    CERN Document Server

    Bertsche, Bernd

    2008-01-01

    In the present contemporary climate of global competition in every branch of engineering and manufacture it has been shown from extensive customer surveys that above every other attribute, reliability stands as the most desired feature in a finished product. To survive this relentless fight for survival any organisation, which neglect the plea of attaining to excellence in reliability, will do so at a serious cost Reliability in Automotive and Mechanical Engineering draws together a wide spectrum of diverse and relevant applications and analyses on reliability engineering. This is distilled into this attractive and well documented volume and practising engineers are challenged with the formidable task of simultaneously improving reliability and reducing the costs and down-time due to maintenance. The volume brings together eleven chapters to highlight the importance of the interrelated reliability and maintenance disciplines. They represent the development trends and progress resulting in making this book ess...

  6. Validity evidence and reliability of a simulated patient feedback instrument.

    Science.gov (United States)

    Schlegel, Claudia; Woermann, Ulrich; Rethans, Jan-Joost; van der Vleuten, Cees

    2012-01-27

    In the training of healthcare professionals, one of the advantages of communication training with simulated patients (SPs) is the SP's ability to provide direct feedback to students after a simulated clinical encounter. The quality of SP feedback must be monitored, especially because it is well known that feedback can have a profound effect on student performance. Due to the current lack of valid and reliable instruments to assess the quality of SP feedback, our study examined the validity and reliability of one potential instrument, the 'modified Quality of Simulated Patient Feedback Form' (mQSF). Content validity of the mQSF was assessed by inviting experts in the area of simulated clinical encounters to rate the importance of the mQSF items. Moreover, generalizability theory was used to examine the reliability of the mQSF. Our data came from videotapes of clinical encounters between six simulated patients and six students and the ensuing feedback from the SPs to the students. Ten faculty members judged the SP feedback according to the items on the mQSF. Three weeks later, this procedure was repeated with the same faculty members and recordings. All but two items of the mQSF received importance ratings of > 2.5 on a four-point rating scale. A generalizability coefficient of 0.77 was established with two judges observing one encounter. The findings for content validity and reliability with two judges suggest that the mQSF is a valid and reliable instrument to assess the quality of feedback provided by simulated patients.

  7. Waste container weighing data processing to create reliable information of household waste generation.

    Science.gov (United States)

    Korhonen, Pirjo; Kaila, Juha

    2015-05-01

    Household mixed waste container weighing data was processed by knowledge discovery and data mining techniques to create reliable information of household waste generation. The final data set included 27,865 weight measurements covering the whole year 2013 and it was selected from a database of Helsinki Region Environmental Services Authority, Finland. The data set contains mixed household waste arising in 6m(3) containers and it was processed identifying missing values and inconsistently low and high values as errors. The share of missing values and errors in the data set was 0.6%. This provides evidence that the waste weighing data gives reliable information of mixed waste generation at collection point level. Characteristic of mixed household waste arising at the waste collection point level is a wide variation between pickups. The seasonal variation pattern as a result of collective similarities in behaviour of households was clearly detected by smoothed medians of waste weight time series. The evaluation of the collection time series against the defined distribution range of pickup weights on the waste collection point level shows that 65% of the pickups were from collection points with optimally dimensioned container capacity and the collection points with over- and under-dimensioned container capacities were noted in 9.5% and 3.4% of all pickups, respectively. Occasional extra waste in containers occurred in 21.2% of the pickups indicating the irregular behaviour of individual households. The results of this analysis show that processing waste weighing data using knowledge discovery and data mining techniques provides trustworthy information of household waste generation and its variations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. A discrete-time Bayesian network reliability modeling and analysis framework

    International Nuclear Information System (INIS)

    Boudali, H.; Dugan, J.B.

    2005-01-01

    Dependability tools are becoming an indispensable tool for modeling and analyzing (critical) systems. However the growing complexity of such systems calls for increasing sophistication of these tools. Dependability tools need to not only capture the complex dynamic behavior of the system components, but they must be also easy to use, intuitive, and computationally efficient. In general, current tools have a number of shortcomings including lack of modeling power, incapacity to efficiently handle general component failure distributions, and ineffectiveness in solving large models that exhibit complex dependencies between their components. We propose a novel reliability modeling and analysis framework based on the Bayesian network (BN) formalism. The overall approach is to investigate timed Bayesian networks and to find a suitable reliability framework for dynamic systems. We have applied our methodology to two example systems and preliminary results are promising. We have defined a discrete-time BN reliability formalism and demonstrated its capabilities from a modeling and analysis point of view. This research shows that a BN based reliability formalism is a powerful potential solution to modeling and analyzing various kinds of system components behaviors and interactions. Moreover, being based on the BN formalism, the framework is easy to use and intuitive for non-experts, and provides a basis for more advanced and useful analyses such as system diagnosis

  9. Microelectronics Reliability

    Science.gov (United States)

    2017-01-17

    inverters  connected in a chain. ................................................. 5  Figure 3  Typical graph showing frequency versus square root of...developing an experimental  reliability estimating methodology that could both illuminate the  lifetime  reliability of advanced devices,  circuits and...or  FIT of the device. In other words an accurate estimate of the device  lifetime  was found and thus the  reliability  that  can  be  conveniently

  10. Human reliability analysis

    International Nuclear Information System (INIS)

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach

  11. Hawaii Electric System Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Loose, Verne William [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva Monroy, Cesar Augusto [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2012-08-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers’ views of reliability “worth” and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers’ views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  12. Combining structure-from-motion derived point clouds from satellites and unmanned aircraft systems images with ground-truth data to create high-resolution digital elevation models

    Science.gov (United States)

    Palaseanu, M.; Thatcher, C.; Danielson, J.; Gesch, D. B.; Poppenga, S.; Kottermair, M.; Jalandoni, A.; Carlson, E.

    2016-12-01

    Coastal topographic and bathymetric (topobathymetric) data with high spatial resolution (1-meter or better) and high vertical accuracy are needed to assess the vulnerability of Pacific Islands to climate change impacts, including sea level rise. According to the Intergovernmental Panel on Climate Change reports, low-lying atolls in the Pacific Ocean are extremely vulnerable to king tide events, storm surge, tsunamis, and sea-level rise. The lack of coastal topobathymetric data has been identified as a critical data gap for climate vulnerability and adaptation efforts in the Republic of the Marshall Islands (RMI). For Majuro Atoll, home to the largest city of RMI, the only elevation dataset currently available is the Shuttle Radar Topography Mission data which has a 30-meter spatial resolution and 16-meter vertical accuracy (expressed as linear error at 90%). To generate high-resolution digital elevation models (DEMs) in the RMI, elevation information and photographic imagery have been collected from field surveys using GNSS/total station and unmanned aerial vehicles for Structure-from-Motion (SfM) point cloud generation. Digital Globe WorldView II imagery was processed to create SfM point clouds to fill in gaps in the point cloud derived from the higher resolution UAS photos. The combined point cloud data is filtered and classified to bare-earth and georeferenced using the GNSS data acquired on roads and along survey transects perpendicular to the coast. A total station was used to collect elevation data under tree canopies where heavy vegetation cover blocked the view of GNSS satellites. A subset of the GPS / total station data was set aside for error assessment of the resulting DEM.

  13. How to interpret safety critical failures in risk and reliability assessments

    International Nuclear Information System (INIS)

    Selvik, Jon Tømmerås; Signoret, Jean-Pierre

    2017-01-01

    Management of safety systems often receives high attention due to the potential for industrial accidents. In risk and reliability literature concerning such systems, and particularly concerning safety-instrumented systems, one frequently comes across the term ‘safety critical failure’. It is a term associated with the term ‘critical failure’, and it is often deduced that a safety critical failure refers to a failure occurring in a safety critical system. Although this is correct in some situations, it is not matching with for example the mathematical definition given in ISO/TR 12489:2013 on reliability modeling, where a clear distinction is made between ‘safe failures’ and ‘dangerous failures’. In this article, we show that different interpretations of the term ‘safety critical failure’ exist, and there is room for misinterpretations and misunderstandings regarding risk and reliability assessments where failure information linked to safety systems are used, and which could influence decision-making. The article gives some examples from the oil and gas industry, showing different possible interpretations of the term. In particular we discuss the link between criticality and failure. The article points in general to the importance of adequate risk communication when using the term, and gives some clarification on interpretation in risk and reliability assessments.

  14. Pocket Handbook on Reliability

    Science.gov (United States)

    1975-09-01

    exponencial distributions Weibull distribution, -xtimating reliability, confidence intervals, relia- bility growth, 0. P- curves, Bayesian analysis. 20 A S...introduction for those not familiar with reliability and a good refresher for those who are currently working in the area. LEWIS NERI, CHIEF...includes one or both of the following objectives: a) prediction of the current system reliability, b) projection on the system reliability for someI future

  15. Reliability and safety engineering

    CERN Document Server

    Verma, Ajit Kumar; Karanki, Durga Rao

    2016-01-01

    Reliability and safety are core issues that must be addressed throughout the life cycle of engineering systems. Reliability and Safety Engineering presents an overview of the basic concepts, together with simple and practical illustrations. The authors present reliability terminology in various engineering fields, viz.,electronics engineering, software engineering, mechanical engineering, structural engineering and power systems engineering. The book describes the latest applications in the area of probabilistic safety assessment, such as technical specification optimization, risk monitoring and risk informed in-service inspection. Reliability and safety studies must, inevitably, deal with uncertainty, so the book includes uncertainty propagation methods: Monte Carlo simulation, fuzzy arithmetic, Dempster-Shafer theory and probability bounds. Reliability and Safety Engineering also highlights advances in system reliability and safety assessment including dynamic system modeling and uncertainty management. Cas...

  16. Reliability studies of a high-power proton accelerator for accelerator-driven system applications for nuclear waste transmutation

    Energy Technology Data Exchange (ETDEWEB)

    Burgazzi, Luciano [ENEA-Centro Ricerche ' Ezio Clementel' , Advanced Physics Technology Division, Via Martiri di Monte Sole, 4, 40129 Bologna (Italy)]. E-mail: burgazzi@bologna.enea.it; Pierini, Paolo [INFN-Sezione di Milano, Laboratorio Acceleratori e Superconduttivita Applicata, Via Fratelli Cervi 201, I-20090 Segrate (MI) (Italy)

    2007-04-15

    The main effort of the present study is to analyze the availability and reliability of a high-performance linac (linear accelerator) conceived for Accelerator-Driven Systems (ADS) purpose and to suggest recommendations, in order both to meet the high operability goals and to satisfy the safety requirements dictated by the reactor system. Reliability Block Diagrams (RBD) approach has been considered for system modelling, according to the present level of definition of the design: component failure modes are assessed in terms of Mean Time Between Failure (MTBF) and Mean Time To Repair (MTTR), reliability and availability figures are derived, applying the current reliability algorithms. The lack of a well-established component database has been pointed out as the main issue related to the accelerator reliability assessment. The results, affected by the conservative character of the study, show a high margin for the improvement in terms of accelerator reliability and availability figures prediction. The paper outlines the viable path towards the accelerator reliability and availability enhancement process and delineates the most proper strategies. The improvement in the reliability characteristics along this path is shown as well.

  17. Reliability studies of a high-power proton accelerator for accelerator-driven system applications for nuclear waste transmutation

    International Nuclear Information System (INIS)

    Burgazzi, Luciano; Pierini, Paolo

    2007-01-01

    The main effort of the present study is to analyze the availability and reliability of a high-performance linac (linear accelerator) conceived for Accelerator-Driven Systems (ADS) purpose and to suggest recommendations, in order both to meet the high operability goals and to satisfy the safety requirements dictated by the reactor system. Reliability Block Diagrams (RBD) approach has been considered for system modelling, according to the present level of definition of the design: component failure modes are assessed in terms of Mean Time Between Failure (MTBF) and Mean Time To Repair (MTTR), reliability and availability figures are derived, applying the current reliability algorithms. The lack of a well-established component database has been pointed out as the main issue related to the accelerator reliability assessment. The results, affected by the conservative character of the study, show a high margin for the improvement in terms of accelerator reliability and availability figures prediction. The paper outlines the viable path towards the accelerator reliability and availability enhancement process and delineates the most proper strategies. The improvement in the reliability characteristics along this path is shown as well

  18. Memory persistency and nonlinearity in daily mean dew point across India

    Science.gov (United States)

    Ray, Rajdeep; Khondekar, Mofazzal Hossain; Ghosh, Koushik; Bhattacharjee, Anup Kumar

    2016-04-01

    Enterprising endeavour has been taken in this work to realize and estimate the persistence in memory of the daily mean dew point time series obtained from seven different weather stations viz. Kolkata, Chennai (Madras), New Delhi, Mumbai (Bombay), Bhopal, Agartala and Ahmedabad representing different geographical zones in India. Hurst exponent values reveal an anti-persistent behaviour of these dew point series. To affirm the Hurst exponent values, five different scaling methods have been used and the corresponding results are compared to synthesize a finer and reliable conclusion out of it. The present analysis also bespeaks that the variation in daily mean dew point is governed by a non-stationary process with stationary increments. The delay vector variance (DVV) method has been exploited to investigate nonlinearity, and the present calculation confirms the presence of deterministic nonlinear profile in the daily mean dew point time series of the seven stations.

  19. Presentation layer finding database of cyanobacteria and algae

    OpenAIRE

    SEMECKÝ, Jiří

    2012-01-01

    Phycological Laboratory University of South Bohemia in Czech Budejovice uses occurrence database samples. This work deals with the analysis and optimization of the existing database, designing and programming extension that allows processing points based on GPS coordinates and display them in on-line maps and georeferenced image.

  20. OSS reliability measurement and assessment

    CERN Document Server

    Yamada, Shigeru

    2016-01-01

    This book analyses quantitative open source software (OSS) reliability assessment and its applications, focusing on three major topic areas: the Fundamentals of OSS Quality/Reliability Measurement and Assessment; the Practical Applications of OSS Reliability Modelling; and Recent Developments in OSS Reliability Modelling. Offering an ideal reference guide for graduate students and researchers in reliability for open source software (OSS) and modelling, the book introduces several methods of reliability assessment for OSS including component-oriented reliability analysis based on analytic hierarchy process (AHP), analytic network process (ANP), and non-homogeneous Poisson process (NHPP) models, the stochastic differential equation models and hazard rate models. These measurement and management technologies are essential to producing and maintaining quality/reliable systems using OSS.

  1. Assessing the Reliability of Curriculum-Based Measurement: An Application of Latent Growth Modeling

    Science.gov (United States)

    Yeo, Seungsoo; Kim, Dong-Il; Branum-Martin, Lee; Wayman, Miya Miura; Espin, Christine A.

    2012-01-01

    The purpose of this study was to demonstrate the use of Latent Growth Modeling (LGM) as a method for estimating reliability of Curriculum-Based Measurement (CBM) progress-monitoring data. The LGM approach permits the error associated with each measure to differ at each time point, thus providing an alternative method for examining of the…

  2. Investigation of MLE in nonparametric estimation methods of reliability function

    International Nuclear Information System (INIS)

    Ahn, Kwang Won; Kim, Yoon Ik; Chung, Chang Hyun; Kim, Kil Yoo

    2001-01-01

    There have been lots of trials to estimate a reliability function. In the ESReDA 20 th seminar, a new method in nonparametric way was proposed. The major point of that paper is how to use censored data efficiently. Generally there are three kinds of approach to estimate a reliability function in nonparametric way, i.e., Reduced Sample Method, Actuarial Method and Product-Limit (PL) Method. The above three methods have some limits. So we suggest an advanced method that reflects censored information more efficiently. In many instances there will be a unique maximum likelihood estimator (MLE) of an unknown parameter, and often it may be obtained by the process of differentiation. It is well known that the three methods generally used to estimate a reliability function in nonparametric way have maximum likelihood estimators that are uniquely exist. So, MLE of the new method is derived in this study. The procedure to calculate a MLE is similar just like that of PL-estimator. The difference of the two is that in the new method, the mass (or weight) of each has an influence of the others but the mass in PL-estimator not

  3. Reliability analysis of the epidural spinal cord compression scale.

    Science.gov (United States)

    Bilsky, Mark H; Laufer, Ilya; Fourney, Daryl R; Groff, Michael; Schmidt, Meic H; Varga, Peter Paul; Vrionis, Frank D; Yamada, Yoshiya; Gerszten, Peter C; Kuklo, Timothy R

    2010-09-01

    The evolution of imaging techniques, along with highly effective radiation options has changed the way metastatic epidural tumors are treated. While high-grade epidural spinal cord compression (ESCC) frequently serves as an indication for surgical decompression, no consensus exists in the literature about the precise definition of this term. The advancement of the treatment paradigms in patients with metastatic tumors for the spine requires a clear grading scheme of ESCC. The degree of ESCC often serves as a major determinant in the decision to operate or irradiate. The purpose of this study was to determine the reliability and validity of a 6-point, MR imaging-based grading system for ESCC. To determine the reliability of the grading scale, a survey was distributed to 7 spine surgeons who participate in the Spine Oncology Study Group. The MR images of 25 cervical or thoracic spinal tumors were distributed consisting of 1 sagittal image and 3 axial images at the identical level including T1-weighted, T2-weighted, and Gd-enhanced T1-weighted images. The survey was administered 3 times at 2-week intervals. The inter- and intrarater reliability was assessed. The inter- and intrarater reliability ranged from good to excellent when surgeons were asked to rate the degree of spinal cord compression using T2-weighted axial images. The T2-weighted images were superior indicators of ESCC compared with T1-weighted images with and without Gd. The ESCC scale provides a valid and reliable instrument that may be used to describe the degree of ESCC based on T2-weighted MR images. This scale accounts for recent advances in the treatment of spinal metastases and may be used to provide an ESCC classification scheme for multicenter clinical trial and outcome studies.

  4. A Method to Increase Drivers' Trust in Collision Warning Systems Based on Reliability Information of Sensor

    Science.gov (United States)

    Tsutsumi, Shigeyoshi; Wada, Takahiro; Akita, Tokihiko; Doi, Shun'ichi

    Driver's workload tends to be increased during driving under complicated traffic environments like a lane change. In such cases, rear collision warning is effective for reduction of cognitive workload. On the other hand, it is pointed out that false alarm or missing alarm caused by sensor errors leads to decrease of driver' s trust in the warning system and it can result in low efficiency of the system. Suppose that reliability information of the sensor is provided in real-time. In this paper, we propose a new warning method to increase driver' s trust in the system even with low sensor reliability utilizing the sensor reliability information. The effectiveness of the warning methods is shown by driving simulator experiments.

  5. Research on cognitive reliability model for main control room considering human factors in nuclear power plants

    International Nuclear Information System (INIS)

    Jiang Jianjun; Zhang Li; Wang Yiqun; Zhang Kun; Peng Yuyuan; Zhou Cheng

    2012-01-01

    Facing the shortcomings of the traditional cognitive factors and cognitive model, this paper presents a Bayesian networks cognitive reliability model by taking the main control room as a reference background and human factors as the key points. The model mainly analyzes the cognitive reliability affected by the human factors, and for the cognitive node and influence factors corresponding to cognitive node, a series of methods and function formulas to compute the node cognitive reliability is proposed. The model and corresponding methods can be applied to the evaluation of cognitive process for the nuclear power plant operators and have a certain significance for the prevention of safety accidents in nuclear power plants. (authors)

  6. A Method for Estimating Surveillance Video Georeferences

    Directory of Open Access Journals (Sweden)

    Aleksandar Milosavljević

    2017-07-01

    Full Text Available The integration of a surveillance camera video with a three-dimensional (3D geographic information system (GIS requires the georeferencing of that video. Since a video consists of separate frames, each frame must be georeferenced. To georeference a video frame, we rely on the information about the camera view at the moment that the frame was captured. A camera view in 3D space is completely determined by the camera position, orientation, and field-of-view. Since the accurate measuring of these parameters can be extremely difficult, in this paper we propose a method for their estimation based on matching video frame coordinates of certain point features with their 3D geographic locations. To obtain these coordinates, we rely on high-resolution orthophotos and digital elevation models (DEM of the area of interest. Once an adequate number of points are matched, Levenberg–Marquardt iterative optimization is applied to find the most suitable video frame georeference, i.e., position and orientation of the camera.

  7. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  8. Hawaii electric system reliability.

    Energy Technology Data Exchange (ETDEWEB)

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  9. CMOS Cell Sensors for Point-of-Care Diagnostics

    Science.gov (United States)

    Adiguzel, Yekbun; Kulah, Haluk

    2012-01-01

    The burden of health-care related services in a global era with continuously increasing population and inefficient dissipation of the resources requires effective solutions. From this perspective, point-of-care diagnostics is a demanded field in clinics. It is also necessary both for prompt diagnosis and for providing health services evenly throughout the population, including the rural districts. The requirements can only be fulfilled by technologies whose productivity has already been proven, such as complementary metal-oxide-semiconductors (CMOS). CMOS-based products can enable clinical tests in a fast, simple, safe, and reliable manner, with improved sensitivities. Portability due to diminished sensor dimensions and compactness of the test set-ups, along with low sample and power consumption, is another vital feature. CMOS-based sensors for cell studies have the potential to become essential counterparts of point-of-care diagnostics technologies. Hence, this review attempts to inform on the sensors fabricated with CMOS technology for point-of-care diagnostic studies, with a focus on CMOS image sensors and capacitance sensors for cell studies. PMID:23112587

  10. Human factor reliability program

    International Nuclear Information System (INIS)

    Knoblochova, L.

    2017-01-01

    The human factor's reliability program was at Slovenske elektrarne, a.s. (SE) nuclear power plants. introduced as one of the components Initiatives of Excellent Performance in 2011. The initiative's goal was to increase the reliability of both people and facilities, in response to 3 major areas of improvement - Need for improvement of the results, Troubleshooting support, Supporting the achievement of the company's goals. The human agent's reliability program is in practice included: - Tools to prevent human error; - Managerial observation and coaching; - Human factor analysis; -Quick information about the event with a human agent; -Human reliability timeline and performance indicators; - Basic, periodic and extraordinary training in human factor reliability(authors)

  11. Modeling Optimal Scheduling for Pumping System to Minimize Operation Cost and Enhance Operation Reliability

    Directory of Open Access Journals (Sweden)

    Yin Luo

    2012-01-01

    Full Text Available Traditional pump scheduling models neglect the operation reliability which directly relates with the unscheduled maintenance cost and the wear cost during the operation. Just for this, based on the assumption that the vibration directly relates with the operation reliability and the degree of wear, it could express the operation reliability as the normalization of the vibration level. The characteristic of the vibration with the operation point was studied, it could be concluded that idealized flow versus vibration plot should be a distinct bathtub shape. There is a narrow sweet spot (80 to 100 percent BEP to obtain low vibration levels in this shape, and the vibration also follows similar law with the square of the rotation speed without resonance phenomena. Then, the operation reliability could be modeled as the function of the capacity and rotation speed of the pump and add this function to the traditional model to form the new. And contrast with the tradition method, the result shown that the new model could fix the result produced by the traditional, make the pump operate in low vibration, then the operation reliability could increase and the maintenance cost could decrease.

  12. Cumulative trauma disorders in the upper extremities: reliability of the postural and repetitive risk-factors index.

    Science.gov (United States)

    James, C P; Harburn, K L; Kramer, J F

    1997-08-01

    This study addresses test-retest reliability of the Postural and Repetitive Risk-Factors Index (PRRI) for work-related upper body injuries. This assessment was developed by the present authors. A repeated measures design was used to assess the test-retest reliability of a videotaped work-site assessment of subjects' movements. Ten heavy users of video display terminals (VDTs) from a local banking industry participated in the study. The 10 subjects' movements were videotaped for 2 hours on each of 2 separate days, while working on-site at their VDTs. The videotaped assessment, which utilized known postural risk factors for developing musculoskeletal disorder, pain, and discomfort in heavy VDT users (ie, repetitiveness, awkward and static postures, and contraction time), was called the PRRI. The videotaped movement assessments were subsequently analyzed in 15-minute sessions (five sessions per 2-hour videotape, which produced a total of 10 sessions over the 2 testing days), and each session was chosen randomly from the videotape. The subjects' movements were given a postural risk score according to the criteria in the PRRI. Each subject was therefore tested a total of 10 times (ie, 10 sessions), over two days. The maximum PRRI score for both sides of the body was 216 points. Reliability coefficients (RCs) for the PRRI scores were calculated, and the reliability of any one session met the minimum criterion for excellent reliability, which was .75. A two-way analysis of variance (ANOVA) confirmed that there was no statistically significant difference between sessions (p < .05). Calculations using the standard error of measurement (SEM) indicated that an individual tested once, on one day and with a PRRI score of 25, required a change of at least 8 points in order to be confident that a true change in score had occurred. The significant results from the reliability tests indicated that the PRRI was a reliable measurement tool that could be used by occupational health

  13. Toward High Altitude Airship Ground-Based Boresight Calibration of Hyperspectral Pushbroom Imaging Sensors

    Directory of Open Access Journals (Sweden)

    Aiwu Zhang

    2015-12-01

    Full Text Available The complexity of the single linear hyperspectral pushbroom imaging based on a high altitude airship (HAA without a three-axis stabilized platform is much more than that based on the spaceborne and airborne. Due to the effects of air pressure, temperature and airflow, the large pitch and roll angles tend to appear frequently that create pushbroom images highly characterized with severe geometric distortions. Thus, the in-flight calibration procedure is not appropriate to apply to the single linear pushbroom sensors on HAA having no three-axis stabilized platform. In order to address this problem, a new ground-based boresight calibration method is proposed. Firstly, a coordinate’s transformation model is developed for direct georeferencing (DG of the linear imaging sensor, and then the linear error equation is derived from it by using the Taylor expansion formula. Secondly, the boresight misalignments are worked out by using iterative least squares method with few ground control points (GCPs and ground-based side-scanning experiments. The proposed method is demonstrated by three sets of experiments: (i the stability and reliability of the method is verified through simulation-based experiments; (ii the boresight calibration is performed using ground-based experiments; and (iii the validation is done by applying on the orthorectification of the real hyperspectral pushbroom images from a HAA Earth observation payload system developed by our research team—“LanTianHao”. The test results show that the proposed boresight calibration approach significantly improves the quality of georeferencing by reducing the geometric distortions caused by boresight misalignments to the minimum level.

  14. Reliability analysis of self-actuated shutdown system

    International Nuclear Information System (INIS)

    Itooka, S.; Kumasaka, K.; Okabe, A.; Satoh, K.; Tsukui, Y.

    1991-01-01

    An analytical study was performed for the reliability of a self-actuated shutdown system (SASS) under the unprotected loss of flow (ULOF) event in a typical loop-type liquid metal fast breeder reactor (LMFBR) by the use of the response surface Monte Carlo analysis method. Dominant parameters for the SASS, such as Curie point characteristics, subassembly outlet coolant temperature, electromagnetic surface condition, etc., were selected and their probability density functions (PDFs) were determined by the design study information and experimental data. To get the response surface function (RSF) for the maximum coolant temperature, transient analyses of ULOF were performed by utilizing the experimental design method in the determination of analytical cases. Then, the RSF was derived by the multi-variable regression analysis. The unreliability of the SASS was evaluated as a probability that the maximum coolant temperature exceeded an acceptable level, employing the Monte Carlo calculation using the above PDFs and RSF. In this study, sensitivities to the dominant parameter were compared. The dispersion of subassembly outlet coolant temperature near the SASS-was found to be one of the most sensitive parameters. Fault tree analysis was performed using this value for the SASS in order to evaluate the shutdown system reliability. As a result of this study, the effectiveness of the SASS on the reliability improvement in the LMFBR shutdown system was analytically confirmed. This study has been performed as a part of joint research and development projects for DFBR under the sponsorship of the nine Japanese electric power companies, Electric Power Development Company and the Japan Atomic Power Company. (author)

  15. Populating a Control Point Database: A cooperative effort between the USGS, Grand Canyon Monitoring and Research Center and the Grand Canyon Youth Organization

    Science.gov (United States)

    Brown, K. M.; Fritzinger, C.; Wharton, E.

    2004-12-01

    The Grand Canyon Monitoring and Research Center measures the effects of Glen Canyon Dam operations on the resources along the Colorado River from Glen Canyon Dam to Lake Mead in support of the Grand Canyon Adaptive Management Program. Control points are integral for geo-referencing the myriad of data collected in the Grand Canyon including aerial photography, topographic and bathymetric data used for classification and change-detection analysis of physical, biologic and cultural resources. The survey department has compiled a list of 870 control points installed by various organizations needing to establish a consistent reference for data collected at field sites along the 240 mile stretch of Colorado River in the Grand Canyon. This list is the foundation for the Control Point Database established primarily for researchers, to locate control points and independently geo-reference collected field data. The database has the potential to be a valuable mapping tool for assisting researchers to easily locate a control point and reduce the occurrance of unknowingly installing new control points within close proximity of an existing control point. The database is missing photographs and accurate site description information. Current site descriptions do not accurately define the location of the point but refer to the project that used the point, or some other interesting fact associated with the point. The Grand Canyon Monitoring and Research Center (GCMRC) resolved this problem by turning the data collection effort into an educational exercise for the participants of the Grand Canyon Youth organization. Grand Canyon Youth is a non-profit organization providing experiential education for middle and high school aged youth. GCMRC and the Grand Canyon Youth formed a partnership where GCMRC provided the logistical support, equipment, and training to conduct the field work, and the Grand Canyon Youth provided the time and personnel to complete the field work. Two data

  16. Human reliability guidance - How to increase the synergies between human reliability, human factors, and system design and engineering. Phase 1: The Nordic Point of View - A user needs analysis

    International Nuclear Information System (INIS)

    Oxstrand, J.; Boring, R.L.

    2010-12-01

    The main goal of this Nordic Nuclear Safety Research (NKS) council project is to produce guidance for how to use human reliability analysis (HRA) to strengthen overall safety. This project is intended to work across (and hopefully diminish) the borders that exist between human reliability analysis (HRA) and human-system interaction, human performance, human factors, and probabilistic risk assessment at Nordic nuclear power plants. This project consists of two major phases, where the initial phase (phase 1) is a study of current practices in the Nordic region, which is presented in this report. Even though the project covers the synergies between HRA and all other relevant fields, the main focus for the phase is to bridge HRA and design. Interviews with 26 Swedish and Finnish plant experts are summarized the present report, and 10 principles to improve the utilization of HRA at plants are presented. A second study, which is not documented in this preliminary report, will chronicle insights into how the US nuclear industry works with HRA. To gain this knowledge the author will conduct interviews with the US regulator, research laboratories, and utilities. (Author)

  17. Reliable Design Versus Trust

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  18. Go-flow: a reliability analysis methodology applicable to piping system

    International Nuclear Information System (INIS)

    Matsuoka, T.; Kobayashi, M.

    1985-01-01

    Since the completion of the Reactor Safety Study, the use of probabilistic risk assessment technique has been becoming more widespread in the nuclear community. Several analytical methods are used for the reliability analysis of nuclear power plants. The GO methodology is one of these methods. Using the GO methodology, the authors performed a reliability analysis of the emergency decay heat removal system of the nuclear ship Mutsu, in order to examine its applicability to piping systems. By this analysis, the authors have found out some disadvantages of the GO methodology. In the GO methodology, the signal is on-to-off or off-to-on signal, therefore the GO finds out the time point at which the state of a system changes, and can not treat a system which state changes as off-on-off. Several computer runs are required to obtain the time dependent failure probability of a system. In order to overcome these disadvantages, the authors propose a new analytical methodology: GO-FLOW. In GO-FLOW, the modeling method (chart) and the calculation procedure are similar to those in the GO methodology, but the meaning of signal and time point, and the definitions of operators are essentially different. In the paper, the GO-FLOW methodology is explained and two examples of the analysis by GO-FLOW are given

  19. Scoring haemophilic arthropathy on X-rays: improving inter- and intra-observer reliability and agreement using a consensus atlas

    Energy Technology Data Exchange (ETDEWEB)

    Foppen, Wouter; Schaaf, Irene C. van der; Beek, Frederik J.A. [University Medical Center Utrecht, Department of Radiology (Netherlands); Verkooijen, Helena M. [University Medical Center Utrecht, Department of Radiology (Netherlands); University Medical Center Utrecht, Julius Center for Health Sciences and Primary Care, Utrecht (Netherlands); Fischer, Kathelijn [University Medical Center Utrecht, Julius Center for Health Sciences and Primary Care, Utrecht (Netherlands); University Medical Center Utrecht, Van Creveldkliniek, Department of Hematology, Utrecht (Netherlands)

    2016-06-15

    The radiological Pettersson score (PS) is widely applied for classification of arthropathy to evaluate costly haemophilia treatment. This study aims to assess and improve inter- and intra-observer reliability and agreement of the PS. Two series of X-rays (bilateral elbows, knees, and ankles) of 10 haemophilia patients (120 joints) with haemophilic arthropathy were scored by three observers according to the PS (maximum score 13/joint). Subsequently, (dis-)agreement in scoring was discussed until consensus. Example images were collected in an atlas. Thereafter, second series of 120 joints were scored using the atlas. One observer rescored the second series after three months. Reliability was assessed by intraclass correlation coefficients (ICC), agreement by limits of agreement (LoA). Median Pettersson score at joint level (PS{sub joint}) of affected joints was 6 (interquartile range 3-9). Using the consensus atlas, inter-observer reliability of the PS{sub joint} improved significantly from 0.94 (95 % confidence interval (CI) 0.91-0.96) to 0.97 (CI 0.96-0.98). LoA improved from ±1.7 to ±1.1 for the PS{sub joint}. Therefore, true differences in arthropathy were differences in the PS{sub joint} of >2 points. Intra-observer reliability of the PS{sub joint} was 0.98 (CI 0.97-0.98), intra-observer LoA were ±0.9 points. Reliability and agreement of the PS improved by using a consensus atlas. (orig.)

  20. Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Faber, M.H.; Sørensen, John Dalsgaard

    2003-01-01

    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  1. Reliability of dose volume constraint inference from clinical data

    DEFF Research Database (Denmark)

    Lutz, C M; Møller, D S; Hoffmann, L

    2017-01-01

    Dose volume histogram points (DVHPs) frequently serve as dose constraints in radiotherapy treatment planning. An experiment was designed to investigate the reliability of DVHP inference from clinical data for multiple cohort sizes and complication incidence rates. The experimental background...... was radiation pneumonitis in non-small cell lung cancer and the DVHP inference method was based on logistic regression. From 102 NSCLC real-life dose distributions and a postulated DVHP model, an 'ideal' cohort was generated where the most predictive model was equal to the postulated model. A bootstrap...

  2. Human reliability

    International Nuclear Information System (INIS)

    Bubb, H.

    1992-01-01

    This book resulted from the activity of Task Force 4.2 - 'Human Reliability'. This group was established on February 27th, 1986, at the plenary meeting of the Technical Reliability Committee of VDI, within the framework of the joint committee of VDI on industrial systems technology - GIS. It is composed of representatives of industry, representatives of research institutes, of technical control boards and universities, whose job it is to study how man fits into the technical side of the world of work and to optimize this interaction. In a total of 17 sessions, information from the part of ergonomy dealing with human reliability in using technical systems at work was exchanged, and different methods for its evaluation were examined and analyzed. The outcome of this work was systematized and compiled in this book. (orig.) [de

  3. Use of DEMs Derived from TLS and HRSI Data for Landslide Feature Recognition

    Directory of Open Access Journals (Sweden)

    Maurizio Barbarella

    2018-04-01

    Full Text Available This paper addresses the problems arising from the use of data acquired with two different remote sensing techniques—high-resolution satellite imagery (HRSI and terrestrial laser scanning (TLS—for the extraction of digital elevation models (DEMs used in the geomorphological analysis and recognition of landslides, taking into account the uncertainties associated with DEM production. In order to obtain a georeferenced and edited point cloud, the two data sets require quite different processes, which are more complex for satellite images than for TLS data. The differences between the two processes are highlighted. The point clouds are interpolated on a DEM with a 1 m grid size using kriging. Starting from these DEMs, a number of contour, slope, and aspect maps are extracted, together with their associated uncertainty maps. Comparative analysis of selected landslide features drawn from the two data sources allows recognition and classification of hierarchical and multiscale landslide components. Taking into account the uncertainty related to the map enables areas to be located for which one data source was able to give more reliable results than another. Our case study is located in Southern Italy, in an area known for active landslides.

  4. Reliable Multicast MAC Protocol for IEEE 802.11 Wireless LANs with Extended Service Range

    Science.gov (United States)

    Choi, Woo-Yong

    2011-11-01

    In this paper, we propose the efficient reliable multicast MAC protocol by which the AP (Access Point) can transmit reliably its multicast data frames to the recipients in the AP's one-hop or two-hop transmission range. The AP uses the STAs (Stations) that are directly associated with itself as the relays for the data delivery to the remote recipients that cannot be reached directly from itself. Based on the connectivity information among the recipients, the reliable multicast MAC protocol optimizes the number of the RAK (Request for ACK) frame transmissions in a reasonable computational time. Numerical examples show that our proposed MAC protocol significantly enhances the MAC performance compared with the BMMM (Batch Mode Multicast MAC) protocol that is extended to support the recipients that are in the AP's one-hop or two-hop transmission range in IEEE 802.11 wireless LANs.

  5. On the change points of mean residual life and failure rate functions for some generalized gamma type distributions

    Directory of Open Access Journals (Sweden)

    Parsa M.

    2014-01-01

    Full Text Available Mean residual life and failure rate functions are ubiquitously employed in reliability analysis. The term of useful period of lifetime distributions of bathtub-shaped failure rate functions is referred to the flat rigion of this function and has attracted authors and researchers in reliability, actuary, and survival analysis. In recent years, considering the change points of mean residual life and failure rate functions has been extensively utelized in determining the optimum burn-in time. In this paper we investigate the difference between the change points of failure rate and mean residual life functions of some generalized gamma type distributions due to the capability of these distributions in modeling various bathtub-shaped failure rate functions.

  6. Design of Wireless Point of Sale Based on ZigBee Technology

    Directory of Open Access Journals (Sweden)

    Xiaoning Jiang

    2014-02-01

    Full Text Available With the rapid development of Point of Sale technology and modern communication technology, financial Point of Sale terminal system has been started from wired to wireless. Wireless payment technology can used where can’t rely on or even no cable network. As one of the most important technologies in the information era, Wireless Sensor Network has been widely used in banking business and other various modem business fields. This paper describes a kind of simple portable Point of Sale terminal based on the ZigBee wireless network 1, which is a low power, low cost, flexible, safe and reliable network. This Point of Sale system can be applied gas stations, liquefied petroleum gas stations and other complex sales environment, and it improves safety of gas station and personnel safety. Simple and user-friendly, this formula design and optimization method greatly improves efficiency and thus has much value for practical application.

  7. Reliability evaluation of nuclear power plants by fault tree analysis

    International Nuclear Information System (INIS)

    Iwao, H.; Otsuka, T.; Fujita, I.

    1993-01-01

    As a work sponsored by the Ministry of International Trade and Industry, the Safety Information Research Center of NUPEC, using reliability data based on the operational experience of the domestic LWR Plants, has implemented FTA for the standard PWRs and BWRs in Japan with reactor scram due to system failures being at the top event. Up to this point, we have obtained the FT chart and minimal cut set for each type of system failure for qualitative evaluation, and we have estimated system unavailability, Fussell-Vesely importance and risk worth for components for quantitative evaluation. As the second stage of a series in our reliability evaluation work, another program was started to establish a support system. The aim of this system is to assist foreign and domestic plants in creating countermeasures when incidents occur, by providing them with the necessary information using the above analytical method and its results. (author)

  8. Ifcwall Reconstruction from Unstructured Point Clouds

    Science.gov (United States)

    Bassier, M.; Klein, R.; Van Genechten, B.; Vergauwen, M.

    2018-05-01

    The automated reconstruction of Building Information Modeling (BIM) objects from point cloud data is still ongoing research. A key aspect is the creation of accurate wall geometry as it forms the basis for further reconstruction of objects in a BIM. After segmenting and classifying the initial point cloud, the labelled segments are processed and the wall topology is reconstructed. However, the preocedure is challenging due to noise, occlusions and the complexity of the input data.In this work, a method is presented to automatically reconstruct consistent wall geometry from point clouds. More specifically, the use of room information is proposed to aid the wall topology creation. First, a set of partial walls is constructed based on classified planar primitives. Next, the rooms are identified using the retrieved wall information along with the floors and ceilings. The wall topology is computed by the intersection of the partial walls conditioned on the room information. The final wall geometry is defined by creating IfcWallStandardCase objects conform the IFC4 standard. The result is a set of walls according to the as-built conditions of a building. The experiments prove that the used method is a reliable framework for wall reconstruction from unstructured point cloud data. Also, the implementation of room information reduces the rate of false positives for the wall topology. Given the walls, ceilings and floors, 94% of the rooms is correctly identified. A key advantage of the proposed method is that it deals with complex rooms and is not bound to single storeys.

  9. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  10. Twitter location (sometimes matters: Exploring the relationship between georeferenced tweet content and nearby feature classes

    Directory of Open Access Journals (Sweden)

    Stefan Hahmann

    2014-12-01

    Full Text Available In this paper, we investigate whether microblogging texts (tweets produced on mobile devices are related to the geographical locations where they were posted. For this purpose, we correlate tweet topics to areas. In doing so, classified points of interest from OpenStreetMap serve as validation points. We adopted the classification and geolocation of these points to correlate with tweet content by means of manual, supervised, and unsupervised machine learning approaches. Evaluation showed the manual classification approach to be highest quality, followed by the supervised method, and that the unsupervised classification was of low quality. We found that the degree to which tweet content is related to nearby points of interest depends upon topic (that is, upon the OpenStreetMap category. A more general synthesis with prior research leads to the conclusion that the strength of the relationship of tweets and their geographic origin also depends upon geographic scale (where smaller scale correlations are more significant than those of larger scale.

  11. Methods for registration laser scanner point clouds in forest stands

    International Nuclear Information System (INIS)

    Bienert, A.; Pech, K.; Maas, H.-G.

    2011-01-01

    Laser scanning is a fast and efficient 3-D measurement technique to capture surface points describing the geometry of a complex object in an accurate and reliable way. Besides airborne laser scanning, terrestrial laser scanning finds growing interest for forestry applications. These two different recording platforms show large differences in resolution, recording area and scan viewing direction. Using both datasets for a combined point cloud analysis may yield advantages because of their largely complementary information. In this paper, methods will be presented to automatically register airborne and terrestrial laser scanner point clouds of a forest stand. In a first step, tree detection is performed in both datasets in an automatic manner. In a second step, corresponding tree positions are determined using RANSAC. Finally, the geometric transformation is performed, divided in a coarse and fine registration. After a coarse registration, the fine registration is done in an iterative manner (ICP) using the point clouds itself. The methods are tested and validated with a dataset of a forest stand. The presented registration results provide accuracies which fulfill the forestry requirements [de

  12. Student Practice Evaluation Form-Revised Edition online comment bank: development and reliability analysis.

    Science.gov (United States)

    Rodger, Sylvia; Turpin, Merrill; Copley, Jodie; Coleman, Allison; Chien, Chi-Wen; Caine, Anne-Maree; Brown, Ted

    2014-08-01

    The reliable evaluation of occupational therapy students completing practice education placements along with provision of appropriate feedback is critical for both students and for universities from a quality assurance perspective. This study describes the development of a comment bank for use with an online version of the Student Practice Evaluation Form-Revised Edition (SPEF-R Online) and investigates its reliability. A preliminary bank of 109 individual comments (based on previous students' placement performance) was developed via five stages. These comments reflected all 11 SPEF-R domains. A purpose-designed online survey was used to examine the reliability of the comment bank. A total of 37 practice educators returned surveys, 31 of which were fully completed. Participants were asked to rate each individual comment using the five-point SPEF-R rating scale. One hundred and two of 109 comments demonstrated satisfactory agreement with their respective default ratings that were determined by the development team. At each domain level, the intra-class correlation coefficients (ranging between 0.86 and 0.96) also demonstrated good to excellent inter-rater reliability. There were only seven items that required rewording prior to inclusion in the final SPEF-R Online comment bank. The development of the SPEF-R Online comment bank offers a source of reliable comments (consistent with the SPEF-R rating scale across different domains) and aims to assist practice educators in providing reliable and timely feedback to students in a user-friendly manner. © 2014 Occupational Therapy Australia.

  13. Incorporation of Spatial Interactions in Location Networks to Identify Critical Geo-Referenced Routes for Assessing Disease Control Measures on a Large-Scale Campus

    Directory of Open Access Journals (Sweden)

    Tzai-Hung Wen

    2015-04-01

    Full Text Available Respiratory diseases mainly spread through interpersonal contact. Class suspension is the most direct strategy to prevent the spread of disease through elementary or secondary schools by blocking the contact network. However, as university students usually attend courses in different buildings, the daily contact patterns on a university campus are complicated, and once disease clusters have occurred, suspending classes is far from an efficient strategy to control disease spread. The purpose of this study is to propose a methodological framework for generating campus location networks from a routine administration database, analyzing the community structure of the network, and identifying the critical links and nodes for blocking respiratory disease transmission. The data comes from the student enrollment records of a major comprehensive university in Taiwan. We combined the social network analysis and spatial interaction model to establish a geo-referenced community structure among the classroom buildings. We also identified the critical links among the communities that were acting as contact bridges and explored the changes in the location network after the sequential removal of the high-risk buildings. Instead of conducting a questionnaire survey, the study established a standard procedure for constructing a location network on a large-scale campus from a routine curriculum database. We also present how a location network structure at a campus could function to target the high-risk buildings as the bridges connecting communities for blocking disease transmission.

  14. Implementation of the project of equipment reliability in the nuclear power plant of Laguna Verde

    International Nuclear Information System (INIS)

    Rios O, J. E.; Martinez L, A. G.

    2008-01-01

    A equipment is reliable if it fulfills the function for which was designed and when it is required. To implement a project of reliability in a nuclear power plant this associate to a process of continuous analysis of the operation, of the conditions and faults of the equipment. The analysis of the operation of a system, of the equipment of the same faults and the parts that integrate to equipment take to identify the potential causes of faults. The predictive analysis on components and equipment allow to rectify and to establish guides to optimize the maintenance and to guarantee the reliability and function of the same ones. The reliability in the equipment is without place to doubts a wide project that embraces from the more small component of the equipment going by the proof of the parts of reserve, the operation conditions until the operative techniques of analysis. Without place of doubt for a nuclear power plant the taking of decisions based on the reliability of their systems and equipment will be the appropriate for to assure the operation and reliability of the same one. In this work would appear the project of reliability its processes, criteria, indicators action of improvement and the interaction of the different disciplines from the Nuclear Power Plant of Laguna Verde like a fundamental point for it put in operation. (Author)

  15. A hybrid reliability algorithm using PSO-optimized Kriging model and adaptive importance sampling

    Science.gov (United States)

    Tong, Cao; Gong, Haili

    2018-03-01

    This paper aims to reduce the computational cost of reliability analysis. A new hybrid algorithm is proposed based on PSO-optimized Kriging model and adaptive importance sampling method. Firstly, the particle swarm optimization algorithm (PSO) is used to optimize the parameters of Kriging model. A typical function is fitted to validate improvement by comparing results of PSO-optimized Kriging model with those of the original Kriging model. Secondly, a hybrid algorithm for reliability analysis combined optimized Kriging model and adaptive importance sampling is proposed. Two cases from literatures are given to validate the efficiency and correctness. The proposed method is proved to be more efficient due to its application of small number of sample points according to comparison results.

  16. Corrections for criterion reliability in validity generalization: The consistency of Hermes, the utility of Midas

    Directory of Open Access Journals (Sweden)

    Jesús F. Salgado

    2016-04-01

    Full Text Available There is criticism in the literature about the use of interrater coefficients to correct for criterion reliability in validity generalization (VG studies and disputing whether .52 is an accurate and non-dubious estimate of interrater reliability of overall job performance (OJP ratings. We present a second-order meta-analysis of three independent meta-analytic studies of the interrater reliability of job performance ratings and make a number of comments and reflections on LeBreton et al.s paper. The results of our meta-analysis indicate that the interrater reliability for a single rater is .52 (k = 66, N = 18,582, SD = .105. Our main conclusions are: (a the value of .52 is an accurate estimate of the interrater reliability of overall job performance for a single rater; (b it is not reasonable to conclude that past VG studies that used .52 as the criterion reliability value have a less than secure statistical foundation; (c based on interrater reliability, test-retest reliability, and coefficient alpha, supervisor ratings are a useful and appropriate measure of job performance and can be confidently used as a criterion; (d validity correction for criterion unreliability has been unanimously recommended by "classical" psychometricians and I/O psychologists as the proper way to estimate predictor validity, and is still recommended at present; (e the substantive contribution of VG procedures to inform HRM practices in organizations should not be lost in these technical points of debate.

  17. 78 FR 38851 - Electric Reliability Organization Proposal To Retire Requirements in Reliability Standards

    Science.gov (United States)

    2013-06-28

    ... either: Provide little protection for Bulk-Power System reliability or are redundant with other aspects... for retirement either: (1) Provide little protection for Bulk-Power System reliability or (2) are... to assure reliability of the Bulk-Power System and should be withdrawn. We have identified 41...

  18. Reliability Omnipotent Analysis For First Stage Separator On The Separation Process Of Gas, Oil And Water

    International Nuclear Information System (INIS)

    Sony Tjahyani, D. T.; Ismu W, Puradwi; Asmara Santa, Sigit

    2001-01-01

    Reliability of industry can be evaluated based on two aspects which are risk and economic aspects. From these points, optimation value can be determined optimation value. Risk of the oil refinery process are fire and explosion, so assessment of this system must be done. One system of the oil refinery process is first stage separator which is used to separate gas, oil and water. Evaluation of reliability for first stage separator system has been done with FAMECA and HAZap method. The analysis results, the probability of fire and explosion of 1.1x10 - 2 3 /hour and 1.2x10 - 1 1 /hour, respectively. The reliability value of the system is high because each undesired event is anticipated with safety system or safety component

  19. Operational safety reliability research

    International Nuclear Information System (INIS)

    Hall, R.E.; Boccio, J.L.

    1986-01-01

    Operating reactor events such as the TMI accident and the Salem automatic-trip failures raised the concern that during a plant's operating lifetime the reliability of systems could degrade from the design level that was considered in the licensing process. To address this concern, NRC is sponsoring the Operational Safety Reliability Research project. The objectives of this project are to identify the essential tasks of a reliability program and to evaluate the effectiveness and attributes of such a reliability program applicable to maintaining an acceptable level of safety during the operating lifetime at the plant

  20. Circuit design for reliability

    CERN Document Server

    Cao, Yu; Wirth, Gilson

    2015-01-01

    This book presents physical understanding, modeling and simulation, on-chip characterization, layout solutions, and design techniques that are effective to enhance the reliability of various circuit units.  The authors provide readers with techniques for state of the art and future technologies, ranging from technology modeling, fault detection and analysis, circuit hardening, and reliability management. Provides comprehensive review on various reliability mechanisms at sub-45nm nodes; Describes practical modeling and characterization techniques for reliability; Includes thorough presentation of robust design techniques for major VLSI design units; Promotes physical understanding with first-principle simulations.

  1. Assessing the contribution of microgrids to the reliability of distribution networks

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Paulo Moises [Escola Superior Tecnologia Viseu, Instituto Politecnico Viseu, Campus Politecnico Repeses, 3504-510 Viseu (Portugal); Matos, Manuel A. [INESC Porto, Faculdade de Engenharia da Universidade do Porto, Porto (Portugal)

    2009-02-15

    The emergence of microgeneration has recently lead to the concept of microgrid, a network of LV consumers and producers able to export electric energy in some circumstances and also to work in an isolated way in emergency situations. Research on the organization of microgrids, control devices, functionalities and other technical aspects is presently being carried out, in order to establish a consistent technical framework to support the concept. The successful development of the microgrid concept implies the definition of a suitable regulation for its integration on distribution systems. In order to define such a regulation, the identification of costs and benefits that microgrids may bring is a crucial task. Actually, this is the basis for a discussion about the way global costs could be divided among the different agents that benefit from the development of microgrids. Among other aspects, the effect of microgrids on the reliability of the distribution network has been pointed out as an important advantage, due to the ability of isolated operation in emergency situations. This paper identifies the situations where the existence of a microgrid may reduce the interruption rate and duration and thus improve the reliability indices of the distribution network. The relevant expressions necessary to quantify the reliability are presented. An illustrative example is included, where the global influence of the microgrid in the reliability is commented. (author)

  2. Investment in new product reliability

    International Nuclear Information System (INIS)

    Murthy, D.N.P.; Rausand, M.; Virtanen, S.

    2009-01-01

    Product reliability is of great importance to both manufacturers and customers. Building reliability into a new product is costly, but the consequences of inadequate product reliability can be costlier. This implies that manufacturers need to decide on the optimal investment in new product reliability by achieving a suitable trade-off between the two costs. This paper develops a framework and proposes an approach to help manufacturers decide on the investment in new product reliability.

  3. The chronic toxicity of molybdate to marine organisms. I. Generating reliable effects data

    Energy Technology Data Exchange (ETDEWEB)

    Heijerick, D.G., E-mail: Dagobert.heijerick@arche-consulting.be [ARCHE - Assessing Risks of Chemicals, Stapelplein 70 Bus 104, Gent (Belgium); Regoli, L. [International Molybdenum Association, 4 Heathfield Terrace, London, W4 4JE (United Kingdom); Stubblefield, W. [Oregon State University, Department of Environmental and Molecular Toxicology, 421 Weniger Hall, Corvallis, OR 97331 (United States)

    2012-07-15

    A scientific research program was initiated by the International Molybdenum Association (IMOA) which addressed identified gaps in the environmental toxicity data for the molybdate ion (MoO{sub 4}{sup 2-}). These gaps were previously identified during the preparation of EU-REACH-dossiers for different molybdenum compounds (European Union regulation on Registration, Evaluation, Authorization and Restriction of Chemical substances; EC, 2006). Evaluation of the open literature identified few reliable marine ecotoxicological data that could be used for deriving a Predicted No-Effect Concentration (PNEC) for the marine environment. Rather than calculating a PNEC{sub marine} using the assessment factor methodology on a combined freshwater/marine dataset, IMOA decided to generate sufficient reliable marine chronic data to permit derivation of a PNEC by means of the more scientifically robust species sensitivity distribution (SSD) approach (also called the statistical extrapolation approach). Nine test species were chronically exposed to molybdate (added as sodium molybdate dihydrate, Na{sub 2}MoO{sub 4}{center_dot}2H{sub 2}O) according to published standard testing guidelines that are acceptable for a broad range of regulatory purposes. The selected test organisms were representative for typical marine trophic levels: micro-algae/diatom (Phaeodactylum tricornutum, Dunaliella tertiolecta), macro-alga (Ceramium tenuicorne), mysids (Americamysis bahia), copepod (Acartia tonsa), fish (Cyprinodon variegatus), echinoderms (Dendraster exentricus, Strongylocentrotus purpuratus) and molluscs (Mytilus edulis, Crassostrea gigas). Available NOEC/EC{sub 10} levels ranged between 4.4 mg Mo/L (blue mussel M. edulis) and 1174 mg Mo/L (oyster C. gigas). Using all available reliable marine chronic effects data that are currently available, a HC{sub 5,50%} (median hazardous concentration affecting 5% of the species) of 5.74 (mg Mo)/L was derived with the statistical extrapolation approach, a

  4. Structural hybrid reliability index and its convergent solving method based on random–fuzzy–interval reliability model

    Directory of Open Access Journals (Sweden)

    Hai An

    2016-08-01

    Full Text Available Aiming to resolve the problems of a variety of uncertainty variables that coexist in the engineering structure reliability analysis, a new hybrid reliability index to evaluate structural hybrid reliability, based on the random–fuzzy–interval model, is proposed in this article. The convergent solving method is also presented. First, the truncated probability reliability model, the fuzzy random reliability model, and the non-probabilistic interval reliability model are introduced. Then, the new hybrid reliability index definition is presented based on the random–fuzzy–interval model. Furthermore, the calculation flowchart of the hybrid reliability index is presented and it is solved using the modified limit-step length iterative algorithm, which ensures convergence. And the validity of convergent algorithm for the hybrid reliability model is verified through the calculation examples in literature. In the end, a numerical example is demonstrated to show that the hybrid reliability index is applicable for the wear reliability assessment of mechanisms, where truncated random variables, fuzzy random variables, and interval variables coexist. The demonstration also shows the good convergence of the iterative algorithm proposed in this article.

  5. Reliability in engineering '87

    International Nuclear Information System (INIS)

    Tuma, M.

    1987-01-01

    The participants heard 51 papers dealing with the reliability of engineering products. Two of the papers were incorporated in INIS, namely ''Reliability comparison of two designs of low pressure regeneration of the 1000 MW unit at the Temelin nuclear power plant'' and ''Use of probability analysis of reliability in designing nuclear power facilities.''(J.B.)

  6. AN ADAPTIVE APPROACH FOR SEGMENTATION OF 3D LASER POINT CLOUD

    Directory of Open Access Journals (Sweden)

    Z. Lari

    2012-09-01

    Full Text Available Automatic processing and object extraction from 3D laser point cloud is one of the major research topics in the field of photogrammetry. Segmentation is an essential step in the processing of laser point cloud, and the quality of extracted objects from laser data is highly dependent on the validity of the segmentation results. This paper presents a new approach for reliable and efficient segmentation of planar patches from a 3D laser point cloud. In this method, the neighbourhood of each point is firstly established using an adaptive cylinder while considering the local point density and surface trend. This neighbourhood definition has a major effect on the computational accuracy of the segmentation attributes. In order to efficiently cluster planar surfaces and prevent introducing ambiguities, the coordinates of the origin's projection on each point's best fitted plane are used as the clustering attributes. Then, an octree space partitioning method is utilized to detect and extract peaks from the attribute space. Each detected peak represents a specific cluster of points which are located on a distinct planar surface in the object space. Experimental results show the potential and feasibility of applying this method for segmentation of both airborne and terrestrial laser data.

  7. 18 CFR 39.5 - Reliability Standards.

    Science.gov (United States)

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Reliability Standards... RELIABILITY STANDARDS § 39.5 Reliability Standards. (a) The Electric Reliability Organization shall file each Reliability Standard or modification to a Reliability Standard that it proposes to be made effective under...

  8. 76 FR 66057 - North American Electric Reliability Corporation; Order Approving Regional Reliability Standard

    Science.gov (United States)

    2011-10-25

    ... Reliability Standard that is necessitated by a physical difference in the Bulk-Power System.\\7\\ \\7\\ Order No... Reliability Standards for the Bulk-Power System, Order No. 693, FERC Stats. & Regs. ] 31,242, order on reh'g... electric system event analyses and thereby improve system reliability by promoting improved system design...

  9. The obscure factor analysis on the vibration reliability of the internals of nuclear power plant reactor and anti-vibration measures

    International Nuclear Information System (INIS)

    Fu Geyan; Zhu Qirong

    1998-11-01

    It is pointed out that the main reason making nuclear power plants reactors leak is the vibration of internals of reactors. The factors which lead the vibration all have randomness and obscureness. The obscure reliability theory is introduced to the vibration system of internals of nuclear power reactor. Based on a quantity of designing and moving data, the obscure factors effecting the vibration reliability of the internals of nuclear power plant reactor are analyzed and the anti-vibration reliability criteria and the evaluating model are given. And the anti-vibration reliability measures are advanced from different quarters of the machine design and building, the thermohydraulics design, the control of reactivity, etc.. They may benefit the theory and practice for building and perfecting the vibration obscure reliability model of the reactor internals

  10. Structural hybrid reliability index and its convergent solving method based on random–fuzzy–interval reliability model

    OpenAIRE

    Hai An; Ling Zhou; Hui Sun

    2016-01-01

    Aiming to resolve the problems of a variety of uncertainty variables that coexist in the engineering structure reliability analysis, a new hybrid reliability index to evaluate structural hybrid reliability, based on the random–fuzzy–interval model, is proposed in this article. The convergent solving method is also presented. First, the truncated probability reliability model, the fuzzy random reliability model, and the non-probabilistic interval reliability model are introduced. Then, the new...

  11. Recent Developments in Maximum Power Point Tracking Technologies for Photovoltaic Systems

    Directory of Open Access Journals (Sweden)

    Nevzat Onat

    2010-01-01

    Full Text Available In photovoltaic (PV system applications, it is very important to design a system for operating of the solar cells (SCs under best conditions and highest efficiency. Maximum power point (MPP varies depending on the angle of sunlight on the surface of the panel and cell temperature. Hence, the operating point of the load is not always MPP of PV system. Therefore, in order to supply reliable energy to the load, PV systems are designed to include more than the required number of modules. The solution to this problem is that switching power converters are used, that is called maximum power point tracker (MPPT. In this study, the various aspects of these algorithms have been analyzed in detail. Classifications, definitions, and basic equations of the most widely used MPPT technologies are given. Moreover, a comparison was made in the conclusion.

  12. Exponential Reliability Coefficient based Reputation Mechanism for isolating selfish nodes in MANETs

    Directory of Open Access Journals (Sweden)

    J. Sengathir

    2015-07-01

    Full Text Available In mobile ad hoc networks, cooperation among active mobile nodes is considered to play a vital role in reliable transmission of data. But, the selfish mobile nodes present in an ad hoc environment refuse to forward neighbouring nodes’ packet for conserving its own energy. This intentional selfish behaviour drastically reduces the degree of cooperation maintained between the mobile nodes. Hence, a need arises for devising an effective mechanism which incorporates both energy efficiency and reputation into account for mitigating selfish behaviour in MANETs. In this paper, we propose an Exponential Reliability Coefficient based reputation Mechanism (ERCRM which isolates the selfish nodes from the routing path based on Exponential Reliability Coefficient (ExRC. This reliability coefficient manipulated through exponential failure rate based on moving average method highlights the most recent past behaviour of the mobile nodes for quantifying its genuineness. From the simulation results, it is evident that, the proposed ERCRM approach outperforms the existing Packet Conservation Monitoring Algorithm (PCMA and Spilt Half Reliability Coefficient based Mathematical Model (SHRCM in terms of performance evaluation metrics such as packet delivery ratio, throughput, total overhead and control overhead. Further, this ERCRM mechanism has a successful rate of 28% in isolating the selfish nodes from the routing path. Furthermore, it also aids in framing the exponential threshold point of detection as 0.4, where a maximum number of selfish nodes are identified when compared to the existing models available in the literature.

  13. Node-pair reliability of network systems with small distances between adjacent nodes

    International Nuclear Information System (INIS)

    Malinowski, Jacek

    2007-01-01

    A new method for computing the node-pair reliability of network systems modeled by random graphs with nodes arranged in sequence is presented. It is based on a recursive algorithm using the 'sliding window' technique, the window being composed of several consecutive nodes. In a single step, the connectivity probabilities for all nodes included in the window are found. Subsequently, the window is moved one node forward. This process is repeated until, in the last step, the window reaches the terminal node. The connectivity probabilities found at that point are used to compute the node-pair reliability of the network system considered. The algorithm is designed especially for graphs with small distances between adjacent nodes, where the distance between two nodes is defined as the absolute value of the difference between the nodes' numbers. The maximal distance between any two adjacent nodes is denoted by Γ(G), where G symbolizes a random graph. If Γ(G)=2 then the method can be applied for directed as well as undirected graphs whose nodes and edges are subject to failure. This is important in view of the fact that many algorithms computing network reliability are designed for graphs with failure-prone edges and reliable nodes. If Γ(G)=3 then the method's applicability is limited to undirected graphs with reliable nodes. The main asset of the presented algorithms is their low numerical complexity-O(n), where n denotes the number of nodes

  14. Component reliability analysis for development of component reliability DB of Korean standard NPPs

    International Nuclear Information System (INIS)

    Choi, S. Y.; Han, S. H.; Kim, S. H.

    2002-01-01

    The reliability data of Korean NPP that reflects the plant specific characteristics is necessary for PSA and Risk Informed Application. We have performed a project to develop the component reliability DB and calculate the component reliability such as failure rate and unavailability. We have collected the component operation data and failure/repair data of Korean standard NPPs. We have analyzed failure data by developing a data analysis method which incorporates the domestic data situation. And then we have compared the reliability results with the generic data for the foreign NPPs

  15. Using VIIRS Day/Night Band to Measure Electricity Supply Reliability: Preliminary Results from Maharashtra, India

    Directory of Open Access Journals (Sweden)

    Michael L. Mann

    2016-08-01

    Full Text Available Unreliable electricity supplies are common in developing countries and impose large socio-economic costs, yet precise information on electricity reliability is typically unavailable. This paper presents preliminary results from a machine-learning approach for using satellite imagery of nighttime lights to develop estimates of electricity reliability for western India at a finer spatial scale. We use data from the Visible Infrared Imaging Radiometer Suite (VIIRS onboard the Suomi National Polar Partnership (SNPP satellite together with newly-available data from networked household voltage meters. Our results point to the possibilities of this approach as well as areas for refinement. With currently available training data, we find a limited ability to detect individual outages identified by household-level measurements of electricity voltage. This is likely due to the relatively small number of individual outages observed in our preliminary data. However, we find that the approach can estimate electricity reliability rates for individual locations fairly well, with the predicted versus actual regression yielding an R2 > 0.5. We also find that, despite the after midnight overpass time of the SNPP satellite, the reliability estimates derived are representative of daytime reliability.

  16. Georeferenced energy information system integrated of energetic matrix of Sao Paulo state from 2005 to 2035; Sistema de informacoes energeticas georreferenciadas integrado a matriz energetica do estado de Sao Paulo: 2005-2035

    Energy Technology Data Exchange (ETDEWEB)

    Alvares, Joao Malta [IX Consultoria e Representacoes Ltda, Itajuba, MG (Brazil); Universidade Federal de Itajuba (UNIFEI), MG (Brazil)

    2010-07-01

    A georeferenced information system energy or simply SIEG, is designed to integrate into the energy matrix of Sao Paulo from 2005 to 2035. Being an innovative request made by the Department of Sanitation and Energy of the state, this system would have the purpose to collect and aggregate information and data from several themes, relating this content in a geographic location spatialized. The main focus of the system is the analysis of the energy sector as a whole, from generation to final consumption, through all phases such as transmission and distribution. The energy data would also be crossed with various themes of support, contributing to the development of numerous reviews and generating sound conclusions. Issues such as environment, socio-economics, infrastructure, interconnected sectors, geographical conditions and other information could be entered, viewed and linked to the system. The SIEG is also a facilitator for planning and managing the energy sector with forecast models in possible future situations. (author)

  17. Isocount scintillation scanner with preset statistical data reliability

    International Nuclear Information System (INIS)

    Ikebe, J.; Yamaguchi, H.; Nawa, O.A.

    1975-01-01

    A scintillation detector scans an object such as a live body along horizontal straight scanning lines in such a manner that the scintillation detector is stopped at a scanning point during the time interval T required for counting a predetermined number of N pulses. The rate R/sub N/ = N/T is then calculated and the output signal pulses the number of which represents the rate R or the corresponding output signal is used as the recording signal for forming the scintigram. In contrast to the usual scanner, the isocount scanner scans an object stepwise in order to gather data with statistically uniform reliability

  18. Georeferenced Population Datasets of Mexico (GEO-MEX): Urban Place GIS Coverage of Mexico

    Data.gov (United States)

    National Aeronautics and Space Administration — The Urban Place GIS Coverage of Mexico is a vector based point Geographic Information System (GIS) coverage of 696 urban places in Mexico. Each Urban Place is...

  19. Electronics reliability calculation and design

    CERN Document Server

    Dummer, Geoffrey W A; Hiller, N

    1966-01-01

    Electronics Reliability-Calculation and Design provides an introduction to the fundamental concepts of reliability. The increasing complexity of electronic equipment has made problems in designing and manufacturing a reliable product more and more difficult. Specific techniques have been developed that enable designers to integrate reliability into their products, and reliability has become a science in its own right. The book begins with a discussion of basic mathematical and statistical concepts, including arithmetic mean, frequency distribution, median and mode, scatter or dispersion of mea

  20. Mathematical reliability an expository perspective

    CERN Document Server

    Mazzuchi, Thomas; Singpurwalla, Nozer

    2004-01-01

    In this volume consideration was given to more advanced theoretical approaches and novel applications of reliability to ensure that topics having a futuristic impact were specifically included. Topics like finance, forensics, information, and orthopedics, as well as the more traditional reliability topics were purposefully undertaken to make this collection different from the existing books in reliability. The entries have been categorized into seven parts, each emphasizing a theme that seems poised for the future development of reliability as an academic discipline with relevance. The seven parts are networks and systems; recurrent events; information and design; failure rate function and burn-in; software reliability and random environments; reliability in composites and orthopedics, and reliability in finance and forensics. Embedded within the above are some of the other currently active topics such as causality, cascading, exchangeability, expert testimony, hierarchical modeling, optimization and survival...