WorldWideScience

Sample records for previously validated technique

  1. 40 CFR 152.93 - Citation of a previously submitted valid study.

    Science.gov (United States)

    2010-07-01

    ... Data Submitters' Rights § 152.93 Citation of a previously submitted valid study. An applicant may demonstrate compliance for a data requirement by citing a valid study previously submitted to the Agency. The... the original data submitter, the applicant may cite the study only in accordance with paragraphs (b...

  2. Mean-Variance-Validation Technique for Sequential Kriging Metamodels

    International Nuclear Information System (INIS)

    Lee, Tae Hee; Kim, Ho Sung

    2010-01-01

    The rigorous validation of the accuracy of metamodels is an important topic in research on metamodel techniques. Although a leave-k-out cross-validation technique involves a considerably high computational cost, it cannot be used to measure the fidelity of metamodels. Recently, the mean 0 validation technique has been proposed to quantitatively determine the accuracy of metamodels. However, the use of mean 0 validation criterion may lead to premature termination of a sampling process even if the kriging model is inaccurate. In this study, we propose a new validation technique based on the mean and variance of the response evaluated when sequential sampling method, such as maximum entropy sampling, is used. The proposed validation technique is more efficient and accurate than the leave-k-out cross-validation technique, because instead of performing numerical integration, the kriging model is explicitly integrated to accurately evaluate the mean and variance of the response evaluated. The error in the proposed validation technique resembles a root mean squared error, thus it can be used to determine a stop criterion for sequential sampling of metamodels

  3. Validation of the actuator line/Navier Stokes technique using mexico measurements

    DEFF Research Database (Denmark)

    Shen, Wen Zhong; Zhu, Wei Jun; Sørensen, Jens Nørkær

    2010-01-01

    This paper concerns the contribution of DTU MEK in the international research collaboration project (MexNext) within the framework of IEA Annex 29 to validate aerodynamic models or CFD codes using the existing measurements made in the previous EU funded projectMEXICO (Model Experiments in Control......This paper concerns the contribution of DTU MEK in the international research collaboration project (MexNext) within the framework of IEA Annex 29 to validate aerodynamic models or CFD codes using the existing measurements made in the previous EU funded projectMEXICO (Model Experiments...... in Controlled Conditions). The Actuator Line/Navier Stokes (AL/NS) technique developed at DTU is validated against the detailed MEXICO measurements. The AL/NS computations without the DNW wind tunnel with speeds of 10m/s, 15m/s and 24m/s. Comparisons of blade loading between computations and measurements show...

  4. A Validation Study of the Impression Replica Technique.

    Science.gov (United States)

    Segerström, Sofia; Wiking-Lima de Faria, Johanna; Braian, Michael; Ameri, Arman; Ahlgren, Camilla

    2018-04-17

    To validate the well-known and often-used impression replica technique for measuring fit between a preparation and a crown in vitro. The validation consisted of three steps. First, a measuring instrument was validated to elucidate its accuracy. Second, a specimen consisting of male and female counterparts was created and validated by the measuring instrument. Calculations were made for the exact values of three gaps between the male and female. Finally, impression replicas were produced of the specimen gaps and sectioned into four pieces. The replicas were then measured with the use of a light microscope. The values received from measuring the specimen were then compared with the values received from the impression replicas, and the technique was thereby validated. The impression replica technique overvalued all measured gaps. Depending on location of the three measuring sites, the difference between the specimen and the impression replicas varied from 47 to 130 μm. The impression replica technique overestimates gaps within the range of 2% to 11%. The validation of the replica technique enables the method to be used as a reference when testing other methods for evaluating fit in dentistry. © 2018 by the American College of Prosthodontists.

  5. Validation of an online replanning technique for prostate adaptive radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Peng Cheng; Chen Guangpei; Ahunbay, Ergun; Wang Dian; Lawton, Colleen; Li, X Allen, E-mail: ali@mcw.edu [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, WI (United States)

    2011-06-21

    We have previously developed an online adaptive replanning technique to rapidly adapt the original plan according to daily CT. This paper reports the quality assurance (QA) developments in its clinical implementation for prostate cancer patients. A series of pre-clinical validation tests were carried out to verify the overall accuracy and consistency of the online replanning procedure. These tests include (a) phantom measurements of 22 individual patient adaptive plans to verify their accuracy and deliverability and (b) efficiency and applicability of the online replanning process. A four-step QA procedure was established to ensure the safe and accurate delivery of an adaptive plan, including (1) offline phantom measurement of the original plan, (2) online independent monitor unit (MU) calculation for a redundancy check, (3) online verification of plan-data transfer using an in-house software and (4) offline validation of actually delivered beam parameters. The pre-clinical validations demonstrate that the newly implemented online replanning technique is dosimetrically accurate and practically efficient. The four-step QA procedure is capable of identifying possible errors in the process of online adaptive radiotherapy and to ensure the safe and accurate delivery of the adaptive plans. Based on the success of this work, the online replanning technique has been used in the clinic to correct for interfractional changes during the prostate radiation therapy.

  6. Validation of an online replanning technique for prostate adaptive radiotherapy

    International Nuclear Information System (INIS)

    Peng Cheng; Chen Guangpei; Ahunbay, Ergun; Wang Dian; Lawton, Colleen; Li, X Allen

    2011-01-01

    We have previously developed an online adaptive replanning technique to rapidly adapt the original plan according to daily CT. This paper reports the quality assurance (QA) developments in its clinical implementation for prostate cancer patients. A series of pre-clinical validation tests were carried out to verify the overall accuracy and consistency of the online replanning procedure. These tests include (a) phantom measurements of 22 individual patient adaptive plans to verify their accuracy and deliverability and (b) efficiency and applicability of the online replanning process. A four-step QA procedure was established to ensure the safe and accurate delivery of an adaptive plan, including (1) offline phantom measurement of the original plan, (2) online independent monitor unit (MU) calculation for a redundancy check, (3) online verification of plan-data transfer using an in-house software and (4) offline validation of actually delivered beam parameters. The pre-clinical validations demonstrate that the newly implemented online replanning technique is dosimetrically accurate and practically efficient. The four-step QA procedure is capable of identifying possible errors in the process of online adaptive radiotherapy and to ensure the safe and accurate delivery of the adaptive plans. Based on the success of this work, the online replanning technique has been used in the clinic to correct for interfractional changes during the prostate radiation therapy.

  7. Validation of the Online version of the Previous Day Food Questionnaire for schoolchildren

    Directory of Open Access Journals (Sweden)

    Raquel ENGEL

    Full Text Available ABSTRACT Objective To evaluate the validity of the web-based version of the Previous Day Food Questionnaire Online for schoolchildren from the 2nd to 5th grades of elementary school. Methods Participants were 312 schoolchildren aged 7 to 12 years of a public school from the city of Florianópolis, Santa Catarina, Brazil. Validity was assessed by sensitivity, specificity, as well as by agreement rates (match, omission, and intrusion rates of food items reported by children on the Previous Day Food Questionnaire Online, using direct observation of foods/beverages eaten during school meals (mid-morning snack or afternoon snack on the previous day as the reference. Multivariate multinomial logistic regression analysis was used to evaluate the influence of participants’ characteristics on omission and intrusion rates. Results The results showed adequate sensitivity (67.7% and specificity (95.2%. There were low omission and intrusion rates of 22.8% and 29.5%, respectively when all food items were analyzed. Pizza/hamburger showed the highest omission rate, whereas milk and milk products showed the highest intrusion rate. The participants who attended school in the afternoon shift presented a higher probability of intrusion compared to their peers who attended school in the morning. Conclusion The Previous Day Food Questionnaire Online possessed satisfactory validity for the assessment of food intake at the group level in schoolchildren from the 2nd to 5th grades of public school.

  8. Ultrasonic techniques validation on shell

    International Nuclear Information System (INIS)

    Navarro, J.; Gonzalez, E.

    1998-01-01

    Due to the results obtained in several international RRT during the 80's, it has been necessary to prove the effectiveness of the NDT techniques. For this reason it has been imperative to verify the goodness of the Inspection Procedure over different mock-ups, representative of the inspection area and with real defects. Prior to the revision of the inspection procedure and with the aim of updating the techniques used, it is a good practice to perform different scans on the mock-ups until the validation is achieved. It is at this point, where all the parameters of the inspection at hands are defined; transducer, step, scan direction,... and what it's more important, it will be demonstrated that the technique to be used for the area required to inspection is suitable to evaluate the degradation phenomena that could appear. (Author)

  9. Immediate breast reconstruction after skin- or nipple-sparing mastectomy for previously augmented patients: a personal technique.

    Science.gov (United States)

    Salgarello, Marzia; Rochira, Dario; Barone-Adesi, Liliana; Farallo, Eugenio

    2012-04-01

    Breast reconstruction for previously augmented patients differs from breast reconstruction for nonaugmented patients. Many surgeons regard conservation therapy as not feasible for these patients because of implant complications, whether radiotherapy-induced or not. Despite this, most authors agree that mastectomy with immediate breast reconstruction is the most suitable choice, ensuring both a good cosmetic result and a low complication rate. Implant retention or removal remains a controversial topic in addition to the best available surgical technique. This study reviewed the authors' experience with immediate breast reconstruction after skin-sparing mastectomy (SSM) and nipple-sparing mastectomy (NSM) with anatomically definitive implants. The retrospective records of 12 patients were examined (group A). These patients were among 254 patients who underwent SSM or NSM for breast carcinoma. The control group comprised 12 of the 254 patients submitted to SSM or NSM (group B) who best matched the 12 patients in the studied group. All of them underwent immediate breast reconstruction, with an anatomically definitive implant placed in a submuscular-subfascial pocket. The demographic, technical, and oncologic data of the two groups were compared as well as the aesthetic outcomes using the Breast Q score. The proportion of complications, the type of implant, the axillary lymph node procedure, and the histology were compared between the two groups using Fisher's exact test. Student's t test was used to compare the scores for the procedure-specific modules of the breast Q questionnaire in the two groups. A validated patient satisfaction score was obtained using the breast Q questionnaire after breast reconstruction. The demographic, technical, and oncologic characteristics were not significantly different between the two groups. The previously augmented patients reported a significantly higher level of satisfaction with their breast than the control patients. The scores

  10. Technique for sparing previously irradiated critical normal structures in salvage proton craniospinal irradiation

    International Nuclear Information System (INIS)

    McDonald, Mark W; Wolanski, Mark R; Simmons, Joseph W; Buchsbaum, Jeffrey C

    2013-01-01

    Cranial reirradiation is clinically appropriate in some cases but cumulative radiation dose to critical normal structures remains a practical concern. The authors developed a simple technique in 3D conformal proton craniospinal irradiation (CSI) to block organs at risk (OAR) while minimizing underdosing of adjacent target brain tissue. Two clinical cases illustrate the use of proton therapy to provide salvage CSI when a previously irradiated OAR required sparing from additional radiation dose. The prior radiation plan was coregistered to the treatment planning CT to create a planning organ at risk volume (PRV) around the OAR. Right and left lateral cranial whole brain proton apertures were created with a small block over the PRV. Then right and left lateral “inverse apertures” were generated, creating an aperture opening in the shape of the area previously blocked and blocking the area previously open. The inverse aperture opening was made one millimeter smaller than the original block to minimize the risk of dose overlap. The inverse apertures were used to irradiate the target volume lateral to the PRV, selecting a proton beam range to abut the 50% isodose line against either lateral edge of the PRV. Together, the 4 cranial proton fields created a region of complete dose avoidance around the OAR. Comparative photon treatment plans were generated with opposed lateral X-ray fields with custom blocks and coplanar intensity modulated radiation therapy optimized to avoid the PRV. Cumulative dose volume histograms were evaluated. Treatment plans were developed and successfully implemented to provide sparing of previously irradiated critical normal structures while treating target brain lateral to these structures. The absence of dose overlapping during irradiation through the inverse apertures was confirmed by film. Compared to the lateral X-ray and IMRT treatment plans, the proton CSI technique improved coverage of target brain tissue while providing the least

  11. Validation techniques of agent based modelling for geospatial simulations

    Directory of Open Access Journals (Sweden)

    M. Darvishi

    2014-10-01

    Full Text Available One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS, biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI’s ArcGIS, OpenMap, GeoTools, etc for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  12. Validation techniques of agent based modelling for geospatial simulations

    Science.gov (United States)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  13. Validation of radioisotopic labelling techniques in gastric emptying studies

    International Nuclear Information System (INIS)

    Corinaldesi, R.; Stanghellini, V.; Raiti, C.; Calamelli, R.; Salgemini, R.; Barbara, L.; Zarabini, G.E.

    1987-01-01

    Several techniques are currently employed to label solid and liquid foods with gamma-emitting radioisotopes in order to carry out gamma-camera gastric emptying studies. The present study describes an in vitro technique for evaluating the labelling stability of some of the most commonly employed radiomarkers of both the solid and liquid phases. Technetium-99m-sulphur colloid ( 99m Tc-SC) in vivo and in vitro labelled liver of chickens and other animal species appears to be almost ideal marker of the solid phase (97% of radioactivity still bound to the solid phase after incubation in gastric juice for 90 minutes). On the contrary, 51 CrCl 3 -beef ground meat (81%) and 99m Tc-SC egg white (69%) are unsatisfactory markers of the solid phase. Likewise, 99m Tc-DTPA and 111 In-DTPA cannot be considered satisfactory fluid-phase agents, because of the high proportion of radioactivity that leaves the liquid phase to become bound to the solid phase (respectively 76% and 49% after 90 minutes of incubation). This validation technique appears to be simple, feasible and reprodicible, and can be applied in any Nuclear Medicine Department to evaluate the validity of the labelling procedures, in order to improve the accuracy of the results of radioisotopic gastric emptying studies

  14. Validation of New Crack Monitoring Technique for Victoria Class High-Pressure Air Bottles

    Science.gov (United States)

    2014-06-01

    Defence Research and Development Canada Recherche et développement pour la défense Canada Validation of new crack monitoring technique for Victoria ...Validation of new crack monitoring technique for Victoria class high-pressure air bottles Ian Thompson John R. MacKay Defence Research and Development...Canada Scientific Report DRDC-RDDC-2014-R81 June 2014 © Her Majesty the Queen in Right of Canada (Department of National Defence), 2014 © Sa Majesté

  15. Technique for unit testing of safety software verification and validation

    International Nuclear Information System (INIS)

    Li Duo; Zhang Liangju; Feng Junting

    2008-01-01

    The key issue arising from digitalization of the reactor protection system for nuclear power plant is how to carry out verification and validation (V and V), to demonstrate and confirm the software that performs reactor safety functions is safe and reliable. One of the most important processes for software V and V is unit testing, which verifies and validates the software coding based on concept design for consistency, correctness and completeness during software development. The paper shows a preliminary study on the technique for unit testing of safety software V and V, focusing on such aspects as how to confirm test completeness, how to establish test platform, how to develop test cases and how to carry out unit testing. The technique discussed here was successfully used in the work of unit testing on safety software of a digital reactor protection system. (authors)

  16. Towards Model Validation and Verification with SAT Techniques

    OpenAIRE

    Gogolla, Martin

    2010-01-01

    After sketching how system development and the UML (Unified Modeling Language) and the OCL (Object Constraint Language) are related, validation and verification with the tool USE (UML-based Specification Environment) is demonstrated. As a more efficient alternative for verification tasks, two approaches using SAT-based techniques are put forward: First, a direct encoding of UML and OCL with Boolean variables and propositional formulas, and second, an encoding employing an...

  17. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  18. Validity of a cross-specialty test in basic laparoscopic techniques (TABLT)

    DEFF Research Database (Denmark)

    Thinggaard, Ebbe; Bjerrum, Flemming; Strandbygaard, Jeanett

    2015-01-01

    . The aim of this study was to establish validity evidence for the Training and Assessment of Basic Laparoscopic Techniques (TABLT) test, a tablet-based training system. METHODS: Laparoscopic surgeons and trainees were recruited from departments of general surgery, gynaecology and urology. Participants...... included novice, intermediate and experienced surgeons. All participants performed the TABLT test. Performance scores were calculated based on time taken and errors made. Evidence of validity was explored using a contemporary framework of validity. RESULTS: Some 60 individuals participated. The TABLT...... was shown to be reliable, with an intraclass correlation coefficient of 0·99 (P value of 0·73 (P 

  19. Insertion of central venous catheters for hemodialysis using angiographic techniques in patients with previous multiple catheterizations

    Energy Technology Data Exchange (ETDEWEB)

    Kotsikoris, Ioannis, E-mail: gkotsikoris@gmail.com [Department of Vascular Surgery, “Erythros Stauros” General Hospital (Greece); Zygomalas, Apollon, E-mail: azygomalas@upatras.gr [Department of General Surgery, University Hospital of Patras (Greece); Papas, Theofanis, E-mail: pfanis@otenet.gr [Department of Vascular Surgery, “Erythros Stauros” General Hospital (Greece); Maras, Dimitris, E-mail: dimmaras@gmail.com [Department of Vascular Surgery, “Erythros Stauros” General Hospital (Greece); Pavlidis, Polyvios, E-mail: polpavlidis@yahoo.gr [Department of Vascular Surgery, “Erythros Stauros” General Hospital (Greece); Andrikopoulou, Maria, E-mail: madric@gmail.com [Department of Vascular Surgery, “Erythros Stauros” General Hospital (Greece); Tsanis, Antonis, E-mail: atsanis@gmail.com [Department of Interventional Radiology, “Erythros Stauros” General Hospital (Greece); Alivizatos, Vasileios, E-mail: valiviz@hol.gr [Department of General Surgery and Artificial Nutrition Unit, “Agios Andreas” General Hospital of Patras (Greece); Bessias, Nikolaos, E-mail: bessias@otenet.gr [Department of Vascular Surgery, “Erythros Stauros” General Hospital (Greece)

    2012-09-15

    Introduction: Central venous catheter placement is an effective alternative vascular access for dialysis in patients with chronic renal failure. The purpose of this study was to evaluate the insertion of central venous catheters for hemodialysis using angiographic techniques in patients with previous multiple catheterizations in terms of efficacy of the procedure and early complications. Materials and methods: Between 2008 and 2010, the vascular access team of our hospital placed 409 central venous catheters in patients with chronic renal failure. The procedure was performed using the Seldinger blind technique. In 18 (4.4%) cases it was impossible to advance the guidewire, and so the patients were transported to the angiography suite. Results: Using the angiographic technique, the guidewire was advanced in order to position the central venous catheter. The latter was inserted into the subclavian vein in 12 (66.6%) cases, into the internal jugular vein in 4 (22.2%) and into the femoral vein in 2 (11.1%) cases. There was only one complicated case with severe arrhythmia in 1 (5.5%) patient. Conclusion: Our results suggest that insertion of central venous catheters using angiographic techniques in hemodialysis patients with previous multiple catheterizations is a safe and effective procedure with few complications and high success rates.

  20. Insertion of central venous catheters for hemodialysis using angiographic techniques in patients with previous multiple catheterizations

    International Nuclear Information System (INIS)

    Kotsikoris, Ioannis; Zygomalas, Apollon; Papas, Theofanis; Maras, Dimitris; Pavlidis, Polyvios; Andrikopoulou, Maria; Tsanis, Antonis; Alivizatos, Vasileios; Bessias, Nikolaos

    2012-01-01

    Introduction: Central venous catheter placement is an effective alternative vascular access for dialysis in patients with chronic renal failure. The purpose of this study was to evaluate the insertion of central venous catheters for hemodialysis using angiographic techniques in patients with previous multiple catheterizations in terms of efficacy of the procedure and early complications. Materials and methods: Between 2008 and 2010, the vascular access team of our hospital placed 409 central venous catheters in patients with chronic renal failure. The procedure was performed using the Seldinger blind technique. In 18 (4.4%) cases it was impossible to advance the guidewire, and so the patients were transported to the angiography suite. Results: Using the angiographic technique, the guidewire was advanced in order to position the central venous catheter. The latter was inserted into the subclavian vein in 12 (66.6%) cases, into the internal jugular vein in 4 (22.2%) and into the femoral vein in 2 (11.1%) cases. There was only one complicated case with severe arrhythmia in 1 (5.5%) patient. Conclusion: Our results suggest that insertion of central venous catheters using angiographic techniques in hemodialysis patients with previous multiple catheterizations is a safe and effective procedure with few complications and high success rates

  1. Validation Techniques for Sensor Data in Mobile Health Applications

    Directory of Open Access Journals (Sweden)

    Ivan Miguel Pires

    2016-01-01

    Full Text Available Mobile applications have become a must in every user’s smart device, and many of these applications make use of the device sensors’ to achieve its goal. Nevertheless, it remains fairly unknown to the user to which extent the data the applications use can be relied upon and, therefore, to which extent the output of a given application is trustworthy or not. To help developers and researchers and to provide a common ground of data validation algorithms and techniques, this paper presents a review of the most commonly used data validation algorithms, along with its usage scenarios, and proposes a classification for these algorithms. This paper also discusses the process of achieving statistical significance and trust for the desired output.

  2. Reliability and criterion validity of an observation protocol for working technique assessments in cash register work.

    Science.gov (United States)

    Palm, Peter; Josephson, Malin; Mathiassen, Svend Erik; Kjellberg, Katarina

    2016-06-01

    We evaluated the intra- and inter-observer reliability and criterion validity of an observation protocol, developed in an iterative process involving practicing ergonomists, for assessment of working technique during cash register work for the purpose of preventing upper extremity symptoms. Two ergonomists independently assessed 17 15-min videos of cash register work on two occasions each, as a basis for examining reliability. Criterion validity was assessed by comparing these assessments with meticulous video-based analyses by researchers. Intra-observer reliability was acceptable (i.e. proportional agreement >0.7 and kappa >0.4) for 10/10 questions. Inter-observer reliability was acceptable for only 3/10 questions. An acceptable inter-observer reliability combined with an acceptable criterion validity was obtained only for one working technique aspect, 'Quality of movements'. Thus, major elements of the cashiers' working technique could not be assessed with an acceptable accuracy from short periods of observations by one observer, such as often desired by practitioners. Practitioner Summary: We examined an observation protocol for assessing working technique in cash register work. It was feasible in use, but inter-observer reliability and criterion validity were generally not acceptable when working technique aspects were assessed from short periods of work. We recommend the protocol to be used for educational purposes only.

  3. Validation of keypad user identity using a novel biometric technique

    International Nuclear Information System (INIS)

    Grabham, N J; White, N M

    2007-01-01

    This paper presents initial work on the development of a keypad incorporating sensors to enable the biometric identity validation of the person using the keypad. The technique reported here is covert and non-intrusive and in use, requires no additional actions on the part of the user. Test systems have been developed using commercially available keypads, modified with mass-produced force sensors to facilitate measurement of key-press dynamics. Measurements are accomplished using a DAQ module attached to a PC running custom software to extract the biometric data and perform the validation. The design of the test system and the results from initial trials are presented. For a system designed for a false rejection ratio of 0%, a false acceptance rate of around 15% can be achieved

  4. Physical validation issue of the NEPTUNE two-phase modelling: validation plan to be adopted, experimental programs to be set up and associated instrumentation techniques developed

    International Nuclear Information System (INIS)

    Pierre Peturaud; Eric Hervieu

    2005-01-01

    Full text of publication follows: A long-term joint development program for the next generation of nuclear reactors simulation tools has been launched in 2001 by EDF (Electricite de France) and CEA (Commissariat a l'Energie Atomique). The NEPTUNE Project constitutes the Thermal-Hydraulics part of this comprehensive program. Along with the underway development of this new two-phase flow software platform, the physical validation of the involved modelling is a crucial issue, whatever the modelling scale is, and the present paper deals with this issue. After a brief recall about the NEPTUNE platform, the general validation strategy to be adopted is first of all clarified by means of three major features: (i) physical validation in close connection with the concerned industrial applications, (ii) involving (as far as possible) a two-step process successively focusing on dominant separate models and assessing the whole modelling capability, (iii) thanks to the use of relevant data with respect to the validation aims. Based on this general validation process, a four-step generic work approach has been defined; it includes: (i) a thorough analysis of the concerned industrial applications to identify the key physical phenomena involved and associated dominant basic models, (ii) an assessment of these models against the available validation pieces of information, to specify the additional validation needs and define dedicated validation plans, (iii) an inventory and assessment of existing validation data (with respect to the requirements specified in the previous task) to identify the actual needs for new validation data, (iv) the specification of the new experimental programs to be set up to provide the needed new data. This work approach has been applied to the NEPTUNE software, focusing on 8 high priority industrial applications, and it has resulted in the definition of (i) the validation plan and experimental programs to be set up for the open medium 3D modelling

  5. Using of Structural Equation Modeling Techniques in Cognitive Levels Validation

    Directory of Open Access Journals (Sweden)

    Natalija Curkovic

    2012-10-01

    Full Text Available When constructing knowledge tests, cognitive level is usually one of the dimensions comprising the test specifications with each item assigned to measure a particular level. Recently used taxonomies of the cognitive levels most often represent some modification of the original Bloom’s taxonomy. There are many concerns in current literature about existence of predefined cognitive levels. The aim of this article is to investigate can structural equation modeling techniques confirm existence of different cognitive levels. For the purpose of the research, a Croatian final high-school Mathematics exam was used (N = 9626. Confirmatory factor analysis and structural regression modeling were used to test three different models. Structural equation modeling techniques did not support existence of different cognitive levels in this case. There is more than one possible explanation for that finding. Some other techniques that take into account nonlinear behaviour of the items as well as qualitative techniques might be more useful for the purpose of the cognitive levels validation. Furthermore, it seems that cognitive levels were not efficient descriptors of the items and so improvements are needed in describing the cognitive skills measured by items.

  6. Age validation of canary rockfish (Sebastes pinniger) using two independent otolith techniques: lead-radium and bomb radiocarbon dating.

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, A H; Kerr, L A; Cailliet, G M; Brown, T A; Lundstrom, C C; Stanley, R D

    2007-11-04

    Canary rockfish (Sebastes pinniger) have long been an important part of recreational and commercial rockfish fishing from southeast Alaska to southern California, but localized stock abundances have declined considerably. Based on age estimates from otoliths and other structures, lifespan estimates vary from about 20 years to over 80 years. For the purpose of monitoring stocks, age composition is routinely estimated by counting growth zones in otoliths; however, age estimation procedures and lifespan estimates remain largely unvalidated. Typical age validation techniques have limited application for canary rockfish because they are deep dwelling and may be long lived. In this study, the unaged otolith of the pair from fish aged at the Department of Fisheries and Oceans Canada was used in one of two age validation techniques: (1) lead-radium dating and (2) bomb radiocarbon ({sup 14}C) dating. Age estimate accuracy and the validity of age estimation procedures were validated based on the results from each technique. Lead-radium dating proved successful in determining a minimum estimate of lifespan was 53 years and provided support for age estimation procedures up to about 50-60 years. These findings were further supported by {Delta}{sup 14}C data, which indicated a minimum estimate of lifespan was 44 {+-} 3 years. Both techniques validate, to differing degrees, age estimation procedures and provide support for inferring that canary rockfish can live more than 80 years.

  7. Validation of a technique of measurement in vivo of 131I in thyroids

    International Nuclear Information System (INIS)

    Villella, A.M.; Puerta Yepes, N.; Gossio, S.; Papadopulos, S.

    2010-01-01

    The Total Body Counter (TBC) Laboratory of the Nuclear Regulatory Authority, following the institutional initiative of quality assurance in its measurement techniques, has been involved in an accreditation process based on the ISO/IEC 17205:2005 norm. In vivo measurement of 131 I in thyroid has been selected as the first technique in this process, and it is described in this paper. The TBC Laboratory uses for this technique a gamma spectrometry system with a NaI(Tl) detector, calibrated with a neck simulator of the IRD and a certified plane source of 131 I with thyroid form. It has been carried out a validation plan that has permitted the characterization of the 131 I measurement technique, and its uncertainty evaluation. Measurement parameters that affect the uncertainty are discussed and recommendations for the technique optimization are proposed. (authors) [es

  8. TCV software test and validation tools and technique. [Terminal Configured Vehicle program for commercial transport aircraft operation

    Science.gov (United States)

    Straeter, T. A.; Williams, J. R.

    1976-01-01

    The paper describes techniques for testing and validating software for the TCV (Terminal Configured Vehicle) program which is intended to solve problems associated with operating a commercial transport aircraft in the terminal area. The TCV research test bed is a Boeing 737 specially configured with digital computer systems to carry out automatic navigation, guidance, flight controls, and electronic displays research. The techniques developed for time and cost reduction include automatic documentation aids, an automatic software configuration, and an all software generation and validation system.

  9. Validation of a Previously Developed Geospatial Model That Predicts the Prevalence of Listeria monocytogenes in New York State Produce Fields

    Science.gov (United States)

    Weller, Daniel; Shiwakoti, Suvash; Bergholz, Peter; Grohn, Yrjo; Wiedmann, Martin

    2015-01-01

    Technological advancements, particularly in the field of geographic information systems (GIS), have made it possible to predict the likelihood of foodborne pathogen contamination in produce production environments using geospatial models. Yet, few studies have examined the validity and robustness of such models. This study was performed to test and refine the rules associated with a previously developed geospatial model that predicts the prevalence of Listeria monocytogenes in produce farms in New York State (NYS). Produce fields for each of four enrolled produce farms were categorized into areas of high or low predicted L. monocytogenes prevalence using rules based on a field's available water storage (AWS) and its proximity to water, impervious cover, and pastures. Drag swabs (n = 1,056) were collected from plots assigned to each risk category. Logistic regression, which tested the ability of each rule to accurately predict the prevalence of L. monocytogenes, validated the rules based on water and pasture. Samples collected near water (odds ratio [OR], 3.0) and pasture (OR, 2.9) showed a significantly increased likelihood of L. monocytogenes isolation compared to that for samples collected far from water and pasture. Generalized linear mixed models identified additional land cover factors associated with an increased likelihood of L. monocytogenes isolation, such as proximity to wetlands. These findings validated a subset of previously developed rules that predict L. monocytogenes prevalence in produce production environments. This suggests that GIS and geospatial models can be used to accurately predict L. monocytogenes prevalence on farms and can be used prospectively to minimize the risk of preharvest contamination of produce. PMID:26590280

  10. Validation of the ELISA technique for diagnosis of trypanosomiasis in cattle in Uganda

    International Nuclear Information System (INIS)

    Okuna, N.M.

    1992-01-01

    ELISA, developed in ILRAD for diagnosis of T. congolense, T. brucei and T. vivax in cattle, has not been validated in Uganda. This study was undertaken to validate the technique. Negative reference sera were collected from 44 cattle in Kapchorwa, a tsetse-free area. The cattle were free of the three trypanosome species T. congolense, T. brucei and T. vivax by the haematocrit buffy coat technique (BCT). But by ELISA, three were positive for T. vivax, one for both T. congolense and T. vivax and one for T. congolense. Sera were collected from the same 44 cattle 10 weeks later. The cattle were again free of T. congolense, T. brucei and T. vivax, both by BCT and by mouse inoculations. Two cattle out of 450 screened at a centre 5 km away had T. vivax by BCT. The ELISA results for the second set of sera were quite similar to the results obtained from the first set of sera. The calculated optical density (D) cut off point was 50 for both T. brucei and T. vivax, but it was 60 for T. congolense. Sera from 5 cattle which had T. theileri and two which had microfilaria were all negative for antigenaemia by ELISA. Positive reference sera were collected form 40 cattle in a high tsetse challenge area. Using the haematocrit buffy coat technique, 5 had T. vivax, two had T. brucei and one had T. congolense. Checked by ELISA for antigenaemia, only 4 cattle were free of all the three trypanosome species, T. congolense, T. vivax and T. brucei. All the 40 cattle were treated with Diminazene aceturate at the rate of 7 mg/kg body weight. Two weeks later, the ELISA test showed that 10 cattle were free of any antigenaemia. Those still positive for antigenaemia had lower OD readings. The ELISA technique is valid. It is much more sensitive compared to parasitological tests. It is specific since none of the 7 cattle with either T. theileri or microfilaria gave positive results by ELISA. The technique would be very useful for epizootiological studies. (author)

  11. Site characterization and validation - equipment design and techniques used in single borehole hydraulic testing, simulated drift experiment and crosshole testing

    International Nuclear Information System (INIS)

    Holmes, D.C.; Sehlstedt, M.

    1991-10-01

    This report describes the equipment and techniques used to investigate the variation of hydrogeological parameters within a fractured crystalline rock mass. The testing program was performed during stage 3 of the site characterization and validation programme at the Stripa mine in Sweden. This programme used a multidisciplinary approach, combining geophysical, geological and hydrogeological methods, to determine how groundwater moved through the rock mass. The hydrogeological work package involved three components. Firstly, novel single borehole techniques (focused packer testing) were used to determine the distribution of hydraulic conductivity and head along individual boreholes. Secondly, water was abstracted from boreholes which were drilled to simulate a tunnel (simulated drift experiment). Locations and magnitudes of flows were measured together with pressure responses at various points in the SCV rock mass. Thirdly, small scale crosshole tests, involving detailed interference testing, were used to determine the variability of hydrogeological parameters within previously identified, significant flow zones. (au)

  12. Validation of the FFM PD count technique for screening personality pathology in later middle-aged and older adults.

    Science.gov (United States)

    Van den Broeck, Joke; Rossi, Gina; De Clercq, Barbara; Dierckx, Eva; Bastiaansen, Leen

    2013-01-01

    Research on the applicability of the five factor model (FFM) to capture personality pathology coincided with the development of a FFM personality disorder (PD) count technique, which has been validated in adolescent, young, and middle-aged samples. This study extends the literature by validating this technique in an older sample. Five alternative FFM PD counts based upon the Revised NEO Personality Inventory (NEO PI-R) are computed and evaluated in terms of both convergent and divergent validity with the Assessment of DSM-IV Personality Disorders Questionnaire (shortly ADP-IV; DSM-IV, Diagnostic and Statistical Manual of Mental Disorders - Fourth edition). For the best working count for each PD normative data are presented, from which cut-off scores are derived. The validity of these cut-offs and their usefulness as a screening tool is tested against both a categorical (i.e., the DSM-IV - Text Revision), and a dimensional (i.e., the Dimensional Assessment of Personality Pathology; DAPP) measure of personality pathology. All but the Antisocial and Obsessive-Compulsive counts exhibited adequate convergent and divergent validity, supporting the use of this method in older adults. Using the ADP-IV and the DAPP - Short Form as validation criteria, results corroborate the use of the FFM PD count technique to screen for PDs in older adults, in particular for the Paranoid, Borderline, Histrionic, Avoidant, and Dependent PDs. Given the age-neutrality of the NEO PI-R and the considerable lack of valid personality assessment tools, current findings appear to be promising for the assessment of pathology in older adults.

  13. Fabrication and Characterization of Surrogate Glasses Aimed to Validate Nuclear Forensic Techniques

    Science.gov (United States)

    2017-12-01

    the glass formed during a nuclear event, trinitite [14]. The SiO2 composition is generally greater than 50% for trinitite and can vary appreciably...CHARACTERIZATION OF SURROGATE GLASSES AIMED TO VALIDATE NUCLEAR FORENSIC TECHNIQUES by Ken G. Foos December 2017 Thesis Advisor: Claudia...December 2017 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE FABRICATION AND CHARACTERIZATION OF SURROGATE GLASSES AIMED TO

  14. Validation of a Previously Developed Geospatial Model That Predicts the Prevalence of Listeria monocytogenes in New York State Produce Fields.

    Science.gov (United States)

    Weller, Daniel; Shiwakoti, Suvash; Bergholz, Peter; Grohn, Yrjo; Wiedmann, Martin; Strawn, Laura K

    2016-02-01

    Technological advancements, particularly in the field of geographic information systems (GIS), have made it possible to predict the likelihood of foodborne pathogen contamination in produce production environments using geospatial models. Yet, few studies have examined the validity and robustness of such models. This study was performed to test and refine the rules associated with a previously developed geospatial model that predicts the prevalence of Listeria monocytogenes in produce farms in New York State (NYS). Produce fields for each of four enrolled produce farms were categorized into areas of high or low predicted L. monocytogenes prevalence using rules based on a field's available water storage (AWS) and its proximity to water, impervious cover, and pastures. Drag swabs (n = 1,056) were collected from plots assigned to each risk category. Logistic regression, which tested the ability of each rule to accurately predict the prevalence of L. monocytogenes, validated the rules based on water and pasture. Samples collected near water (odds ratio [OR], 3.0) and pasture (OR, 2.9) showed a significantly increased likelihood of L. monocytogenes isolation compared to that for samples collected far from water and pasture. Generalized linear mixed models identified additional land cover factors associated with an increased likelihood of L. monocytogenes isolation, such as proximity to wetlands. These findings validated a subset of previously developed rules that predict L. monocytogenes prevalence in produce production environments. This suggests that GIS and geospatial models can be used to accurately predict L. monocytogenes prevalence on farms and can be used prospectively to minimize the risk of preharvest contamination of produce. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  15. Validation of transport models using additive flux minimization technique

    Energy Technology Data Exchange (ETDEWEB)

    Pankin, A. Y.; Kruger, S. E. [Tech-X Corporation, 5621 Arapahoe Ave., Boulder, Colorado 80303 (United States); Groebner, R. J. [General Atomics, San Diego, California 92121 (United States); Hakim, A. [Princeton Plasma Physics Laboratory, Princeton, New Jersey 08543-0451 (United States); Kritz, A. H.; Rafiq, T. [Department of Physics, Lehigh University, Bethlehem, Pennsylvania 18015 (United States)

    2013-10-15

    A new additive flux minimization technique is proposed for carrying out the verification and validation (V and V) of anomalous transport models. In this approach, the plasma profiles are computed in time dependent predictive simulations in which an additional effective diffusivity is varied. The goal is to obtain an optimal match between the computed and experimental profile. This new technique has several advantages over traditional V and V methods for transport models in tokamaks and takes advantage of uncertainty quantification methods developed by the applied math community. As a demonstration of its efficiency, the technique is applied to the hypothesis that the paleoclassical density transport dominates in the plasma edge region in DIII-D tokamak discharges. A simplified version of the paleoclassical model that utilizes the Spitzer resistivity for the parallel neoclassical resistivity and neglects the trapped particle effects is tested in this paper. It is shown that a contribution to density transport, in addition to the paleoclassical density transport, is needed in order to describe the experimental profiles. It is found that more additional diffusivity is needed at the top of the H-mode pedestal, and almost no additional diffusivity is needed at the pedestal bottom. The implementation of this V and V technique uses the FACETS::Core transport solver and the DAKOTA toolkit for design optimization and uncertainty quantification. The FACETS::Core solver is used for advancing the plasma density profiles. The DAKOTA toolkit is used for the optimization of plasma profiles and the computation of the additional diffusivity that is required for the predicted density profile to match the experimental profile.

  16. Validation of transport models using additive flux minimization technique

    International Nuclear Information System (INIS)

    Pankin, A. Y.; Kruger, S. E.; Groebner, R. J.; Hakim, A.; Kritz, A. H.; Rafiq, T.

    2013-01-01

    A new additive flux minimization technique is proposed for carrying out the verification and validation (V and V) of anomalous transport models. In this approach, the plasma profiles are computed in time dependent predictive simulations in which an additional effective diffusivity is varied. The goal is to obtain an optimal match between the computed and experimental profile. This new technique has several advantages over traditional V and V methods for transport models in tokamaks and takes advantage of uncertainty quantification methods developed by the applied math community. As a demonstration of its efficiency, the technique is applied to the hypothesis that the paleoclassical density transport dominates in the plasma edge region in DIII-D tokamak discharges. A simplified version of the paleoclassical model that utilizes the Spitzer resistivity for the parallel neoclassical resistivity and neglects the trapped particle effects is tested in this paper. It is shown that a contribution to density transport, in addition to the paleoclassical density transport, is needed in order to describe the experimental profiles. It is found that more additional diffusivity is needed at the top of the H-mode pedestal, and almost no additional diffusivity is needed at the pedestal bottom. The implementation of this V and V technique uses the FACETS::Core transport solver and the DAKOTA toolkit for design optimization and uncertainty quantification. The FACETS::Core solver is used for advancing the plasma density profiles. The DAKOTA toolkit is used for the optimization of plasma profiles and the computation of the additional diffusivity that is required for the predicted density profile to match the experimental profile

  17. Validation of a large-scale audit technique for CT dose optimisation

    International Nuclear Information System (INIS)

    Wood, T. J.; Davis, A. W.; Moore, C. S.; Beavis, A. W.; Saunderson, J. R.

    2008-01-01

    The expansion and increasing availability of computed tomography (CT) imaging means that there is a greater need for the development of efficient optimisation strategies that are able to inform clinical practice, without placing a significant burden on limited departmental resources. One of the most fundamental aspects to any optimisation programme is the collection of patient dose information, which can be compared with appropriate diagnostic reference levels. This study has investigated the implementation of a large-scale audit technique, which utilises data that already exist in the radiology information system, to determine typical doses for a range of examinations on four CT scanners. This method has been validated against what is considered the 'gold standard' technique for patient dose audits, and it has been demonstrated that results equivalent to the 'standard-sized patient' can be inferred from this much larger data set. This is particularly valuable where CT optimisation is concerned as it is considered a 'high dose' technique, and hence close monitoring of patient dose is particularly important. (authors)

  18. MO-B-BRB-03: 3D Dosimetry in the Clinic: Validating Special Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Juang, T. [Stanford Cancer Center (United States)

    2016-06-15

    Full three-dimensional (3D) dosimetry using volumetric chemical dosimeters probed by 3D imaging systems has long been a promising technique for the radiation therapy clinic, since it provides a unique methodology for dose measurements in the volume irradiated using complex conformal delivery techniques such as IMRT and VMAT. To date true 3D dosimetry is still not widely practiced in the community; it has been confined to centres of specialized expertise especially for quality assurance or commissioning roles where other dosimetry techniques are difficult to implement. The potential for improved clinical applicability has been advanced considerably in the last decade by the development of improved 3D dosimeters (e.g., radiochromic plastics, radiochromic gel dosimeters and normoxic polymer gel systems) and by improved readout protocols using optical computed tomography or magnetic resonance imaging. In this session, established users of some current 3D chemical dosimeters will briefly review the current status of 3D dosimetry, describe several dosimeters and their appropriate imaging for dose readout, present workflow procedures required for good dosimetry, and analyze some limitations for applications in select settings. We will review the application of 3D dosimetry to various clinical situations describing how 3D approaches can complement other dose delivery validation approaches already available in the clinic. The applications presented will be selected to inform attendees of the unique features provided by full 3D techniques. Learning Objectives: L. John Schreiner: Background and Motivation Understand recent developments enabling clinically practical 3D dosimetry, Appreciate 3D dosimetry workflow and dosimetry procedures, and Observe select examples from the clinic. Sofie Ceberg: Application to dynamic radiotherapy Observe full dosimetry under dynamic radiotherapy during respiratory motion, and Understand how the measurement of high resolution dose data in an

  19. MO-B-BRB-03: 3D Dosimetry in the Clinic: Validating Special Techniques

    International Nuclear Information System (INIS)

    Juang, T.

    2016-01-01

    Full three-dimensional (3D) dosimetry using volumetric chemical dosimeters probed by 3D imaging systems has long been a promising technique for the radiation therapy clinic, since it provides a unique methodology for dose measurements in the volume irradiated using complex conformal delivery techniques such as IMRT and VMAT. To date true 3D dosimetry is still not widely practiced in the community; it has been confined to centres of specialized expertise especially for quality assurance or commissioning roles where other dosimetry techniques are difficult to implement. The potential for improved clinical applicability has been advanced considerably in the last decade by the development of improved 3D dosimeters (e.g., radiochromic plastics, radiochromic gel dosimeters and normoxic polymer gel systems) and by improved readout protocols using optical computed tomography or magnetic resonance imaging. In this session, established users of some current 3D chemical dosimeters will briefly review the current status of 3D dosimetry, describe several dosimeters and their appropriate imaging for dose readout, present workflow procedures required for good dosimetry, and analyze some limitations for applications in select settings. We will review the application of 3D dosimetry to various clinical situations describing how 3D approaches can complement other dose delivery validation approaches already available in the clinic. The applications presented will be selected to inform attendees of the unique features provided by full 3D techniques. Learning Objectives: L. John Schreiner: Background and Motivation Understand recent developments enabling clinically practical 3D dosimetry, Appreciate 3D dosimetry workflow and dosimetry procedures, and Observe select examples from the clinic. Sofie Ceberg: Application to dynamic radiotherapy Observe full dosimetry under dynamic radiotherapy during respiratory motion, and Understand how the measurement of high resolution dose data in an

  20. Survey and assessment of conventional software verification and validation techniques

    International Nuclear Information System (INIS)

    Miller, L.A.; Groundwater, E.; Mirsky, S.M.

    1993-02-01

    Reliable software is required for nuclear power plant applications. Verification and validation (V ampersand V) techniques may be applied during software development to help eliminate errors that can inhibit the proper operation of digital systems and that may cause safety problems. EPRI and the NRC are cosponsoring this investigation to determine the best strategies for V ampersand V of expert system software. The strategy used for a particular system will depend on the complexity of the software and the level of integrity required. This report covers the first task in the investigation of reviewing methods for V ampersand V of conventional software systems and evaluating them for use with expert systems

  1. Validity of proxy data obtained by different psychological autopsy information reconstruction techniques.

    Science.gov (United States)

    Fang, L; Zhang, J

    2010-01-01

    Two informants were interviewed for each of 416 living controls (individuals sampled from the normal population) interviewed in a Chinese case-control psychological autopsy study. The validity of proxy data, obtained using seven psychological autopsy information reconstruction techniques (types 1, 2 and A - E), was evaluated, with living controls' self reports used as the gold-standard. Proxy data for reconstruction technique types 1, 2 and D on the Impulsivity Inventory Scale (total impulsivity score) were no different from the living controls' self report gold standard, whereas data for types A and E were smaller than data from living controls. On the 'acceptance or resignation' sub-scale of the avoidance coping dimension of the Moos Coping Response Inventory, information obtained by reconstruction technique types 1 and D was not significantly different from the living controls' self reports, whereas proxy data from types 2, A and E were smaller than those from the living controls. No statistically significant differences were identified for other proxy data obtained by reconstruction technique types 1, 2, A, D and E. These results indicate that using a second informant does not significantly enhance information reconstruction for the target.

  2. SU-F-BRE-07: Experimental Validation of a Lung SBRT Technique Using a Novel, True Volumetric Plenoptic-Plastic-Scintillator Detector

    International Nuclear Information System (INIS)

    Goulet, M; Rilling, M; Gingras, L; Beaulieu, L; Archambault, L; Beddar, S

    2014-01-01

    Purpose: Lung SBRT is being used by an increasing number of clinics, including our center which recently treated its first patient. In order to validate this technique, the 3D dose distribution of the SBRT plan was measured using a previously developed 3D detector based on plenoptic camera and plastic scintillator technology. The excellent agreement between the detector measurement and the expected dose from the treatment planning system Pinnacle 3 shows great promise and amply justify the development of the technique. Methods: The SBRT treatment comprised 8 non-coplanar 6MV photon fields with a mean field size of 12 cm 2 at isocentre and a total prescription dose of 12Gy per fraction for a total of 48Gy. The 3D detector was composed of a 10×10×10 cm 2 EJ-260 water-equivalent plastic scintillator embedded inside a truncated cylindrical acrylic phantom of 10cm radius. The scintillation light was recorded using a static R5 light-field camera and the 3D dose was reconstructed at a 2mm resolution in all 3 dimensions using an iterative backprojection algorithm. Results: The whole 3D dose distribution was recorded at a rate of one acquisition per second. The mean absolute dose difference between the detector and Pinnacle 3 was 1.3% over the region with more than 10% of the maximum dose. 3D gamma tests performed over the same region yield passing rates of 98.8% and 96.6% with criteria of 3%/1mm and 2%/1mm, respectively. Conclusion: Experimental results showed that our beam modeling and treatment planning system calculation was adequate for the safe administration of small field/high dose techniques such as SBRT. Moreover, because of the real-time capability of the detector, further validation of small field rotational, dynamic or gated technique can be monitored or verified by this system

  3. SU-F-BRE-07: Experimental Validation of a Lung SBRT Technique Using a Novel, True Volumetric Plenoptic-Plastic-Scintillator Detector

    Energy Technology Data Exchange (ETDEWEB)

    Goulet, M; Rilling, M; Gingras, L; Beaulieu, L; Archambault, L [CHU de Quebec and Universite Laval, Quebec, QC (Canada); Beddar, S [MD Anderson Cancer Ctr., Houston, TX (United States)

    2014-06-15

    Purpose: Lung SBRT is being used by an increasing number of clinics, including our center which recently treated its first patient. In order to validate this technique, the 3D dose distribution of the SBRT plan was measured using a previously developed 3D detector based on plenoptic camera and plastic scintillator technology. The excellent agreement between the detector measurement and the expected dose from the treatment planning system Pinnacle{sup 3} shows great promise and amply justify the development of the technique. Methods: The SBRT treatment comprised 8 non-coplanar 6MV photon fields with a mean field size of 12 cm{sup 2} at isocentre and a total prescription dose of 12Gy per fraction for a total of 48Gy. The 3D detector was composed of a 10×10×10 cm{sup 2} EJ-260 water-equivalent plastic scintillator embedded inside a truncated cylindrical acrylic phantom of 10cm radius. The scintillation light was recorded using a static R5 light-field camera and the 3D dose was reconstructed at a 2mm resolution in all 3 dimensions using an iterative backprojection algorithm. Results: The whole 3D dose distribution was recorded at a rate of one acquisition per second. The mean absolute dose difference between the detector and Pinnacle{sup 3} was 1.3% over the region with more than 10% of the maximum dose. 3D gamma tests performed over the same region yield passing rates of 98.8% and 96.6% with criteria of 3%/1mm and 2%/1mm, respectively. Conclusion: Experimental results showed that our beam modeling and treatment planning system calculation was adequate for the safe administration of small field/high dose techniques such as SBRT. Moreover, because of the real-time capability of the detector, further validation of small field rotational, dynamic or gated technique can be monitored or verified by this system.

  4. Validation of SWAT+ at field level and comparison with previous SWAT models in simulating hydrologic quantity

    Science.gov (United States)

    GAO, J.; White, M. J.; Bieger, K.; Yen, H.; Arnold, J. G.

    2017-12-01

    Over the past 20 years, the Soil and Water Assessment Tool (SWAT) has been adopted by many researches to assess water quantity and quality in watersheds around the world. As the demand increases in facilitating model support, maintenance, and future development, the SWAT source code and data have undergone major modifications over the past few years. To make the model more flexible in terms of interactions of spatial units and processes occurring in watersheds, a completely revised version of SWAT (SWAT+) was developed to improve SWAT's ability in water resource modelling and management. There are only several applications of SWAT+ in large watersheds, however, no study pays attention to validate the new model at field level and assess its performance. To test the basic hydrologic function of SWAT+, it was implemented in five field cases across five states in the U.S. and compared the SWAT+ created results with that from the previous models at the same fields. Additionally, an automatic calibration tool was used to test which model is easier to be calibrated well in a limited number of parameter adjustments. The goal of the study was to evaluate the performance of SWAT+ in simulating stream flow on field level at different geographical locations. The results demonstrate that SWAT+ demonstrated similar performance with previous SWAT model, but the flexibility offered by SWAT+ via the connection of different spatial objects can result in a more accurate simulation of hydrological processes in spatial, especially for watershed with artificial facilities. Autocalibration shows that SWAT+ is much easier to obtain a satisfied result compared with the previous SWAT. Although many capabilities have already been enhanced in SWAT+, there exist inaccuracies in simulation. This insufficiency will be improved with advancements in scientific knowledge on hydrologic process in specific watersheds. Currently, SWAT+ is prerelease, and any errors are being addressed.

  5. Validity of the Consensual Assessment Technique--Evidence with Three Groups of Judges and an Elementary School Student Sample

    Science.gov (United States)

    Long, Haiying

    2012-01-01

    As one of the most widely used creativity assessment tools, the Consensual Assessment Technique (CAT) has been praised as a valid tool to assess creativity. In Amabile's (1982) seminal work, the inter-rater reliability was defined as construct validity of the CAT. During the past three decades, researchers followed this definition and…

  6. Kidnapping Detection and Recognition in Previous Unknown Environment

    Directory of Open Access Journals (Sweden)

    Yang Tian

    2017-01-01

    Full Text Available An unaware event referred to as kidnapping makes the estimation result of localization incorrect. In a previous unknown environment, incorrect localization result causes incorrect mapping result in Simultaneous Localization and Mapping (SLAM by kidnapping. In this situation, the explored area and unexplored area are divided to make the kidnapping recovery difficult. To provide sufficient information on kidnapping, a framework to judge whether kidnapping has occurred and to identify the type of kidnapping with filter-based SLAM is proposed. The framework is called double kidnapping detection and recognition (DKDR by performing two checks before and after the “update” process with different metrics in real time. To explain one of the principles of DKDR, we describe a property of filter-based SLAM that corrects the mapping result of the environment using the current observations after the “update” process. Two classical filter-based SLAM algorithms, Extend Kalman Filter (EKF SLAM and Particle Filter (PF SLAM, are modified to show that DKDR can be simply and widely applied in existing filter-based SLAM algorithms. Furthermore, a technique to determine the adapted thresholds of metrics in real time without previous data is presented. Both simulated and experimental results demonstrate the validity and accuracy of the proposed method.

  7. Modelling, analysis and validation of microwave techniques for the characterisation of metallic nanoparticles

    Science.gov (United States)

    Sulaimalebbe, Aslam

    In the last decade, the study of nanoparticle (NP) systems has become a large and interesting research area due to their novel properties and functionalities, which are different from those of the bulk materials, and also their potential applications in different fields. It is vital to understand the behaviour and properties of nano-materials aiming at implementing nanotechnology, controlling their behaviour and designing new material systems with superior performance. Physical characterisation of NPs falls into two main categories, property and structure analysis, where the properties of the NPs cannot be studied without the knowledge of size and structure. The direct measurement of the electrical properties of metal NPs presents a key challenge and necessitates the use of innovative experimental techniques. There have been numerous reports of two/four point resistance measurements of NPs films and also electrical conductivity of NPs films using the interdigitated microarray (IDA) electrode. However, using microwave techniques such as open ended coaxial probe (OCP) and microwave dielectric resonator (DR) for electrical characterisation of metallic NPs are much more accurate and effective compared to other traditional techniques. This is because they are inexpensive, convenient, non-destructive, contactless, hazardless (i.e. at low power) and require no special sample preparation. This research is the first attempt to determine the microwave properties of Pt and Au NP films, which were appealing materials for nano-scale electronics, using the aforementioned microwave techniques. The ease of synthesis, relatively cheap, unique catalytic activities and control over the size and the shape were the main considerations in choosing Pt and Au NPs for the present study. The initial phase of this research was to implement and validate the aperture admittance model for the OCP measurement through experiments and 3D full wave simulation using the commercially available Ansoft

  8. Study on rapid valid acidity evaluation of apple by fiber optic diffuse reflectance technique

    Science.gov (United States)

    Liu, Yande; Ying, Yibin; Fu, Xiaping; Jiang, Xuesong

    2004-03-01

    Some issues related to nondestructive evaluation of valid acidity in intact apples by means of Fourier transform near infrared (FTNIR) (800-2631nm) method were addressed. A relationship was established between the diffuse reflectance spectra recorded with a bifurcated optic fiber and the valid acidity. The data were analyzed by multivariate calibration analysis such as partial least squares (PLS) analysis and principal component regression (PCR) technique. A total of 120 Fuji apples were tested and 80 of them were used to form a calibration data set. The influence of data preprocessing and different spectra treatments were also investigated. Models based on smoothing spectra were slightly worse than models based on derivative spectra and the best result was obtained when the segment length was 5 and the gap size was 10. Depending on data preprocessing and multivariate calibration technique, the best prediction model had a correlation efficient (0.871), a low RMSEP (0.0677), a low RMSEC (0.056) and a small difference between RMSEP and RMSEC by PLS analysis. The results point out the feasibility of FTNIR spectral analysis to predict the fruit valid acidity non-destructively. The ratio of data standard deviation to the root mean square error of prediction (SDR) is better to be less than 3 in calibration models, however, the results cannot meet the demand of actual application. Therefore, further study is required for better calibration and prediction.

  9. MCNP HPGe detector benchmark with previously validated Cyltran model.

    Science.gov (United States)

    Hau, I D; Russ, W R; Bronson, F

    2009-05-01

    An exact copy of the detector model generated for Cyltran was reproduced as an MCNP input file and the detection efficiency was calculated similarly with the methodology used in previous experimental measurements and simulation of a 280 cm(3) HPGe detector. Below 1000 keV the MCNP data correlated to the Cyltran results within 0.5% while above this energy the difference between MCNP and Cyltran increased to about 6% at 4800 keV, depending on the electron cut-off energy.

  10. Electron sterilization validation techniques using the controlled depth of sterilization process

    International Nuclear Information System (INIS)

    Cleghorn, D.A.; Nablo, S.V.

    1990-01-01

    Many pharmaceutical products, especially parenteral drugs, cannot be sterilized with gamma rays or high energy electrons due to the concomitant product degradation. In view of the well-controlled electron energy spectrum available in modern electron processors, it is practical to deliver sterilizing doses over depths considerably less than those defining the thickness of blister-pack constructions or pharmaceutical containers. Because bremsstrahlung and X-ray production are minimized at these low electron energies and in these low Z materials, very high electron: penetrating X-ray dose ratios are possible for the application of the technique. Thin film dosimetric techniques have been developed utilizing radiochromic film in the 10-60 g/m 2 range for determining the surface dose distribution in occluded surface areas where direct electron illumination is not possible. Procedures for validation of the process using dried spore inoculum on the product as well as in good geometry are employed to determine the process lethality and its dependence on product surface geometry. Applications of the process to labile pharmaceuticals in glass and polystyrene syringes are reviewed. It has been applied to the sterilization of commercial sterile products since 1987, and the advantages and the natural limitations of the technique are discussed. (author)

  11. X-ray digital industrial radiography (DIR) for local liquid velocity (VLL) measurement in trickle bed reactors (TBRs): Validation of the technique

    Science.gov (United States)

    Mohd Salleh, Khairul Anuar; Rahman, Mohd Fitri Abdul; Lee, Hyoung Koo; Al Dahhan, Muthanna H.

    2014-06-01

    Local liquid velocity measurements in Trickle Bed Reactors (TBRs) are one of the essential components in its hydrodynamic studies. These measurements are used to effectively determine a reactor's operating condition. This study was conducted to validate a newly developed technique that combines Digital Industrial Radiography (DIR) with Particle Tracking Velocimetry (PTV) to measure the Local Liquid Velocity (VLL) inside TBRs. Three millimeter-sized Expanded Polystyrene (EPS) beads were used as packing material. Three validation procedures were designed to test the newly developed technique. All procedures and statistical approaches provided strong evidence that the technique can be used to measure the VLL within TBRs.

  12. Validation of a new technique to detect Cryptosporidium spp. oocysts in bovine feces.

    Science.gov (United States)

    Inácio, Sandra Valéria; Gomes, Jancarlo Ferreira; Oliveira, Bruno César Miranda; Falcão, Alexandre Xavier; Suzuki, Celso Tetsuo Nagase; Dos Santos, Bianca Martins; de Aquino, Monally Conceição Costa; de Paula Ribeiro, Rafaela Silva; de Assunção, Danilla Mendes; Casemiro, Pamella Almeida Freire; Meireles, Marcelo Vasconcelos; Bresciani, Katia Denise Saraiva

    2016-11-01

    Due to its important zoonotic potential, cryptosporidiosis arouses strong interest in the scientific community, because, it was initially considered a rare and opportunistic disease. The parasitological diagnosis of the causative agent of this disease, the protozoan Cryptosporidium spp., requires the use of specific techniques of concentration and permanent staining, which are laborious and costly, and are difficult to use in routine laboratory tests. In view of the above, we conducted the feasibility, development, evaluation and intralaboratory validation of a new parasitological technique for analysis in optical microscopy of Cryptosporidium spp. oocysts, called TF-Test Coccidia, using fecal samples from calves from the city of Araçatuba, São Paulo. To confirm the aforementioned parasite and prove the diagnostic efficiency of the new technique, we used two established methodologies in the scientific literature: parasite concentration by centrifugal sedimentation and negative staining with malachite green (CSN-Malachite) and Nested-PCR. We observed good effectiveness of the TF-Test Coccidia technique, being statistically equivalent to CSN-Malachite. Thus, we verified the effectiveness of the TF-Test Coccidia parasitological technique for the detection of Cryptosporidium spp. oocysts and observed good concentration and morphology of the parasite, with a low amount of debris in the fecal smear. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. X-ray digital industrial radiography (DIR) for local liquid velocity (VLL) measurement in trickle bed reactors (TBRs): Validation of the technique

    International Nuclear Information System (INIS)

    Mohd Salleh, Khairul Anuar; Lee, Hyoung Koo; Rahman, Mohd Fitri Abdul; Al Dahhan, Muthanna H.

    2014-01-01

    Local liquid velocity measurements in Trickle Bed Reactors (TBRs) are one of the essential components in its hydrodynamic studies. These measurements are used to effectively determine a reactor's operating condition. This study was conducted to validate a newly developed technique that combines Digital Industrial Radiography (DIR) with Particle Tracking Velocimetry (PTV) to measure the Local Liquid Velocity (V LL ) inside TBRs. Three millimeter-sized Expanded Polystyrene (EPS) beads were used as packing material. Three validation procedures were designed to test the newly developed technique. All procedures and statistical approaches provided strong evidence that the technique can be used to measure the V LL within TBRs

  14. Uranium Detection - Technique Validation Report

    Energy Technology Data Exchange (ETDEWEB)

    Colletti, Lisa Michelle [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Chemistry Division; Garduno, Katherine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Chemistry Division; Lujan, Elmer J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Chemistry Division; Mechler-Hickson, Alexandra Marie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Chemistry Division; Univ. of Wisconsin, Madison, WI (United States); May, Iain [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Chemistry Division; Reilly, Sean Douglas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Chemistry Division

    2016-04-14

    As a LANL activity for DOE/NNSA in support of SHINE Medical Technologies™ ‘Accelerator Technology’ we have been investigating the application of UV-vis spectroscopy for uranium analysis in solution. While the technique has been developed specifically for sulfate solutions, the proposed SHINE target solutions, it can be adapted to a range of different solution matrixes. The FY15 work scope incorporated technical development that would improve accuracy, specificity, linearity & range, precision & ruggedness, and comparative analysis. Significant progress was achieved throughout FY 15 addressing these technical challenges, as is summarized in this report. In addition, comparative analysis of unknown samples using the Davies-Gray titration technique highlighted the importance of controlling temperature during analysis (impacting both technique accuracy and linearity/range). To fully understand the impact of temperature, additional experimentation and data analyses were performed during FY16. The results from this FY15/FY16 work were presented in a detailed presentation, LA-UR-16-21310, and an update of this presentation is included with this short report summarizing the key findings. The technique is based on analysis of the most intense U(VI) absorbance band in the visible region of the uranium spectra in 1 M H2SO4, at λmax = 419.5 nm.

  15. List of new names and new combinations previously effectively, but not validly, published.

    Science.gov (United States)

    2008-09-01

    The purpose of this announcement is to effect the valid publication of the following effectively published new names and new combinations under the procedure described in the Bacteriological Code (1990 Revision). Authors and other individuals wishing to have new names and/or combinations included in future lists should send three copies of the pertinent reprint or photocopies thereof, or an electronic copy of the published paper, to the IJSEM Editorial Office for confirmation that all of the other requirements for valid publication have been met. It is also a requirement of IJSEM and the ICSP that authors of new species, new subspecies and new combinations provide evidence that types are deposited in two recognized culture collections in two different countries (i.e. documents certifying deposition and availability of type strains). It should be noted that the date of valid publication of these new names and combinations is the date of publication of this list, not the date of the original publication of the names and combinations. The authors of the new names and combinations are as given below, and these authors' names will be included in the author index of the present issue and in the volume author index. Inclusion of a name on these lists validates the publication of the name and thereby makes it available in bacteriological nomenclature. The inclusion of a name on this list is not to be construed as taxonomic acceptance of the taxon to which the name is applied. Indeed, some of these names may, in time, be shown to be synonyms, or the organisms may be transferred to another genus, thus necessitating the creation of a new combination.

  16. Examination of irradiated fuel elements using gamma scanning technique

    International Nuclear Information System (INIS)

    Ichim, O.; Mincu, M.; Man, I.; Stanica, M.

    2016-01-01

    The purpose of this paper is to validate the gamma scanning technique used to calculate the activity of gamma fission products from CANDU/TRIGA irradiated fuel elements. After a short presentation of the equipments used and their characteristics, the paper describes the calibration technique for the devices and how computed tomography reconstruction is done. Following the previously mentioned steps is possible to obtain the axial and radial profiles and the computed tomography reconstruction for calibration sources and for the irradiated fuel elements. The results are used to validate the gamma scanning techniques as a non-destructive examination method. The gamma scanning techniques will be used to: identify the fission products in the irradiated CANDU/TRIGA fuel elements, construct the axial and radial distributions of fission products, get the distribution in cross section through computed tomography reconstruction, and determine the nuclei number and the fission products activity of the irradiated CANDU/TRIGA fuel elements. (authors)

  17. MO-B-BRB-02: 3D Dosimetry in the Clinic: IMRT Technique Validation in Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Ceberg, S. [Lund University (Sweden)

    2016-06-15

    Full three-dimensional (3D) dosimetry using volumetric chemical dosimeters probed by 3D imaging systems has long been a promising technique for the radiation therapy clinic, since it provides a unique methodology for dose measurements in the volume irradiated using complex conformal delivery techniques such as IMRT and VMAT. To date true 3D dosimetry is still not widely practiced in the community; it has been confined to centres of specialized expertise especially for quality assurance or commissioning roles where other dosimetry techniques are difficult to implement. The potential for improved clinical applicability has been advanced considerably in the last decade by the development of improved 3D dosimeters (e.g., radiochromic plastics, radiochromic gel dosimeters and normoxic polymer gel systems) and by improved readout protocols using optical computed tomography or magnetic resonance imaging. In this session, established users of some current 3D chemical dosimeters will briefly review the current status of 3D dosimetry, describe several dosimeters and their appropriate imaging for dose readout, present workflow procedures required for good dosimetry, and analyze some limitations for applications in select settings. We will review the application of 3D dosimetry to various clinical situations describing how 3D approaches can complement other dose delivery validation approaches already available in the clinic. The applications presented will be selected to inform attendees of the unique features provided by full 3D techniques. Learning Objectives: L. John Schreiner: Background and Motivation Understand recent developments enabling clinically practical 3D dosimetry, Appreciate 3D dosimetry workflow and dosimetry procedures, and Observe select examples from the clinic. Sofie Ceberg: Application to dynamic radiotherapy Observe full dosimetry under dynamic radiotherapy during respiratory motion, and Understand how the measurement of high resolution dose data in an

  18. MO-B-BRB-02: 3D Dosimetry in the Clinic: IMRT Technique Validation in Sweden

    International Nuclear Information System (INIS)

    Ceberg, S.

    2016-01-01

    Full three-dimensional (3D) dosimetry using volumetric chemical dosimeters probed by 3D imaging systems has long been a promising technique for the radiation therapy clinic, since it provides a unique methodology for dose measurements in the volume irradiated using complex conformal delivery techniques such as IMRT and VMAT. To date true 3D dosimetry is still not widely practiced in the community; it has been confined to centres of specialized expertise especially for quality assurance or commissioning roles where other dosimetry techniques are difficult to implement. The potential for improved clinical applicability has been advanced considerably in the last decade by the development of improved 3D dosimeters (e.g., radiochromic plastics, radiochromic gel dosimeters and normoxic polymer gel systems) and by improved readout protocols using optical computed tomography or magnetic resonance imaging. In this session, established users of some current 3D chemical dosimeters will briefly review the current status of 3D dosimetry, describe several dosimeters and their appropriate imaging for dose readout, present workflow procedures required for good dosimetry, and analyze some limitations for applications in select settings. We will review the application of 3D dosimetry to various clinical situations describing how 3D approaches can complement other dose delivery validation approaches already available in the clinic. The applications presented will be selected to inform attendees of the unique features provided by full 3D techniques. Learning Objectives: L. John Schreiner: Background and Motivation Understand recent developments enabling clinically practical 3D dosimetry, Appreciate 3D dosimetry workflow and dosimetry procedures, and Observe select examples from the clinic. Sofie Ceberg: Application to dynamic radiotherapy Observe full dosimetry under dynamic radiotherapy during respiratory motion, and Understand how the measurement of high resolution dose data in an

  19. Applications of the parity space technique to the validation of the water level measurement of pressurizer for steady state and transients

    International Nuclear Information System (INIS)

    Zwingelstein, G.; Bath, L.

    1983-01-01

    During the design of disturbance analysis and surveillance systems, safety parameter display systems, computerized operator support systems or advanced control rooms, sensor signal validation is commonly considered as the first task to be performed. After an introduction of the anticipated benefits of the signal validation techniques and a brief survey of the methods under current practices, a signal validation technique based upon the parity space methodology is presented. The efficiency of the method applied to the detection and the identification of five types of failures is illustrated with two examples when three water level measurements of a pressurizer of a nuclear plant are redundant. In the first example the use of the analytical redundancy technique is presented when only two identical sensors are available. A detailed description of the dynamic model of the pressurizer is given. In the second example the case of the identical water level sensors is considered. Performances of the software developed on a computer DEC PDP 11 are finally given

  20. Further validation of the HPCD-technique for the evaluation of PAH microbial availability in soil

    International Nuclear Information System (INIS)

    Doick, Kieron J.; Clasper, Paula J.; Urmann, Karina; Semple, Kirk T.

    2006-01-01

    There is currently considerable scientific interest in finding a chemical technique capable of predicting bioavailability; non-exhaustive extraction techniques (NEETs) offer such potential. Hydroxypropyl-beta-cyclodextrin (HPCD), a NEET, is further validated through the investigation of concentration ranges, differing soil types, and the presence of co-contaminants. This is the first study to demonstrate the utility of the HPCD-extraction technique to predict the microbial availability to phenanthrene across a wide concentration range and independent of soil-contaminant contact time (123 d). The efficacy of the HPCD-extraction technique for the estimation of PAH microbial availability in soil is demonstrated in the presence of co-contaminants that have been aged for the duration of the experiment together in the soil. Desorption dynamics are compared in co-contaminant and single-PAH contaminated spiked soils to demonstrate the occurrence of competitive displacement. Overall, a single HPCD-extraction technique proved accurate and reproducible for the estimation of PAH bioavailability from soil. - HPCD extractions can determine the microbial availability of PAHs in mixtures and over a range of concentrations

  1. X-ray digital industrial radiography (DIR) for local liquid velocity (V(LL)) measurement in trickle bed reactors (TBRs): validation of the technique.

    Science.gov (United States)

    Mohd Salleh, Khairul Anuar; Rahman, Mohd Fitri Abdul; Lee, Hyoung Koo; Al Dahhan, Muthanna H

    2014-06-01

    Local liquid velocity measurements in Trickle Bed Reactors (TBRs) are one of the essential components in its hydrodynamic studies. These measurements are used to effectively determine a reactor's operating condition. This study was conducted to validate a newly developed technique that combines Digital Industrial Radiography (DIR) with Particle Tracking Velocimetry (PTV) to measure the Local Liquid Velocity (V(LL)) inside TBRs. Three millimeter-sized Expanded Polystyrene (EPS) beads were used as packing material. Three validation procedures were designed to test the newly developed technique. All procedures and statistical approaches provided strong evidence that the technique can be used to measure the V(LL) within TBRs.

  2. Validation of the Berlese-funnel technique for thrips extraction.

    Science.gov (United States)

    Casteels, H; Witters, J; De Bondt, G; Desamblanx, J

    2009-01-01

    In order to get the accreditation EN ISO/IEC 17025 for Thrips palmi the Berlese-funnel technique, which is used for the isolation of quarantine insects out of plant material, was validated. Following parameters were investigated: cleaning of the funnel, temperature during isolation, detection limit and duration of the isolation period. Thrips fuscipennis was collected from heavily infected rosehip and used as target organism. Besides orchids, artificially contaminated maple leaves (Acer pseudoplatanus) were used for the validation. Results showed that thrips and other organisms can be present alive or dead in the funnel after removing the treated plants and can contaminate the next sample or isolate. Cleaning of the funnel with a vacuum cleaner and compressed-air apparatus is necessary before running a new extraction. Contamination of the recipient is also possible from the environment. This can be avoided by closing the opening between the funnel and the recipient. To reach an optimal temperature for isolation of the thrips a 60 Watt bulb is necessary. The results showed that the maximum temperature doesn't reach a temperature above 51 degrees C, the average temperatures were situated between 35, 74 degrees C and 39, 38 degrees C. A 40 Watt bulb doesn't create enough heat to guarantee an efficient isolation of the thrips; the average temperature was 34, 74 degrees C and the maximum temperature 36, 80 degrees C. Based on the results we can conclude that an isolation time of 20 hours is necessary to obtain accurate data. Dependent on the number of thrips in the artificially infected samples 87 to 95% is isolated after 20 hours. The detection limit is 1 thrips with a probability of 95% being isolated after 20 hours.

  3. Are we really measuring what we say we're measuring? Using video techniques to supplement traditional construct validation procedures.

    Science.gov (United States)

    Podsakoff, Nathan P; Podsakoff, Philip M; Mackenzie, Scott B; Klinger, Ryan L

    2013-01-01

    Several researchers have persuasively argued that the most important evidence to consider when assessing construct validity is whether variations in the construct of interest cause corresponding variations in the measures of the focal construct. Unfortunately, the literature provides little practical guidance on how researchers can go about testing this. Therefore, the purpose of this article is to describe how researchers can use video techniques to test whether their scales measure what they purport to measure. First, we discuss how researchers can develop valid manipulations of the focal construct that they hope to measure. Next, we explain how to design a study to use this manipulation to test the validity of the scale. Finally, comparing and contrasting traditional and contemporary perspectives on validation, we discuss the advantages and limitations of video-based validation procedures. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  4. Validation study of the modified injection technique for internal mammary sentinel lymph node biopsy in breast cancer

    Directory of Open Access Journals (Sweden)

    Cong BB

    2015-09-01

    Full Text Available Bin-Bin Cong,1,2,* Xiao-Shan Cao,1,2,* Peng-Fei Qiu,1 Yan-Bing Liu,1 Tong Zhao,1 Peng Chen,1 Chun-Jian Wang,1 Zhao-Peng Zhang,1 Xiao Sun,1 Yong-Sheng Wang1 1Breast Cancer Center, Shandong Cancer Hospital and Institute, 2School of Medicine and Life Sciences, Jinan University-Shandong Academy of Medical Sciences, Jinan, Shandong, People’s Republic of China *These authors contributed equally to this study Abstract: According to the hypothesis of internal mammary sentinel lymph node (IM-SLN lymphatic drainage pattern, a modified radiotracer injection technique (periareolar intraparenchyma, high volume, and ultrasonographic guidance was established. To verify the accuracy of the hypothesis and validate the modified radiotracer injection technique and to observe whether the lymphatic drainage of the whole breast parenchyma could reach to the same IM-SLN, different tracers were injected into different locations of the breast. The validation study results showed that the correlation and the agreement of the radiotracer and the fluorescence tracer are significant (case-base, rs =0.808, P<0.001; Kappa =0.79, P<0.001. It proved that the lymphatic drainage from different location of the breast (the primary tumor, the subareolar plexus reached the same IM-SLNs and the hypothesis of IM-SLN lymphatic drainage pattern (ie, IM-SLN receives lymphatic drainage from not only the primary tumor area, but also the entire breast parenchyma. In other words, it validated the accuracy of our modified radiotracer injection technique. Keywords: breast cancer, internal mammary, sentinel lymph node biopsy, visualization rate

  5. X-ray digital industrial radiography (DIR) for local liquid velocity (V{sub LL}) measurement in trickle bed reactors (TBRs): Validation of the technique

    Energy Technology Data Exchange (ETDEWEB)

    Mohd Salleh, Khairul Anuar, E-mail: kmfgf@mst.edu; Lee, Hyoung Koo [Department of Mining and Nuclear Engineering, Missouri University of Science and Technology, Fulton Hall, 310 W. 14th St., Rolla, Missouri 65409 (United States); Rahman, Mohd Fitri Abdul [Department of Chemical and Biochemical Engineering, Missouri University of Science and Technology, 143 Schrenk Hall, 400 W. 11th St., Rolla, Missouri 65409 (United States); Al Dahhan, Muthanna H. [Department of Mining and Nuclear Engineering, Missouri University of Science and Technology, Fulton Hall, 310 W. 14th St., Rolla, Missouri 65409 (United States); Department of Chemical and Biochemical Engineering, Missouri University of Science and Technology, 143 Schrenk Hall, 400 W. 11th St., Rolla, Missouri 65409 (United States)

    2014-06-15

    Local liquid velocity measurements in Trickle Bed Reactors (TBRs) are one of the essential components in its hydrodynamic studies. These measurements are used to effectively determine a reactor's operating condition. This study was conducted to validate a newly developed technique that combines Digital Industrial Radiography (DIR) with Particle Tracking Velocimetry (PTV) to measure the Local Liquid Velocity (V{sub LL}) inside TBRs. Three millimeter-sized Expanded Polystyrene (EPS) beads were used as packing material. Three validation procedures were designed to test the newly developed technique. All procedures and statistical approaches provided strong evidence that the technique can be used to measure the V{sub LL} within TBRs.

  6. Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI

    Science.gov (United States)

    Forer, Barry; Zumbo, Bruno D.

    2011-01-01

    The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…

  7. Parental Attitudes Toward Advanced Behavior Guidance Techniques Used in Pediatric Dentistry.

    Science.gov (United States)

    Patel, Monica; McTigue, Dennis J; Thikkurissy, Sarat; Fields, Henry W

    2016-01-01

    To re-examine parental attitudes toward advanced behavior management techniques in pediatric dentistry and determine whether cost, urgency, and amount of treatment influence parental preferences. Parents viewed previously validated videotaped clinical vignettes of four advanced behavior guidance techniques: (1) passive immobilization; (2) active immobilization; (3) general anesthesia; and (4) oral sedation. The study was conducted in a children's hospital dental clinic and a suburban private pediatric dentistry office. Parents rated overall acceptance of the techniques, and acceptance under specified conditions using an anchored visual analogue scale. One hundred five parents completed the survey; 55 from the children's hospital and 50 from private practice. Oral sedation was rated as the most acceptable technique, followed by general anesthesia, active immobilization, and passive immobilization. As urgency, convenience, and previous experience increased, parental acceptance of the technique increased. As cost of treatment increased, parental acceptance decreased. Ratings between the children's hospital group and private practice group differed, as did the demographic variables of insurance, income, and race. The hierarchy of parental acceptance of advanced behavior guidance techniques is changing with increasing approval of pharmacological management and decreasing approval of physical management. The health care delivery system, urgency, convenience, previous experience, and cost all influence parental acceptance.

  8. Validating database constraints and updates using automated reasoning techniques

    NARCIS (Netherlands)

    Feenstra, Remco; Wieringa, Roelf J.

    1995-01-01

    In this paper, we propose a new approach to the validation of formal specifications of integrity constraints. The validation problem of fornmal specifications consists of assuring whether the formal specification corresponds with what the domain specialist intends. This is distinct from the

  9. Reliability and validity of the ultrasound technique to measure the rectus femoris muscle diameter in older CAD-patients

    Directory of Open Access Journals (Sweden)

    Thomaes Tom

    2012-04-01

    Full Text Available Abstract Background The increasing age of coronary artery disease (CAD patients and the occurrence of sarcopenia in the elderly population accompanied by 'fear of moving' and hospitalization in these patients often results in a substantial loss of skeletal muscle mass and muscle strength. Cardiac rehabilitation can improve exercise tolerance and muscle strength in CAD patients but less data describe eventual morphological muscular changes possibly by more difficult access to imaging techniques. Therefore the aim of this study is to assess and quantify the reliability and validity of an easy applicable method, the ultrasound (US technique, to measure the diameter of rectus femoris muscle in comparison to the muscle dimensions measured with CT scans. Methods 45 older CAD patients without cardiac event during the last 9 months were included in this study. 25 patients were tested twice with ultrasound with a two day interval to assess test-retest reliability and 20 patients were tested twice (once with US and once with CT on the same day to assess the validity of the US technique compared to CT as the gold standard. Isometric and isokinetic muscle testing was performed to test potential zero-order correlations between muscle diameter, muscle volume and muscle force. Results An intraclass correlation coefficient (ICC of 0.97 ((95%CL: 0.92 - 0.99 was found for the test-retest reliability of US and the ICC computed between US and CT was 0.92 (95%CL: 0.81 - 0.97. The absolute difference between both techniques was 0.01 ± 0.12 cm (p = 0.66 resulting in a typical percentage error of 4.4%. Significant zero-order correlations were found between local muscle volume and muscle diameter assessed with CT (r = 0.67, p = 0.001 and assessed with US (r = 0.49, p Conclusions Ultrasound imaging can be used as a valid and reliable measurement tool to assess the rectus femoris muscle diameter in older CAD patients.

  10. Laparoscopy After Previous Laparotomy

    Directory of Open Access Journals (Sweden)

    Zulfo Godinjak

    2006-11-01

    Full Text Available Following the abdominal surgery, extensive adhesions often occur and they can cause difficulties during laparoscopic operations. However, previous laparotomy is not considered to be a contraindication for laparoscopy. The aim of this study is to present that an insertion of Veres needle in the region of umbilicus is a safe method for creating a pneumoperitoneum for laparoscopic operations after previous laparotomy. In the last three years, we have performed 144 laparoscopic operations in patients that previously underwent one or two laparotomies. Pathology of digestive system, genital organs, Cesarean Section or abdominal war injuries were the most common causes of previouslaparotomy. During those operations or during entering into abdominal cavity we have not experienced any complications, while in 7 patients we performed conversion to laparotomy following the diagnostic laparoscopy. In all patients an insertion of Veres needle and trocar insertion in the umbilical region was performed, namely a technique of closed laparoscopy. Not even in one patient adhesions in the region of umbilicus were found, and no abdominal organs were injured.

  11. Validity of photo-identification technique to analyze natural markings in Melanophryniscus montevidensis (Anura: Bufonidae

    Directory of Open Access Journals (Sweden)

    Ernesto Elgue

    2014-08-01

    Full Text Available Individual identification is useful for answering a variety of biological questions about animal life histories. Most of the techniques used to mark amphibians are invasive and can cause negative effects, compromising individual survivorship and biasing studies. Photo-identification consists in the identification of specimens based on photographic records of unique color-design patterns. This technique has been used with success in several amphibian species. Melanophryniscus montevidensis is an endangered anuran species inhabiting the Uruguayan Atlantic coast. The general pattern of coloration is black with red and yellow blotches on the belly. In this study, we validated the technique of photo-identification assisted by software for individual recognition in M. montevidensis using natural markings. Field trips were performed over 16 months during which, the ventral color pattern of specimens was photographed. The photos were edited and analyzed with the Wild-ID 1.0 software for photographic reconnaissance. An efficiency of 100% was obtained in the visual recognition and 90% in the detection of recaptures using the software. The use of photo-identification using natural marks is an effective technique in this species, because the color pattern of the belly was highly variable among individuals and remained unchanged in individuals over the 16 month period. In this evaluation the use of software for photo-identification was necessary for the treatment of large databases.

  12. Models of signal validation using artificial intelligence techniques applied to a nuclear reactor

    International Nuclear Information System (INIS)

    Oliveira, Mauro V.; Schirru, Roberto

    2000-01-01

    This work presents two models of signal validation in which the analytical redundancy of the monitored signals from a nuclear plant is made by neural networks. In one model the analytical redundancy is made by only one neural network while in the other it is done by several neural networks, each one working in a specific part of the entire operation region of the plant. Four cluster techniques were tested to separate the entire operation region in several specific regions. An additional information of systems' reliability is supplied by a fuzzy inference system. The models were implemented in C language and tested with signals acquired from Angra I nuclear power plant, from its start to 100% of power. (author)

  13. Validation of Extraction Paper Chromatography as a Quality Control Technique for Analysis of Sr-90 in Y-90 Product

    International Nuclear Information System (INIS)

    Nipavan, Poramatikul; Jatupol, Sangsuriyan; Wiranee, Sriweing

    2009-07-01

    Full text: Yttrium-90 (Y-90) is a daughter product of strontium-90 (Sr-90). It is specified that there should be less than 2 micro-curie of Sr-90 in Y-90 radiopharmaceuticals. Since both nuclides are beta emitting and there is always a contamination of Y-90 in Sr-90 sample, validation of the analytical method is necessary. In this study, commercial Y-90 and Sr-85 (a gamma emitting isotope of strontium) were used as daughter and mother nuclides, respectively. Extraction paper chromatography technique and its efficient validation method were investigated. Bis-(2-ethylhexyl) diphosphonate was dropped at the origin of chromatography paper and air dried prior to sample drops. Validation of the separation was done by radio-chromatography scanning of the chromatography paper. Their energy spectra were identified in the spectra mode of Packard Cobra II automatic gamma counter, which can differentiate a pure gamma, a pure beta and a mixture of beta and gamma nuclides. Results showed that yttrium acetate remained fixed at the origin of the chromatography paper while strontium acetate moved to the solvent front when developed in saline. In conclusion, the extraction paper chromatography technique can effectively separate Sr-90 from Y-90

  14. Validation of a residue method to determine pesticide residues in cucumber by using nuclear techniques

    International Nuclear Information System (INIS)

    Baysoyu, D.; Tiryaki, O.; Secer, E.; Aydin, G.

    2009-01-01

    In this study, a multi-residue method using ethyl acetate for extraction and gel permeation chromatography for clean-up was validated to determine chlorpyrifos, malathion and dichlorvos in cucumber by gas chromatography. For this purpose, homogenized cucumber samples were fortified with pesticides at 0.02 0.2, 0.8 and 1 mg/kg levels. The efficiency and repeatability of the method in extraction and cleanup steps were performed using 1 4C-carbaryl by radioisotope tracer technique. 1 4C-carbaryl recoveries after the extraction and cleanup steps were between 92.63-111.73 % with a repeatability of 4.85% (CV) and 74.83-102.22 % with a repeatability of 7.19% (CV), respectively. The homogeneity of analytical samples and the stability of pesticides during homogenization were determined using radio tracer technique and chromatographic methods, respectively.

  15. Validation and qualification of surface-applied fibre optic strain sensors using application-independent optical techniques

    International Nuclear Information System (INIS)

    Schukar, Vivien G; Kadoke, Daniel; Kusche, Nadine; Münzenberger, Sven; Gründer, Klaus-Peter; Habel, Wolfgang R

    2012-01-01

    Surface-applied fibre optic strain sensors were investigated using a unique validation facility equipped with application-independent optical reference systems. First, different adhesives for the sensor's application were analysed regarding their material properties. Measurements resulting from conventional measurement techniques, such as thermo-mechanical analysis and dynamic mechanical analysis, were compared with measurements resulting from digital image correlation, which has the advantage of being a non-contact technique. Second, fibre optic strain sensors were applied to test specimens with the selected adhesives. Their strain-transfer mechanism was analysed in comparison with conventional strain gauges. Relative movements between the applied sensor and the test specimen were visualized easily using optical reference methods, digital image correlation and electronic speckle pattern interferometry. Conventional strain gauges showed limited opportunities for an objective strain-transfer analysis because they are also affected by application conditions. (paper)

  16. Retrieval of nitrogen dioxide stratospheric profiles from ground-based zenith-sky UV-visible observations: validation of the technique through correlative comparisons

    Directory of Open Access Journals (Sweden)

    F. Hendrick

    2004-01-01

    Full Text Available A retrieval algorithm based on the Optimal Estimation Method (OEM has been developed in order to provide vertical distributions of NO2 in the stratosphere from ground-based (GB zenith-sky UV-visible observations. It has been applied to observational data sets from the NDSC (Network for Detection of Stratospheric Change stations of Harestua (60° N, 10° E and Andøya (69° N, 16° E in Norway. The information content and retrieval errors have been analyzed following a formalism used for characterizing ozone profiles retrieved from solar infrared absorption spectra. In order to validate the technique, the retrieved NO2 vertical profiles and columns have been compared to correlative balloon and satellite observations. Such extensive validation of the profile and column retrievals was not reported in previously published work on the profiling from GB UV-visible measurements. A good agreement - generally better than 25% - has been found with the SAOZ (Système d'Analyse par Observations Zénithales and DOAS (Differential Optical Absorption Spectroscopy balloons. A similar agreement has been reached with correlative satellite data from the HALogen Occultation Experiment (HALOE and Polar Ozone and Aerosol Measurement (POAM III instruments above 25km of altitude. Below 25km, a systematic underestimation - by up to 40% in some cases - of both HALOE and POAM III profiles by our GB profile retrievals has been observed, pointing out more likely a limitation of both satellite instruments at these altitudes. We have concluded that our study strengthens our confidence in the reliability of the retrieval of vertical distribution information from GB UV-visible observations and offers new perspectives in the use of GB UV-visible network data for validation purposes.

  17. Validation of a dose-point kernel convolution technique for internal dosimetry

    International Nuclear Information System (INIS)

    Giap, H.B.; Macey, D.J.; Bayouth, J.E.; Boyer, A.L.

    1995-01-01

    The objective of this study was to validate a dose-point kernel convolution technique that provides a three-dimensional (3D) distribution of absorbed dose from a 3D distribution of the radionuclide 131 I. A dose-point kernel for the penetrating radiations was calculated by a Monte Carlo simulation and cast in a 3D rectangular matrix. This matrix was convolved with the 3D activity map furnished by quantitative single-photon-emission computed tomography (SPECT) to provide a 3D distribution of absorbed dose. The convolution calculation was performed using a 3D fast Fourier transform (FFT) technique, which takes less than 40 s for a 128 x 128 x 16 matrix on an Intel 486 DX2 (66 MHz) personal computer. The calculated photon absorbed dose was compared with values measured by thermoluminescent dosimeters (TLDS) inserted along the diameter of a 22 cm diameter annular source of 131 I. The mean and standard deviation of the percentage difference between the measurements and the calculations were equal to -1% and 3.6% respectively. This convolution method was also used to calculate the 3D dose distribution in an Alderson abdominal phantom containing a liver, a spleen, and a spherical tumour volume loaded with various concentrations of 131 I. By averaging the dose calculated throughout the liver, spleen, and tumour the dose-point kernel approach was compared with values derived using the MIRD formalism, and found to agree to better than 15%. (author)

  18. Validation of an extraction paper chromatography (EPC) technique for estimation of trace levels of 90Sr in 90Y solutions obtained from 90Sr/90Y generator systems

    International Nuclear Information System (INIS)

    Usha Pandey; Yogendra Kumar; Ashutosh Dash

    2014-01-01

    While the extraction paper chromatography (EPC) technique constitutes a novel paradigm for the determination of few Becquerels of 90 Sr in MBq quantities of 90 Y obtained from 90 Sr/ 90 Y generator, validation of the technique is essential to ensure its usefulness as a real time analytical tool. With a view to explore the relevance and applicability of EPC technique as a real time quality control (QC) technique for the routine estimation of 90 Sr content in generator produced 90 Y, a systematic validation study was carried out diligently not only to establish its worthiness but also to broaden its horizon. The ability of the EPC technique to separate trace amounts of Sr 2+ in the presence of large amounts of Y 3+ was verified. The specificity of the technique for Y 3+ was demonstrated with 90 Y obtained by neutron irradiation. The method was validated under real experimental conditions and compared with a QC method described in US Pharmacopeia for detection of 90 Sr levels in 90 Y radiopharmaceuticals. (author)

  19. Arterial stiffness estimation in healthy subjects: a validation of oscillometric (Arteriograph) and tonometric (SphygmoCor) techniques.

    Science.gov (United States)

    Ring, Margareta; Eriksson, Maria Jolanta; Zierath, Juleen Rae; Caidahl, Kenneth

    2014-11-01

    Arterial stiffness is an important cardiovascular risk marker, which can be measured noninvasively with different techniques. To validate such techniques in healthy subjects, we compared the recently introduced oscillometric Arteriograph (AG) technique with the tonometric SphygmoCor (SC) method and their associations with carotid ultrasound measures and traditional risk indicators. Sixty-three healthy subjects aged 20-69 (mean 48 ± 15) years were included. We measured aortic pulse wave velocity (PWVao) and augmentation index (AIx) by AG and SC, and with SC also the PWVao standardized to 80% of the direct distance between carotid and femoral sites (St-PWVaoSC). The carotid strain, stiffness index and intima-media thickness (cIMTmean) were evaluated by ultrasound. PWVaoAG (8.00 ± 2.16 m s(-1)) was higher (Pstiffness indices by AG and SC correlate with vascular risk markers in healthy subjects. AIxao results by AG and SC are closely interrelated, but higher values are obtained by AG. In the lower range, PWVao values by AG and SC are similar, but differ for higher values. Our results imply the necessity to apply one and the same technique for repeated studies.

  20. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    Science.gov (United States)

    Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB

    2017-11-01

    Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  1. The validation of the Z-Scan technique for the determination of plasma glucose

    Science.gov (United States)

    Alves, Sarah I.; Silva, Elaine A. O.; Costa, Simone S.; Sonego, Denise R. N.; Hallack, Maira L.; Coppini, Ornela L.; Rowies, Fernanda; Azzalis, Ligia A.; Junqueira, Virginia B. C.; Pereira, Edimar C.; Rocha, Katya C.; Fonseca, Fernando L. A.

    2013-11-01

    Glucose is the main energy source for the human body. The concentration of blood glucose is regulated by several hormones including both antagonists: insulin and glucagon. The quantification of glucose in the blood is used for diagnosing metabolic disorders of carbohydrates, such as diabetes, idiopathic hypoglycemia and pancreatic diseases. Currently, the methodology used for this determination is the enzymatic colorimetric with spectrophotometric. This study aimed to validate the use of measurements of nonlinear optical properties of plasma glucose via the Z-Scan technique. For this we used samples of calibrator patterns that simulate commercial samples of patients (ELITech ©). Besides calibrators, serum glucose levels within acceptable reference values (normal control serum - Brazilian Society of Clinical Pathology and Laboratory Medicine) and also overestimated (pathological control serum - Brazilian Society of Clinical Pathology and Laboratory Medicine) were used in the methodology proposal. Calibrator dilutions were performed and determined by the Z-Scan technique for the preparation of calibration curve. In conclusion, Z-Scan method can be used to determinate glucose levels in biological samples with enzymatic colorimetric reaction and also to apply the same quality control parameters used in biochemistry clinical.

  2. [A brief history of resuscitation - the influence of previous experience on modern techniques and methods].

    Science.gov (United States)

    Kucmin, Tomasz; Płowaś-Goral, Małgorzata; Nogalski, Adam

    2015-02-01

    Cardiopulmonary resuscitation (CPR) is relatively novel branch of medical science, however first descriptions of mouth-to-mouth ventilation are to be found in the Bible and literature is full of descriptions of different resuscitation methods - from flagellation and ventilation with bellows through hanging the victims upside down and compressing the chest in order to stimulate ventilation to rectal fumigation with tobacco smoke. The modern history of CPR starts with Kouwenhoven et al. who in 1960 published a paper regarding heart massage through chest compressions. Shortly after that in 1961Peter Safar presented a paradigm promoting opening the airway, performing rescue breaths and chest compressions. First CPR guidelines were published in 1966. Since that time guidelines were modified and improved numerously by two leading world expert organizations ERC (European Resuscitation Council) and AHA (American Heart Association) and published in a new version every 5 years. Currently 2010 guidelines should be obliged. In this paper authors made an attempt to present history of development of resuscitation techniques and methods and assess the influence of previous lifesaving methods on nowadays technologies, equipment and guidelines which allow to help those women and men whose life is in danger due to sudden cardiac arrest. © 2015 MEDPRESS.

  3. Convergent validity test, construct validity test and external validity test of the David Liberman algorithm

    Directory of Open Access Journals (Sweden)

    David Maldavsky

    2013-08-01

    Full Text Available The author first exposes a complement of a previous test about convergent validity, then a construct validity test and finally an external validity test of the David Liberman algorithm.  The first part of the paper focused on a complementary aspect, the differential sensitivity of the DLA 1 in an external comparison (to other methods, and 2 in an internal comparison (between two ways of using the same method, the DLA.  The construct validity test exposes the concepts underlined to DLA, their operationalization and some corrections emerging from several empirical studies we carried out.  The external validity test examines the possibility of using the investigation of a single case and its relation with the investigation of a more extended sample.

  4. Validating a UAV artificial intelligence control system using an autonomous test case generator

    Science.gov (United States)

    Straub, Jeremy; Huber, Justin

    2013-05-01

    The validation of safety-critical applications, such as autonomous UAV operations in an environment which may include human actors, is an ill posed problem. To confidence in the autonomous control technology, numerous scenarios must be considered. This paper expands upon previous work, related to autonomous testing of robotic control algorithms in a two dimensional plane, to evaluate the suitability of similar techniques for validating artificial intelligence control in three dimensions, where a minimum level of airspeed must be maintained. The results of human-conducted testing are compared to this automated testing, in terms of error detection, speed and testing cost.

  5. Attributes of low involvement products: A comparison of five elicitation techniques and a test of their nomological validity

    DEFF Research Database (Denmark)

    Bech-Larsen, Tino; Nielsen, Niels Asger; Grunert, Klaus G.

    attributes, which are important to consumers, is the critical first step in the majority of consumer behavior studies. 2. A number of techniques, ranging from the complicated elicitation of idiosyncratic attributes, to simpler techniques, as picking from a pre-specified list of attributes, has been developed...... negatively to the abstraction level and the number of attributes involved in the choice task. 4. The purpose of the study presented in this paper is to: (a) compare different elicitation techniques on a number of different criteria, such as: importance to consumers, ability to discriminate between brands......, predictive ability, time use, and number of attributes elicited; and to (b) test the nomological validity of the basic assumptions regarding attributes and consumer choices for a low involvement product (veg oil). 5. The study presented is part of the project Rape seed oil for human consumption. Although...

  6. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    Directory of Open Access Journals (Sweden)

    Muhammad Nurul Zhafirah

    2017-01-01

    Full Text Available Increased demand in internet of thing (IOT application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  7. Experimental evaluation of a quasi-modal parameter based rotor foundation identification technique

    Science.gov (United States)

    Yu, Minli; Liu, Jike; Feng, Ningsheng; Hahn, Eric J.

    2017-12-01

    Correct modelling of the foundation of rotating machinery is an invaluable asset in model-based rotor dynamic study. One attractive approach for such purpose is to identify the relevant modal parameters of an equivalent foundation using the motion measurements of rotor and foundation at the bearing supports. Previous research showed that, a complex quasi-modal parameter based system identification technique could be feasible for this purpose; however, the technique was only validated by identifying simple structures under harmonic excitation. In this paper, such identification technique is further extended and evaluated by identifying the foundation of a numerical rotor-bearing-foundation system and an experimental rotor rig respectively. In the identification of rotor foundation with multiple bearing supports, all application points of excitation forces transmitted through bearings need to be included; however the assumed vibration modes far outside the rotor operating speed cannot or not necessary to be identified. The extended identification technique allows one to identify correctly an equivalent foundation with fewer modes than the assumed number of degrees of freedom, essentially by generalising the technique to be able to handle rectangular complex modal matrices. The extended technique is robust in numerical and experimental validation and is therefore likely to be applicable in the field.

  8. Natural resource validation: A primer on concepts and techniques

    Energy Technology Data Exchange (ETDEWEB)

    Ulibarri, C.A.; Wellman, K.F.

    1997-07-01

    Natural resource valuation has always had a fundamental role in the practice of cost-benefit analysis of health, safety, and environmental issues. The authors provide an objective overview of resource valuation techniques and describe their potential role in environmental restoration/waste management (ER/WM) activities at federal facilities. This handbook considers five general classes of valuation techniques: (1) market-based techniques, which rely on historical information on market prices and transactions to determine resource values; (2) nonmarket techniques that rely on indirect estimates of resource values; (3) nonmarket techniques that are based on direct estimates of resource values; (4) cross-cutting valuation techniques, which combine elements of one or more of these methods; and (5) ecological valuation techniques used in the emerging field of ecological economics. The various valuation techniques under consideration are described by highlighting their applicability in environmental management and regulation. The handbook also addresses key unresolved issues in the application of valuation techniques generally, including discounting future values, incorporating environmental equity concerns, and concerns over the uncertainties in the measurement of natural resource values and environmental risk.

  9. [Estimating child mortality using the previous child technique, with data from health centers and household surveys: methodological aspects].

    Science.gov (United States)

    Aguirre, A; Hill, A G

    1988-01-01

    2 trials of the previous child or preceding birth technique in Bamako, Mali, and Lima, Peru, gave very promising results for measurement of infant and early child mortality using data on survivorship of the 2 most recent births. In the Peruvian study, another technique was tested in which each woman was asked about her last 3 births. The preceding birth technique described by Brass and Macrae has rapidly been adopted as a simple means of estimating recent trends in early childhood mortality. The questions formulated and the analysis of results are direct when the mothers are visited at the time of birth or soon after. Several technical aspects of the method believed to introduce unforeseen biases have now been studied and found to be relatively unimportant. But the problems arising when the data come from a nonrepresentative fraction of the total fertile-aged population have not been resolved. The analysis based on data from 5 maternity centers including 1 hospital in Bamako, Mali, indicated some practical problems and the information obtained showed the kinds of subtle biases that can result from the effects of selection. The study in Lima tested 2 abbreviated methods for obtaining recent early childhood mortality estimates in countries with deficient vital registration. The basic idea was that a few simple questions added to household surveys on immunization or diarrheal disease control for example could produce improved child mortality estimates. The mortality estimates in Peru were based on 2 distinct sources of information in the questionnaire. All women were asked their total number of live born children and the number still alive at the time of the interview. The proportion of deaths was converted into a measure of child survival using a life table. Then each woman was asked for a brief history of the 3 most recent live births. Dates of birth and death were noted in month and year of occurrence. The interviews took only slightly longer than the basic survey

  10. Investigation of previously derived Hyades, Coma, and M67 reddenings

    International Nuclear Information System (INIS)

    Taylor, B.J.

    1980-01-01

    New Hyades polarimetry and field star photometry have been obtained to check the Hyades reddening, which was found to be nonzero in a previous paper. The new Hyades polarimetry implies essentially zero reddening; this is also true of polarimetry published by Behr (which was incorrectly interpreted in the previous paper). Four photometric techniques which are presumed to be insensitive to blanketing are used to compare the Hyades to nearby field stars; these four techniques also yield essentially zero reddening. When all of these results are combined with others which the author has previously published and a simultaneous solution for the Hyades, Coma, and M67 reddenings is made, the results are E (B-V) =3 +- 2 (sigma) mmag, -1 +- 3 (sigma) mmag, and 46 +- 6 (sigma) mmag, respectively. No support for a nonzero Hyades reddening is offered by the new results. When the newly obtained reddenings for the Hyades, Coma, and M67 are compared with results from techniques given by Crawford and by users of the David Dunlap Observatory photometric system, no differences between the new and other reddenings are found which are larger than about 2 sigma. The author had previously found that the M67 main-sequence stars have about the same blanketing as that of Coma and less blanketing than the Hyades; this conclusion is essentially unchanged by the revised reddenings

  11. Development and validation of the European Cluster Assimilation Techniques run libraries

    Science.gov (United States)

    Facskó, G.; Gordeev, E.; Palmroth, M.; Honkonen, I.; Janhunen, P.; Sergeev, V.; Kauristie, K.; Milan, S.

    2012-04-01

    The European Commission funded the European Cluster Assimilation Techniques (ECLAT) project as a collaboration of five leader European universities and research institutes. A main contribution of the Finnish Meteorological Institute (FMI) is to provide a wide range global MHD runs with the Grand Unified Magnetosphere Ionosphere Coupling simulation (GUMICS). The runs are divided in two categories: Synthetic runs investigating the extent of solar wind drivers that can influence magnetospheric dynamics, as well as dynamic runs using measured solar wind data as input. Here we consider the first set of runs with synthetic solar wind input. The solar wind density, velocity and the interplanetary magnetic field had different magnitudes and orientations; furthermore two F10.7 flux values were selected for solar radiation minimum and maximum values. The solar wind parameter values were constant such that a constant stable solution was archived. All configurations were run several times with three different (-15°, 0°, +15°) tilt angles in the GSE X-Z plane. The result of the 192 simulations named so called "synthetic run library" were visualized and uploaded to the homepage of the FMI after validation. Here we present details of these runs.

  12. Aircraft applications of fault detection and isolation techniques

    Science.gov (United States)

    Marcos Esteban, Andres

    In this thesis the problems of fault detection & isolation and fault tolerant systems are studied from the perspective of LTI frequency-domain, model-based techniques. Emphasis is placed on the applicability of these LTI techniques to nonlinear models, especially to aerospace systems. Two applications of Hinfinity LTI fault diagnosis are given using an open-loop (no controller) design approach: one for the longitudinal motion of a Boeing 747-100/200 aircraft, the other for a turbofan jet engine. An algorithm formalizing a robust identification approach based on model validation ideas is also given and applied to the previous jet engine. A general linear fractional transformation formulation is given in terms of the Youla and Dual Youla parameterizations for the integrated (control and diagnosis filter) approach. This formulation provides better insight into the trade-off between the control and the diagnosis objectives. It also provides the basic groundwork towards the development of nested schemes for the integrated approach. These nested structures allow iterative improvements on the control/filter Youla parameters based on successive identification of the system uncertainty (as given by the Dual Youla parameter). The thesis concludes with an application of Hinfinity LTI techniques to the integrated design for the longitudinal motion of the previous Boeing 747-100/200 model.

  13. Experimental validation of waveform relaxation technique for power ...

    Indian Academy of Sciences (India)

    Two systems are considered: a HVDC controller tested with a detailed model of the converters, and a TCSC based damping controller tested with a low frequency model of a power system. The results are validated with those obtained using simulated models of the controllers. We also present results of an experiment in ...

  14. Maternal perception of fever in children by tactile technique how valid it is

    International Nuclear Information System (INIS)

    Jalil, J.; Bashir, F.

    2014-01-01

    To determine the validity of tactile technique as a tool for fever assessment in children by mothers. Study Design: A cohort study. Place and Duration of Study: The study was conducted at the department of Paediatrics, Combined Military Hospital, Bahawalpur, Pakistan, from September 2007 to September 2009. Patients and Methods: Convenient sampling technique was employed. Three hundred and ninety three children between the ages of 6 months and 5 years, brought to hospital by mothers with history of prolonged fever (7 days or more) perceived by tactile technique. Children were not supposed to be necessarily febrile at the time of enrollment. A six hourly temperature recording was done. Moreover, whenever mothers felt that their child is febrile by using tactile method of their choice, axillary thermometry was done irrespective of the number of recordings. Standard mercury thermometry by axillary technique (without adding a degree to measured value) was chosen. Reading of more than 99.50 Fahrenheit (37.50 centigrade) was labeled as fever. Cases that remained fever free for five days were labeled afebrile and discharged. Mothers were advised to watch for fever for one week at home and to report back immediately if they felt that their child has fever, confirmed by a single tactile measurement. Those who reported back were readmitted and subjected to the same method of monitoring and recording as was applied on first admission. Data was analyzed using SPSS version 17. Descriptive statistics were applied to calculate the frequencies, means and standard deviations. Results: Among the 392 children 58.4% were males and 41.4% were females. The mean age was 24.4 +-14.39 months. Majority had a history of fever of 5 to 24 days (70.2%). In only 184 (46.93%) patients fever was confirmed. In 208 (53.08%) patients no fever was recorded and were discharged. Twenty one patients reported back with fever. However, fever was confirmed in only 11 patients. In summary, a total of 195 (49

  15. Validation of the L.A.L. technique (gel cloth) in radiodiagnosis and radioisotopes

    International Nuclear Information System (INIS)

    Morote, M.; Otero, M.; Chavez, G.

    1999-01-01

    Based on the validation of the LAL technique (gel cloth), real endotoxin limits (EL), expressed in endotoxin units per milliliter (EU/mL), as well as maximum dilution volume (MDV) were determined in seven radiodiagnosis agents (RDA): medronate, pentetate, sulfur colloid, albumin macroaggregated, succimer, disofenin, and pyrophosphates, which showed M.D.V. values of: 1:80; 1:16; 1:128; 1:64; 1:4; 1:128; 1:64 and EL of 20 EU/mL; 4 EU/mL; 32 EU/mL; 16 EU/mL; 1 EU/mL and 16 EU/mL, respectively. Radioisotope Tc-99m presents a MDV of 1:1 and an EL below 0,25 EU/mL. Clear interference in the acidity of sulfur colloid (pH below 2) was also found against pyrotell reaction, even in the sixth dilution of the product. M.D.V. results also showed that dilutions are even 100 times smaller than the theoretical values

  16. Validation of an Improved Computer-Assisted Technique for Mining Free-Text Electronic Medical Records.

    Science.gov (United States)

    Duz, Marco; Marshall, John F; Parkin, Tim

    2017-06-29

    The use of electronic medical records (EMRs) offers opportunity for clinical epidemiological research. With large EMR databases, automated analysis processes are necessary but require thorough validation before they can be routinely used. The aim of this study was to validate a computer-assisted technique using commercially available content analysis software (SimStat-WordStat v.6 (SS/WS), Provalis Research) for mining free-text EMRs. The dataset used for the validation process included life-long EMRs from 335 patients (17,563 rows of data), selected at random from a larger dataset (141,543 patients, ~2.6 million rows of data) and obtained from 10 equine veterinary practices in the United Kingdom. The ability of the computer-assisted technique to detect rows of data (cases) of colic, renal failure, right dorsal colitis, and non-steroidal anti-inflammatory drug (NSAID) use in the population was compared with manual classification. The first step of the computer-assisted analysis process was the definition of inclusion dictionaries to identify cases, including terms identifying a condition of interest. Words in inclusion dictionaries were selected from the list of all words in the dataset obtained in SS/WS. The second step consisted of defining an exclusion dictionary, including combinations of words to remove cases erroneously classified by the inclusion dictionary alone. The third step was the definition of a reinclusion dictionary to reinclude cases that had been erroneously classified by the exclusion dictionary. Finally, cases obtained by the exclusion dictionary were removed from cases obtained by the inclusion dictionary, and cases from the reinclusion dictionary were subsequently reincluded using Rv3.0.2 (R Foundation for Statistical Computing, Vienna, Austria). Manual analysis was performed as a separate process by a single experienced clinician reading through the dataset once and classifying each row of data based on the interpretation of the free

  17. Gastric emptying of liquid meals: validation of the gamma camera technique

    Energy Technology Data Exchange (ETDEWEB)

    Lawaetz, Otto; Dige-Petersen, Harriet

    1989-05-01

    To assess the extent of errors and to provide correction factors for gamma camera gastric emptying studies of liquid meals labelled with radionuclides (/sup 99/Tc/sup m/ or /sup 113/In/sup m/), phantom studies were performed with different gastric emptying procedures, gamma cameras and data handling systems. To validate the overall accuracy of the method, 24 combined aspiration and gamma camera gastric emptying studies were carried out in three normal volunteers. Gastric meal volume was underestimated due to scattered radiation from the stomach. The underestimation was 7-20% varying with the size of the gastric region of interest (ROI), the energy of the nuclide and the fraction of meal in the stomach. The overestimation, due to scattered radiation from the gut, was negligible (1-3%) for any of the procedures. The gamma camera technique eliminated much of the error due to variations of stomach geometry and produced accurate quantitative gastric emptying data comparable to those obtained by evacuation (P > 0.10), when the entire field maximum 1-min count achieved within the first 20 min of a study was taken as representing the original volume of the meal ingested, and when corrections for area related errors due to scattered radiation from the stomach were performed. (author).

  18. Machine learning techniques for gait biometric recognition using the ground reaction force

    CERN Document Server

    Mason, James Eric; Woungang, Isaac

    2016-01-01

    This book focuses on how machine learning techniques can be used to analyze and make use of one particular category of behavioral biometrics known as the gait biometric. A comprehensive Ground Reaction Force (GRF)-based Gait Biometrics Recognition framework is proposed and validated by experiments. In addition, an in-depth analysis of existing recognition techniques that are best suited for performing footstep GRF-based person recognition is also proposed, as well as a comparison of feature extractors, normalizers, and classifiers configurations that were never directly compared with one another in any previous GRF recognition research. Finally, a detailed theoretical overview of many existing machine learning techniques is presented, leading to a proposal of two novel data processing techniques developed specifically for the purpose of gait biometric recognition using GRF. This book · introduces novel machine-learning-based temporal normalization techniques · bridges research gaps concerning the effect of ...

  19. The validation of an infrared simulation system

    CSIR Research Space (South Africa)

    De Waal, A

    2013-08-01

    Full Text Available theoretical validation framework. This paper briefly describes the procedure used to validate software models in an infrared system simulation, and provides application examples of this process. The discussion includes practical validation techniques...

  20. Specification and Preliminary Validation of IAT (Integrated Analysis Techniques) Methods: Executive Summary.

    Science.gov (United States)

    1985-03-01

    conceptual framwork , and preliminary validation of IAT concepts. Planned work for FY85, including more extensive validation, is also described. 20...Developments: Required Capabilities .... ......... 10 2-1 IAT Conceptual Framework - FY85 (FEO) ..... ........... 11 2-2 Recursive Nature of Decomposition...approach: 1) Identify needs & requirements for IAT. 2) Develop IAT conceptual framework. 3) Validate IAT methods. 4) Develop applications materials. To

  1. Survey of Verification and Validation Techniques for Small Satellite Software Development

    Science.gov (United States)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  2. Validation and Test-Retest Reliability of New Thermographic Technique Called Thermovision Technique of Dry Needling for Gluteus Minimus Trigger Points in Sciatica Subjects and TrPs-Negative Healthy Volunteers

    Science.gov (United States)

    Rychlik, Michał; Samborski, Włodzimierz

    2015-01-01

    The aim of this study was to assess the validity and test-retest reliability of Thermovision Technique of Dry Needling (TTDN) for the gluteus minimus muscle. TTDN is a new thermography approach used to support trigger points (TrPs) diagnostic criteria by presence of short-term vasomotor reactions occurring in the area where TrPs refer pain. Method. Thirty chronic sciatica patients (n=15 TrP-positive and n=15 TrPs-negative) and 15 healthy volunteers were evaluated by TTDN three times during two consecutive days based on TrPs of the gluteus minimus muscle confirmed additionally by referred pain presence. TTDN employs average temperature (T avr), maximum temperature (T max), low/high isothermal-area, and autonomic referred pain phenomenon (AURP) that reflects vasodilatation/vasoconstriction. Validity and test-retest reliability were assessed concurrently. Results. Two components of TTDN validity and reliability, T avr and AURP, had almost perfect agreement according to κ (e.g., thigh: 0.880 and 0.938; calf: 0.902 and 0.956, resp.). The sensitivity for T avr, T max, AURP, and high isothermal-area was 100% for everyone, but specificity of 100% was for T avr and AURP only. Conclusion. TTDN is a valid and reliable method for T avr and AURP measurement to support TrPs diagnostic criteria for the gluteus minimus muscle when digitally evoked referred pain pattern is present. PMID:26137486

  3. Novel experimental measuring techniques required to provide data for CFD validation

    International Nuclear Information System (INIS)

    Prasser, H.-M.

    2008-01-01

    CFD code validation requires experimental data that characterize the distributions of parameters within large flow domains. On the other hand, the development of geometry-independent closure relations for CFD codes have to rely on instrumentation and experimental techniques appropriate for the phenomena that are to be modelled, which usually requires high spatial and time resolution. The paper reports about the use of wire-mesh sensors to study turbulent mixing processes in single-phase flow as well as to characterize the dynamics of the gas-liquid interface in a vertical pipe flow. Experiments at a pipe of a nominal diameter of 200 mm are taken as the basis for the development and test of closure relations describing bubble coalescence and break-up, interfacial momentum transfer and turbulence modulation for a multi-bubble-class model. This is done by measuring the evolution of the flow structure along the pipe. The transferability of the extended CFD code to more complicated 3D flow situations is assessed against measured data from tests involving two-phase flow around an asymmetric obstacle placed in a vertical pipe. The obstacle, a half-moon-shaped diaphragm, is movable in the direction of the pipe axis; this allows the 3D gas fraction field to be recorded without changing the sensor position. In the outlook, the pressure chamber of TOPFLOW is presented, which will be used as the containment for a test facility, in which experiments can be conducted in pressure equilibrium with the inner atmosphere of the tank. In this way, flow structures can be observed by optical means through large-scale windows even at pressures of up to 5 MPa. The so-called 'Diving Chamber' technology will be used for Pressurized Thermal Shock (PTS) tests. Finally, some important trends in instrumentation for multi-phase flows will be given. This includes the state-of-art of X-ray and gamma tomography, new multi-component wire-mesh sensors, and a discussion of the potential of other non

  4. Novel experimental measuring techniques required to provide data for CFD validation

    International Nuclear Information System (INIS)

    Prasser, H.M.

    2007-01-01

    CFD code validation requires experimental data that characterize distributions of parameters within large flow domains. On the other hand, the development of geometry-independent closure relations for CFD codes have to rely on instrumentation and experimental techniques appropriate for the phenomena that are to be modelled, which usually requires high spatial and time resolution. The presentation reports about the use of wire-mesh sensors to study turbulent mixing processes in the single-phase flow as well as to characterize the dynamics of the gas-liquid interface in a vertical pipe flow. Experiments at a pipe of a nominal diameter of 200 mm are taken as the basis for the development and test of closure relations describing bubble coalescence and break-up, interfacial momentum transfer and turbulence modulation for a multi-bubble-class model. This is done by measuring the evolution of the flow structure along the pipe. The transferability of the extended CFD code to more complicated 3D flow situations is assessed against measured data from tests involving two-phase flow around an asymmetric obstacle placed in a vertical pipe. The obstacle, a half-moon-shaped diaphragm, is movable in the direction of the pipe axis; this allows the 3D gas fraction field to be recorded without changing the sensor position. In the outlook, the pressure chamber of TOPFLOW is presented, which will be used as the containment for a test facility, in which experiments can be conducted in pressure equilibrium with the inner atmosphere of the tank. In this way, flow structures can be observed by optical means through large-scale windows even at pressures of up to 5 MPa. The so-called 'Diving Chamber' technology will be used for Pressurized Thermal Shock (PTS) tests. Finally, some important trends in instrumentation for multi-phase flows will be given. This includes the state-of-art of X-ray and gamma tomography, new multi-component wire-mesh sensors, and a discussion of the potential of

  5. Characteristics and Validation Techniques for PCA-Based Gene-Expression Signatures

    Directory of Open Access Journals (Sweden)

    Anders E. Berglund

    2017-01-01

    Full Text Available Background. Many gene-expression signatures exist for describing the biological state of profiled tumors. Principal Component Analysis (PCA can be used to summarize a gene signature into a single score. Our hypothesis is that gene signatures can be validated when applied to new datasets, using inherent properties of PCA. Results. This validation is based on four key concepts. Coherence: elements of a gene signature should be correlated beyond chance. Uniqueness: the general direction of the data being examined can drive most of the observed signal. Robustness: if a gene signature is designed to measure a single biological effect, then this signal should be sufficiently strong and distinct compared to other signals within the signature. Transferability: the derived PCA gene signature score should describe the same biology in the target dataset as it does in the training dataset. Conclusions. The proposed validation procedure ensures that PCA-based gene signatures perform as expected when applied to datasets other than those that the signatures were trained upon. Complex signatures, describing multiple independent biological components, are also easily identified.

  6. Validation of CryoSat-2 SAR mode based lake levels

    DEFF Research Database (Denmark)

    Nielsen, Karina; Stenseng, Lars; Andersen, Ole Baltazar

    2015-01-01

    Lake level serve as an important indicator of the climate and continuous measurements are therefore essential. Satellite radar altimetry has now been used successfully for more than two decades to measure lake level as an addition to gauge measurements. The technique has, due to the large footprint...... with water levels obtained from Envisat. We find that the along-track precision of the mean based on CryoSat-2 is a few centimeter, even for the small lakes, which is a significant improvement compared to previous missions such as Envisat. When validating against gauge data we find RMS values of differences...

  7. An internally validated prognostic model for success in revision stapes surgery for otosclerosis.

    Science.gov (United States)

    Wegner, Inge; Vincent, Robert; Derks, Laura S M; Rauh, Simone P; Heymans, Martijn W; Stegeman, Inge; Grolman, Wilko

    2018-03-09

    To develop a prediction model that can accurately predict the chance of success following revision stapes surgery in patients with recurrent or persistent otosclerosis at 2- to 6-months follow-up and to validate this model internally. A retrospective cohort study of prospectively gathered data in a tertiary referral center. The associations of 11 prognostic factors with treatment success were tested in 705 cases using multivariable logistic regression analysis with backward selection. Success was defined as a mean air-bone gap closure to 10 dB or less. The most relevant predictors were used to derive a clinical prediction rule to determine the probability of success. Internal validation by means of bootstrapping was performed. Model performance indices, including the Hosmer-Lemeshow test, the area under the receiver operating characteristics curve (AUC), and the explained variance were calculated. Success was achieved in 57.7% of cases at 2- to 6-months follow-up. Certain previous surgical techniques, primary causes of failure leading up to revision stapes surgery, and positions of the prosthesis placed during revision surgery were associated with higher success percentages. The clinical prediction rule performed moderately well in the original dataset (Hosmer-Lemeshow P = .78; AUC = 0.73; explained variance = 22%), which slightly decreased following internal validation by means of bootstrapping (AUC = 0.69; explained variance = 13%). Our study established the importance of previous surgical technique, primary cause of failure, and type of the prosthesis placed during the revision surgery in predicting the probability of success following stapes surgery at 2- to 6-months follow-up. 2b. Laryngoscope, 2018. © 2018 The American Laryngological, Rhinological and Otological Society, Inc.

  8. Validation of myocardial blood flow estimation with nitrogen-13 ammonia PET by the argon inert gas technique in humans

    International Nuclear Information System (INIS)

    Kotzerke, J.; Glatting, G.; Neumaier, B.; Reske, S.N.; Hoff, J. van den; Hoeher, M.; Woehrle, J. n

    2001-01-01

    We simultaneously determined global myocardial blood flow (MBF) by the argon inert gas technique and by nitrogen-13 ammonia positron emission tomography (PET) to validate PET-derived MBF values in humans. A total of 19 patients were investigated at rest (n=19) and during adenosine-induced hyperaemia (n=16). Regional coronary artery stenoses were ruled out by angiography. The argon inert gas method uses the difference of arterial and coronary sinus argon concentrations during inhalation of a mixture of 75% argon and 25% oxygen to estimate global MBF. It can be considered as valid as the microspheres technique, which, however, cannot be applied in humans. Dynamic PET was performed after injection of 0.8±0.2 GBq 13 N-ammonia and MBF was calculated applying a two-tissue compartment model. MBF values derived from the argon method at rest and during the hyperaemic state were 1.03±0.24 ml min -1 g -1 and 2.64±1.02 ml min -1 g -1 , respectively. MBF values derived from ammonia PET at rest and during hyperaemia were 0.95±0.23 ml min -1 g -1 and 2.44±0.81 ml min -1 g -1 , respectively. The correlation between the two methods was close (y=0.92x+0.14, r=0.96; P 13 N-ammonia PET. (orig.)

  9. Validation of DRAGON side-step method for Bruce-A restart Phase-B physics tests

    International Nuclear Information System (INIS)

    Shen, W.; Ngo-Trong, C.; Davis, R.S.

    2004-01-01

    The DRAGON side-step method, developed at AECL, has a number of advantages over the all-DRAGON method that was used before. It is now the qualified method for reactivity-device calculations. Although the side-step-method-generated incremental cross sections have been validated against those previously calculated with the all-DRAGON method, it is highly desirable to validate the side-step method against device-worth measurements in power reactors directly. In this paper, the DRAGON side-step method was validated by comparison with the device-calibration measurements made in Bruce-A NGS Unit 4 restart Phase-B commissioning in 2003. The validation exercise showed excellent results, with the DRAGON code overestimating the measured ZCR worth by ∼5%. A sensitivity study was also performed in this paper to assess the effect of various DRAGON modelling techniques on the incremental cross sections. The assessment shows that the refinement of meshes in 3-D and the use of the side-step method are two major reasons contributing to the improved agreement between the calculated ZCR worths and the measurements. Use of different DRAGON versions, DRAGON libraries, local-parameter core conditions, and weighting techniques for the homogenization of tube clusters inside the ZCR have a very small effect on the ZCR incremental thermal absorption cross section and ZCR reactivity worth. (author)

  10. Development and validation of a new technique for estimating a minimum postmortem interval using adult blow fly (Diptera: Calliphoridae) carcass attendance.

    Science.gov (United States)

    Mohr, Rachel M; Tomberlin, Jeffery K

    2015-07-01

    Understanding the onset and duration of adult blow fly activity is critical to accurately estimating the period of insect activity or minimum postmortem interval (minPMI). Few, if any, reliable techniques have been developed and consequently validated for using adult fly activity to determine a minPMI. In this study, adult blow flies (Diptera: Calliphoridae) of Cochliomyia macellaria and Chrysomya rufifacies were collected from swine carcasses in rural central Texas, USA, during summer 2008 and Phormia regina and Calliphora vicina in the winter during 2009 and 2010. Carcass attendance patterns of blow flies were related to species, sex, and oocyte development. Summer-active flies were found to arrive 4-12 h after initial carcass exposure, with both C. macellaria and C. rufifacies arriving within 2 h of one another. Winter-active flies arrived within 48 h of one another. There was significant difference in degree of oocyte development on each of the first 3 days postmortem. These frequency differences allowed a minPMI to be calculated using a binomial analysis. When validated with seven tests using domestic and feral swine and human remains, the technique correctly estimated time of placement in six trials.

  11. Reoperative sentinel lymph node biopsy after previous mastectomy.

    Science.gov (United States)

    Karam, Amer; Stempel, Michelle; Cody, Hiram S; Port, Elisa R

    2008-10-01

    Sentinel lymph node (SLN) biopsy is the standard of care for axillary staging in breast cancer, but many clinical scenarios questioning the validity of SLN biopsy remain. Here we describe our experience with reoperative-SLN (re-SLN) biopsy after previous mastectomy. Review of the SLN database from September 1996 to December 2007 yielded 20 procedures done in the setting of previous mastectomy. SLN biopsy was performed using radioisotope with or without blue dye injection superior to the mastectomy incision, in the skin flap in all patients. In 17 of 20 patients (85%), re-SLN biopsy was performed for local or regional recurrence after mastectomy. Re-SLN biopsy was successful in 13 of 20 patients (65%) after previous mastectomy. Of the 13 patients, 2 had positive re-SLN, and completion axillary dissection was performed, with 1 having additional positive nodes. In the 11 patients with negative re-SLN, 2 patients underwent completion axillary dissection demonstrating additional negative nodes. One patient with a negative re-SLN experienced chest wall recurrence combined with axillary recurrence 11 months after re-SLN biopsy. All others remained free of local or axillary recurrence. Re-SLN biopsy was unsuccessful in 7 of 20 patients (35%). In three of seven patients, axillary dissection was performed, yielding positive nodes in two of the three. The remaining four of seven patients all had previous modified radical mastectomy, so underwent no additional axillary surgery. In this small series, re-SLN was successful after previous mastectomy, and this procedure may play some role when axillary staging is warranted after mastectomy.

  12. Heat Transfer Modeling and Validation for Optically Thick Alumina Fibrous Insulation

    Science.gov (United States)

    Daryabeigi, Kamran

    2009-01-01

    Combined radiation/conduction heat transfer through unbonded alumina fibrous insulation was modeled using the diffusion approximation for modeling the radiation component of heat transfer in the optically thick insulation. The validity of the heat transfer model was investigated by comparison to previously reported experimental effective thermal conductivity data over the insulation density range of 24 to 96 kg/cu m, with a pressure range of 0.001 to 750 torr (0.1 to 101.3 x 10(exp 3) Pa), and test sample hot side temperature range of 530 to 1360 K. The model was further validated by comparison to thermal conductivity measurements using the transient step heating technique on an insulation sample at a density of 144 kg/cu m over a pressure range of 0.001 to 760 torr, and temperature range of 290 to 1090 K.

  13. Urban roughness mapping validation techniques and some first results

    NARCIS (Netherlands)

    Bottema, M; Mestayer, PG

    1998-01-01

    Because of measuring problems related to evaluation of urban roughness parameters, a new approach using a roughness mapping tool has been tested: evaluation of roughness length z(o) and zero displacement z(d) from cadastral databases. Special attention needs to be given to the validation of the

  14. International exchange on nuclear safety related expert systems: The role of software verification and validation

    International Nuclear Information System (INIS)

    Sun, B.K.H.

    1996-01-01

    An important lesson learned from the Three Mile Island accident is that human errors can be significant contributors to risk. Recent advancement in computer hardware and software technology helped make expert system techniques potentially viable tools for improving nuclear power plant safety and reliability. As part of the general man-machine interface technology, expert systems have recently become increasingly prominent as a potential solution to a number of previously intractable problems in many phases of human activity, including operation, maintenance, and engineering functions. Traditional methods for testing and analyzing analog systems are no longer adequate to handle the increased complexity of software systems. The role of Verification and Validation (V and V) is to add rigor to the software development and maintenance cycle to guarantee the high level confidence needed for applications. Verification includes the process and techniques for confirming that all the software requirements in one stage of the development are met before proceeding on to the next stage. Validation involves testing the integrated software and hardware system to ensure that it reliably fulfills its intended functions. Only through a comprehensive V and V program can a high level of confidence be achieved. There exist many different standards and techniques for software verification and validation, yet they lack uniform approaches that provides adequate levels of practical guidance which can be used by users for nuclear power plant applications. There is a need to unify different approaches for addressing software verification and validation and to develop practical and cost effective guidelines for user and regulatory acceptance. (author). 8 refs

  15. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  16. Validation of a simple isotopic technique for the measurement of global and separated renal function

    International Nuclear Information System (INIS)

    Chachati, A.; Meyers, A.; Rigo, P.; Godon, J.P.

    1986-01-01

    Schlegel and Gates described an isotopic method for the measurement of global and separated glomerular filtration rate (GFR) and effective renal plasma flow (ERPF) based on the determination by scintillation camera of the fraction of the injected dose (99mTc-DTPA-[ 131 I]hippuran) present in the kidneys 1-3 min after its administration. This method requires counting of the injected dose and attenuation correction, but no blood or urine sampling. We validated this technique by the simultaneous infusion of inulin and para-amino hippuric acid (PAH) in patients with various levels of renal function (anuric to normal). To better define individual renal function we studied 9 kidneys in patients either nephrectomized or with a nephrostomy enabling separated function measurement. A good correlation between inulin, PAH clearance, and isotopic GFR-ERPF measurement for both global and separate renal function was observed

  17. Calibration and validation of full-field techniques

    Directory of Open Access Journals (Sweden)

    Thalmann R.

    2010-06-01

    Full Text Available We review basic metrological terms related to the use of measurement equipment for verification of numerical model calculations. We address three challenges that are faced when performing measurements in experimental mechanics with optical techniques: the calibration of a measuring instrument that (i measures strain values, (ii provides full-field data, and (iii is dynamic.

  18. Validation and Application of Computed Radiography (CR) Tangential Technique for Wall Thickness Measurement of 10 Inch Carbon Steel Pipe

    International Nuclear Information System (INIS)

    Norhazleena Azaman; Khairul Anuar Mohd Salleh; Amry Amin Abas; Arshad Yassin; Sukhri Ahmad

    2016-01-01

    Oil and gas industry requires Non Destructive Testing (NDT) to ensure each components, in-service and critical, are fit-for-purpose. Pipes that are used to transfer oil or gas are amongst the critical component that needs to be well maintained and inspected. Typical pipe discontinuities that may lead to unintended incidents are erosion, corrosion, dent, welding defects, etc. Wall thickness assessment, with Radiography Testing (RT) is normally used to inspect such discontinuities and can be performed with two approaches; (a) center line beam tangential technique (b) offset from the centre pipe tangential technique. The latter is a method of choice for this work because of the pipe dimension and limited radiation safe distance at site. Two successful validation approaches (simulation and experimental) were performed to determine the probability of successfulness before the actual RT work with tangential technique is carried out. The pipe was a 10 inch diameter in-service wrapped carbon steel. A 9 Ci Ir-192 and white Imaging Plate (IP) were used as a gamma radiation source and to record the radiographic image. Result of this work suggest that RT with tangential technique for 10 inch wrapped in-service carbon steel pipe can be successfully performed. (author)

  19. Transformer Incipient Fault Prediction Using Combined Artificial Neural Network and Various Particle Swarm Optimisation Techniques.

    Directory of Open Access Journals (Sweden)

    Hazlee Azil Illias

    Full Text Available It is important to predict the incipient fault in transformer oil accurately so that the maintenance of transformer oil can be performed correctly, reducing the cost of maintenance and minimise the error. Dissolved gas analysis (DGA has been widely used to predict the incipient fault in power transformers. However, sometimes the existing DGA methods yield inaccurate prediction of the incipient fault in transformer oil because each method is only suitable for certain conditions. Many previous works have reported on the use of intelligence methods to predict the transformer faults. However, it is believed that the accuracy of the previously proposed methods can still be improved. Since artificial neural network (ANN and particle swarm optimisation (PSO techniques have never been used in the previously reported work, this work proposes a combination of ANN and various PSO techniques to predict the transformer incipient fault. The advantages of PSO are simplicity and easy implementation. The effectiveness of various PSO techniques in combination with ANN is validated by comparison with the results from the actual fault diagnosis, an existing diagnosis method and ANN alone. Comparison of the results from the proposed methods with the previously reported work was also performed to show the improvement of the proposed methods. It was found that the proposed ANN-Evolutionary PSO method yields the highest percentage of correct identification for transformer fault type than the existing diagnosis method and previously reported works.

  20. Transformer Incipient Fault Prediction Using Combined Artificial Neural Network and Various Particle Swarm Optimisation Techniques.

    Science.gov (United States)

    Illias, Hazlee Azil; Chai, Xin Rui; Abu Bakar, Ab Halim; Mokhlis, Hazlie

    2015-01-01

    It is important to predict the incipient fault in transformer oil accurately so that the maintenance of transformer oil can be performed correctly, reducing the cost of maintenance and minimise the error. Dissolved gas analysis (DGA) has been widely used to predict the incipient fault in power transformers. However, sometimes the existing DGA methods yield inaccurate prediction of the incipient fault in transformer oil because each method is only suitable for certain conditions. Many previous works have reported on the use of intelligence methods to predict the transformer faults. However, it is believed that the accuracy of the previously proposed methods can still be improved. Since artificial neural network (ANN) and particle swarm optimisation (PSO) techniques have never been used in the previously reported work, this work proposes a combination of ANN and various PSO techniques to predict the transformer incipient fault. The advantages of PSO are simplicity and easy implementation. The effectiveness of various PSO techniques in combination with ANN is validated by comparison with the results from the actual fault diagnosis, an existing diagnosis method and ANN alone. Comparison of the results from the proposed methods with the previously reported work was also performed to show the improvement of the proposed methods. It was found that the proposed ANN-Evolutionary PSO method yields the highest percentage of correct identification for transformer fault type than the existing diagnosis method and previously reported works.

  1. Transformer Incipient Fault Prediction Using Combined Artificial Neural Network and Various Particle Swarm Optimisation Techniques

    Science.gov (United States)

    2015-01-01

    It is important to predict the incipient fault in transformer oil accurately so that the maintenance of transformer oil can be performed correctly, reducing the cost of maintenance and minimise the error. Dissolved gas analysis (DGA) has been widely used to predict the incipient fault in power transformers. However, sometimes the existing DGA methods yield inaccurate prediction of the incipient fault in transformer oil because each method is only suitable for certain conditions. Many previous works have reported on the use of intelligence methods to predict the transformer faults. However, it is believed that the accuracy of the previously proposed methods can still be improved. Since artificial neural network (ANN) and particle swarm optimisation (PSO) techniques have never been used in the previously reported work, this work proposes a combination of ANN and various PSO techniques to predict the transformer incipient fault. The advantages of PSO are simplicity and easy implementation. The effectiveness of various PSO techniques in combination with ANN is validated by comparison with the results from the actual fault diagnosis, an existing diagnosis method and ANN alone. Comparison of the results from the proposed methods with the previously reported work was also performed to show the improvement of the proposed methods. It was found that the proposed ANN-Evolutionary PSO method yields the highest percentage of correct identification for transformer fault type than the existing diagnosis method and previously reported works. PMID:26103634

  2. Linear Unlearning for Cross-Validation

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Larsen, Jan

    1996-01-01

    The leave-one-out cross-validation scheme for generalization assessment of neural network models is computationally expensive due to replicated training sessions. In this paper we suggest linear unlearning of examples as an approach to approximative cross-validation. Further, we discuss...... time series prediction benchmark demonstrate the potential of the linear unlearning technique...

  3. A new validation technique for estimations of body segment inertia tensors: Principal axes of inertia do matter.

    Science.gov (United States)

    Rossi, Marcel M; Alderson, Jacqueline; El-Sallam, Amar; Dowling, James; Reinbolt, Jeffrey; Donnelly, Cyril J

    2016-12-08

    The aims of this study were to: (i) establish a new criterion method to validate inertia tensor estimates by setting the experimental angular velocity data of an airborne objects as ground truth against simulations run with the estimated tensors, and (ii) test the sensitivity of the simulations to changes in the inertia tensor components. A rigid steel cylinder was covered with reflective kinematic markers and projected through a calibrated motion capture volume. Simulations of the airborne motion were run with two models, using inertia tensor estimated with geometric formula or the compound pendulum technique. The deviation angles between experimental (ground truth) and simulated angular velocity vectors and the root mean squared deviation angle were computed for every simulation. Monte Carlo analyses were performed to assess the sensitivity of simulations to changes in magnitude of principal moments of inertia within ±10% and to changes in orientation of principal axes of inertia within ±10° (of the geometric-based inertia tensor). Root mean squared deviation angles ranged between 2.9° and 4.3° for the inertia tensor estimated geometrically, and between 11.7° and 15.2° for the compound pendulum values. Errors up to 10% in magnitude of principal moments of inertia yielded root mean squared deviation angles ranging between 3.2° and 6.6°, and between 5.5° and 7.9° when lumped with errors of 10° in principal axes of inertia orientation. The proposed technique can effectively validate inertia tensors from novel estimation methods of body segment inertial parameter. Principal axes of inertia orientation should not be neglected when modelling human/animal mechanics. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Determination of sulfate in thorium salts using gravimetric technique with previous thorium separation

    International Nuclear Information System (INIS)

    Silva, C.M. da; Pires, M.A.F.

    1994-01-01

    Available as short communication only. A simple analytical method to analyze sulfates in thorium salt, is presented. The method is based on the thorium separation as hydroxide. The gravimetric technique is used to analyze the sulfate in the filtered as barium sulfate. Using this method, the sulfate separation from thorium has been reach 99,9% yield, and 0,1% precision. This method is applied to thorium salts specifically thorium sulfate, carbonate and nitrate. (author). 5 refs, 2 tabs

  5. Validation of variance reduction techniques in Mediso (SPIRIT DH-V) SPECT system by Monte Carlo

    International Nuclear Information System (INIS)

    Rodriguez Marrero, J. P.; Diaz Garcia, A.; Gomez Facenda, A.

    2015-01-01

    Monte Carlo simulation of nuclear medical imaging systems is a widely used method for reproducing their operation in a real clinical environment, There are several Single Photon Emission Tomography (SPECT) systems in Cuba. For this reason it is clearly necessary to introduce a reliable and fast simulation platform in order to obtain consistent image data. This data will reproduce the original measurements conditions. In order to fulfill these requirements Monte Carlo platform GAMOS (Geant4 Medicine Oriented Architecture for Applications) have been used. Due to the very size and complex configuration of parallel hole collimators in real clinical SPECT systems, Monte Carlo simulation usually consumes excessively high time and computing resources. main goal of the present work is to optimize the efficiency of calculation by means of new GAMOS functionality. There were developed and validated two GAMOS variance reduction techniques to speed up calculations. These procedures focus and limit transport of gamma quanta inside the collimator. The obtained results were asses experimentally in Mediso (SPIRIT DH-V) SPECT system. Main quality control parameters, such as sensitivity and spatial resolution were determined. Differences of 4.6% sensitivity and 8.7% spatial resolution were reported against manufacturer values. Simulation time was decreased up to 650 times. Using these techniques it was possible to perform several studies in almost 8 hours each. (Author)

  6. Development and Validation of a Hypersonic Vehicle Design Tool Based On Waverider Design Technique

    Science.gov (United States)

    Dasque, Nastassja

    Methodologies for a tool capable of assisting design initiatives for practical waverider based hypersonic vehicles were developed and validated. The design space for vehicle surfaces was formed using an algorithm that coupled directional derivatives with the conservation laws to determine a flow field defined by a set of post-shock streamlines. The design space is used to construct an ideal waverider with a sharp leading edge. A blunting method was developed to modify the ideal shapes to a more practical geometry for real-world application. Empirical and analytical relations were then systematically applied to the resulting geometries to determine local pressure, skin-friction and heat flux. For the ideal portion of the geometry, flat plate relations for compressible flow were applied. For the blunted portion of the geometry modified Newtonian theory, Fay-Riddell theory and Modified Reynolds analogy were applied. The design and analysis methods were validated using analytical solutions as well as empirical and numerical data. The streamline solution for the flow field generation technique was compared with a Taylor-Maccoll solution and showed very good agreement. The relationship between the local Stanton number and skin friction coefficient with local Reynolds number along the ideal portion of the body showed good agreement with experimental data. In addition, an automated grid generation routine was formulated to construct a structured mesh around resulting geometries in preparation for Computational Fluid Dynamics analysis. The overall analysis of the waverider body using the tool was then compared to CFD studies. The CFD flow field showed very good agreement with the design space. However, the distribution of the surface properties was near CFD results but did not have great agreement.

  7. Evaluation of content validity for the FACT-G quality of life questionaire through multidimensional escalation techniques

    International Nuclear Information System (INIS)

    Sanchez, Ricardo; Ballesteros, Monica; Ortiz, Natascha

    2010-01-01

    Objective: To evaluate the structure of FACT-G latent variables in a sample of patients attending the National Cancer Institute of Colombia Methods: The FACT-G questionnaire was applied in 473 patients with different types of cancer during 2005-2007. A factor analysis was done based on a polychoric matrix and multidimensional escalation techniques for ordinal variables determining the domain structure of the questionnaire. Results: Breast and prostate cancer were the most frequent types of tumors. In total 54.6% were men and the mean age was 61 years (SD 11.7). The four domains of the questionnaire revealed a similar score. The factor analysis showed a similar structure to the original FACT-G with the emotional function as the less consistent domain. According to the multidimensional escalation analysis, a bidimensional structure is suitable after different adjustment indexes. Only the emotional function domain exposed a heterogeneous structure; the remaining revealed clustered structures and independence among them. Central components for quality of life were functional well-being and social/family well-being. Conclusions: The FACT-G quality of life questionnaire applied in a sample of Colombian patients was consistent wit the original instrument. The multidimensional escalation techniques provide additional information to conventional analysis and are useful to validate quality of life questionnaires.

  8. Technique for axillary radiotherapy using computer-assisted planning for high-risk skin cancerΤ

    International Nuclear Information System (INIS)

    Fogarty, G.B.; Martin, J.M.; Fay, M.; Ainslie J; Cassumbhoy, R.

    2007-01-01

    High-risk skin cancer arising on the upper limb or trunk can cause axillary nodal metastases. Previous studies have shown that axillary radiotherapy improves regional control. There is little published work on technique. Technique standardization is important in quality assurance and comparison of results especially for trials. Our technique, planned with CT assistance, is presented. To assess efficacy, an audit of patients treated in our institution over a 15-month period was conducted. Of 24 patients treated, 13 were treated with radical intent, 11 with this technique. With a follow up of over 2 years, the technique had more than a 90% (10/11) regional control in this radical group. Both of the radical patients who were not treated according to the technique had regional failure. One case of late toxicity was found, of asymptomatic lymphoedema in a radically treated patient. This technique for axillary radiotherapy for regional control of skin cancer is acceptable in terms of disease control and toxicity as validated by audit at 2 years

  9. Development of knowledgebase system for assisting signal validation scheme design

    International Nuclear Information System (INIS)

    Kitamura, M.; Baba, T.; Washio, T.; Sugiyama, K.

    1987-01-01

    The purpose of this study is to develop a knowledgebase system to be used as a tool for designing signal validation schemes. The outputs from the signal validation scheme can be used as; (1) auxiliary signals for detecting sensor failures, (2) inputs to advanced instrumentation such as disturbance analysis and diagnosis system or safety parameter display system, and (3) inputs to digital control systems. Conventional signal validation techniques such as comparison of redundant sensors, limit checking, and calibration tests have been employed in nuclear power plants. However, these techniques have serious drawbacks, e.g. needs for extra sensors, vulnerability to common mode failures, limited applicability to continuous monitoring, etc. To alleviate these difficulties, a new signal validation technique has been developed by using the methods called analytic redundancy and parity space. Although the new technique has been proved feasible as far as preliminary tests are concerned, further developments should be made in order to enhance its practical applicability

  10. Sentinel Lymph Node Biopsy (SLNB) for Breast Cancer (BC) - Validation Protocol of the Technique

    International Nuclear Information System (INIS)

    Blidaru, A.; Bordea, C.I.; Condrea, Ileana; Albert, Paul

    2006-01-01

    Full text: The sentinel ganglion concept originates in the assumption according to which the primary tumor drains into a specific ganglionar area and then runs through the lymphatic nodes in an orderly, sequential mode. When neoplastic dissemination along the lymphatic pathway occurs, there is an initial invasion of a specific lymph node (rarely more than one) located on the drainage route. That firstly lymph node has been identified as the sentinel node, which mirrors the regional ganglionar status. In order to establish the indication for lymphadenectomy and avoid the situations in which such a surgical procedure would be of no use (N-), the only correct method consists in the identification and biopsy of the sentinel node. Radioactive tracing and/or use of vital staining enable the identification of the regional ganglionar group towards which the primary lesion is draining. The technique of sentinel lymph node identification and biopsy by means of radioactive tracing includes: - pre-surgical lymphoscintigraphy, - identification of the sentinel lymph node and its excisional biopsy, - intra-operative histopathological examination and immunohistochemical stains of the sentinel lymph node. Regional lymphadenectomy serves two major purposes: - diagnosis (axillary lymph node invasion represents an important prognostic factor) and therapeutic (to ensure local control of the disease). Regional lymph node invasion in breast cancer is directly related to the primary tumour size. In the less advanced stages (T1), as there is rarely invasion of the axillary lymph nodes, lymphadenectomy can be avoided in most cases. The paper presents the refinement of the technique, the validation of the method for the identification and biopsy of the sentinel lymph node in breast cancer using Tc99 and the intra-operative use of NEOPROBE 2000 gamma camera at the 'Prof. Dr. Alexandru Trestioreanu' Oncological Institute in Bucharest. 93 patients with primary breast cancer (T1, T2, N0

  11. Validation for chromatographic and electrophoretic methods

    OpenAIRE

    Ribani, Marcelo; Bottoli, Carla Beatriz Grespan; Collins, Carol H.; Jardim, Isabel Cristina Sales Fontes; Melo, Lúcio Flávio Costa

    2004-01-01

    The validation of an analytical method is fundamental to implementing a quality control system in any analytical laboratory. As the separation techniques, GC, HPLC and CE, are often the principal tools used in such determinations, procedure validation is a necessity. The objective of this review is to describe the main aspects of validation in chromatographic and electrophoretic analysis, showing, in a general way, the similarities and differences between the guidelines established by the dif...

  12. Validation of small Kepler transiting planet candidates in or near the habitable zone

    DEFF Research Database (Denmark)

    Torres, Guillermo; Kane, Stephen R.; Rowe, Jason F.

    2017-01-01

    A main goal of NASA's Kepler Mission is to establish the frequency of potentially habitable Earth-size planets (). Relatively few such candidates identified by the mission can be confirmed to be rocky via dynamical measurement of their mass. Here we report an effort to validate 18 of them...... statistically using the BLENDER technique, by showing that the likelihood they are true planets is far greater than that of a false positive. Our analysis incorporates follow-up observations including high-resolution optical and near-infrared spectroscopy, high-resolution imaging, and information from...... the analysis of the flux centroids of the Kepler observations themselves. Although many of these candidates have been previously validated by others, the confidence levels reported typically ignore the possibility that the planet may transit a star different from the target along the same line of sight...

  13. Dynamic testing in schizophrenia: does training change the construct validity of a test?

    Science.gov (United States)

    Wiedl, Karl H; Schöttke, Henning; Green, Michael F; Nuechterlein, Keith H

    2004-01-01

    Dynamic testing typically involves specific interventions for a test to assess the extent to which test performance can be modified, beyond level of baseline (static) performance. This study used a dynamic version of the Wisconsin Card Sorting Test (WCST) that is based on cognitive remediation techniques within a test-training-test procedure. From results of previous studies with schizophrenia patients, we concluded that the dynamic and static versions of the WCST should have different construct validity. This hypothesis was tested by examining the patterns of correlations with measures of executive functioning, secondary verbal memory, and verbal intelligence. Results demonstrated a specific construct validity of WCST dynamic (i.e., posttest) scores as an index of problem solving (Tower of Hanoi) and secondary verbal memory and learning (Auditory Verbal Learning Test), whereas the impact of general verbal capacity and selective attention (Verbal IQ, Stroop Test) was reduced. It is concluded that the construct validity of the test changes with dynamic administration and that this difference helps to explain why the dynamic version of the WCST predicts functional outcome better than the static version.

  14. Contextual Validity in Hybrid Logic

    DEFF Research Database (Denmark)

    Blackburn, Patrick Rowan; Jørgensen, Klaus Frovin

    2013-01-01

    interpretations. Moreover, such indexicals give rise to a special kind of validity—contextual validity—that interacts with ordinary logi- cal validity in interesting and often unexpected ways. In this paper we model these interactions by combining standard techniques from hybrid logic with insights from the work...... of Hans Kamp and David Kaplan. We introduce a simple proof rule, which we call the Kamp Rule, and first we show that it is all we need to take us from logical validities involving now to contextual validities involving now too. We then go on to show that this deductive bridge is strong enough to carry us...... to contextual validities involving yesterday, today and tomorrow as well....

  15. Experiments at the GELINA facility for the validation of the self-indication neutron resonance densitometry technique

    Directory of Open Access Journals (Sweden)

    Rossa Riccardo

    2017-01-01

    Full Text Available Self-Indication Neutron Resonance Densitometry (SINRD is a passive non-destructive method that is being investigated to quantify the 239Pu content in a spent fuel assembly. The technique relies on the energy dependence of total cross sections for neutron induced reaction. The cross sections show resonance structures that can be used to quantify the presence of materials in objects, e.g. the total cross-section of 239Pu shows a strong resonance close to 0.3 eV. This resonance will cause a reduction of the number of neutrons emitted from spent fuel when 239Pu is present. Hence such a reduction can be used to quantify the amount of 239Pu present in the fuel. A neutron detector with a high sensitivity to neutrons in this energy region is used to enhance the sensitivity to 239Pu. This principle is similar to self-indication cross section measurements. An appropriate detector can be realized by surrounding a 239Pu-loaded fission chamber with appropriate neutron absorbing material. In this contribution experiments performed at the GELINA time-of-flight facility of the JRC at Geel (Belgium to validate the simulations are discussed. The results confirm that the strongest sensitivity to the target material was achieved with the self-indication technique, highlighting the importance of using a 239Pu fission chamber for the SINRD measurements.

  16. Airfoil shape optimization using non-traditional optimization technique and its validation

    Directory of Open Access Journals (Sweden)

    R. Mukesh

    2014-07-01

    Full Text Available Computational fluid dynamics (CFD is one of the computer-based solution methods which is more widely employed in aerospace engineering. The computational power and time required to carry out the analysis increase as the fidelity of the analysis increases. Aerodynamic shape optimization has become a vital part of aircraft design in the recent years. Generally if we want to optimize an airfoil we have to describe the airfoil and for that, we need to have at least hundred points of x and y co-ordinates. It is really difficult to optimize airfoils with this large number of co-ordinates. Nowadays many different schemes of parameter sets are used to describe general airfoil such as B-spline, and PARSEC. The main goal of these parameterization schemes is to reduce the number of needed parameters as few as possible while controlling the important aerodynamic features effectively. Here the work has been done on the PARSEC geometry representation method. The objective of this work is to introduce the knowledge of describing general airfoil using twelve parameters by representing its shape as a polynomial function. And also we have introduced the concept of Genetic Algorithm to optimize the aerodynamic characteristics of a general airfoil for specific conditions. A MATLAB program has been developed to implement PARSEC, Panel Technique, and Genetic Algorithm. This program has been tested for a standard NACA 2411 airfoil and optimized to improve its coefficient of lift. Pressure distribution and co-efficient of lift for airfoil geometries have been calculated using the Panel method. The optimized airfoil has improved co-efficient of lift compared to the original one. The optimized airfoil is validated using wind tunnel data.

  17. A website evaluation model by integration of previous evaluation models using a quantitative approach

    Directory of Open Access Journals (Sweden)

    Ali Moeini

    2015-01-01

    Full Text Available Regarding the ecommerce growth, websites play an essential role in business success. Therefore, many authors have offered website evaluation models since 1995. Although, the multiplicity and diversity of evaluation models make it difficult to integrate them into a single comprehensive model. In this paper a quantitative method has been used to integrate previous models into a comprehensive model that is compatible with them. In this approach the researcher judgment has no role in integration of models and the new model takes its validity from 93 previous models and systematic quantitative approach.

  18. Trace-based post-silicon validation for VLSI circuits

    CERN Document Server

    Liu, Xiao

    2014-01-01

    This book first provides a comprehensive coverage of state-of-the-art validation solutions based on real-time signal tracing to guarantee the correctness of VLSI circuits.  The authors discuss several key challenges in post-silicon validation and provide automated solutions that are systematic and cost-effective.  A series of automatic tracing solutions and innovative design for debug (DfD) techniques are described, including techniques for trace signal selection for enhancing visibility of functional errors, a multiplexed signal tracing strategy for improving functional error detection, a tracing solution for debugging electrical errors, an interconnection fabric for increasing data bandwidth and supporting multi-core debug, an interconnection fabric design and optimization technique to increase transfer flexibility and a DfD design and associated tracing solution for improving debug efficiency and expanding tracing window. The solutions presented in this book improve the validation quality of VLSI circuit...

  19. A second study of the prediction of cognitive errors using the 'CREAM' technique

    International Nuclear Information System (INIS)

    Collier, Steve; Andresen, Gisle

    2000-03-01

    Some human errors, such as errors of commission and knowledge-based errors, are not adequately modelled in probabilistic safety assessments. Even qualitative methods for handling these sorts of errors are comparatively underdeveloped. The 'Cognitive Reliability and Error Analysis Method' (CREAM) was recently developed for prediction of cognitive error modes. It has not yet been comprehensively established how reliable, valid and generally useful it could be to researchers and practitioners. A previous study of CREAM at Halden was promising, showing a relationship between errors predicted in advance and those that actually occurred in simulated fault scenarios. The present study continues this work. CREAM was used to make predictions of cognitive error modes throughout two rather difficult fault scenarios. Predictions were made of the most likely cognitive error mode, were one to occur at all, at several points throughout the expected scenarios, based upon the scenario design and description. Each scenario was then run 15 times with different operators. Error modes occurring during simulations were later scored using the task description for the scenario, videotapes of operator actions, eye-track recording, operators' verbal protocols and an expert's concurrent commentary. The scoring team had no previous substantive knowledge of the experiment or the techniques used, so as to provide a more stringent test of the data and knowledge needed for scoring. The scored error modes were then compared with the CREAM predictions to assess the degree of agreement. Some cognitive error modes were predicted successfully, but the results were generally not so encouraging as the previous study. Several problems were found with both the CREAM technique and the data needed to complete the analysis. It was felt that further development was needed before this kind of analysis can be reliable and valid, either in a research setting or as a practitioner's tool in a safety assessment

  20. Validation of a cylindrical phantom for verification of radiotherapy treatments in head and neck with special techniques

    International Nuclear Information System (INIS)

    Vargas, Nicolas M.; Garcia, Marcia; Piriz, Gustavo; Perez, Niurka

    2011-01-01

    Verification of radiotherapy treatments in head and neck requires, among other things, small volume chambers and a phantom to reproduce the geometry and density of the anatomical structure. New documents from the ICRU (International Commission on Radiation Units and Measurements), Report 83, established the need for quality control in radiotherapy with special techniques such as IMRT (intensity-modulated radiation therapy). In this study, we built a cylindrical acrylic phantom with standing water, containing seven measuring points in the transverse plane and free location (0-20 cm) in the longitudinal plane. These points of measurement are constituted by cavities for the accommodation of the ionization chamber of 7 mm of mayor diameter (semi flex, pinpoint with build cup). The results of the phantom validation yielded percentage differences less than 1% in fixed beams and less than 2.5% in arc therapy for TPS Eclipse calculation. The preparation of this phantom, particularly made to verify the head and neck treatments, was simple and reliable for checking the dose in radiotherapy with fixed beams and/or special techniques such as arc therapy or IMRT, so that will be sent to various radiotherapy centers in the country for dosimetric verification in such treatments. (author)

  1. Validation of a cylindrical phantom for verification of radiotherapy treatments in head and neck with special techniques

    Energy Technology Data Exchange (ETDEWEB)

    Vargas, Nicolas M.; Garcia, Marcia, E-mail: nimoralesv@gmail.com [Universidad de La Frontera, Temuco (Chile). Dept. de Ciencias Fisicas; Piriz, Gustavo [Instituto Nacional del Cancer, Santiago (Chile). Fisica Medica; Perez, Niurka [Instituto de Salud Publica, Santiago (Chile). QA Radioterapia. Inst. de Salud Publica

    2011-07-01

    Verification of radiotherapy treatments in head and neck requires, among other things, small volume chambers and a phantom to reproduce the geometry and density of the anatomical structure. New documents from the ICRU (International Commission on Radiation Units and Measurements), Report 83, established the need for quality control in radiotherapy with special techniques such as IMRT (intensity-modulated radiation therapy). In this study, we built a cylindrical acrylic phantom with standing water, containing seven measuring points in the transverse plane and free location (0-20 cm) in the longitudinal plane. These points of measurement are constituted by cavities for the accommodation of the ionization chamber of 7 mm of mayor diameter (semi flex, pinpoint with build cup). The results of the phantom validation yielded percentage differences less than 1% in fixed beams and less than 2.5% in arc therapy for TPS Eclipse calculation. The preparation of this phantom, particularly made to verify the head and neck treatments, was simple and reliable for checking the dose in radiotherapy with fixed beams and/or special techniques such as arc therapy or IMRT, so that will be sent to various radiotherapy centers in the country for dosimetric verification in such treatments. (author)

  2. Evolutionary Analysis Predicts Sensitive Positions of MMP20 and Validates Newly- and Previously-Identified MMP20 Mutations Causing Amelogenesis Imperfecta

    Directory of Open Access Journals (Sweden)

    Barbara Gasse

    2017-06-01

    Full Text Available Amelogenesis imperfecta (AI designates a group of genetic diseases characterized by a large range of enamel disorders causing important social and health problems. These defects can result from mutations in enamel matrix proteins or protease encoding genes. A range of mutations in the enamel cleavage enzyme matrix metalloproteinase-20 gene (MMP20 produce enamel defects of varying severity. To address how various alterations produce a range of AI phenotypes, we performed a targeted analysis to find MMP20 mutations in French patients diagnosed with non-syndromic AI. Genomic DNA was isolated from saliva and MMP20 exons and exon-intron boundaries sequenced. We identified several homozygous or heterozygous mutations, putatively involved in the AI phenotypes. To validate missense mutations and predict sensitive positions in the MMP20 sequence, we evolutionarily compared 75 sequences extracted from the public databases using the Datamonkey webserver. These sequences were representative of mammalian lineages, covering more than 150 million years of evolution. This analysis allowed us to find 324 sensitive positions (out of the 483 MMP20 residues, pinpoint functionally important domains, and build an evolutionary chart of important conserved MMP20 regions. This is an efficient tool to identify new- and previously-identified mutations. We thus identified six functional MMP20 mutations in unrelated families, finding two novel mutated sites. The genotypes and phenotypes of these six mutations are described and compared. To date, 13 MMP20 mutations causing AI have been reported, making these genotypes and associated hypomature enamel phenotypes the most frequent in AI.

  3. Evolutionary Analysis Predicts Sensitive Positions of MMP20 and Validates Newly- and Previously-Identified MMP20 Mutations Causing Amelogenesis Imperfecta.

    Science.gov (United States)

    Gasse, Barbara; Prasad, Megana; Delgado, Sidney; Huckert, Mathilde; Kawczynski, Marzena; Garret-Bernardin, Annelyse; Lopez-Cazaux, Serena; Bailleul-Forestier, Isabelle; Manière, Marie-Cécile; Stoetzel, Corinne; Bloch-Zupan, Agnès; Sire, Jean-Yves

    2017-01-01

    Amelogenesis imperfecta (AI) designates a group of genetic diseases characterized by a large range of enamel disorders causing important social and health problems. These defects can result from mutations in enamel matrix proteins or protease encoding genes. A range of mutations in the enamel cleavage enzyme matrix metalloproteinase-20 gene ( MMP20 ) produce enamel defects of varying severity. To address how various alterations produce a range of AI phenotypes, we performed a targeted analysis to find MMP20 mutations in French patients diagnosed with non-syndromic AI. Genomic DNA was isolated from saliva and MMP20 exons and exon-intron boundaries sequenced. We identified several homozygous or heterozygous mutations, putatively involved in the AI phenotypes. To validate missense mutations and predict sensitive positions in the MMP20 sequence, we evolutionarily compared 75 sequences extracted from the public databases using the Datamonkey webserver. These sequences were representative of mammalian lineages, covering more than 150 million years of evolution. This analysis allowed us to find 324 sensitive positions (out of the 483 MMP20 residues), pinpoint functionally important domains, and build an evolutionary chart of important conserved MMP20 regions. This is an efficient tool to identify new- and previously-identified mutations. We thus identified six functional MMP20 mutations in unrelated families, finding two novel mutated sites. The genotypes and phenotypes of these six mutations are described and compared. To date, 13 MMP20 mutations causing AI have been reported, making these genotypes and associated hypomature enamel phenotypes the most frequent in AI.

  4. Use of Modern Chemical Protein Synthesis and Advanced Fluorescent Assay Techniques to Experimentally Validate the Functional Annotation of Microbial Genomes

    Energy Technology Data Exchange (ETDEWEB)

    Kent, Stephen [University of Chicago

    2012-07-20

    The objective of this research program was to prototype methods for the chemical synthesis of predicted protein molecules in annotated microbial genomes. High throughput chemical methods were to be used to make large numbers of predicted proteins and protein domains, based on microbial genome sequences. Microscale chemical synthesis methods for the parallel preparation of peptide-thioester building blocks were developed; these peptide segments are used for the parallel chemical synthesis of proteins and protein domains. Ultimately, it is envisaged that these synthetic molecules would be ‘printed’ in spatially addressable arrays. The unique ability of total synthesis to precision label protein molecules with dyes and with chemical or biochemical ‘tags’ can be used to facilitate novel assay technologies adapted from state-of-the art single molecule fluorescence detection techniques. In the future, in conjunction with modern laboratory automation this integrated set of techniques will enable high throughput experimental validation of the functional annotation of microbial genomes.

  5. Improving clinical decision support using data mining techniques

    Science.gov (United States)

    Burn-Thornton, Kath E.; Thorpe, Simon I.

    1999-02-01

    Physicians, in their ever-demanding jobs, are looking to decision support systems for aid in clinical diagnosis. However, clinical decision support systems need to be of sufficiently high accuracy that they help, rather than hinder, the physician in his/her diagnosis. Decision support systems with accuracies, of patient state determination, of greater than 80 percent, are generally perceived to be sufficiently accurate to fulfill the role of helping the physician. We have previously shown that data mining techniques have the potential to provide the underpinning technology for clinical decision support systems. In this paper, an extension of the work in reverence 2, we describe how changes in data mining methodologies, for the analysis of 12-lead ECG data, improve the accuracy by which data mining algorithms determine which patients are suffering from heart disease. We show that the accuracy of patient state prediction, for all the algorithms, which we investigated, can be increased by up to 6 percent, using the combination of appropriate test training ratios and 5-fold cross-validation. The use of cross-validation greater than 5-fold, appears to reduce the improvement in algorithm classification accuracy gained by the use of this validation method. The accuracy of 84 percent in patient state predictions, obtained using the algorithm OCI, suggests that this algorithm will be capable of providing the required accuracy for clinical decision support systems.

  6. Validation of IPS Single-Station Analysis (SSA) Using MEXART Routines in Multi-Station Spectra. Comparison with Cross Correlation Function (CCF) Analysis.

    Science.gov (United States)

    Bisi, M. M.; Chang, O.; Gonzalez-Esparza, A.; Fallows, R. A.; Aguilar-Rodriguez, E.

    2017-12-01

    The phenomenon of Interplanetary Scintillation (IPS) occurs from the scattering of radio waves coming from compact radio sources that cross electron density fluctuations in the interplanetary medium. By analyzing these fluctuations in the measurements of flux intensity of galactic (compact) radio sources in a radio telescope, it is possible to infer some properties of structures in the solar wind. Studies based on observations of IPS have provided valuable information on the physics of the internal heliosphere for over 50 years. There are two techniques that provide IPS results: 1) Single-Station Analysis (SSA), where a theoretical model is fitted to the observed spectrum; and 2) Cross-Correlation Function (CCF), where two antennas separated by a few hundred kilometers simultaneously and independently observe the same radio source. In order to combine and complement solar wind speed determinations, it is important to validate the results of these two IPS techniques. In this work we analyze events from previously studied observations from MERLIN (Multi-Element Radio-Linked Interferometer Network) using the CCF methodology. The SSA model fit is applied to these observations and compared with the previous results to validate the two techniques. The objective is to know the behavior of the parameters in cases studied by CCFs that can be implemented in the SSA model. This work studies the capability of SSA model fit to describe complex events in the interplanetary environment and seeks to improve the adjustment of parameters from individual spectra to the theoretical model. The validation of these two methodologies is important to be able to combine data in real time from different radio telescopes which is necessary for the success of the Worldwide Interplanetary Scintillation Stations (WIPSS) Network to monitor solar wind structures using IPS data.

  7. An automated patient recognition method based on an image-matching technique using previous chest radiographs in the picture archiving and communication system environment

    International Nuclear Information System (INIS)

    Morishita, Junji; Katsuragawa, Shigehiko; Kondo, Keisuke; Doi, Kunio

    2001-01-01

    An automated patient recognition method for correcting 'wrong' chest radiographs being stored in a picture archiving and communication system (PACS) environment has been developed. The method is based on an image-matching technique that uses previous chest radiographs. For identification of a 'wrong' patient, the correlation value was determined for a previous image of a patient and a new, current image of the presumed corresponding patient. The current image was shifted horizontally and vertically and rotated, so that we could determine the best match between the two images. The results indicated that the correlation values between the current and previous images for the same, 'correct' patients were generally greater than those for different, 'wrong' patients. Although the two histograms for the same patient and for different patients overlapped at correlation values greater than 0.80, most parts of the histograms were separated. The correlation value was compared with a threshold value that was determined based on an analysis of the histograms of correlation values obtained for the same patient and for different patients. If the current image is considered potentially to belong to a 'wrong' patient, then a warning sign with the probability for a 'wrong' patient is provided to alert radiology personnel. Our results indicate that at least half of the 'wrong' images in our database can be identified correctly with the method described in this study. The overall performance in terms of a receiver operating characteristic curve showed a high performance of the system. The results also indicate that some readings of 'wrong' images for a given patient in the PACS environment can be prevented by use of the method we developed. Therefore an automated warning system for patient recognition would be useful in correcting 'wrong' images being stored in the PACS environment

  8. Experimental Validation of Advanced Dispersed Fringe Sensing (ADFS) Algorithm Using Advanced Wavefront Sensing and Correction Testbed (AWCT)

    Science.gov (United States)

    Wang, Xu; Shi, Fang; Sigrist, Norbert; Seo, Byoung-Joon; Tang, Hong; Bikkannavar, Siddarayappa; Basinger, Scott; Lay, Oliver

    2012-01-01

    Large aperture telescope commonly features segment mirrors and a coarse phasing step is needed to bring these individual segments into the fine phasing capture range. Dispersed Fringe Sensing (DFS) is a powerful coarse phasing technique and its alteration is currently being used for JWST.An Advanced Dispersed Fringe Sensing (ADFS) algorithm is recently developed to improve the performance and robustness of previous DFS algorithms with better accuracy and unique solution. The first part of the paper introduces the basic ideas and the essential features of the ADFS algorithm and presents the some algorithm sensitivity study results. The second part of the paper describes the full details of algorithm validation process through the advanced wavefront sensing and correction testbed (AWCT): first, the optimization of the DFS hardware of AWCT to ensure the data accuracy and reliability is illustrated. Then, a few carefully designed algorithm validation experiments are implemented, and the corresponding data analysis results are shown. Finally the fiducial calibration using Range-Gate-Metrology technique is carried out and a <10nm or <1% algorithm accuracy is demonstrated.

  9. Assessing Discriminative Performance at External Validation of Clinical Prediction Models.

    Directory of Open Access Journals (Sweden)

    Daan Nieboer

    Full Text Available External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting.We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1 the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2 the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury.The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples and heterogeneous in scenario 2 (in 17%-39% of simulated samples. Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2.The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect regression coefficients.

  10. The LEAP™ Gesture Interface Device and Take-Home Laparoscopic Simulators: A Study of Construct and Concurrent Validity.

    Science.gov (United States)

    Partridge, Roland W; Brown, Fraser S; Brennan, Paul M; Hennessey, Iain A M; Hughes, Mark A

    2016-02-01

    To assess the potential of the LEAP™ infrared motion tracking device to map laparoscopic instrument movement in a simulated environment. Simulator training is optimized when augmented by objective performance feedback. We explore the potential LEAP has to provide this in a way compatible with affordable take-home simulators. LEAP and the previously validated InsTrac visual tracking tool mapped expert and novice performances of a standardized simulated laparoscopic task. Ability to distinguish between the 2 groups (construct validity) and correlation between techniques (concurrent validity) were the primary outcome measures. Forty-three expert and 38 novice performances demonstrated significant differences in LEAP-derived metrics for instrument path distance (P device is able to track the movement of hands using instruments in a laparoscopic box simulator. Construct validity is demonstrated by its ability to distinguish novice from expert performances. Only time and instrument path distance demonstrated concurrent validity with an existing tracking method however. A number of limitations to the tracking method used by LEAP have been identified. These need to be addressed before it can be considered an alternative to visual tracking for the delivery of objective performance metrics in take-home laparoscopic simulators. © The Author(s) 2015.

  11. Student mathematical imagination instruments: construction, cultural adaptation and validity

    Science.gov (United States)

    Dwijayanti, I.; Budayasa, I. K.; Siswono, T. Y. E.

    2018-03-01

    Imagination has an important role as the center of sensorimotor activity of the students. The purpose of this research is to construct the instrument of students’ mathematical imagination in understanding concept of algebraic expression. The researcher performs validity using questionnaire and test technique and data analysis using descriptive method. Stages performed include: 1) the construction of the embodiment of the imagination; 2) determine the learning style questionnaire; 3) construct instruments; 4) translate to Indonesian as well as adaptation of learning style questionnaire content to student culture; 5) perform content validation. The results stated that the constructed instrument is valid by content validation and empirical validation so that it can be used with revisions. Content validation involves Indonesian linguists, english linguists and mathematics material experts. Empirical validation is done through a legibility test (10 students) and shows that in general the language used can be understood. In addition, a questionnaire test (86 students) was analyzed using a biserial point correlation technique resulting in 16 valid items with a reliability test using KR 20 with medium reability criteria. While the test instrument test (32 students) to find all items are valid and reliability test using KR 21 with reability is 0,62.

  12. Prevalent musculoskeletal pain as a correlate of previous exposure to torture

    DEFF Research Database (Denmark)

    Olsen, Dorte Reff; Montgomery, Edith; Bojholm, S

    2006-01-01

    AIM: To research possible associations between previous exposure to specific torture techniques and prevalent pain in the head and face, back, and feet. METHODS: 221 refugees, 193 males and 28 females, previously exposed to torture in their home country, were subject to a clinical interview...... was general abuse of the whole body (OR 5.64, 95% CI 1.93-16.45). CONCLUSION: In spite of many factors being potentially co-responsible for prevalent pain, years after the torture took place it presents itself as strongly associated with specific loci of pain, with generalized effects, and with somatizing....

  13. Implementation and Validation of Artificial Intelligence Techniques for Robotic Surgery

    OpenAIRE

    Aarshay Jain; Deepansh Jagotra; Vijayant Agarwal

    2014-01-01

    The primary focus of this study is implementation of Artificial Intelligence (AI) technique for developing an inverse kinematics solution for the Raven-IITM surgical research robot [1]. First, the kinematic model of the Raven-IITM robot was analysed along with the proposed analytical solution [2] for inverse kinematics problem. Next, The Artificial Neural Network (ANN) techniques was implemented. The training data for the same was careful selected by keeping manipulability constraints in mind...

  14. Validation of a motion-robust 2D sequential technique for quantification of hepatic proton density fat fraction during free breathing.

    Science.gov (United States)

    Pooler, B Dustin; Hernando, Diego; Ruby, Jeannine A; Ishii, Hiroshi; Shimakawa, Ann; Reeder, Scott B

    2018-04-17

    Current chemical-shift-encoded (CSE) MRI techniques for measuring hepatic proton density fat fraction (PDFF) are sensitive to motion artifacts. Initial validation of a motion-robust 2D-sequential CSE-MRI technique for quantification of hepatic PDFF. Phantom study and prospective in vivo cohort. Fifty adult patients (27 women, 23 men, mean age 57.2 years). 3D, 2D-interleaved, and 2D-sequential CSE-MRI acquisitions at 1.5T. Three CSE-MRI techniques (3D, 2D-interleaved, 2D-sequential) were performed in a PDFF phantom and in vivo. Reference standards were 3D CSE-MRI PDFF measurements for the phantom study and single-voxel MR spectroscopy hepatic PDFF measurements (MRS-PDFF) in vivo. In vivo hepatic MRI-PDFF measurements were performed during a single breath-hold (BH) and free breathing (FB), and were repeated by a second reader for the FB 2D-sequential sequence to assess interreader variability. Correlation plots to validate the 2D-sequential CSE-MRI against the phantom and in vivo reference standards. Bland-Altman analysis of FB versus BH CSE-MRI acquisitions to evaluate robustness to motion. Bland-Altman analysis to assess interreader variability. Phantom 2D-sequential CSE-MRI PDFF measurements demonstrated excellent agreement and correlation (R 2 > 0.99) with 3D CSE-MRI. In vivo, the mean (±SD) hepatic PDFF was 8.8 ± 8.7% (range 0.6-28.5%). Compared with BH acquisitions, FB hepatic PDFF measurements demonstrated bias of +0.15% for 2D-sequential compared with + 0.53% for 3D and +0.94% for 2D-interleaved. 95% limits of agreement (LOA) were narrower for 2D-sequential (±0.99%), compared with 3D (±3.72%) and 2D-interleaved (±3.10%). All CSE-MRI techniques had excellent correlation with MRS (R 2 > 0.97). The FB 2D-sequential acquisition demonstrated little interreader variability, with mean bias of +0.07% and 95% LOA of ± 1.53%. This motion-robust 2D-sequential CSE-MRI can accurately measure hepatic PDFF during free breathing in a patient population with

  15. Validating MEDIQUAL Constructs

    Science.gov (United States)

    Lee, Sang-Gun; Min, Jae H.

    In this paper, we validate MEDIQUAL constructs through the different media users in help desk service. In previous research, only two end-users' constructs were used: assurance and responsiveness. In this paper, we extend MEDIQUAL constructs to include reliability, empathy, assurance, tangibles, and responsiveness, which are based on the SERVQUAL theory. The results suggest that: 1) five MEDIQUAL constructs are validated through the factor analysis. That is, importance of the constructs have relatively high correlations between measures of the same construct using different methods and low correlations between measures of the constructs that are expected to differ; and 2) five MEDIQUAL constructs are statistically significant on media users' satisfaction in help desk service by regression analysis.

  16. Is email a reliable means of contacting authors of previously published papers? A study of the Emergency Medicine Journal for 2001.

    Science.gov (United States)

    O'Leary, F

    2003-07-01

    To determine whether it is possible to contact authors of previously published papers via email. A cross sectional study of the Emergency Medicine Journal for 2001. 118 articles were included in the study. The response rate from those with valid email addresses was 73%. There was no statistical difference between the type of email address used and the address being invalid (p=0.392) or between the type of article and the likelihood of a reply (p=0.197). More responses were obtained from work addresses when compared with Hotmail addresses (86% v 57%, p=0.02). Email is a valid means of contacting authors of previously published articles, particularly within the emergency medicine specialty. A work based email address may be a more valid means of contact than a Hotmail address.

  17. Fish age validation by radiometric analysis of otoliths

    International Nuclear Information System (INIS)

    Fenton, G.E.

    1992-01-01

    Radiochemical analysis of aragonitic fish otoliths provides a useful approach to validating ages obtained by more common methods. The history of applications of radiometry using short-lived natural isotopes to clams, Nautilus, living corals and fish otoliths is briefly reviewed. The biogeochemical assumptions required for successful use of these techniques are discussed, and the appropriate mathematical treatments required for data analysis are outlined. Novel normalization techniques designed to widen the validity of this approach are proposed. Desirable lines of further research are also briefly discussed. 38 refs., 1 tab

  18. Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods.

    Science.gov (United States)

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J Sunil

    2014-08-01

    We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called "Patient Recursive Survival Peeling" is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called "combined" cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication.

  19. Radiological findings for hip dysplasia at skeletal maturity. Validation of digital and manual measurement techniques.

    Science.gov (United States)

    Engesæter, Ingvild Øvstebø; Laborie, Lene Bjerke; Lehmann, Trude Gundersen; Sera, Francesco; Fevang, Jonas; Pedersen, Douglas; Morcuende, José; Lie, Stein Atle; Engesæter, Lars Birger; Rosendahl, Karen

    2012-07-01

    To report on intra-observer, inter-observer, and inter-method reliability and agreement for radiological measurements used in the diagnosis of hip dysplasia at skeletal maturity, as obtained by a manual and a digital measurement technique. Pelvic radiographs from 95 participants (56 females) in a follow-up hip study of 18- to 19-year-old patients were included. Eleven radiological measurements relevant for hip dysplasia (Sharp's, Wiberg's, and Ogata's angles; acetabular roof angle of Tönnis; articulo-trochanteric distance; acetabular depth-width ratio; femoral head extrusion index; maximum teardrop width; and the joint space width in three different locations) were validated. Three observers measured the radiographs using both a digital measurement program and manually in AgfaWeb1000. Inter-method and inter- and intra-observer agreement were analyzed using the mean differences between the readings/readers, establishing the 95% limits of agreement. We also calculated the minimum detectable change and the intra-class correlation coefficient. Large variations among different radiological measurements were demonstrated. However, the variation was not related to the use of either the manual or digital measurement technique. For measurements with greater absolute values (Sharp's angle, femoral head extrusion index, and acetabular depth-width ratio) the inter- and intra-observer and inter-method agreements were better as compared to measurements with lower absolute values (acetabular roof angle, teardrop and joint space width). The inter- and intra-observer variation differs notably across different radiological measurements relevant for hip dysplasia at skeletal maturity, a fact that should be taken into account in clinical practice. The agreement between the manual and digital methods is good.

  20. Radiological findings for hip dysplasia at skeletal maturity. Validation of digital and manual measurement techniques

    Energy Technology Data Exchange (ETDEWEB)

    Engesaeter, Ingvild Oevsteboe [University of Bergen, Department of Surgical Sciences, Bergen (Norway); Haukeland University Hospital, Department of Orthopaedic Surgery, Bergen (Norway); Haukeland University Hospital, Department of Radiology, Bergen (Norway); Haukeland University Hospital, The Norwegian Arthroplasty Register, Department of Orthopaedic Surgery, Bergen (Norway); Laborie, Lene Bjerke; Rosendahl, Karen [University of Bergen, Department of Surgical Sciences, Bergen (Norway); Haukeland University Hospital, Department of Radiology, Bergen (Norway); Lehmann, Trude Gundersen; Fevang, Jonas; Engesaeter, Lars Birger [University of Bergen, Department of Surgical Sciences, Bergen (Norway); Haukeland University Hospital, Department of Orthopaedic Surgery, Bergen (Norway); Sera, Francesco [University College London Institute of Child Health, Medical Research Council Centre of Epidemiology for Child Health, London (United Kingdom); Pedersen, Douglas; Morcuende, Jose [University of Iowa Hospital and Clinics, Department of Orthopaedics and Rehabilitation, Iowa City, IA (United States); Lie, Stein Atle [Uni Health, Uni Research, Bergen (Norway)

    2012-07-15

    To report on intra-observer, inter-observer, and inter-method reliability and agreement for radiological measurements used in the diagnosis of hip dysplasia at skeletal maturity, as obtained by a manual and a digital measurement technique. Pelvic radiographs from 95 participants (56 females) in a follow-up hip study of 18- to 19-year-old patients were included. Eleven radiological measurements relevant for hip dysplasia (Sharp's, Wiberg's, and Ogata's angles; acetabular roof angle of Toennis; articulo-trochanteric distance; acetabular depth-width ratio; femoral head extrusion index; maximum teardrop width; and the joint space width in three different locations) were validated. Three observers measured the radiographs using both a digital measurement program and manually in AgfaWeb1000. Inter-method and inter- and intra-observer agreement were analyzed using the mean differences between the readings/readers, establishing the 95% limits of agreement. We also calculated the minimum detectable change and the intra-class correlation coefficient. Large variations among different radiological measurements were demonstrated. However, the variation was not related to the use of either the manual or digital measurement technique. For measurements with greater absolute values (Sharp's angle, femoral head extrusion index, and acetabular depth-width ratio) the inter- and intra-observer and inter-method agreements were better as compared to measurements with lower absolute values (acetabular roof angle, teardrop and joint space width). The inter- and intra-observer variation differs notably across different radiological measurements relevant for hip dysplasia at skeletal maturity, a fact that should be taken into account in clinical practice. The agreement between the manual and digital methods is good. (orig.)

  1. Validation of the dynamics of SDS and RRS flux, flow, pressure and temperature signals using noise analysis technique

    International Nuclear Information System (INIS)

    Glockler, O.; Cooke, D.F.; Tulett, M.V.

    1995-01-01

    In 1992, a program was initiated to establish reactor noise analysis as a practical tool for plant performance monitoring and system diagnostics in Ontario Hydro's CANDU reactors. Since then, various CANDU-specific noise analysis applications have been developed and validated. The noise-based statistical techniques are being successfully applied as powerful troubleshooting and diagnostic tools to a wide variety of actual operational I and C problems. Critical plant components, instrumentation and processes are monitored on a regular basis, and their dynamic characteristics are verified on-power. Recent applications of noise analysis include (1) validating the dynamics of in-core flux detectors (ICFDS) and ion chambers, (2) estimating the prompt fraction ICFDs in noise measurements at full power and in power rundown tests, (3) identifying the cause of excessive signal fluctuations in certain flux detectors, (4) validating the dynamic coupling between liquid zone control signals, (5) detecting and monitoring mechanical vibrations of detector tubes, reactivity devices and fuel channels induced by moderator/coolant flow, (6) estimating the dynamics and response time of RTD temperature signals, (7) isolating the cause of RTD signal anomalies, (8) investigating the source of abnormal flow signal behaviour, (9) estimating the overall response time of flow and pressure signals, (1 0) detecting coolant boiling in fully instrumented fuel channels, (1 1) monitoring moderator circulation via temperature noise, and (12) predicting the performance of shut-off rods. Some of these applications are performed on an as needed basis. The noise analysis program, in the Pickering-B station alone, has saved Ontario Hydro millions of dollars during its first three years. The results of the noise analysis program have been also reviewed by the regulator (Atomic Energy Control Board of Canada) with favorable results. The AECB have expressed interest in Ontario Hydro further exploiting the

  2. EVLncRNAs: a manually curated database for long non-coding RNAs validated by low-throughput experiments

    Science.gov (United States)

    Zhao, Huiying; Yu, Jiafeng; Guo, Chengang; Dou, Xianghua; Song, Feng; Hu, Guodong; Cao, Zanxia; Qu, Yuanxu

    2018-01-01

    Abstract Long non-coding RNAs (lncRNAs) play important functional roles in various biological processes. Early databases were utilized to deposit all lncRNA candidates produced by high-throughput experimental and/or computational techniques to facilitate classification, assessment and validation. As more lncRNAs are validated by low-throughput experiments, several databases were established for experimentally validated lncRNAs. However, these databases are small in scale (with a few hundreds of lncRNAs only) and specific in their focuses (plants, diseases or interactions). Thus, it is highly desirable to have a comprehensive dataset for experimentally validated lncRNAs as a central repository for all of their structures, functions and phenotypes. Here, we established EVLncRNAs by curating lncRNAs validated by low-throughput experiments (up to 1 May 2016) and integrating specific databases (lncRNAdb, LncRANDisease, Lnc2Cancer and PLNIncRBase) with additional functional and disease-specific information not covered previously. The current version of EVLncRNAs contains 1543 lncRNAs from 77 species that is 2.9 times larger than the current largest database for experimentally validated lncRNAs. Seventy-four percent lncRNA entries are partially or completely new, comparing to all existing experimentally validated databases. The established database allows users to browse, search and download as well as to submit experimentally validated lncRNAs. The database is available at http://biophy.dzu.edu.cn/EVLncRNAs. PMID:28985416

  3. Safe pediatric surgery: development and validation of preoperative interventions checklist

    Directory of Open Access Journals (Sweden)

    Maria Paula de Oliveira Pires

    2013-09-01

    Full Text Available OBJECTIVES: this study was aimed at developing and validating a checklist of preoperative pediatric interventions related to the safety of surgical patients. METHOD: methodological study concerning the construction and validation of an instrument with safe preoperative care indicators. The checklist was subject to validation through the Delphi technique, establishing a consensus level of 80%. RESULTS: five professional specialists in the area conducted the validation and a consensus on the content and the construct was reached after two applications of the Delphi technique. CONCLUSION: the "Safe Pediatric Surgery Checklist", simulating the preoperative trajectory of children, is an instrument capable of contributing to the preparation and promotion of safe surgery, as it identifies the presence or absence of measures required to promote patient safety.

  4. Novel Approach for Ensuring Increased Validity in Home Blood Pressure Monitoring

    DEFF Research Database (Denmark)

    Wagner, Stefan Rahr; Toftegaard, Thomas Skjødeberg; Bertelsen, Olav Wedege

    This paper proposes a novel technique to increase the validity of home blood pressure monitoring by using various sensor technologies as part of an intelligent environment platform in the home of the user. A range of recommendations exists on how to obtain a valid blood pressure but with the devi......This paper proposes a novel technique to increase the validity of home blood pressure monitoring by using various sensor technologies as part of an intelligent environment platform in the home of the user. A range of recommendations exists on how to obtain a valid blood pressure...

  5. 29 CFR 1607.6 - Use of selection procedures which have not been validated.

    Science.gov (United States)

    2010-07-01

    .... 1607.6 Section 1607.6 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY... circumstances in which a user cannot or need not utilize the validation techniques contemplated by these... which has an adverse impact, the validation techniques contemplated by these guidelines usually should...

  6. Experimental validation of energy parameters in parabolic trough collector with plain absorber and analysis of heat transfer enhancement techniques

    Science.gov (United States)

    Bilal, F. R.; Arunachala, U. C.; Sandeep, H. M.

    2018-01-01

    The quantum of heat loss from the receiver of the Parabolic Trough Collector is considerable which results in lower thermal efficiency of the system. Hence heat transfer augmentation is essential which can be attained by various techniques. An analytical model to evaluate the system with bare receiver performance was developed using MATLAB. The experimental validation of the model resulted in less than 5.5% error in exit temperature using both water and thermic oil as heat transfer fluid. Further, heat transfer enhancement techniques were incorporated in the model which included the use of twisted tape inserts, nanofluid, and a combination of both for further enhancement. It was observed that the use of evacuated glass cover in the existing setup would increase the useful heat gain up to 5.3%. Fe3O4/H2O nanofluid showed a maximum enhancement of 56% in the Nusselt number for the volume concentration of 0.6% at highest Reynolds number. Similarly, twisted tape turbulators (with twist ratio of 2) taken alone with water exhibited 59% improvement in Nusselt number. Combining both the heat transfer augmentation techniques at their best values revealed the Nusselt number enhancement up to 87%. It is concluded that, use of twisted tape with water is the best method for heat transfer augmentation since it gives the maximum effective thermal efficiency amongst all for the range of Re considered. The first section in your paper

  7. Cross validation for the classical model of structured expert judgment

    International Nuclear Information System (INIS)

    Colson, Abigail R.; Cooke, Roger M.

    2017-01-01

    We update the 2008 TU Delft structured expert judgment database with data from 33 professionally contracted Classical Model studies conducted between 2006 and March 2015 to evaluate its performance relative to other expert aggregation models. We briefly review alternative mathematical aggregation schemes, including harmonic weighting, before focusing on linear pooling of expert judgments with equal weights and performance-based weights. Performance weighting outperforms equal weighting in all but 1 of the 33 studies in-sample. True out-of-sample validation is rarely possible for Classical Model studies, and cross validation techniques that split calibration questions into a training and test set are used instead. Performance weighting incurs an “out-of-sample penalty” and its statistical accuracy out-of-sample is lower than that of equal weighting. However, as a function of training set size, the statistical accuracy of performance-based combinations reaches 75% of the equal weight value when the training set includes 80% of calibration variables. At this point the training set is sufficiently powerful to resolve differences in individual expert performance. The information of performance-based combinations is double that of equal weighting when the training set is at least 50% of the set of calibration variables. Previous out-of-sample validation work used a Total Out-of-Sample Validity Index based on all splits of the calibration questions into training and test subsets, which is expensive to compute and includes small training sets of dubious value. As an alternative, we propose an Out-of-Sample Validity Index based on averaging the product of statistical accuracy and information over all training sets sized at 80% of the calibration set. Performance weighting outperforms equal weighting on this Out-of-Sample Validity Index in 26 of the 33 post-2006 studies; the probability of 26 or more successes on 33 trials if there were no difference between performance

  8. Validation of a gating technique for radiotherapy treatment of injuries affected by respiratory motion; Validacion de una atecnica de gating para el tratamiento con radioterapia externa de lesiones afectadas por el movimiento respiratorio

    Energy Technology Data Exchange (ETDEWEB)

    Martinez Ortega, J.; Castro Tejero, P.

    2011-07-01

    The use of gating techniques for the treatment of lesions that are involved respiratory motion may bring an increase in the dose administered. tumors and decreased the dose to adjacent healthy organs. In the study presented shows the steps taken to validate the respiratory gating technique using the RPM system (Real-time Position Management) from Varian. (Author)

  9. Validation of non-stationary precipitation series for site-specific impact assessment: comparison of two statistical downscaling techniques

    Science.gov (United States)

    Mullan, Donal; Chen, Jie; Zhang, Xunchang John

    2016-02-01

    Statistical downscaling (SD) methods have become a popular, low-cost and accessible means of bridging the gap between the coarse spatial resolution at which climate models output climate scenarios and the finer spatial scale at which impact modellers require these scenarios, with various different SD techniques used for a wide range of applications across the world. This paper compares the Generator for Point Climate Change (GPCC) model and the Statistical DownScaling Model (SDSM)—two contrasting SD methods—in terms of their ability to generate precipitation series under non-stationary conditions across ten contrasting global climates. The mean, maximum and a selection of distribution statistics as well as the cumulative frequencies of dry and wet spells for four different temporal resolutions were compared between the models and the observed series for a validation period. Results indicate that both methods can generate daily precipitation series that generally closely mirror observed series for a wide range of non-stationary climates. However, GPCC tends to overestimate higher precipitation amounts, whilst SDSM tends to underestimate these. This infers that GPCC is more likely to overestimate the effects of precipitation on a given impact sector, whilst SDSM is likely to underestimate the effects. GPCC performs better than SDSM in reproducing wet and dry day frequency, which is a key advantage for many impact sectors. Overall, the mixed performance of the two methods illustrates the importance of users performing a thorough validation in order to determine the influence of simulated precipitation on their chosen impact sector.

  10. Poor Validity of the DSM-IV Schizoid Personality Disorder Construct as a Diagnostic Category.

    Science.gov (United States)

    Hummelen, Benjamin; Pedersen, Geir; Wilberg, Theresa; Karterud, Sigmund

    2015-06-01

    This study sought to evaluate the construct validity of schizoid personality disorder (SZPD) by investigating a sample of 2,619 patients from the Norwegian Network of Personality-Focused Treatment Programs by a variety of statistical techniques. Nineteen patients (0.7%) reached the diagnostic threshold of SZPD. Results from the factor analyses indicated that SZPD consists of three factors: social detachment, withdrawal, and restricted affectivity/ anhedonia. Overall, internal consistency and diagnostic efficiency were poor and best for the criteria that belong to the social detachment factor. These findings pose serious questions about the clinical utility of SZPD as a diagnostic category. On the other hand, the three factors were in concordance with findings from previous studies and with the trait model for personality disorders in DSM-5, supporting the validity of SZPD as a dimensional construct. The authors recommend that SZPD should be deleted as a diagnostic category in future editions of DSM-5.

  11. Stochastic modeling of oligodendrocyte generation in cell culture: model validation with time-lapse data

    Directory of Open Access Journals (Sweden)

    Noble Mark

    2006-05-01

    Full Text Available Abstract Background The purpose of this paper is two-fold. The first objective is to validate the assumptions behind a stochastic model developed earlier by these authors to describe oligodendrocyte generation in cell culture. The second is to generate time-lapse data that may help biomathematicians to build stochastic models of cell proliferation and differentiation under other experimental scenarios. Results Using time-lapse video recording it is possible to follow the individual evolutions of different cells within each clone. This experimental technique is very laborious and cannot replace model-based quantitative inference from clonal data. However, it is unrivalled in validating the structure of a stochastic model intended to describe cell proliferation and differentiation at the clonal level. In this paper, such data are reported and analyzed for oligodendrocyte precursor cells cultured in vitro. Conclusion The results strongly support the validity of the most basic assumptions underpinning the previously proposed model of oligodendrocyte development in cell culture. However, there are some discrepancies; the most important is that the contribution of progenitor cell death to cell kinetics in this experimental system has been underestimated.

  12. A Community Based Study to Test the Reliability and Validity of Physical Activity Measurement Techniques

    Directory of Open Access Journals (Sweden)

    Puneet Misra

    2014-01-01

    Full Text Available Introduction: Physical activity (PA is protective against non-communicable diseases and it can reduce premature mortality. However, it is difficult to assess the frequency, duration, type and intensity of PA. The global physical activity questionnaire (GPAQ has been developed by World Health Organization with the aim of having valid and reliable estimates of PA. The primary aim of this study is to assess the repeatability of the GPAQ instrument and the secondary aim is to validate it against International Physical Activity Questionnaire (IPAQ and against an objective measure of PA (i.e., using pedometers in both rural and peri-urban areas of North India. Methods: A total of 262 subjects were recruited by random selection from Ballabgarh Block of Haryana State in India. For test retest repeatability of GPAQ and IPAQ, the instruments were administered on two occasions separated by at least 3 days. For concurrent validity, both questionnaires were administered in random order and for criterion validity step counters were used. Spearman′s correlation coefficient, intra-class correlation (ICC and Cohen′s kappa was used in the analysis. Results: For GPAQ validity, the spearman′s Rho ranged from 0.40 to 0.59 and ICC ranged from 0.43 to 0.81 while for IPAQ validity, spearman correlation coefficient ranged from 0.42 to 0.43 and ICC ranged from 0.56 to 0.68. The observed concurrent validity coefficients suggested that both the questionnaires had reasonable agreement (Spearman Rho of >0.90; P < 0.0001; ICC: 0.76-0.91, P < 0.05. Conclusions: GPAQ is similar to IPAQ in measuring PA and can be used for measurement of PA in community settings.

  13. Validation of the sentinel lymph node biopsy technique in head and neck cancers of the oral cavity.

    Science.gov (United States)

    Radkani, Pejman; Mesko, Thomas W; Paramo, Juan C

    2013-12-01

    The purpose of this study was to present our experience and validate the use of sentinel lymph node (SLN) mapping in patients with head and neck cancers. A retrospective review of a prospectively collected database of patients with a diagnosis of squamous cell carcinomas of the head and neck from 2008 to 2011 was done. The group consisted of a total of 20 patients. The first node(s) highlighted with blue, or identified as radioactive by Tc99-sulfur radioactive colloid, was (were) identified as the SLNs. In the first seven patients, formal modified neck dissection was performed. In the remaining 13 patients, only a SLN biopsy procedure was done. At least one SLN was identified in all 20 patients (100%). Only one patient (5%) had positive nodes. In this case, the SLN was also positive. In the remaining 19 cases, all lymph nodes were negative. After an average of 24 months of follow-up, there have been three local recurrences (15%) but no evidence of distant metastatic disease. SLN mapping in head and neck cancers is a feasible technique with a high identification rate and a low false-negative rate. Although the detection rate of regional metastatic disease compares favorably with published data as well as the disease-free and overall survival, further studies are warranted before considering this technique to be the "gold standard" in patients with oral squamous cell carcinoma and a negative neck by clinical examination and imaging studies.

  14. Remarks about the displaced spectra techniques

    International Nuclear Information System (INIS)

    Behringer, K.; Pineyro, J.

    1989-01-01

    In a recent paper a new method, called displaced spectra techniques, was presented for distinguishing between sinusoidal components and narrowband random noise contributions in otherwise random noise data. It is based on Fourier transform techniques, and uses the power spectral density (PSD) and a newly-introduced second-order displaced power spectra density (SDPSD) function. In order to distinguish between the two peak types, a validation criterion has been established. In this note, three topics are covered: a) improved numerical data for the validation criterion are given by using the refined estimation procedure of the PSD and SDPSD functions by the Welch method; b) the validation criterion requires the subtraction of the background below the peaks. A semiautomatic procedure is described; c) it was observed that peaks in the real part of the SDPSD function can be accompanied by fine structure phenomena which are unresolved in the PSD function. A few remarks are made about this problem. (author)

  15. The Myotonometer: Not a Valid Measurement Tool for Active Hamstring Musculotendinous Stiffness.

    Science.gov (United States)

    Pamukoff, Derek N; Bell, Sarah E; Ryan, Eric D; Blackburn, J Troy

    2016-05-01

    Hamstring musculotendinous stiffness (MTS) is associated with lower-extremity injury risk (ie, hamstring strain, anterior cruciate ligament injury) and is commonly assessed using the damped oscillatory technique. However, despite a preponderance of studies that measure MTS reliably in laboratory settings, there are no valid clinical measurement tools. A valid clinical measurement technique is needed to assess MTS and permit identification of individuals at heightened risk of injury and track rehabilitation progress. To determine the validity and reliability of the Myotonometer for measuring active hamstring MTS. Descriptive laboratory study. Laboratory. 33 healthy participants (15 men, age 21.33 ± 2.94 y, height 172.03 ± 16.36 cm, mass 74.21 ± 16.36 kg). Hamstring MTS was assessed using the damped oscillatory technique and the Myotonometer. Intraclass correlations were used to determine the intrasession, intersession, and interrater reliability of the Myotonometer. Criterion validity was assessed via Pearson product-moment correlation between MTS measures obtained from the Myotonometer and from the damped oscillatory technique. The Myotonometer demonstrated good intrasession (ICC3,1 = .807) and interrater reliability (ICC2,k = .830) and moderate intersession reliability (ICC2,k = .693). However, it did not provide a valid measurement of MTS compared with the damped oscillatory technique (r = .346, P = .061). The Myotonometer does not provide a valid measure of active hamstring MTS. Although the Myotonometer does not measure active MTS, it possesses good reliability and portability and could be used clinically to measure tissue compliance, muscle tone, or spasticity associated with multiple musculoskeletal disorders. Future research should focus on portable and clinically applicable tools to measure active hamstring MTS in efforts to prevent and monitor injuries.

  16. Potential of accuracy profile for method validation in inductively coupled plasma spectrochemistry

    International Nuclear Information System (INIS)

    Mermet, J.M.; Granier, G.

    2012-01-01

    Method validation is usually performed over a range of concentrations for which analytical criteria must be verified. One important criterion in quantitative analysis is accuracy, i.e. the contribution of both trueness and precision. The study of accuracy over this range is called an accuracy profile and provides experimental tolerance intervals. Comparison with acceptability limits fixed by the end user defines a validity domain. This work describes the computation involved in the building of the tolerance intervals, particularly for the intermediate precision with within-laboratory experiments and for the reproducibility with interlaboratory studies. Computation is based on ISO 5725‐4 and on previously published work. Moreover, the bias uncertainty is also computed to verify the bias contribution to accuracy. The various types of accuracy profile behavior are exemplified with results obtained by using ICP-MS and ICP-AES. This procedure allows the analyst to define unambiguously a validity domain for a given accuracy. However, because the experiments are time-consuming, the accuracy profile method is mainly dedicated to method validation. - Highlights: ► An analytical method is defined by its accuracy, i.e. both trueness and precision. ► The accuracy as a function of an analyte concentration is an accuracy profile. ► Profile basic concepts are explained for trueness and intermediate precision. ► Profile-based tolerance intervals have to be compared with acceptability limits. ► Typical accuracy profiles are given for both ICP-AES and ICP-MS techniques.

  17. Fuel reprocessing data validation using the isotope correlation technique

    International Nuclear Information System (INIS)

    Persiani, P.J.; Bucher, R.G.; Pond, R.B.; Cornella, R.J.

    1990-01-01

    The Isotope Correlation Technique (ICT), in conjunction with the gravimetric (Pu/U ratio) method for mass determination, provides an independent verification of the input accountancy at the dissolver or accountancy stage of the reprocessing plant. The Isotope Correlation Technique has been applied to many classes of domestic and international reactor systems (light-water, heavy-water, and graphite reactors) operating in a variety of modes (power, research, and production reactors), and for a variety of reprocessing fuel cycle management strategies. Analysis of reprocessing operations data based on isotopic correlations derived for assemblies in a PWR environment and fuel management scheme, yielded differences between the measurement-derived and ICT-derived plutonium mass determinations of (- 0.02 ± 0.23)% for the measured U-235 and (+ 0.50 ± 0.31)% for the measured Pu-239, for a core campaign. The ICT analyses has been implemented for the plutonium isotopics in a depleted uranium assembly in a heavy-water, enriched uranium system and for the uranium isotopes in the fuel assemblies in light-water, highly-enriched systems

  18. Isotopic and criticality validation for actinide-only burnup credit

    International Nuclear Information System (INIS)

    Fuentes, E.; Lancaster, D.; Rahimi, M.

    1997-01-01

    The techniques used for actinide-only burnup credit isotopic validation and criticality validation are presented and discussed. Trending analyses have been incorporated into both methodologies, requiring biases and uncertainties to be treated as a function of the trending parameters. The isotopic validation is demonstrated using the SAS2H module of SCALE 4.2, with the 27BURNUPLIB cross section library; correction factors are presented for each of the actinides in the burnup credit methodology. For the criticality validation, the demonstration is performed with the CSAS module of SCALE 4.2 and the 27BURNUPLIB, resulting in a validated upper safety limit

  19. An Improved Fuzzy Based Missing Value Estimation in DNA Microarray Validated by Gene Ranking

    Directory of Open Access Journals (Sweden)

    Sujay Saha

    2016-01-01

    Full Text Available Most of the gene expression data analysis algorithms require the entire gene expression matrix without any missing values. Hence, it is necessary to devise methods which would impute missing data values accurately. There exist a number of imputation algorithms to estimate those missing values. This work starts with a microarray dataset containing multiple missing values. We first apply the modified version of the fuzzy theory based existing method LRFDVImpute to impute multiple missing values of time series gene expression data and then validate the result of imputation by genetic algorithm (GA based gene ranking methodology along with some regular statistical validation techniques, like RMSE method. Gene ranking, as far as our knowledge, has not been used yet to validate the result of missing value estimation. Firstly, the proposed method has been tested on the very popular Spellman dataset and results show that error margins have been drastically reduced compared to some previous works, which indirectly validates the statistical significance of the proposed method. Then it has been applied on four other 2-class benchmark datasets, like Colorectal Cancer tumours dataset (GDS4382, Breast Cancer dataset (GSE349-350, Prostate Cancer dataset, and DLBCL-FL (Leukaemia for both missing value estimation and ranking the genes, and the results show that the proposed method can reach 100% classification accuracy with very few dominant genes, which indirectly validates the biological significance of the proposed method.

  20. Measurement of migration of soft tissue by modified Roentgen stereophotogrammetric analysis (RSA): validation of a new technique to monitor rotator cuff tears.

    Science.gov (United States)

    Cashman, P M M; Baring, T; Reilly, P; Emery, R J H; Amis, A A

    2010-04-01

    The purpose of this study was to develop a technique to use Roentgen stereophotogrammetric analysis (RSA) to measure migration of soft-tissue structures after rotator cuff repair. RSA stereo films were obtained; images were analysed using a semi-automatic software program allowing 3D viewing of results. RSA imaging experiments were performed to validate the technique, using a glass phantom with implanted RSA beads and an animal model with steel sutures as RSA markers which were moved known distances. Repeated measurements allowed assessment of inter- and intra-observer variability at a maximum of 1.06 mm. RSA analysis of the phantom showed a variation up to 0.22 mm for static and 0.28 mm for dynamic studies. The ovine tissue specimen demonstrated that using steel sutures as RSA markers in soft tissue is feasible, although less accurate than when measuring bone motion. This novel application of RSA to measure soft tissue migration is practicable and can be extended to in vivo studies.

  1. Validation of a Low Cost, Disposable, and Ultrasound-guided Suprapubic Catheter Insertion Trainer.

    Science.gov (United States)

    Nonde, James; Adam, Ahmed; Laher, Abdullah Ebrahim

    2018-02-27

    To validate the newly designed ultrasound-guided suprapubic catheter insertion trainer (US-SCIT) model against the real life experience by enrolling participants with prior confidence in the technique of US-guided suprapubic catheter (SPC) insertion. The US-SCIT was self-constructed from common disposables and equipment found in the emergency department. A validation questionnaire was completed by all participants after SPC insertion on the US-SCIT model. Fifty participants enrolled in the study. Each participant had reported confidence in the SPC insertion technique, prior to participation in this study. There were 13 "super-users" (>65 previous successful real life SPC insertions) in the study. The total material cost per US-SCIT unit was 1.71 USD. The US-SCIT's value in understanding the principals of US-guided SPC insertion had a mean score of 8.86 (standard deviation [SD] 1.03), whereas its value in simulating contextual anatomy had a mean score of 8.26 (SD 1.48). The mean score of the model's ability to provide realistic sensory feedback was 8.12 (SD 1.78), whereas that of realism of initial urine outflow was 9.06 (SD 1.20). Simulation with the model compared well with real life SPC insertion, with a mean score of 8.30 (SD1.48). The US-SCIT model performed well in various spheres developed to assess its ability to simulate real life SPC insertion. We are confident that this low-cost, validated, US compatible SPC trainer, constructed from common material present in the ED, will be a valuable learning asset to trainees across the globe. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. Lithium-Ion Cell Fault Detection by Single-Point Impedance Diagnostic and Degradation Mechanism Validation for Series-Wired Batteries Cycled at 0 °C

    Directory of Open Access Journals (Sweden)

    Corey T. Love

    2018-04-01

    Full Text Available The utility of a single-point impedance-based technique to monitor the state-of-health of a pack of four 18650 lithium-ion cells wired in series (4S was demonstrated in a previous publication. This work broadens the applicability of the single-point monitoring technique to identify temperature induced faults within 4S packs at 0 °C by two distinct discharge cut-off thresholds: individual cell cut-off and pack voltage cut-off. The results show how the single-point technique applied to a 4S pack can identify cell faults induced by low temperature degradation when plotted on a unique state-of-health map. Cell degradation is validated through an extensive incremental capacity technique to quantify capacity loss due to low temperature cycling and investigate the underpinnings of cell failure.

  3. Static Validation of a Voting Protocol

    DEFF Research Database (Denmark)

    Nielsen, Christoffer Rosenkilde; Andersen, Esben Heltoft; Nielson, Hanne Riis

    2005-01-01

    is formalised in an extension of the LySa process calculus with blinding signatures. The analysis, which is fully automatic, pinpoints previously undiscovered flaws related to verifiability and accuracy and we suggest modifications of the protocol needed for validating these properties....

  4. The development of digital monitoring technique

    International Nuclear Information System (INIS)

    Koo, In Soo; Kim, D. H.; Kim, J. S.; Kim, C. H.; Kim, G. O.; Park, H. Y.; Suh, S. Y.; Sung, S. H.; Song, S. J.; Lee, C. K.; Jang, G. S.; Hur, S.

    1997-08-01

    A study has been performed for advanced DSP technology for the digital nuclear I and C systems for the monitoring and diagnosis techniques for high-pressurized structures integrity in NSSS. In the development of advanced DSP technology, real time process, communication network and signal validation were selected as the essential technologies of the digital signal process, and the requirements and methodology for the application of these technologies in NPP were established through technical analysis. Based on its results, the DPIS and the signal validation algorithm were developed. For the real-time process, the necessary requirements were define and the methodology of real-time software modeling was developed. For the communication network, the methodology of selection of the communication technique and developing procedure were established with an extraction of requirements. Functions, requirements, structure and technical specification were developed for the DPIS, and a real-time signal validation algorithm was developed and implemented for the signal validation. In a study on monitoring techniques for abnormal conditions, test and experimental facilities have been set up in order to carry out the required tests during research activities. Studies concentrated on how to acquire proper vibration or emission signals from mechanical structures and equipments, and to diagnose effectively the abnormal conditions of high pressure structure integrity. The algorithms of automatic signal analysis and diagnosis for abnormal conditions have been developed in this study to assist the operator's monitoring and diagnosis activities on structure integrity using new technologies. (author). 23 refs., 68 tabs., 196 figs

  5. The development of digital monitoring technique

    Energy Technology Data Exchange (ETDEWEB)

    Koo, In Soo; Kim, D. H.; Kim, J. S.; Kim, C. H.; Kim, G. O.; Park, H. Y.; Suh, S. Y.; Sung, S. H.; Song, S. J.; Lee, C. K.; Jang, G. S.; Hur, S.

    1997-08-01

    A study has been performed for advanced DSP technology for the digital nuclear I and C systems for the monitoring and diagnosis techniques for high-pressurized structures integrity in NSSS. In the development of advanced DSP technology, real time process, communication network and signal validation were selected as the essential technologies of the digital signal process, and the requirements and methodology for the application of these technologies in NPP were established through technical analysis. Based on its results, the DPIS and the signal validation algorithm were developed. For the real-time process, the necessary requirements were define and the methodology of real-time software modeling was developed. For the communication network, the methodology of selection of the communication technique and developing procedure were established with an extraction of requirements. Functions, requirements, structure and technical specification were developed for the DPIS, and a real-time signal validation algorithm was developed and implemented for the signal validation. In a study on monitoring techniques for abnormal conditions, test and experimental facilities have been set up in order to carry out the required tests during research activities. Studies concentrated on how to acquire proper vibration or emission signals from mechanical structures and equipments, and to diagnose effectively the abnormal conditions of high pressure structure integrity. The algorithms of automatic signal analysis and diagnosis for abnormal conditions have been developed in this study to assist the operator`s monitoring and diagnosis activities on structure integrity using new technologies. (author). 23 refs., 68 tabs., 196 figs.

  6. Validation of NAA Method for Urban Particulate Matter

    International Nuclear Information System (INIS)

    Woro Yatu Niken Syahfitri; Muhayatun; Diah Dwiana Lestiani; Natalia Adventini

    2009-01-01

    Nuclear analytical techniques have been applied in many countries for determination of environmental pollutant. Method of NAA (neutron activation analysis) representing one of nuclear analytical technique of that has low detection limits, high specificity, high precision, and accuracy for large majority of naturally occurring elements, and ability of non-destructive and simultaneous determination of multi-elemental, and can handle small sample size (< 1 mg). To ensure quality and reliability of the method, validation are needed to be done. A standard reference material, SRM NIST 1648 Urban Particulate Matter, has been used to validate NAA method. Accuracy and precision test were used as validation parameters. Particulate matter were validated for 18 elements: Ti, I, V, Br, Mn, Na, K, Cl, Cu, Al, As, Fe, Co, Zn, Ag, La, Cr, and Sm,. The result showed that the percent relative standard deviation of the measured elemental concentrations are found to be within ranged from 2 to 14,8% for most of the elements analyzed whereas Hor rat value in range 0,3-1,3. Accuracy test results showed that relative bias ranged from -11,1 to 3,6%. Based on validation results, it can be stated that NAA method is reliable for characterization particulate matter and other similar matrix samples to support air quality monitoring. (author)

  7. 3D structure tensor analysis of light microscopy data for validating diffusion MRI.

    Science.gov (United States)

    Khan, Ahmad Raza; Cornea, Anda; Leigland, Lindsey A; Kohama, Steven G; Jespersen, Sune Nørhøj; Kroenke, Christopher D

    2015-05-01

    Diffusion magnetic resonance imaging (d-MRI) is a powerful non-invasive and non-destructive technique for characterizing brain tissue on the microscopic scale. However, the lack of validation of d-MRI by independent experimental means poses an obstacle to accurate interpretation of data acquired using this method. Recently, structure tensor analysis has been applied to light microscopy images, and this technique holds promise to be a powerful validation strategy for d-MRI. Advantages of this approach include its similarity to d-MRI in terms of averaging the effects of a large number of cellular structures, and its simplicity, which enables it to be implemented in a high-throughput manner. However, a drawback of previous implementations of this technique arises from it being restricted to 2D. As a result, structure tensor analyses have been limited to tissue sectioned in a direction orthogonal to the direction of interest. Here we describe the analytical framework for extending structure tensor analysis to 3D, and utilize the results to analyze serial image "stacks" acquired with confocal microscopy of rhesus macaque hippocampal tissue. Implementation of 3D structure tensor procedures requires removal of sources of anisotropy introduced in tissue preparation and confocal imaging. This is accomplished with image processing steps to mitigate the effects of anisotropic tissue shrinkage, and the effects of anisotropy in the point spread function (PSF). In order to address the latter confound, we describe procedures for measuring the dependence of PSF anisotropy on distance from the microscope objective within tissue. Prior to microscopy, ex vivo d-MRI measurements performed on the hippocampal tissue revealed three regions of tissue with mutually orthogonal directions of least restricted diffusion that correspond to CA1, alveus and inferior longitudinal fasciculus. We demonstrate the ability of 3D structure tensor analysis to identify structure tensor orientations that

  8. Field validity and feasibility of four techniques for the detection of Trichuris in simians: a model for monitoring drug efficacy in public health?

    Directory of Open Access Journals (Sweden)

    Bruno Levecke

    Full Text Available BACKGROUND: Soil-transmitted helminths, such as Trichuris trichiura, are of major concern in public health. Current efforts to control these helminth infections involve periodic mass treatment in endemic areas. Since these large-scale interventions are likely to intensify, monitoring the drug efficacy will become indispensible. However, studies comparing detection techniques based on sensitivity, fecal egg counts (FEC, feasibility for mass diagnosis and drug efficacy estimates are scarce. METHODOLOGY/PRINCIPAL FINDINGS: In the present study, the ether-based concentration, the Parasep Solvent Free (SF, the McMaster and the FLOTAC techniques were compared based on both validity and feasibility for the detection of Trichuris eggs in 100 fecal samples of nonhuman primates. In addition, the drug efficacy estimates of quantitative techniques was examined using a statistical simulation. Trichuris eggs were found in 47% of the samples. FLOTAC was the most sensitive technique (100%, followed by the Parasep SF (83.0% [95% confidence interval (CI: 82.4-83.6%] and the ether-based concentration technique (76.6% [95% CI: 75.8-77.3%]. McMaster was the least sensitive (61.7% [95% CI: 60.7-62.6%] and failed to detect low FEC. The quantitative comparison revealed a positive correlation between the four techniques (Rs = 0.85-0.93; p<0.0001. However, the ether-based concentration technique and the Parasep SF detected significantly fewer eggs than both the McMaster and the FLOTAC (p<0.0083. Overall, the McMaster was the most feasible technique (3.9 min/sample for preparing, reading and cleaning of the apparatus, followed by the ether-based concentration technique (7.7 min/sample and the FLOTAC (9.8 min/sample. Parasep SF was the least feasible (17.7 min/sample. The simulation revealed that the sensitivity is less important for monitoring drug efficacy and that both FLOTAC and McMaster were reliable estimators. CONCLUSIONS/SIGNIFICANCE: The results of this study

  9. Validation of diffuse correlation spectroscopy sensitivity to nicotinamide-induced blood flow elevation in the murine hindlimb using the fluorescent microsphere technique

    Science.gov (United States)

    Proctor, Ashley R.; Ramirez, Gabriel A.; Han, Songfeng; Liu, Ziping; Bubel, Tracy M.; Choe, Regine

    2018-03-01

    Nicotinamide has been shown to affect blood flow in both tumor and normal tissues, including skeletal muscle. Intraperitoneal injection of nicotinamide was used as a simple intervention to test the sensitivity of noninvasive diffuse correlation spectroscopy (DCS) to changes in blood flow in the murine left quadriceps femoris skeletal muscle. DCS was then compared with the gold-standard fluorescent microsphere (FM) technique for validation. The nicotinamide dose-response experiment showed that relative blood flow measured by DCS increased following treatment with 500- and 1000-mg / kg nicotinamide. The DCS and FM technique comparison showed that blood flow index measured by DCS was correlated with FM counts quantified by image analysis. The results of this study show that DCS is sensitive to nicotinamide-induced blood flow elevation in the murine left quadriceps femoris. Additionally, the results of the comparison were consistent with similar studies in higher-order animal models, suggesting that mouse models can be effectively employed to investigate the utility of DCS for various blood flow measurement applications.

  10. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  11. Comparison of a new expert elicitation model with the Classical Model, equal weights and single experts, using a cross-validation technique

    Energy Technology Data Exchange (ETDEWEB)

    Flandoli, F. [Dip.to di Matematica Applicata, Universita di Pisa, Pisa (Italy); Giorgi, E. [Dip.to di Matematica Applicata, Universita di Pisa, Pisa (Italy); Istituto Nazionale di Geofisica e Vulcanologia, Sezione di Pisa, via della Faggiola 32, 56126 Pisa (Italy); Aspinall, W.P. [Dept. of Earth Sciences, University of Bristol, and Aspinall and Associates, Tisbury (United Kingdom); Neri, A., E-mail: neri@pi.ingv.it [Istituto Nazionale di Geofisica e Vulcanologia, Sezione di Pisa, via della Faggiola 32, 56126 Pisa (Italy)

    2011-10-15

    The problem of ranking and weighting experts' performances when quantitative judgments are being elicited for decision support is considered. A new scoring model, the Expected Relative Frequency model, is presented, based on the closeness between central values provided by the expert and known values used for calibration. Using responses from experts in five different elicitation datasets, a cross-validation technique is used to compare this new approach with the Cooke Classical Model, the Equal Weights model, and individual experts. The analysis is performed using alternative reward schemes designed to capture proficiency either in quantifying uncertainty, or in estimating true central values. Results show that although there is only a limited probability that one approach is consistently better than another, the Cooke Classical Model is generally the most suitable for assessing uncertainties, whereas the new ERF model should be preferred if the goal is central value estimation accuracy. - Highlights: > A new expert elicitation model, named Expected Relative Frequency (ERF), is presented. > A cross-validation approach to evaluate the performance of different elicitation models is applied. > The new ERF model shows the best performance with respect to the point-wise estimates.

  12. Service Providers’ Willingness to Change as Innovation Inductor in Services: Validating a Scale

    Directory of Open Access Journals (Sweden)

    Marina Figueiredo Moreir

    2016-12-01

    Full Text Available This study explores the willingness of service providers to incorporate changes suggested by clients altering previously planned services during its delivery, hereby named Willingness to Change in Services [WCS]. We apply qualitative research techniques to map seven dimensions related to this phenomenon: Client relationship management; Organizational conditions for change; Software characteristics and development; Conditions affecting teams; Administrative procedures and decision-making conditions; Entrepreneurial behavior; Interaction with supporting organizations. These dimensions have been converted into variables composing a WCS scale later submitted to theoretical and semantic validations. A scale with 26 variables resulted from such procedures was applied on a large survey carried out with 351 typical Brazilian software development service companies operating all over the country. Data from our sample have been submitted to multivariate statistical analysis to provide validation for the scale. After factorial analysis procedures, 24 items have been validated and assigned to three factors representative of WCS: Organizational Routines and Values – 12 variables; Organizational Structure for Change – 6 variables; and Service Specificities – 6 variables. As future contributions, we expect to see further testing for the WCS scale on alternative service activities to provide evidence about its limits and contributions to general service innovation theory.

  13. Effort, symptom validity testing, performance validity testing and traumatic brain injury.

    Science.gov (United States)

    Bigler, Erin D

    2014-01-01

    To understand the neurocognitive effects of brain injury, valid neuropsychological test findings are paramount. This review examines the research on what has been referred to a symptom validity testing (SVT). Above a designated cut-score signifies a 'passing' SVT performance which is likely the best indicator of valid neuropsychological test findings. Likewise, substantially below cut-point performance that nears chance or is at chance signifies invalid test performance. Significantly below chance is the sine qua non neuropsychological indicator for malingering. However, the interpretative problems with SVT performance below the cut-point yet far above chance are substantial, as pointed out in this review. This intermediate, border-zone performance on SVT measures is where substantial interpretative challenges exist. Case studies are used to highlight the many areas where additional research is needed. Historical perspectives are reviewed along with the neurobiology of effort. Reasons why performance validity testing (PVT) may be better than the SVT term are reviewed. Advances in neuroimaging techniques may be key in better understanding the meaning of border zone SVT failure. The review demonstrates the problems with rigidity in interpretation with established cut-scores. A better understanding of how certain types of neurological, neuropsychiatric and/or even test conditions may affect SVT performance is needed.

  14. An integrated approach for signal validation in nuclear power plants

    International Nuclear Information System (INIS)

    Upadhyaya, B.R.; Kerlin, T.W.; Gloeckler, O.; Frei, Z.; Qualls, L.; Morgenstern, V.

    1987-08-01

    A signal validation system, based on several parallel signal processing modules, is being developed at the University of Tennessee. The major modules perform (1) general consistency checking (GCC) of a set of redundant measurements, (2) multivariate data-driven modeling of dynamic signal components for maloperation detection, (3) process empirical modeling for prediction and redundancy generation, (4) jump, pulse, noise detection, and (5) an expert system for qualitative signal validation. A central database stores information related to sensors, diagnostics rules, past system performance, subsystem models, etc. We are primarily concerned with signal validation during steady-state operation and slow degradations. In general, the different modules will perform signal validation during all operating conditions. The techniques have been successfully tested using PWR steam generator simulation, and efforts are currently underway in applying the techniques to Millstone-III operational data. These methods could be implemented in advanced reactors, including advanced liquid metal reactors

  15. Microsoft Visio 2013 business process diagramming and validation

    CERN Document Server

    Parker, David

    2013-01-01

    Microsoft Visio 2013 Business Process Diagramming and Validation provides a comprehensive and practical tutorial including example code and demonstrations for creating validation rules, writing ShapeSheet formulae, and much more.If you are a Microsoft Visio 2013 Professional Edition power user or developer who wants to get to grips with both the essential features of Visio 2013 and the validation rules in this edition, then this book is for you. A working knowledge of Microsoft Visio and optionally .NET for the add-on code is required, though previous knowledge of business process diagramming

  16. Intelligence, previous convictions and interrogative suggestibility: a path analysis of alleged false-confession cases.

    Science.gov (United States)

    Sharrock, R; Gudjonsson, G H

    1993-05-01

    The main purpose of this study was to investigate the relationship between interrogative suggestibility and previous convictions among 108 defendants in criminal trials, using a path analysis technique. It was hypothesized that previous convictions, which may provide defendants with interrogative experiences, would correlate negatively with 'shift' as measured by the Gudjonsson Suggestibility Scale (Gudjonsson, 1984a), after intelligence and memory had been controlled for. The hypothesis was partially confirmed and the theoretical and practical implications of the findings are discussed.

  17. Development and validation of three-dimensional CFD techniques for reactor safety applications. Final report

    International Nuclear Information System (INIS)

    Buchholz, Sebastian; Palazzo, Simone; Papukchiev, Angel; Scheurer Martina

    2016-12-01

    The overall goal of the project RS 1506 ''Development and Validation of Three Dimensional CFD Methods for Reactor Safety Applications'' is the validation of Computational Fluid Dynamics (CFD) software for the simulation of three -dimensional thermo-hydraulic heat and fluid flow phenomena in nuclear reactors. For this purpose a wide spectrum of validation and test cases was selected covering fluid flow and heat transfer phenomena in the downcomer and in the core of pressurized water reactors. In addition, the coupling of the system code ATHLET with the CFD code ANSYS CFX was further developed and validated. The first choice were UPTF experiments where turbulent single- and two-phase flows were investigated in a 1:1 scaled model of a German KONVOI reactor. The scope of the CFD calculations covers thermal mixing and stratification including condensation in single- and two-phase flows. In the complex core region, the flow in a fuel assembly with spacer grid was simulated as defined in the OECD/NEA Benchmark MATIS-H. Good agreement are achieved when the geometrical and physical boundary conditions were reproduced as realistic as possible. This includes, in particular, the consideration of heat transfer to walls. The influence of wall modelling on CFD results was investigated on the TALL-3D T01 experiment. In this case, the dynamic three dimensional fluid flow and heat transfer phenomena were simulated in a Generation IV liquid metal cooled reactor. Concurrently to the validation work, the coupling of the system code ATHLET with the ANSYS CFX software was optimized and expanded for two-phase flows. Different coupling approaches were investigated, in order to overcome the large difference between CPU-time requirements of system and CFD codes. Finally, the coupled simulation system was validated by applying it to the simulation of the PSI double T-junction experiment, the LBE-flow in the MYRRA Spallation experiment and a demonstration test case simulating a pump trip

  18. Model Validation Using Coordinate Distance with Performance Sensitivity

    Directory of Open Access Journals (Sweden)

    Jiann-Shiun Lew

    2008-01-01

    Full Text Available This paper presents an innovative approach to model validation for a structure with significant parameter variations. Model uncertainty of the structural dynamics is quantified with the use of a singular value decomposition technique to extract the principal components of parameter change, and an interval model is generated to represent the system with parameter uncertainty. The coordinate vector, corresponding to the identified principal directions, of the validation system is computed. The coordinate distance between the validation system and the identified interval model is used as a metric for model validation. A beam structure with an attached subsystem, which has significant parameter uncertainty, is used to demonstrate the proposed approach.

  19. Robustness study in SSNTD method validation: indoor radon quality

    Energy Technology Data Exchange (ETDEWEB)

    Dias, D.C.S.; Silva, N.C.; Bonifácio, R.L., E-mail: danilacdias@gmail.com [Comissao Nacional de Energia Nuclear (LAPOC/CNEN), Pocos de Caldas, MG (Brazil). Laboratorio de Pocos de Caldas

    2017-07-01

    Quality control practices are indispensable to organizations aiming to reach analytical excellence. Method validation is an essential component to quality systems in laboratories, serving as a powerful tool for standardization and reliability of outcomes. This paper presents a study of robustness conducted over a SSNTD technique validation process, with the goal of developing indoor radon measurements at the highest level of quality. This quality parameter indicates how well a technique is able to provide reliable results in face of unexpected variations along the measurement. In this robustness study, based on the Youden method, 7 analytical conditions pertaining to different phases of the SSNTD technique (with focus on detector etching) were selected. Based on the ideal values for each condition as reference, extreme levels regarded as high and low were prescribed to each condition. A partial factorial design of 8 unique etching procedures was defined, where each presented their own set of high and low condition values. The Youden test provided 8 indoor radon concentration results, which allowed percentage estimations that indicate the potential influence of each analytical condition on the SSNTD technique. As expected, detector etching factors such as etching solution concentration, temperature and immersion time were identified as the most critical parameters to the technique. Detector etching is a critical step in the SSNTD method – one that must be carefully designed during validation and meticulously controlled throughout the entire process. (author)

  20. Robustness study in SSNTD method validation: indoor radon quality

    International Nuclear Information System (INIS)

    Dias, D.C.S.; Silva, N.C.; Bonifácio, R.L.

    2017-01-01

    Quality control practices are indispensable to organizations aiming to reach analytical excellence. Method validation is an essential component to quality systems in laboratories, serving as a powerful tool for standardization and reliability of outcomes. This paper presents a study of robustness conducted over a SSNTD technique validation process, with the goal of developing indoor radon measurements at the highest level of quality. This quality parameter indicates how well a technique is able to provide reliable results in face of unexpected variations along the measurement. In this robustness study, based on the Youden method, 7 analytical conditions pertaining to different phases of the SSNTD technique (with focus on detector etching) were selected. Based on the ideal values for each condition as reference, extreme levels regarded as high and low were prescribed to each condition. A partial factorial design of 8 unique etching procedures was defined, where each presented their own set of high and low condition values. The Youden test provided 8 indoor radon concentration results, which allowed percentage estimations that indicate the potential influence of each analytical condition on the SSNTD technique. As expected, detector etching factors such as etching solution concentration, temperature and immersion time were identified as the most critical parameters to the technique. Detector etching is a critical step in the SSNTD method – one that must be carefully designed during validation and meticulously controlled throughout the entire process. (author)

  1. Books average previous decade of economic misery.

    Science.gov (United States)

    Bentley, R Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios

    2014-01-01

    For the 20(th) century since the Depression, we find a strong correlation between a 'literary misery index' derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade.

  2. Development and validation of technique for in-vivo 3D analysis of cranial bone graft survival

    Science.gov (United States)

    Bernstein, Mark P.; Caldwell, Curtis B.; Antonyshyn, Oleh M.; Ma, Karen; Cooper, Perry W.; Ehrlich, Lisa E.

    1997-05-01

    Bone autografts are routinely employed in the reconstruction of facial deformities resulting from trauma, tumor ablation or congenital malformations. The combined use of post- operative 3D CT and SPECT imaging provides a means for quantitative in vivo evaluation of bone graft volume and osteoblastic activity. The specific objectives of this study were: (1) Determine the reliability and accuracy of interactive computer-assisted analysis of bone graft volumes based on 3D CT scans; (2) Determine the error in CT/SPECT multimodality image registration; (3) Determine the error in SPECT/SPECT image registration; and (4) Determine the reliability and accuracy of CT-guided SPECT uptake measurements in cranial bone grafts. Five human cadaver heads served as anthropomorphic models for all experiments. Four cranial defects were created in each specimen with inlay and onlay split skull bone grafts and reconstructed to skull and malar recipient sites. To acquire all images, each specimen was CT scanned and coated with Technetium doped paint. For purposes of validation, skulls were landmarked with 1/16-inch ball-bearings and Indium. This study provides a new technique relating anatomy and physiology for the analysis of cranial bone graft survival.

  3. Cognitive Bias in the Verification and Validation of Space Flight Systems

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of

  4. Evaluation of convergent and discriminant validity of the Russian version of MMPI-2: First results

    Directory of Open Access Journals (Sweden)

    Emma I. Mescheriakova

    2015-06-01

    Full Text Available The paper presents the results of construct validity testing for a new version of the MMPI-2 (Minnesota Multiphasic Personality Inventory, which restandardization started in 1982 (J.N. Butcher, W.G. Dahlstrom, J.R. Graham, A. Tellegen, B. Kaemmer and is still going on. The professional community’s interest in this new version of the Inventory is determined by its advantage over the previous one in restructuring the inventory and adding new items which offer additional opportunities for psychodiagnostics and personality assessment. The construct validity testing was carried out using three up-to-date techniques, namely the Quality of Life and Satisfaction with Life questionnaire (a short version of Ritsner’s instrument adapted by E.I. Rasskazova, Janoff-Bulman’s World Assumptions Scale (adapted by O. Kravtsova, and the Character Strengths Assessment questionnaire developed by E. Osin based on Peterson and Seligman’s Values in Action Inventory of Strengths. These psychodiagnostic techniques were selected in line with the current trends in psychology, such as its orientation to positive phenomena as well as its interpretation of subjectivity potential as the need for self-determined, self-organized, self-realized and self-controlled behavior and the ability to accomplish it. The procedure of construct validity testing involved the «norm» group respondents, with the total sample including 205 people (62% were females, 32% were males. It was focused on the MMPI-2 additional and expanded scales (FI, BF, FP, S и К and six of its ten basic ones (D, Pd, Pa, Pt, Sc, Si. The results obtained confirmed construct validity of the scales concerned, and this allows the MMPI-2 to be applied to examining one’s personal potential instead of a set of questionnaires, facilitating, in turn, the personality researchers’ objectives. The paper discusses the first stage of this construct validity testing, the further stage highlighting the factor

  5. Experimental Investigation of Coolant Mixing in WWER and PWR Reactor Fuel Bundles by Laser Optical Techniques for CFD Validation

    International Nuclear Information System (INIS)

    Tar, D.; Baranyai, V; Ezsoel, Gy.; Toth, I.

    2010-01-01

    Non intrusive laser optical measurements have been carried out to investigate the coolant mixing in a model of the head part of a fuel assembly of a WWER reactor. The goal of this research was to investigate the coolant flow around the point based in-core thermocouple; and also provide experimental database as a validation tool for computational fluid dynamics calculations. The experiments have been carried out on a full size scale model of the head part of WWER-440/213 fuel assembly. In this paper first the previous results of the research project is summarised, when full field velocity vectors and temperature were obtained by particle image velocimetry and planar laser induced fluorescence, respectively. Then, preliminary results of the investigation of the influence of the flow in the central tube will be reported by presenting velocity measurement results. In order to have well measurable effect, extreme flow rates have been set in the central tube by applying an inner tube with controlled flow rates. Despite the extreme conditions, the influence of the central tube to the velocity field proved to be significant. Further measurement will be done for the investigation of the effect of the gaps at the spacer fixings by displacing the inner tube vertically, and also the temperature distribution will also be determined at similar geometries by laser induced fluorescence. The aim of the measurements was to establish an experimental database, as well as the validation of computational fluid dynamics calculations. (Authors)

  6. Verification and Validation of Multisegmented Mooring Capabilities in FAST v8: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Andersen, Morten T.; Wendt, Fabian; Robertson, Amy; Jonkman, Jason; Hall, Matthew

    2016-08-01

    The quasi-static and dynamic mooring modules of the open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, have previously been verified and validated, but only for mooring arrangements consisting of single lines connecting each fairlead and anchor. This paper extends the previous verification and validation efforts to focus on the multisegmented mooring capability of the FAST v8 modules: MAP++, MoorDyn, and the OrcaFlex interface. The OC3-Hywind spar buoy system tested by the DeepCwind consortium at the MARIN ocean basin, which includes a multisegmented bridle layout of the mooring system, was used for the verification and validation activities.

  7. Menstrual blood loss measurement: validation of the alkaline hematin technique for feminine hygiene products containing superabsorbent polymers.

    Science.gov (United States)

    Magnay, Julia L; Nevatte, Tracy M; Dhingra, Vandana; O'Brien, Shaughn

    2010-12-01

    To validate the alkaline hematin technique for measurement of menstrual blood loss using ultra-thin sanitary towels that contain superabsorbent polymer granules as the absorptive agent. Laboratory study using simulated menstrual fluid (SMF) and Always Ultra Normal, Long, and Night "with wings" sanitary towels. Keele Menstrual Disorders Laboratory. None. None. Recovery of blood, linearity, and interassay variation over a range of SMF volumes applied to towels. Because of the variable percentage of blood in menstrual fluid, blood recovery was assessed from SMF constituted as 10%, 25%, 50%, and 100% blood. The lower limit of reliable detection and the effect of storing soiled towels for up to 4 weeks at 15°C-20°C, 4°C, and -20°C before analysis were determined. Ninety percent recovery was reproducibly achieved up to 30 mL applied volume at all tested SMF compositions, except at low volume or high dilution equivalent to sanitary towels that contain superabsorbent polymers. Copyright © 2010 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  8. An easy-to-use semiquantitative food record validated for energy intake by using doubly labelled water technique.

    Science.gov (United States)

    Koebnick, C; Wagner, K; Thielecke, F; Dieter, G; Höhne, A; Franke, A; Garcia, A L; Meyer, H; Hoffmann, I; Leitzmann, P; Trippo, U; Zunft, H J F

    2005-09-01

    Estimating dietary intake is important for both epidemiological and clinical studies, but often lacks accuracy. To investigate the accuracy and validity of energy intake estimated by an easy-to-use semiquantitative food record (EI(SQFR)) compared to total energy expenditure (TEE) estimated by doubly labelled water technique (EE(DLW)). TEE was measured in 29 nonobese subjects using the doubly labelled water method over a period of 14 days. Within this period, subjects reported their food consumption by a newly developed semiquantitative food record for 4 consecutive days. Energy intake was calculated using the German Food Code and Nutrition Data Base BLS II.3. A good correlation was observed between EI(SQFR) and EE(DLW) (r = 0.685, P 20% in nine subjects (31%). In five subjects (17%), an overestimation of EI(SQFR) was observed. The easy-to-use semiquantitative food record provided good estimates of EI in free-living and nonobese adults without prior detailed verbal instructions. The presented food record has limitations regarding accuracy at the individual level.

  9. Validating presupposed versus focused text information.

    Science.gov (United States)

    Singer, Murray; Solar, Kevin G; Spear, Jackie

    2017-04-01

    There is extensive evidence that readers continually validate discourse accuracy and congruence, but that they may also overlook conspicuous text contradictions. Validation may be thwarted when the inaccurate ideas are embedded sentence presuppositions. In four experiments, we examined readers' validation of presupposed ("given") versus new text information. Throughout, a critical concept, such as a truck versus a bus, was introduced early in a narrative. Later, a character stated or thought something about the truck, which therefore matched or mismatched its antecedent. Furthermore, truck was presented as either given or new information. Mismatch target reading times uniformly exceeded the matching ones by similar magnitudes for given and new concepts. We obtained this outcome using different grammatical constructions and with different antecedent-target distances. In Experiment 4, we examined only given critical ideas, but varied both their matching and the main verb's factivity (e.g., factive know vs. nonfactive think). The Match × Factivity interaction closely resembled that previously observed for new target information (Singer, 2006). Thus, readers can successfully validate given target information. Although contemporary theories tend to emphasize either deficient or successful validation, both types of theory can accommodate the discourse and reader variables that may regulate validation.

  10. Towards optical spectroscopic anatomical mapping (OSAM) for lesion validation in cardiac tissue (Conference Presentation)

    Science.gov (United States)

    Singh-Moon, Rajinder P.; Zaryab, Mohammad; Hendon, Christine P.

    2017-02-01

    Electroanatomical mapping (EAM) is an invaluable tool for guiding cardiac radiofrequency ablation (RFA) therapy. The principle roles of EAM is the identification of candidate ablation sites by detecting regions of abnormal electrogram activity and lesion validation subsequent to RF energy delivery. However, incomplete lesions may present interim electrical inactivity similar to effective treatment in the acute setting, despite efforts to reveal them with pacing or drugs, such as adenosine. Studies report that the misidentification and recovery of such lesions is a leading cause of arrhythmia recurrence and repeat procedures. In previous work, we demonstrated spectroscopic characterization of cardiac tissues using a fiber optic-integrated RF ablation catheter. In this work, we introduce OSAM (optical spectroscopic anatomical mapping), the application of this spectroscopic technique to obtain 2-dimensional biodistribution maps. We demonstrate its diagnostic potential as an auxiliary method for lesion validation in treated swine preparations. Endocardial lesion sets were created on fresh swine cardiac samples using a commercial RFA system. An optically-integrated catheter console fabricated in-house was used for measurement of tissue optical spectra between 600-1000nm. Three dimensional, Spatio-spectral datasets were generated by raster scanning of the optical catheter across the treated sample surface in the presence of whole blood. Tissue optical parameters were recovered at each spatial position using an inverse Monte Carlo method. OSAM biodistribution maps showed stark correspondence with gross examination of tetrazolium chloride stained tissue specimens. Specifically, we demonstrate the ability of OSAM to readily distinguish between shallow and deeper lesions, a limitation faced by current EAM techniques. These results showcase the OSAMs potential for lesion validation strategies for the treatment of cardiac arrhythmias.

  11. Magnetoencephalographic localization of peritumoral temporal epileptic focus previous surgical resection.

    Science.gov (United States)

    Amo, Carlos; Saldaña, Cristóbal; Hidalgo, Mercedes González; Maestú, Fernando; Fernández, Alberto; Arrazola, Juan; Ortiz, Tomás

    2003-01-01

    Magnetoencephalography (MEG) is suggested as a localizing technique of epileptogenic areas in drug-resistant seizure patients due to intracraneal lesions. A male 42-year-old patient who begins at 26 with partial complex drug-resistant seizures is put forward. MRI shows a 9 mm diameter lesion located in left superior temporal gyrus which seems compatible with cavernoma. Both conventional and sleep deprivation EEGs have proved normal. Sleep EEG shows sharp waves in left temporal region. MEG helps to localize interictal spike and spike-wave activity, as well as wide slow wave (2-7 Hz) activity areas. Craniotomy under analgesia and aware sedation conditions is carried out. Intrasurgery cortical electric stimulation assisted by neuronavigator causes a limited partial complex seizure which the patient recognizes to be exactly like his. Thus, MEG localization of the epileptogenic area is confirmed. Surgical resection of both the lesion and the epileptogenic area is carried out. The patient remains free from seizures 9 months after surgery. A control MEG study reveals no epileptogenic nor slow wave activity. in this particular case, MEG has proven to be a useful presurgical evaluation technique to localize epileptogenic activity, validated by intrasurgical cortical stimulation.

  12. Magnetism in meteorites. [terminology, principles and techniques

    Science.gov (United States)

    Herndon, J. M.; Rowe, M. W.

    1974-01-01

    An overview of this subject is presented. The paper includes a glossary of magnetism terminology and a discussion of magnetic techniques used in meteorite research. These techniques comprise thermomagnetic analysis, alternating field demagnetization, thermal demagnetization, magnetic anisotropy, low-temperature cycling, and coercive forces, with emphasis on the first method. Limitations on the validity of paleointensity determinations are also discussed.

  13. Clinical outcomes of Laparoscopically Assisted Vaginal Hysterectomy at patients who had previous abdominopelvic surgery

    Directory of Open Access Journals (Sweden)

    Ali Riza Odabasi

    2007-03-01

    Full Text Available OBJECTIVE: To determine clinical outcomes of Laparoscopically Assisted Vaginal Hysterectomy (LAVH at patients who had previous abdominopelvic surgery.\tDesign: A clinical observational, prospective, non randomised trial comparing outcomes of 13 patients who had previous abdominopelvic surgery with outcomes of 19 patients who had not surgery.\tSetting: Adnan Menderes University Faculty of Medicine, Department of Obstetrics and Gynecology.\tPatients: Thirty-two subjects [average age 51,1±6,9 (37-66] who had indication of total abdominal hysterectomy and bilateral\tsalpingooferectomy due to benign pathologies.\tInterventions: According to ACOG, LAVH was performed by using the Garry technique at the trocar insertions, the Reich technique\tat the laparoscopic phase and the Heaney technique at the vaginal phase by the same operator. After adhesiolysis and diagnostic procedures, ureters were dissected medially. By coagulating, bilateral round and infundibulopelvic ligaments were cut after the\tmobilisation of bladder. The operation was completed by the same operation team by vaginal approach consequently. At all operations, 80 W unipolar or 150 W bipolar diathermic dissection and 25-35 W unipolar diathermic cutting were performed.\tMain outcome measures: Age, parity, menopausal status, preoperative indications, type of previous abdominopelvic surgey and incision, intraoperative indications, adhesion scores, rate of unintended laparotomy, operative time, uterus weight, loss of blood,\tcomplications, postoperative pain scores and analgesic requirements, time necessary for returning to normal intestinal function, length of hospitalisation and rate of readmission to hospital.\tRESULTS: When compared with the patients who had not previous abdominopelvic surgery, all adhesion scores, uterus weight, operative time and the number of total postoperative complications were found significantly high at patients who had previous\tsurgery. Loss of blood, the rate

  14. DTU PMU Laboratory Development - Testing and Validation

    OpenAIRE

    Garcia-Valle, Rodrigo; Yang, Guang-Ya; Martin, Kenneth E.; Nielsen, Arne Hejde; Østergaard, Jacob

    2010-01-01

    This is a report of the results of phasor measurement unit (PMU) laboratory development and testing done at the Centre for Electric Technology (CET), Technical University of Denmark (DTU). Analysis of the PMU performance first required the development of tools to convert the DTU PMU data into IEEE standard, and the validation is done for the DTU-PMU via a validated commercial PMU. The commercial PMU has been tested from the authors' previous efforts, where the response can be expected to foll...

  15. Validation of the multiplex PCR for identification of Brucella spp.

    Directory of Open Access Journals (Sweden)

    Lívia de Lima Orzil

    2016-05-01

    Full Text Available ABSTRACT: A multiplex PCR technique for detection of Brucella spp. in samples of bacterial suspension was validated as a complementary tool in the diagnosis of the disease. This technique allows the characterization of the agent without performing biochemical tests, which greatly reduces the time for a final diagnosis, and provides more security for the analyst by reducing the time of exposure to microorganisms. The validation was performed in accordance with the Manual of Diagnostic Tests from OIE (2008 and following the requirements present in the ABNT NBR ISO/IEC 17025:2005. The mPCR validated in this study identified the different species of Brucella ( Brucella abortus , B. suis , B. ovis e B. melitensis of bacterial suspension obtained from the slaughterhouse samples, as well as distinguished the biovars (1, 2 e 4; 3b, 5, 6 e 9 of B. abortus in grouped form and differentiated the field strains from vaccine strains, as a quick, useful and less expensive technique in diagnosis of brucellosis in Brazil.

  16. Validation of gamma-ray detection techniques for safeguards monitoring at natural uranium conversion facilities

    Energy Technology Data Exchange (ETDEWEB)

    Dewji, S.A., E-mail: dewjisa@ornl.gov [Oak Ridge National Laboratory, 1 Bethel Valley Road, MS-6335, Oak Ridge, TN 37831-6335 (United States); Lee, D.L.; Croft, S. [Oak Ridge National Laboratory, 1 Bethel Valley Road, MS-6335, Oak Ridge, TN 37831-6335 (United States); Hertel, N.E. [Oak Ridge National Laboratory, 1 Bethel Valley Road, MS-6335, Oak Ridge, TN 37831-6335 (United States); Nuclear and Radiological Engineering Program, Georgia Institute of Technology, 770 State Street, Atlanta, GA 30332-0745 (United States); Chapman, J.A.; McElroy, R.D.; Cleveland, S. [Oak Ridge National Laboratory, 1 Bethel Valley Road, MS-6335, Oak Ridge, TN 37831-6335 (United States)

    2016-07-01

    Recent IAEA circulars and policy papers have sought to implement safeguards when any purified aqueous uranium solution or uranium oxides suitable for isotopic enrichment or fuel fabrication exists. Under the revised policy, IAEA Policy Paper 18, the starting point for nuclear material under safeguards was reinterpreted, suggesting that purified uranium compounds should be subject to safeguards procedures no later than the first point in the conversion process. In response to this technical need, a combination of simulation models and experimental measurements were employed to develop and validate concepts of nondestructive assay monitoring systems in a natural uranium conversion plant (NUCP). In particular, uranyl nitrate (UO{sub 2}(NO{sub 3}){sub 2}) solution exiting solvent extraction was identified as a key measurement point (KMP), where gamma-ray spectroscopy was selected as the process monitoring tool. The Uranyl Nitrate Calibration Loop Equipment (UNCLE) facility at Oak Ridge National Laboratory was employed to simulate the full-scale operating conditions of a purified uranium-bearing aqueous stream exiting the solvent extraction process in an NUCP. Nondestructive assay techniques using gamma-ray spectroscopy were evaluated to determine their viability as a technical means for drawing safeguards conclusions at NUCPs, and if the IAEA detection requirements of 1 significant quantity (SQ) can be met in a timely way. This work investigated gamma-ray signatures of uranyl nitrate circulating in the UNCLE facility and evaluated various gamma-ray detector sensitivities to uranyl nitrate. These detector validation activities include assessing detector responses to the uranyl nitrate gamma-ray signatures for spectrometers based on sodium iodide, lanthanum bromide, and high-purity germanium detectors. The results of measurements under static and dynamic operating conditions at concentrations ranging from 10–90 g U/L of natural uranyl nitrate are presented. A range of

  17. Formal validation of supervisory energy management systems for microgrids

    DEFF Research Database (Denmark)

    Sugumar, Gayathri; Selvamuthukumaran, R.; Dragicevic, T.

    2017-01-01

    techniques are available in the literature to monitor and control the energy flows among distributed RES in MGs, formal verification of those techniques was not proposed yet. The emphasis of this paper is to design and validate energy management system for a MG which consists of a solar photovoltaic (PV...

  18. Pseudodynamic Bearing Capacity Analysis of Shallow Strip Footing Using the Advanced Optimization Technique “Hybrid Symbiosis Organisms Search Algorithm” with Numerical Validation

    Directory of Open Access Journals (Sweden)

    Arijit Saha

    2018-01-01

    Full Text Available The analysis of shallow foundations subjected to seismic loading has been an important area of research for civil engineers. This paper presents an upper-bound solution for bearing capacity of shallow strip footing considering composite failure mechanisms by the pseudodynamic approach. A recently developed hybrid symbiosis organisms search (HSOS algorithm has been used to solve this problem. In the HSOS method, the exploration capability of SQI and the exploitation potential of SOS have been combined to increase the robustness of the algorithm. This combination can improve the searching capability of the algorithm for attaining the global optimum. Numerical analysis is also done using dynamic modules of PLAXIS-8.6v for the validation of this analytical solution. The results obtained from the present analysis using HSOS are thoroughly compared with the existing available literature and also with the other optimization techniques. The significance of the present methodology to analyze the bearing capacity is discussed, and the acceptability of HSOS technique is justified to solve such type of engineering problems.

  19. Books Average Previous Decade of Economic Misery

    Science.gov (United States)

    Bentley, R. Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios

    2014-01-01

    For the 20th century since the Depression, we find a strong correlation between a ‘literary misery index’ derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade. PMID:24416159

  20. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations.

    Science.gov (United States)

    Hariharan, Prasanna; D'Souza, Gavin A; Horner, Marc; Morrison, Tina M; Malinauskas, Richard A; Myers, Matthew R

    2017-01-01

    A "credible" computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing "model credibility" is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a "threshold-based" validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results ("S") of velocity and viscous shear stress were compared with inter-laboratory experimental measurements ("D"). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student's t-test. However, following the threshold-based approach, a Student's t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be

  1. The Effect of Previous Co-Worker Experience on the Survival of Knowledge Intensive Start-Ups

    DEFF Research Database (Denmark)

    Timmermans, Bram

    The aim of the paper is to investigate the effect of previous co-worker experience on the survival of knowledge intensive start-ups. For the empirical analysis I use the Danish Integrated Database of Labor Market Research (IDA). This longitudinal employer-employee database allows me to identify co-worker...... experience among all members of the firm. In addition, I will make a distinction between ordinary start-ups and entrepreneurial spin-offs. The results show that previous co-worker experience has a positive effect on new firm survival. This effect appears to be valid predominantly for ordinary start-ups than...

  2. A brief fatigue inventory of shoulder health developed by quality function deployment technique.

    Science.gov (United States)

    Liu, Shuo-Fang; Lee, Yannlong; Huang, Yiting

    2009-01-01

    The purpose of this study was to develop a diagnostic outcome instrument that has high reliability and low cost. The scale, called the Shoulder Fatigue Scale-30 Items (SFS-30) risk assessment, was used to determine the severity of patient neck and shoulder discomfort. The quality function deployment (QFD) technique was used in designing and developing a simple medical diagnostic scale with high degree of accuracy. Research data can be used to divide the common causes of neck and shoulder discomfort into 6 core categories: occupation, cumulative, psychologic, diseases, diet, and sleep quality. The SFS-30 was validated by using a group of individuals who had been previously diagnosed with different levels of neck and shoulder symptoms. The SFS-30 assessment determined that 78.57% of the participants experienced a neck and shoulder discomfort level above the SFS-30 risk curve and required immediate medical attention. The QFD technique can improve the accuracy and reliability of an assessment outcome instrument. This is mainly because the QFD technique is effective in prioritizing and assigning weight to the items in the scale. This research successfully developed a reliable risk assessment scale to diagnose neck and shoulder symptoms using QFD technique. This scale was proven to have high accuracy and closely represents reality.

  3. Revision and simplification of the boarding previous minimum of the lumbar column

    International Nuclear Information System (INIS)

    Lazannec, JY; Del Vecchio, R; Ramare, S; Saillant, G

    2001-01-01

    This paper describes the boarding retroperineal previous minimum, which provides access at any level discal and vertebral between T12 and S1. It is carried out a technique of dissection retroperineal that facilitates the renal and duodenum-pancreatic mobilization to consent to the face previous left of the whole lumbar column and of the thoracic-lumbar union. They were carried out careful anatomical dissections in fresh cadavers and preserved to determine the topography and the anatomical relationships of interest and this way to develop a sure boarding and easily reproducible. Special attention has been paid to the description of the lumbar veins and the anastomosis between the vein renal left and the hemiacigos system for the exhibition of the expensive left anterolateral of T12 and L1. A series of 94 patients is reported with lesions caused by traumas or degenerative processes. For all the lumbar levels, even in-patient with antecedents of surgery intraperitoneal, the boarding minimum retroperitoneal, was safe for the kidneys, ureters, spleen, hypo gastric plexus and duodenum-pancreatic union. Better cosmetic results are reported, decrease of the time surgical, scarce bled intraoperatory and easiness for the decortications and placement of implants. The previous boarding minimum retro peritoneal of the column developed starting from the boarding classic retroperineals, offers significant advantages on the endoscopic techniques, which require sophisticated machinery and they are technically plaintiffs. The exhibition of all the lumbar levels, as well as the reduction maneuvers and placement of implants, they can be carried out with easiness without causing muscular damage

  4. Improvement and Validation of Weld Residual Stress Modelling Procedure

    International Nuclear Information System (INIS)

    Zang, Weilin; Gunnars, Jens; Dong, Pingsha; Hong, Jeong K.

    2009-06-01

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  5. Improvement and Validation of Weld Residual Stress Modelling Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Zang, Weilin; Gunnars, Jens (Inspecta Technology AB, Stockholm (Sweden)); Dong, Pingsha; Hong, Jeong K. (Center for Welded Structures Research, Battelle, Columbus, OH (United States))

    2009-06-15

    The objective of this work is to identify and evaluate improvements for the residual stress modelling procedure currently used in Sweden. There is a growing demand to eliminate any unnecessary conservatism involved in residual stress assumptions. The study was focused on the development and validation of an improved weld residual stress modelling procedure, by taking advantage of the recent advances in residual stress modelling and stress measurement techniques. The major changes applied in the new weld residual stress modelling procedure are: - Improved procedure for heat source calibration based on use of analytical solutions. - Use of an isotropic hardening model where mixed hardening data is not available. - Use of an annealing model for improved simulation of strain relaxation in re-heated material. The new modelling procedure is demonstrated to capture the main characteristics of the through thickness stress distributions by validation to experimental measurements. Three austenitic stainless steel butt-welds cases are analysed, covering a large range of pipe geometries. From the cases it is evident that there can be large differences between the residual stresses predicted using the new procedure, and the earlier procedure or handbook recommendations. Previously recommended profiles could give misleading fracture assessment results. The stress profiles according to the new procedure agree well with the measured data. If data is available then a mixed hardening model should be used

  6. Using support vector machines in the multivariate state estimation technique

    International Nuclear Information System (INIS)

    Zavaljevski, N.; Gross, K.C.

    1999-01-01

    One approach to validate nuclear power plant (NPP) signals makes use of pattern recognition techniques. This approach often assumes that there is a set of signal prototypes that are continuously compared with the actual sensor signals. These signal prototypes are often computed based on empirical models with little or no knowledge about physical processes. A common problem of all data-based models is their limited ability to make predictions on the basis of available training data. Another problem is related to suboptimal training algorithms. Both of these potential shortcomings with conventional approaches to signal validation and sensor operability validation are successfully resolved by adopting a recently proposed learning paradigm called the support vector machine (SVM). The work presented here is a novel application of SVM for data-based modeling of system state variables in an NPP, integrated with a nonlinear, nonparametric technique called the multivariate state estimation technique (MSET), an algorithm developed at Argonne National Laboratory for a wide range of nuclear plant applications

  7. IDENTIFICATION OF CANINE VISCERAL LEISHMANIASIS IN A PREVIOUSLY UNAFFECTED AREA BY CONVENTIONAL DIAGNOSTIC TECHNIQUES AND CELL-BLOCK FIXATION

    Directory of Open Access Journals (Sweden)

    Tuanne Rotti ABRANTES

    2016-01-01

    Full Text Available After the report of a second case of canine visceral leishmaniasis (CVL in São Bento da Lagoa, Itaipuaçu, in the municipality of Maricá, Rio de Janeiro State, an epidemiological survey was carried out, through active search, totaling 145 dogs. Indirect immunofluorescence assay (IFA, enzyme-linked immunosorbent assay (ELISA, and rapid chromatographic immunoassay based on dual-path platform (DPP(r were used to perform the serological examinations. The parasitological diagnosis of cutaneous fragments was performed by parasitological culture, histopathology, and immunohistochemistry. In the serological assessment, 21 dogs were seropositive by IFA, 17 by ELISA, and 11 by DPP(r, with sensitivity of 66.7%, 66.7% and 50%, and specificity of 87.2%, 90.2% and 94%, respectively for each technique. The immunohistochemistry of bone marrow using the cell-block technique presented the best results, with six positive dogs found, three of which tested negative by the other parasitological techniques. Leishmania sp. was isolated by parasitological culture in three dogs. The detection of autochthonous Leishmania infantum in Itaipuaçu, and the high prevalence of seropositive dogs confirm the circulation of this parasite in the study area and alert for the risk of expansion in the State of Rio de Janeiro.

  8. Validation of the technique of kinetic fosforimetria induced by laser for the determination of the concentration of uranium in urine; Validacion de la tecnica de fosforimetria cinetica inducida por laser para la determinacion de la concentracion de uranio en orina

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez, C.; Sierra, I.; Benito, P.; Lopez, C.

    2013-07-01

    This paper describes the methodology used to conduct the validation of the method of determination of uranium using the technique of Kinetic Phosphorescence Analyser (KPA) The technical requirements of the standard 17025 stresses the need for validation samples KPA urine of workers exposed to risk of internal contamination. (Author)

  9. Classification and Validation of Behavioral Subtypes of Learning-Disabled Children.

    Science.gov (United States)

    Speece, Deborah L.; And Others

    1985-01-01

    Using the Classroom Behavior Inventory, teachers rated the behaviors of 63 school-identified, learning-disabled first and second graders. Hierarchical cluster analysis techniques identified seven distinct behavioral subtypes. Internal validation techniques indicated that the subtypes were replicable and had profile patterns different from a sample…

  10. Assessing the validity of discourse analysis: transdisciplinary convergence

    Science.gov (United States)

    Jaipal-Jamani, Kamini

    2014-12-01

    Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to research. The argument is made that discourse analysis explicitly grounded in semiotics, systemic functional linguistics, and critical theory, offers a credible research methodology. The underlying assumptions, constructs, and techniques of analysis of these three theoretical disciplines can be drawn on to show convergence of data at multiple levels, validating interpretations from text analysis.

  11. CAVEAT: an assistance project for software validation using formal techniques

    International Nuclear Information System (INIS)

    Trotin, A.; Antoine, C.; Baudin, P.; Collart, J.M.; Raguideau, J.; Zylberajch, C.

    1995-01-01

    The aim of the CAVEAT project is to provide a tool for the validation of industrial C language softwares. It allows the user to go inside the program and have a good comprehension of it. It allows also the possibility to realize refined verifications of the consistency between the specifications and the program by translating the properties into a more suitable language. It calculates automatically the conditions to demonstrate, and offers an assistance to perform interactive demonstrations. The principal application of this tool is the safety of systems during the verification/certification phase or during the developing phase where it can works as an intelligent debugging system. (J.S.). 5 refs., 1 fig

  12. Validation of satellite data through the remote sensing techniques and the inclusion of them into agricultural education pilot programs

    Science.gov (United States)

    Papadavid, Georgios; Kountios, Georgios; Bournaris, T.; Michailidis, Anastasios; Hadjimitsis, Diofantos G.

    2016-08-01

    Nowadays, the remote sensing techniques have a significant role in all the fields of agricultural extensions as well as agricultural economics and education but they are used more specifically in hydrology. The aim of this paper is to demonstrate the use of field spectroscopy for validation of the satellite data and how combination of remote sensing techniques and field spectroscopy can have more accurate results for irrigation purposes. For this reason vegetation indices are used which are mostly empirical equations describing vegetation parameters during the lifecycle of the crops. These numbers are generated by some combination of remote sensing bands and may have some relationship to the amount of vegetation in a given image pixel. Due to the fact that most of the commonly used vegetation indices are only concerned with red-near-infrared spectrum and can be divided to perpendicular and ratio based indices the specific goal of the research is to illustrate the effect of the atmosphere to those indices, in both categories. In this frame field spectroscopy is employed in order to derive the spectral signatures of different crops in red and infrared spectrum after a campaign of ground measurements. The main indices have been calculated using satellite images taken at interval dates during the whole lifecycle of the crops by using a GER 1500 spectro-radiomete. These indices was compared to those extracted from satellite images after applying an atmospheric correction algorithm -darkest pixel- to the satellite images at a pre-processing level so as the indices would be in comparable form to those of the ground measurements. Furthermore, there has been a research made concerning the perspectives of the inclusion of the above mentioned remote satellite techniques to agricultural education pilot programs.

  13. Validating safeguards effectiveness given inherently limited test data

    International Nuclear Information System (INIS)

    Sicherman, A.

    1987-01-01

    A key issue in designing and evaluating nuclear safeguards systems is how to validate safeguards effectiveness against a spectrum of potential threats. Safeguards effectiveness is measured by a performance indicator such as the probability of defeating an adversary attempting a malevolent act. Effectiveness validation means a testing program that provides sufficient evidence that the performance indicator is at an acceptable level. Traditional statistical program when numerous independent system trials are possible. However, within the safeguards environment, many situations arise for which traditional statistical approaches may be neither feasible nor appropriate. Such situations can occur, for example, when there are obvious constraints on the number of possible tests due to operational impacts and testing costs. Furthermore, these tests are usually simulations (e.g., staged force-on-force exercises) rather than actual tests, and the system is often modified after each test. Under such circumstances, it is difficult to make and justify inferences about system performance by using traditional statistical techniques. In this paper, the authors discuss several alternative quantitative techniques for validating system effectiveness. The techniques include: (1) minimizing the number of required tests using sequential testing; (2) combining data from models inspections and exercises using Bayesian statistics to improve inferences about system performance; and (3) using reliability growth and scenario modeling to help specify which safeguards elements and scenarios to test

  14. NDE reliability and advanced NDE technology validation

    International Nuclear Information System (INIS)

    Doctor, S.R.; Deffenbaugh, J.D.; Good, M.S.; Green, E.R.; Heasler, P.G.; Hutton, P.H.; Reid, L.D.; Simonen, F.A.; Spanner, J.C.; Vo, T.V.

    1989-01-01

    This paper reports on progress for three programs: (1) evaluation and improvement in nondestructive examination reliability for inservice inspection of light water reactors (LWR) (NDE Reliability Program), (2) field validation acceptance, and training for advanced NDE technology, and (3) evaluation of computer-based NDE techniques and regional support of inspection activities. The NDE Reliability Program objectives are to quantify the reliability of inservice inspection techniques for LWR primary system components through independent research and establish means for obtaining improvements in the reliability of inservice inspections. The areas of significant progress will be described concerning ASME Code activities, re-analysis of the PISC-II data, the equipment interaction matrix study, new inspection criteria, and PISC-III. The objectives of the second program are to develop field procedures for the AE and SAFT-UT techniques, perform field validation testing of these techniques, provide training in the techniques for NRC headquarters and regional staff, and work with the ASME Code for the use of these advanced technologies. The final program's objective is to evaluate the reliability and accuracy of interpretation of results from computer-based ultrasonic inservice inspection systems, and to develop guidelines for NRC staff to monitor and evaluate the effectiveness of inservice inspections conducted on nuclear power reactors. This program started in the last quarter of FY89, and the extent of the program was to prepare a work plan for presentation to and approval from a technical advisory group of NRC staff

  15. The application of self-validation to wireless sensor networks

    International Nuclear Information System (INIS)

    Collett, Michael A; Cox, Maurice G; Esward, Trevor J; Harris, Peter M; Duta, Mihaela; Henry, Manus P

    2008-01-01

    Self-validation is a valuable tool for extending the operating range of sensing systems and making them more robust. Wireless sensor networks suffer many limitations meaning that their efficacy could be greatly improved by self-validation techniques. We present two independently developed data analysis techniques and demonstrate that they can be applied to a wireless sensor network. Using an acoustic ranging application we demonstrate an improvement of more than ten-fold in the uncertainty of a single measurement where multiple sensor readings are appropriately combined. We also demonstrate that of the two methods for determining a largest consistent subset one is more rigorous in dealing with correlation, and the other more suited to time-series data

  16. Hole Drilling Technique – on site stress measurement

    OpenAIRE

    Schueremans, Luc

    2009-01-01

    2. Hole Drilling Technique for onsite stress measurement has been used to validate the stress level at 2 pillars of the Sint-Jacobschurch (Leuven, B). The technique allows estimating the stress in a stone from measuring deformation when a small hole is made. It is a low intrusive technique. The application of it is limited to local stress measurements and is a complement to stress estimate from calculations of from the use of –for example- flat jacks. In addition to the flat-jack technique...

  17. Member checking: a tool to enhance trustworthiness or merely a nod to validation?

    OpenAIRE

    Birt, Linda; Scott, Suzanne; Cavers, Deborah; Campbell, Christine; Walter, Fiona M

    2016-01-01

    The trustworthiness of results is the bedrock of high quality qualitative research. Member checking, also known as participant or respondent validation, is a technique for exploring the credibility of results. Data or results are returned to participants to check for accuracy and resonance with their experiences. Member checking is often mentioned as one in a list of validation techniques. This simplistic reporting might not acknowledge the value of using the method, nor its juxtaposition wit...

  18. A review on diagnostic techniques for brucellosis | Kaltungo | African ...

    African Journals Online (AJOL)

    ... but has not been validated for standard laboratory use. This paper highlights useful samples and, especially the different conventional to more sophisticated molecular techniques for the diagnosis of brucellosis. Keywords: Brucellosis, diagnosis, techniques. African Journal of Biotechnology, Vol. 13(1), pp. 1-10, 1 January, ...

  19. Flight test techniques for validating simulated nuclear electromagnetic pulse aircraft responses

    Science.gov (United States)

    Winebarger, R. M.; Neely, W. R., Jr.

    1984-01-01

    An attempt has been made to determine the effects of nuclear EM pulses (NEMPs) on aircraft systems, using a highly instrumented NASA F-106B to document the simulated NEMP environment at the Kirtland Air Force Base's Vertically Polarized Dipole test facility. Several test positions were selected so that aircraft orientation relative to the test facility would be the same in flight as when on the stationary dielectric stand, in order to validate the dielectric stand's use in flight configuration simulations. Attention is given to the flight test portions of the documentation program.

  20. Verification and validation of RADMODL Version 1.0

    International Nuclear Information System (INIS)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V ampersand V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident

  1. Verification and validation of RADMODL Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.

  2. Analysis of the validity of the asymptotic techniques in the lower hybrid wave equation solution for reactor applications

    International Nuclear Information System (INIS)

    Cardinali, A.; Morini, L.; Castaldo, C.; Cesario, R.; Zonca, F.

    2007-01-01

    Knowing that the lower hybrid (LH) wave propagation in tokamak plasmas can be correctly described with a full wave approach only, based on fully numerical techniques or on semianalytical approaches, in this paper, the LH wave equation is asymptotically solved via the Wentzel-Kramers-Brillouin (WKB) method for the first two orders of the expansion parameter, obtaining governing equations for the phase at the lowest and for the amplitude at the next order. The nonlinear partial differential equation (PDE) for the phase is solved in a pseudotoroidal geometry (circular and concentric magnetic surfaces) by the method of characteristics. The associated system of ordinary differential equations for the position and the wavenumber is obtained and analytically solved by choosing an appropriate expansion parameter. The quasilinear PDE for the WKB amplitude is also solved analytically, allowing us to reconstruct the wave electric field inside the plasma. The solution is also obtained numerically and compared with the analytical solution. A discussion of the validity limits of the WKB method is also given on the basis of the obtained results

  3. Social validation of vocabulary selection: ensuring stakeholder relevance.

    Science.gov (United States)

    Bornman, Juan; Bryen, Diane Nelson

    2013-06-01

    The vocabulary needs of individuals who are unable to spell their messages continue to be of concern in the field of augmentative and alternative communication (AAC). Social validation of vocabulary selection has been suggested as one way to improve the effectiveness and relevance of service delivery in AAC. Despite increased emphasis on stakeholder accountability, social validation is not frequently used in AAC research. This paper describes an investigation of the social validity of a vocabulary set identified in earlier research. A previous study used stakeholder focus groups to identify vocabulary that could be used by South African adults who use AAC to disclose their experiences as victims of crime or abuse. Another study used this vocabulary to create communication boards for use by adults with complex communication needs. In this current project, 12 South African adults with complex communication needs who use AAC systems used a 5-point Likert scale to score the importance of each of the previously identified 57 vocabulary items. This two-step process of first using stakeholder focus groups to identify vocabulary, and then having literate persons who use AAC provide information on social validity of the vocabulary on behalf of their peers who are illiterate, appears to hold promise as a culturally relevant vocabulary selection approach for sensitive topics such as crime and abuse.

  4. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  5. Evaluation of modified Dennis parasitological technique for diagnosis of bovine fascioliasis.

    Science.gov (United States)

    Correa, Stefanya; Martínez, Yudy Liceth; López, Jessika Lissethe; Velásquez, Luz Elena

    2016-02-23

    Bovine fascioliasis causes important economic losses, estimated at COP$ 12,483 billion per year; its prevalence is 25% in dairy cattle. Parasitological techniques are required for it diagnosis. The Dennis technique, modified in 2002, is the one used in Colombia, but its sensitivity, specificity and validity are not known.  To evaluate the validity and performance of the modified Dennis technique for diagnosis of bovine fascioliasis using as reference test the observation of parasites in the liver.  We conducted a diagnostic evaluation study. We selected a convenience sample of discarded bovines sacrificed between March and June, 2013, in Frigocolanta for the study. We collected 25 g of feces from each animal and their liver and bile ducts were examined for Fasciola hepatica. The sensitivity, specificity, predictive positive value, predictive negative value, and validity index were calculated with 95% confidence intervals. The post-mortem evaluation was used as the gold standard.  We analyzed 180 bovines. The sensitivity and specificity of the modified Dennis technique were 73.2% (95% CI=58.4% - 87.9%) and 84.2% (95% CI= 77.7% - 90.6%), respectively. The positive predictive value was 57.7% (95% CI= 43.3% - 72.1%) and the negative one 91.4% (95% CI= 86.2% - 96.6%). The prevalence of bovine fascioliasis was 22.8% (95% CI= 16.4% - 29.2%).  The validity and the performance of the modified Dennis technique were higher than those of the traditional one, which makes it a good screening test for diagnosing fascioliasis for population and prevalence studies and during animal health campaigns.

  6. A practical guide for operational validation of discrete simulation models

    Directory of Open Access Journals (Sweden)

    Fabiano Leal

    2011-04-01

    Full Text Available As the number of simulation experiments increases, the necessity for validation and verification of these models demands special attention on the part of the simulation practitioners. By analyzing the current scientific literature, it is observed that the operational validation description presented in many papers does not agree on the importance designated to this process and about its applied techniques, subjective or objective. With the expectation of orienting professionals, researchers and students in simulation, this article aims to elaborate a practical guide through the compilation of statistical techniques in the operational validation of discrete simulation models. Finally, the guide's applicability was evaluated by using two study objects, which represent two manufacturing cells, one from the automobile industry and the other from a Brazilian tech company. For each application, the guide identified distinct steps, due to the different aspects that characterize the analyzed distributions

  7. Validation of software releases for CMS

    International Nuclear Information System (INIS)

    Gutsche, Oliver

    2010-01-01

    The CMS software stack currently consists of more than 2 Million lines of code developed by over 250 authors with a new version being released every week. CMS has setup a validation process for quality assurance which enables the developers to compare the performance of a release to previous releases and references. The validation process provides the developers with reconstructed datasets of real data and MC samples. The samples span the whole range of detector effects and important physics signatures to benchmark the performance of the software. They are used to investigate interdependency effects of all CMS software components and to find and fix bugs. The release validation process described here is an integral part of CMS software development and contributes significantly to ensure stable production and analysis. It represents a sizable contribution to the overall MC production of CMS. Its success emphasizes the importance of a streamlined release validation process for projects with a large code basis and significant number of developers and can function as a model for future projects.

  8. Simple protein precipitation extraction technique followed by validated chromatographic method for linezolid analysis in real human plasma samples to study its pharmacokinetics.

    Science.gov (United States)

    Mohammed, Samah A; Eissa, Maya S; Ahmed, Hytham M

    2017-02-01

    Fast and sensitive HPLC method was developed, optimized and validated for quantification of linezolid (LNZ) in human plasma using guaifenesin as an internal standard (IS). Analyte and IS were extracted from plasma by simple protein precipitation extraction technique using methanol as the precipitating solvent. The pretreated samples were injected in a mobile phase formed of acetonitrile:water:methanol (20:70:10v/v/v) in an isocratic mode at a flow rate of 1.5mL/min with UV detection at 251nm. Separation was done using Aglient ODS C 18 . The method showed linearity in the range of 0.75-50μg/mL with correlation coefficients equals to 0.9991. Precision and accuracy were in conformity with the criteria normally accepted in bio-analytical method validation. The RSDs for intra- and inter-day assays were <3.56 and 4.63%, respectively. The intra- and inter-day accuracies were 94.67-98.28% and 91.25-96.18%, respectively. The mean absolute recoveries ranged from 92.56±1.78 to 95.24±2.84. According to stability results, LNZ was stable in human plasma during the storage and analysis. LNZ a pharmacokinetic behavior was studied by applying the proposed analytical method. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Network Security Validation Using Game Theory

    Science.gov (United States)

    Papadopoulou, Vicky; Gregoriades, Andreas

    Non-functional requirements (NFR) such as network security recently gained widespread attention in distributed information systems. Despite their importance however, there is no systematic approach to validate these requirements given the complexity and uncertainty characterizing modern networks. Traditionally, network security requirements specification has been the results of a reactive process. This however, limited the immunity property of the distributed systems that depended on these networks. Security requirements specification need a proactive approach. Networks' infrastructure is constantly under attack by hackers and malicious software that aim to break into computers. To combat these threats, network designers need sophisticated security validation techniques that will guarantee the minimum level of security for their future networks. This paper presents a game-theoretic approach to security requirements validation. An introduction to game theory is presented along with an example that demonstrates the application of the approach.

  10. Multi-element analysis of lubricant oil by WDXRF technique using thin-film sample preparation

    International Nuclear Information System (INIS)

    Scapin, M. A.; Salvador, V. L. R.; Lopes, C. D.; Sato, I. M.

    2006-01-01

    The quantitative analysis of the chemical elements in matrices like oils or gels represents a challenge for the analytical chemists. The classics methods or instrumental techniques such as atomic absorption spectrometry (AAS) and plasma optical emission spectrometry (ICP-OES) need chemical treatments, mainly sample dissolution and degradation processes. X-ray fluorescence technique allows a direct and multi-element analysis without previous sample treatments. In this work, a sensible method for the determination of elements Mg, Al, Si, P, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Mo, Ag, Sn, Ba and Pb in lubricating oil is presented. The x-ray fluorescence (WDXRF) technique using linear regression method and thin film sample preparation was used. The validation of the methodology (repeatability and accuracy) was obtained by the analysis of the standard reference materials SRM Alpha AESAR lot 703527D, applying the Chauvenet, Cochrane, ANOVA and Z-score statistical tests. The method presents a relative standard deviation lower than 10% for all the elements, except for Pb determination (RSD Pb 15%). The Z-score values for all the elements were in the range -2 < Z < 2, indicating a very good accuracy.(Full text)

  11. Factor analysis methods and validity evidence: A systematic review of instrument development across the continuum of medical education

    Science.gov (United States)

    Wetzel, Angela Payne

    Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across the continuum of medical education had not been previously identified. Therefore, the purpose for this study was critical review of instrument development articles employing exploratory factor or principal component analysis published in medical education (2006--2010) to describe and assess the reporting of methods and validity evidence based on the Standards for Educational and Psychological Testing and factor analysis best practices. Data extraction of 64 articles measuring a variety of constructs that have been published throughout the peer-reviewed medical education literature indicate significant errors in the translation of exploratory factor analysis best practices to current practice. Further, techniques for establishing validity evidence tend to derive from a limited scope of methods including reliability statistics to support internal structure and support for test content. Instruments reviewed for this study lacked supporting evidence based on relationships with other variables and response process, and evidence based on consequences of testing was not evident. Findings suggest a need for further professional development within the medical education researcher community related to (1) appropriate factor analysis methodology and reporting and (2) the importance of pursuing multiple sources of reliability and validity evidence to construct a well-supported argument for the inferences made from the instrument. Medical education researchers and educators should be cautious in adopting instruments from the literature and carefully review available evidence. Finally, editors and reviewers are encouraged to recognize

  12. Validating the Remotely Sensed Geography of Crime: A Review of Emerging Issues

    Directory of Open Access Journals (Sweden)

    Alice B. Kelly

    2014-12-01

    Full Text Available This paper explores the existing literature on the active detection of crimes using remote sensing technologies. The paper reviews sixty-one studies that use remote sensing to actively detect crime. Considering the serious consequences of misidentifying crimes or sites of crimes (e.g., opening that place and its residents up to potentially needless intrusion, intimidation, surveillance or violence, the authors were surprised to find a lack of rigorous validation of the remote sensing methods utilized in these studies. In some cases, validation was not mentioned, while in others, validation was severely hampered by security issues, rough terrain and weather conditions. The paper also considers the potential hazards of the use of Google Earth to identify crimes and criminals. The paper concludes by considering alternate, “second order” validation techniques that could add vital context and understanding to remotely sensed images in a law enforcement context. With this discussion, the authors seek to initiate a discussion on other potential “second order” validation techniques, as well as on the exponential growth of surveillance in our everyday lives.

  13. Site characterization and validation - stress field in the SCV block and around the validation drift. Stage 3

    International Nuclear Information System (INIS)

    McKinnon, S.; Carr, P.

    1990-04-01

    The results of previous stress measurement and stress modelling programmes carried out in the vicinity of the SCV block have been reviewed. Collectively, the results show that the stress field is influenced by the presence of the old mine excavations, and the measurements can be divided into near-field and far-field locations. The near-field measurements denote the extent and magnitude of the mining induced stresses while the far-field measurements reflect virgin conditions. Because of large scatter in the previous data, additional stress measurements were carried out using the CSIRO hollow inclusion cell. Combining all measurements, an estimate of the virgin stress tensor was made. Three-dimensional stress modelling was carried out using the program BEFE to determine the state of stress in the SCV block, and around the validation drift. This modelling showed that most of the SCV block is in a virgin stress field. Stresses acting on the fracture zones in the SCV block will be due only to the virgin stress field and induced stresses from the validation drift. (orig.)

  14. Prosthetic valve sparing aortic root replacement: an improved technique.

    Science.gov (United States)

    Leacche, Marzia; Balaguer, Jorge M; Umakanthan, Ramanan; Byrne, John G

    2008-10-01

    We describe a modified surgical technique to treat patients with a previous history of isolated aortic valve replacement who now require aortic root replacement for an aneurysmal or dissected aorta. This technique consists of replacing the aortic root with a Dacron conduit, leaving intact the previously implanted prosthesis, and re-implanting the coronary arteries in the Dacron graft. Our technique differs from other techniques in that we do not leave behind any aortic tissue remnant and also in that we use a felt strip to obliterate any gap between the old sewing ring and the newly implanted graft. In our opinion, this promotes better hemostasis. We demonstrate that this technique is safe, feasible, and results in acceptable outcomes.

  15. Indications and technique of fetal magnetic resonance imaging

    International Nuclear Information System (INIS)

    Asenbaum, U.; Woitek, R.; Furtner, J.; Prayer, D.; Brugger, P.C.

    2013-01-01

    Evaluation and confirmation of fetal pathologies previously suspected or diagnosed with ultrasound. Ultrasound and magnetic resonance imaging (MRI). Technique for prenatal fetal examination. Fetal MRI is an established supplementary technique to prenatal ultrasound. Fetal MRI should only be used as an additional method in prenatal diagnostics and not for routine screening. Fetal MRI should only be performed in perinatal medicine centers after a previous level III ultrasound examination. (orig.) [de

  16. Applying contemporary statistical techniques

    CERN Document Server

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  17. Criminal profiling as expert witness evidence: The implications of the profiler validity research.

    Science.gov (United States)

    Kocsis, Richard N; Palermo, George B

    The use and development of the investigative tool colloquially known as criminal profiling has steadily increased over the past five decades throughout the world. Coupled with this growth has been a diversification in the suggested range of applications for this technique. Possibly the most notable of these has been the attempted transition of the technique from a tool intended to assist police investigations into a form of expert witness evidence admissible in legal proceedings. Whilst case law in various jurisdictions has considered with mutual disinclination the evidentiary admissibility of criminal profiling, a disjunction has evolved between these judicial examinations and the scientifically vetted research testing the accuracy (i.e., validity) of the technique. This article offers an analysis of the research directly testing the validity of the criminal profiling technique and the extant legal principles considering its evidentiary admissibility. This analysis reveals that research findings concerning the validity of criminal profiling are surprisingly compatible with the extant legal principles. The overall conclusion is that a discrete form of crime behavioural analysis is supported by the profiler validity research and could be regarded as potentially admissible expert witness evidence. Finally, a number of theoretical connections are also identified concerning the skills and qualifications of individuals who may feasibly provide such expert testimony. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Applying modern psychometric techniques to melodic discrimination testing: Item response theory, computerised adaptive testing, and automatic item generation.

    Science.gov (United States)

    Harrison, Peter M C; Collins, Tom; Müllensiefen, Daniel

    2017-06-15

    Modern psychometric theory provides many useful tools for ability testing, such as item response theory, computerised adaptive testing, and automatic item generation. However, these techniques have yet to be integrated into mainstream psychological practice. This is unfortunate, because modern psychometric techniques can bring many benefits, including sophisticated reliability measures, improved construct validity, avoidance of exposure effects, and improved efficiency. In the present research we therefore use these techniques to develop a new test of a well-studied psychological capacity: melodic discrimination, the ability to detect differences between melodies. We calibrate and validate this test in a series of studies. Studies 1 and 2 respectively calibrate and validate an initial test version, while Studies 3 and 4 calibrate and validate an updated test version incorporating additional easy items. The results support the new test's viability, with evidence for strong reliability and construct validity. We discuss how these modern psychometric techniques may also be profitably applied to other areas of music psychology and psychological science in general.

  19. Analytical techniques and method validation for the measurement of selected semivolatile and nonvolatile organofluorochemicals in air.

    Science.gov (United States)

    Reagen, William K; Lindstrom, Kent R; Thompson, Kathy L; Flaherty, John M

    2004-09-01

    The widespread use of semi- and nonvolatile organofluorochemicals in industrial facilities, concern about their persistence, and relatively recent advancements in liquid chromatography/mass spectrometry (LC/MS) technology have led to the development of new analytical methods to assess potential worker exposure to airborne organofluorochemicals. Techniques were evaluated for the determination of 19 organofluorochemicals and for total fluorine in ambient air samples. Due to the potential biphasic nature of most of these fluorochemicals when airborne, Occupational Safety and Health Administration (OSHA) versatile sampler (OVS) tubes were used to simultaneously trap fluorochemical particulates and vapors from workplace air. Analytical methods were developed for OVS air samples to quantitatively analyze for total fluorine using oxygen bomb combustion/ion selective electrode and for 17 organofluorochemicals using LC/MS and gas chromatography/mass spectrometry (GC/MS). The experimental design for this validation was based on the National Institute of Occupational Safety and Health (NIOSH) Guidelines for Air Sampling and Analytical Method Development and Evaluation, with some revisions of the experimental design. The study design incorporated experiments to determine analytical recovery and stability, sampler capacity, the effect of some environmental parameters on recoveries, storage stability, limits of detection, precision, and accuracy. Fluorochemical mixtures were spiked onto each OVS tube over a range of 0.06-6 microg for each of 12 compounds analyzed by LC/MS and 0.3-30 microg for 5 compounds analyzed by GC/MS. These ranges allowed reliable quantitation at 0.001-0.1 mg/m3 in general for LC/MS analytes and 0.005-0.5 mg/m3 for GC/MS analytes when 60 L of air are sampled. The organofluorochemical exposure guideline (EG) is currently 0.1 mg/m3 for many analytes, with one exception being ammonium perfluorooctanoate (EG is 0.01 mg/m3). Total fluorine results may be used

  20. Labelled antibody techniques in glycoprotein estimation

    International Nuclear Information System (INIS)

    Hazra, D.K.; Ekins, R.P.; Edwards, R.; Williams, E.S.

    1977-01-01

    The problems in the radioimmunoassay of the glycoprotein hormones (pituitary LH, FSH and TSH and human chlorionic gonadotrophin HGG) are reviewed viz: limited specificity and sensitivity in the clinical context, interpretation of disparity between bioassay and radioimmunoassay, and interlaboratory variability. The advantages and limitations of the labelled antibody techniques - classical immonoradiometric methods and 2-site or 125 I-anti-IgG indirect labelling modifications are reviewed in general, and their theoretical potential in glycoprotein assays examined in the light of previous work. Preliminary experiments in the development of coated tube 2-site assay for glycoproteins using 125 I anti-IgG labelling are described, including conditions for maximizing solid phase extraction of the antigen, iodination of anti-IgG, and assay conditions such as effects of temperature of incubation with antigen 'hormonefree serum', heterologous serum and detergent washing. Experiments with extraction and antigen-specific antisera raised in the same or different species are described as exemplified by LH and TSH assay systems, the latter apparently promising greater sensitivity than radioimmunoassay. Proposed experimental and mathematical optimisation and validation of the method as an assay system is outlined, and the areas for further work delineated. (orig.) [de

  1. Alternative Constraint Handling Technique for Four-Bar Linkage Path Generation

    Science.gov (United States)

    Sleesongsom, S.; Bureerat, S.

    2018-03-01

    This paper proposes an extension of a new concept for path generation from our previous work by adding a new constraint handling technique. The propose technique was initially designed for problems without prescribed timing by avoiding the timing constraint, while remain constraints are solving with a new constraint handling technique. The technique is one kind of penalty technique. The comparative study is optimisation of path generation problems are solved using self-adaptive population size teaching-learning based optimization (SAP-TLBO) and original TLBO. In this study, two traditional path generation test problem are used to test the proposed technique. The results show that the new technique can be applied with the path generation problem without prescribed timing and gives better results than the previous technique. Furthermore, SAP-TLBO outperforms the original one.

  2. Verification and Validation of a Fingerprint Image Registration Software

    Directory of Open Access Journals (Sweden)

    Liu Yan

    2006-01-01

    Full Text Available The need for reliable identification and authentication is driving the increased use of biometric devices and systems. Verification and validation techniques applicable to these systems are rather immature and ad hoc, yet the consequences of the wide deployment of biometric systems could be significant. In this paper we discuss an approach towards validation and reliability estimation of a fingerprint registration software. Our validation approach includes the following three steps: (a the validation of the source code with respect to the system requirements specification; (b the validation of the optimization algorithm, which is in the core of the registration system; and (c the automation of testing. Since the optimization algorithm is heuristic in nature, mathematical analysis and test results are used to estimate the reliability and perform failure analysis of the image registration module.

  3. Time since discharge of 9mm cartridges by headspace analysis, part 1: Comprehensive optimisation and validation of a headspace sorptive extraction (HSSE) method.

    Science.gov (United States)

    Gallidabino, M; Romolo, F S; Weyermann, C

    2017-03-01

    Estimating the time since discharge of spent cartridges can be a valuable tool in the forensic investigation of firearm-related crimes. To reach this aim, it was previously proposed that the decrease of volatile organic compounds released during discharge is monitored over time using non-destructive headspace extraction techniques. While promising results were obtained for large-calibre cartridges (e.g., shotgun shells), handgun calibres yielded unsatisfying results. In addition to the natural complexity of the specimen itself, these can also be attributed to some selective choices in the methods development. Thus, the present series of paper aimed to more systematically evaluate the potential of headspace analysis to estimate the time since discharge of cartridges through the use of more comprehensive analytical and interpretative techniques. Specifically, in this first part, a method based on headspace sorptive extraction (HSSE) was comprehensively optimised and validated, as the latter recently proved to be a more efficient alternative than previous approaches. For this purpose, 29 volatile organic compounds were preliminary selected on the basis of previous works. A multivariate statistical approach based on design of experiments (DOE) was used to optimise variables potentially involved in interaction effects. Introduction of deuterated analogues in sampling vials was also investigated as strategy to account for analytical variations. Analysis was carried out by selected ion mode, gas chromatography coupled to mass spectrometry (GC-MS). Results showed good chromatographic resolution as well as detection limits and peak area repeatability. Application to 9mm spent cartridges confirmed that the use of co-extracted internal standards allowed for improved reproducibility of the measured signals. The validated method will be applied in the second part of this work to estimate the time since discharge of 9mm spent cartridges using multivariate models. Copyright

  4. Validity and reliability of an instrument for assessing case analyses in bioengineering ethics education.

    Science.gov (United States)

    Goldin, Ilya M; Pinkus, Rosa Lynn; Ashley, Kevin

    2015-06-01

    Assessment in ethics education faces a challenge. From the perspectives of teachers, students, and third-party evaluators like the Accreditation Board for Engineering and Technology and the National Institutes of Health, assessment of student performance is essential. Because of the complexity of ethical case analysis, however, it is difficult to formulate assessment criteria, and to recognize when students fulfill them. Improvement in students' moral reasoning skills can serve as the focus of assessment. In previous work, Rosa Lynn Pinkus and Claire Gloeckner developed a novel instrument for assessing moral reasoning skills in bioengineering ethics. In this paper, we compare that approach to existing assessment techniques, and evaluate its validity and reliability. We find that it is sensitive to knowledge gain and that independent coders agree on how to apply it.

  5. Validation study of genetic biomarkers of response to TNF inhibitors in rheumatoid arthritis.

    Directory of Open Access Journals (Sweden)

    Rosario Lopez-Rodriguez

    Full Text Available Genetic biomarkers are sought to personalize treatment of patients with rheumatoid arthritis (RA, given their variable response to TNF inhibitors (TNFi. However, no genetic biomaker is yet sufficiently validated. Here, we report a validation study of 18 previously reported genetic biomarkers, including 11 from GWAS of response to TNFi. The validation was attempted in 581 patients with RA that had not been treated with biologic antirheumatic drugs previously. Their response to TNFi was evaluated at 3, 6 and 12 months in two ways: change in the DAS28 measure of disease activity, and according to the EULAR criteria for response to antirheumatic drugs. Association of these parameters with the genotypes, obtained by PCR amplification followed by single-base extension, was tested with regression analysis. These analyses were adjusted for baseline DAS28, sex, and the specific TNFi. However, none of the proposed biomarkers was validated, as none showed association with response to TNFi in our study, even at the time of assessment and with the outcome that showed the most significant result in previous studies. These negative results are notable because this was the first independent validation study for 12 of the biomarkers, and because they indicate that prudence is needed in the interpretation of the proposed biomarkers of response to TNFi even when they are supported by very low p values. The results also emphasize the requirement of independent replication for validation, and the need to search protocols that could increase reproducibility of the biomarkers of response to TNFi.

  6. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  7. Validation testing of a soil macronutrient sensing system

    Science.gov (United States)

    Rapid on-site measurements of soil macronutrients (i.e., nitrogen, phosphorus, and potassium) are needed for site-specific crop management, where fertilizer nutrient application rates are adjusted spatially based on local requirements. This study reports on validation testing of a previously develop...

  8. Modeling, implementation, and validation of arterial travel time reliability.

    Science.gov (United States)

    2013-11-01

    Previous research funded by Florida Department of Transportation (FDOT) developed a method for estimating : travel time reliability for arterials. This method was not initially implemented or validated using field data. This : project evaluated and r...

  9. Performance analysis of clustering techniques over microarray data: A case study

    Science.gov (United States)

    Dash, Rasmita; Misra, Bijan Bihari

    2018-03-01

    Handling big data is one of the major issues in the field of statistical data analysis. In such investigation cluster analysis plays a vital role to deal with the large scale data. There are many clustering techniques with different cluster analysis approach. But which approach suits a particular dataset is difficult to predict. To deal with this problem a grading approach is introduced over many clustering techniques to identify a stable technique. But the grading approach depends on the characteristic of dataset as well as on the validity indices. So a two stage grading approach is implemented. In this study the grading approach is implemented over five clustering techniques like hybrid swarm based clustering (HSC), k-means, partitioning around medoids (PAM), vector quantization (VQ) and agglomerative nesting (AGNES). The experimentation is conducted over five microarray datasets with seven validity indices. The finding of grading approach that a cluster technique is significant is also established by Nemenyi post-hoc hypothetical test.

  10. Statistical Analysis Methods for Physics Models Verification and Validation

    CERN Document Server

    De Luca, Silvia

    2017-01-01

    The validation and verification process is a fundamental step for any software like Geant4 and GeantV, which aim to perform data simulation using physics models and Monte Carlo techniques. As experimental physicists, we have to face the problem to compare the results obtained using simulations with what the experiments actually observed. One way to solve the problem is to perform a consistency test. Within the Geant group, we developed a C++ compact library which will be added to the automated validation process on the Geant Validation Portal

  11. Validation of previously reported predictors for radiation-induced hypothyroidism in nasopharyngeal cancer patients treated with intensity-modulated radiation therapy, a post hoc analysis from a Phase III randomized trial.

    Science.gov (United States)

    Lertbutsayanukul, Chawalit; Kitpanit, Sarin; Prayongrat, Anussara; Kannarunimit, Danita; Netsawang, Buntipa; Chakkabat, Chakkapong

    2018-05-10

    This study aimed to validate previously reported dosimetric parameters, including thyroid volume, mean dose, and percentage thyroid volume, receiving at least 40, 45 and 50 Gy (V40, V45 and V50), absolute thyroid volume spared (VS) from 45, 50 and 60 Gy (VS45, VS50 and VS60), and clinical factors affecting the development of radiation-induced hypothyroidism (RHT). A post hoc analysis was performed in 178 euthyroid nasopharyngeal cancer (NPC) patients from a Phase III study comparing sequential versus simultaneous-integrated boost intensity-modulated radiation therapy. RHT was determined by increased thyroid-stimulating hormone (TSH) with or without reduced free thyroxin, regardless of symptoms. The median follow-up time was 42.5 months. The 1-, 2- and 3-year freedom from RHT rates were 78.4%, 56.4% and 43.4%, respectively. The median latency period was 21 months. The thyroid gland received a median mean dose of 53.5 Gy. Female gender, smaller thyroid volume, higher pretreatment TSH level (≥1.55 μU/ml) and VS60 treatment planning.

  12. Experience with Aero- and Fluid-Dynamic Testing for Engineering and CFD Validation

    Science.gov (United States)

    Ross, James C.

    2016-01-01

    Ever since computations have been used to simulate aerodynamics the need to ensure that the computations adequately represent real life has followed. Many experiments have been performed specifically for validation and as computational methods have improved, so have the validation experiments. Validation is also a moving target because computational methods improve requiring validation for the new aspect of flow physics that the computations aim to capture. Concurrently, new measurement techniques are being developed that can help capture more detailed flow features pressure sensitive paint (PSP) and particle image velocimetry (PIV) come to mind. This paper will present various wind-tunnel tests the author has been involved with and how they were used for validation of various kinds of CFD. A particular focus is the application of advanced measurement techniques to flow fields (and geometries) that had proven to be difficult to predict computationally. Many of these difficult flow problems arose from engineering and development problems that needed to be solved for a particular vehicle or research program. In some cases the experiments required to solve the engineering problems were refined to provide valuable CFD validation data in addition to the primary engineering data. All of these experiments have provided physical insight and validation data for a wide range of aerodynamic and acoustic phenomena for vehicles ranging from tractor-trailers to crewed spacecraft.

  13. Solution Validation for a Double Façade Prototype

    Directory of Open Access Journals (Sweden)

    Pau Fonseca i Casas

    2017-12-01

    Full Text Available A Solution Validation involves comparing the data obtained from the system that are implemented following the model recommendations, as well as the model results. This paper presents a Solution Validation that has been performed with the aim of certifying that a set of computer-optimized designs, for a double façade, are consistent with reality. To validate the results obtained through simulation models, based on dynamic thermal calculation and using Computational Fluid Dynamic techniques, a comparison with the data obtained by monitoring a real implemented prototype has been carried out. The new validated model can be used to describe the system thermal behavior in different climatic zones without having to build a new prototype. The good performance of the proposed double façade solution is confirmed since the validation assures there is a considerable energy saving, preserving and even improving interior comfort. This work shows all the processes in the Solution Validation depicting some of the problems we faced and represents an example of this kind of validation that often is not considered in a simulation project.

  14. Pulse shaping using the optical Fourier transform technique - for ultra-high-speed signal processing

    DEFF Research Database (Denmark)

    Palushani, Evarist; Oxenløwe, Leif Katsuo; Galili, Michael

    2009-01-01

    This paper reports on the generation of a 1.6 ps FWHM flat-top pulse using the optical Fourier transform technique. The pulse is validated in a 320 Gbit/s demultiplexing experiment.......This paper reports on the generation of a 1.6 ps FWHM flat-top pulse using the optical Fourier transform technique. The pulse is validated in a 320 Gbit/s demultiplexing experiment....

  15. Validation of New Cancer Biomarkers

    DEFF Research Database (Denmark)

    Duffy, Michael J; Sturgeon, Catherine M; Söletormos, Georg

    2015-01-01

    BACKGROUND: Biomarkers are playing increasingly important roles in the detection and management of patients with cancer. Despite an enormous number of publications on cancer biomarkers, few of these biomarkers are in widespread clinical use. CONTENT: In this review, we discuss the key steps...... in advancing a newly discovered cancer candidate biomarker from pilot studies to clinical application. Four main steps are necessary for a biomarker to reach the clinic: analytical validation of the biomarker assay, clinical validation of the biomarker test, demonstration of clinical value from performance...... of the biomarker test, and regulatory approval. In addition to these 4 steps, all biomarker studies should be reported in a detailed and transparent manner, using previously published checklists and guidelines. Finally, all biomarker studies relating to demonstration of clinical value should be registered before...

  16. MRI technique for the snapshot imaging of quantitative velocity maps using RARE

    Science.gov (United States)

    Shiko, G.; Sederman, A. J.; Gladden, L. F.

    2012-03-01

    A quantitative PGSE-RARE pulse sequence was developed and successfully applied to the in situ dissolution of two pharmaceutical formulations dissolving over a range of timescales. The new technique was chosen over other existing fast velocity imaging techniques because it is T2 weighted, not T2∗ weighted, and is, therefore, robust for imaging time-varying interfaces and flow in magnetically heterogeneous systems. The complex signal was preserved intact by separating odd and even echoes to obtain two phase maps which are then averaged in post-processing. Initially, the validity of the technique was shown when imaging laminar flow in a pipe. Subsequently, the dissolution of two drugs was followed in situ, where the technique enables the imaging and quantification of changes in the form of the tablet and the flow field surrounding it at high spatial and temporal resolution. First, the complete 3D velocity field around an eroding salicylic acid tablet was acquired at a resolution of 98 × 49 μm2, within 20 min, and monitored over ˜13 h. The tablet was observed to experience a heterogeneous flow field and, hence a heterogeneous shear field, which resulted in the non-symmetric erosion of the tablet. Second, the dissolution of a fast dissolving immediate release tablet was followed using one-shot 2D velocity images acquired every 5.2 s at a resolution of 390 × 390 μm2. The quantitative nature of the technique and fast acquisition times provided invaluable information on the dissolution behaviour of this tablet, which had not been attainable previously with conventional quantitative MRI techniques.

  17. MRI technique for the snapshot imaging of quantitative velocity maps using RARE.

    Science.gov (United States)

    Shiko, G; Sederman, A J; Gladden, L F

    2012-03-01

    A quantitative PGSE-RARE pulse sequence was developed and successfully applied to the in situ dissolution of two pharmaceutical formulations dissolving over a range of timescales. The new technique was chosen over other existing fast velocity imaging techniques because it is T(2) weighted, not T(2)(∗) weighted, and is, therefore, robust for imaging time-varying interfaces and flow in magnetically heterogeneous systems. The complex signal was preserved intact by separating odd and even echoes to obtain two phase maps which are then averaged in post-processing. Initially, the validity of the technique was shown when imaging laminar flow in a pipe. Subsequently, the dissolution of two drugs was followed in situ, where the technique enables the imaging and quantification of changes in the form of the tablet and the flow field surrounding it at high spatial and temporal resolution. First, the complete 3D velocity field around an eroding salicylic acid tablet was acquired at a resolution of 98×49 μm(2), within 20 min, and monitored over ∼13 h. The tablet was observed to experience a heterogeneous flow field and, hence a heterogeneous shear field, which resulted in the non-symmetric erosion of the tablet. Second, the dissolution of a fast dissolving immediate release tablet was followed using one-shot 2D velocity images acquired every 5.2 s at a resolution of 390×390 μm(2). The quantitative nature of the technique and fast acquisition times provided invaluable information on the dissolution behaviour of this tablet, which had not been attainable previously with conventional quantitative MRI techniques. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Valid methods: the quality assurance of test method development, validation, approval, and transfer for veterinary testing laboratories.

    Science.gov (United States)

    Wiegers, Ann L

    2003-07-01

    Third-party accreditation is a valuable tool to demonstrate a laboratory's competence to conduct testing. Accreditation, internationally and in the United States, has been discussed previously. However, accreditation is only I part of establishing data credibility. A validated test method is the first component of a valid measurement system. Validation is defined as confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled. The international and national standard ISO/IEC 17025 recognizes the importance of validated methods and requires that laboratory-developed methods or methods adopted by the laboratory be appropriate for the intended use. Validated methods are therefore required and their use agreed to by the client (i.e., end users of the test results such as veterinarians, animal health programs, and owners). ISO/IEC 17025 also requires that the introduction of methods developed by the laboratory for its own use be a planned activity conducted by qualified personnel with adequate resources. This article discusses considerations and recommendations for the conduct of veterinary diagnostic test method development, validation, evaluation, approval, and transfer to the user laboratory in the ISO/IEC 17025 environment. These recommendations are based on those of nationally and internationally accepted standards and guidelines, as well as those of reputable and experienced technical bodies. They are also based on the author's experience in the evaluation of method development and transfer projects, validation data, and the implementation of quality management systems in the area of method development.

  19. The development of a self-administered dementia checklist: the examination of concurrent validity and discriminant validity.

    Science.gov (United States)

    Miyamae, Fumiko; Ura, Chiaki; Sakuma, Naoko; Niikawa, Hirotoshi; Inagaki, Hiroki; Ijuin, Mutsuo; Okamura, Tsuyoshi; Sugiyama, Mika; Awata, Shuichi

    2016-01-01

    The present study aims to develop a self-administered dementia checklist to enable community-residing older adults to realize their declining functions and start using necessary services. A previous study confirmed the factorial validity and internal reliability of the checklist. The present study examined its concurrent validity and discriminant validity. The authors conducted a 3-step study (a self-administered survey including the checklist, interviews by nurses, and interviews by doctors and psychologists) of 7,682 community-residing individuals who were over 65 years of age. The authors calculated Spearman's correlation coefficients between the scores of the checklist and the results of a psychological test to examine the concurrent validity. They also compared the average total scores of the checklist between groups with different Clinical Dementia Rating (CDR) scores to examine discriminant validity and conducted a receiver operating characteristic analysis to examine the discriminative power for dementia. The authors analyzed the data of 131 respondents who completed all 3 steps. The checklist scores were significantly correlated with the respondents' Mini-Mental State Examination and Frontal Assessment Battery scores. The checklist also significantly discriminated the patients with dementia (CDR = 1+) from those without dementia (CDR = 0 or 0.5). The optimal cut-off point for the two groups was 17/18 (sensitivity, 72.0%; specificity, 69.2%; positive predictive value, 69.2%; negative predictive value, 72.0%). This study confirmed the concurrent validity and discriminant validity of the self-administered dementia checklist. However, due to its insufficient discriminative power as a screening tool for older people with declining cognitive functions, the checklist is only recommended as an educational and public awareness tool.

  20. Validation of measures from the smartphone sway balance application: a pilot study.

    Science.gov (United States)

    Patterson, Jeremy A; Amick, Ryan Z; Thummar, Tarunkumar; Rogers, Michael E

    2014-04-01

    A number of different balance assessment techniques are currently available and widely used. These include both subjective and objective assessments. The ability to provide quantitative measures of balance and posture is the benefit of objective tools, however these instruments are not generally utilized outside of research laboratory settings due to cost, complexity of operation, size, duration of assessment, and general practicality. The purpose of this pilot study was to assess the value and validity of using software developed to access the iPod and iPhone accelerometers output and translate that to the measurement of human balance. Thirty healthy college-aged individuals (13 male, 17 female; age = 26.1 ± 8.5 years) volunteered. Participants performed a static Athlete's Single Leg Test protocol for 10 sec, on a Biodex Balance System SD while concurrently utilizing a mobile device with balance software. Anterior/posterior stability was recorded using both devices, described as the displacement in degrees from level, and was termed the "balance score." There were no significant differences between the two reported balance scores (p = 0.818. Mean balance score on the balance platform was 1.41 ± 0.90, as compared to 1.38 ± 0.72 using the mobile device. There is a need for a valid, convenient, and cost-effective tool to objectively measure balance. Results of this study are promising, as balance score derived from the Smartphone accelerometers were consistent with balance scores obtained from a previously validated balance system. However, further investigation is necessary as this version of the mobile software only assessed balance in the anterior/posterior direction. Additionally, further testing is necessary on a healthy populations and as well as those with impairment of the motor control system. Level 2b (Observational study of validity)(1.)

  1. On the effectiveness of XML schema validation for countering XML signature wrapping attacks

    DEFF Research Database (Denmark)

    Jensen, Meiko; Meyer, Christopher; Somorovsky, Juraj

    2011-01-01

    In the context of security of Web Services, the XML Signature Wrapping attack technique has lately received increasing attention. Following a broad range of real-world exploits, general interest in applicable countermeasures rises. However, few approaches for countering these attacks have been...... investigated closely enough to make any claims about their effectiveness. In this paper, we analyze the effectiveness of the specific countermeasure of XML Schema validation in terms of fending Signature Wrapping attacks. We investigate the problems of XML Schema validation for Web Services messages......, and discuss the approach of Schema Hardening, a technique for strengthening XML Schema declarations. We conclude that XML Schema validation with a hardened XML Schema is capable of fending XML Signature Wrapping attacks, but bears some pitfalls and disadvantages as well....

  2. Cognitive Support in Teaching Football Techniques

    Science.gov (United States)

    Duda, Henryk

    2009-01-01

    Study aim: To improve the teaching of football techniques by applying cognitive and imagery techniques. Material and methods: Four groups of subjects, n = 32 each, were studied: male and female physical education students aged 20-21 years, not engaged previously in football training; male juniors and minors, aged 16 and 13 years, respectively,…

  3. Limits of validity of photon-in-cell simulation techniques

    International Nuclear Information System (INIS)

    Reitsma, A. J. W.; Jaroszynski, D. A.

    2008-01-01

    A comparison is made between two reduced models for studying laser propagation in underdense plasma; namely, photon kinetic theory and the slowly varying envelope approximation. Photon kinetic theory is a wave-kinetic description of the electromagnetic field where the motion of quasiparticles in photon coordinate-wave number phase space is described by the ray-tracing equations. Numerically, the photon kinetic theory is implemented with standard particle-in-cell techniques, which results in a so-called photon-in-cell code. For all the examples presented in this paper, the slowly varying envelope approximation is accurate and therefore discrepancies indicate the failure of photon kinetic approximation for these cases. Possible remedies for this failure are discussed at the end of the paper

  4. Validation of a Radiosensitivity Molecular Signature in Breast Cancer

    Science.gov (United States)

    Eschrich, Steven A.; Fulp, William J.; Pawitan, Yudi; Foekens, John A.; Smid, Marcel; Martens, John W. M.; Echevarria, Michelle; Kamath, Vidya; Lee, Ji-Hyun; Harris, Eleanor E.; Bergh, Jonas; Torres-Roca, Javier F.

    2014-01-01

    Purpose Previously, we developed a radiosensitivity molecular signature (RSI) that was clinically-validated in three independent datasets (rectal, esophageal, head and neck) in 118 patients. Here, we test RSI in radiotherapy (RT) treated breast cancer patients. Experimental Design RSI was tested in two previously published breast cancer datasets. Patients were treated at the Karolinska University Hospital (n=159) and Erasmus Medical Center (n=344). RSI was applied as previously described. Results We tested RSI in RT-treated patients (Karolinska). Patients predicted to be radiosensitive (RS) had an improved 5 yr relapse-free survival when compared with radioresistant (RR) patients (95% vs. 75%, p=0.0212) but there was no difference between RS/RR patients treated without RT (71% vs. 77%, p=0.6744), consistent with RSI being RT-specific (interaction term RSIxRT, p=0.05). Similarly, in the Erasmus dataset RT-treated RS patients had an improved 5-year distant-metastasis-free survival over RR patients (77% vs. 64%, p=0.0409) but no difference was observed in patients treated without RT (RS vs. RR, 80% vs. 81%, p=0.9425). Multivariable analysis showed RSI is the strongest variable in RT-treated patients (Karolinska, HR=5.53, p=0.0987, Erasmus, HR=1.64, p=0.0758) and in backward selection (removal alpha of 0.10) RSI was the only variable remaining in the final model. Finally, RSI is an independent predictor of outcome in RT-treated ER+ patients (Erasmus, multivariable analysis, HR=2.64, p=0.0085). Conclusions RSI is validated in two independent breast cancer datasets totaling 503 patients. Including prior data, RSI is validated in five independent cohorts (621 patients) and represents, to our knowledge, the most extensively validated molecular signature in radiation oncology. PMID:22832933

  5. Estimating uncertainty of inference for validation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  6. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  7. Validity in Mixed Methods Research in Education: The Application of Habermas' Critical Theory

    Science.gov (United States)

    Long, Haiying

    2017-01-01

    Mixed methods approach has developed into the third methodological movement in educational research. Validity in mixed methods research as an important issue, however, has not been examined as extensively as that of quantitative and qualitative research. Additionally, the previous discussions of validity in mixed methods research focus on research…

  8. Experience with the Large Eddy Simulation (LES) Technique for the Modelling of Premixed and Non-premixed Combustion

    OpenAIRE

    Malalasekera, W; Ibrahim, SS; Masri, AR; Gubba, SR; Sadasivuni, SK

    2013-01-01

    Compared to RANS based combustion modelling, the Large Eddy Simulation (LES) technique has recently emerged as a more accurate and very adaptable technique in terms of handling complex turbulent interactions in combustion modelling problems. In this paper application of LES based combustion modelling technique and the validation of models in non-premixed and premixed situations are considered. Two well defined experimental configurations where high quality data are available for validation is...

  9. The low-frequency sound power measuring technique for an underwater source in a non-anechoic tank

    Science.gov (United States)

    Zhang, Yi-Ming; Tang, Rui; Li, Qi; Shang, Da-Jing

    2018-03-01

    In order to determine the radiated sound power of an underwater source below the Schroeder cut-off frequency in a non-anechoic tank, a low-frequency extension measuring technique is proposed. This technique is based on a unique relationship between the transmission characteristics of the enclosed field and those of the free field, which can be obtained as a correction term based on previous measurements of a known simple source. The radiated sound power of an unknown underwater source in the free field can thereby be obtained accurately from measurements in a non-anechoic tank. To verify the validity of the proposed technique, a mathematical model of the enclosed field is established using normal-mode theory, and the relationship between the transmission characteristics of the enclosed and free fields is obtained. The radiated sound power of an underwater transducer source is tested in a glass tank using the proposed low-frequency extension measuring technique. Compared with the free field, the radiated sound power level of the narrowband spectrum deviation is found to be less than 3 dB, and the 1/3 octave spectrum deviation is found to be less than 1 dB. The proposed testing technique can be used not only to extend the low-frequency applications of non-anechoic tanks, but also for measurement of radiated sound power from complicated sources in non-anechoic tanks.

  10. Satellite imager calibration and validation

    CSIR Research Space (South Africa)

    Vhengani, L

    2010-10-01

    Full Text Available and Validation Lufuno Vhengani*, Minette Lubbe, Derek Griffith and Meena Lysko Council for Scientific and Industrial Research, Defence Peace Safety and Security, Pretoria, South Africa E-mail: * lvhengani@csir.co.za Abstract: The success or failure... techniques specific to South Africa. 1. Introduction The success or failure of any earth observation mission depends on the quality of its data. To achieve optimum levels of reliability most sensors are calibrated pre-launch. However...

  11. SU-E-T-516: Dosimetric Validation of AcurosXB Algorithm in Comparison with AAA & CCC Algorithms for VMAT Technique.

    Science.gov (United States)

    Kathirvel, M; Subramanian, V Sai; Arun, G; Thirumalaiswamy, S; Ramalingam, K; Kumar, S Ashok; Jagadeesh, K

    2012-06-01

    To dosimetrically validate AcurosXB algorithm for Volumetric Modulated Arc Therapy (VMAT) in comparison with standard clinical Anisotropic Analytic Algorithm(AAA) and Collapsed Cone Convolution(CCC) dose calculation algorithms. AcurosXB dose calculation algorithm is available with Varian Eclipse treatment planning system (V10). It uses grid-based Boltzmann equation solver to predict dose precisely in lesser time. This study was made to realize algorithms ability to predict dose accurately as its delivery for which five clinical cases each of Brain, Head&Neck, Thoracic, Pelvic and SBRT were taken. Verification plans were created on multicube phantom with iMatrixx-2D detector array and then dose prediction was done with AcurosXB, AAA & CCC (COMPASS System) algorithm and the same were delivered onto CLINAC-iX treatment machine. Delivered dose was captured in iMatrixx plane for all 25 plans. Measured dose was taken as reference to quantify the agreement between AcurosXB calculation algorithm against previously validated AAA and CCC algorithm. Gamma evaluation was performed with clinical criteria distance-to-agreement 3&2mm and dose difference 3&2% in omnipro-I'MRT software. Plans were evaluated in terms of correlation coefficient, quantitative area gamma and average gamma. Study shows good agreement between mean correlation 0.9979±0.0012, 0.9984±0.0009 & 0.9979±0.0011 for AAA, CCC & Acuros respectively. Mean area gamma for criteria 3mm/3% was found to be 98.80±1.04, 98.14±2.31, 98.08±2.01 and 2mm/2% was found to be 93.94±3.83, 87.17±10.54 & 92.36±5.46 for AAA, CCC & Acuros respectively. Mean average gamma for 3mm/3% was 0.26±0.07, 0.42±0.08, 0.28±0.09 and 2mm/2% was found to be 0.39±0.10, 0.64±0.11, 0.42±0.13 for AAA, CCC & Acuros respectively. This study demonstrated that the AcurosXB algorithm had a good agreement with the AAA & CCC in terms of dose prediction. In conclusion AcurosXB algorithm provides a valid, accurate and speedy alternative to AAA

  12. Validation of a radiosensitivity molecular signature in breast cancer

    NARCIS (Netherlands)

    S.A. Eschrich (Steven); C. Fulp (Carl); Y. Pawitan (Yudi); J.A. Foekens (John); M. Smid (Marcel); J.W.M. Martens (John); M. Echevarria (Michelle); P.S. Kamath (Patrick); J.-H. Lee (Ji-Hyun); E.E. Harris (Eleanor); J. Bergh (Jonas); J.F. Torres-Roca (Javier)

    2012-01-01

    textabstractPurpose: Previously, we developed a radiosensitivity molecular signature [radiosensitivity index (RSI)] that was clinically validated in 3 independent datasets (rectal, esophageal, and head and neck) in 118 patients. Here, we test RSI in radiotherapy (RT)-treated breast cancer patients.

  13. Image decomposition as a tool for validating stress analysis models

    Directory of Open Access Journals (Sweden)

    Mottershead J.

    2010-06-01

    Full Text Available It is good practice to validate analytical and numerical models used in stress analysis for engineering design by comparison with measurements obtained from real components either in-service or in the laboratory. In reality, this critical step is often neglected or reduced to placing a single strain gage at the predicted hot-spot of stress. Modern techniques of optical analysis allow full-field maps of displacement, strain and, or stress to be obtained from real components with relative ease and at modest cost. However, validations continued to be performed only at predicted and, or observed hot-spots and most of the wealth of data is ignored. It is proposed that image decomposition methods, commonly employed in techniques such as fingerprinting and iris recognition, can be employed to validate stress analysis models by comparing all of the key features in the data from the experiment and the model. Image decomposition techniques such as Zernike moments and Fourier transforms have been used to decompose full-field distributions for strain generated from optical techniques such as digital image correlation and thermoelastic stress analysis as well as from analytical and numerical models by treating the strain distributions as images. The result of the decomposition is 101 to 102 image descriptors instead of the 105 or 106 pixels in the original data. As a consequence, it is relatively easy to make a statistical comparison of the image descriptors from the experiment and from the analytical/numerical model and to provide a quantitative assessment of the stress analysis.

  14. Construct validity of the ovine model in endoscopic sinus surgery training.

    Science.gov (United States)

    Awad, Zaid; Taghi, Ali; Sethukumar, Priya; Tolley, Neil S

    2015-03-01

    To demonstrate construct validity of the ovine model as a tool for training in endoscopic sinus surgery (ESS). Prospective, cross-sectional evaluation study. Over 18 consecutive months, trainees and experts were evaluated in their ability to perform a range of tasks (based on previous face validation and descriptive studies conducted by the same group) relating to ESS on the sheep-head model. Anonymized randomized video recordings of the above were assessed by two independent and blinded assessors. A validated assessment tool utilizing a five-point Likert scale was employed. Construct validity was calculated by comparing scores across training levels and experts using mean and interquartile range of global and task-specific scores. Subgroup analysis of the intermediate group ascertained previous experience. Nonparametric descriptive statistics were used, and analysis was carried out using SPSS version 21 (IBM, Armonk, NY). Reliability of the assessment tool was confirmed. The model discriminated well between different levels of expertise in global and task-specific scores. A positive correlation was noted between year in training and both global and task-specific scores (P variable, and the number of ESS procedures performed under supervision had the highest impact on performance. This study describes an alternative model for ESS training and assessment. It is also the first to demonstrate construct validity of the sheep-head model for ESS training. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  15. PEANO, a toolbox for real-time process signal validation and estimation

    International Nuclear Information System (INIS)

    Fantoni, Paolo F.; Figedy, Stefan; Racz, Attila

    1998-02-01

    PEANO (Process Evaluation and Analysis by Neural Operators), a toolbox for real time process signal validation and condition monitoring has been developed. This system analyses the signals, which are e.g. the readings of process monitoring sensors, computes their expected values and alerts if real values are deviated from the expected ones more than limits allow. The reliability level of the current analysis is also produced. The system is based on neuro-fuzzy techniques. Artificial Neural Networks and Fuzzy Logic models can be combined to exploit learning and generalisation capability of the first technique with the approximate reasoning embedded in the second approach. Real-time process signal validation is an application field where the use of this technique can improve the diagnosis of faulty sensors and the identification of outliers in a robust and reliable way. This study implements a fuzzy and possibilistic clustering algorithm to classify the operating region where the validation process has to be performed. The possibilistic approach (rather than probabilistic) allows a ''don't know'' classification that results in a fast detection of unforeseen plant conditions or outliers. Specialised Artificial Neural Networks are used for the validation process, one for each fuzzy cluster in which the operating map has been divided. There are two main advantages in using this technique: the accuracy and generalisation capability is increased compared to the case of a single network working in the entire operating region, and the ability to identify abnormal conditions, where the system is not capable to operate with a satisfactory accuracy, is improved. This model has been tested in a simulated environment on a French PWR, to monitor safety-related reactor variables over the entire power-flow operating map. (author)

  16. PEANO, a toolbox for real-time process signal validation and estimation

    Energy Technology Data Exchange (ETDEWEB)

    Fantoni, Paolo F.; Figedy, Stefan; Racz, Attila

    1998-02-01

    PEANO (Process Evaluation and Analysis by Neural Operators), a toolbox for real time process signal validation and condition monitoring has been developed. This system analyses the signals, which are e.g. the readings of process monitoring sensors, computes their expected values and alerts if real values are deviated from the expected ones more than limits allow. The reliability level of the current analysis is also produced. The system is based on neuro-fuzzy techniques. Artificial Neural Networks and Fuzzy Logic models can be combined to exploit learning and generalisation capability of the first technique with the approximate reasoning embedded in the second approach. Real-time process signal validation is an application field where the use of this technique can improve the diagnosis of faulty sensors and the identification of outliers in a robust and reliable way. This study implements a fuzzy and possibilistic clustering algorithm to classify the operating region where the validation process has to be performed. The possibilistic approach (rather than probabilistic) allows a ''don't know'' classification that results in a fast detection of unforeseen plant conditions or outliers. Specialised Artificial Neural Networks are used for the validation process, one for each fuzzy cluster in which the operating map has been divided. There are two main advantages in using this technique: the accuracy and generalisation capability is increased compared to the case of a single network working in the entire operating region, and the ability to identify abnormal conditions, where the system is not capable to operate with a satisfactory accuracy, is improved. This model has been tested in a simulated environment on a French PWR, to monitor safety-related reactor variables over the entire power-flow operating map. (author)

  17. Upon Further Review: V. An Examination of Previous Lightcurve Analysis from the Palmer Divide Observatory

    Science.gov (United States)

    Warner, Brian D.

    2011-01-01

    Updated results are given for nine asteroids previously reported from the Palmer Divide Observatory (PDO). The original images were re-measured to obtain new data sets using the latest version of MPO Canopus photometry software, analysis tools, and revised techniques for linking multiple observing runs covering several days to several weeks. Results that were previously not reported or were moderately different were found for 1659 Punkajarju, 1719 Jens, 1987 Kaplan, 2105 Gudy, 2961 Katsurahama, 3285 Ruth Wolfe, 3447 Burckhalter, 7816 Hanoi, and (34817) 2000 SE116. This is one in a series of papers that will examine results obtained during the initial years of the asteroid lightcurve program at PDO.

  18. Symptom validity issues in the psychological consultative examination for social security disability.

    Science.gov (United States)

    Chafetz, Michael D

    2010-08-01

    This article is about Social Security Administration (SSA) policy with regard to the Psychological Consultative Examination (PCE) for Social Security Disability, particularly with respect to validation of the responses and findings. First, the nature of the consultation and the importance of understanding the boundaries and ethics of the psychologist's role are described. Issues particular to working with low-functioning claimants usually form a large part of these examinations. The psychologist must understand various forms of non-credible behavior during the PCE, and how malingering might be considered among other non-credible presentations. Issues pertaining to symptom validity testing in low-functioning claimants are further explored. SSA policy with respect to symptom validity testing is carefully examined, with an attempt to answer specific concerns and show how psychological science can be of assistance, particularly with evidence-based practice. Additionally, the nature and importance of techniques to avoid the mislabeling of claimants as malingerers are examined. SSA requires the use of accepted diagnostic techniques with which to establish impairment, and this article describes the implementation of that requirement, particularly with respect to validating the findings.

  19. An Integrated Research Infrastructure for Validating Cyber-Physical Energy Systems

    DEFF Research Database (Denmark)

    Strasser, T. I.; Moyo, C.; Bründlinger, R.

    2017-01-01

    quality and ensure security of supply. At the same time, the increased availability of advanced automation and communication technologies provides new opportunities for the derivation of intelligent solutions to tackle the challenges. Previous work has shown various new methods of operating highly...... interconnected power grids, and their corresponding components, in a more effective way. As a consequence of these developments, the traditional power system is being transformed into a cyber-physical energy system, a smart grid. Previous and ongoing research have tended to mainly focus on how specific aspects...... of smart grids can be validated, but until there exists no integrated approach for the analysis and evaluation of complex cyber-physical systems configurations. This paper introduces integrated research infrastructure that provides methods and tools for validating smart grid systems in a holistic, cyber...

  20. Contact Modelling in Resistance Welding, Part II: Experimental Validation

    DEFF Research Database (Denmark)

    Song, Quanfeng; Zhang, Wenqi; Bay, Niels

    2006-01-01

    Contact algorithms in resistance welding presented in the previous paper are experimentally validated in the present paper. In order to verify the mechanical contact algorithm, two types of experiments, i.e. sandwich upsetting of circular, cylindrical specimens and compression tests of discs...... with a solid ring projection towards a flat ring, are carried out at room temperature. The complete algorithm, involving not only the mechanical model but also the thermal and electrical models, is validated by projection welding experiments. The experimental results are in satisfactory agreement...

  1. Empirical techniques in finance

    CERN Document Server

    Bhar, Ramaprasad

    2005-01-01

    This book offers the opportunity to study and experience advanced empi- cal techniques in finance and in general financial economics. It is not only suitable for students with an interest in the field, it is also highly rec- mended for academic researchers as well as the researchers in the industry. The book focuses on the contemporary empirical techniques used in the analysis of financial markets and how these are implemented using actual market data. With an emphasis on Implementation, this book helps foc- ing on strategies for rigorously combing finance theory and modeling technology to extend extant considerations in the literature. The main aim of this book is to equip the readers with an array of tools and techniques that will allow them to explore financial market problems with a fresh perspective. In this sense it is not another volume in eco- metrics. Of course, the traditional econometric methods are still valid and important; the contents of this book will bring in other related modeling topics tha...

  2. New continuous air pumping technique to improve clinical outcomes of descemet-stripping automated endothelial keratoplasty in asian patients with previous ahmed glaucoma valve implantation.

    Directory of Open Access Journals (Sweden)

    Chang-Min Liang

    Full Text Available BACKGROUND: To evaluate the outcomes of Descemet-stripping automated endothelial keratoplasty (DSAEK with the use of continuous air pumping technique in Asian eyes with previous Ahmed glaucoma valve implantation. METHODS: The DSAEK procedure was modified in that complete air retention of the anterior chamber was maintained for 10 min using continuous air pumping at 30 mm Hg. The primary outcome measurement was graft survival, and postoperative clinical features including, rate of graft detachment, endothelial cell count, intraocular pressure (IOP, surgical time and cup/disc ratio were also recorded. RESULTS: A total of 13 eyes of 13 patients which underwent modified DSAEK and 6 eyes of 6 patients which underwent conventional DSAEK were included. There was a significant difference in graft survival curves between two groups (P = 0.029; the 1-year graft survival rates were estimated as 100% and 66.7% for patients with modified DSAEK and those with traditional DSAEK, respectively. The rate of graft detachment were 0% and 33.3% for the modified DSAEK and conventional DSAEK groups, respectively (P = 0.088. The significantly lowered surgical time for air tamponade was noted in the modified DSAEK group compared to that in the conventional DSAEK group [median (IQR: 10.0 (10.0, 10.0 min vs. 24.5 (22.0, 27.0 min; P<0.001] Postoperatively, patients in the modified DSAEK group had significantly lower IOP as compared to the conventional DSAEK group [12.0 (11.0, 15.0 mm Hg vs. 16.0 (15.0, 18.0 mm Hg; P = 0.047]. Modified DSAEK patients had higher endothelial cell counts as compared to conventional DSAEK patients [2148.0 (1964.0, 2218.0 vs. 1529.0 (713.0, 2014.0], but the difference did not reach statistical significance (P = 0.072. CONCLUSIONS: New continuous air pumping technique in DSAEK can be performed safely and effectively in patients with prior GDDs placement who have corneal failure.

  3. Spacecraft early design validation using formal methods

    International Nuclear Information System (INIS)

    Bozzano, Marco; Cimatti, Alessandro; Katoen, Joost-Pieter; Katsaros, Panagiotis; Mokos, Konstantinos; Nguyen, Viet Yen; Noll, Thomas; Postma, Bart; Roveri, Marco

    2014-01-01

    The size and complexity of software in spacecraft is increasing exponentially, and this trend complicates its validation within the context of the overall spacecraft system. Current validation methods are labor-intensive as they rely on manual analysis, review and inspection. For future space missions, we developed – with challenging requirements from the European space industry – a novel modeling language and toolset for a (semi-)automated validation approach. Our modeling language is a dialect of AADL and enables engineers to express the system, the software, and their reliability aspects. The COMPASS toolset utilizes state-of-the-art model checking techniques, both qualitative and probabilistic, for the analysis of requirements related to functional correctness, safety, dependability and performance. Several pilot projects have been performed by industry, with two of them having focused on the system-level of a satellite platform in development. Our efforts resulted in a significant advancement of validating spacecraft designs from several perspectives, using a single integrated system model. The associated technology readiness level increased from level 1 (basic concepts and ideas) to early level 4 (laboratory-tested)

  4. Predicting Radiation Pneumonitis After Stereotactic Ablative Radiation Therapy in Patients Previously Treated With Conventional Thoracic Radiation Therapy

    International Nuclear Information System (INIS)

    Liu Hui; Zhang Xu; Vinogradskiy, Yevgeniy Y.; Swisher, Stephen G.; Komaki, Ritsuko; Chang, Joe Y.

    2012-01-01

    Purpose: To determine the incidence of and risk factors for radiation pneumonitis (RP) after stereotactic ablative radiation therapy (SABR) to the lung in patients who had previously undergone conventional thoracic radiation therapy. Methods and Materials: Seventy-two patients who had previously received conventionally fractionated radiation therapy to the thorax were treated with SABR (50 Gy in 4 fractions) for recurrent disease or secondary parenchymal lung cancer (T 10 and mean lung dose (MLD) of the previous plan and the V 10 -V 40 and MLD of the composite plan were also related to RP. Multivariate analysis revealed that ECOG PS scores of 2-3 before SABR (P=.009), FEV1 ≤65% before SABR (P=.012), V 20 ≥30% of the composite plan (P=.021), and an initial PTV in the bilateral mediastinum (P=.025) were all associated with RP. Conclusions: We found that severe RP was relatively common, occurring in 20.8% of patients, and could be predicted by an ECOG PS score of 2-3, an FEV1 ≤65%, a previous PTV spanning the bilateral mediastinum, and V 20 ≥30% on composite (previous RT+SABR) plans. Prospective studies are needed to validate these predictors and the scoring system on which they are based.

  5. Right ventriculography as a valid method for the diagnosis of tricuspid insufficiency.

    Science.gov (United States)

    Ubago, J L; Figueroa, A; Colman, T; Ochoteco, A; Rodríguez, M; Durán, C M

    1981-01-01

    The value of right ventriculography in the diagnosis of tricuspid insufficiency (TI) is often questioned because of 1) the high incidence of premature ventricular contractions (PVCs) during injections and 2) interference of the catheter in the valve closure mechanism. In 168 patients a commercially available, not preshaped, balloon-tipped catheter was used for right ventriculography. To avoid the induction of PVCs, the catheter tip was placed in the middle third of the diafragmatic wall of the right ventricle, and the balloon was inflated, becoming trapped by the trabeculae. In this position the catheter's side holes should be located in the inflow chamber. To ensure this correct position, and therefore lack of ectopic beats during angiography, a saline test injection was performed previously in every case. With this technique the incidence of PVCs during ventriculography was only 7.7%. In all but one case, such beats were isolated. The 168 patients were divided into three groups according to their likelihood of experiencing tricuspid interference by the catheter: group 1 included 41 patients with a normal heart or with coronary artery disease. No one from this group had TI. Of group II, 28 patients with right ventricular pressure or volume overload or cardiomyopathy, only 2 had TI, both with a previous clinical diagnosis of regurgitation. Group III contained 99 patients with rheumatic heart disease. Thirty-five of them showed angiographic TI, and 24 of these had this diagnosis confirmed either clinically or at surgery. It is felt that this technique of right ventriculography, with its low incidence of PVCs and slight interference with tricuspid closure, is a valid method for the objective study of the tricuspid valve.

  6. DTU PMU Laboratory Development - Testing and Validation

    DEFF Research Database (Denmark)

    Garcia-Valle, Rodrigo; Yang, Guang-Ya; Martin, Kenneth E.

    2010-01-01

    This is a report of the results of phasor measurement unit (PMU) laboratory development and testing done at the Centre for Electric Technology (CET), Technical University of Denmark (DTU). Analysis of the PMU performance first required the development of tools to convert the DTU PMU data into IEEE...... standard, and the validation is done for the DTU-PMU via a validated commercial PMU. The commercial PMU has been tested from the authors' previous efforts, where the response can be expected to follow known patterns and provide confirmation about the test system to confirm the design and settings....... In a nutshell, having 2 PMUs that observe same signals provides validation of the operation and flags questionable results with more certainty. Moreover, the performance and accuracy of the DTU-PMU is tested acquiring good and precise results, when compared with a commercial phasor measurement device, PMU-1....

  7. Validation of bioelectrical impedance analysis in Ethiopian adults with HIV

    DEFF Research Database (Denmark)

    Hegelund, Maria H; Wells, Jonathan C; Girma, Tsinuel

    2017-01-01

    in populations of different ethnic origin and health status. The aim of the present study was to test the validity of BIA in Ethiopian antiretroviral-naive HIV patients. BIA was validated against the 2H dilution technique by comparing fat-free mass (FFM) measured by the two methods using paired t tests and Bland...... % were underweight with a BMI below 18·5 kg/m2. There were no differences in FFM between the methods. Overall, BIA slightly underestimated FFM by 0·1 kg (-0·1, 95 % CI -0·3, 0·2 kg). The Bland-Altman plot indicated acceptable agreement with an upper limit of agreement of 4·5 kg and a lower limit...... of agreement of -4·6 kg, but with a small correlation between the mean difference and the average FFM. BIA slightly overestimated FFM at low values compared with the 2H dilution technique, while it slightly underestimated FFM at high values. In conclusion, BIA proved to be valid in this population and may...

  8. Intralesional Osteophyte Regrowth Following Autologous Chondrocyte Implantation after Previous Treatment with Marrow Stimulation Technique.

    Science.gov (United States)

    Demange, Marco Kawamura; Minas, Tom; von Keudell, Arvind; Sodha, Sonal; Bryant, Tim; Gomoll, Andreas H

    2017-04-01

    Objective Bone marrow stimulation surgeries are frequent in the treatment of cartilage lesions. Autologous chondrocyte implantation (ACI) may be performed after failed microfracture surgery. Alterations to subchondral bone as intralesional osteophytes are commonly seen after previous microfracture and removed during ACI. There have been no reports on potential recurrence. Our purpose was to evaluate the incidence of intralesional osteophyte development in 2 cohorts: existing intralesional osteophytes and without intralesional osteophytes at the time of ACI. Study Design We identified 87 patients (157 lesions) with intralesional osteophytes among a cohort of 497 ACI patients. Osteophyte regrowth was analyzed on magnetic resonance imaging and categorized as small or large (less or more than 50% of the cartilage thickness). Twenty patients (24 defects) without intralesional osteophytes at the time of ACI acted as control. Results Osteophyte regrowth was observed in 39.5% of lesions (34.4% of small osteophytes and 5.1% of large osteophytes). In subgroup analyses, regrowth was observed in 45.8% of periosteal-covered defects and in 18.9% of collagen membrane-covered defects. Large osteophyte regrowth occurred in less than 5% in either group. Periosteal defects showed a significantly higher incidence for regrowth of small osteophytes. In the control group, intralesional osteophytes developed in 16.7% of the lesions. Conclusions Even though intralesional osteophytes may regrow after removal during ACI, most of them are small. Small osteophyte regrowth occurs almost twice in periosteum-covered ACI. Large osteophytes occur only in 5% of patients. Intralesional osteophyte formation is not significantly different in preexisting intralesional osteophytes and control groups.

  9. Dosimetric evaluation of the response of the TLD-100 dosemeters in the IMRT technique by 'Step and Shoot'

    International Nuclear Information System (INIS)

    Vasquez, J.; Benavides, S.O.

    2005-01-01

    We show the results of the dosimetry response of LiF thermoluminescent crystals: TLD-100, where they were radiated in a linear accelerator Siemens Primus Hl using the Intensity Modulated Radiation Therapy (IMRT) by step and shoot technique. Previous to the crystals calibration and response evaluation, the acceptation procedures recommended by the TG-53 protocol for validation of the technique were carried out. The planning system utilized was the Theraplan Plus 3.8, using the algorithm of Pencil Kernel. The register and verification system was Lantis 5.2. The response curve of dose versus charge was obtained from the readings of the TLD in a Harshaw 3500. The crystals were radiated in a Bench- Marck phantom with doses previously determined by using ionization chambers for square radiation fields, in a beam with a 0.68 TPR20,10 corresponding to 6 MV of energy. We compare the response of these through of radiation of segmented fields in a Anthropomorphic phantom and the calculated doses by the planning system. The results obtained in the crystals response show deviations less than 5 % between the measured dose and the calculated dose in the zones of low gradient. It allows its implementation like routine control of quality by IMRT. (Author)

  10. Identification, detection, and validation of vibrating structures: a signal processing approach

    International Nuclear Information System (INIS)

    Candy, J.V.; Lager, D.L.

    1979-01-01

    This report discusses the application of modern signal processing techniques to characterize parameters governing the vibrational response of a structure. Simulated response data is used to explore the feasibility of applying these techniques to various structural problems. On-line estimator/indentifiers are used to estimate structural parameters, validate designed structures, and detect structural failure when used with a detector

  11. Polarographic validation of chemical speciation models

    International Nuclear Information System (INIS)

    Duffield, J.R.; Jarratt, J.A.

    2001-01-01

    It is well established that the chemical speciation of an element in a given matrix, or system of matrices, is of fundamental importance in controlling the transport behaviour of the element. Therefore, to accurately understand and predict the transport of elements and compounds in the environment it is a requirement that both the identities and concentrations of trace element physico-chemical forms can be ascertained. These twin requirements present the analytical scientist with considerable challenges given the labile equilibria, the range of time scales (from nanoseconds to years) and the range of concentrations (ultra-trace to macro) that may be involved. As a result of this analytical variability, chemical equilibrium modelling has become recognised as an important predictive tool in chemical speciation analysis. However, this technique requires firm underpinning by the use of complementary experimental techniques for the validation of the predictions made. The work reported here has been undertaken with the primary aim of investigating possible methodologies that can be used for the validation of chemical speciation models. However, in approaching this aim, direct chemical speciation analyses have been made in their own right. Results will be reported and analysed for the iron(II)/iron(III)-citrate proton system (pH 2 to 10; total [Fe] = 3 mmol dm -3 ; total [citrate 3- ] 10 mmol dm -3 ) in which equilibrium constants have been determined using glass electrode potentiometry, speciation is predicted using the PHREEQE computer code, and validation of predictions is achieved by determination of iron complexation and redox state with associated concentrations. (authors)

  12. Validation of the TEXSAN thermal-hydraulic analysis program

    International Nuclear Information System (INIS)

    Burns, S.P.; Klein, D.E.

    1992-01-01

    The TEXSAN thermal-hydraulic analysis program has been developed by the University of Texas at Austin (UT) to simulate buoyancy driven fluid flow and heat transfer in spent fuel and high level nuclear waste (HLW) shipping applications. As part of the TEXSAN software quality assurance program, the software has been subjected to a series of test cases intended to validate its capabilities. The validation tests include many physical phenomena which arise in spent fuel and HLW shipping applications. This paper describes some of the principal results of the TEXSAN validation tests and compares them to solutions available in the open literature. The TEXSAN validation effort has shown that the TEXSAN program is stable and consistent under a range of operating conditions and provides accuracy comparable with other heat transfer programs and evaluation techniques. The modeling capabilities and the interactive user interface employed by the TEXSAN program should make it a useful tool in HLW transportation analysis

  13. A cross-validation package driving Netica with python

    Science.gov (United States)

    Fienen, Michael N.; Plant, Nathaniel G.

    2014-01-01

    Bayesian networks (BNs) are powerful tools for probabilistically simulating natural systems and emulating process models. Cross validation is a technique to avoid overfitting resulting from overly complex BNs. Overfitting reduces predictive skill. Cross-validation for BNs is known but rarely implemented due partly to a lack of software tools designed to work with available BN packages. CVNetica is open-source, written in Python, and extends the Netica software package to perform cross-validation and read, rebuild, and learn BNs from data. Insights gained from cross-validation and implications on prediction versus description are illustrated with: a data-driven oceanographic application; and a model-emulation application. These examples show that overfitting occurs when BNs become more complex than allowed by supporting data and overfitting incurs computational costs as well as causing a reduction in prediction skill. CVNetica evaluates overfitting using several complexity metrics (we used level of discretization) and its impact on performance metrics (we used skill).

  14. PolyNano M.6.1.1 Process validation state-of-the-art

    DEFF Research Database (Denmark)

    Tosello, Guido; Hansen, Hans Nørgaard; Calaon, Matteo

    2012-01-01

    Nano project. Methods for replication process validation are presented and will be further investigated in WP6 “Process Chain Validation” and applied to PolyNano study cases. Based on the available information, effective best practice standard process validation will be defined and implemented...... assessment methods, and presents measuring procedures/techniques suitable for replication fidelity studies. The report reviews state‐of‐the‐art research results regarding replication obtained at different scales, tooling technologies based on surface replication, process validation trough design...

  15. Validating a Technology Enhanced Student-Centered Learning Model

    Science.gov (United States)

    Kang, Myunghee; Hahn, Jungsun; Chung, Warren

    2015-01-01

    The Technology Enhanced Student Centered Learning (TESCL) Model in this study presents the core factors that ensure the quality of learning in a technology-supported environment. Although the model was conceptually constructed using a student-centered learning framework and drawing upon previous studies, it should be validated through real-world…

  16. Do emotional intelligence and previous caring experience influence student nurse performance? A comparative analysis.

    Science.gov (United States)

    Stenhouse, Rosie; Snowden, Austyn; Young, Jenny; Carver, Fiona; Carver, Hannah; Brown, Norrie

    2016-08-01

    Reports of poor nursing care have focused attention on values based selection of candidates onto nursing programmes. Values based selection lacks clarity and valid measures. Previous caring experience might lead to better care. Emotional intelligence (EI) might be associated with performance, is conceptualised and measurable. To examine the impact of 1) previous caring experience, 2) emotional intelligence 3) social connection scores on performance and retention in a cohort of first year nursing and midwifery students in Scotland. A longitudinal, quasi experimental design. Adult and mental health nursing, and midwifery programmes in a Scottish University. Adult, mental health and midwifery students (n=598) completed the Trait Emotional Intelligence Questionnaire-short form and Schutte's Emotional Intelligence Scale on entry to their programmes at a Scottish University, alongside demographic and previous caring experience data. Social connection was calculated from a subset of questions identified within the TEIQue-SF in a prior factor and Rasch analysis. Student performance was calculated as the mean mark across the year. Withdrawal data were gathered. 598 students completed baseline measures. 315 students declared previous caring experience, 277 not. An independent-samples t-test identified that those without previous caring experience scored higher on performance (57.33±11.38) than those with previous caring experience (54.87±11.19), a statistically significant difference of 2.47 (95% CI, 0.54 to 4.38), t(533)=2.52, p=.012. Emotional intelligence scores were not associated with performance. Social connection scores for those withdrawing (mean rank=249) and those remaining (mean rank=304.75) were statistically significantly different, U=15,300, z=-2.61, p$_amp_$lt;0.009. Previous caring experience led to worse performance in this cohort. Emotional intelligence was not a useful indicator of performance. Lower scores on the social connection factor were associated

  17. Validation of a Mathematical Model for Green Algae (Raphidocelis Subcapitata Growth and Implications for a Coupled Dynamical System with Daphnia Magna

    Directory of Open Access Journals (Sweden)

    Michael Stemkovski

    2016-05-01

    Full Text Available Toxicity testing in populations probes for responses in demographic variables to anthropogenic or natural chemical changes in the environment. Importantly, these tests are primarily performed on species in isolation of adjacent tropic levels in their ecosystem. The development and validation of coupled species models may aid in predicting adverse outcomes at the ecosystems level. Here, we aim to validate a model for the population dynamics of the green algae Raphidocelis subcapitata, a planktonic species that is often used as a primary food source in toxicity experiments for the fresh water crustacean Daphnia magna. We collected longitudinal data from three replicate population experiments of R. subcapitata. We used this data with statistical model comparison tests and uncertainty quantification techniques to compare the performance of four models: the Logistic model, the Bernoulli model, the Gompertz model, and a discretization of the Logistic model. Overall, our results suggest that the logistic model is the most accurate continuous model for R. subcapitata population growth. We then implement the numerical discretization showing how the continuous logistic model for algae can be coupled to a previously validated discrete-time population model for D. magna.

  18. The nutrition for sport knowledge questionnaire (NSKQ): development and validation using classical test theory and Rasch analysis.

    Science.gov (United States)

    Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina

    2017-01-01

    Appropriate dietary intake can have a significant influence on athletic performance. There is a growing consensus on sports nutrition and professionals working with athletes often provide dietary education. However, due to the limitations of existing sports nutrition knowledge questionnaires, previous reports of athletes' nutrition knowledge may be inaccurate. An updated questionnaire has been developed based on a recent review of sports nutrition guidelines. The tool has been validated using a robust methodology that incorporates relevant techniques from classical test theory (CTT) and Item response theory (IRT), namely, Rasch analysis. The final questionnaire has 89 questions and six sub-sections (weight management, macronutrients, micronutrients, sports nutrition, supplements, and alcohol). The content and face validity of the tool have been confirmed based on feedback from expert sports dietitians and university sports students, respectively. The internal reliability of the questionnaire as a whole is high (KR = 0.88), and most sub-sections achieved an acceptable internal reliability. Construct validity has been confirmed, with an independent T-test revealing a significant ( p  < 0.001) difference in knowledge scores of nutrition (64 ± 16%) and non-nutrition students (51 ± 19%). Test-retest reliability has been assured, with a strong correlation ( r  = 0.92, p  < 0.001) between individuals' scores on two attempts of the test, 10 days to 2 weeks apart. Three of the sub-sections fit the Rasch Unidimensional Model. The final version of the questionnaire represents a significant improvement over previous tools. Each nutrition sub-section is unidimensional, and therefore researchers and practitioners can use these individually, as required. Use of the questionnaire will allow researchers to draw conclusions about the effectiveness of nutrition education programs, and differences in knowledge across athletes of varying ages, genders, and athletic

  19. A systematic review of the reliability and validity of discrete choice experiments in valuing non-market environmental goods

    DEFF Research Database (Denmark)

    Rokotonarivo, Sarobidy; Schaafsma, Marije; Hockley, Neal

    2016-01-01

    reliability measures. DCE results were generally consistent with those of other stated preference techniques (convergent validity), but hypothetical bias was common. Evidence supporting theoretical validity (consistency with assumptions of rational choice theory) was limited. In content validity tests, 2...

  20. Validation and Error Characterization for the Global Precipitation Measurement

    Science.gov (United States)

    Bidwell, Steven W.; Adams, W. J.; Everett, D. F.; Smith, E. A.; Yuter, S. E.

    2003-01-01

    , assumption, or algorithm. The instrumentation and techniques of the Supersites will be discussed. The GPM core satellite, with its dual-frequency radar and conically scanning radiometer, will provide insight into precipitation drop-size distributions and potentially increased measurement capabilities of light rain and snowfall. The ground validation program will include instrumentation and techniques commensurate with these new measurement capabilities.

  1. Preoperative screening: value of previous tests.

    Science.gov (United States)

    Macpherson, D S; Snow, R; Lofgren, R P

    1990-12-15

    To determine the frequency of tests done in the year before elective surgery that might substitute for preoperative screening tests and to determine the frequency of test results that change from a normal value to a value likely to alter perioperative management. Retrospective cohort analysis of computerized laboratory data (complete blood count, sodium, potassium, and creatinine levels, prothrombin time, and partial thromboplastin time). Urban tertiary care Veterans Affairs Hospital. Consecutive sample of 1109 patients who had elective surgery in 1988. At admission, 7549 preoperative tests were done, 47% of which duplicated tests performed in the previous year. Of 3096 previous results that were normal as defined by hospital reference range and done closest to the time of but before admission (median interval, 2 months), 13 (0.4%; 95% CI, 0.2% to 0.7%), repeat values were outside a range considered acceptable for surgery. Most of the abnormalities were predictable from the patient's history, and most were not noted in the medical record. Of 461 previous tests that were abnormal, 78 (17%; CI, 13% to 20%) repeat values at admission were outside a range considered acceptable for surgery (P less than 0.001, frequency of clinically important abnormalities of patients with normal previous results with those with abnormal previous results). Physicians evaluating patients preoperatively could safely substitute the previous test results analyzed in this study for preoperative screening tests if the previous tests are normal and no obvious indication for retesting is present.

  2. Validation of the process control system of an automated large scale manufacturing plant.

    Science.gov (United States)

    Neuhaus, H; Kremers, H; Karrer, T; Traut, R H

    1998-02-01

    The validation procedure for the process control system of a plant for the large scale production of human albumin from plasma fractions is described. A validation master plan is developed, defining the system and elements to be validated, the interfaces with other systems with the validation limits, a general validation concept and supporting documentation. Based on this master plan, the validation protocols are developed. For the validation, the system is subdivided into a field level, which is the equipment part, and an automation level. The automation level is further subdivided into sections according to the different software modules. Based on a risk categorization of the modules, the qualification activities are defined. The test scripts for the different qualification levels (installation, operational and performance qualification) are developed according to a previously performed risk analysis.

  3. VALIDATING A COMPUTER-BASED TECHNIQUE FOR ASSESSING STABILITY TO FAILURE STRESS

    Directory of Open Access Journals (Sweden)

    I. F. Arshava

    2013-03-01

    Full Text Available An upsurge of interest in the implicit personality assessment, currently observed both in personality psycho-diagnostics and in experimental studies of social attitudes and prejudices, signals the shifting of researchers’ attention from de?ning between-person personality taxonomy to specifying comprehensive within-person processes, the dynamics of which can be captured at the level of an individual case. This research examines the possibility of the implicit assessment of the individual’s stability vs. susceptibility to failure stress by comparing the degrees of ef?cacy in the voluntary self-regulation of a computer-simulated information-processing activity under different conditions (patent of Ukraine № 91842, issued in 2010. By exposing two groups of participants (university undergraduates to processing the information, the scope of which exceeds the human short-term memory capacity at one of the stages of the modeled activity an unexpected and unavoidable failure is elicited. The participants who retain stability of their self-regulation behavior after having been exposed to failure, i.e. who keep processing information as effectively as they did prior to failure, are claimed to retain homeostasis and thus possess emotional stability. Those, who loose homeostasis after failure and display lower standards of self-regulation behavior, are considered to be susceptible to stress. The validity of the suggested type of the implicit diagnostics was empirically tested by clustering (K-means algorithm two samples of the participants on the  properties of their self-regulation behavior and testing between-cluster differences by a set of the explicitly assessed variables: Action control ef?cacy (Kuhl, 2001, preferred strategies of Coping with Stressful Situations (Endler, Parker, 1990,  Purpose-in-Life orientation (a Russian version of the test by Crumbaugh and Maholick, modi?ed by D.Leontiev, 1992, Psychological Well-being (Ryff, 1989

  4. Refinement, Validation and Benchmarking of a Model for E-Government Service Quality

    Science.gov (United States)

    Magoutas, Babis; Mentzas, Gregoris

    This paper presents the refinement and validation of a model for Quality of e-Government Services (QeGS). We built upon our previous work where a conceptualized model was identified and put focus on the confirmatory phase of the model development process, in order to come up with a valid and reliable QeGS model. The validated model, which was benchmarked with very positive results with similar models found in the literature, can be used for measuring the QeGS in a reliable and valid manner. This will form the basis for a continuous quality improvement process, unleashing the full potential of e-government services for both citizens and public administrations.

  5. Trypanosomosis surveillance on Zanzibar island, using the trypanosomal antigen detection ELISA (enzyme-linked immunosorbent assay) technique

    Energy Technology Data Exchange (ETDEWEB)

    Mbwambo, H A [Animal Disease Research Inst. (ADRI), Dar-es-Salaam (Tanzania, United Republic of)

    1997-02-01

    The effectiveness of trypanosomosis control programs depends greatly on prior knowledge of basic data of the epidemiological situation of the disease, which in turns depends, among others, on the use of techniques that give a fairly quick and accurate diagnosis. An antigen-detection (Ag) ELISA was first introduced into Tanzania and validated at the Animal Disease Research Institute (ADRI) through the FAO/IAEA Research Contract (RC) No. 5030/NL. Incorporation of the Ag-ELISA technique into a FAO animal disease control project (1986-1993) on Unguja island, in 1992, revealed useful information of high trypanosomosis prevalence in an area previously declared free of the disease using just stained blood smears and buffy coat examinations. This triggered further efforts into more intensive surveys of the tsetse and trypanosomosis situation on Unguja island. The present study is a continuation of previous work in an effort to confirm the practical application of Ag-ELISA in trypanosomosis control operations. Results obtained from a known tsetse and trypanosomosis-free area, on Pemba island, showed a high specificity of the test for Trypanosoma congolense, T. vivax and T. brucei. A preliminary cut-off value of 10% (Percent Positivity = PP) was used. When the PP of 10 was used on sera of trypanosomosis-endemic areas (Mangapwani, Ndijani, Dunga and Kikungwi) on Unguja island, the results reflected the `real` trypanosomis situation in the affected area. This was most strongly felt in the Mangapwani area, where tsetse and trypanosomosis were considered under control by 1994 (no tsetse flies were caught and no samples were encountered positive by the buffy coat technique). However, it should be stressed that the buffy coat technique and the Ag-ELISA complement each other and should be used in conjunction. (author). 8 refs, 1 fig., 5 tabs.

  6. Trypanosomosis surveillance on Zanzibar island, using the trypanosomal antigen detection ELISA (enzyme-linked immunosorbent assay) technique

    International Nuclear Information System (INIS)

    Mbwambo, H.A.

    1997-01-01

    The effectiveness of trypanosomosis control programs depends greatly on prior knowledge of basic data of the epidemiological situation of the disease, which in turns depends, among others, on the use of techniques that give a fairly quick and accurate diagnosis. An antigen-detection (Ag) ELISA was first introduced into Tanzania and validated at the Animal Disease Research Institute (ADRI) through the FAO/IAEA Research Contract (RC) No. 5030/NL. Incorporation of the Ag-ELISA technique into a FAO animal disease control project (1986-1993) on Unguja island, in 1992, revealed useful information of high trypanosomosis prevalence in an area previously declared free of the disease using just stained blood smears and buffy coat examinations. This triggered further efforts into more intensive surveys of the tsetse and trypanosomosis situation on Unguja island. The present study is a continuation of previous work in an effort to confirm the practical application of Ag-ELISA in trypanosomosis control operations. Results obtained from a known tsetse and trypanosomosis-free area, on Pemba island, showed a high specificity of the test for Trypanosoma congolense, T. vivax and T. brucei. A preliminary cut-off value of 10% (Percent Positivity = PP) was used. When the PP of 10 was used on sera of trypanosomosis-endemic areas (Mangapwani, Ndijani, Dunga and Kikungwi) on Unguja island, the results reflected the 'real' trypanosomis situation in the affected area. This was most strongly felt in the Mangapwani area, where tsetse and trypanosomosis were considered under control by 1994 (no tsetse flies were caught and no samples were encountered positive by the buffy coat technique). However, it should be stressed that the buffy coat technique and the Ag-ELISA complement each other and should be used in conjunction. (author). 8 refs, 1 fig., 5 tabs

  7. Pesticides residues in water treatment plant sludge: validation of analytical methodology using liquid chromatography coupled to Tandem mass spectrometry (LC-MS/MS)

    International Nuclear Information System (INIS)

    Moracci, Luiz Fernando Soares

    2008-01-01

    The evolving scenario of Brazilian agriculture brings benefits to the population and demands technological advances to this field. Constantly, new pesticides are introduced encouraging scientific studies with the aim of determine and evaluate impacts on the population and on environment. In this work, the evaluated sample was the sludge resulted from water treatment plant located in the Vale do Ribeira, Sao Paulo, Brazil. The technique used was the reversed phase liquid chromatography coupled to electrospray ionization tandem mass spectrometry. Compounds were previously liquid extracted from the matrix. The development of the methodology demanded data processing in order to be transformed into reliable information. The processes involved concepts of validation of chemical analysis. The evaluated parameters were selectivity, linearity, range, sensitivity, accuracy, precision, limit of detection, limit of quantification and robustness. The obtained qualitative and quantitative results were statistically treated and presented. The developed and validated methodology is simple. As results, even exploring the sensitivity of the analytical technique, the work compounds were not detected in the sludge of the WTP. One can explain that these compounds can be present in a very low concentration, can be degraded under the conditions of the water treatment process or are not completely retained by the WTP. (author)

  8. Verification and validation of computer based systems for PFBR

    International Nuclear Information System (INIS)

    Thirugnanamurthy, D.

    2017-01-01

    Verification and Validation (V and V) process is essential to build quality into system. Verification is the process of evaluating a system to determine whether the products of each development phase satisfies the requirements imposed by the previous phase. Validation is the process of evaluating a system at the end of the development process to ensure compliance with the functional, performance and interface requirements. This presentation elaborates the V and V process followed, documents submission requirements in each stage, V and V activities, check list used for reviews in each stage and reports

  9. Postmortem validation of breast density using dual-energy mammography

    OpenAIRE

    Molloi, Sabee; Ducote, Justin L.; Ding, Huanjun; Feig, Stephen A.

    2014-01-01

    Purpose: Mammographic density has been shown to be an indicator of breast cancer risk and also reduces the sensitivity of screening mammography. Currently, there is no accepted standard for measuring breast density. Dual energy mammography has been proposed as a technique for accurate measurement of breast density. The purpose of this study is to validate its accuracy in postmortem breasts and compare it with other existing techniques. Methods: Forty postmortem breasts were imaged using a dua...

  10. Validation of conversion between mini-mental state examination and montreal cognitive assessment.

    Science.gov (United States)

    Lawton, Michael; Kasten, Meike; May, Margaret T; Mollenhauer, Brit; Schaumburg, Martina; Liepelt-Scarfone, Inga; Maetzler, Walter; Vollstedt, Eva-Juliane; Hu, Michele T M; Berg, Daniela; Ben-Shlomo, Yoav

    2016-04-01

    Harmonizing data across cohorts is important for validating findings or combining data in meta-analyses. We replicate and validate a previous conversion of MoCA to MMSE in PD. We used five studies with 1,161 PD individuals and 2,091 observations measured with both the MoCA and MMSE. We compared a previously published conversion table using equipercentile equating with log-linear smoothing to our internally derived scores. Both conversions found good agreement within and across the studies when comparing true and converted MMSE (mean difference: 0.05; standard deviation: 1.84; median difference: 0; interquartile range: -1 to 1, using internal conversion). These results show that one can get a reliable and valid conversion between two commonly used measures of cognition in PD studies. These approaches need to be applied to other scales and domains to enable large-scale collaborative analyses across multiple PD cohorts. © 2016 The Authors. International Parkinson and Movement Disorder Society.

  11. A Dataset and a Technique for Generalized Nuclear Segmentation for Computational Pathology.

    Science.gov (United States)

    Kumar, Neeraj; Verma, Ruchika; Sharma, Sanuj; Bhargava, Surabhi; Vahadane, Abhishek; Sethi, Amit

    2017-07-01

    Nuclear segmentation in digital microscopic tissue images can enable extraction of high-quality features for nuclear morphometrics and other analysis in computational pathology. Conventional image processing techniques, such as Otsu thresholding and watershed segmentation, do not work effectively on challenging cases, such as chromatin-sparse and crowded nuclei. In contrast, machine learning-based segmentation can generalize across various nuclear appearances. However, training machine learning algorithms requires data sets of images, in which a vast number of nuclei have been annotated. Publicly accessible and annotated data sets, along with widely agreed upon metrics to compare techniques, have catalyzed tremendous innovation and progress on other image classification problems, particularly in object recognition. Inspired by their success, we introduce a large publicly accessible data set of hematoxylin and eosin (H&E)-stained tissue images with more than 21000 painstakingly annotated nuclear boundaries, whose quality was validated by a medical doctor. Because our data set is taken from multiple hospitals and includes a diversity of nuclear appearances from several patients, disease states, and organs, techniques trained on it are likely to generalize well and work right out-of-the-box on other H&E-stained images. We also propose a new metric to evaluate nuclear segmentation results that penalizes object- and pixel-level errors in a unified manner, unlike previous metrics that penalize only one type of error. We also propose a segmentation technique based on deep learning that lays a special emphasis on identifying the nuclear boundaries, including those between the touching or overlapping nuclei, and works well on a diverse set of test images.

  12. Validation of a technique by high-performance liquid chromatography for the determination of total isoflavones

    Directory of Open Access Journals (Sweden)

    Pilar A. Soledispa Cañarte

    2017-04-01

    Full Text Available Context: Isoflavones may act as selective regulators in the prevention of various diseases. The most important source of isoflavones is the soy, from which different phytotherapeutics are elaborated of use in Ecuadorian population. However, its concentration varies depending on several factors, therefore quality assessment need to be carried out through out several analytical methods. Aims: To validate an analytical method by high precision liquid chromatography (HPLC to quantify total isoflavones in herbal medicine. Methods: To quantify isoflavones, it was used a brand liquid chromatography with UV/VIS detector at 260 nm, C-18 column using isocratic method. The mobile phase was composed of 2% acetic acid: acetonitrile (75:25. The quantification was performed against reference standard. The parameters for the validation followed the established in the USP 33. Results: The chromatogram presented six peaks with elution between 1.557 and 18.913 min. The linearity of the system and the method got r2 equal to 0.98 and 0.99 respectively. The coefficients of variation 1.5% in the study of repetitiveness and 2% in intermediate precision. The accuracy of the adjusted lineal model exhibited r=0.95 and intercept reliable interval (-0.921; 1.743. Conclusions: The validated method was specific, accurate, precise and linear. It can be used for quality control and stability studies of isoflavones present in herbal medicine.

  13. Recovery Act Validation of Innovative Exploration Techniques Pilgrim Hot Springs, Alaska

    Energy Technology Data Exchange (ETDEWEB)

    Holdmann, Gwen [Univ. of Alaska, Fairbanks, AK (United States)

    2015-04-30

    Drilling and temperature logging campaigns between the late 1970's and early 1980’s measured temperatures at Pilgrim Hot Springs in excess of 90°C. Between 2010 and 2014 the University of Alaska used a variety of methods including geophysical surveys, remote sensing techniques, heat budget modeling, and additional drilling to better understand the resource and estimate the available geothermal energy.

  14. Derivation and Cross-Validation of Cutoff Scores for Patients With Schizophrenia Spectrum Disorders on WAIS-IV Digit Span-Based Performance Validity Measures.

    Science.gov (United States)

    Glassmire, David M; Toofanian Ross, Parnian; Kinney, Dominique I; Nitch, Stephen R

    2016-06-01

    Two studies were conducted to identify and cross-validate cutoff scores on the Wechsler Adult Intelligence Scale-Fourth Edition Digit Span-based embedded performance validity (PV) measures for individuals with schizophrenia spectrum disorders. In Study 1, normative scores were identified on Digit Span-embedded PV measures among a sample of patients (n = 84) with schizophrenia spectrum diagnoses who had no known incentive to perform poorly and who put forth valid effort on external PV tests. Previously identified cutoff scores resulted in unacceptable false positive rates and lower cutoff scores were adopted to maintain specificity levels ≥90%. In Study 2, the revised cutoff scores were cross-validated within a sample of schizophrenia spectrum patients (n = 96) committed as incompetent to stand trial. Performance on Digit Span PV measures was significantly related to Full Scale IQ in both studies, indicating the need to consider the intellectual functioning of examinees with psychotic spectrum disorders when interpreting scores on Digit Span PV measures. © The Author(s) 2015.

  15. Validation of nursing management diagnoses.

    Science.gov (United States)

    Morrison, R S

    1995-01-01

    Nursing management diagnosis based on nursing and management science, merges "nursing diagnosis" and "organizational diagnosis". Nursing management diagnosis is a judgment about nursing organizational problems. The diagnoses provide a basis for nurse manager interventions to achieve outcomes for which a nurse manager is accountable. A nursing organizational problem is a discrepancy between what should be happening and what is actually happening that prevents the goals of nursing from being accomplished. The purpose of this study was to validate 73 nursing management diagnoses identified previously in 1992: 71 of the 72 diagnoses were considered valid by at least 70% of 136 participants. Diagnoses considered to have high priority for future research and development were identified by summing the mean scores for perceived frequency of occurrence and level of disruption. Further development of nursing management diagnoses and testing of their effectiveness in enhancing decision making is recommended.

  16. Validation of Metrics as Error Predictors

    Science.gov (United States)

    Mendling, Jan

    In this chapter, we test the validity of metrics that were defined in the previous chapter for predicting errors in EPC business process models. In Section 5.1, we provide an overview of how the analysis data is generated. Section 5.2 describes the sample of EPCs from practice that we use for the analysis. Here we discuss a disaggregation by the EPC model group and by error as well as a correlation analysis between metrics and error. Based on this sample, we calculate a logistic regression model for predicting error probability with the metrics as input variables in Section 5.3. In Section 5.4, we then test the regression function for an independent sample of EPC models from textbooks as a cross-validation. Section 5.5 summarizes the findings.

  17. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Science.gov (United States)

    2010-07-01

    ... EMPLOYMENT OPPORTUNITY, DEPARTMENT OF LABOR 3-UNIFORM GUIDELINES ON EMPLOYEE SELECTION PROCEDURES (1978... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... techniques contemplated by these guidelines usually should be followed if technically feasible. Where the...

  18. SDL-Based Protocol Validation for the Integrated Safety Communication Network in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Jung-hun; Kim, Dong-hoon; Lee, Dong-young; Park, Sung-woo

    2006-01-01

    The communication protocol in nuclear power plants needs to be validated systematically to avoid the critical situation that may be caused by its own faults. We establish the methodology to validate the protocol designed for the Integrated Safety Communication Networks (ISCN) of Korea Nuclear Instrumentation and Control System (KNICS). The ISCN protocol is specified using the formal description technique called the SDL. The validation of ISCN protocol is done via the Simulator and Validator, both of which are main functions provided by the SDL

  19. The development and validation of a psychological contract of safety scale.

    Science.gov (United States)

    Walker, Arlene

    2010-08-01

    This paper builds on previous research by the author and describes the development and validation of a new measure of the psychological contract of safety. The psychological contract of safety is defined as the beliefs of individuals about reciprocal safety obligations inferred from implicit and explicit promises. A psychological contract is established when an individual believes that perceived employer and employee safety obligations are contingent on each other. A pilot test of the measure is first undertaken with participants from three different occupations: nurses, construction workers, and meat processing workers (N=99). Item analysis is used to refine the measure and provide initial validation of the scale. A larger validation study is then conducted with a participant sample of health care workers (N=424) to further refine the measure and to determine the psychometric properties of the scale. Item and correlational analyses produced the final employer and employee obligations scales, consisting of 21 and 17 items, respectively. Factor analyses identified two underlying dimensions in each scale comparable to that previously established in the organizational literature. These transactional and relational-type obligations provided construct validity of the scale. Internal consistency ratings using Cronbach's alpha found the components of the psychological contract of safety measure to be reliable. The refined and validated psychological contract of safety measure will allow investigation of the positive and negative outcomes associated with fulfilment and breach of the psychological contract of safety in future research. 2010 Elsevier Ltd. All rights reserved.

  20. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    Energy Technology Data Exchange (ETDEWEB)

    Bravenec, Ronald [Fourth State Research, Austin, TX (United States)

    2017-11-14

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less than half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.

  1. Improved Sectional Image Analysis Technique for Evaluating Fiber Orientations in Fiber-Reinforced Cement-Based Materials.

    Science.gov (United States)

    Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong

    2016-01-12

    The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis.

  2. Static Validation of Security Protocols

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, P.

    2005-01-01

    We methodically expand protocol narrations into terms of a process algebra in order to specify some of the checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we demonstrate that these techniques ...... suffice to identify several authentication flaws in symmetric and asymmetric key protocols such as Needham-Schroeder symmetric key, Otway-Rees, Yahalom, Andrew secure RPC, Needham-Schroeder asymmetric key, and Beller-Chang-Yacobi MSR...

  3. A comparison of morbidity associated with placenta previa with and without previous caesarean sections

    International Nuclear Information System (INIS)

    Baqai, S.; Siraj, A.; Noor, N.

    2018-01-01

    To compare the morbidity associated with placenta previa with and without previous caesarean sections. Study Design: Retrospective comparative study. Place and Duration of Study: From March 2014 till March 2016 in the department of Obstetrics and Gynaecology at PNS Shifa hospital Karachi. Material and Methods: After the approval from hospital ethical committee, antenatal patients with singleton pregnancy of gestational age >32 weeks, in the age group of 20-40 years diagnosed to have placenta previa included in the study. All patients with twin pregnancy less than 20 years and more than 40 years of age were excluded. The records of all patients fulfilling the inclusion criteria were reviewed. Data had been collected for demographic and maternal variables, placenta previa, history of previous lower segment caesarean section (LSCS), complications associated with placenta previa and techniques used to control blood loss were recorded. Results: During the study period, 6879 patients were delivered in PNS Shifa, out of these, 2060 (29.9%) had caesarean section out of these, 47.3% patients had previous history of LSCS. Thirty three (1.6%) patients were diagnosed to have placenta previa and frequency of placenta previa was significantly higher in patients with previous history of LSCS than previous normal delivery of LSCS i.e. 22 vs. 11 (p=0.023). It was observed that the frequency of morbidly adherent placenta (MAP) and Intensive care unit (ICU) stay were significantly higher in patients with previous history of LSCS than previous history of normal delivery. Conclusion: Frequency of placenta previa was significantly higher in patients with history of LSCS. Also placenta previa remains a major risk factor for various maternal complications. (author)

  4. Structural system identification: Structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Red-Horse, J.R.

    1997-04-01

    Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

  5. Structural exploration for the refinement of anticancer matrix metalloproteinase-2 inhibitor designing approaches through robust validated multi-QSARs

    Science.gov (United States)

    Adhikari, Nilanjan; Amin, Sk. Abdul; Saha, Achintya; Jha, Tarun

    2018-03-01

    Matrix metalloproteinase-2 (MMP-2) is a promising pharmacological target for designing potential anticancer drugs. MMP-2 plays critical functions in apoptosis by cleaving the DNA repair enzyme namely poly (ADP-ribose) polymerase (PARP). Moreover, MMP-2 expression triggers the vascular endothelial growth factor (VEGF) having a positive influence on tumor size, invasion, and angiogenesis. Therefore, it is an urgent need to develop potential MMP-2 inhibitors without any toxicity but better pharmacokinetic property. In this article, robust validated multi-quantitative structure-activity relationship (QSAR) modeling approaches were attempted on a dataset of 222 MMP-2 inhibitors to explore the important structural and pharmacophoric requirements for higher MMP-2 inhibition. Different validated regression and classification-based QSARs, pharmacophore mapping and 3D-QSAR techniques were performed. These results were challenged and subjected to further validation to explain 24 in house MMP-2 inhibitors to judge the reliability of these models further. All these models were individually validated internally as well as externally and were supported and validated by each other. These results were further justified by molecular docking analysis. Modeling techniques adopted here not only helps to explore the necessary structural and pharmacophoric requirements but also for the overall validation and refinement techniques for designing potential MMP-2 inhibitors.

  6. Slip Validation and Prediction for Mars Exploration Rovers

    Directory of Open Access Journals (Sweden)

    Jeng Yen

    2008-04-01

    Full Text Available This paper presents a novel technique to validate and predict the rover slips on Martian surface for NASA’s Mars Exploration Rover mission (MER. Different from the traditional approach, the proposed method uses the actual velocity profile of the wheels and the digital elevation map (DEM from the stereo images of the terrain to formulate the equations of motion. The six wheel speed from the empirical encoder data comprises the vehicle's velocity, and the rover motion can be estimated using mixed differential and algebraic equations. Applying the discretization operator to these equations, the full kinematics state of the rover is then resolved by the configuration kinematics solution in the Rover Sequencing and Visualization Program (RSVP. This method, with the proper wheel slip and sliding factors, produces accurate simulation of the Mars Exploration rovers, which have been validated with the earth-testing vehicle. This computational technique has been deployed to the operation of the MER rovers in the extended mission period. Particularly, it yields high quality prediction of the rover motion on high slope areas. The simulated path of the rovers has been validated using the telemetry from the onboard Visual Odometry (VisOdom. Preliminary results indicate that the proposed simulation is very effective in planning the path of the rovers on the high-slope areas.

  7. Validating relationships among attachment, emotional intelligence and clinical communication.

    Science.gov (United States)

    Cherry, M Gemma; Fletcher, Ian; O'Sullivan, Helen

    2014-10-01

    In a previous study, we found that emotional intelligence (EI) mediates the negative influences of Year 1 medical students' attachment styles on their provider-patient communication (PPC). However, in that study, students were examined on a relatively straightforward PPC skill set and were not assessed on their abilities to elicit relevant clinical information from standardised patients. The influence of these psychological variables in more demanding and realistic clinical scenarios warrants investigation. This study aimed to validate previous research findings by exploring the mediating effect of EI on the relationship between medical students' attachment styles and their PPC across an ecologically valid PPC objective structured clinical examination (OSCE). Year 2 medical students completed measures of attachment (the Experiences in Close Relationships-Short Form [ECR-SF], a 12-item measure which provides attachment avoidance and attachment anxiety dimensional scores) and EI (the Mayer-Salovey-Caruso Emotional Intelligence Test [MSCEIT], a 141-item measure on the perception, use, understanding and management of emotions), prior to their summative PPC OSCE. Provider-patient communication was assessed using OSCE scores. Structural equation modelling (SEM) was used to validate our earlier model of the relationships between attachment style, EI and PPC. A total of 296 of 382 (77.5%) students participated. Attachment avoidance was significantly negatively correlated with total EI scores (r = -0.23, p < 0.01); total EI was significantly positively correlated with OSCE scores (r = 0.32, p < 0.01). Parsimonious SEM confirmed that EI mediated the negative influence of attachment avoidance on OSCE scores. It significantly predicted 14% of the variance in OSCE scores, twice as much as the 7% observed in the previous study. In more demanding and realistic clinical scenarios, EI makes a greater contribution towards effective PPC. Attachment is perceived to be stable

  8. Testing philosophy and simulation techniques

    International Nuclear Information System (INIS)

    Holtbecker, H.

    1977-01-01

    This paper reviews past and present testing philosophies and simulation techniques in the field of structure loading and response studies. The main objective of experimental programmes in the past was to simulate a hypothetical energy release with explosives and to deduce the potential damage to a reactor from the measured damage to the model. This approach was continuously refined by improving the instrumentation of the models, by reproducing the structures as faithful as possible and by developing new explosive charges. This paper presents an analysis of the factors which are expected to have an influence on the validity of the results e.g. strain rate effects and the use of water instead of sodium. More recently the discussion of a whole series of accidents in the probabilistic accident analysis and the intention to compare different reactor designs has revealed the need to develop and validate computer codes. Consequently experimental programmes have been started in which the primary aim is not to test a specific reactor but to validate codes. This paper shows the principal aspects of this approach and discusses first results. (Auth.)

  9. Validity and Reliability of the Achilles Tendon Total Rupture Score

    DEFF Research Database (Denmark)

    Ganestam, Ann; Barfod, Kristoffer; Klit, Jakob

    2013-01-01

    study was to validate a Danish translation of the ATRS. The ATRS was translated into Danish according to internationally adopted standards. Of 142 patients, 90 with previous rupture of the Achilles tendon participated in the validity study and 52 in the reliability study. The ATRS showed moderately......The best treatment of acute Achilles tendon rupture remains debated. Patient-reported outcome measures have become cornerstones in treatment evaluations. The Achilles tendon total rupture score (ATRS) has been developed for this purpose but requires additional validation. The purpose of the present...... = .07). The limits of agreement were ±18.53. A strong correlation was found between test and retest (intercorrelation coefficient .908); the standard error of measurement was 6.7, and the minimal detectable change was 18.5. The Danish version of the ATRS showed moderately strong criterion validity...

  10. High spatial validity is not sufficient to elicit voluntary shifts of attention.

    Science.gov (United States)

    Pauszek, Joseph R; Gibson, Bradley S

    2016-10-01

    Previous research suggests that the use of valid symbolic cues is sufficient to elicit voluntary shifts of attention. The present study interpreted this previous research within a broader theoretical context which contends that observers will voluntarily use symbolic cues to orient their attention in space when the temporal costs of using the cues are perceived to be less than the temporal costs of searching without the aid of the cues. In this view, previous research has not addressed the sufficiency of valid symbolic cues, because the temporal cost of using the cues is usually incurred before the target display appears. To address this concern, 70%-valid spatial word cues were presented simultaneously with a search display. In addition, other research suggests that opposing cue-dependent and cue-independent spatial biases may operate in these studies and alter standard measures of orienting. After identifying and controlling these opposing spatial biases, the results of two experiments showed that the word cues did not elicit voluntary shifts of attention when the search task was relatively easy but did when the search task was relatively difficult. Moreover, the findings also showed that voluntary use of the word cues changed over the course of the experiment when the task was difficult, presumably because the temporal cost of searching without the cue lessened as the task got easier with practice. Altogether, the present findings suggested that the factors underlying voluntary control are multifaceted and contextual, and that spatial validity alone is not sufficient to elicit voluntary shifts of attention.

  11. Indiana pouch continent urinary reservoir in patients with previous pelvic irradiation

    International Nuclear Information System (INIS)

    Mannel, R.S.; Braly, P.S.; Buller, R.E.

    1990-01-01

    Little information exists on the use of continent urinary reservoirs in patients with previous pelvic irradiation. We report the use of the Indiana pouch urinary reservoir in ten women with a history of pelvic irradiation for cervical cancer, of whom eight underwent a total pelvic exenteration for recurrent pelvic tumor and two had diversion for radiation-induced vesicovaginal fistula. All ten women achieved daytime continence, with a median time between catheterizations of 4.5 hours and a median pouch capacity of 500 mL. There was no evidence of leakage from the reservoir or significant ureteral reflux or obstruction on postoperative radiographic evaluation. No patient has required reoperation or had significant postoperative complications with the technique described

  12. Validation test case generation based on safety analysis ontology

    International Nuclear Information System (INIS)

    Fan, Chin-Feng; Wang, Wen-Shing

    2012-01-01

    Highlights: ► Current practice in validation test case generation for nuclear system is mainly ad hoc. ► This study designs a systematic approach to generate validation test cases from a Safety Analysis Report. ► It is based on a domain-specific ontology. ► Test coverage criteria have been defined and satisfied. ► A computerized toolset has been implemented to assist the proposed approach. - Abstract: Validation tests in the current nuclear industry practice are typically performed in an ad hoc fashion. This study presents a systematic and objective method of generating validation test cases from a Safety Analysis Report (SAR). A domain-specific ontology was designed and used to mark up a SAR; relevant information was then extracted from the marked-up document for use in automatically generating validation test cases that satisfy the proposed test coverage criteria; namely, single parameter coverage, use case coverage, abnormal condition coverage, and scenario coverage. The novelty of this technique is its systematic rather than ad hoc test case generation from a SAR to achieve high test coverage.

  13. Accounting for sensor calibration, data validation, measurement and sampling uncertainties in monitoring urban drainage systems.

    Science.gov (United States)

    Bertrand-Krajewski, J L; Bardin, J P; Mourad, M; Béranger, Y

    2003-01-01

    Assessing the functioning and the performance of urban drainage systems on both rainfall event and yearly time scales is usually based on online measurements of flow rates and on samples of influent effluent for some rainfall events per year. In order to draw pertinent scientific and operational conclusions from the measurement results, it is absolutely necessary to use appropriate methods and techniques in order to i) calibrate sensors and analytical methods, ii) validate raw data, iii) evaluate measurement uncertainties, iv) evaluate the number of rainfall events to sample per year in order to determine performance indicator with a given uncertainty. Based an previous work, the paper gives a synthetic review of required and techniques, and illustrates their application to storage and settling tanks. Experiments show that, controlled and careful experimental conditions, relative uncertainties are about 20% for flow rates in sewer pipes, 6-10% for volumes, 25-35% for TSS concentrations and loads, and 18-276% for TSS removal rates. In order to evaluate the annual pollutant interception efficiency of storage and settling tanks with a given uncertainty, efforts should first be devoted to decrease the sampling uncertainty by increasing the number of sampled events.

  14. Influence of Previous Knowledge in Torrance Tests of Creative Thinking

    Directory of Open Access Journals (Sweden)

    María Aranguren

    2015-07-01

    Full Text Available The aim of this work is to analyze the influence of study field, expertise and recreational activities participation in Torrance Tests of Creative Thinking (TTCT, 1974 performance. Several hypotheses were postulated to explore the possible effects of previous knowledge in TTCT verbal and TTCT figural university students’ outcomes. Participants in this study included 418 students from five study fields: Psychology;Philosophy and Literature, Music; Engineering; and Journalism and Advertising (Communication Sciences. Results found in this research seem to indicate that there in none influence of the study field, expertise and recreational activities participation in neither of the TTCT tests. Instead, the findings seem to suggest some kind of interaction between certain skills needed to succeed in specific studies fields and performance on creativity tests, such as the TTCT. These results imply that TTCT is a useful and valid instrument to measure creativity and that some cognitive process involved in innovative thinking can be promoted using different intervention programs in schools and universities regardless the students study field.

  15. Left ventricular assist device implantation in a patient who had previously undergone apical myectomy for hypertrophic cardiomyopathy.

    Science.gov (United States)

    Cho, Yang Hyun; Deo, Salil V; Topilsky, Yan; Grogan, Martha A; Park, Soon J

    2012-03-01

    Apical hypertrophy is a rare variant of hypertropic cardiomyopathy. These patients may present with end-stage congestive heart failure subsequent to long standing diastolic dysfunction. We report the technique for left ventricular assist device insertion in a patient with previous apical myectomy for hypertrophic cardiomyopathy. © 2012 Wiley Periodicals, Inc.

  16. Methods for validating the performance of wearable motion-sensing devices under controlled conditions

    International Nuclear Information System (INIS)

    Bliley, Kara E; Kaufman, Kenton R; Gilbert, Barry K

    2009-01-01

    This paper presents validation methods for assessing the accuracy and precision of motion-sensing device (i.e. accelerometer) measurements. The main goals of this paper were to assess the accuracy and precision of these measurements against a gold standard, to determine if differences in manufacturing and assembly significantly affected device performance and to determine if measurement differences due to manufacturing and assembly could be corrected by applying certain post-processing techniques to the measurement data during analysis. In this paper, the validation of a posture and activity detector (PAD), a device containing a tri-axial accelerometer, is described. Validation of the PAD devices required the design of two test fixtures: a test fixture to position the device in a known orientation, and a test fixture to rotate the device at known velocities and accelerations. Device measurements were compared to these known orientations and accelerations. Several post-processing techniques were utilized in an attempt to reduce variability in the measurement error among the devices. In conclusion, some of the measurement errors due to the inevitable differences in manufacturing and assembly were significantly improved (p < 0.01) by these post-processing techniques

  17. Huffman-based code compression techniques for embedded processors

    KAUST Repository

    Bonny, Mohamed Talal

    2010-09-01

    The size of embedded software is increasing at a rapid pace. It is often challenging and time consuming to fit an amount of required software functionality within a given hardware resource budget. Code compression is a means to alleviate the problem by providing substantial savings in terms of code size. In this article we introduce a novel and efficient hardware-supported compression technique that is based on Huffman Coding. Our technique reduces the size of the generated decoding table, which takes a large portion of the memory. It combines our previous techniques, Instruction Splitting Technique and Instruction Re-encoding Technique into new one called Combined Compression Technique to improve the final compression ratio by taking advantage of both previous techniques. The instruction Splitting Technique is instruction set architecture (ISA)-independent. It splits the instructions into portions of varying size (called patterns) before Huffman coding is applied. This technique improves the final compression ratio by more than 20% compared to other known schemes based on Huffman Coding. The average compression ratios achieved using this technique are 48% and 50% for ARM and MIPS, respectively. The Instruction Re-encoding Technique is ISA-dependent. It investigates the benefits of reencoding unused bits (we call them reencodable bits) in the instruction format for a specific application to improve the compression ratio. Reencoding those bits can reduce the size of decoding tables by up to 40%. Using this technique, we improve the final compression ratios in comparison to the first technique to 46% and 45% for ARM and MIPS, respectively (including all overhead that incurs). The Combined Compression Technique improves the compression ratio to 45% and 42% for ARM and MIPS, respectively. In our compression technique, we have conducted evaluations using a representative set of applications and we have applied each technique to two major embedded processor architectures

  18. Reliability and Validity of the Inline Skating Skill Test

    Science.gov (United States)

    Radman, Ivan; Ruzic, Lana; Padovan, Viktoria; Cigrovski, Vjekoslav; Podnar, Hrvoje

    2016-01-01

    This study aimed to examine the reliability and validity of the inline skating skill test. Based on previous skating experience forty-two skaters (26 female and 16 male) were randomized into two groups (competitive level vs. recreational level). They performed the test four times, with a recovery time of 45 minutes between sessions. Prior to testing, the participants rated their skating skill using a scale from 1 to 10. The protocol included performance time measurement through a course, combining different skating techniques. Trivial changes in performance time between the repeated sessions were determined in both competitive females/males and recreational females/males (-1.7% [95% CI: -5.8–2.6%] – 2.2% [95% CI: 0.0–4.5%]). In all four subgroups, the skill test had a low mean within-individual variation (1.6% [95% CI: 1.2–2.4%] – 2.7% [95% CI: 2.1–4.0%]) and high mean inter-session correlation (ICC = 0.97 [95% CI: 0.92–0.99] – 0.99 [95% CI: 0.98–1.00]). The comparison of detected typical errors and smallest worthwhile changes (calculated as standard deviations × 0.2) revealed that the skill test was able to track changes in skaters’ performances. Competitive-level skaters needed shorter time (24.4–26.4%, all p skating skills in amateur competitive and recreational level skaters. Further studies are needed to evaluate the reproducibility of this skill test in different populations including elite inline skaters. Key points Study evaluated the reliability and construct validity of a newly developed inline skating skill test. Evaluated test is a first protocol designed to assess specific inline skating skill. Two groups of amateur skaters with different skating proficiency repeated the skill test in four separate occasions. The results suggest that evaluated test is reliable and valid to evaluate inline skating skill in amateur skaters. PMID:27803616

  19. Intelligent Testing of Traffic Light Programs: Validation in Smart Mobility Scenarios

    OpenAIRE

    Javier Ferrer; José García-Nieto; Enrique Alba; Francisco Chicano

    2016-01-01

    In smart cities, the use of intelligent automatic techniques to find efficient cycle programs of traffic lights is becoming an innovative front for traffic flow management. However, this automatic programming of traffic lights requires a validation process of the generated solutions, since they can affect the mobility (and security) of millions of citizens. In this paper, we propose a validation strategy based on genetic algorithms and feature models for the automatic generation of different ...

  20. Assessing the construct validity of aberrant salience

    Directory of Open Access Journals (Sweden)

    Kristin Schmidt

    2009-12-01

    Full Text Available We sought to validate the psychometric properties of a recently developed paradigm that aims to measure salience attribution processes proposed to contribute to positive psychotic symptoms, the Salience Attribution Test (SAT. The “aberrant salience” measure from the SAT showed good face validity in previous results, with elevated scores both in high-schizotypy individuals, and in patients with schizophrenia suffering from delusions. Exploring the construct validity of salience attribution variables derived from the SAT is important, since other factors, including latent inhibition/learned irrelevance, attention, probabilistic reward learning, sensitivity to probability, general cognitive ability and working memory could influence these measures. Fifty healthy participants completed schizotypy scales, the SAT, a learned irrelevance task, and a number of other cognitive tasks tapping into potentially confounding processes. Behavioural measures of interest from each task were entered into a principal components analysis, which yielded a five-factor structure accounting for ~75% percent of the variance in behaviour. Implicit aberrant salience was found to load onto its own factor, which was associated with elevated “Introvertive Anhedonia” schizotypy, replicating our previous finding. Learned irrelevance loaded onto a separate factor, which also included implicit adaptive salience, but was not associated with schizotypy. Explicit adaptive and aberrant salience, along with a measure of probabilistic learning, loaded onto a further factor, though this also did not correlate with schizotypy. These results suggest that the measures of learned irrelevance and implicit adaptive salience might be based on similar underlying processes, which are dissociable both from implicit aberrant salience and explicit measures of salience.

  1. Progress in Geant4 Electromagnetic Physics Modelling and Validation

    International Nuclear Information System (INIS)

    Apostolakis, J; Burkhardt, H; Ivanchenko, V N; Asai, M; Bagulya, A; Grichine, V; Brown, J M C; Chikuma, N; Cortes-Giraldo, M A; Elles, S; Jacquemier, J; Guatelli, S; Incerti, S; Kadri, O; Maire, M; Urban, L; Pandola, L; Sawkey, D; Toshito, T; Yamashita, T

    2015-01-01

    In this work we report on recent improvements in the electromagnetic (EM) physics models of Geant4 and new validations of EM physics. Improvements have been made in models of the photoelectric effect, Compton scattering, gamma conversion to electron and muon pairs, fluctuations of energy loss, multiple scattering, synchrotron radiation, and high energy positron annihilation. The results of these developments are included in the new Geant4 version 10.1 and in patches to previous versions 9.6 and 10.0 that are planned to be used for production for run-2 at LHC. The Geant4 validation suite for EM physics has been extended and new validation results are shown in this work. In particular, the effect of gamma-nuclear interactions on EM shower shape at LHC energies is discussed. (paper)

  2. Application of nuclear techniques on environmental pollution problems

    International Nuclear Information System (INIS)

    Sumatra, Made

    1998-01-01

    Radioanalysis and tracer techniques that can be used on environmental pollution problems. Neutron activation analysis (NAA) and X-ray fluorescence (XRF) spectrometry are the two methods that are used frequently on such problems. These methods are used for metal analysis. Tracer technique with radioactive labeled compounds are used to study the fate of the pollution substances in environmental systems. It is very important to validate every new developed analysis method, due to the environmental pollution problem closely related to the low enforcement. (author)

  3. Validation of a survey tool for use in cross-cultural studies

    Directory of Open Access Journals (Sweden)

    Costa FA

    2008-09-01

    Full Text Available There is a need for tools to measure the information patients need in order for healthcare professionals in general, and particularly pharmacists, to communicate effectively and play an active part in the way patients manage their medicines. Previous research has developed and validated constructs to measure patients’ desires for information and their perceptions of how useful their medicines are. It is important to develop these tools for use in different settings and countries so that best practice is shared and is based on the best available evidence. Objectives: this project sought to validate of a survey tool measuring the “Extent of Information Desired” (EID, the “Perceived Utility of Medicines” (PUM, and the “Anxiety about Illness” (AI that had been previously translated for use with Portuguese patients. Methods: The scales were validated in a patient sample of 596: construct validity was explored in Factor analysis (PCA and internal consistency analysed using Cronbach’s alpha. Criterion validity was explored correlating scores to the AI scale and patients’ perceived health status. Discriminatory power was assessed using ANOVA. Temporal stability was explored in a sub-sample of patients who responded at two time points, using a T-test to compare their mean scores. Results: Construct validity results indicated the need to remove 1 item from the Perceived Harm of Medicines (PHM and Perceived Benefit of Medicines (PBM for use in a Portuguese sample and the abandon of the tolerance scale. The internal consistency was high for the EID, PBM and AI scales (alpha>0.600 and acceptable for the PHM scale (alpha=0.536. All scales, except the EID, were consistent over time (p>0.05; p<0.01. All the scales tested showed good discriminatory power. The comparison of the AI scale with the SF-36 indicated good criterion validity (p<0.05.Conclusion: The translated tool was valid and reliable in Portuguese patients- excluding the Tolerance

  4. Signalign: An Ontology of DNA as Signal for Comparative Gene Structure Prediction Using Information-Coding-and-Processing Techniques.

    Science.gov (United States)

    Yu, Ning; Guo, Xuan; Gu, Feng; Pan, Yi

    2016-03-01

    Conventional character-analysis-based techniques in genome analysis manifest three main shortcomings-inefficiency, inflexibility, and incompatibility. In our previous research, a general framework, called DNA As X was proposed for character-analysis-free techniques to overcome these shortcomings, where X is the intermediates, such as digit, code, signal, vector, tree, graph network, and so on. In this paper, we further implement an ontology of DNA As Signal, by designing a tool named Signalign for comparative gene structure analysis, in which DNA sequences are converted into signal series, processed by modified method of dynamic time warping and measured by signal-to-noise ratio (SNR). The ontology of DNA As Signal integrates the principles and concepts of other disciplines including information coding theory and signal processing into sequence analysis and processing. Comparing with conventional character-analysis-based methods, Signalign can not only have the equivalent or superior performance, but also enrich the tools and the knowledge library of computational biology by extending the domain from character/string to diverse areas. The evaluation results validate the success of the character-analysis-free technique for improved performances in comparative gene structure prediction.

  5. Optimization of coronary optical coherence tomography imaging using the attenuation-compensated technique: a validation study.

    NARCIS (Netherlands)

    Teo, Jing Chun; Foin, Nicolas; Otsuka, Fumiyuki; Bulluck, Heerajnarain; Fam, Jiang Ming; Wong, Philip; Low, Fatt Hoe; Leo, Hwa Liang; Mari, Jean-Martial; Joner, Michael; Girard, Michael J A; Virmani, Renu; Bezerra, HG.; Costa, MA.; Guagliumi, G.; Rollins, AM.; Simon, D.; Gutiérrez-Chico, JL.; Alegría-Barrero, E.; Teijeiro-Mestre, R.; Chan, PH.; Tsujioka, H.; de Silva, R.; Otsuka, F.; Joner, M.; Prati, F.; Virmani, R.; Narula, J.; Members, WC.; Levine, GN.; Bates, ER.; Blankenship, JC.; Bailey, SR.; Bittl, JA.; Prati, F.; Guagliumi, G.; Mintz, G.S.; Costa, Marco; Regar, E.; Akasaka, T.; Roleder, T.; Jąkała, J.; Kałuża, GL.; Partyka, Ł.; Proniewska, K.; Pociask, E.; Girard, MJA.; Strouthidis, NG.; Ethier, CR.; Mari, JM.; Mari, JM.; Strouthidis, NG.; Park, SC.; Girard, MJA.; van der Lee, R.; Foin, N.; Otsuka, F.; Wong, P.K.; Mari, J-M.; Joner, M.; Nakano, M.; Vorpahl, M.; Otsuka, F.; Taniwaki, M.; Yazdani, SK.; Finn, AV.; Nakano, M.; Yahagi, K.; Yamamoto, H.; Taniwaki, M.; Otsuka, F.; Ladich, ER.; Girard, MJ.; Ang, M.; Chung, CW.; Farook, M.; Strouthidis, N.; Mehta, JS.; Foin, N.; Mari, JM.; Nijjer, S.; Sen, S.; Petraco, R.; Ghione, M.; Liu, X.; Kang, JU.; Virmani, R.; Kolodgie, F.D.; Burke, AP.; Farb, A.; Schwartz, S.M.; Yahagi, K.; Kolodgie, F.D.; Otsuka, F.; Finn, AV.; Davis, HR.; Joner, M.; Kume, T.; Akasaka, T.; Kawamoto, T.; Watanabe, N.; Toyota, E.; Neishi, Y.; Rieber, J.; Meissner, O.; Babaryka, G.; Reim, S.; Oswald, M.E.; Koenig, A.S.; Tearney, G. J.; Regar, E.; Akasaka, T.; Adriaenssens, T.; Barlis, P.; Bezerra, HG.; Yabushita, H.; Bouma, BE.; Houser, S. L.; Aretz, HT.; Jang, I-K.; Schlendorf, KH.; Guo, J.; Sun, L.; Chen, Y.D.; Tian, F.; Liu, HB.; Chen, L.; Kawasaki, M.; Bouma, BE.; Bressner, J. E.; Houser, S. L.; Nadkarni, S. K.; MacNeill, BD.; Jansen, CHP.; Onthank, DC.; Cuello, F.; Botnar, RM.; Wiethoff, AJ.; Warley, A.; von Birgelen, C.; Hartmann, A. M.; Kubo, T.; Akasaka, T.; Shite, J.; Suzuki, T.; Uemura, S.; Yu, B.; Habara, M.; Nasu, K.; Terashima, M.; Kaneda, H.; Yokota, D.; Ko, E.; Virmani, R.; Burke, AP.; Kolodgie, F.D.; Farb, A.; Takarada, S.; Imanishi, T.; Kubo, T.; Tanimoto, T.; Kitabata, H.; Nakamura, N.; Hattori, K.; Ozaki, Y.; Ismail, TF.; Okumura, M.; Naruse, H.; Kan, S.; Nishio, R.; Shinke, T.; Otake, H.; Nakagawa, M.; Nagoshi, R.; Inoue, T.; Sinclair, H.D.; Bourantas, C.; Bagnall, A.; Mintz, G.S.; Kunadian, V.; Tearney, G. J.; Yabushita, H.; Houser, S. L.; Aretz, HT.; Jang, I-K.; Schlendorf, KH.; van Soest, G.; Goderie, T.; Regar, E.; Koljenović, S.; Leenders, GL. van; Gonzalo, N.; Xu, C.; Schmitt, JM.; Carlier, SG.; Virmani, R.; van der Meer, FJ; Faber, D.J.; Sassoon, DMB.; Aalders, M.C.; Pasterkamp, G.; Leeuwen, TG. van; Schmitt, JM.; Knuttel, A.; Yadlowsky, M.; Eckhaus, MA.; Karamata, B.; Laubscher, M.; Leutenegger, M.; Bourquin, S.; Lasser, T.; Lambelet, P.; Vermeer, K.A.; Mo, J.; Weda, J.J.A.; Lemij, H.G.; Boer, JF. de

    2016-01-01

    PURPOSE To optimize conventional coronary optical coherence tomography (OCT) images using the attenuation-compensated technique to improve identification of plaques and the external elastic lamina (EEL) contour. METHOD The attenuation-compensated technique was optimized via manipulating contrast

  6. Planet Candidate Validation in K2 Crowded Fields

    Science.gov (United States)

    Rampalli, Rayna; Vanderburg, Andrew; Latham, David; Quinn, Samuel

    2018-01-01

    In just three years, the K2 mission has yielded some remarkable outcomes with the discovery of over 100 confirmed planets and 500 reported planet candidates to be validated. One challenge with this mission is the search for planets located in star-crowded regions. Campaign 13 is one such example, located towards the galactic plane in the constellation of Taurus. We subject the potential planetary candidates to a validation process involving spectroscopy to derive certain stellar parameters. Seeing-limited on/off imaging follow-up is also utilized in order to rule out false positives due to nearby eclipsing binaries. Using Markov chain Monte Carlo analysis, the best-fit parameters for each candidate are generated. These will be suitable for finding a candidate’s false positive probability through methods including feeding such parameters into the Validation of Exoplanet Signals using a Probabilistic Algorithm (VESPA). These techniques and results serve as important tools for conducting candidate validation and follow-up observations for space-based missions such as the upcoming TESS mission since TESS’s large camera pixels resemble K2’s star-crowded fields.

  7. Validation of safeguards monitoring systems and their simulations

    International Nuclear Information System (INIS)

    Standley, V.; Boeck, H.; Villa, M.

    2001-01-01

    Research is underway at the Atominstitut in Vienna Austria where the objective is to design and validate quantitatively a safeguards monitoring system (SMS) and its simulation. The work is novel because the simulation is also used as the basis for automated evaluation of SMS data. Preliminary results indicate that video and radiation data can be automatically interpreted using this approach. Application of the technique promises that an investment in a simulation supports directly the safeguards objective, which is to catch diversion of nuclear material. Consequently, it is easier for a safeguards agency to also realize other benefits associated with simulation-based acquisition, in addition to having a quantitative method for validation

  8. The validity of nasal endoscopy in patients with chronic rhinosinusitis

    DEFF Research Database (Denmark)

    Larsen, K. L.; Lange, B.; Darling, P.

    2018-01-01

    Objectives: Nasal endoscopy is a cornerstone in diagnosing sinonasal disease, but different raters might generate different results using the technique. Our study aims to evaluate the agreement between multiple raters to assess the validity of nasal endoscopy. Design/Participants: Three independe...

  9. Cross-validation of an employee safety climate model in Malaysia.

    Science.gov (United States)

    Bahari, Siti Fatimah; Clarke, Sharon

    2013-06-01

    Whilst substantial research has investigated the nature of safety climate, and its importance as a leading indicator of organisational safety, much of this research has been conducted with Western industrial samples. The current study focuses on the cross-validation of a safety climate model in the non-Western industrial context of Malaysian manufacturing. The first-order factorial validity of Cheyne et al.'s (1998) [Cheyne, A., Cox, S., Oliver, A., Tomas, J.M., 1998. Modelling safety climate in the prediction of levels of safety activity. Work and Stress, 12(3), 255-271] model was tested, using confirmatory factor analysis, in a Malaysian sample. Results showed that the model fit indices were below accepted levels, indicating that the original Cheyne et al. (1998) safety climate model was not supported. An alternative three-factor model was developed using exploratory factor analysis. Although these findings are not consistent with previously reported cross-validation studies, we argue that previous studies have focused on validation across Western samples, and that the current study demonstrates the need to take account of cultural factors in the development of safety climate models intended for use in non-Western contexts. The results have important implications for the transferability of existing safety climate models across cultures (for example, in global organisations) and highlight the need for future research to examine cross-cultural issues in relation to safety climate. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  10. JaCVAM-organized international validation study of the in vivo rodent alkaline comet assay for the detection of genotoxic carcinogens: I. Summary of pre-validation study results.

    Science.gov (United States)

    Uno, Yoshifumi; Kojima, Hajime; Omori, Takashi; Corvi, Raffaella; Honma, Masamistu; Schechtman, Leonard M; Tice, Raymond R; Burlinson, Brian; Escobar, Patricia A; Kraynak, Andrew R; Nakagawa, Yuzuki; Nakajima, Madoka; Pant, Kamala; Asano, Norihide; Lovell, David; Morita, Takeshi; Ohno, Yasuo; Hayashi, Makoto

    2015-07-01

    The in vivo rodent alkaline comet assay (comet assay) is used internationally to investigate the in vivo genotoxic potential of test chemicals. This assay, however, has not previously been formally validated. The Japanese Center for the Validation of Alternative Methods (JaCVAM), with the cooperation of the U.S. NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM)/the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM), the European Centre for the Validation of Alternative Methods (ECVAM), and the Japanese Environmental Mutagen Society/Mammalian Mutagenesis Study Group (JEMS/MMS), organized an international validation study to evaluate the reliability and relevance of the assay for identifying genotoxic carcinogens, using liver and stomach as target organs. The ultimate goal of this validation effort was to establish an Organisation for Economic Co-operation and Development (OECD) test guideline. The purpose of the pre-validation studies (i.e., Phase 1 through 3), conducted in four or five laboratories with extensive comet assay experience, was to optimize the protocol to be used during the definitive validation study. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Conception and validation of the Behavioral Intentions Scale of Organizational Citizenship (BISOC

    Directory of Open Access Journals (Sweden)

    Ana Cristina Passos Gomes Menezes

    2016-01-01

    Full Text Available Abstract This study aimed to construct and validate the Behavioral Intentions of Organizational Citizenship Scale (BISOC. Organizational citizenship consists of measures of voluntary behaviors, which are beneficial to organizations and are not explicit in employment contracts. To investigate the psychometric properties of BISOC, we selected 767 employees in different cities from the states of Bahia and Pernambuco (Brazil. The validation procedures adopted, which used techniques from both Classical Test Theory and Item Response Theory, showed that the BISOC has a unidimensional structure. From the initial set of 42 items, 35 items met the validation criteria. By presenting suitable psychometric parameters, BISOC is the first measure of organizational citizenship behaviors developed and validated to assess behavioral intentions.

  12. Complementary techniques: validation of gene expression data by quantitative real time PCR.

    Science.gov (United States)

    Provenzano, Maurizio; Mocellin, Simone

    2007-01-01

    Microarray technology can be considered the most powerful tool for screening gene expression profiles of biological samples. After data mining, results need to be validated with highly reliable biotechniques allowing for precise quantitation of transcriptional abundance of identified genes. Quantitative real time PCR (qrt-PCR) technology has recently reached a level of sensitivity, accuracy and practical ease that support its use as a routine bioinstrumentation for gene level measurement. Currently, qrt-PCR is considered by most experts the most appropriate method to confirm or confute microarray-generated data. The knowledge of the biochemical principles underlying qrt-PCR as well as some related technical issues must be beard in mind when using this biotechnology.

  13. Decoupling Solar Variability and Instrument Trends Using the Multiple Same-Irradiance-Level (MuSIL) Analysis Technique

    Science.gov (United States)

    Woods, Thomas N.; Eparvier, Francis G.; Harder, Jerald; Snow, Martin

    2018-05-01

    The solar spectral irradiance (SSI) dataset is a key record for studying and understanding the energetics and radiation balance in Earth's environment. Understanding the long-term variations of the SSI over timescales of the 11-year solar activity cycle and longer is critical for many Sun-Earth research topics. Satellite measurements of the SSI have been made since the 1970s, most of them in the ultraviolet, but recently also in the visible and near-infrared. A limiting factor for the accuracy of previous solar variability results is the uncertainties for the instrument degradation corrections, which need fairly large corrections relative to the amount of solar cycle variability at some wavelengths. The primary objective of this investigation has been to separate out solar cycle variability and any residual uncorrected instrumental trends in the SSI measurements from the Solar Radiation and Climate Experiment (SORCE) mission and the Thermosphere, Mesosphere, Ionosphere, Energetic, and Dynamics (TIMED) mission. A new technique called the Multiple Same-Irradiance-Level (MuSIL) analysis has been developed, which examines an SSI time series at different levels of solar activity to provide long-term trends in an SSI record, and the most common result is a downward trend that most likely stems from uncorrected instrument degradation. This technique has been applied to each wavelength in the SSI records from SORCE (2003 - present) and TIMED (2002 - present) to provide new solar cycle variability results between 27 nm and 1600 nm with a resolution of about 1 nm at most wavelengths. This technique, which was validated with the highly accurate total solar irradiance (TSI) record, has an estimated relative uncertainty of about 5% of the measured solar cycle variability. The MuSIL results are further validated with the comparison of the new solar cycle variability results from different solar cycles.

  14. Computed simulation of radiographies of pipes - validation of techniques for wall thickness measurements

    International Nuclear Information System (INIS)

    Bellon, C.; Tillack, G.R.; Nockemann, C.; Wenzel, L.

    1995-01-01

    A macroscopic model of radiographic NDE methods and applications is given. A computer-aided approach for determination of wall thickness from radiographs is presented, guaranteeing high accuracy and reproducibility of wall thickness determination by means of projection radiography. The algorithm was applied to computed simulations of radiographies. The simulation thus offers an effective means for testing such automated wall thickness determination as a function of imaging conditions, pipe geometries, coatings, and media tracking, and likewise is a tool for validation and optimization of the method. (orig.) [de

  15. Conceptual dissonance: evaluating the efficacy of natural language processing techniques for validating translational knowledge constructs.

    Science.gov (United States)

    Payne, Philip R O; Kwok, Alan; Dhaval, Rakesh; Borlawsky, Tara B

    2009-03-01

    The conduct of large-scale translational studies presents significant challenges related to the storage, management and analysis of integrative data sets. Ideally, the application of methodologies such as conceptual knowledge discovery in databases (CKDD) provides a means for moving beyond intuitive hypothesis discovery and testing in such data sets, and towards the high-throughput generation and evaluation of knowledge-anchored relationships between complex bio-molecular and phenotypic variables. However, the induction of such high-throughput hypotheses is non-trivial, and requires correspondingly high-throughput validation methodologies. In this manuscript, we describe an evaluation of the efficacy of a natural language processing-based approach to validating such hypotheses. As part of this evaluation, we will examine a phenomenon that we have labeled as "Conceptual Dissonance" in which conceptual knowledge derived from two or more sources of comparable scope and granularity cannot be readily integrated or compared using conventional methods and automated tools.

  16. Performance of Jet Substructure Techniques and Boosted Object Identification in ATLAS

    CERN Document Server

    Lacey, J; The ATLAS collaboration

    2014-01-01

    ATLAS has implemented and commissioned many new jet substructure techniques to aid in the identification and interpretation of hadronic final states originating from Lorentz-boosted heavy particles produced at the LHC. These techniques include quantum jets, jet charge, jet shapes, quark/gluon, boosted boson and top quark tagging, along with grooming methods such as pruning, trimming, and filtering. These techniques have been validated using the large 2012 ATLAS dataset. Presented here is a summary of the state of the art jet substructure and tagging techniques developed in ATLAS, their performance and recent results.

  17. Validation of the revised Mystical Experience Questionnaire in experimental sessions with psilocybin.

    Science.gov (United States)

    Barrett, Frederick S; Johnson, Matthew W; Griffiths, Roland R

    2015-11-01

    The 30-item revised Mystical Experience Questionnaire (MEQ30) was previously developed within an online survey of mystical-type experiences occasioned by psilocybin-containing mushrooms. The rated experiences occurred on average eight years before completion of the questionnaire. The current paper validates the MEQ30 using data from experimental studies with controlled doses of psilocybin. Data were pooled and analyzed from five laboratory experiments in which participants (n=184) received a moderate to high oral dose of psilocybin (at least 20 mg/70 kg). Results of confirmatory factor analysis demonstrate the reliability and internal validity of the MEQ30. Structural equation models demonstrate the external and convergent validity of the MEQ30 by showing that latent variable scores on the MEQ30 positively predict persisting change in attitudes, behavior, and well-being attributed to experiences with psilocybin while controlling for the contribution of the participant-rated intensity of drug effects. These findings support the use of the MEQ30 as an efficient measure of individual mystical experiences. A method to score a "complete mystical experience" that was used in previous versions of the mystical experience questionnaire is validated in the MEQ30, and a stand-alone version of the MEQ30 is provided for use in future research. © The Author(s) 2015.

  18. A new type of In-core sensor validation outline

    International Nuclear Information System (INIS)

    Figedy, S.

    2006-01-01

    In this a new type of in-core sensor validation method is outlined, which is based on combination of correlation coefficients and mutual information indices. First experience with this approach is described and further improvements to enhance the outcome reliability are proposed namely through computational intelligence techniques (Authors)

  19. Validation fonctionnelle de contrôleurs logiques : contribution au test de conformité et à l'analyse en boucle fermée

    OpenAIRE

    Guignard , Anaïs

    2014-01-01

    The results presented in this PhD thesis deal with functional validation of logic controllers using conformance test and closed-loop validation techniques. The specification model is written in the Grafcet language and the logic controller is assumed to be a Programmable Logic Controller (PLC). In order to contribute to these validation techniques, this thesis presents:- An axtension to a fomalization methods for Grafcet languages by translation to a Mealy machine. This extension generates a ...

  20. Validation of Small Kepler Transiting Planet Candidates in or near the Habitable Zone

    Science.gov (United States)

    Torres, Guillermo; Kane, Stephen R.; Rowe, Jason F.; Batalha, Natalie M.; Henze, Christopher E.; Ciardi, David R.; Barclay, Thomas; Borucki, William J.; Buchhave, Lars A.; Crepp, Justin R.; Everett, Mark E.; Horch, Elliott P.; Howard, Andrew W.; Howell, Steve B.; Isaacson, Howard T.; Jenkins, Jon M.; Latham, David W.; Petigura, Erik A.; Quintana, Elisa V.

    2017-12-01

    A main goal of NASA’s Kepler Mission is to establish the frequency of potentially habitable Earth-size planets ({η }\\oplus ). Relatively few such candidates identified by the mission can be confirmed to be rocky via dynamical measurement of their mass. Here we report an effort to validate 18 of them statistically using the BLENDER technique, by showing that the likelihood they are true planets is far greater than that of a false positive. Our analysis incorporates follow-up observations including high-resolution optical and near-infrared spectroscopy, high-resolution imaging, and information from the analysis of the flux centroids of the Kepler observations themselves. Although many of these candidates have been previously validated by others, the confidence levels reported typically ignore the possibility that the planet may transit a star different from the target along the same line of sight. If that were the case, a planet that appears small enough to be rocky may actually be considerably larger and therefore less interesting from the point of view of habitability. We take this into consideration here and are able to validate 15 of our candidates at a 99.73% (3σ) significance level or higher, and the other three at a slightly lower confidence. We characterize the GKM host stars using available ground-based observations and provide updated parameters for the planets, with sizes between 0.8 and 2.9 R ⊕. Seven of them (KOI-0438.02, 0463.01, 2418.01, 2626.01, 3282.01, 4036.01, and 5856.01) have a better than 50% chance of being smaller than 2 R ⊕ and being in the habitable zone of their host stars.

  1. Monitoring of piglets' open field activity and choice behaviour during the replay of maternal vocalization: a comparison between Observer and PID technique.

    Science.gov (United States)

    Puppe, B; Schön, P C; Wendland, K

    1999-07-01

    The paper presents a new system for the automatic monitoring of open field activity and choice behaviour of medium-sized animals. Passive infrared motion detectors (PID) were linked on-line via a digital I/O interface to a personal computer provided with self-developed analysis software based on LabVIEW (PID technique). The set up was used for testing 18 one-week-old piglets (Sus scrofa) for their approach to their mother's nursing vocalization replayed through loudspeakers. The results were validated by comparison with a conventional Observer technique, a computer-aided direct observation. In most of the cases, no differences were seen between the Observer and PID technique regarding the percentage of stay in previously defined open field segments, the locomotor open field activity, and the choice behaviour. The results revealed that piglets are clearly attracted by their mother's nursing vocalization. The monitoring system presented in this study is thus suitable for detailed behavioural investigations of individual acoustic recognition. In general, the PID technique is a useful tool for research into the behaviour of individual animals in a restricted open field which does not rely on subjective analysis by a human observer.

  2. A maximum entropy reconstruction technique for tomographic particle image velocimetry

    International Nuclear Information System (INIS)

    Bilsky, A V; Lozhkin, V A; Markovich, D M; Tokarev, M P

    2013-01-01

    This paper studies a novel approach for reducing tomographic PIV computational complexity. The proposed approach is an algebraic reconstruction technique, termed MENT (maximum entropy). This technique computes the three-dimensional light intensity distribution several times faster than SMART, using at least ten times less memory. Additionally, the reconstruction quality remains nearly the same as with SMART. This paper presents the theoretical computation performance comparison for MENT, SMART and MART, followed by validation using synthetic particle images. Both the theoretical assessment and validation of synthetic images demonstrate significant computational time reduction. The data processing accuracy of MENT was compared to that of SMART in a slot jet experiment. A comparison of the average velocity profiles shows a high level of agreement between the results obtained with MENT and those obtained with SMART. (paper)

  3. Expertise, Domains, and the Consensual Assessment Technique

    Science.gov (United States)

    Kaufman, James C.; Baer, John; Cole, Jason C.

    2009-01-01

    The Consensual Assessment Technique (CAT) argues that the most valid judgments of the creativity are those of the combined opinions of experts in the field. Yet who exactly qualifies as an expert to evaluate a creative product such as a short story? This study examines both novice and expert judgments of student short fiction. Results indicate a…

  4. SONOGRAPHIC PREDICTION OF SCAR DEHISCENCE IN WOMEN WITH PREVIOUS CAESAREAN SECTION

    Directory of Open Access Journals (Sweden)

    Shubhada Suhas Jajoo

    2018-01-01

    Full Text Available BACKGROUND Caesarean section (Sectio Caesarea is a surgical method for the completion of delivery. After various historical modifications of operative techniques, modern approach consists in the transverse dissection of the anterior wall of the uterus. The rate of vaginal birth after caesarean section was significantly reduced from year to year and the rate of repeated caesarean section is increased during the past 10 years. Evaluation of scar thickness is done by ultrasound, but it is still debatable size of thick scar that would be guiding “cut-off value” for the completion of the delivery method. To better assess the risk of uterine rupture, some authors have proposed sonographic measurement of lower uterine segment thickness near term assuming that there is an inverse correlation between LUS thickness and the risk of uterine scar defect. Therefore, this assessment for the management of women with prior CS may increase safety during labour by selecting women with the lowest risk of uterine rupture. The aim of the study is to study the diagnostic accuracy of sonographic measurements of the Lower Uterine Segment (LUS thickness near term in predicting uterine scar defects in women with prior Caesarean Section (CS. We aim to ascertain the best cut-off values for predicting uterine rupture. MATERIALS AND METHODS 100 antenatal women with history of previous one LSCS who come to attend antenatal clinic will be assessed for scar thickness by transabdominal ultrasonography and its correlation with intraoperative findings. This prospective longitudinal study was conducted for 1 year after IEC approval with inclusion criteria previous one LSCS. Exclusion criteria- 1 Previous myomectomy scar; 2 Previous 2 LSCS; 3 Previous hysterotomy scar. RESULTS Our findings indicate that there is a strong association between degree of LUS thinning measured near term and the risk of uterine scar defect at birth. In our study, optimal cut-off value for predicting

  5. Comparison on extraction yield of sennoside A and sennoside B from senna (Cassia angustifolia) using conventional and non conventional extraction techniques and their quantification using a validated HPLC-PDA detection method.

    Science.gov (United States)

    Dhanani, Tushar; Singh, Raghuraj; Reddy, Nagaraja; Trivedi, A; Kumar, Satyanshu

    2017-05-01

    Senna is an important medicinal plant and is used in many Ayurvedic formulations. Dianthraquinone glucosides are the main bioactive phytochemicals present in leaves and pods of senna. The extraction efficiency in terms of yield and composition of the extract of senna prepared using both conventional (cold percolation at room temperature and refluxing) and non conventional (ultrasound and microwave assisted solvent extraction as well as supercritical fluid extraction) techniques were compared in the present study. Also a rapid reverse phase HPLC-PDA detection method was developed and validated for the simultaneous determination of sennoside A and sennoside B in the different extracts of senna leaves. Ultrasound and microwave assisted solvent extraction techniques were more effective in terms of yield and composition of the extracts compared to cold percolation at room temperature and refluxing methods of extraction.

  6. Can relaxation interventions reduce anxiety in patients receiving radiotherapy? outcomes and study validity

    International Nuclear Information System (INIS)

    Elith, C.A.; Perkins, B.A.; Johnson, L.S.; Skelly, M.H.; Dempsey, S.

    2001-01-01

    This study piloted the use of three relaxation interventions in an attempt to reduce levels of anxiety in patients who are immobilised for radiotherapy treatment of head and neck cancers, as well as trying to validate the study methodology. In addition to receiving normal radiation therapy treatment, 14 patients were assigned to either a control group not receiving the relaxation intervention or one of three validated relaxation intervention techniques; music therapy, aromatherapy or guided imagery. Patients in the intervention groups underwent the relaxation technique daily for the first seven days of treatment. On days 1, 3, 5 and 7 of treatment patients were required to complete the State Anxiety Inventory survey. While caution should be taken in accepting the results due to the small numbers of patients involved in the study and the non-randomised assignment of patients within the study, the results of the study demonstrate a clinically significant reduction in anxiety levels in each of the three relaxation interventions compared to the control group. The study demonstrated good study validity due to the ease of implementation, the unambiguous results generated, and the use of already validated anxiety intersections and measurement tools. Copyright (2001) Australian Institute of Radiography

  7. Reliability and Validity of 10 Different Standard Setting Procedures.

    Science.gov (United States)

    Halpin, Glennelle; Halpin, Gerald

    Research indicating that different cut-off points result from the use of different standard-setting techniques leaves decision makers with a disturbing dilemma: Which standard-setting method is best? This investigation of the reliability and validity of 10 different standard-setting approaches was designed to provide information that might help…

  8. Validation of phenol red versus gravimetric method for water reabsorption correction and study of gender differences in Doluisio's absorption technique.

    Science.gov (United States)

    Tuğcu-Demiröz, Fatmanur; Gonzalez-Alvarez, Isabel; Gonzalez-Alvarez, Marta; Bermejo, Marival

    2014-10-01

    The aim of the present study was to develop a method for water flux reabsorption measurement in Doluisio's Perfusion Technique based on the use of phenol red as a non-absorbable marker and to validate it by comparison with gravimetric procedure. The compounds selected for the study were metoprolol, atenolol, cimetidine and cefadroxil in order to include low, intermediate and high permeability drugs absorbed by passive diffusion and by carrier mediated mechanism. The intestinal permeabilities (Peff) of the drugs were obtained in male and female Wistar rats and calculated using both methods of water flux correction. The absorption rate coefficients of all the assayed compounds did not show statistically significant differences between male and female rats consequently all the individual values were combined to compare between reabsorption methods. The absorption rate coefficients and permeability values did not show statistically significant differences between the two strategies of concentration correction. The apparent zero order water absorption coefficients were also similar in both correction procedures. In conclusion gravimetric and phenol red method for water reabsorption correction are accurate and interchangeable for permeability estimation in closed loop perfusion method. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Gougar, Hans [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-02-01

    The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation, verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation and to

  10. Discriminant Validity Assessment: Use of Fornell & Larcker criterion versus HTMT Criterion

    Science.gov (United States)

    Hamid, M. R. Ab; Sami, W.; Mohmad Sidek, M. H.

    2017-09-01

    Assessment of discriminant validity is a must in any research that involves latent variables for the prevention of multicollinearity issues. Fornell and Larcker criterion is the most widely used method for this purpose. However, a new method has emerged for establishing the discriminant validity assessment through heterotrait-monotrait (HTMT) ratio of correlations method. Therefore, this article presents the results of discriminant validity assessment using these methods. Data from previous study was used that involved 429 respondents for empirical validation of value-based excellence model in higher education institutions (HEI) in Malaysia. From the analysis, the convergent, divergent and discriminant validity were established and admissible using Fornell and Larcker criterion. However, the discriminant validity is an issue when employing the HTMT criterion. This shows that the latent variables under study faced the issue of multicollinearity and should be looked into for further details. This also implied that the HTMT criterion is a stringent measure that could detect the possible indiscriminant among the latent variables. In conclusion, the instrument which consisted of six latent variables was still lacking in terms of discriminant validity and should be explored further.

  11. Verification and validation of decision support software: Expert Choice{trademark} and PCM{trademark}

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Q.H.; Martin, J.D.

    1994-11-04

    This report documents the verification and validation of two decision support programs: EXPERT CHOICE{trademark} and PCM{trademark}. Both programs use the Analytic Hierarchy Process (AHP) -- or pairwise comparison technique -- developed by Dr. Thomas L. Saaty. In order to provide an independent method for the validating the two programs, the pairwise comparison algorithm was developed for a standard mathematical program. A standard data set -- selecting a car to purchase -- was used with each of the three programs for validation. The results show that both commercial programs performed correctly.

  12. A COMPARISON OF TWO FUZZY CLUSTERING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Samarjit Das

    2013-10-01

    Full Text Available - In fuzzy clustering, unlike hard clustering, depending on the membership value, a single object may belong exactly to one cluster or partially to more than one cluster. Out of a number of fuzzy clustering techniques Bezdek’s Fuzzy C-Means and GustafsonKessel clustering techniques are well known where Euclidian distance and Mahalanobis distance are used respectively as a measure of similarity. We have applied these two fuzzy clustering techniques on a dataset of individual differences consisting of fifty feature vectors of dimension (feature three. Based on some validity measures we have tried to see the performances of these two clustering techniques from three different aspects- first, by initializing the membership values of the feature vectors considering the values of the three features separately one at a time, secondly, by changing the number of the predefined clusters and thirdly, by changing the size of the dataset.

  13. Developing a validation for environmental sustainability

    Science.gov (United States)

    Adewale, Bamgbade Jibril; Mohammed, Kamaruddeen Ahmed; Nawi, Mohd Nasrun Mohd; Aziz, Zulkifli

    2016-08-01

    One of the agendas for addressing environmental protection in construction is to reduce impacts and make the construction activities more sustainable. This important consideration has generated several research interests within the construction industry, especially considering the construction damaging effects on the ecosystem, such as various forms of environmental pollution, resource depletion and biodiversity loss on a global scale. Using Partial Least Squares-Structural Equation Modeling technique, this study validates environmental sustainability (ES) construct in the context of large construction firms in Malaysia. A cross-sectional survey was carried out where data was collected from Malaysian large construction firms using a structured questionnaire. Results of this study revealed that business innovativeness and new technology are important in determining environmental sustainability (ES) of the Malaysian construction firms. It also established an adequate level of internal consistency reliability, convergent validity and discriminant validity for each of this study's constructs. And based on this result, it could be suggested that the indicators for organisational innovativeness dimensions (business innovativeness and new technology) are useful to measure these constructs in order to study construction firms' tendency to adopt environmental sustainability (ES) in their project execution.

  14. Model-based wear measurements in total knee arthroplasty : development and validation of novel radiographic techniques

    NARCIS (Netherlands)

    IJsseldijk, van E.A.

    2016-01-01

    The primary aim of this work was to develop novel model-based mJSW measurement methods using a 3D reconstruction and compare the accuracy and precision of these methods to conventional mJSW measurement. This thesis contributed to the development, validation and clinical application of model-based

  15. Original article The Imagination in Sport Questionnaire – reliability and validity characteristics

    Directory of Open Access Journals (Sweden)

    Dagmara Budnik-Przybylska

    2014-07-01

    Full Text Available Background Imagery is an effective performance enhancement technique. Imagery has been described previously in a range of psychological domains. Measuring imagery is critical in research and practice in sport. Self-report questionnaires are the most regularly used method. The aim of the present study was to examine reliability and validity characteristics of the Imagination in Sport Questionnaire (Kwestionariusz Wyobraźni w Sporcie – KWS. Participants and procedure Five and hundred eight (N = 326 – study I; N = 182 – study II Polish athletes completed questionnaires (169 male, 156 female – study I; 139 male, 43 female – study II, aged between 12 and 57 years (M = 22.08, SD = 8.18 – study I; age 19-24, M = 20.46, SD = 1.1 – study II, at different competitive levels and recruited from various sports disciplines. Results Results indicated the maintained good stability and internal consistency over a 3-week period. Results of confirmatory factor analysis suggested that the 7-factor structure of the KWS resulted in acceptable model fit indices (NC = 2416.63, df = 1203, GFI = 0.944, AGFI = 0.944, CFI = 0.786, RMSEA = 0.056, p (RMSEA < 0.05 = 0.002 – first study; NC = 2234.39, df = 1203, GFI = 0.673, AGFI = = 0.640, CFI = 0.691, RMSEA = 0.069, p (RMSEA < 0.05 = = 0.000 – second study. Concurrent validity was supported by examination of the relationships between the KWS subscales and the SIAM (Sport Imagery Ability Measure in Polish adaptation. In addition, differences in athletes’ imagery ability were examined across competitive levels, and in relation to both gender and age. Conclusions Overall, the results supported the reliability and construct validity of the KWS.

  16. Analytical difficulties facing today's regulatory laboratories: issues in method validation.

    Science.gov (United States)

    MacNeil, James D

    2012-08-01

    The challenges facing analytical laboratories today are not unlike those faced in the past, although both the degree of complexity and the rate of change have increased. Challenges such as development and maintenance of expertise, maintenance and up-dating of equipment, and the introduction of new test methods have always been familiar themes for analytical laboratories, but international guidelines for laboratories involved in the import and export testing of food require management of such changes in a context which includes quality assurance, accreditation, and method validation considerations. Decisions as to when a change in a method requires re-validation of the method or on the design of a validation scheme for a complex multi-residue method require a well-considered strategy, based on a current knowledge of international guidance documents and regulatory requirements, as well the laboratory's quality system requirements. Validation demonstrates that a method is 'fit for purpose', so the requirement for validation should be assessed in terms of the intended use of a method and, in the case of change or modification of a method, whether that change or modification may affect a previously validated performance characteristic. In general, method validation involves method scope, calibration-related parameters, method precision, and recovery. Any method change which may affect method scope or any performance parameters will require re-validation. Some typical situations involving change in methods are discussed and a decision process proposed for selection of appropriate validation measures. © 2012 John Wiley & Sons, Ltd.

  17. Distributed Trust Management for Validating SLA Choreographies

    Science.gov (United States)

    Haq, Irfan Ul; Alnemr, Rehab; Paschke, Adrian; Schikuta, Erich; Boley, Harold; Meinel, Christoph

    For business workflow automation in a service-enriched environment such as a grid or a cloud, services scattered across heterogeneous Virtual Organizations (VOs) can be aggregated in a producer-consumer manner, building hierarchical structures of added value. In order to preserve the supply chain, the Service Level Agreements (SLAs) corresponding to the underlying choreography of services should also be incrementally aggregated. This cross-VO hierarchical SLA aggregation requires validation, for which a distributed trust system becomes a prerequisite. Elaborating our previous work on rule-based SLA validation, we propose a hybrid distributed trust model. This new model is based on Public Key Infrastructure (PKI) and reputation-based trust systems. It helps preventing SLA violations by identifying violation-prone services at service selection stage and actively contributes in breach management at the time of penalty enforcement.

  18. Mitigation of the voltage fluctuations using an efficient disturbance extraction technique

    Energy Technology Data Exchange (ETDEWEB)

    Elnady, Amr; Salama, M.M.A. [University of Waterloo, Electrical and Computer Engineering, 200 University Ave. West, Waterloo, Ont. (Canada N2L3G1)

    2007-03-15

    This paper introduces an efficient technique for extracting the disturbance signal of the voltage flicker. The proposed technique depends on the supervised state estimation that is controlled by the Widrow-Hoff delta rule. The extracted disturbance signal is employed for mitigating the cyclic voltage flicker by using series and parallel mitigating devices. The speed and accuracy of the proposed technique are verified by simulation results with EMTDC/PSCAD. In addition, experimental results are presented to prove the validity of the proposed algorithm. (author)

  19. Validity of observer ratings of the five-factor model of personality traits: a meta-analysis.

    Science.gov (United States)

    Oh, In-Sue; Wang, Gang; Mount, Michael K

    2011-07-01

    Conclusions reached in previous research about the magnitude and nature of personality-performance linkages have been based almost exclusively on self-report measures of personality. The purpose of this study is to address this void in the literature by conducting a meta-analysis of the relationship between observer ratings of the five-factor model (FFM) personality traits and overall job performance. Our results show that the operational validities of FFM traits based on observer ratings are higher than those based on self-report ratings. In addition, the results show that when based on observer ratings, all FFM traits are significant predictors of overall performance. Further, observer ratings of FFM traits show meaningful incremental validity over self-reports of corresponding FFM traits in predicting overall performance, but the reverse is not true. We conclude that the validity of FFM traits in predicting overall performance is higher than previously believed, and our results underscore the importance of disentangling the validity of personality traits from the method of measurement of the traits.

  20. Evaluation of miniature tension specimen fabrication techniques and performance

    International Nuclear Information System (INIS)

    Hamilton, M.L.; Blotter, M.A.; Edwards, D.J.

    1993-01-01

    The confident application of miniature tensile specimens requires adequate control over their fabrication and is facilitated by automated test and analysis techniques. Three fabrication processes -- punching, chemical milling, and electrical discharge machining (EDM) -- were recently evaluated, leading to the replacement of the previously used punching technique with a wire EDM technique. The automated data acquisition system was upgraded, and an interactive data analysis program was developed

  1. Evaluation of miniature tensile specimen fabrication techniques and performance

    Energy Technology Data Exchange (ETDEWEB)

    Hamilton, M.L. (Pacific Northwest Lab., Richland, WA (United States)); Blotter, M.A.; Edwards, D.J. (Missouri Univ., Rolla, MO (United States))

    1992-01-01

    The confident application of miniature tensile specimens requires adequate control over their fabrication and is facilitated by automated test and analysis techniques. Three fabrication processes -- punching, chemical, milling, and electrical discharge machining (EDM) -- were recently evaluated, leading to the replacement of the previously used punching technique with a wire EDM technique. The automated data acquisition system was upgraded, and an interactive data analysis program was developed.

  2. Evaluation of miniature tensile specimen fabrication techniques and performance

    International Nuclear Information System (INIS)

    Hamilton, M.L.; Blotter, M.A.; Edwards, D.J.

    1992-01-01

    The confident application of miniature tensile specimens requires adequate control over their fabrication and is facilitated by automated test and analysis techniques. Three fabrication processes -- punching, chemical, milling, and electrical discharge machining (EDM) -- were recently evaluated, leading to the replacement of the previously used punching technique with a wire EDM technique. The automated data acquisition system was upgraded, and an interactive data analysis program was developed

  3. Cross-validation pitfalls when selecting and assessing regression and classification models.

    Science.gov (United States)

    Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon

    2014-03-29

    We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.

  4. Internal Cluster Validation on Earthquake Data in the Province of Bengkulu

    Science.gov (United States)

    Rini, D. S.; Novianti, P.; Fransiska, H.

    2018-04-01

    K-means method is an algorithm for cluster n object based on attribute to k partition, where k < n. There is a deficiency of algorithms that is before the algorithm is executed, k points are initialized randomly so that the resulting data clustering can be different. If the random value for initialization is not good, the clustering becomes less optimum. Cluster validation is a technique to determine the optimum cluster without knowing prior information from data. There are two types of cluster validation, which are internal cluster validation and external cluster validation. This study aims to examine and apply some internal cluster validation, including the Calinski-Harabasz (CH) Index, Sillhouette (S) Index, Davies-Bouldin (DB) Index, Dunn Index (D), and S-Dbw Index on earthquake data in the Bengkulu Province. The calculation result of optimum cluster based on internal cluster validation is CH index, S index, and S-Dbw index yield k = 2, DB Index with k = 6 and Index D with k = 15. Optimum cluster (k = 6) based on DB Index gives good results for clustering earthquake in the Bengkulu Province.

  5. The validity and reliability of iridology in the diagnosis of previous acute appendicitis as evi-denced by appendectomy

    Directory of Open Access Journals (Sweden)

    L. Frank

    2013-01-01

    Full Text Available Iridology is defined as a photographic science that identifies pathological and functional changes within organs via biomicroscopic iris assessment for aberrant lines, spots, and discolourations. According to iridology, the iris does not reflect changes  during  anaesthesia,  due  to  the  drugs inhibitory  effects  on  nerves  impulses,  and  in cases of organ removal, it reflects the pre-surgical condition.The profession of Homoeopathy is frequently associated with iridology and in a recent survey (2009  investigating  the  perceptions  of  Masters of  Technology  graduates  in  Homoeopathy  of University of Johannesburg, iridology was highly regarded as a potential additional skill requirement for assessing the health status of the patient.This  study  investigated  the  reliability  of iridology  in  the  diagnosis  of  previous  acute appendicitis, as evidenced by appendectomy. A total of 60 participants took part in the study. Thirty of the 60 participants had an appendectomy due to acute appendicitis, and 30 had had no prior history  of  appendicitis.  Each  participant’s  right iris  was  documented  by  photography  with  the use  of  a  non-mydriatic  retinal  camera  that  was reset for photographing the iris. The photographs were then randomized by an external person and no identifying data made available to the three raters.  The  raters  included  the  researcher,  who had little experience in iridology and two highly experienced  practising  iridologists.  Data  was obtained  from  the  analyses  of  the  photographs wherein  the  presence  or  absence  of  lesions (implying acute appendicitis was indicated by the raters. None of the three raters was able to show a significant  success  rate  in  identifying  correctly the  people  with  a  previous  history  of  acute appendicitis and resultant appendectomies

  6. Validity and reliability of the Achilles tendon total rupture score.

    Science.gov (United States)

    Ganestam, Ann; Barfod, Kristoffer; Klit, Jakob; Troelsen, Anders

    2013-01-01

    The best treatment of acute Achilles tendon rupture remains debated. Patient-reported outcome measures have become cornerstones in treatment evaluations. The Achilles tendon total rupture score (ATRS) has been developed for this purpose but requires additional validation. The purpose of the present study was to validate a Danish translation of the ATRS. The ATRS was translated into Danish according to internationally adopted standards. Of 142 patients, 90 with previous rupture of the Achilles tendon participated in the validity study and 52 in the reliability study. The ATRS showed moderately strong correlations with the physical subscores of the Medical Outcomes Study 36-item Short-Form Health Survey (r = .70 to .75; p questionnaire (r = .71; p validity. For study and follow-up purposes, the ATRS seems reliable for comparisons of groups of patients. Its usability is limited for repeated assessment of individual patients. The development of analysis guidelines would be desirable. Copyright © 2013 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  7. Development and Validation of a Bioanalytical Method for Direct ...

    African Journals Online (AJOL)

    Purpose: To develop and validate a user-friendly spiked plasma method for the extraction of diclofenac potassium that reduces the number of treatments with plasma sample, in order to minimize human error. Method: Instead of solvent evaporation technique, the spiked plasma sample was modified with H2SO4 and NaCl, ...

  8. Models and Techniques for Proving Data Structure Lower Bounds

    DEFF Research Database (Denmark)

    Larsen, Kasper Green

    In this dissertation, we present a number of new techniques and tools for proving lower bounds on the operational time of data structures. These techniques provide new lines of attack for proving lower bounds in both the cell probe model, the group model, the pointer machine model and the I...... bound of tutq = (lgd􀀀1 n). For ball range searching, we get a lower bound of tutq = (n1􀀀1=d). The highest previous lower bound proved in the group model does not exceed ((lg n= lg lg n)2) on the maximum of tu and tq. Finally, we present a new technique for proving lower bounds....../O-model. In all cases, we push the frontiers further by proving lower bounds higher than what could possibly be proved using previously known techniques. For the cell probe model, our results have the following consequences: The rst (lg n) query time lower bound for linear space static data structures...

  9. Validation of a Russian Language Oswestry Disability Index Questionnaire.

    Science.gov (United States)

    Yu, Elizabeth M; Nosova, Emily V; Falkenstein, Yuri; Prasad, Priya; Leasure, Jeremi M; Kondrashov, Dimitriy G

    2016-11-01

    Study Design  Retrospective reliability and validity study. Objective  To validate a recently translated Russian language version of the Oswestry Disability Index (R-ODI) using standardized methods detailed from previous validations in other languages. Methods  We included all subjects who were seen in our spine surgery clinic, over the age of 18, and fluent in the Russian language. R-ODI was translated by six bilingual people and combined into a consensus version. R-ODI and visual analog scale (VAS) questionnaires for leg and back pain were distributed to subjects during both their initial and follow-up visits. Test validity, stability, and internal consistency were measured using standardized psychometric methods. Results Ninety-seven subjects participated in the study. No change in the meaning of the questions on R-ODI was noted with translation from English to Russian. There was a significant positive correlation between R-ODI and VAS scores for both the leg and back during both the initial and follow-up visits ( p  Russian-speaking population in the United States.

  10. Development and validation of a questionnaire designed to measure foot-health status.

    Science.gov (United States)

    Bennett, P J; Patterson, C; Wearing, S; Baglioni, T

    1998-09-01

    The aim of this study was to apply the principles of content, criterion, and construct validation to a new questionnaire specifically designed to measure foot-health status. One hundred eleven subjects completed two different questionnaires designed to measure foot health (the new Foot Health Status Questionnaire and the previously validated Foot Function Index) and underwent a clinical examination in order to provide data for a second-order confirmatory factor analysis. Presented herein is a psychometrically evaluated questionnaire that contains 13 items covering foot pain, foot function, footwear, and general foot health. The tool demonstrates a high degree of content, criterion, and construct validity and test-retest reliability.

  11. Comparison of time-resolved and continuous-wave near-infrared techniques for measuring cerebral blood flow in piglets

    Science.gov (United States)

    Diop, Mamadou; Tichauer, Kenneth M.; Elliott, Jonathan T.; Migueis, Mark; Lee, Ting-Yim; Lawrence, Keith St.

    2010-09-01

    A primary focus of neurointensive care is monitoring the injured brain to detect harmful events that can impair cerebral blood flow (CBF), resulting in further injury. Since current noninvasive methods used in the clinic can only assess blood flow indirectly, the goal of this research is to develop an optical technique for measuring absolute CBF. A time-resolved near-infrared (TR-NIR) apparatus is built and CBF is determined by a bolus-tracking method using indocyanine green as an intravascular flow tracer. As a first step in the validation of this technique, CBF is measured in newborn piglets to avoid signal contamination from extracerebral tissue. Measurements are acquired under three conditions: normocapnia, hypercapnia, and following carotid occlusion. For comparison, CBF is concurrently measured by a previously developed continuous-wave NIR method. A strong correlation between CBF measurements from the two techniques is revealed with a slope of 0.79+/-0.06, an intercept of -2.2+/-2.5 ml/100 g/min, and an R2 of 0.810+/-0.088. Results demonstrate that TR-NIR can measure CBF with reasonable accuracy and is sensitive to flow changes. The discrepancy between the two methods at higher CBF could be caused by differences in depth sensitivities between continuous-wave and time-resolved measurements.

  12. Adaptive importance sampling for probabilistic validation of advanced driver assistance systems

    NARCIS (Netherlands)

    Gietelink, O.J.; Schutter, B. de; Verhaegen, M.

    2006-01-01

    We present an approach for validation of advanced driver assistance systems, based on randomized algorithms. The new method consists of an iterative randomized simulation using adaptive importance sampling. The randomized algorithm is more efficient than conventional simulation techniques. The

  13. A reference data set for validating vapor pressure measurement techniques: homologous series of polyethylene glycols

    Science.gov (United States)

    Krieger, Ulrich K.; Siegrist, Franziska; Marcolli, Claudia; Emanuelsson, Eva U.; Gøbel, Freya M.; Bilde, Merete; Marsh, Aleksandra; Reid, Jonathan P.; Huisman, Andrew J.; Riipinen, Ilona; Hyttinen, Noora; Myllys, Nanna; Kurtén, Theo; Bannan, Thomas; Percival, Carl J.; Topping, David

    2018-01-01

    To predict atmospheric partitioning of organic compounds between gas and aerosol particle phase based on explicit models for gas phase chemistry, saturation vapor pressures of the compounds need to be estimated. Estimation methods based on functional group contributions require training sets of compounds with well-established saturation vapor pressures. However, vapor pressures of semivolatile and low-volatility organic molecules at atmospheric temperatures reported in the literature often differ by several orders of magnitude between measurement techniques. These discrepancies exceed the stated uncertainty of each technique which is generally reported to be smaller than a factor of 2. At present, there is no general reference technique for measuring saturation vapor pressures of atmospherically relevant compounds with low vapor pressures at atmospheric temperatures. To address this problem, we measured vapor pressures with different techniques over a wide temperature range for intercomparison and to establish a reliable training set. We determined saturation vapor pressures for the homologous series of polyethylene glycols (H - (O - CH2 - CH2)n - OH) for n = 3 to n = 8 ranging in vapor pressure at 298 K from 10-7 to 5×10-2 Pa and compare them with quantum chemistry calculations. Such a homologous series provides a reference set that covers several orders of magnitude in saturation vapor pressure, allowing a critical assessment of the lower limits of detection of vapor pressures for the different techniques as well as permitting the identification of potential sources of systematic error. Also, internal consistency within the series allows outlying data to be rejected more easily. Most of the measured vapor pressures agreed within the stated uncertainty range. Deviations mostly occurred for vapor pressure values approaching the lower detection limit of a technique. The good agreement between the measurement techniques (some of which are sensitive to the mass

  14. Analog fault diagnosis by inverse problem technique

    KAUST Repository

    Ahmed, Rania F.

    2011-12-01

    A novel algorithm for detecting soft faults in linear analog circuits based on the inverse problem concept is proposed. The proposed approach utilizes optimization techniques with the aid of sensitivity analysis. The main contribution of this work is to apply the inverse problem technique to estimate the actual parameter values of the tested circuit and so, to detect and diagnose single fault in analog circuits. The validation of the algorithm is illustrated through applying it to Sallen-Key second order band pass filter and the results show that the detecting percentage efficiency was 100% and also, the maximum error percentage of estimating the parameter values is 0.7%. This technique can be applied to any other linear circuit and it also can be extended to be applied to non-linear circuits. © 2011 IEEE.

  15. Validation of Portable Muscle Tone Measurement Device Based on a Motor-Driven System

    National Research Council Canada - National Science Library

    Chen, Jia-Jin

    2001-01-01

    .... The aim of this study is to extend a sophisticated motor-driven measurement system, developed in our previous research, as a validation platform for developing a portable muscle tone measurement system...

  16. Validation of biomarkers for the study of environmental carcinogens: a review

    DEFF Research Database (Denmark)

    Gallo, Valentina; Khan, Aneire; Gonzales, Carlos

    2008-01-01

    There is a need for validation of biomarkers. Our aim is to review published work on the validation of selected biomarkers: bulky DNA adducts, N-nitroso compounds, 1-hydroxypyrene, and oxidative damage to DNA. A systematic literature search in PubMed was performed. Information on the variability...... and reliability of the laboratory tests used for biomarkers measurements was collected. For the evaluation of the evidence on validation we referred to the ACCE criteria. Little is known about intraindividual variation of DNA adduct measurements, but measurements have a good repeatability irrespective...... of the technique used for their identification; reproducibility improved after the correction for a laboratory factor. A high-sensitivity method is available for the measurement of 1-hydroxypyrene in urine. There is consensus on validation of biomarkers of oxidative damage DNA based on the comet assay...

  17. Design for validation: An approach to systems validation

    Science.gov (United States)

    Carter, William C.; Dunham, Janet R.; Laprie, Jean-Claude; Williams, Thomas; Howden, William; Smith, Brian; Lewis, Carl M. (Editor)

    1989-01-01

    Every complex system built is validated in some manner. Computer validation begins with review of the system design. As systems became too complicated for one person to review, validation began to rely on the application of adhoc methods by many individuals. As the cost of the changes mounted and the expense of failure increased, more organized procedures became essential. Attempts at devising and carrying out those procedures showed that validation is indeed a difficult technical problem. The successful transformation of the validation process into a systematic series of formally sound, integrated steps is necessary if the liability inherent in the future digita-system-based avionic and space systems is to be minimized. A suggested framework and timetable for the transformtion are presented. Basic working definitions of two pivotal ideas (validation and system life-cyle) are provided and show how the two concepts interact. Many examples are given of past and present validation activities by NASA and others. A conceptual framework is presented for the validation process. Finally, important areas are listed for ongoing development of the validation process at NASA Langley Research Center.

  18. Developing and validating a nutrition knowledge questionnaire: key methods and considerations.

    Science.gov (United States)

    Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina

    2017-10-01

    To outline key statistical considerations and detailed methodologies for the development and evaluation of a valid and reliable nutrition knowledge questionnaire. Literature on questionnaire development in a range of fields was reviewed and a set of evidence-based guidelines specific to the creation of a nutrition knowledge questionnaire have been developed. The recommendations describe key qualitative methods and statistical considerations, and include relevant examples from previous papers and existing nutrition knowledge questionnaires. Where details have been omitted for the sake of brevity, the reader has been directed to suitable references. We recommend an eight-step methodology for nutrition knowledge questionnaire development as follows: (i) definition of the construct and development of a test plan; (ii) generation of the item pool; (iii) choice of the scoring system and response format; (iv) assessment of content validity; (v) assessment of face validity; (vi) purification of the scale using item analysis, including item characteristics, difficulty and discrimination; (vii) evaluation of the scale including its factor structure and internal reliability, or Rasch analysis, including assessment of dimensionality and internal reliability; and (viii) gathering of data to re-examine the questionnaire's properties, assess temporal stability and confirm construct validity. Several of these methods have previously been overlooked. The measurement of nutrition knowledge is an important consideration for individuals working in the nutrition field. Improved methods in the development of nutrition knowledge questionnaires, such as the use of factor analysis or Rasch analysis, will enable more confidence in reported measures of nutrition knowledge.

  19. Development of the staffing evaluation technique for mental tasks of the advanced main control room

    International Nuclear Information System (INIS)

    Hsieh Tsungling; Yang Chihwei; Lin Chiuhsiangjoe

    2011-01-01

    The key goals of staffing and qualifications review element are to ensure that the right numbers of people with the appropriate skills and abilities are available to support plant operations and events. If the staffing level is too few, excessive stress that caused human errors possibly will be placed on the operators. Accordingly, this study developed a staffing evaluation technique based on CPM-GOMS for the mental tasks such as operations in the advanced main control room. A within-subject experiment was designed to examine the validity of the staffing evaluation technique. The results indicated the performance of evaluated staffing level via the staffing evaluation technique was significantly higher than that of non-evaluated staffing level; thus, validity of the staffing evaluation technique can be accepted. Finally, the implications for managerial practice on the findings of this study were discussed. (author)

  20. Member Checking: A Tool to Enhance Trustworthiness or Merely a Nod to Validation?

    Science.gov (United States)

    Birt, Linda; Scott, Suzanne; Cavers, Debbie; Campbell, Christine; Walter, Fiona

    2016-06-22

    The trustworthiness of results is the bedrock of high quality qualitative research. Member checking, also known as participant or respondent validation, is a technique for exploring the credibility of results. Data or results are returned to participants to check for accuracy and resonance with their experiences. Member checking is often mentioned as one in a list of validation techniques. This simplistic reporting might not acknowledge the value of using the method, nor its juxtaposition with the interpretative stance of qualitative research. In this commentary, we critique how member checking has been used in published research, before describing and evaluating an innovative in-depth member checking technique, Synthesized Member Checking. The method was used in a study with patients diagnosed with melanoma. Synthesized Member Checking addresses the co-constructed nature of knowledge by providing participants with the opportunity to engage with, and add to, interview and interpreted data, several months after their semi-structured interview. © The Author(s) 2016.

  1. Validation of KENO-based criticality calculations at Rocky Flats

    International Nuclear Information System (INIS)

    Felsher, P.D.; McKamy, J.N.; Monahan, S.P.

    1992-01-01

    In the absence of experimental data, it is necessary to rely on computer-based computational methods in evaluating the criticality condition of a nuclear system. The validity of the computer codes is established in a two-part procedure as outlined in ANSI/ANS 8.1. The first step, usually the responsibility of the code developer, involves verification that the algorithmic structure of the code is performing the intended mathematical operations correctly. The second step involves an assessment of the code's ability to realistically portray the governing physical processes in question. This is accomplished by determining the code's bias, or systematic error, through a comparison of computational results to accepted values obtained experimentally. In this paper, the authors discuss the validation process for KENO and the Hansen-Roach cross sections in use at EG and G Rocky Flats. The validation process at Rocky Flats consists of both global and local techniques. The global validation resulted in a maximum k eff limit of 0.95 for the limiting-accident scanarios of a criticality evaluation

  2. An experimental technique to measure the capillary waves in electrified microjets

    Directory of Open Access Journals (Sweden)

    Rebollo-Muñoz Noelia

    2012-04-01

    Full Text Available Backlight optical imaging is an experimental technique with an enormous potential in microfluidics to study very varied fluid configurations and phenomena. In this paper, we show the capability of this technique to precisely characterize the capillary waves growing in electrified microjets. For this purpose, images of electrified liquid jets formed by electrospray were acquired and processed using a sub-pixel resolution technique. Our results reflect the validity and usefulness of optical imaging for this type of application.

  3. Current role of endovascular therapy in Marfan patients with previous aortic surgery

    Directory of Open Access Journals (Sweden)

    Ibrahim Akin

    2008-02-01

    Full Text Available Ibrahim Akin, Stephan Kische, Tim C Rehders, Tushar Chatterjee, Henrik Schneider, Thomas Körber, Christoph A Nienaber, Hüseyin InceDepartment of Medicine, Division of Cardiology at the University Hospital Rostock, Rostock School of Medicine, Ernst-Heydemann-Str. 6, 18057 Rostock, GermanyAbstract: The Marfan syndrome is a heritable disorder of the connective tissue which affects the cardiovascular, ocular, and skeletal system. The cardiovascular manifestation with aortic root dilatation, aortic valve regurgitation, and aortic dissection has a prevalence of 60% to 90% and determines the premature death of these patients. Thirty-four percent of the patients with Marfan syndrome will have serious cardiovascular complications requiring surgery in the first 10 years after diagnosis. Before aortic surgery became available, the majority of the patients died by the age of 32 years. Introduction in the aortic surgery techniques caused an increase of the 10 year survival rate up to 97%. The purpose of this article is to give an overview about the feasibility and outcome of stent-graft placement in the descending thoracic aorta in Marfan patients with previous aortic surgery.Keywords: Marfan syndrome, aortic dissection, root replacement, stent-graft, previous aortic surgery

  4. The OECD validation program of the H295R steroidogenesis assay: Phase 3. Final inter-laboratory validation study

    DEFF Research Database (Denmark)

    Hecker, Markus; Hollert, Henner; Cooper, Ralph

    2011-01-01

    In response to increasing concerns regarding the potential of chemicals to interact with the endocrine system of humans and wildlife, various national and international programs have been initiated with the aim to develop new guidelines for the screening and testing of these chemicals in vertebra......In response to increasing concerns regarding the potential of chemicals to interact with the endocrine system of humans and wildlife, various national and international programs have been initiated with the aim to develop new guidelines for the screening and testing of these chemicals...... in vertebrates. Here, we report on the validation of an in vitro assay, the H295R steroidogenesis assay, to detect chemicals with the potential to inhibit or induce the production of the sex steroid hormones testosterone (T) and 17β-estradiol (E2) in preparation for the development of an Organization...... for Economic Cooperation and Development (OECD) test guideline.A previously optimized and pre-validated protocol was used to assess the potential of 28 chemicals of diverse structures and properties to validate the H295R steroidogenesis assay. These chemicals are comprised of known endocrine-active chemicals...

  5. Detour technique, Dipping technique, or IIeal bladder flap technique for surgical correction of uretero-ileal anastomotic stricture in orthotopic ileal neobladder

    Directory of Open Access Journals (Sweden)

    Mohamed Wishahi

    2015-08-01

    Full Text Available ABSTRACTBackground:Uretero-ileal anastomotic stricture (UIAS is a urological complication after ileal neobladder, the initial management being endourological intervention. If this fails or stricture recurs, surgical intervention will be indicated.Design and Participants:From 1994 to 2013, 129 patients were treated for UIAS after unsuccessful endourological intervention. Unilateral UIAS was present in 101 patients, and bilateral in 28 patients; total procedures were 157. The previous ileal neobladder techniques were Hautmann neobladder, detubularized U shape, or spherical shape neobladder.Surgical procedures:Dipping technique was performed in 74 UIAS. Detour technique was done in 60 renal units. Ileal Bladder flap was indicated in 23 renal units. Each procedure ended with insertion of double J, abdominal drain, and indwelling catheter.Results:Follow-up was done for 12 to 36 months. Patency of the anastomosis was found in 91.7 % of cases. Thirteen patients (8.3% underwent antegrade dilatation and insertion of double J.Conclusion:After endourological treatment for uretero-ileal anastomotic failure, basically three techniques may be indicated: dipping technique, detour technique, and ileal bladder flap. The indications are dependent on the length of the stenotic/dilated ureteral segment. Better results for long length of stenotic ureter are obtained with detour technique; for short length stenotic ureter dipping technique; when the stenotic segment is 5 cm or more with a short ureter, the ileal tube flap is indicated. The use of double J stent is mandatory in the majority of cases. Early intervention is the rule for protecting renal units from progressive loss of function.

  6. Images of hospitality : validation of experiential dimensions

    NARCIS (Netherlands)

    Pijls-Hoekstra, Ruth; Groen, Brenda H.; Galetzka, Mirjam; Pruyn, Ad T.H.

    2016-01-01

    Hospitality is a much-researched topic, but its definition is still debated. This paper is part of a larger research project into the perception of hospitality. Previous research using the Delphi-method (hospitality providers and experts) and the Critical Incident Technique (guests and consumers)

  7. Cultural Orientations Framework (COF) Assessment Questionnaire in Cross-Cultural Coaching: A Cross-Validation with Wave Focus Styles

    OpenAIRE

    Rojon, C; McDowall, A

    2010-01-01

    This paper outlines a cross-validation of the Cultural Orientations Framework assessment questionnaire\\ud (COF, Rosinski, 2007; a new tool designed for cross-cultural coaching) with the Saville Consulting\\ud Wave Focus Styles questionnaire (Saville Consulting, 2006; an existing validated measure of\\ud occupational personality), using data from UK and German participants (N = 222). The convergent and\\ud divergent validity of the questionnaire was adequate. Contrary to previous findings which u...

  8. Evaluating internal public relations using the critical incident technique

    NARCIS (Netherlands)

    Koning, K.H.; de Jong, Menno D.T.; van Vuuren, Hubrecht A.

    2015-01-01

    Although the critical incident technique (CIT) is one of the current methods in communication audits, little is known about the way it works. The validity of the CIT in the context of internal public relations depends on 3 assumptions: that participants can describe discrete communication events,

  9. Benchmarks for GADRAS performance validation

    International Nuclear Information System (INIS)

    Mattingly, John K.; Mitchell, Dean James; Rhykerd, Charles L. Jr.

    2009-01-01

    The performance of the Gamma Detector Response and Analysis Software (GADRAS) was validated by comparing GADRAS model results to experimental measurements for a series of benchmark sources. Sources for the benchmark include a plutonium metal sphere, bare and shielded in polyethylene, plutonium oxide in cans, a highly enriched uranium sphere, bare and shielded in polyethylene, a depleted uranium shell and spheres, and a natural uranium sphere. The benchmark experimental data were previously acquired and consist of careful collection of background and calibration source spectra along with the source spectra. The calibration data were fit with GADRAS to determine response functions for the detector in each experiment. A one-dimensional model (pie chart) was constructed for each source based on the dimensions of the benchmark source. The GADRAS code made a forward calculation from each model to predict the radiation spectrum for the detector used in the benchmark experiment. The comparisons between the GADRAS calculation and the experimental measurements are excellent, validating that GADRAS can correctly predict the radiation spectra for these well-defined benchmark sources.

  10. Content validation of the 'Mosaic of Opinions About Abortion' (Mosai).

    Science.gov (United States)

    Cacique, Denis Barbosa; Passini Junior, Renato; Osis, Maria José Martins Duarte

    2013-01-01

    This study aimed to develop and validate the contents of the Mosaico de Opiniões Sobre o Aborto Induzido (Mosai), a structured questionnaire intended to be used as a tool to collect information about the views of health professionals about the morality of abortion. The contents of the first version of the questionnaire was developed based on the technique of thematic content analysis of books, articles, films, websites and newspapers reporting cases of abortion and arguing about their practice. The Mosai was composed of 6 moral dilemmas (vignettes) related to induced abortion, whose outcomes should be chosen by the respondents and could be justified by the classification of 15 patterns of arguments about the morality of abortion. In order to validate its contents, the questionnaire was submitted to the scrutiny of a panel of 12 experts, an intentional sample consisted of doctors, lawyers, ethicists, sociologists, nurses and statisticians, who evaluated the criteria of clarity of writing, relevance, appropriateness to sample and suitability to the fields. These scores were analyzed by the method of concordance rate, while the free comments were analyzed using the analysis technique content. All the moral dilemmas and arguments were considered valid according to the rate of agreement, however, some comments led to the exclusion of a dilemma about emergency contraception, among other changes. The content of Mosai was considered valid to serve as a tool to collect the opinions of healthcare professionals regarding the morality of abortion. Copyright © 2013 Elsevier Editora Ltda. All rights reserved.

  11. Software Process Validation: Quantitatively Measuring the Correspondence of a Process to a Model

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1997-01-01

    .... When process models and process executions diverge, something significant is happening. The authors have developed techniques for uncovering and measuring the discrepancies between models and executions, which they call process validation...

  12. Validity of Type D personality in Iceland

    DEFF Research Database (Denmark)

    Svansdottir, Erla; Karlsson, Hrobjartur D; Gudnason, Thorarinn

    2012-01-01

    was 26-29%, and assessment of Type D personality was not confounded by severity of underlying coronary artery disease. Regarding risk markers, Type D patients reported more psychopharmacological medication use and smoking, but frequency of previous mental problems was similar across groups. Type D......Type D personality has been associated with poor prognosis in cardiac patients. This study investigated the validity of the Type D construct in Iceland and its association with disease severity and health-related risk markers in cardiac patients. A sample of 1,452 cardiac patients completed...... the Type D scale (DS14), and a subgroup of 161 patients completed measurements for the five-factor model of personality, emotional control, anxiety, depression, stress and lifestyle factors. The Icelandic DS14 had good psychometric properties and its construct validity was confirmed. Prevalence of Type D...

  13. Multimicrophone Speech Dereverberation: Experimental Validation

    Directory of Open Access Journals (Sweden)

    Marc Moonen

    2007-05-01

    Full Text Available Dereverberation is required in various speech processing applications such as handsfree telephony and voice-controlled systems, especially when signals are applied that are recorded in a moderately or highly reverberant environment. In this paper, we compare a number of classical and more recently developed multimicrophone dereverberation algorithms, and validate the different algorithmic settings by means of two performance indices and a speech recognition system. It is found that some of the classical solutions obtain a moderate signal enhancement. More advanced subspace-based dereverberation techniques, on the other hand, fail to enhance the signals despite their high-computational load.

  14. Physical standards and valid caibration

    International Nuclear Information System (INIS)

    Smith, D.B.

    1975-01-01

    The desire for improved nuclear material safeguards has led to the development and use of a number and techniques and instruments for the nondestructive assay (NDA) of special nuclear material. Sources of potential bias in NDA measurements are discussed and methods of eliminating the effects of bias in assay results are suggested. Examples are given of instruments in which these methods have been successfully applied. The results of careful attention to potential sources of assay bias are a significant reduction in the number and complexity of standards required for valid instrument calibration and more credible assay results. (auth)

  15. Multiphysics software and the challenge to validating physical models

    International Nuclear Information System (INIS)

    Luxat, J.C.

    2008-01-01

    This paper discusses multi physics software and validation of physical models in the nuclear industry. The major challenge is to convert the general purpose software package to a robust application-specific solution. This requires greater knowledge of the underlying solution techniques and the limitations of the packages. Good user interfaces and neat graphics do not compensate for any deficiencies

  16. Development and Validation of the Alcohol Identity Implicit Associations Test (AI-IAT)

    Science.gov (United States)

    Gray, Heather M.; LaPlante, Debi A.; Bannon, Brittany L.; Ambady, Nalini; Shaffer, Howard J.

    2011-01-01

    Alcohol identity is the extent to which an individual perceives drinking alcohol to be a defining characteristic of his or her self-identity. Although alcohol identity might play an important role in risky college drinking practices, there is currently no easily administered, implicit measure of this concept. Therefore we developed a computerized implicit measure of alcohol identity (the Alcohol Identity Implicit Associations Test; AI-IAT) and assessed its reliability and predictive validity in relation to risky college drinking practices. One hundred forty-one college students completed the AI-IAT. Again 3- and 6-months later, we administered the AI-IAT and indices of engagement in risky college drinking practices. A subset of participants also completed the previously-validated implicit measure of alcohol identity. Scores on the AI-IAT were stable over time, internally consistent, and positively correlated with the previously-validated measure of alcohol identity. Baseline AI-IAT scores predicted future engagement in risky college drinking practices, even after controlling for standard alcohol consumption measures. We conclude that the AI-IAT reliably measures alcohol identity, a concept that appears to play an important role in risky college drinking practices. PMID:21621924

  17. Selecting the "Best" Factor Structure and Moving Measurement Validation Forward: An Illustration.

    Science.gov (United States)

    Schmitt, Thomas A; Sass, Daniel A; Chappelle, Wayne; Thompson, William

    2018-04-09

    Despite the broad literature base on factor analysis best practices, research seeking to evaluate a measure's psychometric properties frequently fails to consider or follow these recommendations. This leads to incorrect factor structures, numerous and often overly complex competing factor models and, perhaps most harmful, biased model results. Our goal is to demonstrate a practical and actionable process for factor analysis through (a) an overview of six statistical and psychometric issues and approaches to be aware of, investigate, and report when engaging in factor structure validation, along with a flowchart for recommended procedures to understand latent factor structures; (b) demonstrating these issues to provide a summary of the updated Posttraumatic Stress Disorder Checklist (PCL-5) factor models and a rationale for validation; and (c) conducting a comprehensive statistical and psychometric validation of the PCL-5 factor structure to demonstrate all the issues we described earlier. Considering previous research, the PCL-5 was evaluated using a sample of 1,403 U.S. Air Force remotely piloted aircraft operators with high levels of battlefield exposure. Previously proposed PCL-5 factor structures were not supported by the data, but instead a bifactor model is arguably more statistically appropriate.

  18. Lower-Order Compensation Chain Threshold-Reduction Technique for Multi-Stage Voltage Multipliers.

    Science.gov (United States)

    Dell' Anna, Francesco; Dong, Tao; Li, Ping; Wen, Yumei; Azadmehr, Mehdi; Casu, Mario; Berg, Yngvar

    2018-04-17

    This paper presents a novel threshold-compensation technique for multi-stage voltage multipliers employed in low power applications such as passive and autonomous wireless sensing nodes (WSNs) powered by energy harvesters. The proposed threshold-reduction technique enables a topological design methodology which, through an optimum control of the trade-off among transistor conductivity and leakage losses, is aimed at maximizing the voltage conversion efficiency (VCE) for a given ac input signal and physical chip area occupation. The conducted simulations positively assert the validity of the proposed design methodology, emphasizing the exploitable design space yielded by the transistor connection scheme in the voltage multiplier chain. An experimental validation and comparison of threshold-compensation techniques was performed, adopting 2N5247 N-channel junction field effect transistors (JFETs) for the realization of the voltage multiplier prototypes. The attained measurements clearly support the effectiveness of the proposed threshold-reduction approach, which can significantly reduce the chip area occupation for a given target output performance and ac input signal.

  19. Evaluation of biologic occupational risk control practices: quality indicators development and validation.

    Science.gov (United States)

    Takahashi, Renata Ferreira; Gryschek, Anna Luíza F P L; Izumi Nichiata, Lúcia Yasuko; Lacerda, Rúbia Aparecida; Ciosak, Suely Itsuko; Gir, Elucir; Padoveze, Maria Clara

    2010-05-01

    There is growing demand for the adoption of qualification systems for health care practices. This study is aimed at describing the development and validation of indicators for evaluation of biologic occupational risk control programs. The study involved 3 stages: (1) setting up a research team, (2) development of indicators, and (3) validation of the indicators by a team of specialists recruited to validate each attribute of the developed indicators. The content validation method was used for the validation, and a psychometric scale was developed for the specialists' assessment. A consensus technique was used, and every attribute that obtained a Content Validity Index of at least 0.75 was approved. Eight indicators were developed for the evaluation of the biologic occupational risk prevention program, with emphasis on accidents caused by sharp instruments and occupational tuberculosis prevention. The indicators included evaluation of the structure, process, and results at the prevention and biologic risk control levels. The majority of indicators achieved a favorable consensus regarding all validated attributes. The developed indicators were considered validated, and the method used for construction and validation proved to be effective. Copyright (c) 2010 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.

  20. Predicting and validating protein interactions using network structure.

    Directory of Open Access Journals (Sweden)

    Pao-Yang Chen

    2008-07-01

    Full Text Available Protein interactions play a vital part in the function of a cell. As experimental techniques for detection and validation of protein interactions are time consuming, there is a need for computational methods for this task. Protein interactions appear to form a network with a relatively high degree of local clustering. In this paper we exploit this clustering by suggesting a score based on triplets of observed protein interactions. The score utilises both protein characteristics and network properties. Our score based on triplets is shown to complement existing techniques for predicting protein interactions, outperforming them on data sets which display a high degree of clustering. The predicted interactions score highly against test measures for accuracy. Compared to a similar score derived from pairwise interactions only, the triplet score displays higher sensitivity and specificity. By looking at specific examples, we show how an experimental set of interactions can be enriched and validated. As part of this work we also examine the effect of different prior databases upon the accuracy of prediction and find that the interactions from the same kingdom give better results than from across kingdoms, suggesting that there may be fundamental differences between the networks. These results all emphasize that network structure is important and helps in the accurate prediction of protein interactions. The protein interaction data set and the program used in our analysis, and a list of predictions and validations, are available at http://www.stats.ox.ac.uk/bioinfo/resources/PredictingInteractions.

  1. Reliability and Validity of the Inline Skating Skill Test

    Directory of Open Access Journals (Sweden)

    Ivan Radman, Lana Ruzic, Viktoria Padovan, Vjekoslav Cigrovski, Hrvoje Podnar

    2016-09-01

    Full Text Available This study aimed to examine the reliability and validity of the inline skating skill test. Based on previous skating experience forty-two skaters (26 female and 16 male were randomized into two groups (competitive level vs. recreational level. They performed the test four times, with a recovery time of 45 minutes between sessions. Prior to testing, the participants rated their skating skill using a scale from 1 to 10. The protocol included performance time measurement through a course, combining different skating techniques. Trivial changes in performance time between the repeated sessions were determined in both competitive females/males and recreational females/males (-1.7% [95% CI: -5.8–2.6%] – 2.2% [95% CI: 0.0–4.5%]. In all four subgroups, the skill test had a low mean within-individual variation (1.6% [95% CI: 1.2–2.4%] – 2.7% [95% CI: 2.1–4.0%] and high mean inter-session correlation (ICC = 0.97 [95% CI: 0.92–0.99] – 0.99 [95% CI: 0.98–1.00]. The comparison of detected typical errors and smallest worthwhile changes (calculated as standard deviations × 0.2 revealed that the skill test was able to track changes in skaters’ performances. Competitive-level skaters needed shorter time (24.4–26.4%, all p < 0.01 to complete the test in comparison to recreational-level skaters. Moreover, moderate correlation (ρ = 0.80–0.82; all p < 0.01 was observed between the participant’s self-rating and achieved performance times. In conclusion, the proposed test is a reliable and valid method to evaluate inline skating skills in amateur competitive and recreational level skaters. Further studies are needed to evaluate the reproducibility of this skill test in different populations including elite inline skaters.

  2. Validation and Refinement of Prediction Models to Estimate Exercise Capacity in Cancer Survivors Using the Steep Ramp Test

    NARCIS (Netherlands)

    Stuiver, Martijn M.; Kampshoff, Caroline S.; Persoon, Saskia; Groen, Wim; van Mechelen, Willem; Chinapaw, Mai J. M.; Brug, Johannes; Nollet, Frans; Kersten, Marie-José; Schep, Goof; Buffart, Laurien M.

    2017-01-01

    Objective: To further test the validity and clinical usefulness of the steep ramp test (SRT) in estimating exercise tolerance in cancer survivors by external validation and extension of previously published prediction models for peak oxygen consumption (Vo2(peak)) and peak power output (W-peak).&

  3. Validity and reliability of Nike + Fuelband for estimating physical activity energy expenditure.

    Science.gov (United States)

    Tucker, Wesley J; Bhammar, Dharini M; Sawyer, Brandon J; Buman, Matthew P; Gaesser, Glenn A

    2015-01-01

    The Nike + Fuelband is a commercially available, wrist-worn accelerometer used to track physical activity energy expenditure (PAEE) during exercise. However, validation studies assessing the accuracy of this device for estimating PAEE are lacking. Therefore, this study examined the validity and reliability of the Nike + Fuelband for estimating PAEE during physical activity in young adults. Secondarily, we compared PAEE estimation of the Nike + Fuelband with the previously validated SenseWear Armband (SWA). Twenty-four participants (n = 24) completed two, 60-min semi-structured routines consisting of sedentary/light-intensity, moderate-intensity, and vigorous-intensity physical activity. Participants wore a Nike + Fuelband and SWA, while oxygen uptake was measured continuously with an Oxycon Mobile (OM) metabolic measurement system (criterion). The Nike + Fuelband (ICC = 0.77) and SWA (ICC = 0.61) both demonstrated moderate to good validity. PAEE estimates provided by the Nike + Fuelband (246 ± 67 kcal) and SWA (238 ± 57 kcal) were not statistically different than OM (243 ± 67 kcal). Both devices also displayed similar mean absolute percent errors for PAEE estimates (Nike + Fuelband = 16 ± 13 %; SWA = 18 ± 18 %). Test-retest reliability for PAEE indicated good stability for Nike + Fuelband (ICC = 0.96) and SWA (ICC = 0.90). The Nike + Fuelband provided valid and reliable estimates of PAEE, that are similar to the previously validated SWA, during a routine that included approximately equal amounts of sedentary/light-, moderate- and vigorous-intensity physical activity.

  4. Verification and Validation of Multisegmented Mooring Capabilities in FAST v8

    DEFF Research Database (Denmark)

    Andersen, Morten Thøtt; Wendt, Fabian F.; Robertson, Amy N.

    2016-01-01

    The quasi-static and dynamic mooring modules of the open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, have previously been verified and validated, but only for mooring arrangements consisting of single lines connecting each fairlead and anchor. This paper extends the...

  5. Using process elicitation and validation to understand and improve chemotherapy ordering and delivery.

    Science.gov (United States)

    Mertens, Wilson C; Christov, Stefan C; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Cassells, Lucinda J; Marquard, Jenna L

    2012-11-01

    Chemotherapy ordering and administration, in which errors have potentially severe consequences, was quantitatively and qualitatively evaluated by employing process formalism (or formal process definition), a technique derived from software engineering, to elicit and rigorously describe the process, after which validation techniques were applied to confirm the accuracy of the described process. The chemotherapy ordering and administration process, including exceptional situations and individuals' recognition of and responses to those situations, was elicited through informal, unstructured interviews with members of an interdisciplinary team. The process description (or process definition), written in a notation developed for software quality assessment purposes, guided process validation (which consisted of direct observations and semistructured interviews to confirm the elicited details for the treatment plan portion of the process). The overall process definition yielded 467 steps; 207 steps (44%) were dedicated to handling 59 exceptional situations. Validation yielded 82 unique process events (35 new expected but not yet described steps, 16 new exceptional situations, and 31 new steps in response to exceptional situations). Process participants actively altered the process as ambiguities and conflicts were discovered by the elicitation and validation components of the study. Chemotherapy error rates declined significantly during and after the project, which was conducted from October 2007 through August 2008. Each elicitation method and the subsequent validation discussions contributed uniquely to understanding the chemotherapy treatment plan review process, supporting rapid adoption of changes, improved communication regarding the process, and ensuing error reduction.

  6. DBS-LC-MS/MS assay for caffeine: validation and neonatal application.

    Science.gov (United States)

    Bruschettini, Matteo; Barco, Sebastiano; Romantsik, Olga; Risso, Francesco; Gennai, Iulian; Chinea, Benito; Ramenghi, Luca A; Tripodi, Gino; Cangemi, Giuliana

    2016-09-01

    DBS might be an appropriate microsampling technique for therapeutic drug monitoring of caffeine in infants. Nevertheless, its application presents several issues that still limit its use. This paper describes a validated DBS-LC-MS/MS method for caffeine. The results of the method validation showed an hematocrit dependence. In the analysis of 96 paired plasma and DBS clinical samples, caffeine levels measured in DBS were statistically significantly lower than in plasma but the observed differences were independent from hematocrit. These results clearly showed the need for extensive validation with real-life samples for DBS-based methods. DBS-LC-MS/MS can be considered to be a good alternative to traditional methods for therapeutic drug monitoring or PK studies in preterm infants.

  7. Experimental validation of incomplete data CT image reconstruction techniques

    International Nuclear Information System (INIS)

    Eberhard, J.W.; Hsiao, M.L.; Tam, K.C.

    1989-01-01

    X-ray CT inspection of large metal parts is often limited by x-ray penetration problems along many of the ray paths required for a complete CT data set. In addition, because of the complex geometry of many industrial parts, manipulation difficulties often prevent scanning over some range of angles. CT images reconstructed from these incomplete data sets contain a variety of artifacts which limit their usefulness in part quality determination. Over the past several years, the authors' company has developed 2 new methods of incorporating a priori information about the parts under inspection to significantly improve incomplete data CT image quality. This work reviews the methods which were developed and presents experimental results which confirm the effectiveness of the techniques. The new methods for dealing with incomplete CT data sets rely on a priori information from part blueprints (in electronic form), outer boundary information from touch sensors, estimates of part outer boundaries from available x-ray data, and linear x-ray attenuation coefficients of the part. The two methods make use of this information in different fashions. The relative performance of the two methods in detecting various flaw types is compared. Methods for accurately registering a priori information with x-ray data are also described. These results are critical to a new industrial x-ray inspection cell built for inspection of large aircraft engine parts

  8. Tailored Cloze: Improved with Classical Item Analysis Techniques.

    Science.gov (United States)

    Brown, James Dean

    1988-01-01

    The reliability and validity of a cloze procedure used as an English-as-a-second-language (ESL) test in China were improved by applying traditional item analysis and selection techniques. The 'best' test items were chosen on the basis of item facility and discrimination indices, and were administered as a 'tailored cloze.' 29 references listed.…

  9. Construct Validity and Case Validity in Assessment

    Science.gov (United States)

    Teglasi, Hedwig; Nebbergall, Allison Joan; Newman, Daniel

    2012-01-01

    Clinical assessment relies on both "construct validity", which focuses on the accuracy of conclusions about a psychological phenomenon drawn from responses to a measure, and "case validity", which focuses on the synthesis of the full range of psychological phenomena pertaining to the concern or question at hand. Whereas construct validity is…

  10. Vibration transducer calibration techniques

    Science.gov (United States)

    Brinkley, D. J.

    1980-09-01

    Techniques for the calibration of vibration transducers used in the Aeronautical Quality Assurance Directorate of the British Ministry of Defence are presented. Following a review of the types of measurements necessary in the calibration of vibration transducers, the performance requirements of vibration transducers, which can be used to measure acceleration, velocity or vibration amplitude, are discussed, with particular attention given to the piezoelectric accelerometer. Techniques for the accurate measurement of sinusoidal vibration amplitude in reference-grade transducers are then considered, including the use of a position sensitive photocell and the use of a Michelson laser interferometer. Means of comparing the output of working-grade accelerometers with that of previously calibrated reference-grade devices are then outlined, with attention given to a method employing a capacitance bridge technique and a method to be used at temperatures between -50 and 200 C. Automatic calibration procedures developed to speed up the calibration process are outlined, and future possible extensions of system software are indicated.

  11. Engineers find climbing techniques work well for dam inspections

    Energy Technology Data Exchange (ETDEWEB)

    O`Shea, M.; Graves, A. [Bureau of Reclamation, Denver, CO (United States)

    1996-10-01

    Climbing techniques adopted by the Bureau of Reclamation to inspect previously inaccessible or difficult to reach features at dams are described. Following the failure of the steel radial-arm gate at Folsom Dam, engineers mounted an effort to reach and inspect the dam`s seven other spillway gates. This close-up examination was performed to: (1) determine the condition of these gates; and (2) gather clues about the failure of the one gate. The access techniques described involved mountaineering techniques, as opposed to high scaling techniques, performed with dynamic and static nylon kermantle ropes.

  12. A simple method of measuring tibial tubercle to trochlear groove distance on MRI: description of a novel and reliable technique.

    Science.gov (United States)

    Camp, Christopher L; Heidenreich, Mark J; Dahm, Diane L; Bond, Jeffrey R; Collins, Mark S; Krych, Aaron J

    2016-03-01

    Tibial tubercle-trochlear groove (TT-TG) distance is a variable that helps guide surgical decision-making in patients with patellar instability. The purpose of this study was to compare the accuracy and reliability of an MRI TT-TG measuring technique using a simple external alignment method to a previously validated gold standard technique that requires advanced software read by radiologists. TT-TG was calculated by MRI on 59 knees with a clinical diagnosis of patellar instability in a blinded and randomized fashion by two musculoskeletal radiologists using advanced software and by two orthopaedists using the study technique which utilizes measurements taken on a simple electronic imaging platform. Interrater reliability between the two radiologists and the two orthopaedists and intermethods reliability between the two techniques were calculated using interclass correlation coefficients (ICC) and concordance correlation coefficients (CCC). ICC and CCC values greater than 0.75 were considered to represent excellent agreement. The mean TT-TG distance was 14.7 mm (Standard Deviation (SD) 4.87 mm) and 15.4 mm (SD 5.41) as measured by the radiologists and orthopaedists, respectively. Excellent interobserver agreement was noted between the radiologists (ICC 0.941; CCC 0.941), the orthopaedists (ICC 0.978; CCC 0.976), and the two techniques (ICC 0.941; CCC 0.933). The simple TT-TG distance measurement technique analysed in this study resulted in excellent agreement and reliability as compared to the gold standard technique. This method can predictably be performed by orthopaedic surgeons without advanced radiologic software. II.

  13. A modified random decrement technique for modal identification from nonstationary ambient response data only

    International Nuclear Information System (INIS)

    Lin, Chang Sheng; Chiang, Dar Yun

    2012-01-01

    Modal identification is considered from response data of structural system under nonstationary ambient vibration. In a previous paper, we showed that by assuming the ambient excitation to be nonstationary white noise in the form of a product model, the nonstationary response signals can be converted into free-vibration data via the correlation technique. In the present paper, if the ambient excitation can be modeled as a nonstationary white noise in the form of a product model, then the nonstationary cross random decrement signatures of structural response evaluated at any fixed time instant are shown theoretically to be proportional to the nonstationary cross-correlation functions. The practical problem of insufficient data samples available for evaluating nonstationary random decrement signatures can be approximately resolved by first extracting the amplitude-modulating function from the response and then transforming the nonstationary responses into stationary ones. Modal-parameter identification can then be performed using the Ibrahim time-domain technique, which is effective at identifying closely spaced modes. The theory proposed can be further extended by using the filtering concept to cover the case of nonstationary color excitations. Numerical simulations confirm the validity of the proposed method for identification of modal parameters from nonstationary ambient response data

  14. Effect of Changes in Prolactin RIA Reactants on the Validity of the Results

    International Nuclear Information System (INIS)

    Ahmed, A.M.; Megahed, Y.M.; El Mosallamy, M.A.F.; El-Khoshnia, R.A.M.

    1998-01-01

    Human prolactin plays an essential role in the secretion of milk and has the ability to suppress gonadal function. This study is considered as atrial to discuss some technical problems which made by operator in the RIA technique to select an optimized reliable and valid parameters for the measurement of prolactin concentration in human sera. Prolactin concentration was measured in normal control group and chronic renal failure group using the optimized technique. Finally the present optimized technique is very suitable selected one for measurement of prolactin

  15. Geostatistical validation and cross-validation of magnetometric measurements of soil pollution with Potentially Toxic Elements in problematic areas

    Science.gov (United States)

    Fabijańczyk, Piotr; Zawadzki, Jarosław

    2016-04-01

    Field magnetometry is fast method that was previously effectively used to assess the potential soil pollution. One of the most popular devices that are used to measure the soil magnetic susceptibility on the soil surface is a MS2D Bartington. Single reading using MS2D device of soil magnetic susceptibility is low time-consuming but often characterized by considerable errors related to the instrument or environmental and lithogenic factors. In this connection, measured values of soil magnetic susceptibility have to be usually validated using more precise, but also much more expensive, chemical measurements. The goal of this study was to analyze validation methods of magnetometric measurements using chemical analyses of a concentration of elements in soil. Additionally, validation of surface measurements of soil magnetic susceptibility was performed using selected parameters of a distribution of magnetic susceptibility in a soil profile. Validation was performed using selected geostatistical measures of cross-correlation. The geostatistical approach was compared with validation performed using the classic statistics. Measurements were performed at selected areas located in the Upper Silesian Industrial Area in Poland, and in the selected parts of Norway. In these areas soil magnetic susceptibility was measured on the soil surface using a MS2D Bartington device and in the soil profile using MS2C Bartington device. Additionally, soil samples were taken in order to perform chemical measurements. Acknowledgment The research leading to these results has received funding from the Polish-Norwegian Research Programme operated by the National Centre for Research and Development under the Norwegian Financial Mechanism 2009-2014 in the frame of Project IMPACT - Contract No Pol-Nor/199338/45/2013.

  16. Development and validation of a multidimensional measure of lean manufacturing

    Directory of Open Access Journals (Sweden)

    Juan A. Marin-Garcia

    2010-02-01

    Full Text Available In the last 30 years of research of lean manufacturing many different questionnaires was proposed to check the degree of the use of the concept. The set of the items used changed considerably from one investigation to another one. Until now isn’t appreciate a movement that converge towards the use, by the investigators, of a few instruments whose validity and reliability have been compared in different surroundings. In fact, the majority of investigations are based on ad-hoc questionnaires and a few of them present the questionnaire validation checking only the unidimensionality and -Cronbach. Nevertheless it seems to have a consensus in identifying 5 big constructs that compose the lean manufacturing (TQM, JIT, TPM, supply chain management and high-involvement. Our research has consisted of identifying and summarizing the models that have been published previously to add the items in constructs or sub-scales of constructs. Later we developed an integrating questionnaire, starting off of the items that appeared in previous investigations. Finally we realized the sub-scales and models validation through a confirmatory factorial analysis, using date of a sample of Spanish Sheltered Work Centre’s (N=128. Of all proposed models, the best an adjustment takes place with the first order model with 20 sub-scales. Our investigation contributes to an integrating vision of the published models and the lean manufacturing sub-scales validity and reliability verification raised by other investigators. Due to his confirming approach, it can serve as generalization of studies that had been realized in contexts with different samples to which we have used for the replication.

  17. First validation of the new continuous energy version of the MORET5 Monte Carlo code

    International Nuclear Information System (INIS)

    Miss, Joachim; Bernard, Franck; Forestier, Benoit; Haeck, Wim; Richet, Yann; Jacquet, Olivier

    2008-01-01

    The 5.A.1 version is the next release of the MORET Monte Carlo code dedicated to criticality and reactor calculations. This new version combines all the capabilities that are already available in the multigroup version with many new and enhanced features. The main capabilities of the previous version are the powerful association of a deterministic and Monte Carlo approach (like for instance APOLLO-MORET), the modular geometry, five source sampling techniques and two simulation strategies. The major advance in MORET5 is the ability to perform calculations either a multigroup or a continuous energy simulation. Thanks to these new developments, we now have better control over the whole process of criticality calculations, from reading the basic nuclear data to the Monte Carlo simulation itself. Moreover, this new capability enables us to better validate the deterministic-Monte Carlo multigroup calculations by performing continuous energy calculations with the same code, using the same geometry and tracking algorithms. The aim of this paper is to describe the main options available in this new release, and to present the first results. Comparisons of the MORET5 continuous-energy results with experimental measurements and against another continuous-energy Monte Carlo code are provided in terms of validation and time performance. Finally, an analysis of the interest of using a unified energy grid for continuous energy Monte Carlo calculations is presented. (authors)

  18. Efficient techniques for wave-based sound propagation in interactive applications

    Science.gov (United States)

    Mehra, Ravish

    -driven, rotating or time-varying directivity function at runtime. Unlike previous approaches, the listener directivity approach can be used to compute spatial audio (3D audio) for a moving, rotating listener at interactive rates. Lastly, we propose an efficient GPU-based time-domain solver for the wave equation that enables wave simulation up to the mid-frequency range in tens of minutes on a desktop computer. It is demonstrated that by carefully mapping all the components of the wave simulator to match the parallel processing capabilities of the graphics processors, significant improvement in performance can be achieved compared to the CPU-based simulators, while maintaining numerical accuracy. We validate these techniques with offline numerical simulations and measured data recorded in an outdoor scene. We present results of preliminary user evaluations conducted to study the impact of these techniques on user's immersion in virtual environment. We have integrated these techniques with the Half-Life 2 game engine, Oculus Rift head-mounted display, and Xbox game controller to enable users to experience high-quality acoustics effects and spatial audio in the virtual environment.

  19. Exploring conformational space using a mean field technique with ...

    Indian Academy of Sciences (India)

    PRAKASH KUMAR

    2007-06-21

    Jun 21, 2007 ... In general the calculations identified all the folds determined by previous ..... R C 1954 Studies on the Gross Structure, Cross-Linkages, and Terminal ... 201–210. Arunachalam J, Kanagasabai K and Gautham N 2006 Protein ... 126–141. Clark M, Cramer III R D and van Opdenhosch N 1989 Validation.

  20. Validation of cardiovascular diagnoses in the Greenlandic Hospital Discharge Register for epidemiological use

    DEFF Research Database (Denmark)

    Tvermosegaard, Maria; Ronn, Pernille Falberg; Pedersen, Michael Lynge

    2018-01-01

    not previously been validated specifically. The objective of the study was to validate diagnoses of CVD in GHDR. The study was conducted as a validation study with primary investigator comparing information in GHDR with information in medical records. Diagnoses in GHDR were considered correct and thus valid......Cardiovascular disease (CVD) is one of the leading causes of death worldwide. In Greenland, valid estimates of prevalence and incidence of CVD do not exist and can only be calculated if diagnoses of CVD in the Greenlandic Hospital Discharge Register (GHDR) are correct. Diagnoses of CVD in GHDR have...... if they matched the diagnoses or the medical information in the medical records. A total of 432 online accessible medical records with a cardiovascular diagnosis according to GHDR from Queen Ingrid's Hospital from 2001 to 2013 (n=291) and from local health care centres from 2007 to 2013 (n=141) were reviewed...

  1. Don't believe everything you hear: Routine validation of audiovisual information in children and adults.

    Science.gov (United States)

    Piest, Benjamin A; Isberner, Maj-Britt; Richter, Tobias

    2018-04-05

    Previous research has shown that the validation of incoming information during language comprehension is a fast, efficient, and routine process (epistemic monitoring). Previous research on this topic has focused on epistemic monitoring during reading. The present study extended this research by investigating epistemic monitoring of audiovisual information. In a Stroop-like paradigm, participants (Experiment 1: adults; Experiment 2: 10-year-old children) responded to the probe words correct and false by keypress after the presentation of auditory assertions that could be either true or false with respect to concurrently presented pictures. Results provide evidence for routine validation of audiovisual information. Moreover, the results show a stronger and more stable interference effect for children compared with adults.

  2. Parachute technique for partial penectomy

    Directory of Open Access Journals (Sweden)

    Fernando Korkes

    2010-04-01

    Full Text Available PURPOSE: Penile carcinoma is a rare but mutilating malignancy. In this context, partial penectomy is the most commonly applied approach for best oncological results. We herein propose a simple modification of the classic technique of partial penectomy, for better cosmetic and functional results. TECHNIQUE: If partial penectomy is indicated, the present technique can bring additional benefits. Different from classical technique, the urethra is spatulated only ventrally. An inverted "V" skin flap with 0.5 cm of extension is sectioned ventrally. The suture is performed with vicryl 4-0 in a "parachute" fashion, beginning from the ventral portion of the urethra and the "V" flap, followed by the "V" flap angles and than by the dorsal portion of the penis. After completion of the suture, a Foley catheter and light dressing are placed for 24 hours. CONCLUSIONS: Several complex reconstructive techniques have been previously proposed, but normally require specific surgical abilities, adequate patient selection and staged procedures. We believe that these reconstructive techniques are very useful in some specific subsets of patients. However, the technique herein proposed is a simple alternative that can be applied to all men after a partial penectomy, and takes the same amount of time as that in the classic technique. In conclusion, the "parachute" technique for penile reconstruction after partial amputation not only improves the appearance of the penis, but also maintains an adequate function.

  3. Repeat immigration: A previously unobserved source of heterogeneity?

    Science.gov (United States)

    Aradhya, Siddartha; Scott, Kirk; Smith, Christopher D

    2017-07-01

    Register data allow for nuanced analyses of heterogeneities between sub-groups which are not observable in other data sources. One heterogeneity for which register data is particularly useful is in identifying unique migration histories of immigrant populations, a group of interest across disciplines. Years since migration is a commonly used measure of integration in studies seeking to understand the outcomes of immigrants. This study constructs detailed migration histories to test whether misclassified migrations may mask important heterogeneities. In doing so, we identify a previously understudied group of migrants called repeat immigrants, and show that they differ systematically from permanent immigrants. In addition, we quantify the degree to which migration information is misreported in the registers. The analysis is carried out in two steps. First, we estimate income trajectories for repeat immigrants and permanent immigrants to understand the degree to which they differ. Second, we test data validity by cross-referencing migration information with changes in income to determine whether there are inconsistencies indicating misreporting. From the first part of the analysis, the results indicate that repeat immigrants systematically differ from permanent immigrants in terms of income trajectories. Furthermore, income trajectories differ based on the way in which years since migration is calculated. The second part of the analysis suggests that misreported migration events, while present, are negligible. Repeat immigrants differ in terms of income trajectories, and may differ in terms of other outcomes as well. Furthermore, this study underlines that Swedish registers provide a reliable data source to analyze groups which are unidentifiable in other data sources.

  4. Monitoring Ground Subsidence in Hong Kong via Spaceborne Radar: Experiments and Validation

    Directory of Open Access Journals (Sweden)

    Yuxiao Qin

    2015-08-01

    Full Text Available The persistent scatterers interferometry (PSI technique is gradually becoming known for its capability of providing up to millimeter accuracy of measurement on ground displacement. Nevertheless, there is still quite a good amount of doubt regarding its correctness or accuracy. In this paper, we carried out an experiment corroborating the capability of the PSI technique with the help of a traditional survey method in the urban area of Hong Kong, China. Seventy three TerraSAR-X (TSX and TanDEM-X (TDX images spanning over four years are used for the data process. There are three aims of this study. The first is to generate a displacement map of urban Hong Kong and to check for spots with possible ground movements. This information will be provided to the local surveyors so that they can check these specific locations. The second is to validate if the accuracy of the PSI technique can indeed reach the millimeter level in this real application scenario. For validating the accuracy of PSI, four corner reflectors (CR were installed at a construction site on reclaimed land in Hong Kong. They were manually moved up or down by a few to tens of millimeters, and the value derived from the PSI analysis was compared to the true value. The experiment, carried out in unideal conditions, nevertheless proved undoubtedly that millimeter accuracy can be achieved by the PSI technique. The last is to evaluate the advantages and limitations of the PSI technique. Overall, the PSI technique can be extremely useful if used in collaboration with other techniques, so that the advantages can be highlighted and the drawbacks avoided.

  5. Validation of the actuator line and disc techniques using the New MEXICO measurements

    DEFF Research Database (Denmark)

    Sarmast, Sasan; Shen, Wen Z.; Zhu, Wei Jun

    2016-01-01

    Actuator line and disc techniques are employed to analyse the wake obtained in the New MEXICO wind turbine experiment. The New MEXICO measurement campaign done in 2014 is a follow-up to the MEXICO campaign, which was completed in 2006. Three flow configurations in axial flow condition are simulated...

  6. Review of Nipple Reconstruction Techniques and Introduction of V to Y Technique in a Bilateral Wise Pattern Mastectomy or Reduction Mammaplasty

    OpenAIRE

    Riccio, Charles A.; Zeiderman, Matthew R.; Chowdhry, Saeed; Wilhelmi, Bradon J.

    2015-01-01

    Introduction: Nipple-areola complex reconstruction (NAR) is the final procedure in breast reconstruction after the majority of mastectomies. Many methods of NAR have been described, each with inherent advantages and disadvantages depending on local healthy tissue availability, previous scarring and procedures, and the operative morbidity of the NAR technique. Nipple reconstructions may be complicated by scars or previous nipple reconstruction, making the procedure more challenging. We propose...

  7. Developing an instrument to measure emotional behaviour abilities of meaningful learning through the Delphi technique.

    Science.gov (United States)

    Cadorin, Lucia; Bagnasco, Annamaria; Tolotti, Angela; Pagnucci, Nicola; Sasso, Loredana

    2017-09-01

    To identify items for a new instrument that measures emotional behaviour abilities of meaningful learning, according to Fink's Taxonomy. Meaningful learning is an active process that promotes a wider and deeper understanding of concepts. It is the result of an interaction between new and previous knowledge and produces a long-term change of knowledge and skills. To measure meaningful learning capability, it is very important in the education of health professionals to identify problems or special learning needs. For this reason, it is necessary to create valid instruments. A Delphi Study technique was implemented in four phases by means of e-mail. The study was conducted from April-September 2015. An expert panel consisting of ten researchers with experience in Fink's Taxonomy was established to identify the items of the instrument. Data were analysed for conceptual description and item characteristics and attributes were rated. Expert consensus was sought in each of these phases. An 87·5% consensus cut-off was established. After four rounds, consensus was obtained for validation of the content of the instrument 'Assessment of Meaningful learning Behavioural and Emotional Abilities'. This instrument consists of 56 items evaluated on a 6-point Likert-type scale. Foundational Knowledge, Application, Integration, Human Dimension, Caring and Learning How to Learn were the six major categories explored. This content validated tool can help educators (teachers, trainers and tutors) to identify and improve the strategies to support students' learning capability, which could increase their awareness of and/or responsibility in the learning process. © 2017 John Wiley & Sons Ltd.

  8. Estimating Rooftop Suitability for PV: A Review of Methods, Patents, and Validation Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Melius, J. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, R. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Ong, S. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    A number of methods have been developed using remote sensing data to estimate rooftop area suitable for the installation of photovoltaics (PV) at various geospatial resolutions. This report reviews the literature and patents on methods for estimating rooftop-area appropriate for PV, including constant-value methods, manual selection methods, and GIS-based methods. This report also presents NREL's proposed method for estimating suitable rooftop area for PV using Light Detection and Ranging (LiDAR) data in conjunction with a GIS model to predict areas with appropriate slope, orientation, and sunlight. NREL's method is validated against solar installation data from New Jersey, Colorado, and California to compare modeled results to actual on-the-ground measurements.

  9. Validation of single-sample doubly labeled water method

    International Nuclear Information System (INIS)

    Webster, M.D.; Weathers, W.W.

    1989-01-01

    We have experimentally validated a single-sample variant of the doubly labeled water method for measuring metabolic rate and water turnover in a very small passerine bird, the verdin (Auriparus flaviceps). We measured CO 2 production using the Haldane gravimetric technique and compared these values with estimates derived from isotopic data. Doubly labeled water results based on the one-sample calculations differed from Haldane values by less than 0.5% on average (range -8.3 to 11.2%, n = 9). Water flux computed by the single-sample method differed by -1.5% on average from results for the same birds based on the standard, two-sample technique (range -13.7 to 2.0%, n = 9)

  10. Validation of biomarkers for loin meat quality (m. longissimus) of pigs

    NARCIS (Netherlands)

    Pierzchala, M.; Hoekman, A.J.W.; Urbanski, P.; Kruijt, L.; Kristensen, L.; Young, L.; Oksbjerg, N.; Goluch, D.; Pas, te M.F.W.

    2014-01-01

    The aim of this study was to validate previously reported associations between microarray gene expression levels and pork quality traits using real-time PCR. Meat samples and meat quality data from 100 pigs were collected from a different pig breed to the one tested by microarray (Large White versus

  11. Combined antegrade and retrograde ureteral stenting: the rendezvous technique

    International Nuclear Information System (INIS)

    Macri, A.; Magno, C.; Certo, A.; Basile, A.; Scuderi, G.; Crescenti, F.; Famulari, C.

    2005-01-01

    Ureteral stenting is a routine procedure in endourology. To increase the success rate in difficult cases, it may be helpful to use the rendezvous technique, a combined antegrade and retrograde approach. We performed 16 urological rendezvous in 11 patients with ureteral strictures or urologic lesions. The combined approach was successful in all patients, without morbidity or mortality. In our experience the rendezvous technique increased the success rate of antegrade ureteral stenting from 78.6 to 88.09% (p>0.05). This procedure is a valid option in case of failure of conventional ureteral stenting

  12. Development and Validation of a Technique for Detection of Stress and Pregnancy in Large Whales

    Science.gov (United States)

    2015-09-30

    humpback whales, blue whales, and possibly insular false killer whales). 2 2) The second objective is to complete the biological validation using...identification using high-pressure liquid chromatography (HPLC). Briefly, pooled blubber extract from animals of known gender will be serially diluted 1...progesterone in captive female false killer whales, pseudorca crassidens. Gen. Comp. Endocrinol. 115:323-332. Atkinson, S., Crocker, D., Houser, D

  13. Testing the Predictive Validity and Construct of Pathological Video Game Use

    Science.gov (United States)

    Groves, Christopher L.; Gentile, Douglas; Tapscott, Ryan L.; Lynch, Paul J.

    2015-01-01

    Three studies assessed the construct of pathological video game use and tested its predictive validity. Replicating previous research, Study 1 produced evidence of convergent validity in 8th and 9th graders (N = 607) classified as pathological gamers. Study 2 replicated and extended the findings of Study 1 with college undergraduates (N = 504). Predictive validity was established in Study 3 by measuring cue reactivity to video games in college undergraduates (N = 254), such that pathological gamers were more emotionally reactive to and provided higher subjective appraisals of video games than non-pathological gamers and non-gamers. The three studies converged to show that pathological video game use seems similar to other addictions in its patterns of correlations with other constructs. Conceptual and definitional aspects of Internet Gaming Disorder are discussed. PMID:26694472

  14. Testing the Predictive Validity and Construct of Pathological Video Game Use

    Directory of Open Access Journals (Sweden)

    Christopher L. Groves

    2015-12-01

    Full Text Available Three studies assessed the construct of pathological video game use and tested its predictive validity. Replicating previous research, Study 1 produced evidence of convergent validity in 8th and 9th graders (N = 607 classified as pathological gamers. Study 2 replicated and extended the findings of Study 1 with college undergraduates (N = 504. Predictive validity was established in Study 3 by measuring cue reactivity to video games in college undergraduates (N = 254, such that pathological gamers were more emotionally reactive to and provided higher subjective appraisals of video games than non-pathological gamers and non-gamers. The three studies converged to show that pathological video game use seems similar to other addictions in its patterns of correlations with other constructs. Conceptual and definitional aspects of Internet Gaming Disorder are discussed.

  15. Integrated Design Validation: Combining Simulation and Formal Verification for Digital Integrated Circuits

    Directory of Open Access Journals (Sweden)

    Lun Li

    2006-04-01

    Full Text Available The correct design of complex hardware continues to challenge engineers. Bugs in a design that are not uncovered in early design stages can be extremely expensive. Simulation is a predominantly used tool to validate a design in industry. Formal verification overcomes the weakness of exhaustive simulation by applying mathematical methodologies to validate a design. The work described here focuses upon a technique that integrates the best characteristics of both simulation and formal verification methods to provide an effective design validation tool, referred as Integrated Design Validation (IDV. The novelty in this approach consists of three components, circuit complexity analysis, partitioning based on design hierarchy, and coverage analysis. The circuit complexity analyzer and partitioning decompose a large design into sub-components and feed sub-components to different verification and/or simulation tools based upon known existing strengths of modern verification and simulation tools. The coverage analysis unit computes the coverage of design validation and improves the coverage by further partitioning. Various simulation and verification tools comprising IDV are evaluated and an example is used to illustrate the overall validation process. The overall process successfully validates the example to a high coverage rate within a short time. The experimental result shows that our approach is a very promising design validation method.

  16. 49 CFR 173.23 - Previously authorized packaging.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Previously authorized packaging. 173.23 Section... REQUIREMENTS FOR SHIPMENTS AND PACKAGINGS Preparation of Hazardous Materials for Transportation § 173.23 Previously authorized packaging. (a) When the regulations specify a packaging with a specification marking...

  17. Diagnostic PCR: validation and sample preparation are two sides of the same coin

    DEFF Research Database (Denmark)

    Hoorfar, Jeffrey; Wolffs, Petra; Radstrøm, Peter

    2004-01-01

    Increased use of powerful PCR technology for the routine detection of pathogens has focused attention on the need for international validation and preparation of official non-commercial guidelines. Bacteria of epidemiological importance should be the prime focus, although a "validation...... of quantitative reference DNA material and reagents, production of stringent protocols and tools for thermal cycler performance testing, uncomplicated sample preparation techniques, and extensive ring trials for assessment of the efficacy of selected matrix/pathogen detection protocols....

  18. Validating competencies for an undergraduate training program in rural medicine using the Delphi technique.

    Science.gov (United States)

    Gouveia, Eneline Ah; Braga, Taciana D; Heráclio, Sandra A; Pessoa, Bruno Henrique S

    2016-01-01

    Worldwide, half the population lives in rural or remote areas; however, less than 25% of doctors work in such regions. Despite the continental dimensions of Brazil and its enormous cultural diversity, only some medical schools in this country offer students the opportunity to acquire work experience focused on medicine in rural or remote areas. The objective of the present study was to develop a framework of competencies for a longitudinal medical training program in rural medicine as an integrated part of medical training in Brazil. Two rounds of a modified version of the Delphi technique were conducted. Initially, a structured questionnaire was elaborated, based on a literature review. This questionnaire was submitted to the opinion of 20 panelists affiliated with the Rural Medicine Working Party of the Brazilian Society of Family and Community Medicine. The panelists were asked to evaluate the relevance of the competencies using a five-point Likert-type scale. In this study, the consensus criterion for a competency to be included in the framework was it being deemed 'very important' or 'indispensable' by a simple majority of the participants, while the criterion for excluding a competency was that a simple majority of the panel members considered that it 'should not be included' or was 'of little importance'. When a consensus was not reached regarding a given competency, it was submitted to a second round to enable the panelists to re-evaluate the now dichotomized questions. Compliance in responding to the questionnaire was better among the panelists predominantly involved in teaching activities (85%; n=12) compared to those working principally in patient care (45%; n=8). The questionnaire consisted of 26 core competencies and 165 secondary competencies. After evaluation by the specialists, all the 26 core competencies were classified as relevant, with none being excluded and only eight secondary competencies failing to achieve a consensus. No new competencies

  19. 22 CFR 40.91 - Certain aliens previously removed.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Certain aliens previously removed. 40.91... IMMIGRANTS UNDER THE IMMIGRATION AND NATIONALITY ACT, AS AMENDED Aliens Previously Removed § 40.91 Certain aliens previously removed. (a) 5-year bar. An alien who has been found inadmissible, whether as a result...

  20. Karolinska prostatectomy: a robot-assisted laparoscopic radical prostatectomy technique.

    Science.gov (United States)

    Nilsson, Andreas E; Carlsson, Stefan; Laven, Brett A; Wiklund, N Peter

    2006-01-01

    The last decade has witnessed an increasing trend towards minimally invasive management of prostate cancer, including laparoscopic and, more recently, robot-assisted laparoscopic prostatectomy. Several different laparoscopic approaches have been continuously developed during the last 5 years and it is still unclear which technique yields the best outcome. We present our current technique of robot-assisted laparoscopic radical prostatectomy. The technique described has evolved during the course of >400 robotic prostatectomies performed by the robotic team since the robot-assisted laparoscopic radical prostatectomy program was introduced at Karolinska University Hospital in January 2002. Our procedure comprises several modifications of previously reported ones, and we utilize fewer robotic instruments to reduce costs. An extended posterior dissection is performed to aid in the bladder neck-sparing dissection. In nerve-sparing procedures the vesicles are divided to avoid damage to the erectile nerves. In order to preserve the apical anatomy the dorsal venous complex is incised sharply and is first over-sewn after the apical dissection is completed. Our technique enables a more fluent dissection than previously described robotic techniques. Minimizing changes of instruments and the camera not only cuts costs but also reduces inefficient operating maneuvers, such as switching between 30 degrees and 0 degrees lenses during the procedure. We present a technique which in our hands has achieved excellent functional and oncological results.

  1. Application of the Delphi technique in healthcare maintenance.

    Science.gov (United States)

    Njuangang, Stanley; Liyanage, Champika; Akintoye, Akintola

    2017-10-09

    Purpose The purpose of this paper is to examine the research design, issues and considerations in the application of the Delphi technique to identify, refine and rate the critical success factors and performance measures in maintenance-associated infections. Design/methodology/approach In-depth literature review through the application of open and axial coding were applied to formulate the interview and research questions. These were used to conduct an exploratory case study of two healthcare maintenance managers, randomly selected from two National Health Service Foundation Trusts in England. The results of exploratory case study provided the rationale for the application of the Delphi technique in this research. The different processes in the application of the Delphi technique in healthcare research are examined thoroughly. Findings This research demonstrates the need to apply and integrate different research methods to enhance the validity of the Delphi technique. The rationale for the application of the Delphi technique in this research is because some healthcare maintenance managers lack knowledge about basic infection control (IC) principles to make hospitals safe for patient care. The result of first round of the Delphi exercise is a useful contribution in its own rights. It identified a number of salient issues and differences in the opinions of the Delphi participants, noticeably between healthcare maintenance managers and members of the infection control team. It also resulted in useful suggestions and comments to improve the quality and presentation of the second- and third-round Delphi instruments. Practical implications This research provides a research methodology that can be adopted by researchers investigating new and emerging issues in the healthcare sector. As this research demonstrates, the Delphi technique is relevant in soliciting expert knowledge and opinion to identify performance measures to control maintenance-associated infections in

  2. Expert system verification and validation for nuclear power industry applications

    International Nuclear Information System (INIS)

    Naser, J.A.

    1990-01-01

    The potential for the use of expert systems in the nuclear power industry is widely recognized. The benefits of such systems include consistency of reasoning during off-normal situations when humans are under great stress, the reduction of times required to perform certain functions, the prevention of equipment failures through predictive diagnostics, and the retention of human expertise in performing specialized functions. The increased use of expert systems brings with it concerns about their reliability. Difficulties arising from software problems can affect plant safety, reliability, and availability. A joint project between EPRI and the US Nuclear Regulatory Commission is being initiated to develop a methodology for verification and validation of expert systems for nuclear power applications. This methodology will be tested on existing and developing expert systems. This effort will explore the applicability of conventional verification and validation methodologies to expert systems. The major area of concern will be certification of the knowledge base. This is expected to require new types of verification and validation techniques. A methodology for developing validation scenarios will also be studied

  3. Validation and application of the methodology for analysis of radon concentration in the air through the technique of solid state nuclear track detectors (SSNTD)

    International Nuclear Information System (INIS)

    Carvalho, Caroline de; Comissao Nacional de Energia Nuclear; Silva, Nivaldo Carlos da

    2011-01-01

    Radon is a radioactive noble gas that occurs naturally in soil and could enter into residential. The decay products of radon are radioactive metals which, when inhaled, can be retained in the respiratory system, leading to an internal dose of radiation. The monitoring of radon levels in residences and workplaces is extremely important, since high concentrations of this gas can cause serious public health problems. This study analyzed the concentration of radon in the air in 94 work environments at the Laboratory of Pocos de Caldas - LAPOC/CNEN, including laboratories, administrative rooms, workshop, warehouse and guardhouse. The method employed in the monitoring was the technique of solid state nuclear track detectors, known as SSNTD. For calibration and validation of this method, controlled experiments were conducted in laboratory with specific instrumentation. The monitoring results indicated that most environments present radon concentrations above 100 Bq m -3 , which is the reference level recommended by the World Health Organization. (author)

  4. Validation and application of the methodology for analysis of radon concentration in the air through the technique of solid state nuclear track detectors (SSNTD)

    Energy Technology Data Exchange (ETDEWEB)

    Carvalho, Caroline de [Pontificia Universidade Catolica de Minas Gerais (PUC-Pocos), Pocos de Caldas, MG (Brazil); Comissao Nacional de Energia Nuclear (LAPOC/CNEN), Pocos de Caldas, MG (Brazil). Lab. de Pocos de Caldas; Silva, Nivaldo Carlos da, E-mail: ncsilva@cnen.gov.b [Comissao Nacional de Energia Nuclear (LAPOC/CNEN), Pocos de Caldas, MG (Brazil). Lab. de Pocos de Caldas

    2011-07-01

    Radon is a radioactive noble gas that occurs naturally in soil and could enter into residential. The decay products of radon are radioactive metals which, when inhaled, can be retained in the respiratory system, leading to an internal dose of radiation. The monitoring of radon levels in residences and workplaces is extremely important, since high concentrations of this gas can cause serious public health problems. This study analyzed the concentration of radon in the air in 94 work environments at the Laboratory of Pocos de Caldas - LAPOC/CNEN, including laboratories, administrative rooms, workshop, warehouse and guardhouse. The method employed in the monitoring was the technique of solid state nuclear track detectors, known as SSNTD. For calibration and validation of this method, controlled experiments were conducted in laboratory with specific instrumentation. The monitoring results indicated that most environments present radon concentrations above 100 Bq m{sup -3}, which is the reference level recommended by the World Health Organization. (author)

  5. Explicating Validity

    Science.gov (United States)

    Kane, Michael T.

    2016-01-01

    How we choose to use a term depends on what we want to do with it. If "validity" is to be used to support a score interpretation, validation would require an analysis of the plausibility of that interpretation. If validity is to be used to support score uses, validation would require an analysis of the appropriateness of the proposed…

  6. Data Analysis and Neuro-Fuzzy Technique for EOR Screening: Application in Angolan Oilfields

    Directory of Open Access Journals (Sweden)

    Geraldo A. R. Ramos

    2017-06-01

    Full Text Available In this work, a neuro-fuzzy (NF simulation study was conducted in order to screen candidate reservoirs for enhanced oil recovery (EOR projects in Angolan oilfields. First, a knowledge pattern is extracted by combining both the searching potential of fuzzy-logic (FL and the learning capability of neural network (NN to make a priori decisions. The extracted knowledge pattern is validated against rock and fluid data trained from successful EOR projects around the world. Then, data from Block K offshore Angolan oilfields are then mined and analysed using box-plot technique for the investigation of the degree of suitability for EOR projects. The trained and validated model is then tested on the Angolan field data (Block K where EOR application is yet to be fully established. The results from the NF simulation technique applied in this investigation show that polymer, hydrocarbon gas, and combustion are the suitable EOR techniques.

  7. The Predictive Validity of CBM Writing Indices for Eighth-Grade Students

    Science.gov (United States)

    Amato, Janelle M.; Watkins, Marley W.

    2011-01-01

    Curriculum-based measurement (CBM) is an alternative to traditional assessment techniques. Technical work has begun to identify CBM writing indices that are psychometrically sound for monitoring older students' writing proficiency. This study examined the predictive validity of CBM writing indices in a sample of 447 eighth-grade students.…

  8. Identification and validation of highly frequent CpG island hypermethylation in colorectal adenomas and carcinomas

    DEFF Research Database (Denmark)

    Øster, Bodil; Thorsen, Kasper; Lamy, Philippe

    2011-01-01

    ), in carcinomas only (ABHD9, AOX1 and RERG), or in MSI but not MSS carcinomas (RAMP2, DSC3 and MLH1) were validated using MS-HRM. Four of these genes (MLH1, AOX1, EYA4 and TWIST1) had previously been reported to be hypermethylated in CRC. Eleven genes, not previously known to be affected by CRC specific...

  9. On qualification of TOFD technique for austenitic stainless steel welds inspection

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Ona, R. [Tecnatom, San Sebastian de los Reyes (Spain); Viggianiello, S.; Bleuze, A. [Metalscan, Saint-Remy (France)

    2006-07-01

    Time of Flight Diffraction (TOFD) technique is gaining ground as a solid method for detection and sizing of defects. It has been reported that TOFD technique provides good results on the inspection of fine grain steels. However, there are few results regarding the application and performance of this technique on austenitic stainless steels. A big challenge of these inspections is the coarse grain structure that produces low signal to noise ratio and may mask the diffraction signals. Appropriate transducer design, selection of technique parameters and analysis tools could overcome the actual difficulties. In this paper, the main design aspects and parameters of the TOFD technique for austenitic steels are presented. It follows the description of qualification tests carried out to validate the technique for inspecting stainless steels welds. To conclude, discussion of results from actual inspections is shown. (orig.)

  10. Study of water flowrate using time transient and cross-correlation techniques with 82Br radiotracer

    International Nuclear Information System (INIS)

    Salgado, William L.; Brandao, Luiz E.B.

    2013-01-01

    This paper aims to determinate the water flowrate using Time Transient and Cross-Correlation techniques. The detection system uses two NaI (T1) detectors adequately positioned on the outside of pipe and a gamma-ray source ( 82 Br radiotracer). The water flowrate measurements using Time Transient and Cross-Correlation techniques were compared to invasive conventional measurements of the flowrate previously installed in pipeline. Discrepancies between Time Transient and Cross-Correlation techniques flowmeter previously installed in pipeline. Discrepancies between Time Transient and Cross-Correlation techniques flowrate values were found to be less than 3% in relation to conventional ones. (author)

  11. The Development and Validation of the Student Response System Benefit Scale

    Science.gov (United States)

    Hooker, J. F.; Denker, K. J.; Summers, M. E.; Parker, M.

    2016-01-01

    Previous research into the benefits student response systems (SRS) that have been brought into the classroom revealed that SRS can contribute positively to student experiences. However, while the benefits of SRS have been conceptualized and operationalized into a widely cited scale, the validity of this scale had not been tested. Furthermore,…

  12. Validation of Land Surface Temperature from Sentinel-3

    Science.gov (United States)

    Ghent, D.

    2017-12-01

    -based coefficients, thus emphasizing the importance of non-traditional forms of validation such as radiance-based techniques. Here we present examples of the ongoing routine application of the protocol to operational Sentinel-3 LST data.

  13. Recent Progress Validating the HADES Model of LLNL's HEAF MicroCT Measurements

    Energy Technology Data Exchange (ETDEWEB)

    White, W. T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bond, K. C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lennox, K. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Aufderheide, M. B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Seetho, I. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Roberson, G. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-07-17

    This report compares recent HADES calculations of x-ray linear attenuation coefficients to previous MicroCT measurements made at Lawrence Livermore National Laboratory’s High Energy Applications Facility (HEAF). The chief objective is to investigate what impact recent changes in HADES modeling have on validation results. We find that these changes have no obvious effect on the overall accuracy of the model. Detailed comparisons between recent and previous results are presented.

  14. Validation of the Single-Factor Model of the Relationship Assessment Scale among Married and Cohabiting Persons from Monterrey, Mexico

    Directory of Open Access Journals (Sweden)

    José Moral de la Rubia

    2015-07-01

    Full Text Available The study of intimate partner relationships is particularly important because this union is the foundation of the family. Satisfaction with the relationship can be defined as the overall attitude to the relationship and the partner. The Hendrick's Relationship Assessment Scale (RAS is a instrument commonly used to assess the construct. Previous research papers have showed that this scale has high internal consistency and a single-factor structure. Although there are validation studies of the RAS, these studies used inappropriate statistical techniques to analyze its Likert-type items, and to determine the number of factors; likewise, its factor invariance across sex has not been previously contrasted. Therefore, this study posed the following research questions: Does the RAS have consistent and discriminating items? Basing the analysis on a polychoric correlation matrix, what is its level of internal consistency? How many factors emerge using rigorous empirical methods? Is the single-factor model invariant across sex? In order to answer these research questions, we used a random route probability sampling in this instrument validation study of the RAS. The sample was extracted from the population of married couples or the ones living in consensual union in Monterrey, Mexico. There were 431 female and 376 male participants in the study. The RAS’ items were consistent and discriminative. The internal consistency of the scale was excellent in the whole sample (ordinal α = .93, as well as among female (ordinal α = .94 and male participants (ordinal α = .92. Horn's parallel analysis and Velicer's  minimum average partial test suggested a one factor solution. Moreover, the single-factor model (with one correlation between the residuals of the two negatively worded items had a close fit to the data, and its properties of invariance across sex were very acceptable by the Unweighted Least Squares method. We conclude that the scale shows internal

  15. Developing a model for hospital inherent safety assessment: Conceptualization and validation.

    Science.gov (United States)

    Yari, Saeed; Akbari, Hesam; Gholami Fesharaki, Mohammad; Khosravizadeh, Omid; Ghasemi, Mohammad; Barsam, Yalda; Akbari, Hamed

    2018-01-01

    Paying attention to the safety of hospitals, as the most crucial institute for providing medical and health services wherein a bundle of facilities, equipment, and human resource exist, is of significant importance. The present research aims at developing a model for assessing hospitals' safety based on principles of inherent safety design. Face validity (30 experts), content validity (20 experts), construct validity (268 examples), convergent validity, and divergent validity have been employed to validate the prepared questionnaire; and the items analysis, the Cronbach's alpha test, ICC test (to measure reliability of the test), composite reliability coefficient have been used to measure primary reliability. The relationship between variables and factors has been confirmed at 0.05 significance level by conducting confirmatory factor analysis (CFA) and structural equations modeling (SEM) technique with the use of Smart-PLS. R-square and load factors values, which were higher than 0.67 and 0.300 respectively, indicated the strong fit. Moderation (0.970), simplification (0.959), substitution (0.943), and minimization (0.5008) have had the most weights in determining the inherent safety of hospital respectively. Moderation, simplification, and substitution, among the other dimensions, have more weight on the inherent safety, while minimization has the less weight, which could be due do its definition as to minimize the risk.

  16. Design and Validation of Affective Warning Pictorial on Cigarette Labels

    Directory of Open Access Journals (Sweden)

    Chanduen Pat-Arin

    2016-01-01

    Full Text Available The purpose of present study were to design and validate affective warning pictorials for cigarette label in Thailand. Brainstorming and survey techniques were used to collect the idea of possible warning pictorials. All ideas were grouped for finding candidated pictorials. Then, primary sixty warning pictorials were collected and equally classified into three affective warning pictorial groups as positive, neutral, and negative. Sixty Thai male engineering students participated in affective validation of warning pictorials using SAM rating. The International Affective Picture System (IAPS was used to manipulate the affective state of participants to neutral affective state before the experiments. The results revealed that all affective warning pictorials were successfully evoked target affective states on participants. After refining, thirty affective warning pictorials were provided as positive, neutral, and negative affective warning pictorials for using on cigarette labels. Implications on the affective warning pictorials design and validation.

  17. Reliability and validity of a Danish version of the multiple sclerosis neuropsychological screening Questionnaire

    DEFF Research Database (Denmark)

    Sejbæk, Tobias; Blaabjerg, Morten; Sprogøe, Pippi

    2018-01-01

    . The Multiple Sclerosis Neuropsychological Screening Questionnaire (MSNQ) has previously shown good validity in American, Argentinean, and Dutch MS cohorts. We sought to test reliability and validity of a Danish translation of the MSNQ compared with formal neuropsychological testing, and measures of depression...... the Expanded Disability Status Scale and MS Impairment Scale. Results: The test-retest reliability of the MSNQ-P was significant (R2 = 0.79, P ... that the MSNQ-P measures these items more than the cognitive abilities of the patients. Conclusions: This study does not support use of the MSNQ as a sensitive or valid screening tool for cognitive impairment in Danish patients with MS....

  18. Lower-Order Compensation Chain Threshold-Reduction Technique for Multi-Stage Voltage Multipliers

    Directory of Open Access Journals (Sweden)

    Francesco Dell’ Anna

    2018-04-01

    Full Text Available This paper presents a novel threshold-compensation technique for multi-stage voltage multipliers employed in low power applications such as passive and autonomous wireless sensing nodes (WSNs powered by energy harvesters. The proposed threshold-reduction technique enables a topological design methodology which, through an optimum control of the trade-off among transistor conductivity and leakage losses, is aimed at maximizing the voltage conversion efficiency (VCE for a given ac input signal and physical chip area occupation. The conducted simulations positively assert the validity of the proposed design methodology, emphasizing the exploitable design space yielded by the transistor connection scheme in the voltage multiplier chain. An experimental validation and comparison of threshold-compensation techniques was performed, adopting 2N5247 N-channel junction field effect transistors (JFETs for the realization of the voltage multiplier prototypes. The attained measurements clearly support the effectiveness of the proposed threshold-reduction approach, which can significantly reduce the chip area occupation for a given target output performance and ac input signal.

  19. Gating Techniques for Rao-Blackwellized Monte Carlo Data Association Filter

    Directory of Open Access Journals (Sweden)

    Yazhao Wang

    2014-01-01

    Full Text Available This paper studies the Rao-Blackwellized Monte Carlo data association (RBMCDA filter for multiple target tracking. The elliptical gating strategies are redesigned and incorporated into the framework of the RBMCDA filter. The obvious benefit is the reduction of the time cost because the data association procedure can be carried out with less validated measurements. In addition, the overlapped parts of the neighboring validation regions are divided into several separated subregions according to the possible origins of the validated measurements. In these subregions, the measurement uncertainties can be taken into account more reasonably than those of the simple elliptical gate. This would help to achieve higher tracking ability of the RBMCDA algorithm by a better association prior approximation. Simulation results are provided to show the effectiveness of the proposed gating techniques.

  20. LATUX: An Iterative Workflow for Designing, Validating, and Deploying Learning Analytics Visualizations

    Science.gov (United States)

    Martinez-Maldonado, Roberto; Pardo, Abelardo; Mirriahi, Negin; Yacef, Kalina; Kay, Judy; Clayphan, Andrew

    2015-01-01

    Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now…