WorldWideScience

Sample records for factor analysis procedures

  1. Human factoring administrative procedures

    International Nuclear Information System (INIS)

    Grider, D.A.; Sturdivant, M.H.

    1991-01-01

    In nonnuclear business, administrative procedures bring to mind such mundane topics as filing correspondence and scheduling vacation time. In the nuclear industry, on the other hand, administrative procedures play a vital role in assuring the safe operation of a facility. For some time now, industry focus has been on improving technical procedures. Significant efforts are under way to produce technical procedure requires that a validated technical, regulatory, and administrative basis be developed and that the technical process be established for each procedure. Producing usable technical procedures requires that procedure presentation be engineered to the same human factors principles used in control room design. The vital safety role of administrative procedures requires that they be just as sound, just a rigorously formulated, and documented as technical procedures. Procedure programs at the Tennessee Valley Authority and at Boston Edison's Pilgrim Station demonstrate that human factors engineering techniques can be applied effectively to technical procedures. With a few modifications, those same techniques can be used to produce more effective administrative procedures. Efforts are under way at the US Department of Energy Nuclear Weapons Complex and at some utilities (Boston Edison, for instance) to apply human factors engineering to administrative procedures: The techniques being adapted include the following

  2. [Delirium in stroke patients : Critical analysis of statistical procedures for the identification of risk factors].

    Science.gov (United States)

    Nydahl, P; Margraf, N G; Ewers, A

    2017-04-01

    Delirium is a relevant complication following an acute stroke. It is a multifactor occurrence with numerous interacting risk factors that alternately influence each other. The risk factors of delirium in stroke patients are often based on limited clinical studies. The statistical procedures and clinical relevance of delirium related risk factors in adult stroke patients should therefore be questioned. This secondary analysis includes clinically relevant studies that give evidence for the clinical relevance and statistical significance of delirium-associated risk factors in stroke patients. The quality of the reporting of regression analyses was assessed using Ottenbacher's quality criteria. The delirium-associated risk factors identified were examined with regard to statistical significance using the Bonferroni method of multiple testing for forming incorrect positive hypotheses. This was followed by a literature-based discussion on clinical relevance. Nine clinical studies were included. None of the studies fulfilled all the prerequisites and assumptions given for the reporting of regression analyses according to Ottenbacher. Of the 108 delirium-associated risk factors, a total of 48 (44.4%) were significant, whereby a total of 28 (58.3%) were false positive after Bonferroni correction. Following a literature-based discussion on clinical relevance, the assumption of statistical significance and clinical relevance could be found for only four risk factors (dementia or cognitive impairment, total anterior infarct, severe infarct and infections). The statistical procedures used in the existing literature are questionable, as are their results. A post-hoc analysis and critical appraisal reduced the number of possible delirium-associated risk factors to just a few clinically relevant factors.

  3. Human factor analysis related to new symptom based procedures used by control room crews during treatment of emergency states

    International Nuclear Information System (INIS)

    Holy, J.

    1999-01-01

    New symptom based emergency procedures have been developed for Nuclear Power Plant Dukovany in the Czech Republic. As one point of the process of verification and validation of the procedures, a specific effort was devoted to detailed analysis of the procedures from human factors and human reliability point of view. The course and results of the analysis are discussed in this article. Although the analyzed procedures have been developed for one specific plant of WWER-440/213 type, most of the presented results may be valid for many other procedures recently developed for semi-automatic control of those technological units which are operated under measurable level of risk. (author)

  4. Human Factors Process Task Analysis Liquid Oxygen Pump Acceptance Test Procedure for the Advanced Technology Development Center

    Science.gov (United States)

    Diorio, Kimberly A.

    2002-01-01

    A process task analysis effort was undertaken by Dynacs Inc. commencing in June 2002 under contract from NASA YA-D6. Funding was provided through NASA's Ames Research Center (ARC), Code M/HQ, and Industrial Engineering and Safety (IES). The John F. Kennedy Space Center (KSC) Engineering Development Contract (EDC) Task Order was 5SMA768. The scope of the effort was to conduct a Human Factors Process Failure Modes and Effects Analysis (HF PFMEA) of a hazardous activity and provide recommendations to eliminate or reduce the effects of errors caused by human factors. The Liquid Oxygen (LOX) Pump Acceptance Test Procedure (ATP) was selected for this analysis. The HF PFMEA table (see appendix A) provides an analysis of six major categories evaluated for this study. These categories include Personnel Certification, Test Procedure Format, Test Procedure Safety Controls, Test Article Data, Instrumentation, and Voice Communication. For each specific requirement listed in appendix A, the following topics were addressed: Requirement, Potential Human Error, Performance-Shaping Factors, Potential Effects of the Error, Barriers and Controls, Risk Priority Numbers, and Recommended Actions. This report summarizes findings and gives recommendations as determined by the data contained in appendix A. It also includes a discussion of technology barriers and challenges to performing task analyses, as well as lessons learned. The HF PFMEA table in appendix A recommends the use of accepted and required safety criteria in order to reduce the risk of human error. The items with the highest risk priority numbers should receive the greatest amount of consideration. Implementation of the recommendations will result in a safer operation for all personnel.

  5. Human Factors Process Task Analysis: Liquid Oxygen Pump Acceptance Test Procedure at the Advanced Technology Development Center

    Science.gov (United States)

    Diorio, Kimberly A.; Voska, Ned (Technical Monitor)

    2002-01-01

    This viewgraph presentation provides information on Human Factors Process Failure Modes and Effects Analysis (HF PFMEA). HF PFMEA includes the following 10 steps: Describe mission; Define System; Identify human-machine; List human actions; Identify potential errors; Identify factors that effect error; Determine likelihood of error; Determine potential effects of errors; Evaluate risk; Generate solutions (manage error). The presentation also describes how this analysis was applied to a liquid oxygen pump acceptance test.

  6. Human Reliability Analysis For Computerized Procedures

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Gertman, David I.; Le Blanc, Katya

    2011-01-01

    This paper provides a characterization of human reliability analysis (HRA) issues for computerized procedures in nuclear power plant control rooms. It is beyond the scope of this paper to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper provides a review of HRA as applied to traditional paper-based procedures, followed by a discussion of what specific factors should additionally be considered in HRAs for computerized procedures. Performance shaping factors and failure modes unique to computerized procedures are highlighted. Since there is no definitive guide to HRA for paper-based procedures, this paper also serves to clarify the existing guidance on paper-based procedures before delving into the unique aspects of computerized procedures.

  7. Biomass Compositional Analysis Laboratory Procedures | Bioenergy | NREL

    Science.gov (United States)

    Biomass Compositional Analysis Laboratory Procedures Biomass Compositional Analysis Laboratory Procedures NREL develops laboratory analytical procedures (LAPs) for standard biomass analysis. These procedures help scientists and analysts understand more about the chemical composition of raw biomass

  8. Factor analysis

    CERN Document Server

    Gorsuch, Richard L

    2013-01-01

    Comprehensive and comprehensible, this classic covers the basic and advanced topics essential for using factor analysis as a scientific tool in psychology, education, sociology, and related areas. Emphasizing the usefulness of the techniques, it presents sufficient mathematical background for understanding and sufficient discussion of applications for effective use. This includes not only theory but also the empirical evaluations of the importance of mathematical distinctions for applied scientific analysis.

  9. Design-related influencing factors of the computerized procedure system for inclusion into human reliability analysis of the advanced control room

    International Nuclear Information System (INIS)

    Kim, Jaewhan; Lee, Seung Jun; Jang, Seung Cheol; Ahn, Kwang-Il; Shin, Yeong Cheol

    2013-01-01

    This paper presents major design factors of the computerized procedure system (CPS) by task characteristics/requirements, with individual relative weight evaluated by the analytic hierarchy process (AHP) technique, for inclusion into human reliability analysis (HRA) of the advanced control rooms. Task characteristics/requirements of an individual procedural step are classified into four categories according to the dynamic characteristics of an emergency situation: (1) a single-static step, (2) a single-dynamic and single-checking step, (3) a single-dynamic and continuous-monitoring step, and (4) a multiple-dynamic and continuous-monitoring step. According to the importance ranking evaluation by the AHP technique, ‘clearness of the instruction for taking action’, ‘clearness of the instruction and its structure for rule interpretation’, and ‘adequate provision of requisite information’ were rated as of being higher importance for all the task classifications. Importance of ‘adequacy of the monitoring function’ and ‘adequacy of representation of the dynamic link or relationship between procedural steps’ is dependent upon task characteristics. The result of the present study gives a valuable insight on which design factors of the CPS should be incorporated, with relative importance or weight between design factors, into HRA of the advanced control rooms. (author)

  10. Safety analysis procedures for PHWR

    International Nuclear Information System (INIS)

    Min, Byung Joo; Kim, Hyoung Tae; Yoo, Kun Joong

    2004-03-01

    The methodology of safety analyses for CANDU reactors in Canada, a vendor country, uses a combination of best-estimate physical models and conservative input parameters so as to minimize the uncertainty of the plant behavior predictions. As using the conservative input parameters, the results of the safety analyses are assured the regulatory requirements such as the public dose, the integrity of fuel and fuel channel, the integrity of containment and reactor structures, etc. However, there is not the comprehensive and systematic procedures for safety analyses for CANDU reactors in Korea. In this regard, the development of the safety analyses procedures for CANDU reactors is being conducted not only to establish the safety analyses system, but also to enhance the quality assurance of the safety assessment. In the first phase of this study, the general procedures of the deterministic safety analyses are developed. The general safety procedures are covered the specification of the initial event, selection of the methodology and accident sequences, computer codes, safety analysis procedures, verification of errors and uncertainties, etc. Finally, These general procedures of the safety analyses are applied to the Large Break Loss Of Coolant Accident (LBLOCA) in Final Safety Analysis Report (FSAR) for Wolsong units 2, 3, 4

  11. A Human Error Analysis Procedure for Identifying Potential Error Modes and Influencing Factors for Test and Maintenance Activities

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Park, Jin Kyun

    2010-01-01

    Periodic or non-periodic test and maintenance (T and M) activities in large, complex systems such as nuclear power plants (NPPs) are essential for sustaining stable and safe operation of the systems. On the other hand, it also has been raised that human erroneous actions that might occur during T and M activities has the possibility of incurring unplanned reactor trips (RTs) or power derate, making safety-related systems unavailable, or making the reliability of components degraded. Contribution of human errors during normal and abnormal activities of NPPs to the unplanned RTs is known to be about 20% of the total events. This paper introduces a procedure for predictively analyzing human error potentials when maintenance personnel perform T and M tasks based on a work procedure or their work plan. This procedure helps plant maintenance team prepare for plausible human errors. The procedure to be introduced is focusing on the recurrent error forms (or modes) in execution-based errors such as wrong object, omission, too little, and wrong action

  12. Abdominoplasty: Risk Factors, Complication Rates, and Safety of Combined Procedures.

    Science.gov (United States)

    Winocour, Julian; Gupta, Varun; Ramirez, J Roberto; Shack, R Bruce; Grotting, James C; Higdon, K Kye

    2015-11-01

    Among aesthetic surgery procedures, abdominoplasty is associated with a higher complication rate, but previous studies are limited by small sample sizes or single-institution experience. A cohort of patients who underwent abdominoplasty between 2008 and 2013 was identified from the CosmetAssure database. Major complications were recorded. Univariate and multivariate analysis was performed evaluating risk factors, including age, smoking, body mass index, sex, diabetes, type of surgical facility, and combined procedures. The authors identified 25,478 abdominoplasties from 183,914 procedures in the database. Of these, 8,975 patients had abdominoplasty alone and 16,503 underwent additional procedures. The number of complications recorded was 1,012 (4.0 percent overall rate versus 1.4 percent in other aesthetic surgery procedures). Of these, 31.5 percent were hematomas, 27.2 percent were infections and 20.2 percent were suspected or confirmed venous thromboembolism. On multivariate analysis, significant risk factors (p procedures (1.5), and procedure performance in a hospital or surgical center versus office-based surgical suite (1.6). Combined procedures increased the risk of complication (abdominoplasty alone, 3.1 percent; with liposuction, 3.8 percent; breast procedure, 4.3 percent; liposuction and breast procedure, 4.6 percent; body-contouring procedure, 6.8 percent; liposuction and body-contouring procedure, 10.4 percent). Abdominoplasty is associated with a higher complication rate compared with other aesthetic procedures. Combined procedures can significantly increase complication rates and should be considered carefully in higher risk patients. Risk, II.

  13. Factors Affecting Patient Satisfaction During Endoscopic Procedures

    International Nuclear Information System (INIS)

    Qureshi, M. O.; Shafqat, F.; Ahmed, S.; Niazi, T. K.; Khokhar, N. K.

    2013-01-01

    Objective: To assess the quality and patient satisfaction in Endoscopy Unit of Shifa International Hospital. Study Design: Cross-sectional survey. Place and Duration of Study: Division of Gastroenterology, Shifa International Hospital, Islamabad, Pakistan, from July 2011 to January 2012. Methodology: Quality and patient satisfaction after the endoscopic procedure was assessed using a modified GHAA-9 questionnaire. Data was analyzed using SPSS version 16. Results: A total of 1028 patients were included with a mean age of 45 A+- 14.21 years. Out of all the procedures, 670 (65.17%) were gastroscopies, 181 (17.60%) were flexible sigmoidoscopies and 177 (17.21%) were colonoscopies. The maximum unsatisfactory responses were on the waiting time before the procedure (13.13 %), followed by unsatisfactory explanation of the procedure and answers to questions (7.58%). Overall, unsatisfied impression was 4.86%. The problem rate was 6.22%. Conclusion: The quality of procedures and level of satisfaction of patients undergoing a gastroscopy or colonoscopy was generally good. The factors that influence the satisfaction of these patients are related to communication between doctor and patient, doctor's manner and waiting time for the procedure. Feedback information in an endoscopy unit may be useful in improving standards, including the performance of endoscopists. (author)

  14. A procedure for effective Dancoff factor calculation

    International Nuclear Information System (INIS)

    Milosevic, M.

    2001-01-01

    In this paper, a procedure for Dancoff factors calculation based on equivalence principle and its application in the SCALE-4.3 code system is described. This procedure is founded on principle of conservation of neutron absorption for resolved resonance range in a heterogeneous medium and an equivalent medium consisted of an infinite array of two-region pin cells, where the presence of other fuel rods is taken into account through a Dancoff factor. The neutron absorption in both media is obtained using a fine-group elastic slowing-down calculation. This procedure is implemented in a design oriented lattice physics code, which is applicable for any geometry where the method of collision probability is possible to apply to get a flux solution. Proposed procedure was benchmarked for recent exercise that represents a system with a fuel double heterogeneity, i.e., fuel in solid form (pellets) surrounded by fissile material in solution, and for a 5x5 irregular pressurised water reactor assembly, which requires different Dancoff factors. (author)

  15. Risk factors for unplanned readmission within 30 days after pediatric neurosurgery: a nationwide analysis of 9799 procedures from the American College of Surgeons National Surgical Quality Improvement Program

    Science.gov (United States)

    Sherrod, Brandon A.; Johnston, James M.; Rocque, Brandon G.

    2017-01-01

    Objective Readmission rate is increasingly used as a quality outcome measure after surgery. The purpose of this study was to establish, using a national database, the baseline readmission rates and risk factors for readmission after pediatric neurosurgical procedures. Methods The American College of Surgeons National Surgical Quality Improvement Program–Pediatric database was queried for pediatric patients treated by a neurosurgeon from 2012 to 2013. Procedures were categorized by current procedural terminology code. Patient demographics, comorbidities, preoperative laboratory values, operative variables, and postoperative complications were analyzed via univariate and multivariate techniques to find associations with unplanned readmission within 30 days of the primary procedure. Results A total of 9799 cases met the inclusion criteria, 1098 (11.2%) of which had an unplanned readmission within 30 days. Readmission occurred 14.0 ± 7.7 days postoperatively (mean ± standard deviation). The 4 procedures with the highest unplanned readmission rates were CSF shunt revision (17.3%), repair of myelomeningocele > 5 cm in diameter (15.4%), CSF shunt creation (14.1%), and craniectomy for infratentorial tumor excision (13.9%). Spine (6.5%), craniotomy for craniosynostosis (2.1%), and skin lesion (1.0%) procedures had the lowest unplanned readmission rates. On multivariate regression analysis, the odds of readmission were greatest in patients experiencing postoperative surgical site infection (SSI; deep, organ/space, superficial SSI and wound disruption: OR > 12 and p readmission risk. Independent patient risk factors for unplanned readmission included Native American race (OR 2.363, p = 0.019), steroid use > 10 days (OR 1.411, p = 0.010), oxygen supplementation (OR 1.645, p = 0.010), nutritional support (OR 1.403, p = 0.009), seizure disorder (OR 1.250, p = 0.021), and longer operative time (per hour increase, OR 1.059, p = 0.014). Conclusions This study may aid in

  16. Procedures monitoring and MAAP analysis

    International Nuclear Information System (INIS)

    May, R.S.

    1991-01-01

    Numerous studies of severe accidents in light water reactors have shown that operator response can play a crucial role in the predicted outcomes of dominant accident scenarios. MAAP provides the capability to specify certain operator actions as input data. However, making reasonable assumptions about the nature and timing of operator response requires substantial knowledge about plant practices and procedures and what they imply for the event being analyzed. The appearance of knowledge based software technology in the mid-1980s provided a natural format for representing and maintaining procedures as IF-THEN rules. The boiling water reactor (BWR) Emergency Operating Procedures Tracking System (EOPTS) was composed of a rule base of procedures and a dedicated inference engine (problem-solver). Based on the general approach and experience of EOPTS, the authors have developed a prototype procedures monitoring system that reads MAAP transient output files and evaluate the EOP messages and instructions that would be implied during each transient time interval. The prototype system was built using the NEXPERT OBJECT expert system development system, running on a 386-class personal computer with 4 MB of memory. The limited scope prototype includes a reduced set of BWR6 EOPs procedures evaluation on a coarse time interval, a simple text-based user interface, and a summary-report generator. The prototype, which is limited to batch-mode analysis of MAAP output, is intended to demonstrate the concept and aid in the design of a production system, which will involve a direct link to MAAP and interactive capabilities

  17. Risk factors for unplanned readmission within 30 days after pediatric neurosurgery: a nationwide analysis of 9799 procedures from the American College of Surgeons National Surgical Quality Improvement Program.

    Science.gov (United States)

    Sherrod, Brandon A; Johnston, James M; Rocque, Brandon G

    2016-09-01

    OBJECTIVE Hospital readmission rate is increasingly used as a quality outcome measure after surgery. The purpose of this study was to establish, using a national database, the baseline readmission rates and risk factors for patient readmission after pediatric neurosurgical procedures. METHODS The American College of Surgeons National Surgical Quality Improvement Program-Pediatric database was queried for pediatric patients treated by a neurosurgeon between 2012 and 2013. Procedures were categorized by current procedural terminology (CPT) code. Patient demographics, comorbidities, preoperative laboratory values, operative variables, and postoperative complications were analyzed via univariate and multivariate techniques to find associations with unplanned readmissions within 30 days of the primary procedure. RESULTS A total of 9799 cases met the inclusion criteria, 1098 (11.2%) of which had an unplanned readmission within 30 days. Readmission occurred 14.0 ± 7.7 days postoperatively (mean ± standard deviation). The 4 procedures with the highest unplanned readmission rates were CSF shunt revision (17.3%; CPT codes 62225 and 62230), repair of myelomeningocele > 5 cm in diameter (15.4%), CSF shunt creation (14.1%), and craniectomy for infratentorial tumor excision (13.9%). The lowest unplanned readmission rates were for spine (6.5%), craniotomy for craniosynostosis (2.1%), and skin lesion (1.0%) procedures. On multivariate regression analysis, the odds of readmission were greatest in patients experiencing postoperative surgical site infection (SSI; deep, organ/space, superficial SSI, and wound disruption: OR > 12 and p 10 days (OR 1.411, p = 0.010), oxygen supplementation (OR 1.645, p = 0.010), nutritional support (OR 1.403, p = 0.009), seizure disorder (OR 1.250, p = 0.021), and longer operative time (per hour increase, OR 1.059, p = 0.029). CONCLUSIONS This study may aid in identifying patients at risk for unplanned readmission following pediatric neurosurgery

  18. Groin hematoma after electrophysiological procedures-incidence and predisposing factors.

    Science.gov (United States)

    Dalsgaard, Anja Borgen; Jakobsen, Christina Spåbæk; Riahi, Sam; Hjortshøj, Søren

    2014-10-01

    We evaluated the incidence and predisposing factors of groin hematomas after electrophysiological (EP) procedures. Prospective, observational study, enrolling consecutive patients after EP procedures (Atrial fibrillation: n = 151; Supraventricular tachycardia/Diagnostic EP: n = 82; Ventricular tachycardia: n = 18). Patients underwent manual compression for 10 min and 3 h post procedural bed rest. AF ablations were performed with INR 2-3, ACT > 300, and no protamine sulfate. Adhesive pressure dressings (APDs) were used if sheath size ≥ 10F; procedural time > 120 min; and BMI > 30. Patient-reported hematomas were recorded by a telephone follow-up after 2 weeks. Hematoma developed immediately in 26 patients (10%) and after 14 days significant hematoma was reported in 68 patients (27%). Regression analysis on sex, age, BMI 25, ACT 300, use of APD, sheath size and number, and complicated venous access was not associated with hematoma, either immediately after the procedure or after 14 days. Any hematoma presenting immediately after procedures was associated with patient-reported hematomas after 14 days, odds ratio 18.7 (CI 95%: 5.00-69.8; P hematoma immediately after EP procedures was the sole predictor of patient-reported hematoma after 2 weeks. Initiatives to prevent groin hematoma should focus on the procedure itself as well as post-procedural care.

  19. Probabilistic safety analysis procedures guide

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Bari, R.A.; Buslik, A.J.

    1984-01-01

    A procedures guide for the performance of probabilistic safety assessment has been prepared for interim use in the Nuclear Regulatory Commission programs. The probabilistic safety assessment studies performed are intended to produce probabilistic predictive models that can be used and extended by the utilities and by NRC to sharpen the focus of inquiries into a range of tissues affecting reactor safety. This guide addresses the determination of the probability (per year) of core damage resulting from accident initiators internal to the plant and from loss of offsite electric power. The scope includes analyses of problem-solving (cognitive) human errors, a determination of importance of the various core damage accident sequences, and an explicit treatment and display of uncertainties for the key accident sequences. Ultimately, the guide will be augmented to include the plant-specific analysis of in-plant processes (i.e., containment performance) and the risk associated with external accident initiators, as consensus is developed regarding suitable methodologies in these areas. This guide provides the structure of a probabilistic safety study to be performed, and indicates what products of the study are essential for regulatory decision making. Methodology is treated in the guide only to the extent necessary to indicate the range of methods which is acceptable; ample reference is given to alternative methodologies which may be utilized in the performance of the study

  20. Procedures as a Contributing Factor to Events in the Swedish Nuclear Power Plants. Analysis of a Database with Licensee Event Reports 1995-1999

    International Nuclear Information System (INIS)

    Bento, Jean-Pierre

    2002-12-01

    The operating experience from the twelve Swedish nuclear power units has been reviewed for the years 1995 - 1999 with respect to events - both Scrams and Licensee Event Reports, LERs - to which deficient procedure has been a contributing cause. In the present context 'Procedure' is defined as all written documentation used for the planning, performance and control of the tasks necessary for the operation and maintenance of the plants. The study has used an MTO-database (Man - Technology - Organisation) containing, for the five years studied, 42 MTO-related scrams out of 87 occurred scrams, and about 800 MTO-related LERs out of 2000 reported LERs. On an average, deficient procedures contribute to approximately 0,2 scram/unit/ year and to slightly more than three LERs/unit/year. Presented differently, procedure related scrams amount to 15% of the total number of scrams and to 31% of the MTO-related scrams. Similarly procedure related LERs amount to 10% of the total number of LERs and to 25% of the MTO-related LERs. For the most frequent work types performed at the plants, procedure related LERs are - in decreasing order - associated with tasks performed during maintenance, modification, testing and operation. However, for the latest year studied almost as many procedure related LERs are associated with modification tasks as with the three other work types together. A further analysis indicates that 'Deficient procedure content' is, by far, the dominating underlying cause contributing to procedure related scrams and LERs. The study also discusses the coupling between procedure related scrams/LERs, power operation and refuelling outages, and Common Cause Failures, CCF. An overall conclusion is that procedure related events in the Swedish nuclear power plants do not, on a national scale, represent an alarming issue. Significant and sustained efforts have been and are made at most units to improve the quality of procedures. However, a few units exhibit a noticeable

  1. Procedure for measurement of anisotropy factor for neutron sources

    International Nuclear Information System (INIS)

    Creazolla, Prycylla Gomes

    2017-01-01

    Radioisotope neutron sources allow the production of reference fields for calibration of neutron detectors for radiation protection and analysis purposes. When the emission rate of these sources is isotropic, no correction is necessary. However, variations in source encapsulation and in the radioactive material concentration produce differences in its neutron emission rate, relative to the source axis, this effect is called anisotropy. In this study, is describe a procedure for measuring the anisotropy factor of neutron sources performed in the Laboratório de Metrologia de Neutrons (LN) using a Precision Long Counter (PLC) detector. A measurement procedure that takes into account the anisotropy factor of neutron sources contributes to solve some issues, particularly with respect to the high uncertainties associated with neutron dosimetry. Thus, a bibliographical review was carried out based on international standards and technical regulations specific to the area of neutron fields, and were later reproduced in practice by means of the procedure for measuring the anisotropy factor in neutron sources of the LN. The anisotropy factor is determined as a function of the angle of 90° in relation to the cylindrical axis of the source. This angle is more important due to its high use in measurements and also of its higher neutron emission rate if compared with other angles. (author)

  2. Foundations of factor analysis

    CERN Document Server

    Mulaik, Stanley A

    2009-01-01

    Introduction Factor Analysis and Structural Theories Brief History of Factor Analysis as a Linear Model Example of Factor AnalysisMathematical Foundations for Factor Analysis Introduction Scalar AlgebraVectorsMatrix AlgebraDeterminants Treatment of Variables as Vectors Maxima and Minima of FunctionsComposite Variables and Linear Transformations Introduction Composite Variables Unweighted Composite VariablesDifferentially Weighted Composites Matrix EquationsMulti

  3. Step Complexity Measure for Emergency Operating Procedures - Determining Weighting Factors

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jung, Wondea; Kim, Jaewhan; Ha, Jaejoo

    2003-01-01

    weighting factors are determined by using a nonlinear regression analysis. The results show that the SC scores quantified by the new weighting factors show statistically meaningful correlation with averaged step performance time data. Thus, it can be concluded that the SC measure can represent the complexity of procedural steps included in EOPs

  4. Current outcomes and risk factors for the Norwood procedure.

    Science.gov (United States)

    Stasik, Chad N; Gelehrter, S; Goldberg, Caren S; Bove, Edward L; Devaney, Eric J; Ohye, Richard G

    2006-02-01

    Tremendous strides have been made in the outcomes for hypoplastic left heart syndrome and other functional single-ventricle malformations over the past 25 years. This progress relates primarily to improvements in survival for patients undergoing the Norwood procedure. Previous reports on risk factors have been on smaller groups of patients or collected over relatively long periods of time, during which management has evolved. We analyzed our current results for the Norwood procedure with attention to risk factors for poor outcome. A single-institution review of all patients undergoing a Norwood procedure for a single-ventricle malformation from May 1, 2001, through April 30, 2003, was performed. Patient demographics, anatomy, clinical condition, associated anomalies, operative details, and outcomes were recorded. Of the 111 patients, there were 23 (21%) hospital deaths. Univariate analysis revealed noncardiac abnormalities (genetic or significant extracardiac diagnosis, P = .0018), gestational age (P = .03), diagnosis of unbalanced atrioventricular septal defect (P = .017), and weight of less than 2.5 kg (P = .0072) to be related to hospital death. On multivariate analysis, only weight of less than 2.5 kg and noncardiac abnormalities were found to be independent risk factors. Patients with either of these characteristics had a hospital survival of 52% (12/23), whereas those at standard risk had a survival of 86% (76/88). Although improvements in management might have lessened the effect of some of the traditionally reported risk factors related to variations in the cardiovascular anatomy, noncardiac abnormalities and low birth weight remain as a future challenge for the physician caring for the patient with single-ventricle physiology.

  5. Procedures for measurement of anisotropy factor of neutron sources

    International Nuclear Information System (INIS)

    Creazolla, P.G.; Camargo, A.; Astuto, A.; Silva, F.; Pereira, W.W.

    2017-01-01

    Radioisotope sources of neutrons allow the production of reference fields for calibration of neutron measurement devices for radioprotection and analysis purposes. When the emission rate of these sources is isotropic, no correction is necessary. However, variations in the source capsule material and variations in the concentration of the emitting material may produce differences in its neutron emission rate relative to the source axis, this effect is called anisotropy. A proposed procedure for measuring the anisotropy factor of the sources belonging to the IRD/LNMRI/LN Neutron Metrology Laboratory using a Precision Long Counter (PLC) detector will be presented

  6. Factors affecting the design of instrument flight procedures

    Directory of Open Access Journals (Sweden)

    Ivan FERENCZ

    2008-01-01

    Full Text Available The article highlights factors, which might affect the design of instrument flight procedures. Ishikawa diagram is used to distribute individual factors into classes, as are People, Methods, Regulations, Tools, Data and Environment.

  7. Gastroesophageal reflux disease after peroral endoscopic myotomy: Analysis of clinical, procedural and functional factors, associated with gastroesophageal reflux disease and esophagitis.

    Science.gov (United States)

    Familiari, Pietro; Greco, Santi; Gigante, Giovanni; Calì, Anna; Boškoski, Ivo; Onder, Graziano; Perri, Vincenzo; Costamagna, Guido

    2016-01-01

    Peroral endoscopic myotomy (POEM) does not include any antireflux procedure, resulting in a certain risk of iatrogenic gastroesophageal reflux disease (GERD). The aim of the present study was to evaluate the incidence of iatrogenic GERD after POEM and identify preoperative, perioperative and postoperative factors associated with GERD. All patients treated at a single center who had a complete GERD evaluation after POEM were included in the study. Demographics, preoperative and follow-up data, results of functional studies and procedural data were collected and analyzed. A total of 103 patients (mean age 46.6 years, 47 males) were included. Postoperative altered esophageal acid exposure was attested in 52 patients (50.5%). A total of 19 patients (18.4%) had heartburn and 21 had esophagitis (20.4%). Overall, a clinically relevant GERD (altered esophageal acid exposure, associated with heartburn and/or esophagitis) was diagnosed in 30 patients (29.1%). Correlation between the severity of esophageal acid exposure with heartburn and esophagitis after POEM was found. Patients with heartburn had a lower postoperative 4-second integrated relaxation pressure compared to patients without symptoms (7.6 ± 3.8 mmHg vs 10.01 ± 4.4 mmHg, p<0.05). No correlations were identified with patient sex, age, postoperative body mass index, esophageal shape (sigmoid vs non sigmoid), lower esophageal sphincter pressure, length of myotomy, previous therapies and type of achalasia at high-resolution manometry. Preoperative, perioperative or postoperative factors minimally correlated with GERD after POEM. Clinically relevant GERD was identified in less than one-third of patients, but all patients were well controlled with medical therapy. © 2015 The Authors Digestive Endoscopy © 2015 Japan Gastroenterological Endoscopy Society.

  8. Factorization Procedure for Harmonically Bound Brownian Particle

    International Nuclear Information System (INIS)

    Omolo, JK.

    2006-01-01

    The method of factorization to solve the problem of the one-dimensional harmonically bound Brownian particle was applied. Assuming the the rapidily fluctuating random force is Gaussian and has an infinitely short correlation time, explicit expressions for the position-position,velocity-velocity, and the position-velocity correlation functions, which are also use to write down appropriate distribution functions were used. The correlation and distribution functions for the complex quantity (amplititude) which provides the expressions for the position and velocity of the particle are calculated. Finally, Fokker-Planck equations for the joint probability distribution functions for the amplititude and it's complex conjugate as well as for the position and velocity of the particle are obtained. (author)

  9. Bio-Oil Analysis Laboratory Procedures | Bioenergy | NREL

    Science.gov (United States)

    Bio-Oil Analysis Laboratory Procedures Bio-Oil Analysis Laboratory Procedures NREL develops laboratory analytical procedures (LAPs) for the analysis of raw and upgraded pyrolysis bio-oils. These standard procedures have been validated and allow for reliable bio-oil analysis. Procedures Determination

  10. Safeguards Network Analysis Procedure (SNAP): overview

    International Nuclear Information System (INIS)

    Chapman, L.D; Engi, D.

    1979-08-01

    Nuclear safeguards systems provide physical protection and control of nuclear materials. The Safeguards Network Analysis Procedure (SNAP) provides a convenient and standard analysis methodology for the evaluation of physical protection system effectiveness. This is achieved through a standard set of symbols which characterize the various elements of safeguards systems and an analysis program to execute simulation models built using the SNAP symbology. The outputs provided by the SNAP simulation program supplements the safeguards analyst's evaluative capabilities and supports the evaluation of existing sites as well as alternative design possibilities. This paper describes the SNAP modeling technique and provides an example illustrating its use

  11. Building America House Performance Analysis Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, R.; Farrar-Nagy, S.; Anderson, R.; Judkoff, R.

    2001-10-29

    As the Building America Program has grown to include a large and diverse cross section of the home building industry, accurate and consistent analysis techniques have become more important to help all program partners as they perform design tradeoffs and calculate energy savings for prototype houses built as part of the program. This document illustrates some of the analysis concepts proven effective and reliable for analyzing the transient energy usage of advanced energy systems as well as entire houses. The analysis procedure described here provides a starting point for calculating energy savings of a prototype house relative to two base cases: builder standard practice and regional standard practice. Also provides building simulation analysis to calculate annual energy savings based on side-by-side short-term field testing of a prototype house.

  12. Factor analysis and scintigraphy

    International Nuclear Information System (INIS)

    Di Paola, R.; Penel, C.; Bazin, J.P.; Berche, C.

    1976-01-01

    The goal of factor analysis is usually to achieve reduction of a large set of data, extracting essential features without previous hypothesis. Due to the development of computerized systems, the use of largest sampling, the possibility of sequential data acquisition and the increase of dynamic studies, the problem of data compression can be encountered now in routine. Thus, results obtained for compression of scintigraphic images were first presented. Then possibilities given by factor analysis for scan processing were discussed. At last, use of this analysis for multidimensional studies and specially dynamic studies were considered for compression and processing [fr

  13. Hierarchical Factoring Based On Image Analysis And Orthoblique Rotations.

    Science.gov (United States)

    Stankov, L

    1979-07-01

    The procedure for hierarchical factoring suggested by Schmid and Leiman (1957) is applied within the framework of image analysis and orthoblique rotational procedures. It is shown that this approach necessarily leads to correlated higher order factors. Also, one can obtain a smaller number of factors than produced by typical hierarchical procedures.

  14. Aesthetic Surgical Procedures in Men: Major Complications and Associated Risk Factors.

    Science.gov (United States)

    Kaoutzanis, Christodoulos; Winocour, Julian; Yeslev, Max; Gupta, Varun; Asokan, Ishan; Roostaeian, Jason; Grotting, James C; Higdon, K Kye

    2018-03-14

    The number of men undergoing cosmetic surgery is increasing in North America. To determine the incidence and risk factors of major complications in males undergoing cosmetic surgery, compare the complication profiles between men and women, and identify specific procedures that are associated with higher risk of complications in males. A prospective cohort of patients undergoing cosmetic surgery between 2008 and 2013 was identified from the CosmetAssure database. Gender specific procedures were excluded. Primary outcome was occurrence of a major complication in males requiring emergency room visit, hospital admission, or reoperation within 30 days of the index operation. Univariate and multivariate analysis evaluated potential risk factors for major complications including age, body mass index (BMI), smoking, diabetes, type of surgical facility, type of procedure, and combined procedures. Of the 129,007 patients, 54,927 underwent gender nonspecific procedures, of which 5801 (10.6%) were males. Women showed a higher mean age (46.4 ± 14.1 vs 45.2 ± 16.7 years, P procedures (RR 3.47), and combined procedures (RR 2.56). Aesthetic surgery in men is safe with low major complication rates. Modifiable predictors of complications included BMI and combined procedures.

  15. Standard Procedure for Grid Interaction Analysis

    International Nuclear Information System (INIS)

    Svensson, Bertil; Lindahl, Sture; Karlsson, Daniel; Joensson, Jonas; Heyman, Fredrik

    2015-01-01

    Grid events, simultaneously affecting all safety related auxiliary systems in a nuclear power plant, are critical and must be carefully addressed in the design, upgrading and operational processes. Up to now, the connecting grid has often been treated as either fully available or totally unavailable, and too little attention has been paid to specify the grid performance criteria. This paper deals with standard procedures for grid interaction analysis, to derive tools and criteria to handle grid events challenging the safety systems of the plant. Critical external power system events are investigated and characterised, with respect to severity and rate of occurrence. These critical events are then grouped with respect to impact on the safety systems, when a disturbance propagates into the plant. It is then important to make sure that 1) the impact of the disturbance will never reach any critical system, 2) the impact of the disturbance will be eliminated before it will hurt any critical system, or 3) the critical systems will be proven to be designed in such a way that they can withstand the impact of the disturbance, and the associated control and protection systems can withstand voltage and frequency transients associated with the disturbances. A number of representative disturbance profiles, reflecting connecting grid conditions, are therefore derived, to be used for equipment testing. (authors)

  16. "Factor Analysis Using ""R"""

    Directory of Open Access Journals (Sweden)

    A. Alexander Beaujean

    2013-02-01

    Full Text Available R (R Development Core Team, 2011 is a very powerful tool to analyze data, that is gaining in popularity due to its costs (its free and flexibility (its open-source. This article gives a general introduction to using R (i.e., loading the program, using functions, importing data. Then, using data from Canivez, Konold, Collins, and Wilson (2009, this article walks the user through how to use the program to conduct factor analysis, from both an exploratory and confirmatory approach.

  17. Centre characteristics and procedure-related factors have an impact on outcomes of allogeneic transplantation for patients with CLL: a retrospective analysis from the European Society for Blood and Marrow Transplantation (EBMT)

    NARCIS (Netherlands)

    Schetelig, J.; Wreede, L.C. de; Andersen, N.S.; Moreno, C.; Gelder, M. van; Vitek, A.; Karas, M.; Michallet, M.; Machaczka, M.; Gramatzki, M.; Beelen, D.; Finke, J.; Delgado, J.; Volin, L.; Passweg, J.; Dreger, P.; Schaap, N.P.; Wagner, E.; Henseler, A.; Biezen, A. van; Bornhauser, M.; Iacobelli, S.; Putter, H.; Schonland, S.O.; Kroger, N.

    2017-01-01

    The best approach for allogeneic haematopoietic stem cell transplantations (alloHCT) in patients with chronic lymphocytic leukaemia (CLL) is unknown. We therefore analysed the impact of procedure- and centre-related factors on 5-year event-free survival (EFS) in a large retrospective study. Data of

  18. CARVEDILOL POPULATION PHARMACOKINETIC ANALYSIS – APPLIED VALIDATION PROCEDURE

    Directory of Open Access Journals (Sweden)

    Aleksandra Catić-Đorđević

    2013-09-01

    and drug. This study confirmed the importance of using valid analytical procedure for the purpose of carvedilol population pharmacokinetic analysis. Identification of demographic, pathophysiological and other factors that may influence the population carvedilol PK parameters gives the physician the possibility of a more comprehensive overview of the patient and better optimization of the therapeutical regimen.

  19. Risk factors for postoperative urinary tract infection following midurethral sling procedures.

    Science.gov (United States)

    Doganay, Melike; Cavkaytar, Sabri; Kokanali, Mahmut Kuntay; Ozer, Irfan; Aksakal, Orhan Seyfi; Erkaya, Salim

    2017-04-01

    To identify the potential risk factors for urinary tract infections following midurethral sling procedures. 556 women who underwent midurethral sling procedure due to stress urinary incontinence over a four-year period were reviewed in this retrospective study. Of the study population, 280 women underwent TVT procedures and 276 women underwent TOT procedures. Patients were evaluated at 4-8 weeks postoperatively and were investigated for the occurrence of a urinary tract infection. Patients who experienced urinary tract infection were defined as cases, and patients who didn't were defined as controls. All data were collected from medical records. Multivariate logistic regression model was used to identify the risk factors for urinary tract infection. Of 556 women, 58 (10.4%) were defined as cases while 498 (89.6%) were controls. The mean age of women in cases (57.8±12.9years) was significantly greater than in controls (51.8±11.2years) (purinary tract infection, concomitant vaginal hysterectomy and cystocele repair, TVT procedure and postoperative postvoiding residual bladder volume ≥100ml were more common in cases than in controls. However, in multivariate regression analysis model presence of preoperative urinary tract infection [OR (95% CI)=0.1 (0.1-0.7); p=0.013], TVT procedure [OR (95% CI)=8.4 (3.1-22.3); p=0.000] and postoperative postvoiding residual bladder volume ≥100ml [OR (95% CI)=4.6 (1.1-19.2); p=0.036] were significant independent risk factors for urinary tract infection following midurethral slings CONCLUSION: Urinary tract infection after midurethral sling procedures is a relatively common complication. The presence of preoperative urinary tract infection, TVT procedure and postoperative postvoiding residual bladder volume ≥100ml may increase the risk of this complication. Identification of these factors could help surgeons to minimize this complicationby developing effective strategies. Copyright © 2017. Published by Elsevier B.V.

  20. Summative Mass Analysis of Algal Biomass - Integration of Analytical Procedures: Laboratory Analytical Procedure (LAP)

    Energy Technology Data Exchange (ETDEWEB)

    Laurens, Lieve M. L.

    2016-01-13

    This procedure guides the integration of laboratory analytical procedures to measure algal biomass constituents in an unambiguous manner and ultimately achieve mass balance closure for algal biomass samples. Many of these methods build on years of research in algal biomass analysis.

  1. UNC Nuclear Industries' human-factored approach to the operating or maintenance procedure

    International Nuclear Information System (INIS)

    Nelson, A.A.; Clark, J.E.

    1982-01-01

    The development of Human Factors Engineering (HFE) and UNC Nuclear Industries' (UNC) commitment to minimizing the potential for human error in the performance of operating or maintenance procedures have lead to a procedure upgrade program. Human-factored procedures were developed using information from many sources including, but not limited to, operators, a human factors specialist, engineers and supervisors. This has resulted in the Job Performance Aid (JPA). This paper presents UNC's approach to providing human-factored operating and maintenance procedures

  2. Solid-phase extraction procedures in systematic toxicological analysis

    NARCIS (Netherlands)

    Franke, J.P.; de Zeeuw, R.A

    1998-01-01

    In systematic toxicological analysis (STA) the substance(s) present is (are) not known at the start of the analysis. in such an undirected search the extraction procedure cannot be directed to a given substance but must be a general procedure where a compromise must be reached in that the substances

  3. Factors affecting construction performance: exploratory factor analysis

    Science.gov (United States)

    Soewin, E.; Chinda, T.

    2018-04-01

    The present work attempts to develop a multidimensional performance evaluation framework for a construction company by considering all relevant measures of performance. Based on the previous studies, this study hypothesizes nine key factors, with a total of 57 associated items. The hypothesized factors, with their associated items, are then used to develop questionnaire survey to gather data. The exploratory factor analysis (EFA) was applied to the collected data which gave rise 10 factors with 57 items affecting construction performance. The findings further reveal that the items constituting ten key performance factors (KPIs) namely; 1) Time, 2) Cost, 3) Quality, 4) Safety & Health, 5) Internal Stakeholder, 6) External Stakeholder, 7) Client Satisfaction, 8) Financial Performance, 9) Environment, and 10) Information, Technology & Innovation. The analysis helps to develop multi-dimensional performance evaluation framework for an effective measurement of the construction performance. The 10 key performance factors can be broadly categorized into economic aspect, social aspect, environmental aspect, and technology aspects. It is important to understand a multi-dimension performance evaluation framework by including all key factors affecting the construction performance of a company, so that the management level can effectively plan to implement an effective performance development plan to match with the mission and vision of the company.

  4. Selective Sequential Zero-Base Budgeting Procedures Based on Total Factor Productivity Indicators

    OpenAIRE

    A. Ishikawa; E. F. Sudit

    1981-01-01

    The authors' purpose in this paper is to develop productivity-based sequential budgeting procedures designed to expedite identification of major problem areas in bugetary performance, as well as to reduce the costs associated with comprehensive zero-base analyses. The concept of total factor productivity is reviewed and its relations to ordinary and zero-based budgeting are discussed in detail. An outline for a selective sequential analysis based on monitoring of three key indicators of (a) i...

  5. Analysis procedure for americium in environmental samples

    International Nuclear Information System (INIS)

    Holloway, R.W.; Hayes, D.W.

    1982-01-01

    Several methods for the analysis of 241 Am in environmental samples were evaluated and a preferred method was selected. This method was modified and used to determine the 241 Am content in sediments, biota, and water. The advantages and limitations of the method are discussed. The method is also suitable for 244 Cm analysis

  6. Building America Performance Analysis Procedures: Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    None

    2004-06-01

    To measure progress toward multi-year research goals, cost and performance trade-offs are evaluated through a series of controlled field and laboratory experiments supported by energy analysis techniques using test data to calibrate simulation models.

  7. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

    International Nuclear Information System (INIS)

    Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

    2015-01-01

    Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based procedure system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the

  8. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

    2015-02-01

    Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based procedure system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the

  9. Analysis of fetal dose in CT procedures

    International Nuclear Information System (INIS)

    Ortiz Torres, A.; Plazas, M. C.

    2006-01-01

    It is the miracle of the life, that sublime formation, the given more beautiful gift for heaven's sake to our to exist, and it is consequently our responsibility to look after their protection and care. Today in day the quantity of radiation absorbed by the fetus in the treatments for radiodiagnostic, mainly in the procedures of on-line axial tomography, the fetus absorbs a considerable dose of radiation and the questions generated regarding if these doses, bear to a risk of malformations or if it is necessary the interruption of the pregnancy is very frequent. In most of the cases, the treatment with ionizing radiations that it is beneficial for the mother, is only indirectly it for the fetus that is exposed to a risk. The possibility that a fetus or a small boy contract cancer caused by the radiation it can be three times superior to that of the population in general, of there the importance of analyzing the goods of the prenatal irradiation and the main agents to consider for the estimate of the magnitude of the risk of the exhibitions in uterus. In the different circumstances in that these can happen in treatments of on-line axial tomography computerized. (Author)

  10. Shakedown analysis by finite element incremental procedures

    International Nuclear Information System (INIS)

    Borkowski, A.; Kleiber, M.

    1979-01-01

    It is a common occurence in many practical problems that external loads are variable and the exact time-dependent history of loading is unknown. Instead of it load is characterized by a given loading domain: a convex polyhedron in the n-dimensional space of load parameters. The problem is then to check whether a structure shakes down, i.e. responds elastically after a few elasto-plastic cycles, or not to a variable loading as defined above. Such check can be performed by an incremental procedure. One should reproduce incrementally a simple cyclic process which consists of proportional load paths that connect the origin of the load space with the corners of the loading domain. It was proved that if a structure shakes down to such loading history then it is able to adopt itself to an arbitrary load path contained in the loading domain. The main advantage of such approach is the possibility to use existing incremental finite-element computer codes. (orig.)

  11. Updating QR factorization procedure for solution of linear least squares problem with equality constraints.

    Science.gov (United States)

    Zeb, Salman; Yousaf, Muhammad

    2017-01-01

    In this article, we present a QR updating procedure as a solution approach for linear least squares problem with equality constraints. We reduce the constrained problem to unconstrained linear least squares and partition it into a small subproblem. The QR factorization of the subproblem is calculated and then we apply updating techniques to its upper triangular factor R to obtain its solution. We carry out the error analysis of the proposed algorithm to show that it is backward stable. We also illustrate the implementation and accuracy of the proposed algorithm by providing some numerical experiments with particular emphasis on dense problems.

  12. Video content analysis of surgical procedures.

    Science.gov (United States)

    Loukas, Constantinos

    2018-02-01

    In addition to its therapeutic benefits, minimally invasive surgery offers the potential for video recording of the operation. The videos may be archived and used later for reasons such as cognitive training, skills assessment, and workflow analysis. Methods from the major field of video content analysis and representation are increasingly applied in the surgical domain. In this paper, we review recent developments and analyze future directions in the field of content-based video analysis of surgical operations. The review was obtained from PubMed and Google Scholar search on combinations of the following keywords: 'surgery', 'video', 'phase', 'task', 'skills', 'event', 'shot', 'analysis', 'retrieval', 'detection', 'classification', and 'recognition'. The collected articles were categorized and reviewed based on the technical goal sought, type of surgery performed, and structure of the operation. A total of 81 articles were included. The publication activity is constantly increasing; more than 50% of these articles were published in the last 3 years. Significant research has been performed for video task detection and retrieval in eye surgery. In endoscopic surgery, the research activity is more diverse: gesture/task classification, skills assessment, tool type recognition, shot/event detection and retrieval. Recent works employ deep neural networks for phase and tool recognition as well as shot detection. Content-based video analysis of surgical operations is a rapidly expanding field. Several future prospects for research exist including, inter alia, shot boundary detection, keyframe extraction, video summarization, pattern discovery, and video annotation. The development of publicly available benchmark datasets to evaluate and compare task-specific algorithms is essential.

  13. Analysis and optimization of blood-testing procedures.

    NARCIS (Netherlands)

    Bar-Lev, S.K.; Boxma, O.J.; Perry, D.; Vastazos, L.P.

    2017-01-01

    This paper is devoted to the performance analysis and optimization of blood testing procedures. We present a queueing model of two queues in series, representing the two stages of a blood-testing procedure. Service (testing) in stage 1 is performed in batches, whereas it is done individually in

  14. System analysis procedures for conducting PSA of nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Yoon Hwan; Jeong, Won Dae; Kim, Tae Un; Kim, Kil You; Han, Sang Hoon; Chang, Seung Chul; Sung, Tae Yong; Yang, Jun Eon; Kang, Dae Il; Park, Jin Hee; Hwang, Mi Jeong; Jin, Young Ho.

    1997-03-01

    This document, the Probabilistic Safety Assessment(PSA) procedures guide for system analysis, is intended to provide the guidelines to analyze the target of system consistently and technically in the performance of PSA for nuclear power plants(NPPs). The guide has been prepared in accordance with the procedures and techniques for fault tree analysis(FTA) used in system analysis. Normally the main objective of system analysis is to assess the reliability of system modeled by Event Tree Analysis(ETA). A variety of analytical techniques can be used for the system analysis, however, FTA method is used in this procedures guide. FTA is the method used for representing the failure logic of plant systems deductively using AND, OR or NOT gates. The fault tree should reflect all possible failure modes that may contribute to the system unavailability. This should include contributions due to the mechanical failures of the components, Common Cause Failures (CCFs), human errors and outages for testing and maintenance. After the construction of fault tree is completed, system unavailability is calculated with the CUT module of KIRAP, and the qualitative and quantitative analysis is performed through the process as above stated. As above mentioned, the procedures for system analysis is based on PSA procedures and methods which has been applied to the safety assessments of constructing NPPs in the country. Accordingly, the method of FTA stated in this procedures guide will be applicable to PSA for the NPPs to be constructed in the future. (author). 6 tabs., 11 figs., 7 refs

  15. System analysis procedures for conducting PSA of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yoon Hwan; Jeong, Won Dae; Kim, Tae Un; Kim, Kil You; Han, Sang Hoon; Chang, Seung Chul; Sung, Tae Yong; Yang, Jun Eon; Kang, Dae Il; Park, Jin Hee; Hwang, Mi Jeong; Jin, Young Ho

    1997-03-01

    This document, the Probabilistic Safety Assessment(PSA) procedures guide for system analysis, is intended to provide the guidelines to analyze the target of system consistently and technically in the performance of PSA for nuclear power plants(NPPs). The guide has been prepared in accordance with the procedures and techniques for fault tree analysis(FTA) used in system analysis. Normally the main objective of system analysis is to assess the reliability of system modeled by Event Tree Analysis(ETA). A variety of analytical techniques can be used for the system analysis, however, FTA method is used in this procedures guide. FTA is the method used for representing the failure logic of plant systems deductively using AND, OR or NOT gates. The fault tree should reflect all possible failure modes that may contribute to the system unavailability. This should include contributions due to the mechanical failures of the components, Common Cause Failures (CCFs), human errors and outages for testing and maintenance. After the construction of fault tree is completed, system unavailability is calculated with the CUT module of KIRAP, and the qualitative and quantitative analysis is performed through the process as above stated. As above mentioned, the procedures for system analysis is based on PSA procedures and methods which has been applied to the safety assessments of constructing NPPs in the country. Accordingly, the method of FTA stated in this procedures guide will be applicable to PSA for the NPPs to be constructed in the future. (author). 6 tabs., 11 figs., 7 refs.

  16. Identifying cognitive complexity factors affecting the complexity of procedural steps in emergency operating procedures of a nuclear power plant

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jeong, Kwangsup; Jung, Wondea

    2005-01-01

    In complex systems such as a nuclear and chemical plant, it is well known that the provision of understandable procedures that allow operators to clarify what needs to be done and how to do it is one of the requisites to secure their safety. As a previous study in providing understandable procedures, the step complexity (SC) measure that can quantify the complexity of procedural steps in emergency operating procedures (EOPs) of a nuclear power plant (NPP) was suggested. However, the necessity of additional complexity factors that can consider a cognitive aspect in evaluating the complexity of procedural steps is raised. To this end, the comparisons between operators' performance data measured by the form of a step performance time with their behavior in carrying out the prescribed activities of procedural steps are conducted in this study. As a result, two kinds of complexity factors (the abstraction level of knowledge and the level of engineering decision) that could affect an operator's cognitive burden are identified. Although a well-designed experiment is indispensable for confirming the appropriateness of the additional complexity factors, it is strongly believed that the change of operators' performance data can be more authentically explained if the additional complexity factors are taken into consideration

  17. Identifying cognitive complexity factors affecting the complexity of procedural steps in emergency operating procedures of a nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jinkyun [Integrated Safety Assessment Division, Korea Atomic Energy Research Institute, P.O. Box 105, Duckjin-Dong, Yusong-Ku, Taejon 305-600 (Korea, Republic of)]. E-mail: kshpjk@kaeri.re.kr; Jeong, Kwangsup [Integrated Safety Assessment Division, Korea Atomic Energy Research Institute, P.O. Box 105, Duckjin-Dong, Yusong-Ku, Taejon 305-600 (Korea, Republic of); Jung, Wondea [Integrated Safety Assessment Division, Korea Atomic Energy Research Institute, P.O. Box 105, Duckjin-Dong, Yusong-Ku, Taejon 305-600 (Korea, Republic of)

    2005-08-01

    In complex systems such as a nuclear and chemical plant, it is well known that the provision of understandable procedures that allow operators to clarify what needs to be done and how to do it is one of the requisites to secure their safety. As a previous study in providing understandable procedures, the step complexity (SC) measure that can quantify the complexity of procedural steps in emergency operating procedures (EOPs) of a nuclear power plant (NPP) was suggested. However, the necessity of additional complexity factors that can consider a cognitive aspect in evaluating the complexity of procedural steps is raised. To this end, the comparisons between operators' performance data measured by the form of a step performance time with their behavior in carrying out the prescribed activities of procedural steps are conducted in this study. As a result, two kinds of complexity factors (the abstraction level of knowledge and the level of engineering decision) that could affect an operator's cognitive burden are identified. Although a well-designed experiment is indispensable for confirming the appropriateness of the additional complexity factors, it is strongly believed that the change of operators' performance data can be more authentically explained if the additional complexity factors are taken into consideration.

  18. Identifying cognitive complexity factors affecting the complexity of procedural steps in emergency operating procedures of a nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Jinkyun Park; Kwangsup Jeong; Wondea Jung [Korea Atomic Energy Research Institute, Taejon (Korea). Integrated Safety Assessment Division

    2005-08-15

    In complex systems such as a nuclear and chemical plant, it is well known that the provision of understandable procedures that allow operators to clarify what needs to be done and how to do it is one of the requisites to secure their safety. As a previous study in providing understandable procedures, the step complexity (SC) measure that can quantify the complexity of procedural steps in emergency operating procedures (EOPs) of a nuclear power plant (NPP) was suggested. However, the necessity of additional complexity factors that can consider a cognitive aspect in evaluating the complexity of procedural steps is raised. To this end, the comparisons between operator' performance data measured by the form of a step performance time with their behavior in carrying out the prescribed activities of procedural steps are conducted in this study. As a result, two kinds of complexity factors (the abstraction level of knowledge and the level of engineering decision) that could affect an operator's cognitive burden are identified. Although a well-designed experiment is indispensable for confirming the appropriateness of the additional complexity factors, it is strongly believed that the change of operators' performance data can be more authentically explained if the additional complexity factors are taken into consideration. (author)

  19. Quantification procedures in micro X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Kanngiesser, Birgit

    2003-01-01

    For the quantification in micro X-ray fluorescence analysis standardfree quantification procedures have become especially important. An introduction to the basic concepts of these quantification procedures is given, followed by a short survey of the procedures which are available now and what kind of experimental situations and analytical problems are addressed. The last point is extended by the description of an own development for the fundamental parameter method, which renders the inclusion of nonparallel beam geometries possible. Finally, open problems for the quantification procedures are discussed

  20. Operating procedures: Fusion Experiments Analysis Facility

    Energy Technology Data Exchange (ETDEWEB)

    Lerche, R.A.; Carey, R.W.

    1984-03-20

    The Fusion Experiments Analysis Facility (FEAF) is a computer facility based on a DEC VAX 11/780 computer. It became operational in late 1982. At that time two manuals were written to aid users and staff in their interactions with the facility. This manual is designed as a reference to assist the FEAF staff in carrying out their responsibilities. It is meant to supplement equipment and software manuals supplied by the vendors. Also this manual provides the FEAF staff with a set of consistent, written guidelines for the daily operation of the facility.

  1. Building America Performance Analysis Procedures: Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, R.; Anderson, R.; Judkoff, R.; Christensen, C.; Eastment, M.; Norton, P.; Reeves, P.; Hancock, E.

    2004-06-01

    To measure progress toward multi-year Building America research goals, cost and performance trade-offs are evaluated through a series of controlled field and laboratory experiments supported by energy analysis techniques that use test data to''calibrate'' energy simulation models. This report summarizes the guidelines for reporting such analytical results using the Building America Research Benchmark (Version 3.1) in studies that also include consideration of current Regional and Builder Standard Practice. Version 3.1 of the Benchmark is generally consistent with the 1999 Home Energy Rating System (HERS) Reference Home, with additions that allow evaluation of all home energy uses.

  2. Operating procedures: Fusion Experiments Analysis Facility

    International Nuclear Information System (INIS)

    Lerche, R.A.; Carey, R.W.

    1984-01-01

    The Fusion Experiments Analysis Facility (FEAF) is a computer facility based on a DEC VAX 11/780 computer. It became operational in late 1982. At that time two manuals were written to aid users and staff in their interactions with the facility. This manual is designed as a reference to assist the FEAF staff in carrying out their responsibilities. It is meant to supplement equipment and software manuals supplied by the vendors. Also this manual provides the FEAF staff with a set of consistent, written guidelines for the daily operation of the facility

  3. Factor analysis of multivariate data

    Digital Repository Service at National Institute of Oceanography (India)

    Fernandes, A.A.; Mahadevan, R.

    A brief introduction to factor analysis is presented. A FORTRAN program, which can perform the Q-mode and R-mode factor analysis and the singular value decomposition of a given data matrix is presented in Appendix B. This computer program, uses...

  4. The scientific use of factor analysis in behavioral and life sciences

    National Research Council Canada - National Science Library

    Cattell, Raymond Bernard

    1978-01-01

    ...; the choice of procedures in experimentation; factor interpretation; the relationship of factor analysis to broadened psychometric concepts such as scaling, validity, and reliability, and to higher- strata models...

  5. Seismic Retrofit of Reinforced Concrete Frame Buildings with Hysteretic Bracing Systems: Design Procedure and Behaviour Factor

    Directory of Open Access Journals (Sweden)

    Antonio Di Cesare

    2017-01-01

    Full Text Available This paper presents a design procedure to evaluate the mechanical characteristics of hysteretic Energy Dissipation Bracing (EDB systems for seismic retrofitting of existing reinforced concrete framed buildings. The proposed procedure, aiming at controlling the maximum interstorey drifts, imposes a maximum top displacement as function of the seismic demand and, if needed, regularizes the stiffness and strength of the building along its elevation. In order to explain the application of the proposed procedure and its capacity to involve most of the devices in the energy dissipation with similar level of ductility demand, a simple benchmark structure has been studied and nonlinear dynamic analyses have been performed. A further goal of this work is to propose a simplified approach for designing dissipating systems based on linear analysis with the application of a suitable behaviour factor, in order to achieve a widespread adoption of the passive control techniques. At this goal, the increasing of the structural performances due to the addition of an EDB system designed with the above-mentioned procedure has been estimated considering one thousand case studies designed with different combinations of the main design parameters. An analytical formulation of the behaviour factor for braced buildings has been proposed.

  6. Centre characteristics and procedure-related factors have an impact on outcomes of allogeneic transplantation for patients with CLL: a retrospective analysis from the European Society for Blood and Marrow Transplantation (EBMT).

    Science.gov (United States)

    Schetelig, Johannes; de Wreede, Liesbeth C; Andersen, Niels S; Moreno, Carol; van Gelder, Michel; Vitek, Antonin; Karas, Michal; Michallet, Mauricette; Machaczka, Maciej; Gramatzki, Martin; Beelen, Dietrich; Finke, Jürgen; Delgado, Julio; Volin, Liisa; Passweg, Jakob; Dreger, Peter; Schaap, Nicolaas; Wagner, Eva; Henseler, Anja; van Biezen, Anja; Bornhäuser, Martin; Iacobelli, Simona; Putter, Hein; Schönland, Stefan O; Kröger, Nicolaus

    2017-08-01

    The best approach for allogeneic haematopoietic stem cell transplantations (alloHCT) in patients with chronic lymphocytic leukaemia (CLL) is unknown. We therefore analysed the impact of procedure- and centre-related factors on 5-year event-free survival (EFS) in a large retrospective study. Data of 684 CLL patients who received a first alloHCT between 2000 and 2011 were analysed by multivariable Cox proportional hazards models with a frailty component to investigate unexplained centre heterogeneity. Five-year EFS of the whole cohort was 37% (95% confidence interval [CI], 34-42%). Larger numbers of CLL alloHCTs (hazard ratio [HR] 0·96, P = 0·002), certification of quality management (HR 0·7, P = 0·045) and a higher gross national income per capita (HR 0·4, P = 0·04) improved EFS. In vivo T-cell depletion (TCD) with alemtuzumab compared to no TCD (HR 1·5, P = 0·03), and a female donor compared to a male donor for a male patient (HR 1·4, P = 0·02) had a negative impact on EFS, but not non-myeloablative versus more intensive conditioning. After correcting for patient-, procedure- and centre-characteristics, significant variation in centre outcomes persisted. In conclusion, further research on the impact of centre and procedural characteristics is warranted. Non-myeloablative conditioning appears to be the preferable approach for patients with CLL. © 2017 John Wiley & Sons Ltd.

  7. Reconstruction in oral malignancy: Factors affecting morbidity of various procedures

    Science.gov (United States)

    Chakrabarti, Suvadip; Chakrabarti, Preeti Rihal; Desai, Sanjay M.; Agrawal, Deepak; Mehta, Dharmendra Y.; Pancholi, Mayank

    2015-01-01

    Aims and Objective: (1) To study the age and sex distribution of patient with oral malignancies. (2) To analyze various types of surgery performed. (3) Evaluation of reconstruction and factors affecting complications and its relation to the type of reconstruction. Materials and Methods: Cases of oral malignancies, undergoing surgery for the same in Sri Aurobindo Medical College and PG Institute, Indore from the period from October 1, 2012, to March 31, 2015. Results: Out of analysis of 111 cases of oral malignancy, 31 (27.9%) cases were in the fifth decade of life with male to female ratio 1.9:1. The commonest site of cancer was buccal mucosa. Forty-seven cases (43.2%) were in stage IVa. Diabetes was the most common co-morbidity reported, accounting for 53.9% of cases with reported morbidity. Tobacco chewing was the common entity in personal habits. All the cases underwent neck dissection along with resection of the primary. Hemimandibulectomy was the most preferred form of primary resection accounting for 53.15% (59 cases), followed by wide resection of primary 27% (30 cases). Pectoralis major myocutaneous (PMMC) flap only was the most common reconstruction across the study population. PMMC alone accounted for 38.7% (43 cases). The infection rate was 16.21%. PMMC alone accounted for 5 out of 18 (27.8%) of total infection rate, and 4.5% of the total study population. PMMC + deltopectoral accounted for 5 out of 18 (27.8%) of total infection rate, and 4.5% of the total study population. Conclusion: PMMC is a major workhorse for reconstruction with better functional outcome and acceptance among operated patients. PMID:26981469

  8. [High risk factors of upper gastrointestinal bleeding after neurosurgical procedures].

    Science.gov (United States)

    Zheng, Kang; Wu, Gang; Cheng, Neng-neng; Yao, Cheng-jun; Zhou, Liang-fu

    2005-12-21

    To analyze high risk factors of postoperative upper gastrointestinal (GI) bleeding after neurosurgery so as to give guidance for prevention of GI bleeding. A questionnaire was developed to investigate the medical records of 1500 patients who were hospitalized and underwent neurosurgical operations in 1997. Logistic regression analysis was made. 1430 valid questionnaires were obtained. Postoperative upper GI bleeding occurred in 75 patients (5.24%). The incidence of upper GI bleeding were 6.64% (54/813) in the male patients and 3.40% (21/617) in the female persons (P = 0.007); 9.88% (41/415) in those aged > 50 and 3.35% in those aged hematoma, intraventricular hemorrhage, subdural hematoma, and extradural hematoma were 15.7%, 10.0%, 6.00%, and 2.94% respectively (P = 0.02). The incidence of upper GI bleeding of the patients with tumors of fourth ventricle of cerebrum, brainstem, cerebral hemisphere, and sellar hypothalamus were 15.79% (3/19), 7.89%, 5.71%, and 3.74% respectively. In the emergent cases, the incidence of upper GI bleeding was higher in those with hypertension. The incidence of upper GI bleeding was 5.46% in the patients undergoing adrenocortical hormone treatment, significantly higher than that in those who did not receive such treatment (2.13%). Patients who are at high risk of developing postoperative upper GI bleeding including that: age greater than 50 years; male; Glasgow Coma Score less than 10 pre and post operation; The lesion was located in brain stem and forth ventricle; Hypertensive cerebral hemorrhage; Intracerebral and intraventricular hemorrhagic brain trauma; Postoperative pneumonia, brain edema, encephalic high pressure, pyogenic infection of the central nervous system and other postoperative complications. The mortality of patients with postoperative upper GI bleeding was evidently higher than that of the patients without postoperative upper GI bleeding.

  9. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  10. Procedure for the analysis of americium in complex matrices

    International Nuclear Information System (INIS)

    Knab, D.

    1978-02-01

    A radioanalytical procedure for the analysis of americium in complex matrices has been developed. Clean separations of americium can be obtained from up to 100 g of sample ash, regardless of the starting material. The ability to analyze large masses of material provides the increased sensitivity necessary to detect americium in many environmental samples. The procedure adequately decontaminates from rare earth elements and natural radioactive nuclides that interfere with the alpha spectrometric measurements

  11. ORNL-PWR BDHT analysis procedure: an overview

    International Nuclear Information System (INIS)

    Cliff, S.B.

    1978-01-01

    The key computer programs currently used by the analysis procedure of the ORNL-PWR Blowdown Heat Transfer Separate Effects Program are overviewed with particular emphasis placed on their interrelationships. The major modeling and calculational programs, COBRA, ORINC, ORTCAL, PINSIM, and various versions of RELAP4, are summarized and placed into the perspective of the procedure. The supportive programs, REDPLT, ORCPLT, BDHTPLOT, OXREPT, and OTOCI, and their uses are described

  12. Accident Sequence Evaluation Program: Human reliability analysis procedure

    Energy Technology Data Exchange (ETDEWEB)

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs.

  13. Accident Sequence Evaluation Program: Human reliability analysis procedure

    International Nuclear Information System (INIS)

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs

  14. Confirmatory factor analysis using Microsoft Excel.

    Science.gov (United States)

    Miles, Jeremy N V

    2005-11-01

    This article presents a method for using Microsoft (MS) Excel for confirmatory factor analysis (CFA). CFA is often seen as an impenetrable technique, and thus, when it is taught, there is frequently little explanation of the mechanisms or underlying calculations. The aim of this article is to demonstrate that this is not the case; it is relatively straightforward to produce a spreadsheet in MS Excel that can carry out simple CFA. It is possible, with few or no programming skills, to effectively program a CFA analysis and, thus, to gain insight into the workings of the procedure.

  15. Human factors evaluation of teletherapy: Human-system interfaces and procedures. Volume 3

    International Nuclear Information System (INIS)

    Kaye, R.D.; Henriksen, K.; Jones, R.; Morisseau, D.S.; Serig, D.I.

    1995-07-01

    A series of human factors evaluations was undertaken to better understand the contributing factors to human error in the teletherapy environment. Teletherapy is a multidisciplinary methodology for treating cancerous tissue through selective exposure to an external beam of ionizing radiation. The principal sources of radiation are a radioactive isotope, typically cobalt60 (Co-60), or a linear accelerator device capable of producing very high energy x-ray and electron beams. A team of human factors specialists conducted site visits to radiation oncology departments at community hospitals, university centers, and free-standing clinics. In addition, a panel of radiation oncologists, medical physicists, and radiation technologists served as subject matter experts. A function and task analysis was initially performed to guide subsequent evaluations in the areas of user-system interfaces, procedures, training and qualifications, and organizational policies and practices. The present report focuses on an evaluation of the human-system interfaces in relation to the treatment machines and supporting equipment (e.g., simulators, treatment planning computers, control consoles, patient charts) found in the teletherapy environment. The report also evaluates operating, maintenance and emergency procedures and practices involved in teletherapy. The evaluations are based on the function and task analysis and established human engineering guidelines, where applicable

  16. Human factors evaluation of teletherapy: Human-system interfaces and procedures. Volume 3

    Energy Technology Data Exchange (ETDEWEB)

    Kaye, R.D.; Henriksen, K.; Jones, R. [Hughes Training, Inc., Falls Church, VA (United States); Morisseau, D.S.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-07-01

    A series of human factors evaluations was undertaken to better understand the contributing factors to human error in the teletherapy environment. Teletherapy is a multidisciplinary methodology for treating cancerous tissue through selective exposure to an external beam of ionizing radiation. The principal sources of radiation are a radioactive isotope, typically cobalt60 (Co-60), or a linear accelerator device capable of producing very high energy x-ray and electron beams. A team of human factors specialists conducted site visits to radiation oncology departments at community hospitals, university centers, and free-standing clinics. In addition, a panel of radiation oncologists, medical physicists, and radiation technologists served as subject matter experts. A function and task analysis was initially performed to guide subsequent evaluations in the areas of user-system interfaces, procedures, training and qualifications, and organizational policies and practices. The present report focuses on an evaluation of the human-system interfaces in relation to the treatment machines and supporting equipment (e.g., simulators, treatment planning computers, control consoles, patient charts) found in the teletherapy environment. The report also evaluates operating, maintenance and emergency procedures and practices involved in teletherapy. The evaluations are based on the function and task analysis and established human engineering guidelines, where applicable.

  17. Nonparametric factor analysis of time series

    OpenAIRE

    Rodríguez-Poo, Juan M.; Linton, Oliver Bruce

    1998-01-01

    We introduce a nonparametric smoothing procedure for nonparametric factor analaysis of multivariate time series. The asymptotic properties of the proposed procedures are derived. We present an application based on the residuals from the Fair macromodel.

  18. Factors affecting successful colonoscopy procedures: Single-center experience.

    Science.gov (United States)

    Kozan, Ramazan; Yılmaz, Tonguç Utku; Baştuğral, Uygar; Kerimoğlu, Umut; Yavuz, Yücel

    2018-01-01

    Colonoscopy is a gold standard procedure for several colon pathologies. Successful colonoscopy means demonstration of the ileocecal valve and determination of colon polyps. Here we aimed to evaluate our colonoscopy success and results. This retrospective descriptive study was performed in İstanbul Eren hospital endoscopy unit between 2012 and 2015. Colonoscopy results and patient demographics were obtained from the hospital database. All colonoscopy procedures were performed under general anesthesia and after full bowel preparation. In all, 870 patients were included to the study. We reached to the cecum in 850 (97.8%) patients. We were unable to reach the cecum in patients who were old and obese and those with previous lower abdominal operations. Angulation, inability to move forward, and tortuous colon were the reasons for inability to reach the cecum. Total 203 polyp samplings were performed in 139 patients. We performed 1, 2, and 3 polypectomies in 97, 28, and 10 patients, respectively. There were 29 (3.3%) colorectal cancers in our series. There was no mortality or morbidity in our study. General anesthesia and full bowel preparation may be the reason for increased success of colonoscopy. Increased experience and patient-endoscopist cooperation increased the rate of cecum access and polyp resection and decreased the complication rate.

  19. First course in factor analysis

    CERN Document Server

    Comrey, Andrew L

    2013-01-01

    The goal of this book is to foster a basic understanding of factor analytic techniques so that readers can use them in their own research and critically evaluate their use by other researchers. Both the underlying theory and correct application are emphasized. The theory is presented through the mathematical basis of the most common factor analytic models and several methods used in factor analysis. On the application side, considerable attention is given to the extraction problem, the rotation problem, and the interpretation of factor analytic results. Hence, readers are given a background of

  20. Method and procedure of fatigue analysis for nuclear equipment

    International Nuclear Information System (INIS)

    Wen Jing; Fang Yonggang; Lu Yan; Zhang Yue; Sun Zaozhan; Zou Mingzhong

    2014-01-01

    As an example, the fatigue analysis for the upper head of the pressurizer in one NPP was carried out by using ANSYS, a finite element method analysis software. According to RCC-M code, only two kinds of typical transients of temperature and pressure were considered in the fatigue analysis. Meanwhile, the influence of earthquake was taken into account. The method and procedure of fatigue analysis for nuclear safety equipment were described in detail. This paper provides a reference for fatigue analysis and assessment of nuclear safety grade equipment and pipe. (authors)

  1. Lithuanian Population Aging Factors Analysis

    Directory of Open Access Journals (Sweden)

    Agnė Garlauskaitė

    2015-05-01

    Full Text Available The aim of this article is to identify the factors that determine aging of Lithuania’s population and to assess the influence of these factors. The article shows Lithuanian population aging factors analysis, which consists of two main parts: the first describes the aging of the population and its characteristics in theoretical terms. Second part is dedicated to the assessment of trends that influence the aging population and demographic factors and also to analyse the determinants of the aging of the population of Lithuania. After analysis it is concluded in the article that the decline in the birth rate and increase in the number of emigrants compared to immigrants have the greatest impact on aging of the population, so in order to show the aging of the population, a lot of attention should be paid to management of these demographic processes.

  2. ORNL: PWR-BDHT analysis procedure, a preliminary overview

    International Nuclear Information System (INIS)

    Cliff, S.B.

    1978-01-01

    The computer programs currently used in the analysis of the ORNL-PWR Blowdown Heat Transfer Separate-Effects Program are overviewed. The current linkages and relationships among the programs are given along with general comments about the future directions of some of these programs. The overview is strictly from the computer science point of view with only minimal information concerning the engineering aspects of the analysis procedure

  3. Cost analysis of robotic versus laparoscopic general surgery procedures.

    Science.gov (United States)

    Higgins, Rana M; Frelich, Matthew J; Bosler, Matthew E; Gould, Jon C

    2017-01-01

    Robotic surgical systems have been used at a rapidly increasing rate in general surgery. Many of these procedures have been performed laparoscopically for years. In a surgical encounter, a significant portion of the total costs is associated with consumable supplies. Our hospital system has invested in a software program that can track the costs of consumable surgical supplies. We sought to determine the differences in cost of consumables with elective laparoscopic and robotic procedures for our health care organization. De-identified procedural cost and equipment utilization data were collected from the Surgical Profitability Compass Procedure Cost Manager System (The Advisory Board Company, Washington, DC) for our health care system for laparoscopic and robotic cholecystectomy, fundoplication, and inguinal hernia between the years 2013 and 2015. Outcomes were length of stay, case duration, and supply cost. Statistical analysis was performed using a t-test for continuous variables, and statistical significance was defined as p robotic procedures. Length of stay did not differ for fundoplication or cholecystectomy. Length of stay was greater for robotic inguinal hernia repair. Case duration was similar for cholecystectomy (84.3 robotic and 75.5 min laparoscopic, p = 0.08), but significantly longer for robotic fundoplication (197.2 robotic and 162.1 min laparoscopic, p = 0.01) and inguinal hernia repair (124.0 robotic and 84.4 min laparoscopic, p = ≪0.01). We found a significantly increased cost of general surgery procedures for our health care system when cases commonly performed laparoscopically are instead performed robotically. Our analysis is limited by the fact that we only included costs associated with consumable surgical supplies. The initial acquisition cost (over $1 million for robotic surgical system), depreciation, and service contract for the robotic and laparoscopic systems were not included in this analysis.

  4. Human Factors Evaluation of Procedures for Periodic Safety Review of Yonggwang Unit no. 1, 2

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Hee; Lee, Jung Woon; Park, Jae Chang (and others)

    2006-01-15

    This report describes the results of human factors assessment on the plant operating procedures as part of Periodic Safety Review(PSR) of Yonggwang Nuclear Power Plant Unit no. 1, 2. The suitability of item and appropriateness of format and structure in the key operating procedures of nuclear power plants were investigated by the review of plant operating experiences and procedure documents, field survey, and experimental assessment on some part of procedures. A checklist was used to perform this assessment and record the review results. The reviewed procedures include EOP(Emergency Operating Procedures), GOP(General Operating Procedures), AOP(Abnormal Operating Procedures), and management procedures of some technical departments. As results of the assessments, any significant problem challenging the safety was not found on the human factors in the operating procedures. However, several small items to be changed and improved were discovered. An action plan is recommended to accommodate the suggestions and review comments. It will enhance the plant safety on the operating procedure.

  5. Factor Analysis for Clustered Observations.

    Science.gov (United States)

    Longford, N. T.; Muthen, B. O.

    1992-01-01

    A two-level model for factor analysis is defined, and formulas for a scoring algorithm for this model are derived. A simple noniterative method based on decomposition of total sums of the squares and cross-products is discussed and illustrated with simulated data and data from the Second International Mathematics Study. (SLD)

  6. Transforming Rubrics Using Factor Analysis

    Science.gov (United States)

    Baryla, Ed; Shelley, Gary; Trainor, William

    2012-01-01

    Student learning and program effectiveness is often assessed using rubrics. While much time and effort may go into their creation, it is equally important to assess how effective and efficient the rubrics actually are in terms of measuring competencies over a number of criteria. This study demonstrates the use of common factor analysis to identify…

  7. Flood risk analysis procedure for nuclear power plants

    International Nuclear Information System (INIS)

    Wagner, D.P.

    1982-01-01

    This paper describes a methodology and procedure for determining the impact of floods on nuclear power plant risk. The procedures are based on techniques of fault tree and event tree analysis and use the logic of these techniques to determine the effects of a flood on system failure probability and accident sequence occurrence frequency. The methodology can be applied independently or as an add-on analysis for an existing risk assessment. Each stage of the analysis yields useful results such as the critical flood level, failure flood level, and the flood's contribution to accident sequence occurrence frequency. The results of applications show the effects of floods on the risk from nuclear power plants analyzed in the Reactor Safety Study

  8. Sample preparation procedure for PIXE elemental analysis on soft tissues

    International Nuclear Information System (INIS)

    Kubica, B.; Kwiatek, W.M.; Dutkiewicz, E.M.; Lekka, M.

    1997-01-01

    Trace element analysis is one of the most important field in analytical chemistry. There are several instrumental techniques which are applied for determinations of microscopic elemental content. The PIXE (Proton Induced X-ray Emission) technique is one of the nuclear techniques that is commonly applied for such purpose due to its multielemental analysis possibilities. The aim of this study was to establish the optimal conditions for target preparation procedure. In this paper two different approaches to the topic are presented and widely discussed. The first approach was the traditional pellet technique and the second one was mineralization procedure. For the analysis soft tissue such as liver was used. Some results are also presented on water samples. (author)

  9. Application of human error theory in case analysis of wrong procedures.

    Science.gov (United States)

    Duthie, Elizabeth A

    2010-06-01

    The goal of this study was to contribute to the emerging body of literature about the role of human behaviors and cognitive processes in the commission of wrong procedures. Case analysis of 5 wrong procedures in operative and nonoperative settings using James Reason's human error theory was performed. The case analysis showed that cognitive underspecification, cognitive flips, automode processing, and skill-based errors were contributory to wrong procedures. Wrong-site procedures accounted for the preponderance of the cases. Front-line supervisory staff used corrective actions that focused on the performance of the individual without taking into account cognitive factors. System fixes using human cognition concepts have a greater chance of achieving sustainable safety outcomes than those that are based on the traditional approach of counseling, education, and disciplinary action for staff.

  10. Experimental verification for standard analysis procedure of 241Am in food

    International Nuclear Information System (INIS)

    Liu Qingfen; Zhu Hongda; Liu Shutian; Pan Jingshun; Yang Dating

    2005-01-01

    Objective: The briefly experimental verification for 'determination of 241 Am in food' has been described. Methods: The overall recovery, the MDL of method and decontamination experiment has been done by standard analysis procedure. Results: The overall recovery is 76.26 ± 4.1%. The MDL is 3.4 x 10 -5 Bq/g ash, decontamination factor is higher than 10 3 for Po, 10 2 for U, Th, Pu and 60 for 237 Np. Conclusion: The results showed that the overall recovery is quite high and reliable, the MDL of method is able to meet examining 241 Am limited values in foods. the obtained decontamination factors of recommended procedure can meet analysis of 241 Am in food examination. Venifying results of the procedure are satisfied by using 243 Am spike and 241 Am standard reference material. (authors)

  11. Cost analysis of radiological interventional procedures and reimbursement within a clinic

    International Nuclear Information System (INIS)

    Strotzer, M.; Voelk, M.; Lenhart, M.; Fruend, R.; Feuerbach, S.

    2002-01-01

    Purpose: Analysis of costs for vascular radiological interventions on a per patient basis and comparison with reimbursement based on GOAe(Gebuehrenordnung fuer Aerzte) and DKG-NT (Deutsche Krankenhausgesellschaft-Nebenkostentarif). Material and Methods: The ten procedures most frequently performed within 12 months were evaluated. Personnel costs were derived from precise costs per hour and estimated procedure time for each intervention. Costs for medical devices were included. Reimbursement based on GOAewas calculated using the official conversion factor of 0.114 DM for each specific relative value unit and a multiplication factor of 1.0. The corresponding conversion factor for DKG-NT, determined by the DKG, was 0.168 DM. Results: A total of 832 interventional procedures were included. Marked differences between calculated costs and reimbursement rates were found. Regarding the ten most frequently performed procedures, there was a deficit of 1.06 million DM according GOAedata (factor 1.0) and 0.787 million DM according DKG-NT. The percentage of reimbursement was only 34.2 (GOAe; factor 1.0) and 51.3 (DKG-NT), respectively. Conclusion: Reimbursement of radiological interventional procedures based on GOAeand DKG-NT data is of limited value for economic controlling purposes within a hospital. (orig.) [de

  12. The utilisation of thermal analysis to optimise radiocarbon dating procedures

    International Nuclear Information System (INIS)

    Brandova, D.; Keller, W.A.; Maciejewski, M.

    1999-01-01

    Thermal analysis combined with mass spectrometry was applied to radiocarbon dating procedures (age determination of carbon-containing samples). Experiments carried out under an oxygen atmosphere were used to determine carbon content and combustion range of soil and wood samples. Composition of the shell sample and its decomposition were investigated. The quantification of CO 2 formed by the oxidation of carbon was done by the application of pulse thermal analysis. Experiments carried out under an inert atmosphere determined the combustion range of coal with CuO as an oxygen source. To eliminate a possible source of contamination in the radiocarbon dating procedures the adsorption of CO 2 by CuO was investigated. (author)

  13. Structural analysis and optimization procedure of the TFTR device substructure

    International Nuclear Information System (INIS)

    Driesen, G.

    1975-10-01

    A structural evaluation of the TFTR device substructure is performed in order to verify the feasibility of the proposed design concept as well as to establish a design optimization procedure for minimizing the material and fabrication cost of the substructure members. A preliminary evaluation of the seismic capability is also presented. The design concept on which the analysis is based is consistent with that described in the Conceptual Design Status Briefing report dated June 18, 1975

  14. EXPLORATORY FACTOR ANALYSIS (EFA IN CONSUMER BEHAVIOR AND MARKETING RESEARCH

    Directory of Open Access Journals (Sweden)

    Marcos Pascual Soler

    2012-06-01

    Full Text Available Exploratory Factor Analysis (EFA is one of the most widely used statistical procedures in social research. The main objective of this work is to describe the most common practices used by researchers in the consumer behavior and marketing area. Through a literature review methodology the practices of AFE in five consumer behavior and marketing journals(2000-2010 were analyzed. Then, the choices made by the researchers concerning factor model, retention criteria, rotation, factors interpretation and other relevant issues to factor analysis were analized. The results suggest that researchers routinely conduct analyses using such questionable methods. Suggestions for improving the use of factor analysis and the reporting of results are presented and a checklist (Exploratory Factor Analysis Checklist, EFAC is provided to help editors, reviewers, and authors improve reporting exploratory factor analysis.

  15. Computation of stress intensity factors for nozzle corner cracks by various finite element procedures

    International Nuclear Information System (INIS)

    Broekhoven, M.J.G.

    1975-01-01

    The present study aims at deriving accurate K-factors for a series of 5 elliptical nozzle corner cracks of increasing size by various finite element procedures, using a three-level recursive substructuring scheme to perform the computations in an economic way on an intermediate size computer (IBM 360/65 system). A nozzle on a flat plate has been selected for subsequent experimental verification, this configuration being considered an adequate simulation of a nozzle on a shallow shell. The computations have been performed with the ASKA finite element system using mainly HEXEC-27 (incomplete quartic) elements. The geometry has been subdivided into 5 subnets with a total of 3515 nodal points and 6250 unknowns, two main nets and one hyper net. Each crack front is described by 11 nodal points and all crack front nodes are inserted in the hyper net, which allows for the realization of the successive crack geometries by changing only a relatively small hyper net (615 to 725 unknowns). Output data have been interpreted in terms of K-factors by the global energy method, the displacement method and the stress method. Besides, a stiffness derivative procedure, recently developed at Brown University, which takes full advantage of the finite element formulation to calculate local K-factors, has been applied. Finally it has been investigated whether sufficiently accurate results can be obtained by analyzing a considerably smaller part than one half of the geometry (as strictly required by symmetry considerations), using fixed boundary conditions derived from a far cheaper analysis of the uncracked structure

  16. Analysis of Elementary School students’ algebraic perceptions and procedures

    Directory of Open Access Journals (Sweden)

    Sandra Mara Marasini

    2012-12-01

    Full Text Available This study aims to verify how students in elementary school see themselves in relation to mathematics and, at the same time, analyze the procedures used to solve algebraic tasks. These students in the 8th year of elementary school, and first and third years of high school, from two State schools in Passo Fundo/RS, answered a questionnaire about their own perceptions of the mathematics lessons, the subject mathematics and algebraic content. The analysis was based mainly on authors from the athematical education and the historic-cultural psychology areas. It was verifi ed that even among students who claimed to be happy with the idea of having mathematicsclasses several presented learning diffi culties regarding algebraic contents, revealed by the procedures employed. It was concluded that it is necessary to design proposals with didactic sequences, mathematically and pedagogically based, which can effi cientlyoptimize the appropriation of meaning from the concepts approached and their application in different situations.

  17. First Outbreak with MRSA in a Danish Neonatal Intensive Care Unit: Risk Factors and Control Procedures

    Science.gov (United States)

    Ramsing, Benedicte Grenness Utke; Arpi, Magnus; Andersen, Erik Arthur; Knabe, Niels; Mogensen, Dorthe; Buhl, Dorte; Westh, Henrik; Østergaard, Christian

    2013-01-01

    Introduction The purpose of the study was to describe demographic and clinical characteristics and outbreak handling of a large methicillin-resistant Staphylococcus aureus (MRSA) outbreak in a neonatal intensive care unit (NICU) in Denmark June 25th–August 8th 2008, and to identify risk factors for MRSA transmission. Methods Data were collected retrospectively from medical records and the Danish Neobase database. All MRSA isolates obtained from neonates, relatives and NICU health care workers (HCW) as well as environmental cultures were typed. Results During the 46 day outbreak period, 102 neonates were admitted to the two neonatal wards. Ninety-nine neonates were subsequently sampled, and 32 neonates (32%) from 25 families were colonized with MRSA (spa-type t127, SCCmec V, PVL negative). Thirteen family members from 11 of those families (44%) and two of 161 HCWs (1%) were colonized with the same MRSA. No one was infected. Five environmental cultures were MRSA positive. In a multiple logistic regression analysis, nasal Continuous Positive Airway Pressure (nCPAP) treatment (p = 0.006) and Caesarean section (p = 0.016) were independent risk factors for MRSA acquisition, whereas days of exposure to MRSA was a risk factors in the unadjusted analysis (p = 0.04). Conclusions MRSA transmission occurs with high frequency in the NICU during hospitalization with unidentified MRSA neonates. Caesarean section and nCPAP treatment were identified as risk factors for MRSA colonization. The MRSA outbreak was controlled through infection control procedures. PMID:23825581

  18. First outbreak with MRSA in a Danish neonatal intensive care unit: risk factors and control procedures.

    Directory of Open Access Journals (Sweden)

    Benedicte Grenness Utke Ramsing

    Full Text Available INTRODUCTION: The purpose of the study was to describe demographic and clinical characteristics and outbreak handling of a large methicillin-resistant Staphylococcus aureus (MRSA outbreak in a neonatal intensive care unit (NICU in Denmark June 25(th-August 8(th 2008, and to identify risk factors for MRSA transmission. METHODS: Data were collected retrospectively from medical records and the Danish Neobase database. All MRSA isolates obtained from neonates, relatives and NICU health care workers (HCW as well as environmental cultures were typed. RESULTS: During the 46 day outbreak period, 102 neonates were admitted to the two neonatal wards. Ninety-nine neonates were subsequently sampled, and 32 neonates (32% from 25 families were colonized with MRSA (spa-type t127, SCCmec V, PVL negative. Thirteen family members from 11 of those families (44% and two of 161 HCWs (1% were colonized with the same MRSA. No one was infected. Five environmental cultures were MRSA positive. In a multiple logistic regression analysis, nasal Continuous Positive Airway Pressure (nCPAP treatment (p = 0.006 and Caesarean section (p = 0.016 were independent risk factors for MRSA acquisition, whereas days of exposure to MRSA was a risk factors in the unadjusted analysis (p = 0.04. CONCLUSIONS: MRSA transmission occurs with high frequency in the NICU during hospitalization with unidentified MRSA neonates. Caesarean section and nCPAP treatment were identified as risk factors for MRSA colonization. The MRSA outbreak was controlled through infection control procedures.

  19. Elemental hair analysis: A review of procedures and applications

    International Nuclear Information System (INIS)

    Pozebon, D.; Scheffler, G.L.; Dressler, V.L.

    2017-01-01

    Although exogenous contamination and unreliable reference values have limited the utility of scalp hair as a biomarker of chemical elements exposure, its use in toxicological, clinical, environmental and forensic investigations is growing and becoming more extensive. Therefore, hair elemental analysis is reviewed in the current manuscript which spans articles published in the last 10 years. It starts with a general discussion of history, morphology and possible techniques for elemental analysis, where inductively coupled plasma-mass spectrometry (ICP-MS) is clearly highlighted since this technique is leading quantitative ultra-trace elemental analysis. Emphasis over sampling, quality assurance, washing procedures and sample decomposition is given with detailed protocols compiled in tables as well as the utility of hair to identify human gender, age, diseases, healthy conditions, nutrition status and contamination sites. Isotope ratio information, chemical speciation analysis and analyte preconcentration are also considered for hair. Finally, the potential of laser ablation ICP-MS (LA-ICP-MS) to provide spatial resolution and time-track the monitoring of elements in hair strands instead of conventional bulk analysis is spotlighted as a real future trend in the field. - Highlights: • Elemental analysis of hair is critically reviewed, with focus on ICP-MS employment. • Standards protocols of hair washing and sample decomposition are compiled. • The usefulness of elemental and/or isotopic analysis of hair is demonstrated. • The potential of LA-ICP-MS for elemental time tracking in hair is highlighted.

  20. Simplified procedures for fast reactor fuel cycle and sensitivity analysis

    International Nuclear Information System (INIS)

    Badruzzaman, A.

    1979-01-01

    The Continuous Slowing Down-Integral Transport Theory has been extended to perform criticality calculations in a Fast Reactor Core-blanket system achieving excellent prediction of the spectrum and the eigenvalue. The integral transport parameters did not need recalculation with source iteration and were found to be relatively constant with exposure. Fuel cycle parameters were accurately predicted when these were not varied, thus reducing a principal potential penalty of the Intergal Transport approach where considerable effort may be required to calculate transport parameters in more complicated geometries. The small variation of the spectrum in the central core region, and its weak dependence on exposure for both this region, the core blanket interface and blanket region led to the extension and development of inexpensive simplified procedures to complement exact methods. These procedures gave accurate predictions of the key fuel cycle parameters such as cost and their sensitivity to variation in spectrum-averaged and multigroup cross sections. They also predicted the implications of design variation on these parameters very well. The accuracy of these procedures and their use in analyzing a wide variety of sensitivities demonstrate the potential utility of survey calculations in Fast Reactor analysis and fuel management

  1. A simplified procedure of linear regression in a preliminary analysis

    Directory of Open Access Journals (Sweden)

    Silvia Facchinetti

    2013-05-01

    Full Text Available The analysis of a statistical large data-set can be led by the study of a particularly interesting variable Y – regressed – and an explicative variable X, chosen among the remained variables, conjointly observed. The study gives a simplified procedure to obtain the functional link of the variables y=y(x by a partition of the data-set into m subsets, in which the observations are synthesized by location indices (mean or median of X and Y. Polynomial models for y(x of order r are considered to verify the characteristics of the given procedure, in particular we assume r= 1 and 2. The distributions of the parameter estimators are obtained by simulation, when the fitting is done for m= r + 1. Comparisons of the results, in terms of distribution and efficiency, are made with the results obtained by the ordinary least square methods. The study also gives some considerations on the consistency of the estimated parameters obtained by the given procedure.

  2. Reduction procedures for accurate analysis of MSX surveillance experiment data

    Science.gov (United States)

    Gaposchkin, E. Mike; Lane, Mark T.; Abbot, Rick I.

    1994-01-01

    Technical challenges of the Midcourse Space Experiment (MSX) science instruments require careful characterization and calibration of these sensors for analysis of surveillance experiment data. Procedures for reduction of Resident Space Object (RSO) detections will be presented which include refinement and calibration of the metric and radiometric (and photometric) data and calculation of a precise MSX ephemeris. Examples will be given which support the reduction, and these are taken from ground-test data similar in characteristics to the MSX sensors and from the IRAS satellite RSO detections. Examples to demonstrate the calculation of a precise ephemeris will be provided from satellites in similar orbits which are equipped with S-band transponders.

  3. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1984-05-01

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  4. Review and Application of Ship Collision and Grounding Analysis Procedures

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    2010-01-01

    It is the purpose of the paper to present a review of prediction and analysis tools for collision and grounding analyses and to outline a probabilistic procedure for which these tools can be used by the maritime industry to develop performance based rules to reduce the risk associated with human,......, environmental and economic costs of collision and grounding events. The main goal of collision and grounding research should be to identify the most economic risk control options associated with prevention and mitigation of collision and grounding events....

  5. User's operating procedures. Volume 2: Scout project financial analysis program

    Science.gov (United States)

    Harris, C. G.; Haris, D. K.

    1985-01-01

    A review is presented of the user's operating procedures for the Scout Project Automatic Data system, called SPADS. SPADS is the result of the past seven years of software development on a Prime mini-computer located at the Scout Project Office, NASA Langley Research Center, Hampton, Virginia. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, two (2) of three (3), provides the instructions to operate the Scout Project Financial Analysis program in data retrieval and file maintenance via the user friendly menu drivers.

  6. Stent-Assisted Coil Embolization of Vertebrobasilar Dissecting Aneurysms: Procedural Outcomes and Factors for Recanalization.

    Science.gov (United States)

    Jeon, Jin Pyeong; Cho, Young Dae; Rhim, Jong Kook; Park, Jeong Jin; Cho, Won-Sang; Kang, Hyun-Seung; Kim, Jeong Eun; Hwang, Gyojun; Kwon, O-Ki; Han, Moon Hee

    2016-01-01

    Outcomes of stent-assisted coil embolization (SACE) have not been well established in the setting of vertebrobasilar dissecting aneurysms (VBDAs) due to the low percentage of cases that need treatment and the array of available therapeutic options. Herein, we presented clinical and radiographic results of SACE in patients with VBDAs. A total of 47 patients (M:F, 30:17; mean age ± SD, 53.7 ± 12.6 years), with a VBDA who underwent SACE between 2008 and 2014 at two institutions were evaluated retrospectively. Medical records and radiologic data were analyzed to assess the outcome of SACE procedures. Cox proportional hazards regression analysis was conducted to determine the factors that were associated with aneurysmal recanalization after SACE. Stent-assisted coil embolization technically succeeded in all patients. Three cerebellar infarctions occurred on postembolization day 1, week 2, and month 2, but no other procedure-related complications developed. Immediately following SACE, 25 aneurysms (53.2%) showed no contrast filling into the aneurysmal sac. During a mean follow-up of 20.2 months, 37 lesions (78.7%) appeared completely occluded, whereas 10 lesions showed recanalization, 5 of which required additional embolization. Overall recanalization rate was 12.64% per lesion-year, and mean postoperative time to recanalization was 18 months (range, 3-36 months). In multivariable analysis, major branch involvement (hazard ratio [HR]: 7.28; p = 0.013) and the presence of residual sac filling (HR: 8.49, p = 0.044) were identified as statistically significant independent predictors of recanalization. No bleeding was encountered in follow-up monitoring. Stent-assisted coil embolization appears feasible and safe for treatment of VBDAs. Long-term results were acceptable in a majority of patients studied, despite a relatively high rate of incomplete occlusion immediately after SACE. Major branch involvement and coiled aneurysms with residual sac filling may predispose to

  7. Shrunken head (tsantsa): a complete forensic analysis procedure.

    Science.gov (United States)

    Charlier, P; Huynh-Charlier, I; Brun, L; Hervé, C; de la Grandmaison, G Lorin

    2012-10-10

    Based on the analysis of shrunken heads referred to our forensic laboratory for anthropological expertise, and data from both anthropological and medical literature, we propose a complete forensic procedure for the analysis of such pieces. A list of 14 original morphological criteria has been developed, based on the global aspect, color, physical deformation, anatomical details, and eventual associated material (wood, vegetal fibers, sand, charcoals, etc.). Such criteria have been tested on a control sample of 20 tsantsa (i.e. shrunken heads from the Jivaro or Shuar tribes of South America). Further complementary analyses are described such as CT-scan and microscopic examination. Such expertise is more and more asked to forensic anthropologists and practitioners in a context of global repatriation of human artifacts to native communities. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  8. Pertinent anatomy and analysis for midface volumizing procedures.

    Science.gov (United States)

    Surek, Christopher C; Beut, Javier; Stephens, Robert; Jelks, Glenn; Lamb, Jerome

    2015-05-01

    The study was conducted to construct an anatomically inspired midfacial analysis facilitating safe, accurate, and dynamic nonsurgical rejuvenation. Emphasis is placed on determining injection target areas and adverse event zones. Twelve hemifacial fresh cadavers were dissected in a layered fashion. Dimensional measurements between the midfacial fat compartments, prezygomatic space, mimetic muscles, and neurovascular bundles were used to develop a topographic analysis for clinical injections. A longitudinal line from the base of the alar crease to the medial edge of the levator anguli oris muscle (1.9 cm), lateral edge of the levator anguli oris muscle (2.6 cm), and zygomaticus major muscle (4.6 cm) partitions the cheek into two aesthetic regions. A six-step facial analysis outlines three target zones and two adverse event zones and triangulates the point of maximum cheek projection. The lower adverse event zone yields an anatomical explanation to inadvertent jowling during anterior cheek injection. The upper adverse event zone localizes the palpebral branch of the infraorbital artery. The medial malar target area isolates quadrants for anterior cheek projection and tear trough effacement. The middle malar target area addresses lid-cheek blending and superficial compartment turgor. The lateral malar target area highlights lateral cheek projection and locates the prezygomatic space. This stepwise analysis illustrates target areas and adverse event zones to achieve midfacial support, contour, and profile in the repose position and simultaneous molding of a natural shape during animation. This reproducible method can be used both procedurally and in record-keeping for midface volumizing procedures.

  9. An easy guide to factor analysis

    CERN Document Server

    Kline, Paul

    2014-01-01

    Factor analysis is a statistical technique widely used in psychology and the social sciences. With the advent of powerful computers, factor analysis and other multivariate methods are now available to many more people. An Easy Guide to Factor Analysis presents and explains factor analysis as clearly and simply as possible. The author, Paul Kline, carefully defines all statistical terms and demonstrates step-by-step how to work out a simple example of principal components analysis and rotation. He further explains other methods of factor analysis, including confirmatory and path analysis, a

  10. Comparative analysis of diagnostic accuracy of different brain biopsy procedures.

    Science.gov (United States)

    Jain, Deepali; Sharma, Mehar Chand; Sarkar, Chitra; Gupta, Deepak; Singh, Manmohan; Mahapatra, A K

    2006-12-01

    Image-guided procedures such as computed tomography (CT) guided, neuronavigator-guided and ultrasound-guided methods can assist neurosurgeons in localizing the intraparenchymal lesion of the brain. However, despite improvements in the imaging techniques, an accurate diagnosis of intrinsic lesion requires tissue sampling and histological verification. The present study was carried out to examine the reliability of the diagnoses made on tumor sample obtained via different stereotactic and ultrasound-guided brain biopsy procedures. A retrospective analysis was conducted of all brain biopsies (frame-based and frameless stereotactic and ultrasound-guided) performed in a single tertiary care neurosciences center between 1995 and 2005. The overall diagnostic accuracy achieved on histopathology and correlation with type of biopsy technique was evaluated. A total of 130 cases were included, which consisted of 82 males and 48 females. Age ranged from 4 to 75 years (mean age 39.5 years). Twenty per cent (27 patients) were in the pediatric age group, while 12% (16 patients) were >or= 60-years of age. A definitive histological diagnosis was established in 109 cases (diagnostic yield 80.2%), which encompassed 101 neoplastic and eight nonneoplastic lesions. Frame-based, frameless stereotactic and ultrasound-guided biopsies were done in 95, 15 and 20 patients respectively. Although the numbers of cases were small there was trend for better yield with frameless image-guided stereotactic biopsy and maximum diagnostic yield was obtained i.e, 87% (13/15) in comparison to conventional frame-based CT-guided stereotactic biopsy and ultrasound-guided biopsy. Overall, a trend of higher diagnostic yield was seen in cases with frameless image-guided stereotactic biopsy. Thus, this small series confirms that frameless neuronavigator-guided stereotactic procedures represent the lesion sufficiently in order to make histopathologic diagnosis.

  11. A receptor model for urban aerosols based on oblique factor analysis

    DEFF Research Database (Denmark)

    Keiding, Kristian; Sørensen, Morten S.; Pind, Niels

    1987-01-01

    A procedure is outlined for the construction of receptor models of urban aerosols, based on factor analysis. The advantage of the procedure is that the covariation of source impacts is included in the construction of the models. The results are compared with results obtained by other receptor......-modelling procedures. It was found that procedures based on correlating sources were physically sound as well as in mutual agreement. Procedures based on non-correlating sources were found to generate physically obscure models....

  12. A study on the identification of cognitive complexity factors related to the complexity of procedural steps

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Kyun; Jeong, Kwang Sup; Jung, Won Dea [KAERI, Taejon (Korea, Republic of)

    2004-07-01

    In complex systems, it is well recognized that the provision of understandable procedures that allow operators to clarify 'what needs to be done' and 'how to do it' is one of the requisites to confirm their safety. In this regard, the step complexity (SC) measure that can quantify the complexity of procedural steps in emergency operating procedures (EOPs) of a nuclear power plant (NPP) was suggested. However, the necessity of additional complexity factors that can consider a cognitive aspect in evaluating the complexity of procedural steps is evinced from the comparisons between SC scores and operators' performance data. To this end, the comparisons between operators' performance data with their behavior in conducting prescribed activities of procedural steps are conducted in this study. As a result, two kinds of complexity factors (the abstraction level of knowledge and the level of engineering decision) that could affect operators' cognitive burden are identified. Although a well-designed experiment is indispensable in confirming the appropriateness of cognitive complexity factors, it is strongly believed that the change of an operator's performance can be more authentically explained if they are taken into consideration.

  13. A study on the identification of cognitive complexity factors related to the complexity of procedural steps

    International Nuclear Information System (INIS)

    Park, Jin Kyun; Jeong, Kwang Sup; Jung, Won Dea

    2004-01-01

    In complex systems, it is well recognized that the provision of understandable procedures that allow operators to clarify 'what needs to be done' and 'how to do it' is one of the requisites to confirm their safety. In this regard, the step complexity (SC) measure that can quantify the complexity of procedural steps in emergency operating procedures (EOPs) of a nuclear power plant (NPP) was suggested. However, the necessity of additional complexity factors that can consider a cognitive aspect in evaluating the complexity of procedural steps is evinced from the comparisons between SC scores and operators' performance data. To this end, the comparisons between operators' performance data with their behavior in conducting prescribed activities of procedural steps are conducted in this study. As a result, two kinds of complexity factors (the abstraction level of knowledge and the level of engineering decision) that could affect operators' cognitive burden are identified. Although a well-designed experiment is indispensable in confirming the appropriateness of cognitive complexity factors, it is strongly believed that the change of an operator's performance can be more authentically explained if they are taken into consideration

  14. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  15. 75 FR 72739 - Compliance Testing Procedures: Correction Factor for Room Air Conditioners

    Science.gov (United States)

    2010-11-26

    ...: Correction Factor for Room Air Conditioners AGENCY: Office of the General Counsel, Department of Energy (DOE... air conditioners. The petition seeks temporary enforcement forbearance, or in the alternative, a... procedures for room air conditioners. Public comment is requested on whether DOE should grant the petition...

  16. Assessment of job stress factors and organizational personality types for procedure-based jobs in nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Dae-Ho; Lee, Yong-Hee; Lee, Jung-Woon

    2008-01-01

    The purpose of this study is to assess the organizational types and the job stress factors that affect procedure-based job performances in nuclear power plants. We derived 24 organizational factors affecting job stress level in nuclear power plants from the job stress analysis models developed by NIOSH, JDI, and IOR. Considering the safety characteristics in the operating tasks of nuclear power plants, we identified the job contents and characteristics through the analyses of job assignments that appeared in the organizational chart and the results of an activity-based costing. By using questionnaire surveys and structured interviews with the plant personnel and expert panels, we assessed 70 jobs among the 777 jobs managed officially in accordance with the procedures. They consist of the representative jobs of each department and are directly related to safety. We utilized the organizational personality type indicators to characterize the personality types of each organization in nuclear power plants. (author)

  17. Multicriteria decision analysis in ranking of analytical procedures for aldrin determination in water.

    Science.gov (United States)

    Tobiszewski, Marek; Orłowski, Aleksander

    2015-03-27

    The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Development of a draft of human factors safety review procedures for the Korean next generation reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Moon, B. S.; Park, J. C.; Lee, Y. H.; Oh, I. S.; Lee, H. C. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-02-01

    In this study, a draft of human factors engineering (HFE) safety review procedures (SRP) was developed for the safety review of KNGR based on HFE Safety and Regulatory Requirements and Guidelines (SRRG). This draft includes acceptance criteria, review procedure, and evaluation findings for the areas of review including HFE Program Management, Human Factors Analyses, Human Factors Design, and HFE Verification and Validation, based on Section 15.1 'Human Factors Engineering Design Process' and 15.2 'Control Room Human Factors Engineering' of KNGR Specific Safety Requirements and Chapter 15 'Human Factors Engineering' of KNGR Safety Regulatory Guides. For the effective review, human factors concerns or issues related to advanced HSI design that have been reported so far should be extensively examined. In this study, a total of 384 human factors issues related to the advanced HSI design were collected through our review of a total of 145 documents. A summary of each issue was described and the issues were identified by specific features of HSI design. These results were implemented into a database system. 8 refs., 2 figs. (Author)

  19. Development of a draft of human factors safety review procedures for the Korean Next Generation Reactor

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Moon, B. S.; Park, J. C.; Lee, Y. H.; Oh, I. S.; Lee, H. C.

    2000-02-01

    In this study, a draft of Human Factors Engineering (HFE) Safety Review Procedures (SRP) was developed for the safety review of KNGR based on HFE Safety and Regulatory Requirements and Guidelines (SRRG). This draft includes acceptance criteria, review procedure, and evaluation findings for the areas of review including HFE program management, human factors analyses, human factors design, and HFE verification and validation, based on section 15.1 'human factors engineering design process' and 15.2 'control room human factors engineering' of KNGR specific safety requirements and chapter 15 'human factors engineering' of KNGR safety regulatory guides. For the effective review, human factors concerns or issues related to advanced HSI design that have been reported so far should be extensively examined. In this study, a total of 384 human factors issues related to the advanced HSI design were collected through our review of a total of 145 documents. A summary of each issue was described and the issues were identified by specific features of HSI design. These results were implemented into a database system

  20. Scenes for Social Information Processing in Adolescence: Item and factor analytic procedures for psychometric appraisal.

    Science.gov (United States)

    Vagos, Paula; Rijo, Daniel; Santos, Isabel M

    2016-04-01

    Relatively little is known about measures used to investigate the validity and applications of social information processing theory. The Scenes for Social Information Processing in Adolescence includes items built using a participatory approach to evaluate the attribution of intent, emotion intensity, response evaluation, and response decision steps of social information processing. We evaluated a sample of 802 Portuguese adolescents (61.5% female; mean age = 16.44 years old) using this instrument. Item analysis and exploratory and confirmatory factor analytic procedures were used for psychometric examination. Two measures for attribution of intent were produced, including hostile and neutral; along with 3 emotion measures, focused on negative emotional states; 8 response evaluation measures; and 4 response decision measures, including prosocial and impaired social behavior. All of these measures achieved good internal consistency values and fit indicators. Boys seemed to favor and choose overt and relational aggression behaviors more often; girls conveyed higher levels of neutral attribution, sadness, and assertiveness and passiveness. The Scenes for Social Information Processing in Adolescence achieved adequate psychometric results and seems a valuable alternative for evaluating social information processing, even if it is essential to continue investigation into its internal and external validity. (c) 2016 APA, all rights reserved.

  1. Supplement to procedures, analysis, and comparison of groundwater velocity measurement methods for unconfined aquifers

    International Nuclear Information System (INIS)

    Zinkl, R.J.; Kearl, P.M.

    1988-09-01

    This report is a supplement to Procedures, Analysis, and Comparison of Groundwater Velocity Measurement Methods for Unconfined Aquifers and provides computer program descriptions, type curves, and calculations for the analysis of field data in determining groundwater velocity in unconfined aquifers. The computer programs analyze bail or slug tests, pumping tests, Geoflo Meter data, and borehole dilution data. Appendix A is a description of the code, instructions for using the code, an example data file, and the calculated results to allow checking the code after installation on the user's computer. Calculations, development of formulas, and correction factors for the various programs are presented in Appendices B through F. Appendix G provides a procedure for calculating transmissivity and specific yield for pumping tests performed in unconfined aquifers

  2. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis

    International Nuclear Information System (INIS)

    2016-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  3. Fracture analysis procedure for cast austenitic stainless steel pipe with an axial crack

    International Nuclear Information System (INIS)

    Kamaya, Masayuki

    2012-01-01

    Since the ductility of cast austenitic stainless steel pipes decreases due to thermal aging embrittlement after long term operation, not only plastic collapse failure but also unstable ductile crack propagation (elastic-plastic failure) should be taken into account for the structural integrity assessment of cracked pipes. In the fitness-for-service code of the Japan Society of Mechanical Engineers (JSME), Z-factor is used to incorporate the reduction in failure load due to elastic-plastic failure. However, the JSME code does not provide the Z-factor for axial cracks. In this study, Z-factor for axial cracks in aged cast austenitic stainless steel pipes was derived. Then, a comparison was made for the elastic-plastic failure load obtained from different analysis procedures. It was shown that the obtained Z-factor could derive reasonable elastic-plastic failure loads, although the failure loads were more conservative than those obtained by the two-parameter method. (author)

  4. A factor analysis to detect factors influencing building national brand

    Directory of Open Access Journals (Sweden)

    Naser Azad

    Full Text Available Developing a national brand is one of the most important issues for development of a brand. In this study, we present factor analysis to detect the most important factors in building a national brand. The proposed study uses factor analysis to extract the most influencing factors and the sample size has been chosen from two major auto makers in Iran called Iran Khodro and Saipa. The questionnaire was designed in Likert scale and distributed among 235 experts. Cronbach alpha is calculated as 84%, which is well above the minimum desirable limit of 0.70. The implementation of factor analysis provides six factors including “cultural image of customers”, “exciting characteristics”, “competitive pricing strategies”, “perception image” and “previous perceptions”.

  5. PROCEDURE FOR ANALYSIS AND EVALUATION OF MARKET POSITION PRODUCTION ORGANIZATION

    Directory of Open Access Journals (Sweden)

    A. N. Polozova

    2014-01-01

    Full Text Available Summary. Methodical procedures economic monitoring market position of industrial organization, particularly those relating to food production, including the 5 elements: matrix «component of business processes», matrix «materiality – efficiency», matrix «materiality – relevant», matrix emption and hindering factors matrix operation scenarios. Substantiated components assess the strengths and weaknesses of the business activities of organizations that characterize the state of internal business environment on the elements: production, organization, personnel, finance, marketing. The advantages of the matrix «materiality – relevance» consisting of 2 materiality level - high and low, and 3 directions relevance – «no change», «gain importance in the future», «lose importance in the future». Presented the contents of the matrix «scenarios functioning of the organization», involving 6 attribute levels, 10 classes of scenarios, 19 activities, including an optimistic and pessimistic. The evaluation of primary classes of scenarios, characterized by the properties of «development», «dynamic equilibrium», «quality improvement», «competitiveness», «favorable realization of opportunities», «competition resistance».

  6. The Infinitesimal Jackknife with Exploratory Factor Analysis

    Science.gov (United States)

    Zhang, Guangjian; Preacher, Kristopher J.; Jennrich, Robert I.

    2012-01-01

    The infinitesimal jackknife, a nonparametric method for estimating standard errors, has been used to obtain standard error estimates in covariance structure analysis. In this article, we adapt it for obtaining standard errors for rotated factor loadings and factor correlations in exploratory factor analysis with sample correlation matrices. Both…

  7. Structural Analysis of Correlated Factors: Lessons from the Verbal-Performance Dichotomy of the Wechsler Scales.

    Science.gov (United States)

    Macmann, Gregg M.; Barnett, David W.

    1994-01-01

    Describes exploratory and confirmatory analyses of verbal-performance procedures to illustrate concepts and procedures for analysis of correlated factors. Argues that, based on convergent and discriminant validity criteria, factors should have higher correlations with variables that they purport to measure than with other variables. Discusses…

  8. Sensitivity of the diagnostic radiological index of protection to procedural factors in fluoroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Jones, A. Kyle, E-mail: kyle.jones@mdanderson.org [Department of Imaging Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States); Pasciak, Alexander S. [Department of Radiology, The University of Tennessee Medical Center at Knoxville, Knoxville, Tennessee 37922 (United States); Wagner, Louis K. [Department of Diagnostic and Interventional Imaging, The John P. and Katharine G. McGovern Medical School, Houston, Texas 77030 (United States)

    2016-07-15

    Purpose: To evaluate the sensitivity of the diagnostic radiological index of protection (DRIP), used to quantify the protective value of radioprotective garments, to procedural factors in fluoroscopy in an effort to determine an appropriate set of scatter-mimicking primary beams to be used in measuring the DRIP. Methods: Monte Carlo simulations were performed to determine the shape of the scattered x-ray spectra incident on the operator in different clinical fluoroscopy scenarios, including interventional radiology and interventional cardiology (IC). Two clinical simulations studied the sensitivity of the scattered spectrum to gantry angle and patient size, while technical factors were varied according to measured automatic dose rate control (ADRC) data. Factorial simulations studied the sensitivity of the scattered spectrum to gantry angle, field of view, patient size, and beam quality for constant technical factors. Average energy (E{sub avg}) was the figure of merit used to condense fluence in each energy bin to a single numerical index. Results: Beam quality had the strongest influence on the scattered spectrum in fluoroscopy. Many procedural factors affect the scattered spectrum indirectly through their effect on primary beam quality through ADRC, e.g., gantry angle and patient size. Lateral C-arm rotation, common in IC, increased the energy of the scattered spectrum, regardless of the direction of rotation. The effect of patient size on scattered radiation depended on ADRC characteristics, patient size, and procedure type. Conclusions: The scattered spectrum striking the operator in fluoroscopy is most strongly influenced by primary beam quality, particularly kV. Use cases for protective garments should be classified by typical procedural primary beam qualities, which are governed by the ADRC according to the impacts of patient size, anatomical location, and gantry angle.

  9. Main clinical, therapeutic and technical factors related to patient's maximum skin dose in interventional cardiology procedures

    Science.gov (United States)

    Journy, N; Sinno-Tellier, S; Maccia, C; Le Tertre, A; Pirard, P; Pagès, P; Eilstein, D; Donadieu, J; Bar, O

    2012-01-01

    Objective The study aimed to characterise the factors related to the X-ray dose delivered to the patient's skin during interventional cardiology procedures. Methods We studied 177 coronary angiographies (CAs) and/or percutaneous transluminal coronary angioplasties (PTCAs) carried out in a French clinic on the same radiography table. The clinical and therapeutic characteristics, and the technical parameters of the procedures, were collected. The dose area product (DAP) and the maximum skin dose (MSD) were measured by an ionisation chamber (Diamentor; Philips, Amsterdam, The Netherlands) and radiosensitive film (Gafchromic; International Specialty Products Advanced Materials Group, Wayne, NJ). Multivariate analyses were used to assess the effects of the factors of interest on dose. Results The mean MSD and DAP were respectively 389 mGy and 65 Gy cm−2 for CAs, and 916 mGy and 69 Gy cm−2 for PTCAs. For 8% of the procedures, the MSD exceeded 2 Gy. Although a linear relationship between the MSD and the DAP was observed for CAs (r=0.93), a simple extrapolation of such a model to PTCAs would lead to an inadequate assessment of the risk, especially for the highest dose values. For PTCAs, the body mass index, the therapeutic complexity, the fluoroscopy time and the number of cine frames were independent explanatory factors of the MSD, whoever the practitioner was. Moreover, the effect of technical factors such as collimation, cinematography settings and X-ray tube orientations on the DAP was shown. Conclusion Optimising the technical options for interventional procedures and training staff on radiation protection might notably reduce the dose and ultimately avoid patient skin lesions. PMID:22457404

  10. An Aggregate IRT Procedure for Exploratory Factor Analysis

    NARCIS (Netherlands)

    Camilli, Gregory; Fox, Gerardus J.A.

    2015-01-01

    An aggregation strategy is proposed to potentially address practical limitation related to computing resources for two-level multidimensional item response theory (MIRT) models with large data sets. The aggregate model is derived by integration of the normal ogive model, and an adaptation of the

  11. An Aggregate IRT Procedure for Exploratory Factor Analysis

    Science.gov (United States)

    Camilli, Gregory; Fox, Jean-Paul

    2015-01-01

    An aggregation strategy is proposed to potentially address practical limitation related to computing resources for two-level multidimensional item response theory (MIRT) models with large data sets. The aggregate model is derived by integration of the normal ogive model, and an adaptation of the stochastic approximation expectation maximization…

  12. Radiation and environmental data analysis computer (REDAC) hardware, software band analysis procedures

    International Nuclear Information System (INIS)

    Hendricks, T.J.

    1985-01-01

    The REDAC was conceived originally as a tape verifier for the Radiation and Environmental Data Acquisition Recorder (REDAR). From that simple beginning in 1971, the REDAC has evolved into a family of systems used for complete analysis of data obtained by the REDAR and other acquisition systems. Portable or mobile REDACs are deployed to support checkout and analysis tasks in the field. Laboratory systems are additionally used for software development, physics investigations, data base management and graphics. System configurations range from man-portable systems to a large laboratory-based system which supports time-shared analysis and development tasks. Custom operating software allows the analyst to process data either interactively or by batch procedures. Analysis packages are provided for numerous necessary functions. All these analysis procedures can be performed even on the smallest man-portable REDAC. Examples of the multi-isotope stripping and radiation isopleth mapping are presented. Techniques utilized for these operations are also presented

  13. A finite volume procedure for fluid flow, heat transfer and solid-body stress analysis

    KAUST Repository

    Jagad, P. I.; Puranik, B. P.; Date, A. W.

    2018-01-01

    A unified cell-centered unstructured mesh finite volume procedure is presented for fluid flow, heat transfer and solid-body stress analysis. An in-house procedure (A. W. Date, Solution of Transport Equations on Unstructured Meshes with Cell

  14. Procedural-support music therapy in the healthcare setting: a cost-effectiveness analysis.

    Science.gov (United States)

    DeLoach Walworth, Darcy

    2005-08-01

    This comparative analysis examined the cost-effectiveness of music therapy as a procedural support in the pediatric healthcare setting. Many healthcare organizations are actively attempting to reduce the amount of sedation for pediatric patients undergoing various procedures. Patients receiving music therapy-assisted computerized tomography scans ( n = 57), echocardiograms ( n = 92), and other procedures ( n = 17) were included in the analysis. Results of music therapy-assisted procedures indicate successful elimination of patient sedation, reduction in procedural times, and decrease in the number of staff members present for procedures. Implications for nurses and music therapists in the healthcare setting are discussed.

  15. U.F.F.A.: A numerical procedure for fatigue analysis according to ASME code

    International Nuclear Information System (INIS)

    Bellettato, W.; Ticozzi, C.; Zucchini, C.

    1981-01-01

    A new procedure is developed, which employs some already used methodologies and brings some new concepts. The computer code UFFA employs the so obtained procedure. This paper in the first part describes the methodology used for the usage factor calculation, in the second part carries a general description of the code and in the third part shows some example and their respective results. We suppose an elastic behaviour of the materials and we do not consider the effect of the application order of the loads. Moreover, we suppose valid the hypothesis of cumulative damage, that is we apply the Miner's rule. One of the problems in the nuclear components fatigue analysis is that in the load histories there is a high number of operational cycles for which we cannot specify a succession in the time. Therefore, it was introduced the concept of 'level' (or steady working status) by which we can approximate the load conditions in realistic way. As regard the problem of multiaxial cases, it is possible to show that it is not right, an neither conservative, to make a distinguished analysis of the 3 stress differences and then take the maximum of the 3 compuoted usage factors as component usage factor. Indeed, as the stresses act on the structure at the same time, it is necessary a contemporary analysis of the 3 stress difference curves. The computer code can deal as well with the case of sher stresses (varying principal stress directions) through the ASME 'normalization' procedure. The results of the UFFA program, compared with the results of other programs used at present, come up to the expectations. (orig./HP)

  16. Pricing of common cosmetic surgery procedures: local economic factors trump supply and demand.

    Science.gov (United States)

    Richardson, Clare; Mattison, Gennaya; Workman, Adrienne; Gupta, Subhas

    2015-02-01

    The pricing of cosmetic surgery procedures has long been thought to coincide with laws of basic economics, including the model of supply and demand. However, the highly variable prices of these procedures indicate that additional economic contributors are probable. The authors sought to reassess the fit of cosmetic surgery costs to the model of supply and demand and to determine the driving forces behind the pricing of cosmetic surgery procedures. Ten plastic surgery practices were randomly selected from each of 15 US cities of various population sizes. Average prices of breast augmentation, mastopexy, abdominoplasty, blepharoplasty, and rhytidectomy in each city were compared with economic and demographic statistics. The average price of cosmetic surgery procedures correlated substantially with population size (r = 0.767), cost-of-living index (r = 0.784), cost to own real estate (r = 0.714), and cost to rent real estate (r = 0.695) across the 15 US cities. Cosmetic surgery pricing also was found to correlate (albeit weakly) with household income (r = 0.436) and per capita income (r = 0.576). Virtually no correlations existed between pricing and the density of plastic surgeons (r = 0.185) or the average age of residents (r = 0.076). Results of this study demonstrate a correlation between costs of cosmetic surgery procedures and local economic factors. Cosmetic surgery pricing cannot be completely explained by the supply-and-demand model because no association was found between procedure cost and the density of plastic surgeons. © 2015 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com.

  17. Analysis of Bernstein's factorization circuit

    NARCIS (Netherlands)

    Lenstra, A.K.; Shamir, A.; Tomlinson, J.; Tromer, E.; Zheng, Y.

    2002-01-01

    In [1], Bernstein proposed a circuit-based implementation of the matrix step of the number field sieve factorization algorithm. These circuits offer an asymptotic cost reduction under the measure "construction cost x run time". We evaluate the cost of these circuits, in agreement with [1], but argue

  18. Analysis of decision procedures for a sequence of inventory periods

    International Nuclear Information System (INIS)

    Avenhaus, R.

    1982-07-01

    Optimal test procedures for a sequence of inventory periods will be discussed. Starting with a game theoretical description of the conflict situation between the plant operator and the inspector, the objectives of the inspector as well as the general decision theoretical problem will be formulated. In the first part the objective of 'secure' detection will be emphasized which means that only at the end of the reference time a decision is taken by the inspector. In the second part the objective of 'timely' detection will be emphasized which will lead to sequential test procedures. At the end of the paper all procedures will be summarized, and in view of the multitude of procedures available at the moment some comments about future work will be given. (orig./HP) [de

  19. Procedures for uncertainty and sensitivity analysis in repository performance assessment

    International Nuclear Information System (INIS)

    Poern, K.; Aakerlund, O.

    1985-10-01

    The objective of the project was mainly a literature study of available methods for the treatment of parameter uncertainty propagation and sensitivity aspects in complete models such as those concerning geologic disposal of radioactive waste. The study, which has run parallel with the development of a code package (PROPER) for computer assisted analysis of function, also aims at the choice of accurate, cost-affective methods for uncertainty and sensitivity analysis. Such a choice depends on several factors like the number of input parameters, the capacity of the model and the computer reresources required to use the model. Two basic approaches are addressed in the report. In one of these the model of interest is directly simulated by an efficient sampling technique to generate an output distribution. Applying the other basic method the model is replaced by an approximating analytical response surface, which is then used in the sampling phase or in moment matching to generate the output distribution. Both approaches are illustrated by simple examples in the report. (author)

  20. Performing MR-guided biopsies in clinical routine: factors that influence accuracy and procedure time

    International Nuclear Information System (INIS)

    Hoffmann, Ruediger; Thomas, Christoph; Rempp, Hansjoerg; Schmidt, Diethard; Claussen, Claus D.; Clasen, Stephan; Pereira, Philippe L.

    2012-01-01

    To assess the accuracy, the duration and factors that influence the duration of MRI-guided liver or soft-tissue biopsies. Nineteen liver biopsies and 19 soft-tissue biopsies performed using 1.5T-MRI guidance were retrospectively analysed. Diagnostic performance and complications were assessed. Intervention time was subdivided into preparation period, puncture period and control period. Correlation between procedure time and target size, skin-to-target-distance, used sequences and interventionalists' experience were analysed. Overall sensitivity, specificity and accuracy were 0.86, 1.0 and 0.92, respectively. Two minor complications occurred. Overall median procedure time was 103.5 min. Liver biopsies lasted longer than soft-tissue biopsies (mean [soft-tissue] : 73.0 min, mean [liver] : 134.1 min, P [liver] = 0.048, P [soft-tissue] = 0.005) was significantly prolonged for longer skin-to-target-distances. Lower numbers of image acquisitions (P [liver] = 0.0007, P [soft-tissue] = 0.0012) and interventionalists' experience reduces the procedure duration significantly (P < 0.05), besides all false-negative results appeared during the first five biopsies of each individual radiologist. The interventionalists' experience, skin-to-target-distances and number of image acquisition influence the procedure time significantly. (orig.)

  1. Work procedures and risk factors for high rdiation exposure among radiologic technologists in South Korea

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Young; Choi, Yeong Chull [Dept. of Preventive Medicine, Keimyung University College of Medicine, Daegu (Korea, Republic of); Lee, Won Jin; Cha, Eun Shil [Dept. of Preventive Medicine, Korea University College of Medicine, Seoul (Korea, Republic of)

    2016-12-15

    Radiologic technologists currently consist of 31.5% among diagnostic radiation workers in South Korea. Among diagnostic radiation workers, radiologic technologists receive the highest annual and collective doses in South Korea. Comprehensive assessment of the work practices and associated radiation doses from diagnostic radiology procedures should be undertaken for effective prevention for radiologic technologists. Using the national survey, this study aimed (1) to explore the distribution of the work procedures performed by gender, (2) to evaluate occupational radiation exposure by work characteristics and safety compliance, (3) to identify the primary factors influencing high radiation exposure among radiologic technologists in South Korea. This study provided detailed information on work practices, number of procedures performed on weekly basis, and occupational radiation doses among radiologic technologists in South Korea. Average radiation dose for radiologic technologists is higher than other countries, and type of facility, work safety, and wearing lead apron explained quite a portion of increased risk in the association between radiology procedures and radiation exposure among radiologic technologists.

  2. Work procedures and risk factors for high rdiation exposure among radiologic technologists in South Korea

    International Nuclear Information System (INIS)

    Kim, Jae Young; Choi, Yeong Chull; Lee, Won Jin; Cha, Eun Shil

    2016-01-01

    Radiologic technologists currently consist of 31.5% among diagnostic radiation workers in South Korea. Among diagnostic radiation workers, radiologic technologists receive the highest annual and collective doses in South Korea. Comprehensive assessment of the work practices and associated radiation doses from diagnostic radiology procedures should be undertaken for effective prevention for radiologic technologists. Using the national survey, this study aimed (1) to explore the distribution of the work procedures performed by gender, (2) to evaluate occupational radiation exposure by work characteristics and safety compliance, (3) to identify the primary factors influencing high radiation exposure among radiologic technologists in South Korea. This study provided detailed information on work practices, number of procedures performed on weekly basis, and occupational radiation doses among radiologic technologists in South Korea. Average radiation dose for radiologic technologists is higher than other countries, and type of facility, work safety, and wearing lead apron explained quite a portion of increased risk in the association between radiology procedures and radiation exposure among radiologic technologists.

  3. Text mining factor analysis (TFA) in green tea patent data

    Science.gov (United States)

    Rahmawati, Sela; Suprijadi, Jadi; Zulhanif

    2017-03-01

    Factor analysis has become one of the most widely used multivariate statistical procedures in applied research endeavors across a multitude of domains. There are two main types of analyses based on factor analysis: Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA). Both EFA and CFA aim to observed relationships among a group of indicators with a latent variable, but they differ fundamentally, a priori and restrictions made to the factor model. This method will be applied to patent data technology sector green tea to determine the development technology of green tea in the world. Patent analysis is useful in identifying the future technological trends in a specific field of technology. Database patent are obtained from agency European Patent Organization (EPO). In this paper, CFA model will be applied to the nominal data, which obtain from the presence absence matrix. While doing processing, analysis CFA for nominal data analysis was based on Tetrachoric matrix. Meanwhile, EFA model will be applied on a title from sector technology dominant. Title will be pre-processing first using text mining analysis.

  4. Factors that influence length of stay for in-patient gynaecology surgery: is the Case Mix Group (CMG) or type of procedure more important?

    Science.gov (United States)

    Carey, Mark S; Victory, Rahi; Stitt, Larry; Tsang, Nicole

    2006-02-01

    To compare the association between the Case Mix Group (CMG) code and length of stay (LOS) with the association between the type of procedure and LOS in patients admitted for gynaecology surgery. We examined the records of women admitted for surgery in CMG 579 (major uterine/adnexal procedure, no malignancy) or 577 (major surgery ovary/adnexa with malignancy) between April 1997 and March 1999. Factors thought to influence LOS included age, weight, American Society of Anesthesiologists (ASA) score, physician, day of the week on which surgery was performed, and procedure type. Procedures were divided into six categories, four for CMG 579 and two for CMG 577. Data were abstracted from the hospital information costing system (T2 system) and by retrospective chart review. Multivariable analysis was performed using linear regression with backwards elimination. There were 606 patients in CMG 579 and 101 patients in CMG 577, and the corresponding median LOS was four days (range 1-19) for CMG 579 and nine days (range 3-30) for CMG 577. Combined analysis of both CMGs 577 and 579 revealed the following factors as highly significant determinants of LOS: procedure, age, physician, and ASA score. Although confounded by procedure type, the CMG did not significantly account for differences in LOS in the model if procedure was considered. Pairwise comparisons of procedure categories were all found to be statistically significant, even when controlled for other important variables. The type of procedure better accounts for differences in LOS by describing six statistically distinct procedure groups rather than the traditional two CMGs. It is reasonable therefore to consider changing the current CMG codes for gynaecology to a classification based on the type of procedure.

  5. A scaling procedure for the response of an isolated system with high modal overlap factor

    Science.gov (United States)

    De Rosa, S.; Franco, F.

    2008-10-01

    The paper deals with a numerical approach that reduces some physical sizes of the solution domain to compute the dynamic response of an isolated system: it has been named Asymptotical Scaled Modal Analysis (ASMA). The proposed numerical procedure alters the input data needed to obtain the classic modal responses to increase the frequency band of validity of the discrete or continuous coordinates model through the definition of a proper scaling coefficient. It is demonstrated that the computational cost remains acceptable while the frequency range of analysis increases. Moreover, with reference to the flexural vibrations of a rectangular plate, the paper discusses the ASMA vs. the statistical energy analysis and the energy distribution approach. Some insights are also given about the limits of the scaling coefficient. Finally it is shown that the linear dynamic response, predicted with the scaling procedure, has the same quality and characteristics of the statistical energy analysis, but it can be useful when the system cannot be solved appropriately by the standard Statistical Energy Analysis (SEA).

  6. Solution Tree Problem Solving Procedure for Engineering Analysis ...

    African Journals Online (AJOL)

    Illustrations are provided in the thermofluid engineering area to showcase the procedure's applications. This approach has proved to be a veritable tool for enhancing the problem-solving and computer algorithmic skills of engineering students, eliciting their curiosity, active participation and appreciation of the taught course.

  7. A Comparative Analysis of the Procedure Employed in Item ...

    African Journals Online (AJOL)

    Zimbabwe Journal of Educational Research ... and psychological scales designed to measure constructs in education and social sciences were purposively selected for the study based on accessibility and availability of validation information. The instruments used for the study were scaling procedures used in 27 published ...

  8. Analysis of emergency response procedures and air traffic accidents ...

    African Journals Online (AJOL)

    Incessant air transport accidents have been a source of concern to stakeholders and aviation experts in Nigeria, yet the response and process has not been adequately appraised. This study attempts an evaluation of the emergency response procedures in the aviation industry with particular focus on Murtala Muhammed ...

  9. Multiple factor analysis by example using R

    CERN Document Server

    Pagès, Jérôme

    2014-01-01

    Multiple factor analysis (MFA) enables users to analyze tables of individuals and variables in which the variables are structured into quantitative, qualitative, or mixed groups. Written by the co-developer of this methodology, Multiple Factor Analysis by Example Using R brings together the theoretical and methodological aspects of MFA. It also includes examples of applications and details of how to implement MFA using an R package (FactoMineR).The first two chapters cover the basic factorial analysis methods of principal component analysis (PCA) and multiple correspondence analysis (MCA). The

  10. Analysis of assistance procedures to normal birth in primiparous

    Directory of Open Access Journals (Sweden)

    Joe Luiz Vieira Garcia Novo

    2016-04-01

    Full Text Available Introduction: Current medical technologies in care in birth increased maternal and fetal benefits persist, despite numerous unnecessary procedures. The purpose of the normal childbirth care is to have healthy women and newborns, using a minimum of safe interventions. Objective: To analyze the assistance to normal delivery in secondary care maternity. Methodology: A total of 100 primiparous mothers who had vaginal delivery were included, in which care practices used were categorized: 1 according to the WHO classification for assistance to normal childbirth: effective, harmful, used with caution and used inappropriately; 2 associating calculations with the Bologna Index parameters: presence of a birth partner, partograph, no stimulation of labor, delivery in non-supine position, and mother-newborn skin-to-skin contact. Results: Birth partners (85%, correctly filled partographs (62%, mother-newborn skin-to-skin contact (36%, use of oxytocin (87%, use of parenteral nutrition during labor (86% and at delivery (74%, episiotomy (94% and uterine fundal pressure in the expulsion stage (58%. The overall average value of the Bologna Index of the mothers analyzed was 1.95. Conclusions: Some effective procedures recommended by WHO (presence of a birth partner, some effective and mandatory practices were not complied with (partograph completely filled, potentially harmful or ineffective procedures were used (oxytocin in labor/post-partum, as well as inadequate procedures (uterine fundal pressure during the expulsion stage, use of forceps and episiotomy. The maternity’s care model did not offer excellence procedures in natural birth to their mothers in primiparity, (BI=1.95.

  11. Interactive analysis of human error factors in NPP operation events

    International Nuclear Information System (INIS)

    Zhang Li; Zou Yanhua; Huang Weigang

    2010-01-01

    Interactive of human error factors in NPP operation events were introduced, and 645 WANO operation event reports from 1999 to 2008 were analyzed, among which 432 were found relative to human errors. After classifying these errors with the Root Causes or Causal Factors, and then applying SPSS for correlation analysis,we concluded: (1) Personnel work practices are restricted by many factors. Forming a good personnel work practices is a systematic work which need supports in many aspects. (2)Verbal communications,personnel work practices, man-machine interface and written procedures and documents play great roles. They are four interaction factors which often come in bundle. If some improvements need to be made on one of them,synchronous measures are also necessary for the others.(3) Management direction and decision process, which are related to management,have a significant interaction with personnel factors. (authors)

  12. Rates and risk factors of unplanned 30-day readmission following general and thoracic pediatric surgical procedures.

    Science.gov (United States)

    Polites, Stephanie F; Potter, Donald D; Glasgow, Amy E; Klinkner, Denise B; Moir, Christopher R; Ishitani, Michael B; Habermann, Elizabeth B

    2017-08-01

    Postoperative unplanned readmissions are costly and decrease patient satisfaction; however, little is known about this complication in pediatric surgery. The purpose of this study was to determine rates and predictors of unplanned readmission in a multi-institutional cohort of pediatric surgical patients. Unplanned 30-day readmissions following general and thoracic surgical procedures in children readmission per 30 person-days were determined to account for varied postoperative length of stay (pLOS). Patients were randomly divided into 70% derivation and 30% validation cohorts which were used for creation and validation of a risk model for readmission. Readmission occurred in 1948 (3.6%) of 54,870 children for a rate of 4.3% per 30 person-days. Adjusted predictors of readmission included hepatobiliary procedures, increased wound class, operative duration, complications, and pLOS. The predictive model discriminated well in the derivation and validation cohorts (AUROC 0.710 and 0.701) with good calibration between observed and expected readmission events in both cohorts (p>.05). Unplanned readmission occurs less frequently in pediatric surgery than what is described in adults, calling into question its use as a quality indicator in this population. Factors that predict readmission including type of procedure, complications, and pLOS can be used to identify at-risk children and develop prevention strategies. III. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Exploratory factor analysis and reliability analysis with missing data: A simple method for SPSS users

    Directory of Open Access Journals (Sweden)

    Bruce Weaver

    2014-09-01

    Full Text Available Missing data is a frequent problem for researchers conducting exploratory factor analysis (EFA or reliability analysis. The SPSS FACTOR procedure allows users to select listwise deletion, pairwise deletion or mean substitution as a method for dealing with missing data. The shortcomings of these methods are well-known. Graham (2009 argues that a much better way to deal with missing data in this context is to use a matrix of expectation maximization (EM covariances(or correlations as input for the analysis. SPSS users who have the Missing Values Analysis add-on module can obtain vectors ofEM means and standard deviations plus EM correlation and covariance matrices via the MVA procedure. But unfortunately, MVA has no /MATRIX subcommand, and therefore cannot write the EM correlations directly to a matrix dataset of the type needed as input to the FACTOR and RELIABILITY procedures. We describe two macros that (in conjunction with an intervening MVA command carry out the data management steps needed to create two matrix datasets, one containing EM correlations and the other EM covariances. Either of those matrix datasets can then be used asinput to the FACTOR procedure, and the EM correlations can also be used as input to RELIABILITY. We provide an example that illustrates the use of the two macros to generate the matrix datasets and how to use those datasets as input to the FACTOR and RELIABILITY procedures. We hope that this simple method for handling missing data will prove useful to both students andresearchers who are conducting EFA or reliability analysis.

  14. Use of safety analysis to site comfirmation procedure in case of hard rock repository

    International Nuclear Information System (INIS)

    Peltonen, E.K.

    1984-02-01

    The role of safety analysis in a confirmation procedure of a candidate disposal site of radioactive wastes is discussed. Items dealt with include principle reasons and practical goals of the use of safety analysis, methodology of safety analysis and assessment, as well as usefulness and adequacy of the present safety analysis. Safety analysis is a tool, which enables one to estimate quantitatively the possible radiological impacts from the disposal. The results can be compared with the criteria and the suitability conclusions drawn. Because of its systems analytical nature safety analysis is an effective method to reveal, what are the most important factors of the disposal system and the most critical site characteristics inside the lumped parameters often provided by the experimental site investigation methods. Furthermore it gives information on the accuracy needs of different site properties. This can be utilized to judge whether the quality and quantity of the measurements for the characterization are sufficient as well as to guide the further site investigations. A more practical discussion regarding the applicability of the use of safety analysis is presented by an example concerning the assessment of a Finnish candidate site for low- and intermediate-level radioactive waste repository. (author)

  15. Comparative analysis of diagnostic accuracy of different brain biopsy procedures

    OpenAIRE

    Jain Deepali; Sharma Mehar; Sarkar Chitra; Gupta Deepak; Singh Manmohan; Mahapatra A

    2006-01-01

    Background: Image-guided procedures such as computed tomography (CT) guided, neuronavigator-guided and ultrasound-guided methods can assist neurosurgeons in localizing the intraparenchymal lesion of the brain. However, despite improvements in the imaging techniques, an accurate diagnosis of intrinsic lesion requires tissue sampling and histological verification. Aims: The present study was carried out to examine the reliability of the diagnoses made on tumor sample obtained via different s...

  16. An analytical inductor design procedure for three-phase PWM converters in power factor correction applications

    DEFF Research Database (Denmark)

    Kouchaki, Alireza; Niroumand, Farideh Javidi; Haase, Frerk

    2015-01-01

    This paper presents an analytical method for designing the inductor of three-phase power factor correction converters (PFCs). The complex behavior of the inductor current complicates the inductor design procedure as well as the core loss and copper loss calculations. Therefore, this paper analyze...... to calculate the core loss in the PFC application. To investigate the impact of the dc link voltage level, two inductors for different dc voltage levels are designed and the results are compared.......This paper presents an analytical method for designing the inductor of three-phase power factor correction converters (PFCs). The complex behavior of the inductor current complicates the inductor design procedure as well as the core loss and copper loss calculations. Therefore, this paper analyzes...... circuit is used to provide the inductor current harmonic spectrum. Therefore, using the harmonic spectrum, the low and high frequency copper losses are calculated. The high frequency minor B-H loops in one switching cycle are also analyzed. Then, the loss map provided by the measurement setup is used...

  17. Factors influencing changes in levels of radiation doses received by patients during gastroduodenal series procedures in the Hospital Dr. Max Peralta de Cartago

    International Nuclear Information System (INIS)

    Guzman Campos, Jeremy; Vargas Navarro, Jonnathan

    2009-01-01

    A measurement was made of the number of radiation doses emitted by fluoroscopy equipment used in Hospital Dr. Max Peralta, specifically at the Centro de Deteccion de Cancer Gastrico. The analysis has included the factors could be influencing on increase of the total dose to the patient, by means of indicators that directly affect the unnecessary increase in dose, such as: the procedure, sequences of images, indicators of dosage levels, varying conditions of actual studies, variations dose levels and production process factors. [es

  18. A Quantitative Review of Functional Analysis Procedures in Public School Settings

    Science.gov (United States)

    Solnick, Mark D.; Ardoin, Scott P.

    2010-01-01

    Functional behavioral assessments can consist of indirect, descriptive and experimental procedures, such as a functional analysis. Although the research contains numerous examples demonstrating the effectiveness of functional analysis procedures, experimental conditions are often difficult to implement in classroom settings and analog conditions…

  19. Analysis of technological, institutional and socioeconomic factors ...

    African Journals Online (AJOL)

    Analysis of technological, institutional and socioeconomic factors that influences poor reading culture among secondary school students in Nigeria. ... Proliferation and availability of smart phones, chatting culture and social media were identified as technological factors influencing poor reading culture among secondary ...

  20. Hand function evaluation: a factor analysis study.

    Science.gov (United States)

    Jarus, T; Poremba, R

    1993-05-01

    The purpose of this study was to investigate hand function evaluations. Factor analysis with varimax rotation was used to assess the fundamental characteristics of the items included in the Jebsen Hand Function Test and the Smith Hand Function Evaluation. The study sample consisted of 144 subjects without disabilities and 22 subjects with Colles fracture. Results suggest a four factor solution: Factor I--pinch movement; Factor II--grasp; Factor III--target accuracy; and Factor IV--activities of daily living. These categories differentiated the subjects without Colles fracture from the subjects with Colles fracture. A hand function evaluation consisting of these four factors would be useful. Such an evaluation that can be used for current clinical purposes is provided.

  1. Sample handling and chemical procedures for efficacious trace analysis of urine by neutron activation analysis

    International Nuclear Information System (INIS)

    Blotcky, A.J.; Rack, E.P.; Roman, F.R.

    1988-01-01

    Important for the determination of trace elements, ions, or compounds in urine by chemical neutron activation analysis is the optimization of sample handling, preirradiation chemistry, and radioassay procedures necessary for viable analysis. Each element, because of its natural abundance in the earth's crust and, hence, its potential for reagent and environmental contamination, requires specific procedures for storage, handling, and preirradiation chemistry. Radioassay techniques for radionuclides vary depending on their half-lives and decay characteristics. Described in this paper are optimized procedures for aluminum and selenium. While 28 Al (T 1/2 = 2.24 min) and 77m Se(T 1/2 = 17.4s) have short half-lives, their gamma-ray spectra are quite different. Aluminum-28 decays by a 1779-keV gamma and 77m Se by a 162-keV gamma. Unlike selenium, aluminum is a ubiquitous element in the environment requiring special handling to minimize contamination in all phases of its analytical determination

  2. A limited assessment of the ASEP human reliability analysis procedure using simulator examination results

    International Nuclear Information System (INIS)

    Gore, B.R.; Dukelow, J.S. Jr.; Mitts, T.M.; Nicholson, W.L.

    1995-10-01

    This report presents a limited assessment of the conservatism of the Accident Sequence Evaluation Program (ASEP) human reliability analysis (HRA) procedure described in NUREG/CR-4772. In particular, the, ASEP post-accident, post-diagnosis, nominal HRA procedure is assessed within the context of an individual's performance of critical tasks on the simulator portion of requalification examinations administered to nuclear power plant operators. An assessment of the degree to which operator perforn:Lance during simulator examinations is an accurate reflection of operator performance during actual accident conditions was outside the scope of work for this project; therefore, no direct inference can be made from this report about such performance. The data for this study are derived from simulator examination reports from the NRC requalification examination cycle. A total of 4071 critical tasks were identified, of which 45 had been failed. The ASEP procedure was used to estimate human error probability (HEP) values for critical tasks, and the HEP results were compared with the failure rates observed in the examinations. The ASEP procedure was applied by PNL operator license examiners who supplemented the limited information in the examination reports with expert judgment based upon their extensive simulator examination experience. ASEP analyses were performed for a sample of 162 critical tasks selected randomly from the 4071, and the results were used to characterize the entire population. ASEP analyses were also performed for all of the 45 failed critical tasks. Two tests were performed to assess the bias of the ASEP HEPs compared with the data from the requalification examinations. The first compared the average of the ASEP HEP values with the fraction of the population actually failed and it found a statistically significant factor of two bias on the average

  3. A model for analysing factors which may influence quality management procedures in higher education

    Directory of Open Access Journals (Sweden)

    Cătălin MAICAN

    2015-12-01

    Full Text Available In all universities, the Office for Quality Assurance defines the procedure for assessing the performance of the teaching staff, with a view to establishing students’ perception as regards the teachers’ activity from the point of view of the quality of the teaching process, of the relationship with the students and of the assistance provided for learning. The present paper aims at creating a combined model for evaluation, based on Data Mining statistical methods: starting from the findings revealed by the evaluations teachers performed to students, using the cluster analysis and the discriminant analysis, we identified the subjects which produced significant differences between students’ grades, subjects which were subsequently subjected to an evaluation by students. The results of these analyses allowed the formulation of certain measures for enhancing the quality of the evaluation process.

  4. Using P-Stat, BMDP and SPSS for a cross-products factor analysis.

    Science.gov (United States)

    Tanner, B A; Leiman, J M

    1983-06-01

    The major disadvantage of the Q factor analysis with Euclidean distances described by Tanner and Koning [Comput. Progr. Biomed. 12 (1980) 201-202] is the considerable editing required. An alternative procedure with commercially distributed software, and with cross-products in place of Euclidean distances is described. This procedure does not require any editing.

  5. Integrating human factors into process hazard analysis

    International Nuclear Information System (INIS)

    Kariuki, S.G.; Loewe, K.

    2007-01-01

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors

  6. A scenario-based procedure for seismic risk analysis

    International Nuclear Information System (INIS)

    Kluegel, J.-U.; Mualchin, L.; Panza, G.F.

    2006-12-01

    A new methodology for seismic risk analysis based on probabilistic interpretation of deterministic or scenario-based hazard analysis, in full compliance with the likelihood principle and therefore meeting the requirements of modern risk analysis, has been developed. The proposed methodology can easily be adjusted to deliver its output in a format required for safety analysts and civil engineers. The scenario-based approach allows the incorporation of all available information collected in a geological, seismotectonic and geotechnical database of the site of interest as well as advanced physical modelling techniques to provide a reliable and robust deterministic design basis for civil infrastructures. The robustness of this approach is of special importance for critical infrastructures. At the same time a scenario-based seismic hazard analysis allows the development of the required input for probabilistic risk assessment (PRA) as required by safety analysts and insurance companies. The scenario-based approach removes the ambiguity in the results of probabilistic seismic hazard analysis (PSHA) which relies on the projections of Gutenberg-Richter (G-R) equation. The problems in the validity of G-R projections, because of incomplete to total absence of data for making the projections, are still unresolved. Consequently, the information from G-R must not be used in decisions for design of critical structures or critical elements in a structure. The scenario-based methodology is strictly based on observable facts and data and complemented by physical modelling techniques, which can be submitted to a formalised validation process. By means of sensitivity analysis, knowledge gaps related to lack of data can be dealt with easily, due to the limited amount of scenarios to be investigated. The proposed seismic risk analysis can be used with confidence for planning, insurance and engineering applications. (author)

  7. Analysis of generalized Schwarz alternating procedure for domain decomposition

    Energy Technology Data Exchange (ETDEWEB)

    Engquist, B.; Zhao, Hongkai [Univ. of California, Los Angeles, CA (United States)

    1996-12-31

    The Schwartz alternating method(SAM) is the theoretical basis for domain decomposition which itself is a powerful tool both for parallel computation and for computing in complicated domains. The convergence rate of the classical SAM is very sensitive to the overlapping size between each subdomain, which is not desirable for most applications. We propose a generalized SAM procedure which is an extension of the modified SAM proposed by P.-L. Lions. Instead of using only Dirichlet data at the artificial boundary between subdomains, we take a convex combination of u and {partial_derivative}u/{partial_derivative}n, i.e. {partial_derivative}u/{partial_derivative}n + {Lambda}u, where {Lambda} is some {open_quotes}positive{close_quotes} operator. Convergence of the modified SAM without overlapping in a quite general setting has been proven by P.-L.Lions using delicate energy estimates. The important questions remain for the generalized SAM. (1) What is the most essential mechanism for convergence without overlapping? (2) Given the partial differential equation, what is the best choice for the positive operator {Lambda}? (3) In the overlapping case, is the generalized SAM superior to the classical SAM? (4) What is the convergence rate and what does it depend on? (5) Numerically can we obtain an easy to implement operator {Lambda} such that the convergence is independent of the mesh size. To analyze the convergence of the generalized SAM we focus, for simplicity, on the Poisson equation for two typical geometry in two subdomain case.

  8. Procedure for conducting a human-reliability analysis for nuclear power plants. Final report

    International Nuclear Information System (INIS)

    Bell, B.J.; Swain, A.D.

    1983-05-01

    This document describes in detail a procedure to be followed in conducting a human reliability analysis as part of a probabilistic risk assessment when such an analysis is performed according to the methods described in NUREG/CR-1278, Handbook for Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications. An overview of the procedure describing the major elements of a human reliability analysis is presented along with a detailed description of each element and an example of an actual analysis. An appendix consists of some sample human reliability analysis problems for further study

  9. Failure mode and effects analysis: an empirical comparison of failure mode scoring procedures.

    Science.gov (United States)

    Ashley, Laura; Armitage, Gerry

    2010-12-01

    To empirically compare 2 different commonly used failure mode and effects analysis (FMEA) scoring procedures with respect to their resultant failure mode scores and prioritization: a mathematical procedure, where scores are assigned independently by FMEA team members and averaged, and a consensus procedure, where scores are agreed on by the FMEA team via discussion. A multidisciplinary team undertook a Healthcare FMEA of chemotherapy administration. This included mapping the chemotherapy process, identifying and scoring failure modes (potential errors) for each process step, and generating remedial strategies to counteract them. Failure modes were scored using both an independent mathematical procedure and a team consensus procedure. Almost three-fifths of the 30 failure modes generated were scored differently by the 2 procedures, and for just more than one-third of cases, the score discrepancy was substantial. Using the Healthcare FMEA prioritization cutoff score, almost twice as many failure modes were prioritized by the consensus procedure than by the mathematical procedure. This is the first study to empirically demonstrate that different FMEA scoring procedures can score and prioritize failure modes differently. It found considerable variability in individual team members' opinions on scores, which highlights the subjective and qualitative nature of failure mode scoring. A consensus scoring procedure may be most appropriate for FMEA as it allows variability in individuals' scores and rationales to become apparent and to be discussed and resolved by the team. It may also yield team learning and communication benefits unlikely to result from a mathematical procedure.

  10. Analysis of Economic Factors Affecting Stock Market

    OpenAIRE

    Xie, Linyin

    2010-01-01

    This dissertation concentrates on analysis of economic factors affecting Chinese stock market through examining relationship between stock market index and economic factors. Six economic variables are examined: industrial production, money supply 1, money supply 2, exchange rate, long-term government bond yield and real estate total value. Stock market comprises fixed interest stocks and equities shares. In this dissertation, stock market is restricted to equity market. The stock price in thi...

  11. Analysis of Relational Communication in Dyads: New Measurement Procedures.

    Science.gov (United States)

    Rogers, L. Edna; Farace, Richard

    Relational communication refers to the control or dominance aspects of message exchange in dyads--distinguishing it from the report or referential aspects of communication. In relational communicational analysis, messages as transactions are emphasized; major theoretical concepts which emerge are symmetry, transitoriness, and complementarity of…

  12. Application of a statistical thermal design procedure to evaluate the PWR DNBR safety analysis limits

    International Nuclear Information System (INIS)

    Robeyns, J.; Parmentier, F.; Peeters, G.

    2001-01-01

    In the framework of safety analysis for the Belgian nuclear power plants and for the reload compatibility studies, Tractebel Energy Engineering (TEE) has developed, to define a 95/95 DNBR criterion, a statistical thermal design method based on the analytical full statistical approach: the Statistical Thermal Design Procedure (STDP). In that methodology, each DNBR value in the core assemblies is calculated with an adapted CHF (Critical Heat Flux) correlation implemented in the sub-channel code Cobra for core thermal hydraulic analysis. The uncertainties of the correlation are represented by the statistical parameters calculated from an experimental database. The main objective of a sub-channel analysis is to prove that in all class 1 and class 2 situations, the minimum DNBR (Departure from Nucleate Boiling Ratio) remains higher than the Safety Analysis Limit (SAL). The SAL value is calculated from the Statistical Design Limit (SDL) value adjusted with some penalties and deterministic factors. The search of a realistic value for the SDL is the objective of the statistical thermal design methods. In this report, we apply a full statistical approach to define the DNBR criterion or SDL (Statistical Design Limit) with the strict observance of the design criteria defined in the Standard Review Plan. The same statistical approach is used to define the expected number of rods experiencing DNB. (author)

  13. A Bayesian multidimensional scaling procedure for the spatial analysis of revealed choice data

    NARCIS (Netherlands)

    DeSarbo, WS; Kim, Y; Fong, D

    1999-01-01

    We present a new Bayesian formulation of a vector multidimensional scaling procedure for the spatial analysis of binary choice data. The Gibbs sampler is gainfully employed to estimate the posterior distribution of the specified scalar products, bilinear model parameters. The computational procedure

  14. Responding to Self-Harm: A Documentary Analysis of Agency Policy and Procedure

    Science.gov (United States)

    Paul, Sally; Hill, Malcolm

    2013-01-01

    This paper reports on the findings of a documentary analysis of policies and procedures relating to self-harm from a range of organisations working with young people in the UK. It identifies the extent to which policies and/or procedures relating to self-harm are available for service providers and offers a wider understanding of the concepts of…

  15. Fast analysis procedure of radiochemical coordinat uptake for methotrexate

    International Nuclear Information System (INIS)

    Caston, J.D.; Kamen, B.A.

    1976-01-01

    Under this invention, a radio-chemical analysis is submitted to determine the concentration of methotrexate or its equivalents in analysis in a biological medium. The amounts taken up of the labelled compound and the known concentrations of the unlabelled compound to be determined are radio-isotopically related to a first system containing a pre-determined amount of the labelled compound and a pre-determined amount of the unlabelled compound. In a second system, identical to the first, save that the sample of the biological medium to be analyzed takes the place of the unlabelled compound, the amount of labelled compound taken up is determined radio-isotopically. The concentration of the compound in the sample is then determined by correlation of the labelled compound uptake determined in the second system with the relation determined in the first system. The radio-isotopic relations and determinations may be made by direct and sequential analytical techniques [fr

  16. Development of the quantification procedures for in situ XRF analysis

    International Nuclear Information System (INIS)

    Kump, P.; Necemer, M.; Rupnik, P.

    2005-01-01

    For in situ XRF applications, two excitation systems (radioisotope and tube excited) and an X ray spectrometer based on an Si-PIN detector were assembled and used. The radioisotope excitation system with an Am-241 source was assembled into a prototype of a compact XRF analyser PEDUZO-01, which is also applicable in field work. The existing quantification software QAES (quantitative analysis of environmental samples) was assessed to be adequate also in field work. This QAES software was also integrated into a new software attached to the developed XRF analyser PEDUZO-01, which includes spectrum acquisition, spectrum analysis and quantification and runs in the LABVIEW environment. In a process of assessment of the Si-PIN based X ray spectrometers and QAES quantification software in field work, a comparison was made with the results obtained by the standard Si(Li) based spectrometer. The results of this study prove that the use of this spectrometer is adequate for field work. This work was accepted for publication in X ray Spectrometry. Application of a simple sample preparation of solid samples was studied in view of the analytical results obtained. It has been established that under definite conditions the results are not very different from the ones obtained by the homogenized sample pressed into the pellet. The influence of particle size and mineralogical effects on quantitative results was studied. A simple sample preparation kit was proposed. Sample preparation for the analysis of water samples by precipitation with APDC and aerosol analysis using a dichotomous sampler were also adapted and used in the field work. An adequate sample preparation kit was proposed. (author)

  17. Human factors research plan for instrument procedures : FY12 version 1.1

    Science.gov (United States)

    2012-06-19

    This research will support the development of instrument procedures for performance-based navigation (PBN) operations. These procedures include, but are not limited to, area navigation (RNAV) and required navigation performance (RNP) operations. The ...

  18. A comparative examination of sample treatment procedures for ICAP-AES analysis of biological tissue

    Science.gov (United States)

    De Boer, J. L. M.; Maessen, F. J. M. J.

    The objective of this study was to contribute to the evaluation of existing sample preparation procedures for ICAP-AES analysis of biological material. Performance characteristics were established of current digestion procedures comprising extraction, solubilization, pressure digestion, and wet and dry ashing methods. Apart from accuracy and precision, a number of criteria of special interest for the analytical practice was applied. As a test sample served SRM bovine liver. In this material six elements were simultaneously determined. Results showed that every procedure has its defects and advantages. Hence, unambiguous recommendation of standard digestion procedures can be made only when taking into account the specific analytical problem.

  19. A comparison of various procedures in photon activation analysis with the same irradiation setup

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Z.J. [Chemical Sciences and Engineering Division, Argonne National Laboratory, 9700 S. Cass Ave., Argonne, IL 60439 (United States); Wells, D. [Physics Department, South Dakota School of Mines and Technology, 501 E. Saint Joseph St., Rapid City, SD 57701 (United States); Segebade, C. [Idaho Accelerator Center, Idaho State University, 921 S. 8th Ave., Pocatello, ID 83209 (United States); Quigley, K.; Chemerisov, S. [Chemical Sciences and Engineering Division, Argonne National Laboratory, 9700 S. Cass Ave., Argonne, IL 60439 (United States)

    2014-11-15

    A sample of known elemental concentrations was activated in the bremsstrahlung photon beam which was created by a pulsed electron LINAC. Several procedures of photon activation analysis, including those applied with/without reference material and with/without photon flux monitor, were conducted to make a comparison of their precision and accuracy in practice. Experimental results have indicated that: (1) relative procedures usually produce better outcome despite that the absolute measurement is straightforward and eliminate the assistance of reference materials; (2) among relative procedures, the method with internal flux monitor yields higher quality of the analytical results. In the article, the pros and cons of each procedure are discussed as well.

  20. Replica Analysis for Portfolio Optimization with Single-Factor Model

    Science.gov (United States)

    Shinzato, Takashi

    2017-06-01

    In this paper, we use replica analysis to investigate the influence of correlation among the return rates of assets on the solution of the portfolio optimization problem. We consider the behavior of an optimal solution for the case where the return rate is described with a single-factor model and compare the findings obtained from our proposed methods with correlated return rates with those obtained with independent return rates. We then analytically assess the increase in the investment risk when correlation is included. Furthermore, we also compare our approach with analytical procedures for minimizing the investment risk from operations research.

  1. Aesthetic Breast Surgery and Concomitant Procedures: Incidence and Risk Factors for Major Complications in 73,608 Cases.

    Science.gov (United States)

    Gupta, Varun; Yeslev, Max; Winocour, Julian; Bamba, Ravinder; Rodriguez-Feo, Charles; Grotting, James C; Higdon, K Kye

    2017-05-01

    Major complications following aesthetic breast surgery are uncommon and thus assessment of risk factors is challenging. To determine the incidence and risk factors of major complications following aesthetic breast surgery and concomitant procedures. A prospective cohort of patients who enrolled into the CosmetAssure (Birmingham, AL) insurance program and underwent aesthetic breast surgery between 2008 and 2013 was identified. Major complications (requiring reoperation, readmission, or emergency room visit) within 30 days of surgery were recorded. Risk factors including age, smoking, body mass index (BMI), diabetes, type of surgical facility, and combined procedures were evaluated. Among women, augmentation was the most common breast procedure (n = 41,651, 58.6%) followed by augmentation-mastopexy, mastopexy, and reduction. Overall, major complications occurred in 1.46% with hematoma (0.99%) and infection (0.25%) being most common. Augmentation-mastopexy had a higher risk of complications, particularly infection (relative risk [RR] 1.74, P procedures. Age was the only significant predictor for hematomas (RR 1.01, P procedures or abdominoplasty performed alone. Among men, correction of gynecomastia was the most common breast procedure (n = 1613, 64.6%) with a complication rate of 1.80% and smoking as a risk factor (RR 2.73, P = 0.03). Incidence of major complications after breast cosmetic surgical procedures is low. Risk factors for major complications include increasing age and BMI. Combining abdominoplasty with any breast procedure increases the risk of major complications. 2. © 2017 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com

  2. Using plant procedures as the basis for conducting a job and task analysis

    International Nuclear Information System (INIS)

    Haynes, F.H.; Ruth, B.W.

    1985-01-01

    Plant procedures were selected, by Northeast Utilities (NU), as the basis for conducting Job and Task Analyses (JTA). The resultant JTA was used to design procedure based simulator training programs for Millstone 1, 2, and Connecticut Yankee. The task listings were both plant specific and exhibited excellent correlation to INPO's generic PWR and BWR task analyses. Using the procedures based method enabled us to perform the JTA using plant and training staff. This proved cost effective in terms of both time and money. Learning objectives developed from the JTA were easily justified and correlated directly to job performance within the context of the plant procedures. In addition, the analysis generated a comprehensive review of plant procedures and, conversely, the plant's normal procedure revision process generated an automatic trigger for updating the task data

  3. Quality assurance procedures for the analysis of TRU waste samples

    International Nuclear Information System (INIS)

    Glasgow, D.C. Giaquinto, J.M.; Robinson, L.

    1995-01-01

    The Waste Isolation Pilot Plant (WIPP) project was undertaken in response to the growing need for a national repository for transuranic (TRU) waste. Guidelines for WIPP specify that any waste item to be interred must be fully characterized and analyzed to determine the presence of chemical compounds designated hazardous and certain toxic elements. The Transuranic Waste Characterization Program (TWCP) was launched to develop analysis and quality guidelines, certify laboratories, and to oversee the actual waste characterizations at the laboratories. ORNL is participating in the waste characterization phase and brings to bear a variety of analytical techniques including ICP-AES, cold vapor atomic absorption, and instrumental neutron activation analysis (INAA) to collective determine arsenic, cadmium, barium, chromium, mercury, selenium, silver, and other elements. All of the analytical techniques involved participate in a cooperative effort to meet the project objectives. One important component of any good quality assurance program is determining when an alternate method is more suitable for a given analytical problem. By bringing to bear a whole arsenal of analytical techniques working toward common objectives, few analytical problems prove to be insurmountable. INAA and ICP-AES form a powerful pair when functioning in this cooperative manner. This paper will provide details of the quality assurance protocols, typical results from quality control samples for both INAA and ICP-AES, and detail method cooperation schemes used

  4. Factor Economic Analysis at Forestry Enterprises

    Directory of Open Access Journals (Sweden)

    M.Yu. Chik

    2018-03-01

    Full Text Available The article studies the importance of economic analysis according to the results of research of scientific works of domestic and foreign scientists. The calculation of the influence of factors on the change in the cost of harvesting timber products by cost items has been performed. The results of the calculation of the influence of factors on the change of costs on 1 UAH are determined using the full cost of sold products. The variable and fixed costs and their distribution are allocated that influences the calculation of the impact of factors on cost changes on 1 UAH of sold products. The paper singles out the general results of calculating the influence of factors on cost changes on 1 UAH of sold products. According to the results of the analysis, the list of reserves for reducing the cost of production at forest enterprises was proposed. The main sources of reserves for reducing the prime cost of forest products at forest enterprises are investigated based on the conducted factor analysis.

  5. An SPSSR -Menu for Ordinal Factor Analysis

    Directory of Open Access Journals (Sweden)

    Mario Basto

    2012-01-01

    Full Text Available Exploratory factor analysis is a widely used statistical technique in the social sciences. It attempts to identify underlying factors that explain the pattern of correlations within a set of observed variables. A statistical software package is needed to perform the calculations. However, there are some limitations with popular statistical software packages, like SPSS. The R programming language is a free software package for statistical and graphical computing. It offers many packages written by contributors from all over the world and programming resources that allow it to overcome the dialog limitations of SPSS. This paper offers an SPSS dialog written in theR programming language with the help of some packages, so that researchers with little or no knowledge in programming, or those who are accustomed to making their calculations based on statistical dialogs, have more options when applying factor analysis to their data and hence can adopt a better approach when dealing with ordinal, Likert-type data.

  6. Establishment of analysis procedure for control rod reactivity worth

    Energy Technology Data Exchange (ETDEWEB)

    Song, Hoon; Kim, Young Il; Kim, Sang Ji; Kim, Young In

    2001-03-01

    As to the calculation method of control rod reactivity relating to hexagonal assembly, which are used generally in fast reactor, we have investigated the calculation method, the problems to rise during calculation, the degrees of calculation and the enhancement of calculation modeling so on, and estimated the application of calculation method through comparison and analysis of calculation result using the effective cross section generation system, TRANSX/TWODANT, and neutron flux calculation system, diffusion theory code DIF-3D, which are belonged to K-CORE System, and determined the basic calculation method, and extracted the present calculation problem in case of application in K-CORE System and the future improvement items so on.

  7. Establishment of analysis procedure for control rod reactivity worth

    International Nuclear Information System (INIS)

    Song, Hoon; Kim, Young Il; Kim, Sang Ji; Kim, Young In

    2001-03-01

    As to the calculation method of control rod reactivity relating to hexagonal assembly, which are used generally in fast reactor, we have investigated the calculation method, the problems to rise during calculation, the degrees of calculation and the enhancement of calculation modeling so on, and estimated the application of calculation method through comparison and analysis of calculation result using the effective cross section generation system, TRANSX/TWODANT, and neutron flux calculation system, diffusion theory code DIF-3D, which are belonged to K-CORE System, and determined the basic calculation method, and extracted the present calculation problem in case of application in K-CORE System and the future improvement items so on

  8. Pragmatic evaluation of the Toyota Production System (TPS analysis procedure for problem solving with entry-level nurses

    Directory of Open Access Journals (Sweden)

    Lukasz Maciej Mazur

    2008-12-01

    Full Text Available Medication errors occurring in hospitals are a growing national concern. These medication errors and their related costs (or wastes are seen as major factors leading to increased patient safety risks and increased waste in the hospital setting.  This article presents a study in which sixteen entry-level nurses utilized a Toyota Production System (TPS analysis procedure to solve medication delivery problems at one community hospital. The objective of this research was to study and evaluate the TPS analysis procedure for problem solving with entry-level nurses. Personal journals, focus group discussions, and a survey study were used to collect data about entry-level nurses’ perceptions of using the TPS problem solving approach to study medication delivery. A regression analysis was used to identify characteristics that enhance problem solving efforts. In addition, propositions for effective problem solving by entry-level nurses to aid in the reduction of medication errors in healthcare delivery settings are offered.

  9. ANALYSIS OF THE FACTORS AFFECTING THE AVERAGE

    Directory of Open Access Journals (Sweden)

    Carmen BOGHEAN

    2013-12-01

    Full Text Available Productivity in agriculture most relevantly and concisely expresses the economic efficiency of using the factors of production. Labour productivity is affected by a considerable number of variables (including the relationship system and interdependence between factors, which differ in each economic sector and influence it, giving rise to a series of technical, economic and organizational idiosyncrasies. The purpose of this paper is to analyse the underlying factors of the average work productivity in agriculture, forestry and fishing. The analysis will take into account the data concerning the economically active population and the gross added value in agriculture, forestry and fishing in Romania during 2008-2011. The distribution of the average work productivity per factors affecting it is conducted by means of the u-substitution method.

  10. Oxygen isotope analysis of plant water without extraction procedure

    International Nuclear Information System (INIS)

    Gan, K.S.; Wong, S.C.; Farquhar, G.D.; Yong, J.W.H.

    2001-01-01

    Isotopic analyses of plant water (mainly xylem, phloem and leaf water) are gaming importance as the isotopic signals reflect plant-environment interactions, affect the oxygen isotopic composition of atmospheric O 2 and CO 2 and are eventually incorporated into plant organic matter. Conventionally, such isotopic measurements require a time-consuming process of isolating the plant water by azeotropic distillation or vacuum extraction, which would not complement the speed of isotope analysis provided by continuous-flow IRMS (Isotope-Ratio Mass Spectrometry), especially when large data sets are needed for statistical calculations in biological studies. Further, a substantial amount of plant material is needed for water extraction and leaf samples would invariably include unenriched water from the fine veins. To measure sub-microlitre amount of leaf mesophyll water, a new approach is undertaken where a small disc of fresh leaf is cut using a specially designed leaf punch, and pyrolysed directly in an IRMS. By comparing with results from pyrolysis of the dry matter of the same leaf, the 18 O content of leaf water can be determined without extraction from fresh leaves. This method is validated using a range of cellulose-water mixtures to simulate the constituents of fresh leaf. Cotton leaf water δ 18 O obtained from both methods of fresh leaf pyrolysis and azeotropic distillation will be compared. The pyrolysis technique provides a robust approach to measure the isotopic content of water or any volatile present in a homogeneous solution or solid hydrous substance

  11. Nominal Performance Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. Wasiolek

    2004-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs for the groundwater exposure scenario for the three climate states considered in the TSPA-LA as well as conversion factors for evaluating compliance with the groundwater protection standard. The BDCFs will be used in performance assessment for calculating all-pathway annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle activity in groundwater and the annual dose

  12. Human factors review for Severe Accident Sequence Analysis (SASA)

    International Nuclear Information System (INIS)

    Krois, P.A.; Haas, P.M.; Manning, J.J.; Bovell, C.R.

    1984-01-01

    The paper will discuss work being conducted during this human factors review including: (1) support of the Severe Accident Sequence Analysis (SASA) Program based on an assessment of operator actions, and (2) development of a descriptive model of operator severe accident management. Research by SASA analysts on the Browns Ferry Unit One (BF1) anticipated transient without scram (ATWS) was supported through a concurrent assessment of operator performance to demonstrate contributions to SASA analyses from human factors data and methods. A descriptive model was developed called the Function Oriented Accident Management (FOAM) model, which serves as a structure for bridging human factors, operations, and engineering expertise and which is useful for identifying needs/deficiencies in the area of accident management. The assessment of human factors issues related to ATWS required extensive coordination with SASA analysts. The analysis was consolidated primarily to six operator actions identified in the Emergency Procedure Guidelines (EPGs) as being the most critical to the accident sequence. These actions were assessed through simulator exercises, qualitative reviews, and quantitative human reliability analyses. The FOAM descriptive model assumes as a starting point that multiple operator/system failures exceed the scope of procedures and necessitates a knowledge-based emergency response by the operators. The FOAM model provides a functionally-oriented structure for assembling human factors, operations, and engineering data and expertise into operator guidance for unconventional emergency responses to mitigate severe accident progression and avoid/minimize core degradation. Operators must also respond to potential radiological release beyond plant protective barriers. Research needs in accident management and potential uses of the FOAM model are described. 11 references, 1 figure

  13. Factor analysis for exercise stress radionuclide ventriculography

    International Nuclear Information System (INIS)

    Hirota, Kazuyoshi; Yasuda, Mitsutaka; Oku, Hisao; Ikuno, Yoshiyasu; Takeuchi, Kazuhide; Takeda, Tadanao; Ochi, Hironobu

    1987-01-01

    Using factor analysis, a new image processing in exercise stress radionuclide ventriculography, changes in factors associated with exercise were evaluated in 14 patients with angina pectoris or old myocardial infarction. The patients were imaged in the left anterior oblique projection, and three factor images were presented on a color coded scale. Abnormal factors (AF) were observed in 6 patients before exercise, 13 during exercise, and 4 after exercise. In 7 patients, the occurrence of AF was associated with exercise. Five of them became free from AF after exercise. Three patients showing AF before exercise had aggravation of AF during exercise. Overall, the occurrence or aggravation of AF was associated with exercise in ten (71 %) of the patients. The other three patients, however, had disappearance of AF during exercise. In the last patient, none of the AF was observed throughout the study. In view of a high incidence of AF associated with exercise, the factor analysis may have the potential in evaluating cardiac reverse from the viewpoint of left ventricular wall motion abnormality. (Namekawa, K.)

  14. An analysis of tolerance levels in IMRT quality assurance procedures

    International Nuclear Information System (INIS)

    Basran, Parminder S.; Woo, Milton K.

    2008-01-01

    Increased use of intensity modulated radiation therapy (IMRT) has resulted in increased efforts in patient quality assurance (QA). Software and detector systems intended to streamline the IMRT quality assurance process often report metrics, such as percent discrepancies between measured and computed doses, which can be compared to benchmark or threshold values. The purpose of this work is to examine the relationships between two different types of IMRT QA processes in order to define, or refine, appropriate tolerances values. For 115 IMRT plans delivered in a 3 month period, we examine the discrepancies between (a) the treatment planning system (TPS) and results from a commercial independent monitor unit (MU) calculation program; (b) TPS and results from a commercial diode-array measurement system; and (c) the independent MU calculation and the diode-array measurements. Statistical tests were performed to assess significance in the IMRT QA results for different disease site and machine models. There is no evidence that the average total dose discrepancy in the monitor unit calculation depends on the disease site. Second, the discrepancies in the two IMRT QA methods are independent: there is no evidence that a better --or worse--monitor unit validation result is related to a better--or worse--diode-array measurement result. Third, there is marginal benefit in repeating the independent MU calculation with a more suitable dose point, if the initial IMRT QA failed a certain tolerance. Based on these findings, the authors conclude at some acceptable tolerances based on disease site and IMRT QA method. Specifically, monitor unit validations are expected to have a total dose discrepancy of 3% overall, and 5% per beam, independent of disease site. Diode array measurements are expected to have a total absolute dose discrepancy of 3% overall, and 3% per beam, independent of disease site. The percent of pixels exceeding a 3% and 3 mm threshold in a gamma analysis should be

  15. Task Analysis of Emergency Operating Procedures for Generating Quantitative HRA Data

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yochan; Park, Jinkyun; Kim, Seunghwan; Choi, Sun Yeong; Jung, Wondea; Jang, Inseok [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    In this paper, the analysis results of the emergency task in the procedures (EOPs; emergency operating procedures) that can be observed from the simulator data are introduced. The task type, component type, system type, and additional information related with the performance of the operators were described. In addition, a prospective application of the analyzed information to HEP quantification process was discussed. In the probabilistic safety analysis (PSA) field, various human reliability analyses (HRAs) have been performed to produce estimates of human error probabilities (HEPs) for significant tasks in complex socio-technical systems. To this end, Many HRA methods have provided basic or nominal HEPs for typical tasks and the quantitative relations describing how a certain performance context or performance shaping factors (PSFs) affects the HEPs. In the HRA community, however, the necessity of appropriate and sufficient human performance data has been recently indicated. This is because a wide range of quantitative estimates in the previous HRA methods are not supported by solid empirical bases. Hence, there have been attempts to collect HRA supporting data. For example, KAERI has started to collect information on both unsafe acts of operators and the relevant PSFs. A characteristic of the database that is being developed at KAERI is that human errors and related PSF surrogates that can be objectively observable are collected from full-scope simulator experiences. In this environment, to produce concretely grounded bases of the HEPs, the traits or attributes of tasks where significant human errors can be observed should be definitely determined. The determined traits should be applicable to compare the HEPs on the traits with the data in previous HRA methods or databases. In this study, task characteristics in a Westinghouse type of EOPs were analyzed with the defining task, component, and system taxonomies.

  16. Effective Work Procedure design Using Discomfort and Effort Factor in Brick stacking operation-A case study

    Science.gov (United States)

    Rout, Biswaranjan; Dash, R. R.; Dhupal, D.

    2018-02-01

    In this work a typical planning of movement of limbs and torso of the worker to be well design to reduce fatigue and energy of the worker. A simulation model is generated to suit the procedure and comply with the constraints in the workspace. It requires verifying the capability of human postures and movements in different working conditions for the evaluation of effectiveness of the new design. In this article a simple human performance measure is introduce that enable the mathematical model for evaluation of a cost function. The basic scheme is to evaluate the performance in the form of several cost factors using AI techniques. Here two main cost factors taken in to consideration are discomfort factor and effort factor in limb movements. Discomfort factor measures the level of discomfort from the most neutral position of a given limb to the position of the corresponding limb after movement and effort factor is a measure of the displacement of the corresponding limbs from the original position. The basic aim is to optimize the movement of the limbs with the above mentioned cost functions. The effectiveness of the procedure is tested with an example of working procedure of workers used for stacking of fly ash bricks in a local fly ash bricks manufacturing unit. The objective is to find out the optimised movement of the limbs to reduce discomfort level and effort required of workers. The effectiveness of the procedure in this case study illustrated with the obtained results.

  17. Human Factors Analysis of Pipeline Monitoring and Control Operations: Final Technical Report

    Science.gov (United States)

    2008-11-26

    The purpose of the Human Factors Analysis of Pipeline Monitoring and Control Operations project was to develop procedures that could be used by liquid pipeline operators to assess and manage the human factors risks in their control rooms that may adv...

  18. Correction factor for hair analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Castro Faria, L.V. de; Paschoa, A.S.

    1980-01-01

    The application of the Particle Induced X-ray Emission (PIXE) technique to analyse quantitatively the elemental composition of hair specimens brings about some difficulties in the interpretation of the data. The present paper proposes a correction factor to account for the effects of the energy loss of the incident particle with penetration depth, and X-ray self-absorption when a particular geometrical distribution of elements in hair is assumed for calculational purposes. The correction factor has been applied to the analysis of hair contents Zn, Cu and Ca as a function of the energy of the incident particle. (orig.)

  19. Boolean Factor Analysis by Attractor Neural Network

    Czech Academy of Sciences Publication Activity Database

    Frolov, A. A.; Húsek, Dušan; Muraviev, I. P.; Polyakov, P.Y.

    2007-01-01

    Roč. 18, č. 3 (2007), s. 698-707 ISSN 1045-9227 R&D Projects: GA AV ČR 1ET100300419; GA ČR GA201/05/0079 Institutional research plan: CEZ:AV0Z10300504 Keywords : recurrent neural network * Hopfield-like neural network * associative memory * unsupervised learning * neural network architecture * neural network application * statistics * Boolean factor analysis * dimensionality reduction * features clustering * concepts search * information retrieval Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 2.769, year: 2007

  20. Correction factor for hair analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Castro Faria, L.V. de; Paschoa, A.S.

    1979-06-01

    The application of the Particle Induced X-ray Emission (PIXE) technique to analyse quantitatively the elemental composition of hair specimens brings about some difficulties in the interpretation of the data. The present paper proposes a correction factor to account for the effects of energy loss of the incident particle with penetration depth, and x-ray self-absorption when a particular geometrical distribution of elements in hair is assumed for calculational purposes. The correction factor has been applied to the analysis of hair contents Zn, Cu and Ca as a function of the energy of the incident particle.(Author) [pt

  1. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M.A. Wasiolek

    2003-07-25

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports (BSC 2003 [DIRS 160964]; BSC 2003 [DIRS 160965]; BSC 2003 [DIRS 160976]; BSC 2003 [DIRS 161239]; BSC 2003 [DIRS 161241]) contain detailed description of the model input parameters. This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs and conversion factors for the TSPA. The BDCFs will be used in performance assessment for calculating annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle activity in groundwater and the annual dose from beta- and photon-emitting radionuclides.

  2. The effects of predictor method factors on selection outcomes: A modular approach to personnel selection procedures.

    Science.gov (United States)

    Lievens, Filip; Sackett, Paul R

    2017-01-01

    Past reviews and meta-analyses typically conceptualized and examined selection procedures as holistic entities. We draw on the product design literature to propose a modular approach as a complementary perspective to conceptualizing selection procedures. A modular approach means that a product is broken down into its key underlying components. Therefore, we start by presenting a modular framework that identifies the important measurement components of selection procedures. Next, we adopt this modular lens for reviewing the available evidence regarding each of these components in terms of affecting validity, subgroup differences, and applicant perceptions, as well as for identifying new research directions. As a complement to the historical focus on holistic selection procedures, we posit that the theoretical contributions of a modular approach include improved insight into the isolated workings of the different components underlying selection procedures and greater theoretical connectivity among different selection procedures and their literatures. We also outline how organizations can put a modular approach into operation to increase the variety in selection procedures and to enhance the flexibility in designing them. Overall, we believe that a modular perspective on selection procedures will provide the impetus for programmatic and theory-driven research on the different measurement components of selection procedures. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Factors Associated with Anxiety About Colonoscopy: The Preparation, the Procedure, and the Anticipated Findings.

    Science.gov (United States)

    Shafer, L A; Walker, J R; Waldman, C; Yang, C; Michaud, V; Bernstein, C N; Hathout, L; Park, J; Sisler, J; Restall, G; Wittmeier, K; Singh, H

    2018-03-01

    Previous research has assessed anxiety around colonoscopy procedures, but has not considered anxiety related to different aspects related to the colonoscopy process. Before colonoscopy, we assessed anxiety about: bowel preparation, the procedure, and the anticipated results. We evaluated associations between patient characteristics and anxiety in each area. An anonymous survey was distributed to patients immediately prior to their outpatient colonoscopy in six hospitals and two ambulatory care centers in Winnipeg, Canada. Anxiety was assessed using a visual analog scale. For each aspect, logistic regression models were used to explore associations between patient characteristics and high anxiety. A total of 1316 respondents completed the questions about anxiety (52% female, median age 56 years). Anxiety scores > 70 (high anxiety) were reported by 18% about bowel preparation, 29% about the procedure, and 28% about the procedure results. High anxiety about bowel preparation was associated with female sex, perceived unclear instructions, unfinished laxative, and no previous colonoscopies. High anxiety about the procedure was associated with female sex, no previous colonoscopies, and confusing instructions. High anxiety about the results was associated with symptoms as an indication for colonoscopy and instructions perceived as confusing. Fewer people had high anxiety about preparation than about the procedure and findings of the procedure. There are unique predictors of anxiety about each colonoscopy aspect. Understanding the nuanced differences in aspects of anxiety may help to design strategies to reduce anxiety, leading to improved acceptance of the procedure, compliance with preparation instructions, and less discomfort with the procedure.

  4. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M.A. Wasiolek

    2005-04-28

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standards. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis

  5. Nominal Performance Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M.A. Wasiolek

    2005-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standards. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1-1). The objectives of this analysis are to develop BDCFs for the

  6. A definition and evaluation procedure of generalized stress intensity factors at cracks and multi-material wedges

    International Nuclear Information System (INIS)

    Song Chongmin

    2010-01-01

    A definition of generalized stress intensity factors is proposed. It is based on a matrix function solution for singular stress fields obtained from the scaled boundary finite-element method. The dimensions of the matrix are equal to the number of singular terms. Not only real and complex power singularities but also power-logarithmic singularities are represented in a unified expression without explicitly determining the type of singularity. The generalized stress intensity factors are evaluated directly from the definition by following standard stress recovery procedures in the finite element method. Numerical examples are presented to valid the definition and evaluation procedure.

  7. A kernel version of spatial factor analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2009-01-01

    . Schölkopf et al. introduce kernel PCA. Shawe-Taylor and Cristianini is an excellent reference for kernel methods in general. Bishop and Press et al. describe kernel methods among many other subjects. Nielsen and Canty use kernel PCA to detect change in univariate airborne digital camera images. The kernel...... version of PCA handles nonlinearities by implicitly transforming data into high (even infinite) dimensional feature space via the kernel function and then performing a linear analysis in that space. In this paper we shall apply kernel versions of PCA, maximum autocorrelation factor (MAF) analysis...

  8. Analysis of half diallel mating designs I: a practical analysis procedure for ANOVA approximation.

    Science.gov (United States)

    G.R. Johnson; J.N. King

    1998-01-01

    Procedures to analyze half-diallel mating designs using the SAS statistical package are presented. The procedure requires two runs of PROC and VARCOMP and results in estimates of additive and non-additive genetic variation. The procedures described can be modified to work on most statistical software packages which can compute variance component estimates. The...

  9. Development of a procedure for qualitative and quantitative evaluation of human factors as a part of probabilistic safety assessments of nuclear power plants. Part A

    International Nuclear Information System (INIS)

    Richei, A.

    1998-01-01

    The objective of this project is the development of a procedure for the qualitative and quantitative evaluation of human factors in the probabilistic safety assessment for nuclear power plants. The Human Error Rate Assessment and Optimizing System (HEROS) is introduced. The evaluation of a task with HEROS is realized in the three evaluation levels, i.e. 'Management Structure', 'Working Environment' and 'Man-Machine-Interface'. The developed expert system uses the fuzzy set theory for an assessment. For the evaluation of cognitive tasks evaluation criteria are derived also. The validation of the procedure is based on three examples, reflecting the common practice of probabilistic safety assessments and including problems, which cannot, respectively - only insufficiently - be evaluated with the established human risk analysis procedures. HERO applications give plausible and comprehensible results. (orig.) [de

  10. DISRUPTIVE EVENT BIOSPHERE DOSE CONVERSION FACTOR ANALYSIS

    International Nuclear Information System (INIS)

    M.A. Wasiolek

    2005-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The Biosphere Model Report (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1-1). The objective of this analysis was to develop the BDCFs for the volcanic

  11. CONSIDERATIONS FOR THE TREATMENT OF COMPUTERIZED PROCEDURES IN HUMAN RELIABILITY ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2012-07-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room. Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  12. HUMAN RELIABILITY ANALYSIS FOR COMPUTERIZED PROCEDURES, PART TWO: APPLICABILITY OF CURRENT METHODS

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2012-10-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no U.S. nuclear power plant has implemented CPs in its main control room. Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  13. Qualitative content analysis in nursing research: concepts, procedures and measures to achieve trustworthiness.

    Science.gov (United States)

    Graneheim, U H; Lundman, B

    2004-02-01

    Qualitative content analysis as described in published literature shows conflicting opinions and unsolved issues regarding meaning and use of concepts, procedures and interpretation. This paper provides an overview of important concepts (manifest and latent content, unit of analysis, meaning unit, condensation, abstraction, content area, code, category and theme) related to qualitative content analysis; illustrates the use of concepts related to the research procedure; and proposes measures to achieve trustworthiness (credibility, dependability and transferability) throughout the steps of the research procedure. Interpretation in qualitative content analysis is discussed in light of Watzlawick et al.'s [Pragmatics of Human Communication. A Study of Interactional Patterns, Pathologies and Paradoxes. W.W. Norton & Company, New York, London] theory of communication.

  14. Procedures for conducting common cause failure analysis in probabilistic safety assessment

    International Nuclear Information System (INIS)

    1992-05-01

    The principal objective of this report is to supplement the procedure developed in Mosleh et al. (1988, 1989) by providing more explicit guidance for a practical approach to common cause failures (CCF) analysis. The detailed CCF analysis following that procedure would be very labour intensive and time consuming. This document identifies a number of options for performing the more labour intensive parts of the analysis in an attempt to achieve a balance between the need for detail, the purpose of the analysis and the resources available. The document is intended to be compatible with the Agency's Procedures for Conducting Probabilistic Safety Assessments for Nuclear Power Plants (IAEA, 1992), but can be regarded as a stand-alone report to be used in conjunction with NUREG/CR-4780 (Mosleh et al., 1988, 1989) to provide additional detail, and discussion of key technical issues

  15. Experience with conventional inelastic analysis procedures in very high temperature applications

    International Nuclear Information System (INIS)

    Mallett, R.H.; Thompson, J.M.; Swindeman, R.W.

    1991-01-01

    Conventional incremental plasticity and creep analysis procedures for inelastic analysis are applied to hot flue gas cleanup system components. These flue gas systems operate at temperatures where plasticity and creep are very much intertwined while the two phenomena are treated separately in the conventional inelastic analysis procedure. Data for RA333 material are represented in forms appropriate for the conventional inelastic analysis procedures. Behavior is predicted for typical operating cycles. Creep-fatigue damage is estimated based upon usage fractions. Excessive creep damage is predicted; the major contributions occur during high stress short term intervals caused by rapid temperature changes. In this paper these results are presented for discussion of the results and their interpretation in terms of creep-fatigue damage for very high temperature applications

  16. Procedure for statistical analysis of one-parameter discrepant experimental data

    International Nuclear Information System (INIS)

    Badikov, Sergey A.; Chechev, Valery P.

    2012-01-01

    A new, Mandel–Paule-type procedure for statistical processing of one-parameter discrepant experimental data is described. The procedure enables one to estimate a contribution of unrecognized experimental errors into the total experimental uncertainty as well as to include it in analysis. A definition of discrepant experimental data for an arbitrary number of measurements is introduced as an accompanying result. In the case of negligible unrecognized experimental errors, the procedure simply reduces to the calculation of the weighted average and its internal uncertainty. The procedure was applied to the statistical analysis of half-life experimental data; Mean half-lives for 20 actinides were calculated and results were compared to the ENSDF and DDEP evaluations. On the whole, the calculated half-lives are consistent with the ENSDF and DDEP evaluations. However, the uncertainties calculated in this work essentially exceed the ENSDF and DDEP evaluations for discrepant experimental data. This effect can be explained by adequately taking into account unrecognized experimental errors. - Highlights: ► A new statistical procedure for processing one-parametric discrepant experimental data has been presented. ► Procedure estimates a contribution of unrecognized errors in the total experimental uncertainty. ► Procedure was applied for processing half-life discrepant experimental data. ► Results of the calculations are compared to the ENSDF and DDEP evaluations.

  17. Analysis of mineral phases in coal utilizing factor analysis

    International Nuclear Information System (INIS)

    Roscoe, B.A.; Hopke, P.K.

    1982-01-01

    The mineral phase inclusions of coal are discussed. The contribution of these to a coal sample are determined utilizing several techniques. Neutron activation analysis in conjunction with coal washability studies have produced some information on the general trends of elemental variation in the mineral phases. These results have been enhanced by the use of various statistical techniques. The target transformation factor analysis is specifically discussed and shown to be able to produce elemental profiles of the mineral phases in coal. A data set consisting of physically fractionated coal samples was generated. These samples were analyzed by neutron activation analysis and then their elemental concentrations examined using TTFA. Information concerning the mineral phases in coal can thus be acquired from factor analysis even with limited data. Additional data may permit the resolution of additional mineral phases as well as refinement of theose already identified

  18. A Beginner’s Guide to Factor Analysis: Focusing on Exploratory Factor Analysis

    Directory of Open Access Journals (Sweden)

    An Gie Yong

    2013-10-01

    Full Text Available The following paper discusses exploratory factor analysis and gives an overview of the statistical technique and how it is used in various research designs and applications. A basic outline of how the technique works and its criteria, including its main assumptions are discussed as well as when it should be used. Mathematical theories are explored to enlighten students on how exploratory factor analysis works, an example of how to run an exploratory factor analysis on SPSS is given, and finally a section on how to write up the results is provided. This will allow readers to develop a better understanding of when to employ factor analysis and how to interpret the tables and graphs in the output.

  19. Seismic analysis response factors and design margins of piping systems

    International Nuclear Information System (INIS)

    Shieh, L.C.; Tsai, N.C.; Yang, M.S.; Wong, W.L.

    1985-01-01

    The objective of the simplified methods project of the Seismic Safety Margins Research Program is to develop a simplified seismic risk methodology for general use. The goal is to reduce seismic PRA costs to roughly 60 man-months over a 6 to 8 month period, without compromising the quality of the product. To achieve the goal, it is necessary to simplify the calculational procedure of the seismic response. The response factor approach serves this purpose. The response factor relates the median level response to the design data. Through a literature survey, we identified the various seismic analysis methods adopted in the U.S. nuclear industry for the piping system. A series of seismic response calculations was performed. The response factors and their variabilities for each method of analysis were computed. A sensitivity study of the effect of piping damping, in-structure response spectra envelop method, and analysis method was conducted. In addition, design margins, which relate the best-estimate response to the design data, are also presented

  20. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs for the groundwater exposure scenario for the three climate states considered in the TSPA-LA as well as conversion factors for evaluating compliance with the groundwater protection standard. The BDCFs will be used in performance assessment for calculating all-pathway annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle

  1. Determining the Number of Factors in P-Technique Factor Analysis

    Science.gov (United States)

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  2. Disruptive Event Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. A. Wasiolek

    2003-07-21

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in

  3. Disruptive Event Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. A. Wasiolek

    2003-01-01

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in the biosphere. The biosphere process

  4. Exploratory Bi-Factor Analysis: The Oblique Case

    Science.gov (United States)

    Jennrich, Robert I.; Bentler, Peter M.

    2012-01-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford ("Psychometrika" 47:41-54, 1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler ("Psychometrika" 76:537-549, 2011) introduced an exploratory form of bi-factor…

  5. Exploratory factor analysis in Rehabilitation Psychology: a content analysis.

    Science.gov (United States)

    Roberson, Richard B; Elliott, Timothy R; Chang, Jessica E; Hill, Jessica N

    2014-11-01

    Our objective was to examine the use and quality of exploratory factor analysis (EFA) in articles published in Rehabilitation Psychology. Trained raters examined 66 separate exploratory factor analyses in 47 articles published between 1999 and April 2014. The raters recorded the aim of the EFAs, the distributional statistics, sample size, factor retention method(s), extraction and rotation method(s), and whether the pattern coefficients, structure coefficients, and the matrix of association were reported. The primary use of the EFAs was scale development, but the most widely used extraction and rotation method was principle component analysis, with varimax rotation. When determining how many factors to retain, multiple methods (e.g., scree plot, parallel analysis) were used most often. Many articles did not report enough information to allow for the duplication of their results. EFA relies on authors' choices (e.g., factor retention rules extraction, rotation methods), and few articles adhered to all of the best practices. The current findings are compared to other empirical investigations into the use of EFA in published research. Recommendations for improving EFA reporting practices in rehabilitation psychology research are provided.

  6. Exploratory Bi-factor Analysis: The Oblique Case

    OpenAIRE

    Jennrich, Robert L.; Bentler, Peter M.

    2011-01-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford (1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler (2011) introduced an exploratory form of bi-factor analysis that does not require one to provide an explicit bi-factor structure a priori. They use exploratory factor analysis and a bi-factor rotation criterion designed to produce a rotated loading mat...

  7. Quantitative analysis of crystalline pharmaceuticals in tablets by pattern-fitting procedure using X-ray diffraction pattern.

    Science.gov (United States)

    Takehira, Rieko; Momose, Yasunori; Yamamura, Shigeo

    2010-10-15

    A pattern-fitting procedure using an X-ray diffraction pattern was applied to the quantitative analysis of binary system of crystalline pharmaceuticals in tablets. Orthorhombic crystals of isoniazid (INH) and mannitol (MAN) were used for the analysis. Tablets were prepared under various compression pressures using a direct compression method with various compositions of INH and MAN. Assuming that X-ray diffraction pattern of INH-MAN system consists of diffraction intensities from respective crystals, observed diffraction intensities were fitted to analytic expression based on X-ray diffraction theory and separated into two intensities from INH and MAN crystals by a nonlinear least-squares procedure. After separation, the contents of INH were determined by using the optimized normalization constants for INH and MAN. The correction parameter including all the factors that are beyond experimental control was required for quantitative analysis without calibration curve. The pattern-fitting procedure made it possible to determine crystalline phases in the range of 10-90% (w/w) of the INH contents. Further, certain characteristics of the crystals in the tablets, such as the preferred orientation, size of crystallite, and lattice disorder were determined simultaneously. This method can be adopted to analyze compounds whose crystal structures are known. It is a potentially powerful tool for the quantitative phase analysis and characterization of crystals in tablets and powders using X-ray diffraction patterns. Copyright 2010 Elsevier B.V. All rights reserved.

  8. Analysis of Factors Associated With Rhytidectomy Malpractice Litigation Cases.

    Science.gov (United States)

    Kandinov, Aron; Mutchnick, Sean; Nangia, Vaibhuv; Svider, Peter F; Zuliani, Giancarlo F; Shkoukani, Mahdi A; Carron, Michael A

    2017-07-01

    This study investigates the financial burden of medical malpractice litigation associated with rhytidectomies, as well as factors that contribute to litigation and poor defendant outcomes, which can help guide physician practices. To comprehensively evaluate rhytidectomy malpractice litigation. Jury verdict and settlement reports related to rhytidectomy malpractice litigations were obtained using the Westlaw Next database. Use of medical malpractice in conjunction with several terms for rhytidectomy, to account for the various procedure names associated with the procedure, yielded 155 court cases. Duplicate and nonrelevant cases were removed, and 89 cases were included in the analysis and reviewed for outcomes, defendant specialty, payments, and other allegations raised in proceedings. Data were collected from November 21, 2015, to December 25, 2015. Data analysis took place from December 25, 2015, to January 20, 2016. A total of 89 cases met our inclusion criteria. Most plaintiffs were female (81 of 88 with known sex [92%]), and patient age ranged from 40 to 76 years (median age, 56 years). Fifty-three (60%) were resolved in the defendant's favor, while the remaining 36 cases (40%) were resolved with either a settlement or a plaintiff verdict payment. The mean payment was $1.4 million. A greater proportion of cases involving plastic surgeon defendants were resolved with payment compared with cases involving defendants with ear, nose, and throat specialty (15 [36%] vs 4 [24%]). The most common allegations raised in litigation were intraoperative negligence (61 [69%]), poor cosmesis or disfigurement (57 [64%]), inadequate informed consent (30 [34%]), additional procedures required (14 [16%]), postoperative negligence (12 [14%]), and facial nerve injury (10 [11%]). Six cases (7%) involved alleged negligence surrounding a "lifestyle-lift" procedure, which tightens or oversews the superficial muscular aponeurosis system layer. In this study, although most cases of

  9. Energetic soft-tissue treatment technologies: an overview of procedural fundamentals and safety factors

    NARCIS (Netherlands)

    van de Berg, N. J.; van den Dobbelsteen, J. J.; Jansen, F. W.; Grimbergen, C. A.; Dankelman, J.

    2013-01-01

    Energy administered during soft-tissue treatments may cauterize, coagulate, seal, or otherwise affect underlying structures. A general overview of the functionality, procedural outcomes, and associated risks of these treatments, however, is not yet generally available. In addition, literature is

  10. Procedure for measurement of anisotropy factor for neutron sources; Procedimentos para medição do fator de anisotropia de fontes de nêutrons

    Energy Technology Data Exchange (ETDEWEB)

    Creazolla, Prycylla Gomes

    2017-07-01

    Radioisotope neutron sources allow the production of reference fields for calibration of neutron detectors for radiation protection and analysis purposes. When the emission rate of these sources is isotropic, no correction is necessary. However, variations in source encapsulation and in the radioactive material concentration produce differences in its neutron emission rate, relative to the source axis, this effect is called anisotropy. In this study, is describe a procedure for measuring the anisotropy factor of neutron sources performed in the Laboratório de Metrologia de Neutrons (LN) using a Precision Long Counter (PLC) detector. A measurement procedure that takes into account the anisotropy factor of neutron sources contributes to solve some issues, particularly with respect to the high uncertainties associated with neutron dosimetry. Thus, a bibliographical review was carried out based on international standards and technical regulations specific to the area of neutron fields, and were later reproduced in practice by means of the procedure for measuring the anisotropy factor in neutron sources of the LN. The anisotropy factor is determined as a function of the angle of 90° in relation to the cylindrical axis of the source. This angle is more important due to its high use in measurements and also of its higher neutron emission rate if compared with other angles. (author)

  11. Procedures for multielement analysis using high-flux fast-neutron activation

    International Nuclear Information System (INIS)

    Williams, R.E.; Hopke, P.K.; Meyer, R.A.

    1981-06-01

    Improvements have been made in the rabbit system used for multi-element fast-neutron activation analysis at the Lawrence Livermore National Laboratory Rotating Target Neutron Source, RTNS-I. Procedures have been developed for the analysis of 20 to 25 elements in samples with an inorganic matrix and 10 to 15 elements in biological samples, without the need for prohibitively expensive, long irradiations. Results are presented for the analysis of fly ash, orchard leaves, and bovine liver

  12. Simple procedure for evaluating earthquake response spectra of large-event motions based on site amplification factors derived from smaller-event records

    International Nuclear Information System (INIS)

    Dan, Kazuo; Miyakoshi, Jun-ichi; Yashiro, Kazuhiko.

    1996-01-01

    A primitive procedure was proposed for evaluating earthquake response spectra of large-event motions to make use of records from smaller events. The result of the regression analysis of the response spectra was utilized to obtain the site amplification factors in the proposed procedure, and the formulation of the seismic-source term in the regression analysis was examined. A linear form of the moment magnitude, Mw, is good for scaling the source term of moderate earthquakes with Mw of 5.5 to 7.0, while a quadratic form of Mw and the ω-square source-spectrum model is appropriate for scaling the source term of smaller and greater earthquakes, respectively. (author). 52 refs

  13. Reliability assessment of a manual-based procedure towards learning curve modeling and fmea analysis

    Directory of Open Access Journals (Sweden)

    Gustavo Rech

    2013-03-01

    Full Text Available Separation procedures in drug Distribution Centers (DC are manual-based activities prone to failures such as shipping exchanged, expired or broken drugs to the customer. Two interventions seem as promising in improving the reliability in the separation procedure: (i selection and allocation of appropriate operators to the procedure, and (ii analysis of potential failure modes incurred by selected operators. This article integrates Learning Curves (LC and FMEA (Failure Mode and Effect Analysis aimed at reducing the occurrence of failures in the manual separation of a drug DC. LCs parameters enable generating an index to identify the recommended operators to perform the procedures. The FMEA is then applied to the separation procedure carried out by the selected operators in order to identify failure modes. It also deployed the traditional FMEA severity index into two sub-indexes related to financial issues and damage to company´s image in order to characterize failures severity. When applied to a drug DC, the proposed method significantly reduced the frequency and severity of failures in the separation procedure.

  14. Formalizing the Austrian Procedure Catalogue: A 4-step methodological analysis approach.

    Science.gov (United States)

    Neururer, Sabrina Barbara; Lasierra, Nelia; Peiffer, Karl Peter; Fensel, Dieter

    2016-04-01

    Due to the lack of an internationally accepted and adopted standard for coding health interventions, Austria has established its own country-specific procedure classification system - the Austrian Procedure Catalogue (APC). Even though the APC is an elaborate coding standard for medical procedures, it has shortcomings that limit its usability. In order to enhance usability and usefulness, especially for research purposes and e-health applications, we developed an ontologized version of the APC. In this paper we present a novel four-step approach for the ontology engineering process, which enables accurate extraction of relevant concepts for medical ontologies from written text. The proposed approach for formalizing the APC consists of the following four steps: (1) comparative pre-analysis, (2) definition analysis, (3) typological analysis, and (4) ontology implementation. The first step contained a comparison of the APC to other well-established or elaborate health intervention coding systems in order to identify strengths and weaknesses of the APC. In the second step, a list of definitions of medical terminology used in the APC was obtained. This list of definitions was used as input for Step 3, in which we identified the most important concepts to describe medical procedures using the qualitative typological analysis approach. The definition analysis as well as the typological analysis are well-known and effective methods used in social sciences, but not commonly employed in the computer science or ontology engineering domain. Finally, this list of concepts was used in Step 4 to formalize the APC. The pre-analysis highlighted the major shortcomings of the APC, such as the lack of formal definition, leading to implicitly available, but not directly accessible information (hidden data), or the poor procedural type classification. After performing the definition and subsequent typological analyses, we were able to identify the following main characteristics of

  15. Power analysis for multivariate and repeated measures designs: a flexible approach using the SPSS MANOVA procedure.

    Science.gov (United States)

    D'Amico, E J; Neilands, T B; Zambarano, R

    2001-11-01

    Although power analysis is an important component in the planning and implementation of research designs, it is often ignored. Computer programs for performing power analysis are available, but most have limitations, particularly for complex multivariate designs. An SPSS procedure is presented that can be used for calculating power for univariate, multivariate, and repeated measures models with and without time-varying and time-constant covariates. Three examples provide a framework for calculating power via this method: an ANCOVA, a MANOVA, and a repeated measures ANOVA with two or more groups. The benefits and limitations of this procedure are discussed.

  16. Effectiveness of internet-based affect induction procedures: A systematic review and meta-analysis.

    Science.gov (United States)

    Ferrer, Rebecca A; Grenen, Emily G; Taber, Jennifer M

    2015-12-01

    Procedures used to induce affect in a laboratory are effective and well-validated. Given recent methodological and technological advances in Internet research, it is important to determine whether affect can be effectively induced using Internet methodology. We conducted a meta-analysis and systematic review of prior research that has used Internet-based affect induction procedures, and examined potential moderators of the effectiveness of affect induction procedures. Twenty-six studies were included in final analyses, with 89 independent effect sizes. Affect induction procedures effectively induced general positive affect, general negative affect, fear, disgust, anger, sadness, and guilt, but did not significantly induce happiness. Contamination of other nontarget affect did not appear to be a major concern. Video inductions resulted in greater effect sizes. Overall, results indicate that affect can be effectively induced in Internet studies, suggesting an important venue for the acceleration of affective science. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  17. Pretreatment procedures applied to samples to be analysed by neutron activation analysis at CDTN/CNEN

    International Nuclear Information System (INIS)

    Francisco, Dovenir; Menezes, Maria Angela de Barros Correia

    2009-01-01

    The neutron activation technique - using several methods - has been applied in 80% of the analytical demand of Division for Reactor and Analytical Techniques at CDTN/CNEN, Belo Horizonte, Minas Gerais. This scenario emphasizes the responsibility of the Laboratory to provide and assure the quality of the measurements. The first step to assure the results quality is the preparation of the samples. Therefore, this paper describes the experimental procedures adopted at CDTN/CNEN in order to uniform conditions of analysis and to avoid contaminations by elements present everywhere. Some of the procedures are based on methods described in the literature; others are based on many years of experience preparing samples from many kinds of matrices. The procedures described are related to geological material - soil, sediment, rock, gems, clay, archaeological ceramics and ore - biological materials - hair, fish, plants, food - water, etc. Analytical results in sediment samples are shown as n example pointing out the efficiency of the experimental procedure. (author)

  18. Disruptive Event Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this

  19. Scalable group level probabilistic sparse factor analysis

    DEFF Research Database (Denmark)

    Hinrich, Jesper Løve; Nielsen, Søren Føns Vind; Riis, Nicolai Andre Brogaard

    2017-01-01

    Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation. We propose a scalable group level probabilistic sparse factor analysis (psFA) allowing spatially sparse maps, component...... pruning using automatic relevance determination (ARD) and subject specific heteroscedastic spatial noise modeling. For task-based and resting state fMRI, we show that the sparsity constraint gives rise to components similar to those obtained by group independent component analysis. The noise modeling...... shows that noise is reduced in areas typically associated with activation by the experimental design. The psFA model identifies sparse components and the probabilistic setting provides a natural way to handle parameter uncertainties. The variational Bayesian framework easily extends to more complex...

  20. Disruptive Event Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. Wasiolek

    2004-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this analysis was to develop the BDCFs for the volcanic ash

  1. Kernel parameter dependence in spatial factor analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2010-01-01

    kernel PCA. Shawe-Taylor and Cristianini [4] is an excellent reference for kernel methods in general. Bishop [5] and Press et al. [6] describe kernel methods among many other subjects. The kernel version of PCA handles nonlinearities by implicitly transforming data into high (even infinite) dimensional...... feature space via the kernel function and then performing a linear analysis in that space. In this paper we shall apply a kernel version of maximum autocorrelation factor (MAF) [7, 8] analysis to irregularly sampled stream sediment geochemistry data from South Greenland and illustrate the dependence...... of the kernel width. The 2,097 samples each covering on average 5 km2 are analyzed chemically for the content of 41 elements....

  2. Risk Factor Analysis for Oral Precancer among Slum Dwellers in ...

    African Journals Online (AJOL)

    risk factors for oral precancer, i.e., smoking/smokeless tobacco, chewing ... procedure was performed on a group of 10 subjects, which were ... clinical description of observed oral mucosal lesions was made ..... use and effects of cessation.

  3. Minerals sampling: sensibility analysis and correction factors for Pierre Gy's equation

    International Nuclear Information System (INIS)

    Vallebuona, G.; Niedbalski, F.

    2005-01-01

    Pierre Gy's equation is widely used in ore sampling. This equation is based in four parameters: shape factor, size distribution factor, mineralogical factor and liberation factor. The usual practice is to consider fixed values for the shape and size distribution factors. This practice does not represent well several important ores. The mineralogical factor considers only one specie of interest and the gangue, leaving out other cases such as polymetallic ores where there are more than one species of interest. A sensibility analysis to the Gy's equation factors was done and a procedure to determine specific values for them was developed and presented in this work. mean ore characteristics, associated with an insecure use of the actual procedure, were determined. finally, for a case study, the effects of using each alternative were evaluated. (Author) 4 refs

  4. Improving Your Exploratory Factor Analysis for Ordinal Data: A Demonstration Using FACTOR

    Directory of Open Access Journals (Sweden)

    James Baglin

    2014-06-01

    Full Text Available Exploratory factor analysis (EFA methods are used extensively in the field of assessment and evaluation. Due to EFA's widespread use, common methods and practices have come under close scrutiny. A substantial body of literature has been compiled highlighting problems with many of the methods and practices used in EFA, and, in response, many guidelines have been proposed with the aim to improve application. Unfortunately, implementing recommended EFA practices has been restricted by the range of options available in commercial statistical packages and, perhaps, due to an absence of clear, practical - how-to' demonstrations. Consequently, this article describes the application of methods recommended to get the most out of your EFA. The article focuses on dealing with the common situation of analysing ordinal data as derived from Likert-type scales. These methods are demonstrated using the free, stand-alone, easy-to-use and powerful EFA package FACTOR (http://psico.fcep.urv.es/utilitats/factor/, Lorenzo-Seva & Ferrando, 2006. The demonstration applies the recommended techniques using an accompanying dataset, based on the Big 5 personality test. The outcomes obtained by the EFA using the recommended procedures through FACTOR are compared to the default techniques currently available in SPSS.

  5. A finite volume procedure for fluid flow, heat transfer and solid-body stress analysis

    KAUST Repository

    Jagad, P. I.

    2018-04-12

    A unified cell-centered unstructured mesh finite volume procedure is presented for fluid flow, heat transfer and solid-body stress analysis. An in-house procedure (A. W. Date, Solution of Transport Equations on Unstructured Meshes with Cell-Centered Colocated Variables. Part I: Discretization, International Journal of Heat and Mass Transfer, vol. 48 (6), 1117-1127, 2005) is extended to include the solid-body stress analysis. The transport terms for a cell-face are evaluated in a structured grid-like manner. The Cartesian gradients at the center of each cell-face are evaluated using the coordinate transformation relations. The accuracy of the procedure is demonstrated by solving several benchmark problems involving different boundary conditions, source terms, and types of loading.

  6. An automated sensitivity analysis procedure for the performance assessment of nuclear waste isolation systems

    International Nuclear Information System (INIS)

    Pin, F.G.; Worley, B.A.; Oblow, E.M.; Wright, R.Q.; Harper, W.V.

    1986-01-01

    To support an effort in making large-scale sensitivity analyses feasible, cost efficient and quantitatively complete, the authors have developed an automated procedure making use of computer calculus. The procedure, called GRESS (GRadient Enhanced Software System), is embodied in a precompiler that can process Fortran computer codes and add derivative-taking capabilities to the normal calculation scheme. In this paper, the automated GRESS procedure is described and applied to the code UCB-NE-10.2, which simulates the migration through a sorption medium of the radionuclide members of a decay chain. The sensitivity calculations for a sample problem are verified using comparison with analytical and perturbation analysis results. Conclusions are drawn relative to the applicability of GRESS for more general large-scale sensitivity studies, and the role of such techniques in an overall sensitivity and uncertainty analysis program is discussed

  7. Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha

    2002-04-01

    Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes

  8. Long terms results of Draf 3 procedure

    NARCIS (Netherlands)

    Georgalas, C.; Hansen, F.; Videler, W. J. M.; Fokkens, W. J.

    2011-01-01

    To assess the effectiveness and factors associated with restenosis after Draf type III (Endoscopic Modified Lothrop) frontal sinus drainage procedure. Retrospective analysis of prospectively collected data. A hundred and twenty two consecutive patients undergoing Draf III procedure for recalcitrant

  9. A Procedure for the Computerized Analysis of Cleft Palate Speech Transcription

    Science.gov (United States)

    Fitzsimons, David A.; Jones, David L.; Barton, Belinda; North, Kathryn N.

    2012-01-01

    The phonetic symbols used by speech-language pathologists to transcribe speech contain underlying hexadecimal values used by computers to correctly display and process transcription data. This study aimed to develop a procedure to utilise these values as the basis for subsequent computerized analysis of cleft palate speech. A computer keyboard…

  10. Computer-Aided Diagnosis of Solid Breast Lesions Using an Ultrasonic Multi-Feature Analysis Procedure

    Science.gov (United States)

    2011-01-01

    ultrasound. 1. BACKGROUND AND INTRODUCTION Breast cancer affects one of every eight women, it kills one of 29 women in the United States, and is the leading...feature analysis procedure for computer-aided diagnosis of solid breast lesions,” Ultrason Imag, 2010 (In Press). 22. C. B. Shakespeare , personal

  11. A Numerical Procedure for Analysis of W/R Contact Using Explicit Finite Element Methods

    NARCIS (Netherlands)

    Ma, Y.; Markine, V.L.

    2015-01-01

    Since no effective experimental approaches have been proposed to assess wheel and rail (W/R) contact performance till now, numerical computational analysis is known as an alternative to approximately simulate the W/R interaction. In this paper, one numerical procedure is proposed on the basis of

  12. Development of residual stress analysis procedure for fitness-for-service assessment of welded structure

    International Nuclear Information System (INIS)

    Kim, Jong Sung; Jin, Tae Eun; Dong, P.; Prager, M.

    2003-01-01

    In this study, a state of art review of existing residual stress analysis techniques and representative solutions is presented in order to develop the residual stress analysis procedure for Fitness-For-Service(FFS) assessment of welded structure. Critical issues associated with existing residual stress solutions and their treatments in performing FFS are discussed. It should be recognized that detailed residual stress evolution is an extremely complicated phenomenon that typically involves material-specific thermomechanical/metallurgical response, welding process physics, and structural interactions within a component being welded. As a result, computational procedures can vary significantly from highly complicated numerical techniques intended only to elucidate a small part of the process physics to cost-effective procedures that are deemed adequate for capturing some of the important features in a final residual stress distribution. Residual stress analysis procedure for FFS purposes belongs to the latter category. With this in mind, both residual stress analysis techniques and their adequacy for FFS are assessed based on both literature data and analyses performed in this investigation

  13. Methodology for LOCA analysis and its qualification procedures for PWR reload licensing

    International Nuclear Information System (INIS)

    Serrano, M.A.B.

    1986-01-01

    The methodology for LOCA analysis developed by FURNAS and its qualification procedure for PWR reload licensing are presented. Digital computer codes developed by NRC and published collectively as the WREM package were modified to get versions that comply to each requirement of Brazilian Licensing Criteria. This metodology is applied to Angra-1 basic case to conclude the qualification process. (Author) [pt

  14. Risk factors for surgical site infection following nonshunt pediatric neurosurgery: a review of 9296 procedures from a national database and comparison with a single-center experience.

    Science.gov (United States)

    Sherrod, Brandon A; Arynchyna, Anastasia A; Johnston, James M; Rozzelle, Curtis J; Blount, Jeffrey P; Oakes, W Jerry; Rocque, Brandon G

    2017-04-01

    OBJECTIVE Surgical site infection (SSI) following CSF shunt operations has been well studied, yet risk factors for nonshunt pediatric neurosurgery are less well understood. The purpose of this study was to determine SSI rates and risk factors following nonshunt pediatric neurosurgery using a nationwide patient cohort and an institutional data set specifically for better understanding SSI. METHODS The authors reviewed the American College of Surgeons National Surgical Quality Improvement Program-Pediatric (ACS NSQIP-P) database for the years 2012-2014, including all neurosurgical procedures performed on pediatric patients except CSF shunts and hematoma evacuations. SSI included deep (intracranial abscesses, meningitis, osteomyelitis, and ventriculitis) and superficial wound infections. The authors performed univariate analyses of SSI association with procedure, demographic, comorbidity, operative, and hospital variables, with subsequent multivariate logistic regression analysis to determine independent risk factors for SSI within 30 days of the index procedure. A similar analysis was performed using a detailed institutional infection database from Children's of Alabama (COA). RESULTS A total of 9296 nonshunt procedures were identified in NSQIP-P with an overall 30-day SSI rate of 2.7%. The 30-day SSI rate in the COA institutional database was similar (3.3% of 1103 procedures, p = 0.325). Postoperative time to SSI in NSQIP-P and COA was 14.6 ± 6.8 days and 14.8 ± 7.3 days, respectively (mean ± SD). Myelomeningocele (4.3% in NSQIP-P, 6.3% in COA), spine (3.5%, 4.9%), and epilepsy (3.4%, 3.1%) procedure categories had the highest SSI rates by procedure category in both NSQIP-P and COA. Independent SSI risk factors in NSQIP-P included postoperative pneumonia (OR 4.761, 95% CI 1.269-17.857, p = 0.021), immune disease/immunosuppressant use (OR 3.671, 95% CI 1.371-9.827, p = 0.010), cerebral palsy (OR 2.835, 95% CI 1.463-5.494, p = 0.002), emergency operation (OR 1

  15. Risk factors for surgical site infection following nonshunt pediatric neurosurgery: a review of 9296 procedures from a national database and comparison with a single-center experience

    Science.gov (United States)

    Sherrod, Brandon A.; Arynchyna, Anastasia A.; Johnston, James M.; Rozzelle, Curtis J.; Blount, Jeffrey P.; Oakes, W. Jerry; Rocque, Brandon G.

    2017-01-01

    Objective Surgical site infection (SSI) following CSF shunt operations has been well studied, yet risk factors for nonshunt pediatric neurosurgery are less well understood. The purpose of this study was to determine SSI rates and risk factors following nonshunt pediatric neurosurgery using a nationwide patient cohort and an institutional dataset specifically for better understanding SSI. Methods The authors reviewed the American College of Surgeons National Surgical Quality Improvement Program Pediatric (ACS NSQIP-P) database for the years 2012–2014, including all neurosurgical procedures performed on pediatric patients except CSF shunts and hematoma evacuations. SSI included deep (intracranial abscesses, meningitis, osteomyelitis, and ventriculitis) and superficial wound infections. The authors performed univariate analyses of SSI association with procedure, demographic, comorbidity, operative, and hospital variables, with subsequent multivariate logistic regression analysis to determine independent risk factors for SSI within 30 days of the index procedure. A similar analysis was performed using a detailed institutional infection database from Children’s Hospital of Alabama (COA). Results A total of 9296 nonshunt procedures were identified in NSQIP-P with an overall 30-day SSI rate of 2.7%. The 30-day SSI rate in the COA institutional database was similar (3.3% of 1103 procedures, p = 0.325). Postoperative time to SSI in NSQIP-P and COA was 14.6 ± 6.8 days and 14.8 ± 7.3 days, respectively (mean ± SD). Myelomeningocele (4.3% in NSQIP-P, 6.3% in COA), spine (3.5%, 4.9%), and epilepsy (3.4%, 3.1%) procedure categoriess had the highest SSI rates by procedure category in both NSQIP-P and COA. Independent SSI risk factors in NSQIP-P included postoperative pneumonia (OR 4.761, 95% CI 1.269–17.857, p = 0.021), immune disease/immunosuppressant use (OR 3.671, 95% CI 1.371–9.827, p = 0.010), cerebral palsy (OR 2.835, 95% CI 1.463–5.494, p = 0.002), emergency

  16. Interdisciplinary analysis procedures in the modeling and control of large space-based structures

    Science.gov (United States)

    Cooper, Paul A.; Stockwell, Alan E.; Kim, Zeen C.

    1987-01-01

    The paper describes a computer software system called the Integrated Multidisciplinary Analysis Tool, IMAT, that has been developed at NASA Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven interactive executive program, IMAT links a relational database to commercial structural and controls analysis codes. The paper describes the procedures followed to analyze a complex satellite structure and control system. The codes used to accomplish the analysis are described, and an example is provided of an application of IMAT to the analysis of a reference space station subject to a rectangular pulse loading at its docking port.

  17. THE INFLUENCE OF THE CHOSEN SOCIO-DEMOGRAPHIC FACTORS ON THE QUALITY OF LIFE IN WOMEN AFTER GYNAECOLOGICAL SURGICAL PROCEDURES

    Directory of Open Access Journals (Sweden)

    Beata Karakiewicz

    2010-09-01

    Full Text Available Background: The aim of this study was to assess how the chosen socio-demographic factors effect the quality of life in the patients after gynaecological surgical procedures. Materials and Methods: Research was conducted in 2007 among 250 women operated in the Department of Reproduction and Gynaecology, the Pomeranian Medical University in Szczecin. In this survey-based study, we used a standardized quality of life questionnaire, the Women’s Health Questionnaire (WHQ, developed by Dr Myra Hunter at London University. Results: The most numerous patients were those with sleep disorders (38,8%, 37,6% of the surveyed complained of troublesome menstrual symptoms, 26,8% of respondents had disturbing somatic symptoms, short memory and problems with concentration. The lowest percentage of women (12,4% felt anxiety and fear associated with the past gynaecological surgical procedure. Conclusions: 1. General satisfaction and good disposition is declared by the majority of patients after gynaecological surgical procedures. 2. Age, education, having a partner, place of residence, and the number of children are the factors which have significant effect on the quality of life in women after gynaecological procedures.

  18. SU-D-209-05: Sensitivity of the Diagnostic Radiological Index of Protection (DRIP) to Procedural Factors in Fluoroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Jones, A [UT MD Anderson Cancer Center, Houston, TX (United States); Pasciak, A [University of Tennessee Medical Center, Knoxville, TN (United States); Wagner, L [UT Medical School, Houston, TX (United States)

    2016-06-15

    Purpose: To evaluate the sensitivity of the Diagnostic Radiological Index of Protection (DRIP) to procedural factors in fluoroscopy in an effort to determine an appropriate set of scatter-mimicking primary beams (SMPB) to be used in measuring the DRIP. Methods: A series of clinical and factorial Monte Carlo simulations were conducted to determine the shape of the scattered X-ray spectra incident on the operator in different clinical fluoroscopy scenarios. Two clinical evaluations studied the sensitivity of the scattered spectrum to gantry angle and patient size while technical factors were varied according to measured automatic dose rate control (ADRC) data. Factorial evaluations studied the sensitivity of the scattered spectrum to gantry angle, field of view, patient size and beam quality for constant technical factors. Average energy was the figure of merit used to condense fluence in each energy bin to a single numerical index. Results: Beam quality had the strongest influence on the scattered spectrum in fluoroscopy. Many procedural factors affected the scattered spectrum indirectly through their effects on primary beam quality through ADRC, e.g., gantry angle and patient size. Lateral C-arm rotation, common in interventional cardiology, increased the energy of the scattered spectrum, regardless of the direction of rotation. The effect of patient size on scattered radiation depended on ADRC characteristics, patient size, and procedure type. Conclusion: The scattered spectrum striking the operator in fluoroscopy, and therefore the DRIP, is most strongly influenced by primary beam quality, particularly kV. Use cases for protective garments should be classified by typical procedural primary beam qualities, which are governed by the ADRC according to the impacts of patient size, anatomical location, and gantry angle. These results will help determine an appropriate set of SMPB to be used for measuring the DRIP.

  19. An expert system-based aid for analysis of Emergency Operating Procedures in NPPs

    International Nuclear Information System (INIS)

    Jakubowski, Z.; Beraha, D.

    1996-01-01

    Emergency Operating Procedures (EOPs) generally and an accident management (AM) particularly play a significant part in the safety philosophy on NPPs since many years. A better methodology for development and validation of EOPs is desired. A prototype of an Emergency Operating Procedures Analysis System (EOPAS), which has been developed at GRS, is presented in the paper. The hardware configuration and software organisation of the system is briefly reviewed. The main components of the system such as the knowledge base of an expert system and the engineering simulator are described. (author)

  20. A P-value model for theoretical power analysis and its applications in multiple testing procedures

    Directory of Open Access Journals (Sweden)

    Fengqing Zhang

    2016-10-01

    Full Text Available Abstract Background Power analysis is a critical aspect of the design of experiments to detect an effect of a given size. When multiple hypotheses are tested simultaneously, multiplicity adjustments to p-values should be taken into account in power analysis. There are a limited number of studies on power analysis in multiple testing procedures. For some methods, the theoretical analysis is difficult and extensive numerical simulations are often needed, while other methods oversimplify the information under the alternative hypothesis. To this end, this paper aims to develop a new statistical model for power analysis in multiple testing procedures. Methods We propose a step-function-based p-value model under the alternative hypothesis, which is simple enough to perform power analysis without simulations, but not too simple to lose the information from the alternative hypothesis. The first step is to transform distributions of different test statistics (e.g., t, chi-square or F to distributions of corresponding p-values. We then use a step function to approximate each of the p-value’s distributions by matching the mean and variance. Lastly, the step-function-based p-value model can be used for theoretical power analysis. Results The proposed model is applied to problems in multiple testing procedures. We first show how the most powerful critical constants can be chosen using the step-function-based p-value model. Our model is then applied to the field of multiple testing procedures to explain the assumption of monotonicity of the critical constants. Lastly, we apply our model to a behavioral weight loss and maintenance study to select the optimal critical constants. Conclusions The proposed model is easy to implement and preserves the information from the alternative hypothesis.

  1. The mathematical pathogenetic factors analysis of acute inflammatory diseases development of bronchopulmonary system among infants

    Directory of Open Access Journals (Sweden)

    G. O. Lezhenko

    2017-10-01

    Full Text Available The purpose. To study the factor structure and to establish the associative interaction of pathogenetic links of acute diseases development of the bronchopulmonary system in infants.Materials and methods. The examination group consisted of 59 infants (average age 13.8 ± 1.4 months sick with acute inflammatory bronchopulmonary diseases. Also we tested the level of 25-hydroxyvitamin D (25(ОНD, vitamin D-binding protein, hBPI, cathelicidin LL-37, ß1-defensins, lactoferrin in blood serum with the help of immunoenzymometric analysis. Selection of prognostically important pathogenetic factors of acute bronchopulmonary disease among infants was conducted using ROC-analysis. The procedure for classifying objects was carried out using Hierarchical Cluster Analysis by the method of Centroid-based clustering. Results. Based on the results of the ROC-analysis were selected 15 potential predictors of the development of acute inflammatory diseases of the bronchopulmonary system among infants. The factor analysis made it possible to determine the 6 main components . The biggest influence in the development of the disease was made by "the anemia factor", "the factor of inflammation", "the maternal factor", "the vitamin D supply factor", "the immune factor" and "the phosphorus-calcium exchange factor” with a factor load of more than 0.6. The performed procedure of hierarchical cluster analysis confirmed the initial role of immuno-inflammatory components. The conclusions. The highlighted factors allowed to define a group of parameters, that must be influenced to achieve a maximum effect in carrying out preventive and therapeutic measures. First of all, it is necessary to influence the "the anemia factor" and "the calcium exchange factor", as well as the "the vitamin D supply factor". In other words, to correct vitamin D deficiency and carry out measures aimed at preventing the development of anemia. The prevention and treatment of the pathological course of

  2. Experimental design technique applied to the validation of an instrumental Neutron Activation Analysis procedure

    International Nuclear Information System (INIS)

    Santos, Uanda Paula de M. dos; Moreira, Edson Gonçalves

    2017-01-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) method were carried out for the determination of the elements bromine, chlorine, magnesium, manganese, potassium, sodium and vanadium in biological matrix materials using short irradiations at a pneumatic system. 2 k experimental designs were applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. The chosen experimental designs were the 2 3 and the 2 4 , depending on the radionuclide half life. Different certified reference materials and multi-element comparators were analyzed considering the following variables: sample decay time, irradiation time, counting time and sample distance to detector. Comparator concentration, sample mass and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations, it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN /CNEN-SP). Optimized conditions were estimated based on the results of z-score tests, main effect, interaction effects and better irradiation conditions. (author)

  3. Experimental design technique applied to the validation of an instrumental Neutron Activation Analysis procedure

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Uanda Paula de M. dos; Moreira, Edson Gonçalves, E-mail: uandapaula@gmail.com, E-mail: emoreira@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) method were carried out for the determination of the elements bromine, chlorine, magnesium, manganese, potassium, sodium and vanadium in biological matrix materials using short irradiations at a pneumatic system. 2{sup k} experimental designs were applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. The chosen experimental designs were the 2{sup 3} and the 2{sup 4}, depending on the radionuclide half life. Different certified reference materials and multi-element comparators were analyzed considering the following variables: sample decay time, irradiation time, counting time and sample distance to detector. Comparator concentration, sample mass and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations, it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN /CNEN-SP). Optimized conditions were estimated based on the results of z-score tests, main effect, interaction effects and better irradiation conditions. (author)

  4. Container Throughput Forecasting Using Dynamic Factor Analysis and ARIMAX Model

    Directory of Open Access Journals (Sweden)

    Marko Intihar

    2017-11-01

    Full Text Available The paper examines the impact of integration of macroeconomic indicators on the accuracy of container throughput time series forecasting model. For this purpose, a Dynamic factor analysis and AutoRegressive Integrated Moving-Average model with eXogenous inputs (ARIMAX are used. Both methodologies are integrated into a novel four-stage heuristic procedure. Firstly, dynamic factors are extracted from external macroeconomic indicators influencing the observed throughput. Secondly, the family of ARIMAX models of different orders is generated based on the derived factors. In the third stage, the diagnostic and goodness-of-fit testing is applied, which includes statistical criteria such as fit performance, information criteria, and parsimony. Finally, the best model is heuristically selected and tested on the real data of the Port of Koper. The results show that by applying macroeconomic indicators into the forecasting model, more accurate future throughput forecasts can be achieved. The model is also used to produce future forecasts for the next four years indicating a more oscillatory behaviour in (2018-2020. Hence, care must be taken concerning any bigger investment decisions initiated from the management side. It is believed that the proposed model might be a useful reinforcement of the existing forecasting module in the observed port.

  5. A finite element based substructuring procedure for design analysis of large smart structural systems

    International Nuclear Information System (INIS)

    Ashwin, U; Raja, S; Dwarakanathan, D

    2009-01-01

    A substructuring based design analysis procedure is presented for large smart structural system using the Craig–Bampton method. The smart structural system is distinctively characterized as an active substructure, modelled as a design problem, and a passive substructure, idealized as an analysis problem. Furthermore, a novel thought has been applied by introducing the electro–elastic coupling into the reduction scheme to solve the global structural control problem in a local domain. As an illustration, a smart composite box beam with surface bonded actuators/sensors is considered, and results of the local to global control analysis are presented to show the potential use of the developed procedure. The present numerical scheme is useful for optimally designing the active substructures to study their locations, coupled structure–actuator interaction and provide a solution to the global design of large smart structural systems

  6. Orthogonal Higher Order Structure of the WISC-IV Spanish Using Hierarchical Exploratory Factor Analytic Procedures

    Science.gov (United States)

    McGill, Ryan J.; Canivez, Gary L.

    2016-01-01

    As recommended by Carroll, the present study examined the factor structure of the Wechsler Intelligence Scale for Children-Fourth Edition Spanish (WISC-IV Spanish) normative sample using higher order exploratory factor analytic techniques not included in the WISC-IV Spanish Technical Manual. Results indicated that the WISC-IV Spanish subtests were…

  7. Great Lakes water quality initiative technical support document for the procedure to determine bioaccumulation factors. Draft report

    International Nuclear Information System (INIS)

    1993-03-01

    The purpose of the document is to provide the technical information and rationale in support of the proposed procedures to determine bioaccumulation factors. Bioaccumulation factors, together with the quantity of aquatic organisms eaten, determine the extent to which people and wildlife are exposed to chemicals through the consumption of aquatic organisms. The more bioaccumulative a pollutant is, the more important the consumption of aquatic organisms becomes as a potential source of contaminants to humans and wildlife. Bioaccumulation factors are needed to determine both human health and wildlife tier I water quality criteria and tier II values. Also, they are used to define Bioaccumulative Chemicals of Concern among the Great Lakes Initiative universe of pollutants. Bioaccumulation factors range from less than one to several million

  8. Procedure of trace element analysis in oyster tissues by using X-ray fluorescence

    International Nuclear Information System (INIS)

    Vo Thi Tuong Hanh; Dinh Thi Bich Lieu; Dinh Thien Lam and Nguyen Manh Hung

    2004-01-01

    The procedure of trace element analysis such as Ca, Mn, Fe, Zn, Cu, Pb in molluscs (oyster tissues) was established by using X-ray fluorescence techniques. The procedure was investigated from the sample collection, drying, ashing ratio to the analytical techniques by using Cd-109, detector Si (Li) and the peak processing MCAPLUS program was applied for this study. The procedure is based on direct comparison with certified concentrations of international standard reference SRM 1566b Oyster Tissue of National Institute of Standards and Technology, Department of commerce, United States of America for Ca, Mn, Fe, Zn, Cu and the Standard Addition Methods for Pb. The accuracy of the Standard Addition Methods was estimated by CRM281 Rye Grass of Community Bureau of Reference-BCR, European Commission. The results of 10 samples which were collected from several markets in Hanoi are shown. (author)

  9. Policy analysis of authorisation procedures for wind energy deployment in Spain

    International Nuclear Information System (INIS)

    Iglesias, Guillermo; Rio, Pablo del; Dopico, Jesus Angel

    2011-01-01

    The aim of this paper is to analyse the administrative procedures for the granting of authorisations for the siting of wind farms in Spain, currently the competency of regional authorities. The analysis reveals some commonalities and differences between the procedures across regions. Furthermore, some aspects regarding these procedures have raised the concern of different stakeholders, including the central government and wind energy investors. A conflict between the interests of the central and regional governments can be observed. Lack of coordination between the different administrative levels and the 'more is better mentality' of regional authorities have led to a significant growth of wind energy requests for the (national) feed-in tariff. In turn, investors have complained about the discretionarity and non-transparency of those procedures and the lack of homogeneity across regions. This is likely to result in delays, uncertainty for investors and higher transaction costs. Although there has been a trend to a model which involves the use of multicriteria bidding procedures with more explicit, objective and precise criteria regarding project selection, the aforementioned problems suggest the need to improve coordination between the different administrative levels. - Highlights: → A conflict between the interests of the central and regional governments in the granting of administrative procedures can be observed. → Lack of coordination between different administrative levels have led to a significant growth of wind energy requests for the (national) feed-in tariff. → The resulting increase in the total costs of wind energy promotion has been a major concern for national policy-makers. → In turn, investors have complained about the discretionarity and non-transparency of those procedures and the lack of homogeneity across regions. → Those problems suggest the need to improve coordination between the different administrative levels.

  10. Compliance & dexterity, factors to consider in home care and maintenance procedures Adherencia e destreza: factores a considerar en programas preventivos

    Directory of Open Access Journals (Sweden)

    Victoria Criado

    2007-01-01

    Full Text Available Mechanical plaque control appears to be the primary means of controlling supragingival dental plaque build-up. Although daily oral hygiene practices and periodic professional care are considered the basis for any program aimed at the prevention and treatment of oral diseases, these procedures are technically demanding, time consuming and can be affected by the compliance and manual dexterity of the patient. Individual skills and acquired behavior patterns determine effectiveness of a preventive program and oral hygiene practice. Successful preventive programs and home care procedures clearly depend on the interaction and commitment between the dental professional and the patient. Identifying the capacity of the individual to comply with the professional recommendations and evaluating the dexterity of the patient to remove supragingival dental plaque will permit the implementation of an adequate preventive program and can help on the selection of adjunctive antimicrobial agents and devices needed to reach an effective oral care routine.El control de la placa dental parece ser el mecanismo primario para controlar el crecimiento de la placa dental supragingival. Aunque la práctica diaria de la higiene bucal y el cuidado profesional periódico, son considerados la base para cualquier programa dirigido a la prevención y tratamiento de las enfermedades de la cavidad bucal, estos procedimientos son técnicamente exigentes, consumen tiempo y pueden ser afectados por la aceptación y la destreza manual del paciente. Las destrezas individuales y los patrones de comportamiento adquiridos, determinan la efectividad de un programa preventivo y la práctica de la higiene bucal. El éxito de los programas preventivos y los procedimientos del cuidado bucal en el hogar dependen claramente de la interacción y compromiso entre el odontólogo y el paciente. La importancia de identificar la capacidad del individuo para cumplir con las recomendaciones y la

  11. Estimating the Cost of Neurosurgical Procedures in a Low-Income Setting: An Observational Economic Analysis.

    Science.gov (United States)

    Abdelgadir, Jihad; Tran, Tu; Muhindo, Alex; Obiga, Doomwin; Mukasa, John; Ssenyonjo, Hussein; Muhumza, Michael; Kiryabwire, Joel; Haglund, Michael M; Sloan, Frank A

    2017-05-01

    There are no data on cost of neurosurgery in low-income and middle-income countries. The objective of this study was to estimate the cost of neurosurgical procedures in a low-resource setting to better inform resource allocation and health sector planning. In this observational economic analysis, microcosting was used to estimate the direct and indirect costs of neurosurgical procedures at Mulago National Referral Hospital (Kampala, Uganda). During the study period, October 2014 to September 2015, 1440 charts were reviewed. Of these patients, 434 had surgery, whereas the other 1006 were treated nonsurgically. Thirteen types of procedures were performed at the hospital. The estimated mean cost of a neurosurgical procedure was $542.14 (standard deviation [SD], $253.62). The mean cost of different procedures ranged from $291 (SD, $101) for burr hole evacuations to $1,221 (SD, $473) for excision of brain tumors. For most surgeries, overhead costs represented the largest proportion of the total cost (29%-41%). This is the first study using primary data to determine the cost of neurosurgery in a low-resource setting. Operating theater capacity is likely the binding constraint on operative volume, and thus, investing in operating theaters should achieve a higher level of efficiency. Findings from this study could be used by stakeholders and policy makers for resource allocation and to perform economic analyses to establish the value of neurosurgery in achieving global health goals. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Time Series Factor Analysis with an Application to Measuring Money

    NARCIS (Netherlands)

    Gilbert, Paul D.; Meijer, Erik

    2005-01-01

    Time series factor analysis (TSFA) and its associated statistical theory is developed. Unlike dynamic factor analysis (DFA), TSFA obviates the need for explicitly modeling the process dynamics of the underlying phenomena. It also differs from standard factor analysis (FA) in important respects: the

  13. An improved and explicit surrogate variable analysis procedure by coefficient adjustment.

    Science.gov (United States)

    Lee, Seunggeun; Sun, Wei; Wright, Fred A; Zou, Fei

    2017-06-01

    Unobserved environmental, demographic, and technical factors can negatively affect the estimation and testing of the effects of primary variables. Surrogate variable analysis, proposed to tackle this problem, has been widely used in genomic studies. To estimate hidden factors that are correlated with the primary variables, surrogate variable analysis performs principal component analysis either on a subset of features or on all features, but weighting each differently. However, existing approaches may fail to identify hidden factors that are strongly correlated with the primary variables, and the extra step of feature selection and weight calculation makes the theoretical investigation of surrogate variable analysis challenging. In this paper, we propose an improved surrogate variable analysis using all measured features that has a natural connection with restricted least squares, which allows us to study its theoretical properties. Simulation studies and real data analysis show that the method is competitive to state-of-the-art methods.

  14. A novel procedure on next generation sequencing data analysis using text mining algorithm.

    Science.gov (United States)

    Zhao, Weizhong; Chen, James J; Perkins, Roger; Wang, Yuping; Liu, Zhichao; Hong, Huixiao; Tong, Weida; Zou, Wen

    2016-05-13

    Next-generation sequencing (NGS) technologies have provided researchers with vast possibilities in various biological and biomedical research areas. Efficient data mining strategies are in high demand for large scale comparative and evolutional studies to be performed on the large amounts of data derived from NGS projects. Topic modeling is an active research field in machine learning and has been mainly used as an analytical tool to structure large textual corpora for data mining. We report a novel procedure to analyse NGS data using topic modeling. It consists of four major procedures: NGS data retrieval, preprocessing, topic modeling, and data mining using Latent Dirichlet Allocation (LDA) topic outputs. The NGS data set of the Salmonella enterica strains were used as a case study to show the workflow of this procedure. The perplexity measurement of the topic numbers and the convergence efficiencies of Gibbs sampling were calculated and discussed for achieving the best result from the proposed procedure. The output topics by LDA algorithms could be treated as features of Salmonella strains to accurately describe the genetic diversity of fliC gene in various serotypes. The results of a two-way hierarchical clustering and data matrix analysis on LDA-derived matrices successfully classified Salmonella serotypes based on the NGS data. The implementation of topic modeling in NGS data analysis procedure provides a new way to elucidate genetic information from NGS data, and identify the gene-phenotype relationships and biomarkers, especially in the era of biological and medical big data. The implementation of topic modeling in NGS data analysis provides a new way to elucidate genetic information from NGS data, and identify the gene-phenotype relationships and biomarkers, especially in the era of biological and medical big data.

  15. Forensic analysis of Salvia divinorum using multivariate statistical procedures. Part I: discrimination from related Salvia species.

    Science.gov (United States)

    Willard, Melissa A Bodnar; McGuffin, Victoria L; Smith, Ruth Waddell

    2012-01-01

    Salvia divinorum is a hallucinogenic herb that is internationally regulated. In this study, salvinorin A, the active compound in S. divinorum, was extracted from S. divinorum plant leaves using a 5-min extraction with dichloromethane. Four additional Salvia species (Salvia officinalis, Salvia guaranitica, Salvia splendens, and Salvia nemorosa) were extracted using this procedure, and all extracts were analyzed by gas chromatography-mass spectrometry. Differentiation of S. divinorum from other Salvia species was successful based on visual assessment of the resulting chromatograms. To provide a more objective comparison, the total ion chromatograms (TICs) were subjected to principal components analysis (PCA). Prior to PCA, the TICs were subjected to a series of data pretreatment procedures to minimize non-chemical sources of variance in the data set. Successful discrimination of S. divinorum from the other four Salvia species was possible based on visual assessment of the PCA scores plot. To provide a numerical assessment of the discrimination, a series of statistical procedures such as Euclidean distance measurement, hierarchical cluster analysis, Student's t tests, Wilcoxon rank-sum tests, and Pearson product moment correlation were also applied to the PCA scores. The statistical procedures were then compared to determine the advantages and disadvantages for forensic applications.

  16. Development of a 3-dimensional flow analysis procedure for axial pump impellers

    International Nuclear Information System (INIS)

    Kim, Min Hwan; Kim, Jong In; Park, Jin Seok; Huh, Houng Huh; Chang, Moon Hee

    1999-06-01

    A fluid dynamic analysis procedure was developed using the three-dimensional solid model of an axial pump impeller which was theoretically designed using I-DEAS CAD/CAM/CAE software. The CFD software FLUENT was used in the flow field analysis. The steady-state flow regime in the MCP impeller and diffuser was simulated using the developed procedure. The results of calculation were analyzed to confirm whether the design requirements were properly implemented in the impeller model. The validity of the developed procedure was demonstrated by comparing the calculation results with the experimental data available. The pump performance at the design point could be effectively predicted using the developed procedure. The computed velocity distributions have shown a good agreement with the experimental data except for the regions near the wall. The computed head, however, was over-predicted than the experiment. The design period and cost required for the development of an axial pump impeller can be significantly reduced by applying the proposed methodology. (author). 7 refs., 2 tabs

  17. What are the factors predictive of hysterosalpingogram compliance after female sterilization by the Essure procedure in a publicly insured population?

    Science.gov (United States)

    Howard, David L; Wall, Jeffrey; Strickland, Julie L

    2013-12-01

    To determine what factors are predictive of post-Essure hysterosalpingogram (HSG) compliance. We conducted a retrospective chart review of all patients who underwent the Essure procedure at the two campuses of the Truman Medical Center, Kansas City, Missouri, from January 1, 2005 through December 31, 2010. Our study population consisted primarily of women who were publicly insured (89.0 %) and unmarried (76.7 %). Of 132 patients referred for HSG, 70 (53.0 %) complied. In adjusted analyses women 35 years and older had an almost fourfold higher odds of HSG compliance (OR = 3.72, 95 % CI 1.35-10.23) and women with 3 or more living children had a 64 % lower odds of HSG compliance (OR = 0.36, 95 % CI 0.16-0.82). Women younger than 35 who had 3 or more children had the lowest compliance rate (36.4 %) suggesting an interaction between age and parity. Women undergoing the Essure procedure at the campus with a dedicated protocol to ensure compliance had an almost fourfold higher odds of HSG compliance (OR = 3.67, 95 % CI 1.01-13.40). In a population consisting largely of publicly insured, unmarried women, several factors are predictive of post-Essure HSG compliance. These include age, parity and the presence or absence of an institutional protocol to keep track of patients after their Essure procedure.

  18. A qualitative analysis of the determinants in the choice of a French journal reviewing procedures

    Science.gov (United States)

    Morge, Ludovic

    2015-12-01

    Between 1993 and 2010, two French journals (Aster and Didaskalia) coming from different backgrounds but belonging to the same institution used to publish papers on research in science and technology education. The merging of these journals made it necessary for them to compare the different reviewing procedures used by each. This merging occurred at a time when research is becoming increasingly international which partly determines some of the reviewing procedure choices. In order for a francophone international journal to survive, it needs to take this internationalization into account in a reasoned manner. The author of this article, as a chief editor of RDST (Recherches en Didactique des Sciences et des Technologies)—the journal resulting from the merging- taking part in this merger, analyses the social, cultural and pragmatic determinants which impacted the choices made in reviewing procedures. This paper describes how these diversity of factors leads us to drop the idea of a standard reviewing procedure which would be valid for all journals.

  19. Theoretical analysis about early detection of hepatocellular carcinoma by medical imaging procedure

    Energy Technology Data Exchange (ETDEWEB)

    Odano, Ikuo; Hinata, Hiroshi; Hara, Keiji; Sakai, Kunio [Niigata Univ. (Japan). School of Medicine

    1983-04-01

    It is well-known that patients with chronic hepatitis and liver cirrhosis are frequently accompanied by hepatocellular carcinoma (hepatoma). They are called high risk group for hepatoma. In order to detect a small hepatoma, it is reasonable for us to perform screening examinations on these high risk group patients. Optimal screening interval, however, has not been established. In this report, a theoretical analysis was made to estimate optimal screening interval by imaging procedure such as ultrasonography, x-ray computed tomography and scintigraphy. By the analysis of eight cases, mean doubling time of hepatoma was estimated about four months (73 - 143 days). If we want to detect a hepatoma not greater than 3.0cm in diameter, medical screening procedure combining ultrasonography and scintigraphy should be performed once per about nine months.

  20. Growth inhibitory factors in bovine faeces impairs detection of Salmonella Dublin by conventional culture procedure

    DEFF Research Database (Denmark)

    Baggesen, Dorte Lau; Nielsen, L.R.; Sørensen, Gitte

    2007-01-01

    Aims: To analyse the relative importance of different biological and technical factors on the analytical sensitivity of conventional culture methods for detection of Salmonella Dublin in cattle faeces. Methods and Results: Faeces samples collected from six adult bovines from different salmonella...... novobiocin, followed by combinations of culture media (three types) and selective media (two types). The sensitivity of each combination and sources of variation in detection were determined by a generalized linear mixed model using a split-plot design. Conclusions: Biological factors, such as faecal origin...... and S. Dublin strain influenced the sensitivity more than technical factors. Overall, the modified semisolid Rappaport Vassiliadis (MSRV)-culture medium had the most reliable detection capability, whereas detection with selenite cystine broth and Mueller Kauffman tetrathionate broth combinations varied...

  1. An Information Processing Analysis of the Function of Conceptual Understanding in the Learning of Arithmetic Procedures

    Science.gov (United States)

    1988-08-01

    by Gelman and co-workers with respect to counting (Gelman & Gallistel , 1978; Gelman & Meck, 1983, 1987; Gelman, Mack, & Merkin, 1986; Greeno, Riley...Gelman, 1984). Gelman and Gallistel (1978) formulated a set of princples that determine the correct procedure for counting. The three most Important...Gelman & Gallistel , 1978). Greeno, Riley, and Gelman (1984) and Smith, Greeno, and Vitolo (in press) have proposed a theoretical analysis that shows how

  2. Analysis of the acceptance procedure and quality control a virtual simulation system

    International Nuclear Information System (INIS)

    Gonzalez Ruiz, C.; Pedrero de Aristizabal, D.; Jimenez Rojas, R.; Garcia Hernandez, M. J.; Ruiz Galan, G.; Ayala Lazaro, R.; Garcia Marcos, R.

    2011-01-01

    Acceptance has been made, determining the reference state, commissioning and implementation of control protocol virtual simulation system consists of an image acquisition unit of computerized tomography (CT), an independent external location laser locator and a simulation module associated with the existing scheduler for clinical dosimetry in radiotherapy. This paper summarizes the path followed in this process, together with the established procedure for periodic monitoring and analysis system of the results obtained in the two years of clinical and control.

  3. Housing price forecastability: A factor analysis

    DEFF Research Database (Denmark)

    Bork, Lasse; Møller, Stig Vinther

    of the model stays high at longer horizons. The estimated factors are strongly statistically signi…cant according to a bootstrap resampling method which takes into account that the factors are estimated regressors. The simple three-factor model also contains substantial out-of-sample predictive power...

  4. Erikson Psychosocial Stage Inventory: A Factor Analysis

    Science.gov (United States)

    Gray, Mary McPhail; And Others

    1986-01-01

    The 72-item Erikson Psychosocial Stage Inventory (EPSI) was factor analyzed for a group of 534 university freshmen and sophomore students. Seven factors emerged, which were labeled Initiative, Industry, Identity, Friendship, Dating, Goal Clarity, and Self-Confidence. Item's representing Erikson's factors, Trust and Autonomy, were dispersed across…

  5. A highly rationalized procedure of neutron activation analysis for routine applications in dairy science

    International Nuclear Information System (INIS)

    Heine, K.; Wiechen, A.

    1976-01-01

    A rational procedure was developed for the multi-element analysis by neutron activation for applications in dairy science. The preparation of samples prior to irradiation consists of drying, weighing, and welding in quartz ampoules. The neutron flux, samples are exposed to during reactor irradiation , is determined by the mono-comparator technique for which the Co-content of a commercial aluminium foil was chosen as the flux monitor. Constancy of the Co-content and uncomplicated handling of the foil essentially simplify the determination of flux. The samples are irradiated for 72 h, dissolved in HNO 3 /H 2 SO 4 and measured in the liquid state after waiting times of 1-2, 4 and 8 weeks by a Ge(Li)-detector and a 4,096 channel spectrometer. The procedure was confirmed by investigations of the biological KALE standard and by participation in inter-comparisons of biological substances of the Analytical Quality Control Service of the IAEA for the analysis of the elements Na, Ca, Cr, Fe, Co, Zn, Se, Rb, and Cs. So a procedure was developed suitable for routine multi-element analysis of biologic samples by optimizing and rationalizing all analytical steps. (orig./MG) [de

  6. Safety of Running Two Rooms: A Systematic Review and Meta-Analysis of Overlapping Neurosurgical Procedures.

    Science.gov (United States)

    Self, D Mitchell; Ilyas, Adeel; Stetler, William R

    2018-04-27

    Overlapping surgery, a long-standing practice within academic neurosurgery centers nationwide, has recently come under scrutiny from the government and media as potentially harmful to patients. Therefore, the objective of this systematic review and meta-analysis is to determine the safety of overlapping neurosurgical procedures. The authors performed a systematic review and meta-analysis in accordance with PRISMA guidelines. A review of PubMed and Medline databases was undertaken with the search phrase "overlapping surgery AND neurosurgery AND outcomes." Data regarding patient demographics, type of neurosurgical procedure, and outcomes and complications were extracted from each study. The principle summary measure was odds ratio (OR) of the association of overlapping versus non-overlapping surgery with outcomes. The literature search yielded a total of 36 studies, of which 5 studies met inclusion criteria and were included in this study. These studies included a total of 25,764 patients undergoing neurosurgical procedures. Overlapping surgery was associated with an increased likelihood of being discharged home (OR = 1.32; 95% CI 1.20 to 1.44; P < 0.001) and a reduced 30-day unexpected return to the operating room (OR = 0.79; 95% CI 0.72 to 0.87; P < 0.001). Overlapping surgery did not significantly affect OR of length of surgery, 30-day mortality, or 30-day readmission. Overlapping neurosurgical procedures were not associated with worse patient outcomes. Additional, prospective studies are needed to further assess the safety overlapping procedures. Copyright © 2018. Published by Elsevier Inc.

  7. Analysis of nasopharyngeal carcinoma risk factors with Bayesian networks.

    Science.gov (United States)

    Aussem, Alex; de Morais, Sérgio Rodrigues; Corbex, Marilys

    2012-01-01

    We propose a new graphical framework for extracting the relevant dietary, social and environmental risk factors that are associated with an increased risk of nasopharyngeal carcinoma (NPC) on a case-control epidemiologic study that consists of 1289 subjects and 150 risk factors. This framework builds on the use of Bayesian networks (BNs) for representing statistical dependencies between the random variables. We discuss a novel constraint-based procedure, called Hybrid Parents and Children (HPC), that builds recursively a local graph that includes all the relevant features statistically associated to the NPC, without having to find the whole BN first. The local graph is afterwards directed by the domain expert according to his knowledge. It provides a statistical profile of the recruited population, and meanwhile helps identify the risk factors associated to NPC. Extensive experiments on synthetic data sampled from known BNs show that the HPC outperforms state-of-the-art algorithms that appeared in the recent literature. From a biological perspective, the present study confirms that chemical products, pesticides and domestic fume intake from incomplete combustion of coal and wood are significantly associated with NPC risk. These results suggest that industrial workers are often exposed to noxious chemicals and poisonous substances that are used in the course of manufacturing. This study also supports previous findings that the consumption of a number of preserved food items, like house made proteins and sheep fat, are a major risk factor for NPC. BNs are valuable data mining tools for the analysis of epidemiologic data. They can explicitly combine both expert knowledge from the field and information inferred from the data. These techniques therefore merit consideration as valuable alternatives to traditional multivariate regression techniques in epidemiologic studies. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. Factors of Predicted Learning Disorders and their Interaction with Attentional and Perceptual Training Procedures.

    Science.gov (United States)

    Friar, John T.

    Two factors of predicted learning disorders were investigated: (1) inability to maintain appropriate classroom behavior (BEH), (2) perceptual discrimination deficit (PERC). Three groups of first-graders (BEH, PERC, normal control) were administered measures of impulse control, distractability, auditory discrimination, and visual discrimination.…

  9. 75 FR 33379 - Railroad Cost Recovery Procedures-Productivity Adjustment; Quarterly Rail Cost Adjustment Factor

    Science.gov (United States)

    2010-06-11

    ... information is contained in the Board's June 14, 2010 decision, which is available on our website at http://www.stb.dot.gov . Copies of the decision may be purchased by contacting the office of Public... Cost Adjustment Factor AGENCY: Surface Transportation Board. [[Page 33380

  10. Extension of a GIS procedure for calculating the RUSLE equation LS factor

    NARCIS (Netherlands)

    Zhang, H.; Yang, Q.; Li, R.; Liu, Q.; Moore, D.; He, P.; Ritsema, C.J.; Geissen, V.

    2013-01-01

    The Universal Soil Loss Equation (USLE) and revised USLE (RUSLE) are often used to estimate soil erosion at regional landscape scales, however a major limitation is the difficulty in extracting the LS factor. The geographic information system-based (GIS-based) methods which have been developed for

  11. Incidence and Risk Factors for Major Hematomas in Aesthetic Surgery: Analysis of 129,007 Patients.

    Science.gov (United States)

    Kaoutzanis, Christodoulos; Winocour, Julian; Gupta, Varun; Ganesh Kumar, Nishant; Sarosiek, Konrad; Wormer, Blair; Tokin, Christopher; Grotting, James C; Higdon, K Kye

    2017-10-16

    Postoperative hematomas are one of the most frequent complications following aesthetic surgery. Identifying risk factors for hematoma has been limited by underpowered studies from single institution experiences. To examine the incidence and identify independent risk factors for postoperative hematomas following cosmetic surgery utilizing a prospective, multicenter database. A prospectively enrolled cohort of patients who underwent aesthetic surgery between 2008 and 2013 was identified from the CosmetAssure database. Primary outcome was occurrence of major hematomas requiring emergency room visit, hospital admission, or reoperation within 30 days of the index operation. Univariate and multivariate analysis was used to identify potential risk factors for hematomas including age, gender, body mass index (BMI), smoking, diabetes, type of surgical facility, procedure by body region, and combined procedures. Of 129,007 patients, 1180 (0.91%) had a major hematoma. Mean age (42.0 ± 13.0 years vs 40.9 ± 13.9 years, P hematomas. Males suffered more hematomas than females (1.4% vs 0.9%, P Hematoma rates were higher in patients undergoing combined procedures compared to single procedures (1.1% vs 0.8%, P hematoma included age (Relative Risk [RR] 1.01), male gender (RR 1.98), the procedure being performed in a hospital setting rather than an office-based setting (RR 1.68), combined procedures (RR 1.35), and breast procedures rather than the body/extremity and face procedures (RR 1.81). Major hematoma is the most common complication following aesthetic surgery. Male patients and those undergoing breast or combined procedures have a significantly higher risk of developing hematomas. 2. © 2017 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com

  12. Completely automated modal analysis procedure based on the combination of different OMA methods

    Science.gov (United States)

    Ripamonti, Francesco; Bussini, Alberto; Resta, Ferruccio

    2018-03-01

    In this work a completely automated output-only Modal Analysis procedure is presented and all its benefits are listed. Based on the merging of different Operational Modal Analysis methods and a statistical approach, the identification process has been improved becoming more robust and giving as results only the real natural frequencies, damping ratios and mode shapes of the system. The effect of the temperature can be taken into account as well, leading to the creation of a better tool for automated Structural Health Monitoring. The algorithm has been developed and tested on a numerical model of a scaled three-story steel building present in the laboratories of Politecnico di Milano.

  13. Multi-response permutation procedure as an alternative to the analysis of variance: an SPSS implementation.

    Science.gov (United States)

    Cai, Li

    2006-02-01

    A permutation test typically requires fewer assumptions than does a comparable parametric counterpart. The multi-response permutation procedure (MRPP) is a class of multivariate permutation tests of group difference useful for the analysis of experimental data. However, psychologists seldom make use of the MRPP in data analysis, in part because the MRPP is not implemented in popular statistical packages that psychologists use. A set of SPSS macros implementing the MRPP test is provided in this article. The use of the macros is illustrated by analyzing example data sets.

  14. Sampling, storage and sample preparation procedures for X ray fluorescence analysis of environmental materials

    International Nuclear Information System (INIS)

    1997-06-01

    X ray fluorescence (XRF) method is one of the most commonly used nuclear analytical technique because of its multielement and non-destructive character, speed, economy and ease of operation. From the point of view of quality assurance practices, sampling and sample preparation procedures are the most crucial steps in all analytical techniques, (including X ray fluorescence) applied for the analysis of heterogeneous materials. This technical document covers recent modes of the X ray fluorescence method and recent developments in sample preparation techniques for the analysis of environmental materials. Refs, figs, tabs

  15. Procedure proposed for performance of a probabilistic safety analysis for the event of ''Air plane crash''

    International Nuclear Information System (INIS)

    Hoffmann, H.H.

    1998-01-01

    A procedures guide for a probabilistic safety analysis for the external event 'Air plane crash' has been prepared. The method is based on analysis done within the framework of PSA for German NPPs as well as on international documents. Both crashes of military air planes and commercial air planes contribute to the plant risk. For the determination of the plant related crash rate the air traffic will be divided into 3 different categories of air traffic: - The landing and takeoff phase, - the airlane traffic and waiting loop traffic, - the free air traffic, and the air planes into different types and weight classes. (orig./GL) [de

  16. Random analysis of bearing capacity of square footing using the LAS procedure

    Science.gov (United States)

    Kawa, Marek; Puła, Wojciech; Suska, Michał

    2016-09-01

    In the present paper, a three-dimensional problem of bearing capacity of square footing on random soil medium is analyzed. The random fields of strength parameters c and φ are generated using LAS procedure (Local Average Subdivision, Fenton and Vanmarcke 1990). The procedure used is re-implemented by the authors in Mathematica environment in order to combine it with commercial program. Since the procedure is still tested the random filed has been assumed as one-dimensional: the strength properties of soil are random in vertical direction only. Individual realizations of bearing capacity boundary-problem with strength parameters of medium defined the above procedure are solved using FLAC3D Software. The analysis is performed for two qualitatively different cases, namely for the purely cohesive and cohesive-frictional soils. For the latter case the friction angle and cohesion have been assumed as independent random variables. For these two cases the random square footing bearing capacity results have been obtained for the range of fluctuation scales from 0.5 m to 10 m. Each time 1000 Monte Carlo realizations have been performed. The obtained results allow not only the mean and variance but also the probability density function to be estimated. An example of application of this function for reliability calculation has been presented in the final part of the paper.

  17. Risk analysis procedure for post-wildfire natural hazards in British Columbia

    Science.gov (United States)

    Jordan, Peter

    2010-05-01

    Following a severe wildfire season in 2003, and several subsequent damaging debris flow and flood events, the British Columbia Forest Service developed a procedure for analysing risks to public safety and infrastructure from such events. At the same time, the Forest Service undertook a research program to determine the extent of post-wildfire hazards, and examine the hydrologic and geomorphic processes contributing to the hazards. The risk analysis procedure follows the Canadian Standards Association decision-making framework for risk management (which in turn is based on international standards). This has several steps: identification of risk, risk analysis and estimation, evaluation of risk tolerability, developing control or mitigation strategies, and acting on these strategies. The Forest Service procedure deals only with the first two steps. The results are passed on to authorities such as the Provincial Emergency Program and local government, who are responsible for evaluating risks, warning residents, and applying mitigation strategies if appropriate. The objective of the procedure is to identify and analyse risks to public safety and infrastructure. The procedure is loosely based on the BAER (burned area emergency response) program in the USA, with some important differences. Our procedure focuses on identifying risks and warning affected parties, not on mitigation activities such as broadcast erosion control measures. Partly this is due to limited staff and financial resources. Also, our procedure is not multi-agency, but is limited to wildfires on provincial forest land; in British Columbia about 95% of forest land is in the publicly-owned provincial forest. Each fire season, wildfires are screened by size and proximity to values at risk such as populated areas. For selected fires, when the fire is largely contained, the procedure begins with an aerial reconnaissance of the fire, and photography with a hand-held camera, which can be used to make a

  18. Patient Dose During Carotid Artery Stenting With Embolic-Protection Devices: Evaluation With Radiochromic Films and Related Diagnostic Reference Levels According to Factors Influencing the Procedure

    International Nuclear Information System (INIS)

    D’Ercole, Loredana; Quaretti, Pietro; Cionfoli, Nicola; Klersy, Catherine; Bocchiola, Milena; Rodolico, Giuseppe; Azzaretti, Andrea; Lisciandro, Francesco; Cascella, Tommaso; Zappoli Thyrion, Federico

    2013-01-01

    To measure the maximum entrance skin dose (MESD) on patients undergoing carotid artery stenting (CAS) using embolic-protection devices, to analyze the dependence of dose and exposure parameters on anatomical, clinical, and technical factors affecting the procedure complexity, to obtain some local diagnostic reference levels (DRLs), and to evaluate whether overcoming DRLs is related to procedure complexity. MESD were evaluated with radiochromic films in 31 patients (mean age 72 ± 7 years). Five of 33 (15 %) procedures used proximal EPD, and 28 of 33 (85 %) procedures used distal EPD. Local DRLs were derived from the recorded exposure parameters in 93 patients (65 men and 28 women, mean age 73 ± 9 years) undergoing 96 CAS with proximal (33 %) or distal (67 %) EPD. Four bilateral lesions were included. MESD values (mean 0.96 ± 0.42 Gy) were FR ) were 269 Gy cm 2 , 28 minutes, and 251, respectively. Only simultaneous bilateral treatment was associated with KAP (odds ratio [OR] 10.14, 95 % confidence interval [CI] 1–102.7, p FR overexposures (OR 10.8, 95 % CI 1.1–109.5, p FR overexposure (OR 2.8, 95 % CI 1.1–7.4, p = 0.040). At multivariable analysis, stenosis ≥ 90 % (OR 2.8, 95 % CI 1.1–7.4, p = 0.040) and bilateral treatment (OR 10.8, 95 % CI 1.1–109.5, p = 0.027) were associated with overexposure for two or more parameters. Skin doses are not problematic in CAS with EPD because these procedures rarely lead to doses >2 Gy.

  19. Nominal Performance Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    Wasiolek, M.

    2000-01-01

    The purpose of this report was to document the process leading to development of the Biosphere Dose Conversion Factors (BDCFs) for the postclosure nominal performance of the potential repository at Yucca Mountain. BDCF calculations concerned twenty-four radionuclides. This selection included sixteen radionuclides that may be significant nominal performance dose contributors during the compliance period of up to 10,000 years, five additional radionuclides of importance for up to 1 million years postclosure, and three relatively short-lived radionuclides important for the human intrusion scenario. Consideration of radionuclide buildup in soil caused by previous irrigation with contaminated groundwater was taken into account in the BDCF development. The effect of climate evolution, from the current arid conditions to a wetter and cooler climate, on the BDCF values was evaluated. The analysis included consideration of different exposure pathway's contribution to the BDCFs. Calculations of nominal performance BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. BDCFs for the nominal performance, when combined with the concentrations of radionuclides in groundwater allow calculation of potential radiation doses to the receptor of interest. Calculated estimates of radionuclide concentration in groundwater result from the saturated zone modeling. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA) to calculate doses to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain

  20. Disruptive Event Biosphere Doser Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2000-12-28

    The purpose of this report was to document the process leading to, and the results of, development of radionuclide-, exposure scenario-, and ash thickness-specific Biosphere Dose Conversion Factors (BDCFs) for the postulated postclosure extrusive igneous event (volcanic eruption) at Yucca Mountain. BDCF calculations were done for seventeen radionuclides. The selection of radionuclides included those that may be significant dose contributors during the compliance period of up to 10,000 years, as well as radionuclides of importance for up to 1 million years postclosure. The approach documented in this report takes into account human exposure during three different phases at the time of, and after, volcanic eruption. Calculations of disruptive event BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. The pathway analysis included consideration of different exposure pathway's contribution to the BDCFs. BDCFs for volcanic eruption, when combined with the concentration of radioactivity deposited by eruption on the soil surface, allow calculation of potential radiation doses to the receptor of interest. Calculation of radioactivity deposition is outside the scope of this report and so is the transport of contaminated ash from the volcano to the location of the receptor. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA), in which doses are calculated to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  1. Procedural Factors That Affect Psychophysical Measures of Spatial Selectivity in Cochlear Implant Users

    Directory of Open Access Journals (Sweden)

    Stefano Cosentino

    2015-09-01

    Full Text Available Behavioral measures of spatial selectivity in cochlear implants are important both for guiding the programing of individual users’ implants and for the evaluation of different stimulation methods. However, the methods used are subject to a number of confounding factors that can contaminate estimates of spatial selectivity. These factors include off-site listening, charge interactions between masker and probe pulses in interleaved masking paradigms, and confusion effects in forward masking. We review the effects of these confounds and discuss methods for minimizing them. We describe one such method in which the level of a 125-pps masker is adjusted so as to mask a 125-pps probe, and where the masker and probe pulses are temporally interleaved. Five experiments describe the method and evaluate the potential roles of the different potential confounding factors. No evidence was obtained for off-site listening of the type observed in acoustic hearing. The choice of the masking paradigm was shown to alter the measured spatial selectivity. For short gaps between masker and probe pulses, both facilitation and refractory mechanisms had an effect on masking; this finding should inform the choice of stimulation rate in interleaved masking experiments. No evidence for confusion effects in forward masking was revealed. It is concluded that the proposed method avoids many potential confounds but that the choice of method should depend on the research question under investigation.

  2. Cath lab costs in patients undergoing percutaneous coronary angioplasty - detailed analysis of consecutive procedures.

    Science.gov (United States)

    Dziki, Beata; Miechowicz, Izabela; Iwachów, Piotr; Kuzemczak, Michał; Kałmucki, Piotr; Szyszka, Andrzej; Baszko, Artur; Siminiak, Tomasz

    2017-01-01

    Costs of percutaneous coronary interventions (PCI) have an important impact on health care expenditures. Despite the present stress upon the cost-effectiveness issues in medicine, few comprehensive data exist on costs and resource use in different clinical settings. To assess catheterisation laboratory costs related to use of drugs and single-use devices in patients undergoing PCI due to coronary artery disease. Retrospective analysis of 1500 consecutive PCIs (radial approach, n = 1103; femoral approach, n = 397) performed due to ST segment elevation myocardial infarction (STEMI; n = 345) and non ST-segment elevation myocardial infarction (NSTEMI; n = 426) as well as unstable angina (UA; n = 489) and stable angina (SA; n = 241) was undertaken. Comparative cost analysis was performed and shown in local currency units (PLN). The cath lab costs were higher in STEMI (4295.01 ± 2384.54PLN, p costs were positively correlated with X-ray dose, fluoroscopy, and total procedure times. Patients' age negatively correlated with cath lab costs in STEMI/NSTEMI patients. Cath lab costs were higher in STEMI patients compared to other groups. In STEMI/NSTEMI they were lower in older patients. In all analysed groups costs were related to the level of procedural difficulty. In female patients, the costs of PCI performed via radial approach were higher compared to femoral approach. Despite younger age, male patients underwent more expensive procedures.

  3. Effect of music in endoscopy procedures: systematic review and meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Wang, Man Cai; Zhang, Ling Yi; Zhang, Yu Long; Zhang, Ya Wu; Xu, Xiao Dong; Zhang, You Cheng

    2014-10-01

    Endoscopies are common clinical examinations that are somewhat painful and even cause fear and anxiety for patients. We performed this systematic review and meta-analysis of randomized controlled trials to determine the effect of music on patients undergoing various endoscopic procedures. We searched the Cochrane Library, Issue 6, 2013, PubMed, and EMBASE databases up to July 2013. Randomized controlled trials comparing endoscopies, with and without the use of music, were included. Two authors independently abstracted data and assessed risk of bias. Subgroup analyses were performed to examine the impact of music on different types of endoscopic procedures. Twenty-one randomized controlled trials involving 2,134 patients were included. The overall effect of music on patients undergoing a variety of endoscopic procedures significantly improved pain score (weighted mean difference [WMD] = -1.53, 95% confidence interval [CI] [-2.53, -0.53]), anxiety (WMD = -6.04, 95% CI [-9.61, -2.48]), heart rate (P = 0.01), arterial pressure (P music group, compared with the control group. Furthermore, music had little effect for patients undergoing colposcopy and bronchoscopy in the subanalysis. Our meta-analysis suggested that music may offer benefits for patients undergoing endoscopy, except in colposcopy and bronchoscopy. Wiley Periodicals, Inc.

  4. Comparative rate and risk factors of recurrent urethral stricture during different surgical procedures

    Directory of Open Access Journals (Sweden)

    D. Yu. Pushkar

    2014-11-01

    Full Text Available Objective – to identify the major risk factors leading to worse results of surgical treatment in patients with urethral stricture.Subjects and methods. Two hundred and forty-eight patients with urethral stricture underwent different surgical interventions: internal optical urethrotomy (IOU for strictures of different portions of the urethra in 157 patients (the operation was made once in 121 patients, twice in 24 patients, and thrice or more in 12; replacement urethroplasty using a buccal mucosa graft for strictures of the anterior urethra in 46 patients; Turner-Warwick’s anastomotic urethroplasty modified by Webster for strictures (distraction defects of the posterior urethra in 45 patients. The results of surgical treatment were studied using urethrography, uroflowmetry, urethrocystoscopy, the international prostate symptom score, quality of life (QoL questionnaire, and the international index of erectile function (IIEF questionnaire. The role of risk factors for postoperative recurrent urethral stricture was assessed by univariate and multivariate analyses.Results. The rate of recurrent urethral stricture after IOU was 66.9 % (59.5, 87.5, and 100 % after the first, second, third or more subsequent operations, respectively; 12.1 % after all types of urethroplasty, 15.2 % after augmentation urethroplasty, and 8.9 % after anastomotic urethroplasty. The major risk factors of recurrent urethral stricture after IOU were recognized to be the location of urethral stricture in the penile or bulbomembranous portions, a urethral stricture length of > 1 cm, severe urethral lumen narrowing, and performance of 2 or more operations; those after augmentation urethroplasty were previous ineffective treatment, a stricture length of > 4 cm, lichen sclerosus, and smoking; those after anastomotic urethroplasty were previous ineffective treatment, smoking, and a stricture length of > 4 cm.Conclusion. The results of the investigation have shown that only

  5. Comparative rate and risk factors of recurrent urethral stricture during different surgical procedures

    Directory of Open Access Journals (Sweden)

    D. Yu. Pushkar

    2012-01-01

    Full Text Available Objective – to identify the major risk factors leading to worse results of surgical treatment in patients with urethral stricture.Subjects and methods. Two hundred and forty-eight patients with urethral stricture underwent different surgical interventions: internal optical urethrotomy (IOU for strictures of different portions of the urethra in 157 patients (the operation was made once in 121 patients, twice in 24 patients, and thrice or more in 12; replacement urethroplasty using a buccal mucosa graft for strictures of the anterior urethra in 46 patients; Turner-Warwick’s anastomotic urethroplasty modified by Webster for strictures (distraction defects of the posterior urethra in 45 patients. The results of surgical treatment were studied using urethrography, uroflowmetry, urethrocystoscopy, the international prostate symptom score, quality of life (QoL questionnaire, and the international index of erectile function (IIEF questionnaire. The role of risk factors for postoperative recurrent urethral stricture was assessed by univariate and multivariate analyses.Results. The rate of recurrent urethral stricture after IOU was 66.9 % (59.5, 87.5, and 100 % after the first, second, third or more subsequent operations, respectively; 12.1 % after all types of urethroplasty, 15.2 % after augmentation urethroplasty, and 8.9 % after anastomotic urethroplasty. The major risk factors of recurrent urethral stricture after IOU were recognized to be the location of urethral stricture in the penile or bulbomembranous portions, a urethral stricture length of > 1 cm, severe urethral lumen narrowing, and performance of 2 or more operations; those after augmentation urethroplasty were previous ineffective treatment, a stricture length of > 4 cm, lichen sclerosus, and smoking; those after anastomotic urethroplasty were previous ineffective treatment, smoking, and a stricture length of > 4 cm.Conclusion. The results of the investigation have shown that only

  6. Impacts of biological and procedural factors on semiquantification uptake value of liver in fluorine-18 fluorodeoxyglucose positron emission tomography/computed tomography imaging.

    Science.gov (United States)

    Mahmud, Mohd Hafizi; Nordin, Abdul Jalil; Ahmad Saad, Fathinul Fikri; Azman, Ahmad Zaid Fattah

    2015-10-01

    Increased metabolic activity of fluorodeoxyglucose (FDG) in tissue is not only resulting of pathological uptake, but due to physiological uptake as well. This study aimed to determine the impacts of biological and procedural factors on FDG uptake of liver in whole body positron emission tomography/computed tomography (PET/CT) imaging. Whole body fluorine-18 ((18)F) FDG PET/CT scans of 51 oncology patients have been reviewed. Maximum standardized uptake value (SUVmax) of lesion-free liver was quantified in each patient. Pearson correlation was performed to determine the association between the factors of age, body mass index (BMI), blood glucose level, FDG dose and incubation period and liver SUVmax. Multivariate regression analysis was established to determine the significant factors that best predicted the liver SUVmax. Then the subjects were dichotomised into four BMI groups. Analysis of variance (ANOVA) was established for mean difference of SUVmax of liver between those BMI groups. BMI and incubation period were significantly associated with liver SUVmax. These factors were accounted for 29.6% of the liver SUVmax variance. Statistically significant differences were observed in the mean SUVmax of liver among those BMI groups (Pvalue for physiological liver SUVmax as a reference standard for different BMI of patients in PET/CT interpretation and use a standard protocol for incubation period of patient to reduce variation in physiological FDG uptake of liver in PET/CT study.

  7. A Bayesian Nonparametric Approach to Factor Analysis

    DEFF Research Database (Denmark)

    Piatek, Rémi; Papaspiliopoulos, Omiros

    2018-01-01

    This paper introduces a new approach for the inference of non-Gaussian factor models based on Bayesian nonparametric methods. It relaxes the usual normality assumption on the latent factors, widely used in practice, which is too restrictive in many settings. Our approach, on the contrary, does no...

  8. Classification analysis of organization factors related to system safety

    International Nuclear Information System (INIS)

    Liu Huizhen; Zhang Li; Zhang Yuling; Guan Shihua

    2009-01-01

    This paper analyzes the different types of organization factors which influence the system safety. The organization factor can be divided into the interior organization factor and exterior organization factor. The latter includes the factors of political, economical, technical, law, social culture and geographical, and the relationships among different interest groups. The former includes organization culture, communication, decision, training, process, supervision and management and organization structure. This paper focuses on the description of the organization factors. The classification analysis of the organization factors is the early work of quantitative analysis. (authors)

  9. Secondary Analysis of Audio Data. Technical Procedures for Virtual Anonymization and Pseudonymization

    Directory of Open Access Journals (Sweden)

    Henning Pätzold

    2005-01-01

    Full Text Available Qualitative material presented as audio data requires a greater degree of protecting of anonymity than for example textual data. Apart from the verbal content, it carries paraverbal aspects including voice characteristics, thus making it easier to identify the speaker. This complicates secondary analysis or reanalysis conducted by researchers who were not involved in the data collection. Difficulties increase if the chances are high that the researcher and the interviewee come in contact for example through a meeting. This paper describes the technical procedures that are used to modify the sound of the audio source in a way that it reduces the possibility of recognition (i.e. similar to that of a carefully written transcript. A discussion of the technical possibilities of this procedure along with an exploration of the boundaries of anonymization is presented. URN: urn:nbn:de:0114-fqs0501249

  10. Cost analysis of procedures related to the management of renal artery stenosis from various perspectives

    International Nuclear Information System (INIS)

    Helvoort-Postulart, Debby van; Dirksen, Carmen D.; Kessels, Alfons G.H.; Kroon, Abraham A.; Leeuw, Peter W. de; Nelemans, Patricia J.; Engelshoven, Jos M.A. van; Myriam Hunink, M.G.

    2006-01-01

    To determine the costs associated with the diagnostic work-up and percutaneous revascularization of renal artery stenosis from various perspectives. A prospective multicenter comparative study was conducted between 1998 and 2001. A total of 402 hypertensive patients with suspected renal artery stenosis were included. Costs were assessed of computed tomography angiography (CTA), magnetic resonance angiography (MRA), digital subtraction angiography (DSA), and percutaneous revascularization. From the societal perspective, DSA was the most costly (EUR 1,721) and CTA the least costly diagnostic technique (EUR 424). CTA was the least costly imaging procedure irrespective of the perspective used. The societal costs associated with percutaneous renal artery revascularization ranged from EUR 2,680 to EUR 6,172. Overall the radiology department incurred the largest proportion of the total societal costs. For the management of renal artery stenosis, performing the analysis from different perspectives leads to the same conclusion concerning the least costly diagnostic imaging and revascularization procedure. (orig.)

  11. Simplified Procedure For The Free Vibration Analysis Of Rectangular Plate Structures With Holes And Stiffeners

    Directory of Open Access Journals (Sweden)

    Cho Dae Seung

    2015-04-01

    Full Text Available Thin and thick plates, plates with holes, stiffened panels and stiffened panels with holes are primary structural members in almost all fields of engineering: civil, mechanical, aerospace, naval, ocean etc. In this paper, a simple and efficient procedure for the free vibration analysis of such elements is presented. It is based on the assumed mode method and can handle different plate thickness, various shapes and sizes of holes, different framing sizes and types as well as different combinations of boundary conditions. Natural frequencies and modes are determined by solving an eigenvalue problem of a multi-degree-of-freedom system matrix equation derived by using Lagrange’s equations. Mindlin theory is applied for a plate and Timoshenko beam theory for stiffeners. The applicability of the method in the design procedure is illustrated with several numerical examples obtained by the in-house developed code VAPS. Very good agreement with standard commercial finite element software is achieved.

  12. Procedure for conducting probabilistic safety assessment: level 1 full power internal event analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dae; Lee, Y. H.; Hwang, M. J. [and others

    2003-07-01

    This report provides guidance on conducting a Level I PSA for internal events in NPPs, which is based on the method and procedure that was used in the PSA for the design of Korea Standard Nuclear Plants (KSNPs). Level I PSA is to delineate the accident sequences leading to core damage and to estimate their frequencies. It has been directly used for assessing and modifying the system safety and reliability as a key and base part of PSA. Also, Level I PSA provides insights into design weakness and into ways of preventing core damage, which in most cases is the precursor to accidents leading to major accidents. So Level I PSA has been used as the essential technical bases for risk-informed application in NPPs. The report consists six major procedural steps for Level I PSA; familiarization of plant, initiating event analysis, event tree analysis, system fault tree analysis, reliability data analysis, and accident sequence quantification. The report is intended to assist technical persons performing Level I PSA for NPPs. A particular aim is to promote a standardized framework, terminology and form of documentation for PSAs. On the other hand, this report would be useful for the managers or regulatory persons related to risk-informed regulation, and also for conducting PSA for other industries.

  13. Contribution of the ergonomic analysis to the improvement of the design of operating procedures in nuclear power plants

    International Nuclear Information System (INIS)

    Dien, Y.; Montmayeul, R.

    1992-11-01

    The design of operating procedures for continuous processes is much too often based on implicit assumptions both concerning the operators and the operating conditions that must be dealt with. The merit of the ergonomic approach to the design of procedures is to take account of the way the various operators actually use operating procedures. The actual use is determined from the analysis of on-site operation (normal and incident operating conditions) and the analysis of full-scale simulators tests (incident operating conditions). The introduction of the ergonomic approach in the procedure design results in new design principles being proposed

  14. Bootstrap-based procedures for inference in nonparametric receiver-operating characteristic curve regression analysis.

    Science.gov (United States)

    Rodríguez-Álvarez, María Xosé; Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Tahoces, Pablo G

    2018-03-01

    Prior to using a diagnostic test in a routine clinical setting, the rigorous evaluation of its diagnostic accuracy is essential. The receiver-operating characteristic curve is the measure of accuracy most widely used for continuous diagnostic tests. However, the possible impact of extra information about the patient (or even the environment) on diagnostic accuracy also needs to be assessed. In this paper, we focus on an estimator for the covariate-specific receiver-operating characteristic curve based on direct regression modelling and nonparametric smoothing techniques. This approach defines the class of generalised additive models for the receiver-operating characteristic curve. The main aim of the paper is to offer new inferential procedures for testing the effect of covariates on the conditional receiver-operating characteristic curve within the above-mentioned class. Specifically, two different bootstrap-based tests are suggested to check (a) the possible effect of continuous covariates on the receiver-operating characteristic curve and (b) the presence of factor-by-curve interaction terms. The validity of the proposed bootstrap-based procedures is supported by simulations. To facilitate the application of these new procedures in practice, an R-package, known as npROCRegression, is provided and briefly described. Finally, data derived from a computer-aided diagnostic system for the automatic detection of tumour masses in breast cancer is analysed.

  15. Uncertainty Analysis of A Flood Risk Mapping Procedure Applied In Urban Areas

    Science.gov (United States)

    Krause, J.; Uhrich, S.; Bormann, H.; Diekkrüger, B.

    In the framework of IRMA-Sponge program the presented study was part of the joint research project FRHYMAP (flood risk and hydrological mapping). A simple con- ceptual flooding model (FLOODMAP) has been developed to simulate flooded areas besides rivers within cities. FLOODMAP requires a minimum of input data (digital el- evation model (DEM), river line, water level plain) and parameters and calculates the flood extent as well as the spatial distribution of flood depths. of course the simulated model results are affected by errors and uncertainties. Possible sources of uncertain- ties are the model structure, model parameters and input data. Thus after the model validation (comparison of simulated water to observed extent, taken from airborne pictures) the uncertainty of the essential input data set (digital elevation model) was analysed. Monte Carlo simulations were performed to assess the effect of uncertain- ties concerning the statistics of DEM quality and to derive flooding probabilities from the set of simulations. The questions concerning a minimum resolution of a DEM re- quired for flood simulation and concerning the best aggregation procedure of a given DEM was answered by comparing the results obtained using all available standard GIS aggregation procedures. Seven different aggregation procedures were applied to high resolution DEMs (1-2m) in three cities (Bonn, Cologne, Luxembourg). Basing on this analysis the effect of 'uncertain' DEM data was estimated and compared with other sources of uncertainties. Especially socio-economic information and monetary transfer functions required for a damage risk analysis show a high uncertainty. There- fore this study helps to analyse the weak points of the flood risk and damage risk assessment procedure.

  16. Using BMDP and SPSS for a Q factor analysis.

    Science.gov (United States)

    Tanner, B A; Koning, S M

    1980-12-01

    While Euclidean distances and Q factor analysis may sometimes be preferred to correlation coefficients and cluster analysis for developing a typology, commercially available software does not always facilitate their use. Commands are provided for using BMDP and SPSS in a Q factor analysis with Euclidean distances.

  17. Factor analysis of serogroups botanica and aurisina of Leptospira biflexa.

    Science.gov (United States)

    Cinco, M

    1977-11-01

    Factor analysis is performed on serovars of Botanica and Aurisina serogroup of Leptospira biflexa. The results show the arrangement of main factors serovar and serogroup specific, as well as the antigens common with serovars of heterologous serogroups.

  18. Human factors analysis of incident/accident report

    International Nuclear Information System (INIS)

    Kuroda, Isao

    1992-01-01

    Human factors analysis of accident/incident has different kinds of difficulties in not only technical, but also psychosocial background. This report introduces some experiments of 'Variation diagram method' which is able to extend to operational and managemental factors. (author)

  19. Chronic subdural hematoma: a systematic review and meta-analysis of surgical procedures.

    Science.gov (United States)

    Liu, Weiming; Bakker, Nicolaas A; Groen, Rob J M

    2014-09-01

    In this paper the authors systematically evaluate the results of different surgical procedures for chronic subdural hematoma (CSDH). The MEDLINE, Embase, Cochrane Central Register of Controlled Trials, and other databases were scrutinized according to the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analysis) statement, after which only randomized controlled trials (RCTs) and quasi-RCTs were included. At least 2 different neurosurgical procedures in the management of chronic subdural hematoma (CSDH) had to be evaluated. Included studies were assessed for the risk of bias. Recurrence rates, complications, and outcome including mortality were taken as outcome measures. Statistical heterogeneity in each meta-analysis was assessed using the T(2) (tau-squared), I(2), and chi-square tests. The DerSimonian-Laird method was used to calculate the summary estimates using the fixed-effect model in meta-analysis. Of the 297 studies identified, 19 RCTs were included. Of them, 7 studies evaluated the use of postoperative drainage, of which the meta-analysis showed a pooled OR of 0.36 (95% CI 0.21-0.60; p < 0.001) in favor of drainage. Four studies compared twist drill and bur hole procedures. No significant differences between the 2 methods were present, but heterogeneity was considered to be significant. Three studies directly compared the use of irrigation before drainage. A fixed-effects meta-analysis showed a pooled OR of 0.49 (95% CI 0.21-1.14; p = 0.10) in favor of irrigation. Two studies evaluated postoperative posture. The available data did not reveal a significant advantage in favor of the postoperative supine posture. Regarding positioning of the catheter used for drainage, it was shown that a frontal catheter led to a better outcome. One study compared duration of drainage, showing that 48 hours of drainage was as effective as 96 hours of drainage. Postoperative drainage has the advantage of reducing recurrence without increasing complications

  20. Analysis of success factors in advertising

    OpenAIRE

    Fedorchak, Oleksiy; Kedebecz, Kristina

    2017-01-01

    The essence of factors of the success of advertising campaigns is investigated. The stages of conducting and stages of evaluation of the effectiveness of advertising campaigns are determined. Also defined goals and objectives of advertising campaigns.

  1. Holographic analysis of diffraction structure factors

    International Nuclear Information System (INIS)

    Marchesini, S.; Bucher, J.J.; Shuh, D.K.; Fabris, L.; Press, M.J.; West, M.W.; Hussain, Z.; Mannella, N.; Fadley, C.S.; Van Hove, M.A.; Stolte, W.C.

    2002-01-01

    We combine the theory of inside-source/inside-detector x-ray fluorescence holography and Kossel lines/ x ray standing waves in kinematic approximation to directly obtain the phases of the diffraction structure factors. The influence of Kossel lines and standing waves on holography is also discussed. We obtain partial phase determination from experimental data obtaining the sign of the real part of the structure factor for several reciprocal lattice vectors of a vanadium crystal

  2. Cell-based land use screening procedure for regional siting analysis

    International Nuclear Information System (INIS)

    Jalbert, J.S.; Dobson, J.E.

    1976-01-01

    An energy facility site-screening methodology which permits the land resource planner to identify candidate siting areas was developed. Through the use of spatial analysis procedures and computer graphics, a selection of candidate areas is obtained. Specific sites then may be selected from among candidate areas for environmental impact analysis. The computerized methodology utilizes a cell-based geographic information system for specifying the suitability of candidate areas for an energy facility. The criteria to be considered may be specified by the user and weighted in terms of importance. Three primary computer programs have been developed. These programs produce thematic maps, proximity calculations, and suitability calculations. Programs are written so as to be transferrable to regional planning or regulatory agencies to assist in rational and comprehensive power plant site identification and analysis

  3. General Staining and Segmentation Procedures for High Content Imaging and Analysis.

    Science.gov (United States)

    Chambers, Kevin M; Mandavilli, Bhaskar S; Dolman, Nick J; Janes, Michael S

    2018-01-01

    Automated quantitative fluorescence microscopy, also known as high content imaging (HCI), is a rapidly growing analytical approach in cell biology. Because automated image analysis relies heavily on robust demarcation of cells and subcellular regions, reliable methods for labeling cells is a critical component of the HCI workflow. Labeling of cells for image segmentation is typically performed with fluorescent probes that bind DNA for nuclear-based cell demarcation or with those which react with proteins for image analysis based on whole cell staining. These reagents, along with instrument and software settings, play an important role in the successful segmentation of cells in a population for automated and quantitative image analysis. In this chapter, we describe standard procedures for labeling and image segmentation in both live and fixed cell samples. The chapter will also provide troubleshooting guidelines for some of the common problems associated with these aspects of HCI.

  4. Periodic tests: a human factors analysis of documentary aspects

    International Nuclear Information System (INIS)

    Perinet, Romuald; Rousseau, Jean-Marie

    2007-01-01

    Periodic tests are technical inspections aimed at verifying the availability of the safety-related systems during operation. The French licensee, Electricite de France (EDF), manages periodic tests according to procedures, methods of examination and a frequency, which were defined when the systems were designed. These requirements are defined by national authorities of EDF in a reference document composed of rules of testing and tables containing the reference values to be respected. This reference document is analyzed and transformed by each 'Centre Nucleaire de Production d'Electricite' (CNPE) into station-specific operating ranges of periodic tests. In 2003, the IRSN noted that significant events for safety (ESS) involving periodic tests represented more than 20% of ESS between 2000 and 2002. Thus, 340 ESS were related to non-compliance with the conditions of the test and errors in the implementation of the procedures. A first analysis showed that almost 26% of all ESSs from 2000 to 2002 were related to periodic tests. For many of them, the national reference document and the operating ranges of tests were involved. In this context, the 'Direction Generale de la Surete Nucleaire' (DGSNR), requested the 'Institut de Radioprotection et de Surete Nucleaire' (IRSN) to examine the process of definition and implementation of the periodic tests. The IRSN analyzed about thirty French Licensee event reports occurring during the considered period (2000-2002). The IRSN also interviewed the main persons responsible for the processes and observed the performance of 3 periodic tests. The results of this analysis were presented to a group of experts ('Groupe Permanent') charged with delivering advice to the DGSNR about the origin of the problems identified and the improvements to be implemented. The main conclusions of the IRSN addressed the quality of the prescriptive documents. In this context, EDF decided to carry out a thorough analysis of the whole process. The first

  5. Identification of noise in linear data sets by factor analysis

    International Nuclear Information System (INIS)

    Roscoe, B.A.; Hopke, Ph.K.

    1982-01-01

    A technique which has the ability to identify bad data points, after the data has been generated, is classical factor analysis. The ability of classical factor analysis to identify two different types of data errors make it ideally suited for scanning large data sets. Since the results yielded by factor analysis indicate correlations between parameters, one must know something about the nature of the data set and the analytical techniques used to obtain it to confidentially isolate errors. (author)

  6. Exploring Technostress: Results of a Large Sample Factor Analysis

    OpenAIRE

    Jonušauskas, Steponas; Raišienė, Agota Giedrė

    2016-01-01

    With reference to the results of a large sample factor analysis, the article aims to propose the frame examining technostress in a population. The survey and principal component analysis of the sample consisting of 1013 individuals who use ICT in their everyday work was implemented in the research. 13 factors combine 68 questions and explain 59.13 per cent of the answers dispersion. Based on the factor analysis, questionnaire was reframed and prepared to reasonably analyze the respondents’ an...

  7. Meta-analysis of the technical performance of an imaging procedure: guidelines and statistical methodology.

    Science.gov (United States)

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2015-02-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  8. A procedure for partitioning bulk sediments into distinct grain-size fractions for geochemical analysis

    Science.gov (United States)

    Barbanti, A.; Bothner, Michael H.

    1993-01-01

    A method to separate sediments into discrete size fractions for geochemical analysis has been tested. The procedures were chosen to minimize the destruction or formation of aggregates and involved gentle sieving and settling of wet samples. Freeze-drying and sonication pretreatments, known to influence aggregates, were used for comparison. Freeze-drying was found to increase the silt/clay ratio by an average of 180 percent compared to analysis of a wet sample that had been wet sieved only. Sonication of a wet sample decreased the silt/clay ratio by 51 percent. The concentrations of metals and organic carbon in the separated fractions changed depending on the pretreatment procedures in a manner consistent with the hypothesis that aggregates consist of fine-grained organic- and metal-rich particles. The coarse silt fraction of a freeze-dried sample contained 20–44 percent higher concentrations of Zn, Cu, and organic carbon than the coarse silt fraction of the wet sample. Sonication resulted in concentrations of these analytes that were 18–33 percent lower in the coarse silt fraction than found in the wet sample. Sonication increased the concentration of lead in the clay fraction by an average of 40 percent compared to an unsonicated sample. Understanding the magnitude of change caused by different analysis protocols is an aid in designing future studies that seek to interpret the spatial distribution of contaminated sediments and their transport mechanisms.

  9. EFFICIENCY OF MOMENT AMPLIFICATION PROCEDURES FOR THE SECOND-ORDER ANALYSIS OF STEEL FRAMES

    OpenAIRE

    Paschal Chiadighikaobi

    2018-01-01

    Beam-column is the member subjected to axial compression and bending. Secondary Moment was accounted for in the design and was additional moment induced by axial load. Comparing the results analysis from two computer aided software (SAP2000 and Java). The moment amplification factor Af was inputted in the Java code. Af did not create any change in the result outputs in the Java Code results. There are many different ways to apply amplification factors to first-order analysis results, each wit...

  10. Legitimization Arguments for Procedural Reforms: a semio-linguistic analysis of statement of reasons from the Civil Procedure Code of 1939 and of the draft bill of the New Civil Procedure Code of 2010.

    Directory of Open Access Journals (Sweden)

    Matheus Guarino Sant’Anna Lima de Almeida

    2016-08-01

    Full Text Available This research aims to analyze the arguments of legitimization that were used in the reform of Brazilian procedural legal codes, by comparing the texts of the statement of reasons of the Civil Procedure Code of 1939 and the draft bill of the New Civil Procedure Code. We consider these codes as milestones: the Civil Procedure Code of 1939 was the first one with a national scope; the draft bill of the New Civil Procedure Code was the first one produced during a democratic period. Our goal is to search for similarities and contrasts between the legitimization arguments used in each historical and political period, by asking if they were only arguments to bestow legitimacy to such reforms. We decided to use the methodological tools of sociolinguistic analysis of speech developed by Patrick Charaudeau in his analyses of political speech in order to elucidate how the uses of language and elements of meaning in the speech construction provide justification for the concept of procedure, in both 1939 and 2010. As a result, we conclude that the process of drafting the CPC of 1939 and the New CPC, even if they are very distant in terms of political and historical contexts, they are also very close in their rhetorical construction and their attempt to find justification and adherence. On balance, some of the differences depend on the vocabulary used when the codes were developed, their justification and the need for change. 

  11. Drawing up of a procedure for vanadium determination in mussels using the neutron activation analysis method

    International Nuclear Information System (INIS)

    Seo, Daniele; Vasconcellos, Marina B.A.; Saiki, Mitiko; Catharino, Marilia G.M.; Moreira, Edson G.; Sousa, Eduinetty C.P.M. de; Pereira, Camilo D.S.

    2009-01-01

    This work establishes an adequate procedure for obtaining reliable results for determination of vanadium in mussels, leg by leg by the neutron activation analysis (NAA), viewing the posterior application on the bio monitoring the coastal pollution, particularly near the petroleum terminals.For the evaluation of result quality concerning to the quality of those results, the work analysed the reference material certification NIST SRM 1566b Oyster Tissue. The precision of the results were also analysed using repetitions of mussel samples collected at the coastal of northern Sao Paulo state, Brazil. The NAA procedure consisted of 200 mg of sample and a synthetic standard of vanadium during a period of 8 s and under a thermal neutron flux of 6.6 x 10 12 n cm -2 s -1 at the pneumatic station 4 of the IEA-R1 nuclear reactor of IPEN-CNEN/SP. After a 3 min decay, the measurements of the gamma activities of the sample and the standard were done using a Ge hyper pure semi-conductor detector, connected to gamma ray multichannel analyser. The vanadium were determined by the measurement of the gamma activity of the 52 V through the 1434.08 keV peak, and half-life time of 3.75 min. The concentration of V were calculated by the comparative method. The obtained results indicated the viability of the NAA procedure established for the determination of vanadium in mussels

  12. Analysis and application of ratcheting evaluation procedure of Japanese high temperature design code DDS

    International Nuclear Information System (INIS)

    Lee, H. Y.; Kim, J. B.; Lee, J. H.

    2002-01-01

    In this study, the evaluation procedure of Japanese DDS code which was recently developed to assess the progressive inelastic deformation occurring under repetition of secondary stresses was analyzed and the evaluation results according to DDS was compared those of the thermal ratchet structural test carried out by KAERI to analyze the conservativeness of the code. The existing high temperature codes of US ASME-NH and French RCC-MR suggest the limited ratcheting procedures for only the load cases of cyclic secondary stresses under primary stresses. So they are improper to apply to the actual ratcheting problem which can occur under cyclic secondary membrane stresses due to the movement of hot free surface for the pool type LMR. DDS provides explicitly an analysis procedure of ratcheting due to moving thermal gradients near hot free surface. A comparison study was carried out between the results by the design code of DDS and by the structural test to investigate the conservativeness of DDS code, which showed that the evaluation results by DDS were in good agreement with those of the structural test

  13. New analysis procedure for fast and reliable size measurement of nanoparticles from atomic force microscopy images

    International Nuclear Information System (INIS)

    Boyd, Robert D.; Cuenat, Alexandre

    2011-01-01

    Accurate size measurement during nanoparticle production is essential for the continuing innovation, quality and safety of nano-enabled products. Size measurement by analysing a number of separate particles individually has particular advantages over ensemble methods. In the latter case nanoparticles have to be well dispersed in a fluid and changes that may occur during analysis, such as agglomeration and degradation, will not be detected which could lead to misleading results. Atomic force microscopy (AFM) allows imaging of particles both in air and liquid, however, the strong interactions between the probe and the particle will cause the broadening of the lateral dimension in the final image. In this paper a new procedure to measure the size of spherical nanoparticles from AFM images via vertical height measurement is described. This procedure will quickly analyse hundred of particles simultaneously and reproduce the measurements obtained from electron microscopy (EM). Nanoparticles samples that were difficult, if not impossible, to analyse with EM were successfully measured using this method. The combination of this procedure with the use of a metrological AFM moves closer to true traceable measurements of nanoparticle dispersions.

  14. Procedure for the record, calculation and analysis of costs at the Post Company of Cuba.

    Directory of Open Access Journals (Sweden)

    María Luisa Lara Zayas

    2012-12-01

    Full Text Available The Cuban Company is immersed in important changes, which lead to a new economic model that requires to increase the productivity of work and to enlarge the economic efficiency by means of rational use of material resources, financial and humans. In the present work it is proposed a procedure based on the application of cost techniques, for the record, calculation and costs analysis of activities in the Post Company of Cuba in Sancti Spiritus with the objective to obtain a major efficiency from the rational use of resources.

  15. The analysis, by a fusion procedure and X-ray-fluorescence spectrometry, of silicates and slags

    International Nuclear Information System (INIS)

    Jacobs, J.J.; Balaes, A.M.E.

    1980-01-01

    A glass-disc fusion method is described for the analysis, by X-ray-fluorescence spectrometry, of slags and silicate materials. The data are corrected for detector dead time and short-term instrumental drift. Corrections are made for matrix variations by use of the Lachange-Traill mathematical model, and the results are processed on a mini-computer, an iterative procedure being used in the solving of the simultaneous equations. As the alpha-correction coefficients of the Lachange-Traill model are not truly constant, a modified version of the model is proposed [af

  16. A risk-factor analysis of medical litigation judgments related to fall injuries in Korea.

    Science.gov (United States)

    Kim, Insook; Won, Seonae; Lee, Mijin; Lee, Won

    2018-01-01

    The aim of this study was to find out the risk factors through analysis of seven medical malpractice judgments related to fall injuries. The risk factors were analysed by using the framework that approaches falls from a systems perspective and comprised people, organisational or environmental factors, with each factor being comprised of subfactors. The risk factors found in each of the seven judgments were aggregated into one framework. The risk factors related to patients (i.e. the people factor) were age, pain, related disease, activities and functional status, urination state, cognitive function impairment, past history of fall, blood transfusion, sleep endoscopy state and uncooperative attitude. The risk factors related to the medical staff and caregivers (i.e. people factor) were observation negligence, no fall prevention activities and negligence in managing high-risk group for fall. Organisational risk factors were a lack of workforce, a lack of training, neglecting the management of the high-risk group, neglecting the management of caregivers and the absence of a fall prevention procedure. Regarding the environment, the risk factors were found to be the emergency room, chairs without a backrest and the examination table. Identifying risk factors is essential for preventing fall accidents, since falls are preventable patient-safety incidents. Falls do not happen as a result of a single risk factor. Therefore, a systems approach is effective to identify risk factors, especially organisational and environmental factors.

  17. Analysis of Increased Information Technology Outsourcing Factors

    Directory of Open Access Journals (Sweden)

    Brcar Franc

    2013-01-01

    Full Text Available The study explores the field of IT outsourcing. The narrow field of research is to build a model of IT outsourcing based on influential factors. The purpose of this research is to determine the influential factors on IT outsourcing expansion. A survey was conducted with 141 large-sized Slovenian companies. Data were statistically analyzed using binary logistic regression. The final model contains five factors: (1 management’s support; (2 knowledge on IT outsourcing; (3 improvement of efficiency and effectiveness; (4 quality improvement of IT services; and (5 innovation improvement of IT. Managers immediately can use the results of this research in their decision-making. Increased performance of each individual organization is to the benefit of the entire society. The examination of IT outsourcing with the methods used is the first such research in Slovenia.

  18. Criteria and procedures for validating biomathematical models of human performance and fatigue : procedures for analysis of work schedules.

    Science.gov (United States)

    2013-01-01

    Each railroad covered by 49 CFR 228.407 must perform an analysis of the work schedules of its train employees who are engaged in commuter or intercity rail passenger transportation and identify those schedules that, if worked by such a train employee...

  19. Comparative analysis of lockout programs and procedures applied to industrial machines

    Energy Technology Data Exchange (ETDEWEB)

    Chinniah, Y.; Champoux, M.; Burlet-Vienney, D.; Daigle, R. [Institut de recherche Robert-Sauve en sante et en securite du travail, Montreal, PQ (Canada)

    2008-09-15

    In 2005, approximately 20 workers in Quebec were killed by dangerous machines. Approximately 13,000 accidents in the province were linked to the use of machines. The resulting cost associated with these accidents was estimated to be $70 million to the Quebec Occupational Health and Safety Commission (CSST) in compensation and salary replacement. According to article 185 of the Quebec Occupational Health and Safety Regulation (RSST), workers intervening in hazardous zones of machines and processes during maintenance, repairs, and unjamming activities must apply lockout procedures. Lockout is defined as the placement of a lock or tag on an energy-isolating device in accordance with an established procedure, indicating that the energy-isolating device is not to be operated until removal of the lock or tag in accordance with an established procedure. This report presented a comparative analysis of lockout programs and procedures applied to industrial machines. The study attempted to answer several questions regarding the concept of lockout and its definition in the literature; the differences between legal lockout requirements among provinces and countries; different standards on lockout; the contents of lockout programs as described by different documents; and the compliance of lockout programs in a sample of industries in Quebec in terms of Canadian standard on lockout, the CSA Z460-05 (2005). The report discussed the research objectives, methodology, and results of the study. It was concluded that the concept of lockout has different meanings or definitions in the literature, especially in regulations. However, definitions of lockout which are found in standards have certain similarities. 50 refs., 52 tabs., 2 appendices.

  20. Electrically evoked compound action potentials artefact rejection by independent component analysis: procedure automation.

    Science.gov (United States)

    Akhoun, Idrick; McKay, Colette; El-Deredy, Wael

    2015-01-15

    Independent-components-analysis (ICA) successfully separated electrically-evoked compound action potentials (ECAPs) from the stimulation artefact and noise (ECAP-ICA, Akhoun et al., 2013). This paper shows how to automate the ECAP-ICA artefact cancellation process. Raw-ECAPs without artefact rejection were consecutively recorded for each stimulation condition from at least 8 intra-cochlear electrodes. Firstly, amplifier-saturated recordings were discarded, and the data from different stimulus conditions (different current-levels) were concatenated temporally. The key aspect of the automation procedure was the sequential deductive source categorisation after ICA was applied with a restriction to 4 sources. The stereotypical aspect of the 4 sources enables their automatic classification as two artefact components, a noise and the sought ECAP based on theoretical and empirical considerations. The automatic procedure was tested using 8 cochlear implant (CI) users and one to four stimulus electrodes. The artefact and noise sources were successively identified and discarded, leaving the ECAP as the remaining source. The automated ECAP-ICA procedure successfully extracted the correct ECAPs compared to standard clinical forward masking paradigm in 22 out of 26 cases. ECAP-ICA does not require extracting the ECAP from a combination of distinct buffers as it is the case with regular methods. It is an alternative that does not have the possible bias of traditional artefact rejections such as alternate-polarity or forward-masking paradigms. The ECAP-ICA procedure bears clinical relevance, for example as the artefact rejection sub-module of automated ECAP-threshold detection techniques, which are common features of CI clinical fitting software. Copyright © 2014. Published by Elsevier B.V.

  1. A procedure for the determination of scenario earthquakes for seismic design based on probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Hirose, Jiro; Muramatsu, Ken

    2002-03-01

    This report presents a study on the procedures for the determination of scenario earthquakes for seismic design of nuclear power plants (NPPs) based on probabilistic seismic hazard analysis (PSHA). In the recent years, the use of PSHA, which is a part of seismic probabilistic safety assessment (PSA), to determine the design basis earthquake motions for NPPs has been proposed. The identified earthquakes are called probability-based scenario earthquakes (PBSEs). The concept of PBSEs originates both from the study of US NRC and from Ishikawa and Kameda. The assessment of PBSEs is composed of seismic hazard analysis and identification of dominant earthquakes. The objectives of this study are to formulate the concept of PBSEs and to examine the procedures for determining the PBSEs for a domestic NPP site. This report consists of three parts, namely, procedures to compile analytical conditions for PBSEs, an assessment to identify PBSEs for a model site using the Ishikawa's concept and the examination of uncertainties involved in analytical conditions. The results obtained from the examination of PBSEs using Ishikawa's concept are as follows. (a) Since PBSEs are expressed by hazard-consistent magnitude and distance in terms of a prescribed reference probability, it is easy to obtain a concrete image of earthquakes that determine the ground response spectrum to be considered in the design of NPPs. (b) Source contribution factors provide the information on the importance of the earthquake source regions and/or active faults, and allows the selection of a couple of PBSEs based on their importance to the site. (c) Since analytical conditions involve uncertainty, sensitivity analyses on uncertainties that would affect seismic hazard curves and identification of PBSEs were performed on various aspects and provided useful insights for assessment of PBSEs. A result from this sensitivity analysis was that, although the difference in selection of attenuation equations led to a

  2. Radiological management of patients with urinary obstruction following urinary diversion procedures: technical factors, complications, long-term management and outcome. Experience with 378 procedures.

    LENUS (Irish Health Repository)

    Maher, M M

    2012-02-03

    We aimed to assess management by interventional radiology techniques of patients with urinary diversion procedures (UD) complicated by urinary obstruction (UO). A 12-year electronic database of interventional cases was searched for urinary access in patients with UD. Patients\\' records were assessed for aetiology of obstruction, indication for procedure, types of interventional radiology, complications and outcome. Management issues included frequency of visits for catheter care, type of catheter placement and technical problems associated with catheter maintenance. Three hundred and seventy eight procedures were carried out in 25 patients (mean age 70 years; Male : Female ratio 13:12). Indications for UD were malignancy (n = 22) and neuropathic bladder (n = 3). UD included ileal conduits (n = 17), cutaneous ureterostomy (n = 3 (2 patients)) and sigmoid colon urinary conduit (n = 6). In most patients, catheters were placed antegradely through nephrostomy tract, but subsequent access was through the UD. Twenty of 25 patients had unilateral stents where as 5 had bilateral stents (8-10- Fr pigtail catheters (20-45 cm in length)). The mean number of procedures including catheter changes was 15 +\\/- 4 per patient and 331 of 378 procedures (87 %) were carried out as outpatients. Since catheter placement, 11 patients required hospital admission on 22 occasions for catheter-related complications. Ureteric strictures in patients with UD can be successfully managed by interventional radiology.

  3. Warranty claim analysis considering human factors

    International Nuclear Information System (INIS)

    Wu Shaomin

    2011-01-01

    Warranty claims are not always due to product failures. They can also be caused by two types of human factors. On the one hand, consumers might claim warranty due to misuse and/or failures caused by various human factors. Such claims might account for more than 10% of all reported claims. On the other hand, consumers might not be bothered to claim warranty for failed items that are still under warranty, or they may claim warranty after they have experienced several intermittent failures. These two types of human factors can affect warranty claim costs. However, research in this area has received rather little attention. In this paper, we propose three models to estimate the expected warranty cost when the two types of human factors are included. We consider two types of failures: intermittent and fatal failures, which might result in different claim patterns. Consumers might report claims after a fatal failure has occurred, and upon intermittent failures they might report claims after a number of failures have occurred. Numerical examples are given to validate the results derived.

  4. Chiral analysis of baryon form factors

    Energy Technology Data Exchange (ETDEWEB)

    Gail, T.A.

    2007-11-08

    This work presents an extensive theoretical investigation of the structure of the nucleon within the standard model of elementary particle physics. In particular, the long range contributions to a number of various form factors parametrizing the interactions of the nucleon with an electromagnetic probe are calculated. The theoretical framework for those calculations is chiral perturbation theory, the exact low energy limit of Quantum Chromo Dynamics, which describes such long range contributions in terms of a pion-cloud. In this theory, a nonrelativistic leading one loop order calculation of the form factors parametrizing the vector transition of a nucleon to its lowest lying resonance, the {delta}, a covariant calculation of the isovector and isoscalar vector form factors of the nucleon at next to leading one loop order and a covariant calculation of the isoscalar and isovector generalized vector form factors of the nucleon at leading one loop order are performed. In order to perform consistent loop calculations in the covariant formulation of chiral perturbation theory an appropriate renormalization scheme is defined in this work. All theoretical predictions are compared to phenomenology and results from lattice QCD simulations. These comparisons allow for a determination of the low energy constants of the theory. Furthermore, the possibility of chiral extrapolation, i.e. the extrapolation of lattice data from simulations at large pion masses down to the small physical pion mass is studied in detail. Statistical as well as systematic uncertainties are estimated for all results throughout this work. (orig.)

  5. PGDP [Paducah Gaseous Diffusion Plant]-UF6 handling, sampling, analysis and associated QC/QA and safety related procedures

    International Nuclear Information System (INIS)

    Harris, R.L.

    1987-01-01

    This document is a compilation of Paducah Gaseous Diffusion Plant procedures on UF 6 handling, sampling, and analysis, along with associated QC/QA and safety related procedures. It was assembled for transmission by the US Department of Energy to the Korean Advanced Energy Institute as a part of the US-Korea technical exchange program

  6. Development of SRC-I product analysis. Volume 3. Documentation of procedures

    Energy Technology Data Exchange (ETDEWEB)

    Schweighardt, F.K.; Kingsley, I.S.; Cooper, F.E.; Kamzelski, A.Z.; Parees, D.M.

    1983-09-01

    This section documents the BASIC computer program written to simulate Wilsonville's GC-simulated distillation (GCSD) results at APCI-CRSD Trexlertown. The GC conditions used at APCI for the Wilsonville GCSD analysis of coal-derived liquid samples were described in the SRC-I Quarterly Technical Report, April-June 1981. The approach used to simulate the Wilsonville GCSD results is also from an SRC-I Quarterly Technical Report and is reproduced in Appendix VII-A. The BASIC computer program is described in the attached Appendix VII-B. Analysis of gases produced during coal liquefaction generates key information needed to determine product yields for material balance and process control. Gas samples from the coal process development unit (CPDU) and tubing bombs are the primary samples analyzed. A Carle gas chromatographic system was used to analyze coal liquefaction gas samples. A BASIC computer program was written to calculate the gas chromatographic peak area results into mole percent results. ICRC has employed several analytical workup procedures to determine the amount of distillate, oils, asphaltenes, preasphaltenes, and residue in SRC-I process streams. The ASE procedure was developed using Conoco's liquid column fractionation (LC/F) method as a model. In developing the ASE procedure, ICRC was able to eliminate distillation, and therefore quantify the oils fraction in one extraction step. ASE results were shown to be reproducible within +- 2 wt %, and to yield acceptable material balances. Finally, the ASE method proved to be the least affected by sample composition.

  7. Assessment of bone formation capacity using in vivo transplantation assays: procedure and tissue analysis

    DEFF Research Database (Denmark)

    Abdallah, Basem; Ditzel, Nicholas; Kassem, Moustapha

    2008-01-01

    In vivo assessment of bone formation (osteogenesis) potential by isolated cells is an important method for analysis of cells and factors control ling bone formation. Currently, cell implantation mixed with hydroxyapa-tite/tricalcium phosphate in an open system (subcutaneous implantation) in immun...

  8. Patient Dose During Carotid Artery Stenting With Embolic-Protection Devices: Evaluation With Radiochromic Films and Related Diagnostic Reference Levels According to Factors Influencing the Procedure

    Energy Technology Data Exchange (ETDEWEB)

    D' Ercole, Loredana, E-mail: l.dercole@smatteo.pv.it [Fondazione IRCCS Policlinico San Matteo, Department of Medical Physics (Italy); Quaretti, Pietro; Cionfoli, Nicola [Fondazione IRCCS Policlinico San Matteo, Department of Radiology (Italy); Klersy, Catherine [Fondazione IRCCS Policlinico San Matteo, Biometry and Clinical Epidemiology Service, Research Department, (Italy); Bocchiola, Milena [Fondazione IRCCS Policlinico San Matteo, Department of Medical Physics (Italy); Rodolico, Giuseppe; Azzaretti, Andrea [Fondazione IRCCS Policlinico San Matteo, Department of Radiology (Italy); Lisciandro, Francesco [Fondazione IRCCS Policlinico San Matteo, Department of Medical Physics (Italy); Cascella, Tommaso; Zappoli Thyrion, Federico [Fondazione IRCCS Policlinico San Matteo, Department of Radiology (Italy)

    2013-04-15

    To measure the maximum entrance skin dose (MESD) on patients undergoing carotid artery stenting (CAS) using embolic-protection devices, to analyze the dependence of dose and exposure parameters on anatomical, clinical, and technical factors affecting the procedure complexity, to obtain some local diagnostic reference levels (DRLs), and to evaluate whether overcoming DRLs is related to procedure complexity. MESD were evaluated with radiochromic films in 31 patients (mean age 72 {+-} 7 years). Five of 33 (15 %) procedures used proximal EPD, and 28 of 33 (85 %) procedures used distal EPD. Local DRLs were derived from the recorded exposure parameters in 93 patients (65 men and 28 women, mean age 73 {+-} 9 years) undergoing 96 CAS with proximal (33 %) or distal (67 %) EPD. Four bilateral lesions were included. MESD values (mean 0.96 {+-} 0.42 Gy) were <2 Gy without relevant dependence on procedure complexity. Local DRL values for kerma area product (KAP), fluoroscopy time (FT), and number of frames (N{sub FR}) were 269 Gy cm{sup 2}, 28 minutes, and 251, respectively. Only simultaneous bilateral treatment was associated with KAP (odds ratio [OR] 10.14, 95 % confidence interval [CI] 1-102.7, p < 0.05) and N{sub FR} overexposures (OR 10.8, 95 % CI 1.1-109.5, p < 0.05). Type I aortic arch decreased the risk of FT overexposure (OR 0.4, 95 % CI 0.1-0.9, p = 0.042), and stenosis {>=} 90 % increased the risk of N{sub FR} overexposure (OR 2.8, 95 % CI 1.1-7.4, p = 0.040). At multivariable analysis, stenosis {>=} 90 % (OR 2.8, 95 % CI 1.1-7.4, p = 0.040) and bilateral treatment (OR 10.8, 95 % CI 1.1-109.5, p = 0.027) were associated with overexposure for two or more parameters. Skin doses are not problematic in CAS with EPD because these procedures rarely lead to doses >2 Gy.

  9. Sedation for pediatric radiological procedures: analysis of potential causes of sedation failure and paradoxical reactions

    Energy Technology Data Exchange (ETDEWEB)

    Karian, V.E.; Burrows, P.E.; Connor, L. [Dept. of Radiology, Children' s Hospital, Boston, MA (United States); Zurakowski, D. [Dept. of Biostatistics, Children' s Hospital, Boston, MA (United States); Mason, K.P. [Dept. of Anesthesiology, Children' s Hospital, Boston, MA (United States)

    1999-11-01

    Background. Sedation for diagnostic imaging and interventional radiologic procedures in pediatrics has greatly increased over the past decade. With appropriate patient selection and monitoring, serious adverse effects are infrequent, but failure to sedate and paradoxical reactions do occur. Objective. The purpose of this study was to determine, among patients undergoing sedation for radiologic procedures, the incidence of sedation failure and paradoxical reaction to pentobarbital and to identify potentially correctable causes. Materials and methods. Records of 1665 patients who were sedated in the radiology department from 1 November 1997 to 1 July 1998 were reviewed. Patients failing sedation or experiencing paradoxical reaction were compared with respect to sex, age group, diagnosis, scan type, time of day, NPO status, use of IV contrast and type of sedation agent using the Fisher exact test, Pearson chi-square, analysis of variance (ANOVA), the Student t-test, and logistic regression. Results. Data analysis revealed a sedation failure rate of 1 % and paradoxical reaction rate of 1.2 %. Stepwise multiple logistic regression revealed that the only significant independent multivariate predictor of failure was the need for the administration of a combination of pentobarbital, fentanyl, and midazolam IV. Conclusion. The low rate of sedation failure and paradoxical reactions to pentobarbital was near optimal and probably cannot be improved with the currently available sedatives. (orig.)

  10. Sedation for pediatric radiological procedures: analysis of potential causes of sedation failure and paradoxical reactions

    International Nuclear Information System (INIS)

    Karian, V.E.; Burrows, P.E.; Connor, L.; Zurakowski, D.; Mason, K.P.

    1999-01-01

    Background. Sedation for diagnostic imaging and interventional radiologic procedures in pediatrics has greatly increased over the past decade. With appropriate patient selection and monitoring, serious adverse effects are infrequent, but failure to sedate and paradoxical reactions do occur. Objective. The purpose of this study was to determine, among patients undergoing sedation for radiologic procedures, the incidence of sedation failure and paradoxical reaction to pentobarbital and to identify potentially correctable causes. Materials and methods. Records of 1665 patients who were sedated in the radiology department from 1 November 1997 to 1 July 1998 were reviewed. Patients failing sedation or experiencing paradoxical reaction were compared with respect to sex, age group, diagnosis, scan type, time of day, NPO status, use of IV contrast and type of sedation agent using the Fisher exact test, Pearson chi-square, analysis of variance (ANOVA), the Student t-test, and logistic regression. Results. Data analysis revealed a sedation failure rate of 1 % and paradoxical reaction rate of 1.2 %. Stepwise multiple logistic regression revealed that the only significant independent multivariate predictor of failure was the need for the administration of a combination of pentobarbital, fentanyl, and midazolam IV. Conclusion. The low rate of sedation failure and paradoxical reactions to pentobarbital was near optimal and probably cannot be improved with the currently available sedatives. (orig.)

  11. Optimization of procedures for mercury-203 instrumental neutron activation analysis in human urine

    Energy Technology Data Exchange (ETDEWEB)

    Blotcky, A J; Claassen, J P [Nebraska Univ., Omaha, NE (United States). Medical Center; Fung, Y K [Nebraska Univ., Lincoln, NE (United States). Dept. of Chemistry; Meade, A G; Rack, E P [Nebraska Univ., Lincoln, NE (United States)

    1995-08-01

    Mercury, a known neurotoxin, has been implicated in etiology and pathogenesis of such disease states as Alzheimer`s and Parkinson`s diseases. There is concern that the exposure to mercury vapor released from dental amalgam restorations is a potential health hazard. Measurement of mercury concentrations in blood or urine may be useful in diagnosis of mercury poisoning and in assessing the extent exposure. This study describes the optimization of pre-neutron activation analysis procedures such as sampling, selection of irradiation and counting vials and acid digestion in order to minimize mercury loss via volatilization and/or permeation through containers. Therefore, the determination of mercury can be complicated by these potential losses. In the optimized procedure 20mL of urine was spiked with three different concentrations of mercury, digested with concentrated nitric acid, and placed in polypropylene vials for irradiation and counting. Analysis was performed by subtracting the Se-75 photopeak contribution to the 279 keV Hg-203 photopeak and applying the method of standard additions. Urinary mercury concentrations in normal human subjects were determined to be of the order of 10ng/mL. (author). 22 refs., 1 fig., 5 tabs.

  12. Optimization of procedures for mercury-203 instrumental neutron activation analysis in human urine

    International Nuclear Information System (INIS)

    Blotcky, A.J.; Claassen, J.P.

    1995-01-01

    Mercury, a known neurotoxin, has been implicated in etiology and pathogenesis of such disease states as Alzheimer's and Parkinson's diseases. There is concern that the exposure to mercury vapor released from dental amalgam restorations is a potential health hazard. Measurement of mercury concentrations in blood or urine may be useful in diagnosis of mercury poisoning and in assessing the extent exposure. This study describes the optimization of pre-neutron activation analysis procedures such as sampling, selection of irradiation and counting vials and acid digestion in order to minimize mercury loss via volatilization and/or permeation through containers. Therefore, the determination of mercury can be complicated by these potential losses. In the optimized procedure 20mL of urine was spiked with three different concentrations of mercury, digested with concentrated nitric acid, and placed in polypropylene vials for irradiation and counting. Analysis was performed by subtracting the Se-75 photopeak contribution to the 279 keV Hg-203 photopeak and applying the method of standard additions. Urinary mercury concentrations in normal human subjects were determined to be of the order of 10ng/mL. (author). 22 refs., 1 fig., 5 tabs

  13. Neutron activation analysis with k{sub 0}-standardisation : general formalism and procedure

    Energy Technology Data Exchange (ETDEWEB)

    Pomme, S.; Hardeman, F. [Centre de l`Etude de l`Energie Nucleaire, Mol (Belgium); Robouch, P.; Etxebarria, N.; Arana, G. [European Commission, Joint Research Centre, Institute for Reference Materials and Measurements, Geel (Belgium)

    1997-09-01

    Instrumental neutron activation analysis (INAA) with k{sub 0}-standardisation is a powerful tool for multi-element analysis at a broad range of trace element concentrations. An overview is given of the basic principles, fundamental equations, and general procedure of this method. Different aspects of the description of the neutron activation reaction rate are discussed, applying the Hogdahl convention. A general activation-decay formula is derived and its application to INAA is demonstrated. Relevant k{sub 0}-definitions for different activation decay schemes are summarised and upgraded to cases of extremely high fluxes. The main standardisation techniques for INAA are discussed, emphasizing the k{sub 0}-standardisation. Some general aspects of the basic equipment and its calibration are discussed, such as the characterisation of the neutron field and the tuning of the spectrometry part. A method for the prediction and optimisation of the analytical performance of INAA is presented.

  14. Neutron activation analysis with k0-standardisation : general formalism and procedure

    International Nuclear Information System (INIS)

    Pomme, S.; Hardeman, F.; Robouch, P.; Etxebarria, N.; Arana, G.

    1997-09-01

    Instrumental neutron activation analysis (INAA) with k 0 -standardisation is a powerful tool for multi-element analysis at a broad range of trace element concentrations. An overview is given of the basic principles, fundamental equations, and general procedure of this method. Different aspects of the description of the neutron activation reaction rate are discussed, applying the Hogdahl convention. A general activation-decay formula is derived and its application to INAA is demonstrated. Relevant k 0 -definitions for different activation decay schemes are summarised and upgraded to cases of extremely high fluxes. The main standardisation techniques for INAA are discussed, emphasizing the k 0 -standardisation. Some general aspects of the basic equipment and its calibration are discussed, such as the characterisation of the neutron field and the tuning of the spectrometry part. A method for the prediction and optimisation of the analytical performance of INAA is presented

  15. Headspace solid-phase microextraction procedures for gas chromatographic analysis of biological fluids and materials.

    Science.gov (United States)

    Mills, G A; Walker, V

    2000-12-01

    Solid-phase microextraction (SPME) is a new solventless sample preparation technique that is finding wide usage. This review provides updated information on headspace SPME with gas chromatographic separation for the extraction and measurement of volatile and semivolatile analytes in biological fluids and materials. Firstly the background to the technique is given in terms of apparatus, fibres used, extraction conditions and derivatisation procedures. Then the different matrices, urine, blood, faeces, breast milk, hair, breath and saliva are considered separately. For each, methods appropriate for the analysis of drugs and metabolites, solvents and chemicals, anaesthetics, pesticides, organometallics and endogenous compounds are reviewed and the main experimental conditions outlined with specific examples. Then finally, the future potential of SPME for the analysis of biological samples in terms of the development of new devices and fibre chemistries and its coupling with high-performance liquid chromatography is discussed.

  16. Regression analysis of nuclear plant capacity factors

    International Nuclear Information System (INIS)

    Stocks, K.J.; Faulkner, J.I.

    1980-07-01

    Operating data on all commercial nuclear power plants of the PWR, HWR, BWR and GCR types in the Western World are analysed statistically to determine whether the explanatory variables size, year of operation, vintage and reactor supplier are significant in accounting for the variation in capacity factor. The results are compared with a number of previous studies which analysed only United States reactors. The possibility of specification errors affecting the results is also examined. Although, in general, the variables considered are statistically significant, they explain only a small portion of the variation in the capacity factor. The equations thus obtained should certainly not be used to predict the lifetime performance of future large reactors

  17. The role of human error in risk analysis: Application to pre- and post-maintenance procedures of process facilities

    International Nuclear Information System (INIS)

    Noroozi, Alireza; Khakzad, Nima; Khan, Faisal; MacKinnon, Scott; Abbassi, Rouzbeh

    2013-01-01

    Human factors play an important role in the safe operation of a facility. Human factors include the systematic application of information about human characteristics and behavior to increase the safety of a process system. A significant proportion of human errors occur during the maintenance phase. However, the quantification of human error probabilities in the maintenance phase has not been given the amount of attention it deserves. This paper focuses on a human factors analysis in pre-and post- pump maintenance operations. The procedures for removing process equipment from service (pre-maintenance) and returning the equipment to service (post-maintenance) are considered for possible failure scenarios. For each scenario, human error probability is calculated for each activity using the Success Likelihood Index Method (SLIM). Consequences are also assessed in this methodology. The risk assessment is conducted for each component and the overall risk is estimated by adding individual risks. The present study is aimed at highlighting the importance of considering human error in quantitative risk analyses. The developed methodology has been applied to a case study of an offshore process facility

  18. An Empirical Analysis of Job Satisfaction Factors.

    Science.gov (United States)

    1987-09-01

    have acknowledged the importance of factors which make the Air Force attractive to its members or conversely, make other employees consider...Maslow’s need hierarchy theory attempts to show that man has five basic categories of needs: physiological, safety, belongingness , esteem, and self...attained until lower-level basic needs are attained. This implies a sort of growth process where optional job environments for given employees are

  19. Simulation and analysis of data for enhancing low cycle fatigue test procedures

    Energy Technology Data Exchange (ETDEWEB)

    Sarajaervi, U.; Cronvall, O. [VTT Technical Research Centre of Finland (Finland)

    2006-04-15

    The simulation and analysis of data for enhancing low cycle fatigue test procedures is discussed in this report. The analysed materials are an austenitic stainless piping steel and an austenitic weld material. This project continues the work performed in 2003 and 2004. The fatigue test data treatment application developed within the project in 2004 for the preparation of the fatigue data has been developed further. Also, more fatigue test data has been analysed with the application than in 2004. In addition to this numerical fatigue simulations were performed with FEM code ABAQUS. With the fatigue test data treatment application one can e.g. both calculate cyclically certain relevant characteristic values, e.g. elastic range, and form a set of certain cyclical parameter values needed as a part of ABAQUS analysis input files. The hardening properties of metals were modelled with both isotropic and kinematic hardening models. The further development of the application included trimming of the analysed data, and consequently trimming of resulting hardening parameters. The need for the trimming arose from the fact that the analysed fatigue test data presents some scatter caused by the limited accuracy of the test equipment and the sampling rate. The hardening parameters obtained from the application analysis results were used in the subsequent ABAQUS analyses, and then the fatigue test data were compared with the ABAQUS simulation results. After finding a procedure to trim result data to get smooth curves for cyclic hardening, hardening and softening could be reproduced in ABAQUS analysis with a reasonable accuracy. The modelling of the fatigue induced initiation and growth of cracks was not considered in this study. On the other hand, a considerable part of the fatigue life of nuclear power plant (NPP) piping components is spent in the phase preceding the initiation and growth of cracks. (au)

  20. Simulation and analysis of data for enhancing low cycle fatigue test procedures

    International Nuclear Information System (INIS)

    Sarajaervi, U.; Cronvall, O.

    2006-04-01

    The simulation and analysis of data for enhancing low cycle fatigue test procedures is discussed in this report. The analysed materials are an austenitic stainless piping steel and an austenitic weld material. This project continues the work performed in 2003 and 2004. The fatigue test data treatment application developed within the project in 2004 for the preparation of the fatigue data has been developed further. Also, more fatigue test data has been analysed with the application than in 2004. In addition to this numerical fatigue simulations were performed with FEM code ABAQUS. With the fatigue test data treatment application one can e.g. both calculate cyclically certain relevant characteristic values, e.g. elastic range, and form a set of certain cyclical parameter values needed as a part of ABAQUS analysis input files. The hardening properties of metals were modelled with both isotropic and kinematic hardening models. The further development of the application included trimming of the analysed data, and consequently trimming of resulting hardening parameters. The need for the trimming arose from the fact that the analysed fatigue test data presents some scatter caused by the limited accuracy of the test equipment and the sampling rate. The hardening parameters obtained from the application analysis results were used in the subsequent ABAQUS analyses, and then the fatigue test data were compared with the ABAQUS simulation results. After finding a procedure to trim result data to get smooth curves for cyclic hardening, hardening and softening could be reproduced in ABAQUS analysis with a reasonable accuracy. The modelling of the fatigue induced initiation and growth of cracks was not considered in this study. On the other hand, a considerable part of the fatigue life of nuclear power plant (NPP) piping components is spent in the phase preceding the initiation and growth of cracks. (au)

  1. Analysis of the Impact of Transparency, Corruption, Openness in Competition and Tender Procedures on Public Procurement in the Czech Republic

    Directory of Open Access Journals (Sweden)

    František Ochrana

    2014-01-01

    Full Text Available This study analyses the impact of transparency and openness to competition in public procurement in the Czech Republic. The problems of the Czech procurement market have been demonstrated on the analysis of a sample of contracts awarded by local government entities. From among a set of factors influencing the efficiency of public procurement, we closely analyse transparency, resilience against corruption, openness, effective administrative award procedure, and formulation of appropriate evaluation criteria for selecting the most suitable bid. Some assumptions were confirmed, including a positive effect of open procedures on the level of competition on the supply side as well as the dominant use of price criteria only. The latter case is probably often caused by low skills of workers at the contracting entities, as well as the lack of resources in public budgets. However, we have to reject the persistent legend of “undershooting” tender prices and subsequently increasing the final prices of public contracts. Increases of final prices are very limited. Based on the results of the analyses presented, we argue that the main problem of the Czech public procurement market lies in a rather low competence of administrators who are not able to use non-price criteria more often.

  2. EFFICIENCY OF MOMENT AMPLIFICATION PROCEDURES FOR THE SECOND-ORDER ANALYSIS OF STEEL FRAMES

    Directory of Open Access Journals (Sweden)

    Paschal Chiadighikaobi

    2018-02-01

    Full Text Available Beam-column is the member subjected to axial compression and bending. Secondary Moment was accounted for in the design and was additional moment induced by axial load. Comparing the results analysis from two computer aided software (SAP2000 and Java. The moment amplification factor Af was inputted in the Java code. Af did not create any change in the result outputs in the Java Code results. There are many different ways to apply amplification factors to first-order analysis results, each with various ranges of applicability. The results shown in this paper are the comparative results of the moment diagrams, axial forces, and shear forces. The type of steel used in the design and analysis is ASTM A992.

  3. A Factor Analysis of the BSRI and the PAQ.

    Science.gov (United States)

    Edwards, Teresa A.; And Others

    Factor analysis of the Bem Sex Role Inventory (BSRI) and the Personality Attributes Questionnaire (PAQ) was undertaken to study the independence of the masculine and feminine scales within each instrument. Both instruments were administered to undergraduate education majors. Analysis of primary first and second order factors of the BSRI indicated…

  4. Analysis and optimization of the TWINKLE factoring device

    NARCIS (Netherlands)

    Lenstra, A.K.; Shamir, A.; Preneel, B.

    2000-01-01

    We describe an enhanced version of the TWINKLE factoring device and analyse to what extent it can be expected to speed up the sieving step of the Quadratic Sieve and Number Field Sieve factoring al- gorithms. The bottom line of our analysis is that the TWINKLE-assisted factorization of 768-bit

  5. Factors associated with patient-reported procedural memory following emergency department procedural sedation with ketamine and propofol: A prospective cohort of 563 patients.

    Science.gov (United States)

    Greer, Andrew; Treston, Greg

    2018-04-01

    To describe the proportion of patients reporting procedural memory following procedural sedation and analgesia (PSA) with ketamine and propofol (KP) administered premixed together (ketofol) or individually (sequential KP) in ED attendees. Identify any clinical or demographic variables associated with procedural memory. This was a convenience sample of 563 patients who received KP PSA as per the departmental protocol. A standardised script was used to assess for procedural memory. This was categorised as 'any' and 'unpleasant' prior to discharge (immediate memory) and at telephone follow up (delayed memory). A total of 318 patients had sequential KP and 249 premixed 1:1 ketofol. For sequential KP compared to ketofol, the proportion reporting any memory was as follows: 3.5% versus 3.3% immediate, 4.4% versus 5.5% delayed and 5.4% versus 7.4% for the sum of these. For unpleasant memory, the proportion was as follows: 1.6% versus 2.9% immediate, 1.7% versus 4.7% delayed and 2.2% versus 6.9% all unpleasant memory (odds ratio [OR] 3.3, 95% confidence interval [CI] 1.4-8.1). Memory was associated with male sex (OR 4, 95% CI 1.5-10.5), opiates (OR 3, 95% CI 1.7-7.5), a Wisconsin Sedation Scale score ≥3 (moderate sedation) (OR 4.3, 95% CI 1.1-18.2) and propofol dose 0.75 mg/kg (13% versus 3%) (OR 6, 95% CI 1.7-21). The ketofol group had 5% (95% CI 0.1-10) more respiratory events requiring intervention. Procedural memory was uncommon for both mix types; however, a greater proportion of the premixed ketofol group had unpleasant memory. Associations with sex, opiates, moderate sedation and propofol dose were identified, and respiratory adverse events were more common in the premixed ketofol group. © 2017 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.

  6. Development of a procedure for qualitative and quantitative evaluation of human factors as a part of probabilistic safety assessments of nuclear power plants. Part B. Technical documentation

    International Nuclear Information System (INIS)

    Richei, A.

    1998-01-01

    As international studies have shown, accidents in plants are increasingly caused by combinations of technical failures and human errors. Therefore careful investigations of man-machine-interactions to determine human reliability are gaining importance worldwide. Regarding nuclear power plants such investigations are usually carried out within the scope of probabilistic safety assessments. A great number of procedures to evaluate human factors has been developed up to now. However, none of them is able to take into account the whole spectrum of requirements - as for instance transferability of date to other plants, analysis of weak points, and evaluation of cognitive tasks - for a complete and reliable probabilistic safety assessment. Based on an advanced model for a man-machine-system, the Human Error Rate Assessment and Optimizing System (HEROS) and a corresponding expert system of the same name are introduced. This expert system enables the quantification of human error probabilities for plant operator actions on the one hand and is also capable of providing quantitative statements regarding the optimization of man-machine-system in terms of human error probability minimization on the other one. Three relevant evaluation levels, i.e. 'Management Structure', 'Working Environment' and 'Man-Machine-Interface', are derived from a model of the man-machine-system. Linguistic variables are assigned to all performance shaping factors at these levels. These variables are used to establish a rule-based expert system. The knowledge bases of this system are represented by rules. Processing of these rules is carried out by means of the fuzzy set theory, after provision of relevant data for a particular personal action to be evaluated. This procedure enables a simple and effective use of ergonomic studies as the relevant database, which is also transferable to other plants with any design. The expert system consist in total of 16 rule bases in which all ascertainable and

  7. New non-cognitive procedures for medical applicant selection: a qualitative analysis in one school.

    Science.gov (United States)

    Katz, Sara; Vinker, Shlomo

    2014-11-07

    Recent data have called into question the reliability and predictive validity of standard admission procedures to medical schools. Eliciting non-cognitive attributes of medical school applicants using qualitative tools and methods has thus become a major challenge. 299 applicants aged 18-25 formed the research group. A set of six research tools was developed in addition to the two existing ones. These included: a portfolio task, an intuitive task, a cognitive task, a personal task, an open self-efficacy questionnaire and field-notes. The criteria-based methodology design used constant comparative analysis and grounded theory techniques to produce a personal attributes profile per participant, scored on a 5-point scale holistic rubric. Qualitative validity of data gathering was checked by comparing the profiles elicited from the existing interview against the profiles elicited from the other tools, and by comparing two profiles of each of the applicants who handed in two portfolio tasks. Qualitative validity of data analysis was checked by comparing researcher results with those of an external rater (n =10). Differences between aggregated profile groups were checked by the Npar Wilcoxon Signed Ranks Test and by Spearman Rank Order Correlation Test. All subjects gave written informed consent to their participation. Privacy was protected by using code numbers. A concept map of 12 personal attributes emerged, the core constructs of which were motivation, sociability and cognition. A personal profile was elicited. Inter-rater agreement was 83.3%. Differences between groups by aggregated profiles were found significant (p < .05, p < .01, p < .001).A random sample of sixth year students (n = 12) underwent the same admission procedure as the research group. Rank order was different; and arrogance was a new construct elicited in the sixth year group. This study suggests a broadening of the methodology for selecting medical school applicants. This methodology

  8. Modification and analysis of engineering hot spot factor of HFETR

    International Nuclear Information System (INIS)

    Hu Yuechun; Deng Caiyu; Li Haitao; Xu Taozhong; Mo Zhengyu

    2014-01-01

    This paper presents the modification and analysis of engineering hot spot factors of HFETR. The new factors are applied in the fuel temperature analysis and the estimated value of the safety allowable operating power of HFETR. The result shows the maximum cladding temperature of the fuel is lower when the new factor are in utilization, and the safety allowable operating power of HFETR if higher, thus providing the economical efficiency of HFETR. (authors)

  9. A replication of a factor analysis of motivations for trapping

    Science.gov (United States)

    Schroeder, Susan; Fulton, David C.

    2015-01-01

    Using a 2013 sample of Minnesota trappers, we employed confirmatory factor analysis to replicate an exploratory factor analysis of trapping motivations conducted by Daigle, Muth, Zwick, and Glass (1998).  We employed the same 25 items used by Daigle et al. and tested the same five-factor structure using a recent sample of Minnesota trappers. We also compared motivations in our sample to those reported by Daigle et el.

  10. Factor analysis improves the selection of prescribing indicators

    DEFF Research Database (Denmark)

    Rasmussen, Hanne Marie Skyggedal; Søndergaard, Jens; Sokolowski, Ineta

    2006-01-01

    OBJECTIVE: To test a method for improving the selection of indicators of general practitioners' prescribing. METHODS: We conducted a prescription database study including all 180 general practices in the County of Funen, Denmark, approximately 472,000 inhabitants. Principal factor analysis was us...... appropriate and inappropriate prescribing, as revealed by the correlation of the indicators in the first factor. CONCLUSION: Correlation and factor analysis is a feasible method that assists the selection of indicators and gives better insight into prescribing patterns....

  11. Missing data treatments matter: an analysis of multiple imputation for anterior cervical discectomy and fusion procedures.

    Science.gov (United States)

    Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Cui, Jonathan J; Basques, Bryce A; Albert, Todd J; Grauer, Jonathan N

    2018-04-09

    The presence of missing data is a limitation of large datasets, including the National Surgical Quality Improvement Program (NSQIP). In addressing this issue, most studies use complete case analysis, which excludes cases with missing data, thus potentially introducing selection bias. Multiple imputation, a statistically rigorous approach that approximates missing data and preserves sample size, may be an improvement over complete case analysis. The present study aims to evaluate the impact of using multiple imputation in comparison with complete case analysis for assessing the associations between preoperative laboratory values and adverse outcomes following anterior cervical discectomy and fusion (ACDF) procedures. This is a retrospective review of prospectively collected data. Patients undergoing one-level ACDF were identified in NSQIP 2012-2015. Perioperative adverse outcome variables assessed included the occurrence of any adverse event, severe adverse events, and hospital readmission. Missing preoperative albumin and hematocrit values were handled using complete case analysis and multiple imputation. These preoperative laboratory levels were then tested for associations with 30-day postoperative outcomes using logistic regression. A total of 11,999 patients were included. Of this cohort, 63.5% of patients had missing preoperative albumin and 9.9% had missing preoperative hematocrit. When using complete case analysis, only 4,311 patients were studied. The removed patients were significantly younger, healthier, of a common body mass index, and male. Logistic regression analysis failed to identify either preoperative hypoalbuminemia or preoperative anemia as significantly associated with adverse outcomes. When employing multiple imputation, all 11,999 patients were included. Preoperative hypoalbuminemia was significantly associated with the occurrence of any adverse event and severe adverse events. Preoperative anemia was significantly associated with the

  12. A “Cookbook” Cost Analysis Procedure for Medical Information Systems*

    Science.gov (United States)

    Torrance, Janice L.; Torrance, George W.; Covvey, H. Dominic

    1983-01-01

    A costing procedure for medical information systems is described. The procedure incorporates state-of-the-art costing methods in an easy to follow “cookbook” format. Application of the procedure consists of filling out a series of Mac-Tor EZ-Cost forms. The procedure and forms have been field tested by application to a cardiovascular database system. This article describes the major features of the costing procedure. The forms and other details are available upon request.

  13. Human factor analysis and preventive countermeasures in nuclear power plant

    International Nuclear Information System (INIS)

    Li Ye

    2010-01-01

    Based on the human error analysis theory and the characteristics of maintenance in a nuclear power plant, human factors of maintenance in NPP are divided into three different areas: human, technology, and organization. Which is defined as individual factors, including psychological factors, physiological characteristics, health status, level of knowledge and interpersonal skills; The technical factors including technology, equipment, tools, working order, etc.; The organizational factors including management, information exchange, education, working environment, team building and leadership management,etc The analysis found that organizational factors can directly or indirectly affect the behavior of staff and technical factors, is the most basic human error factor. Based on this nuclear power plant to reduce human error and measures the response. (authors)

  14. Cement Leakage in Percutaneous Vertebral Augmentation for Osteoporotic Vertebral Compression Fractures: Analysis of Risk Factors.

    Science.gov (United States)

    Xie, Weixing; Jin, Daxiang; Ma, Hui; Ding, Jinyong; Xu, Jixi; Zhang, Shuncong; Liang, De

    2016-05-01

    The risk factors for cement leakage were retrospectively reviewed in 192 patients who underwent percutaneous vertebral augmentation (PVA). To discuss the factors related to the cement leakage in PVA procedure for the treatment of osteoporotic vertebral compression fractures. PVA is widely applied for the treatment of osteoporotic vertebral fractures. Cement leakage is a major complication of this procedure. The risk factors for cement leakage were controversial. A retrospective review of 192 patients who underwent PVA was conducted. The following data were recorded: age, sex, bone density, number of fractured vertebrae before surgery, number of treated vertebrae, severity of the treated vertebrae, operative approach, volume of injected bone cement, preoperative vertebral compression ratio, preoperative local kyphosis angle, intraosseous clefts, preoperative vertebral cortical bone defect, and ratio and type of cement leakage. To study the correlation between each factor and cement leakage ratio, bivariate regression analysis was employed to perform univariate analysis, whereas multivariate linear regression analysis was employed to perform multivariate analysis. The study included 192 patients (282 treated vertebrae), and cement leakage occurred in 100 vertebrae (35.46%). The vertebrae with preoperative cortical bone defects generally exhibited higher cement leakage ratio, and the leakage is typically type C. Vertebrae with intact cortical bones before the procedure tend to experience type S leakage. Univariate analysis showed that patient age, bone density, number of fractured vertebrae before surgery, and vertebral cortical bone were associated with cement leakage ratio (Pcement leakage are bone density and vertebral cortical bone defect, with standardized partial regression coefficients of -0.085 and 0.144, respectively. High bone density and vertebral cortical bone defect are independent risk factors associated with bone cement leakage.

  15. Cost-consequence analysis of different active flowable hemostatic matrices in cardiac surgical procedures.

    Science.gov (United States)

    Makhija, D; Rock, M; Xiong, Y; Epstein, J D; Arnold, M R; Lattouf, O M; Calcaterra, D

    2017-06-01

    A recent retrospective comparative effectiveness study found that use of the FLOSEAL Hemostatic Matrix in cardiac surgery was associated with significantly lower risks of complications, blood transfusions, surgical revisions, and shorter length of surgery than use of SURGIFLO Hemostatic Matrix. These outcome improvements in cardiac surgery procedures may translate to economic savings for hospitals and payers. The objective of this study was to estimate the cost-consequence of two flowable hemostatic matrices (FLOSEAL or SURGIFLO) in cardiac surgeries for US hospitals. A cost-consequence model was constructed using clinical outcomes from a previously published retrospective comparative effectiveness study of FLOSEAL vs SURGIFLO in adult cardiac surgeries. The model accounted for the reported differences between these products in length of surgery, rates of major and minor complications, surgical revisions, and blood product transfusions. Costs were derived from Healthcare Cost and Utilization Project's National Inpatient Sample (NIS) 2012 database and converted to 2015 US dollars. Savings were modeled for a hospital performing 245 cardiac surgeries annually, as identified as the average for hospitals in the NIS dataset. One-way sensitivity analysis and probabilistic sensitivity analysis were performed to test model robustness. The results suggest that if FLOSEAL is utilized in a hospital that performs 245 mixed cardiac surgery procedures annually, 11 major complications, 31 minor complications, nine surgical revisions, 79 blood product transfusions, and 260.3 h of cumulative operating time could be avoided. These improved outcomes correspond to a net annualized saving of $1,532,896. Cost savings remained consistent between $1.3m and $1.8m and between $911k and $2.4m, even after accounting for the uncertainty around clinical and cost inputs, in a one-way and probabilistic sensitivity analysis, respectively. Outcome differences associated with FLOSEAL vs SURGIFLO

  16. Clinical Outcomes of Root Reimplantation and Bentall Procedure: Propensity Score Matching Analysis.

    Science.gov (United States)

    Lee, Heemoon; Cho, Yang Hyun; Sung, Kiick; Kim, Wook Sung; Park, Kay-Hyun; Jeong, Dong Seop; Park, Pyo Won; Lee, Young Tak

    2018-03-26

    This study aimed to evaluate the clinical outcomes of aortic root replacement(ARR) surgery:Root reimplantation as valve-sparing root replacement(VSR) and the Bentall procedure. We retrospectively reviewed 216 patients who underwent ARR between 1995 and 2013 at Samsung Medical Center. Patients were divided into two groups, depending on the procedure they underwent: Bentall(n=134) and VSR(n=82). The mean follow-up duration was 100.9±56.4 months. There were 2 early deaths in the Bentall group and none in the VSR group(p=0.53). Early morbidities were not different between the groups. Overall mortality was significantly lower in the VSR group (HR=0.12,p=0.04). Despite the higher reoperation rate in the VSR group(p=0.03), major adverse valve-related events(MAVRE) did not differ between the groups(p=0.28). Bleeding events were significantly higher in the Bentall group during follow-up(10 in Bentall group, 0 in VSR group, p=0.04). here were 6 thromboembolic events only in the Bentall group(p=0.11). We performed a propensity score matching analysis comparing the groups(134 Bentall vs 43 VSR). Matched analysis gave similar results, i.e. HR=0.17 and p=0.10 for overall mortality and HR=1.01 and p=0.99 for MAVRE. Although there was marginal significance in the propensity matched analysis, it is plausible to anticipate a survival benefit with VSR during long-term follow-up. Despite a higher reoperation for aortic valves, VSR can be a viable option in patients who decline life-long anticoagulation, especially the young or the patients in whom anticoagulation is contraindicated. Copyright © 2018. Published by Elsevier Inc.

  17. Processes and Procedures for Application of CFD to Nuclear Reactor Safety Analysis

    International Nuclear Information System (INIS)

    Richard W. Johnson; Richard R. Schultz; Patrick J. Roache; Ismail B. Celik; William D. Pointer; Yassin A. Hassan

    2006-01-01

    Traditionally, nuclear reactor safety analysis has been performed using systems analysis codes such as RELAP5, which was developed at the INL. However, goals established by the Generation IV program, especially the desire to increase efficiency, has lead to an increase in operating temperatures for the reactors. This increase pushes reactor materials to operate towards their upper temperature limits relative to structural integrity. Because there will be some finite variation of the power density in the reactor core, there will be a potential for local hot spots to occur in the reactor vessel. Hence, it has become apparent that detailed analysis will be required to ensure that local ''hot spots'' do not exceed safety limits. It is generally accepted that computational fluid dynamics (CFD) codes are intrinsically capable of simulating fluid dynamics and heat transport locally because they are based on ''first principles''. Indeed, CFD analysis has reached a fairly mature level of development, including the commercial level. However, CFD experts are aware that even though commercial codes are capable of simulating local fluid and thermal physics, great care must be taken in their application to avoid errors caused by such things as inappropriate grid meshing, low-order discretization schemes, lack of iterative convergence and inaccurate time-stepping. Just as important is the choice of a turbulence model for turbulent flow simulation. Turbulence models model the effects of turbulent transport of mass, momentum and energy, but are not necessarily applicable for wide ranges of flow types. Therefore, there is a well-recognized need to establish practices and procedures for the proper application of CFD to simulate flow physics accurately and establish the level of uncertainty of such computations. The present document represents contributions of CFD experts on what the basic practices, procedures and guidelines should be to aid CFD analysts to obtain accurate estimates

  18. Application procedures and analysis examples of the SIE ASME-NH program

    International Nuclear Information System (INIS)

    Kim, Seok Hoon; Koo, G. H.; Kim, J. B.

    2010-12-01

    In this report, the design rule of the ASME-NH Code was briefly summarized and the application procedures of SIE ASME-NH program were analysed, the analysis examples were described. The SIE ASME-NH program was developed according to the ASME Code Section III Subsection NH rules to perform the primary stress limits, the accumulated inelastic strain limits and the creep fatigue damage evaluations in the structural design of nuclear power plants operating with high temperatures over creep temperature at normal operating conditions. In the analysis examples, the benchmark problem for the high temperature reactor vessel which was discussed in the SIE ASME-NH user's seminar was described. Also, the preliminary structural analysis of an Advanced Burner Test Reactor internal structure was described. Considering the load combinations of the various cycle types submitted from significant operating conditions, the integrity of a reactor internal structure was reviewed according to the stress and strain limits of the ASME-NH rules and the analysis and evaluation results were summarized

  19. ANALYSIS OF RISK FACTORS ECTOPIC PREGNANCY

    Directory of Open Access Journals (Sweden)

    Budi Santoso

    2017-04-01

    Full Text Available Introduction: Ectopic pregnancy is a pregnancy with extrauterine implantation. This situation is gynecologic emergency that contributes to maternal mortality. Therefore, early recognition, based on identification of the causes of ectopic pregnancy risk factors, is needed. Methods: The design descriptive observational. The samples were pregnant women who had ectopic pregnancy at Maternity Room, Emergency Unit, Dr. Soetomo Hospital, Surabaya, from 1 July 2008 to 1 July 2010. Sampling technique was total sampling using medical records. Result: Patients with ectopic pregnancy were 99 individuals out of 2090 pregnant women who searched for treatment in Dr. Soetomo Hospital. However, only 29 patients were accompanied with traceable risk factors. Discussion:. Most ectopic pregnancies were in the age group of 26-30 years, comprising 32 patients (32.32%, then in age groups of 31–35 years as many as 25 patients (25.25%, 18 patients in age group 21–25 years (18.18%, 17 patients in age group 36–40 years (17.17%, 4 patients in age group 41 years and more (4.04%, and the least was in age group of 16–20 years with 3 patients (3.03%. A total of 12 patients with ectopic pregnancy (41.38% had experience of abortion and 6 patients (20.69% each in groups of patients with ectopic pregnancy who used family planning, in those who used family planning as well as ectopic pregnancy patients with history of surgery. There were 2 patients (6.90% of the group of patients ectopic pregnancy who had history of surgery and history of abortion. The incidence rate of ectopic pregnancy was 4.73%, mostly in the second gravidity (34.34%, whereas the nulliparous have the highest prevalence of 39.39%. Acquired risk factors, i.e. history of operations was 10.34%, patients with family planning 20.69%, patients with history of abortion 41.38%, patients with history of abortion and operation 6.90% patients with family and history of abortion was 20.69%.

  20. Investigating product development strategy in beverage industry using factor analysis

    Directory of Open Access Journals (Sweden)

    Naser Azad

    2013-03-01

    Full Text Available Selecting a product development strategy that is associated with the company's current service or product innovation, based on customers’ needs and changing environment, plays an important role in increasing demand, increasing market share, increasing sales and profits. Therefore, it is important to extract effective variables associated with product development to improve performance measurement of firms. This paper investigates important factors influencing product development strategies using factor analysis. The proposed model of this paper investigates 36 factors and, using factor analysis, we extract six most influential factors including information sharing, intelligence information, exposure strategy, differentiation, research and development strategy and market survey. The first strategy, partnership, includes five sub-factor including product development partnership, partnership with foreign firms, customers’ perception from competitors’ products, Customer involvement in product development, inter-agency coordination, customer-oriented approach to innovation and transmission of product development change where inter-agency coordination has been considered the most important factor. Internal strengths are the most influential factors impacting the second strategy, intelligence information. The third factor, introducing strategy, introducing strategy, includes four sub criteria and consumer buying behavior is the most influencing factor. Differentiation is the next important factor with five components where knowledge and expertise in product innovation is the most important one. Research and development strategy with four sub-criteria where reducing product development cycle plays the most influential factor and finally, market survey strategy is the last important factor with three factors and finding new market plays the most important role.

  1. Fatigue Analysis of Tubesheet/Shell Juncture Applying the Mitigation Factor for Over-conservatism

    International Nuclear Information System (INIS)

    Kang, Deog Ji; Kim, Kyu Hyoung; Lee, Jae Gon

    2009-01-01

    If the environmental fatigue requirements are applied to the primary components of a nuclear power plant, to which the present ASME Code fatigue curves are applied, some locations with high level CUF (Cumulative Usage Factor) are anticipated not to meet the code criteria. The application of environmental fatigue damage is still particularly controversial for plants with 60-year design lives. Therefore, it is need to develop a detailed fatigue analysis procedure to identify the conservatisms in the procedure and to lower the cumulative usage factor. Several factors are being considered to mitigate the conservatism such as three-dimensional finite element modeling. In the present analysis, actual pressure transient data instead of conservative maximum and minimum pressure data was applied as one of mitigation factors. Unlike in the general method, individual transient events were considered instead of the grouped transient events. The tubesheet/shell juncture in the steam generator assembly is the one of the weak locations and was, therefore, selected as a target to evaluate the mitigation factor in the present analysis

  2. Housing price forecastability: A factor analysis

    DEFF Research Database (Denmark)

    Møller, Stig Vinther; Bork, Lasse

    2017-01-01

    We examine U.S. housing price forecastability using principal component analysis (PCA), partial least squares (PLS), and sparse PLS (SPLS). We incorporate information from a large panel of 128 economic time series and show that macroeconomic fundamentals have strong predictive power for future...... movements in housing prices. We find that (S)PLS models systematically dominate PCA models. (S)PLS models also generate significant out-of-sample predictive power over and above the predictive power contained by the price-rent ratio, autoregressive benchmarks, and regression models based on small datasets....

  3. System Requirements Analysis for a Computer-based Procedure in a Research Reactor Facility

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jaek Wan; Jang, Gwi Sook; Seo, Sang Moon; Shin, Sung Ki [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    This can address many of the routine problems related to human error in the use of conventional, hard-copy operating procedures. An operating supporting system is also required in a research reactor. A well-made CBP can address the staffing issues of a research reactor and reduce the human errors by minimizing the operator's routine tasks. A CBP for a research reactor has not been proposed yet. Also, CBPs developed for nuclear power plants have powerful and various technical functions to cover complicated plant operation situations. However, many of the functions may not be required for a research reactor. Thus, it is not reasonable to apply the CBP to a research reactor directly. Also, customizing of the CBP is not cost-effective. Therefore, a compact CBP should be developed for a research reactor. This paper introduces high level requirements derived by the system requirements analysis activity as the first stage of system implementation. Operation support tools are under consideration for application to research reactors. In particular, as a full digitalization of the main control room, application of a computer-based procedure system has been required as a part of man-machine interface system because it makes an impact on the operating staffing and human errors of a research reactor. To establish computer-based system requirements for a research reactor, this paper addressed international standards and previous practices on nuclear plants.

  4. Confidence ellipses: A variation based on parametric bootstrapping applicable on Multiple Factor Analysis results for rapid graphical evaluation

    DEFF Research Database (Denmark)

    Dehlholm, Christian; Brockhoff, Per B.; Bredie, Wender L. P.

    2012-01-01

    A new way of parametric bootstrapping allows similar construction of confidence ellipses applicable on all results from Multiple Factor Analysis obtained from the FactoMineR package in the statistical program R. With this procedure, a similar approach will be applied to Multiple Factor Analysis r...... in different studies performed on the same set of products. In addition, the graphical display of confidence ellipses eases interpretation and communication of results....

  5. Item-level factor analysis of the Self-Efficacy Scale.

    Science.gov (United States)

    Bunketorp Käll, Lina

    2014-03-01

    This study explores the internal structure of the Self-Efficacy Scale (SES) using item response analysis. The SES was previously translated into Swedish and modified to encompass all types of pain, not exclusively back pain. Data on perceived self-efficacy in 47 patients with subacute whiplash-associated disorders were derived from a previously conducted randomized-controlled trial. The item-level factor analysis was carried out using a six-step procedure. To further study the item inter-relationships and to determine the underlying structure empirically, the 20 items of the SES were also subjected to principal component analysis with varimax rotation. The analyses showed two underlying factors, named 'social activities' and 'physical activities', with seven items loading on each factor. The remaining six items of the SES appeared to measure somewhat different constructs and need to be analysed further.

  6. Factors influencing selection for a day-case or 23-h stay procedure in transanal endoscopic microsurgery.

    Science.gov (United States)

    Ford, S J; Wheeler, J M D; Borley, N R

    2010-03-01

    Transanal endoscopic microsurgery (TEMS) is an alternative to radical resection of the rectum for benign lesions and early rectal cancer. This study aimed to identify whether day-case TEMS is safe and which factors dictate patient suitability and length of stay (LOS). Details of patients undergoing TEMS resection were retrieved from a tertiary referral prospective database. Of 96 patients, 46 (48 per cent) were day cases, 24 (25 per cent) had a 23-h stay and 26 (27 per cent) were inpatients. The frequency of day-case surgery increased significantly over the study interval (P = 0.050). Distance of the lesion from the anorectal junction, malignant potential and travel distance had no bearing on LOS. Older age (P = 0.004) and duration of surgery (P = 0.002) correlated significantly with increased LOS. Lesions covering one quadrant involved a significantly shorter stay than those covering two or more quadrants (P = 0.002). Maximum diameter (mean 5.7 cm) was strongly related to LOS (P = 0.009). Day-case and 23-h stay patients had a significantly higher proportion of lower-risk lesions (P = 0.001). High-volume day-case TEMS appears safe, even when long travel distances are involved. With advances in practice and procedural safety, traditional risk factors may not be as important as currently thought. (c) 2010 British Journal of Surgery Society Ltd. Published by John Wiley & Sons, Ltd.

  7. Factoring handedness data: I. Item analysis.

    Science.gov (United States)

    Messinger, H B; Messinger, M I

    1995-12-01

    Recently in this journal Peters and Murphy challenged the validity of factor analyses done on bimodal handedness data, suggesting instead that right- and left-handers be studied separately. But bimodality may be avoidable if attention is paid to Oldfield's questionnaire format and instructions for the subjects. Two characteristics appear crucial: a two-column LEFT-RIGHT format for the body of the instrument and what we call Oldfield's Admonition: not to indicate strong preference for handedness item, such as write, unless "... the preference is so strong that you would never try to use the other hand unless absolutely forced to...". Attaining unimodality of an item distribution would seem to overcome the objections of Peters and Murphy. In a 1984 survey in Boston we used Oldfield's ten-item questionnaire exactly as published. This produced unimodal item distributions. With reflection of the five-point item scale and a logarithmic transformation, we achieved a degree of normalization for the items. Two surveys elsewhere based on Oldfield's 20-item list but with changes in the questionnaire format and the instructions, yielded markedly different item distributions with peaks at each extreme and sometimes in the middle as well.

  8. Assessment of soil/structure interaction analysis procedures for nuclear power plant structures

    International Nuclear Information System (INIS)

    Young, G.A.; Wei, B.C.

    1977-01-01

    The paper presents an assessment of two state-of-the-art soil/structure interaction analysis procedures that are frequently used to provide seismic analyses of nuclear power plant structures. The advantages of large three-dimensional, elastic, discrete mass models and two-dimensional finite element models are compared. The discrete mass models can provide three-dimensional response capability with economical computer costs but only fair soil/structure interaction representation. The two-dimensional finite element models provide good soil/structure interaction representation, but cannot provide out-of-plane response. Three-dimensional finite element models would provide the most informative and complete analyses. For this model, computer costs would be much greater, but modeling costs would be approximately the same as those required for three-dimensional discrete mass models

  9. A Procedure for 3-D Contact Stress Analysis of Spiral Bevel Gears

    Science.gov (United States)

    Kumar, A.; Bibel, G.

    1994-01-01

    Contact stress distribution of spiral bevel gears using nonlinear finite element static analysis is presented. Procedures have been developed to solve the nonlinear equations that identify the gear and pinion surface coordinates based on the kinematics of the cutting process and orientate the pinion and the gear in space to mesh with each other. Contact is simulated by connecting GAP elements along the intersection of a line from each pinion point (parallel to the normal at the contact point) with the gear surface. A three dimensional model with four gear teeth and three pinion teeth is used to determine the contact stresses at two different contact positions in a spiral bevel gearset. A summary of the elliptical contact stress distribution is given. This information will be helpful to helicopter and aircraft transmission designers who need to minimize weight of the transmission and maximize reliability.

  10. A Numerical Procedure for Model Identifiability Analysis Applied to Enzyme Kinetics

    DEFF Research Database (Denmark)

    Daele, Timothy, Van; Van Hoey, Stijn; Gernaey, Krist

    2015-01-01

    The proper calibration of models describing enzyme kinetics can be quite challenging. In the literature, different procedures are available to calibrate these enzymatic models in an efficient way. However, in most cases the model structure is already decided on prior to the actual calibration...... and Pronzato (1997) and which can be easily set up for any type of model. In this paper the proposed approach is applied to the forward reaction rate of the enzyme kinetics proposed by Shin and Kim(1998). Structural identifiability analysis showed that no local structural model problems were occurring......) identifiability problems. By using the presented approach it is possible to detect potential identifiability problems and avoid pointless calibration (and experimental!) effort....

  11. Fundamental quantification procedure for total reflection X-ray fluorescence spectra analysis and elements determination

    International Nuclear Information System (INIS)

    Wegrzynek, D.; Holynska, B.

    1997-01-01

    A method for the determination of the concentrations of elements in particulate-like samples measured in total reflection geometry is proposed. In the proposed method the fundamental parameters are utilized for calculating the sensitivities of elements and an internal standard is used to account for the unknown mass per unit area of a sample and geometrical constant of the spectrometer. The modification of the primary excitation spectrum on its way to a sample has been taken into consideration. The concentrations of the elements to be determined are calculated simultaneously with the spectra deconvolution procedure. In the process of quantitative analysis the intensities of all X-ray peaks corresponding to K and L-series lines present in the analyzed spectrum are taken into account. (Author)

  12. The significance of the probabilistic safety analysis (PSA) in administrative procedures under nuclear law

    International Nuclear Information System (INIS)

    Berg, H.P.

    1994-01-01

    The probabilistic safety analysis (PSA) is a useful tool for safety relevant evaluation of nuclear power plant designed on the basis of deterministic specifications. The PSA yields data identifying reliable or less reliable systems, or frequent or less frequent failure modes to be taken into account for safety engineering. Performance of a PSA in administrative procedures under nuclear law, e.g. licensing, is an obligation laid down in a footnote to criterion 1.1 of the BMI safety criteria catalogue, which has been in force unaltered since 1977. The paper explains the application and achievements of PSA in the phase of reactor development concerned with the conceptual design basis and design features, using as an example the novel PWR. (orig./HP) [de

  13. Report on nuclear industry quality assurance procedures for safety analysis computer code development and use

    International Nuclear Information System (INIS)

    Sheron, B.W.; Rosztoczy, Z.R.

    1980-08-01

    As a result of a request from Commissioner V. Gilinsky to investigate in detail the causes of an error discovered in a vendor Emergency Core Cooling System (ECCS) computer code in March, 1978, the staff undertook an extensive investigation of the vendor quality assurance practices applied to safety analysis computer code development and use. This investigation included inspections of code development and use practices of the four major Light Water Reactor Nuclear Steam Supply System vendors and a major reload fuel supplier. The conclusion reached by the staff as a result of the investigation is that vendor practices for code development and use are basically sound. A number of areas were identified, however, where improvements to existing vendor procedures should be made. In addition, the investigation also addressed the quality assurance (QA) review and inspection process for computer codes and identified areas for improvement

  14. Advanced human-system interface design review guideline. Evaluation procedures and guidelines for human factors engineering reviews

    International Nuclear Information System (INIS)

    O'Hara, J.M.; Brown, W.S.; Baker, C.C.; Welch, D.L.; Granda, T.M.; Vingelis, P.J.

    1994-07-01

    Advanced control rooms will use advanced human-system interface (HSI) technologies that may have significant implications for plant safety in that they will affect the operator's overall role in the system, the method of information presentation, and the ways in which operators interact with the system. The U.S. Nuclear Regulatory Commission (NRC) reviews the HSI aspects of control rooms to ensure that they are designed to good human factors engineering principles and that operator performance and reliability are appropriately supported to protect public health and safety. The principal guidance available to the NRC, however, was developed more than ten years ago, well before these technological changes. Accordingly, the human factors guidance needs to be updated to serve as the basis for NRC review of these advanced designs. The purpose of this project was to develop a general approach to advanced HSI review and the human factors guidelines to support. NRC safety reviews of advanced systems. This two-volume report provides the results of the project. Volume I describes the development of the Advanced HSI Design Review Guideline (DRG) including (1) its theoretical and technical foundation, (2) a general model for the review of advanced HSIs, (3) guideline development in both hard-copy and computer-based versions, and (4) the tests and evaluations performed to develop and validate the DRG. Volume I also includes a discussion of the gaps in available guidance and a methodology for addressing them. Volume 2 provides the guidelines to be used for advanced HSI review and the procedures for their use

  15. Advanced human-system interface design review guideline. Evaluation procedures and guidelines for human factors engineering reviews

    Energy Technology Data Exchange (ETDEWEB)

    O`Hara, J.M.; Brown, W.S. [Brookhaven National Lab., Upton, NY (United States); Baker, C.C.; Welch, D.L.; Granda, T.M.; Vingelis, P.J. [Carlow International Inc., Falls Church, VA (United States)

    1994-07-01

    Advanced control rooms will use advanced human-system interface (HSI) technologies that may have significant implications for plant safety in that they will affect the operator`s overall role in the system, the method of information presentation, and the ways in which operators interact with the system. The U.S. Nuclear Regulatory Commission (NRC) reviews the HSI aspects of control rooms to ensure that they are designed to good human factors engineering principles and that operator performance and reliability are appropriately supported to protect public health and safety. The principal guidance available to the NRC, however, was developed more than ten years ago, well before these technological changes. Accordingly, the human factors guidance needs to be updated to serve as the basis for NRC review of these advanced designs. The purpose of this project was to develop a general approach to advanced HSI review and the human factors guidelines to support. NRC safety reviews of advanced systems. This two-volume report provides the results of the project. Volume I describes the development of the Advanced HSI Design Review Guideline (DRG) including (1) its theoretical and technical foundation, (2) a general model for the review of advanced HSIs, (3) guideline development in both hard-copy and computer-based versions, and (4) the tests and evaluations performed to develop and validate the DRG. Volume I also includes a discussion of the gaps in available guidance and a methodology for addressing them. Volume 2 provides the guidelines to be used for advanced HSI review and the procedures for their use.

  16. Neutron activation analysis characterization procedures for fish consumed at São Paulo City

    International Nuclear Information System (INIS)

    Tappiz, Bruno; Moreira, Edson G.

    2017-01-01

    The characterization of edible tissues of fishes consumed by humans is very important for determination of several toxic and potentially toxic elements, ensuring the food safety. The Instrumental Neutron Activation Analysis (INAA) comparative method allows the determination of several of these elements, as well as others, for example of nutritional character. This study is part of the International Atomic Energy Agency (IAEA) technical cooperation project of Latin America and Caribbean countries to ensure the quality of food and biomonitoring of contaminants in molluscs and fishes. Ten specimens of 5 of the most consumed fish in São Paulo city: white mouth croaker (Micropogonias Furnieri), smooth weakfish (Cynoscion learchus), common snook (Centropomus undecimalis), Brazilian sardine (Sardinella brasiliensis) and bluefish (Pomatomus Saltatrix) were analyzed. Complete procedures for analysis, which includes purchase in the largest warehouse in Latin America, transport to the laboratory, storage, freeze-drying, milling, weighting and others preparations of the subsamples, and the short irradiation parameters for the determination of Br, Cl, K, Mn and Na are reported. Results obtained for macro and microelements are presented and are in agreement with analysis of oyster tissue and mussel tissue certified reference materials under the same irradiation conditions, with z-score values ranging from -3.0 to 2.2. (author)

  17. A single extraction and HPLC procedure for simultaneous analysis of phytosterols, tocopherols and lutein in soybeans.

    Science.gov (United States)

    Slavin, Margaret; Yu, Liangli Lucy

    2012-12-15

    A saponification/extraction procedure and high performance liquid chromatography (HPLC) analysis method were developed and validated for simultaneous analysis of phytosterols, tocopherols and lutein (a carotenoid) in soybeans. Separation was achieved on a phenyl column with a ternary, isocratic solvent system of acetonitrile, methanol and water (48:22.5:29.5, v/v/v). Evaporative light scattering detection (ELSD) was used to quantify β-sitosterol, stigmasterol, campesterol, and α-, δ- and γ-tocopherols, while lutein was quantified with visible light absorption at 450 nm. Peak identification was verified by retention times and spikes with external standards. Standard curves were constructed (R(2)>0.99) to allow for sample quantification. Recovery of the saponification and extraction was demonstrated via analysis of spiked samples. Also, the accuracy of results of four soybeans using the described saponification and HPLC analytical method was validated against existing methods. This method offers a more efficient alternative to individual methods for quantifying lutein, tocopherols and sterols in soybeans. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Neutron activation analysis characterization procedures for fish consumed at São Paulo City

    Energy Technology Data Exchange (ETDEWEB)

    Tappiz, Bruno; Moreira, Edson G., E-mail: brunotappiz2@gmail.com, E-mail: emoreira@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    The characterization of edible tissues of fishes consumed by humans is very important for determination of several toxic and potentially toxic elements, ensuring the food safety. The Instrumental Neutron Activation Analysis (INAA) comparative method allows the determination of several of these elements, as well as others, for example of nutritional character. This study is part of the International Atomic Energy Agency (IAEA) technical cooperation project of Latin America and Caribbean countries to ensure the quality of food and biomonitoring of contaminants in molluscs and fishes. Ten specimens of 5 of the most consumed fish in São Paulo city: white mouth croaker (Micropogonias Furnieri), smooth weakfish (Cynoscion learchus), common snook (Centropomus undecimalis), Brazilian sardine (Sardinella brasiliensis) and bluefish (Pomatomus Saltatrix) were analyzed. Complete procedures for analysis, which includes purchase in the largest warehouse in Latin America, transport to the laboratory, storage, freeze-drying, milling, weighting and others preparations of the subsamples, and the short irradiation parameters for the determination of Br, Cl, K, Mn and Na are reported. Results obtained for macro and microelements are presented and are in agreement with analysis of oyster tissue and mussel tissue certified reference materials under the same irradiation conditions, with z-score values ranging from -3.0 to 2.2. (author)

  19. A DNA fingerprinting procedure for ultra high-throughput genetic analysis of insects.

    Science.gov (United States)

    Schlipalius, D I; Waldron, J; Carroll, B J; Collins, P J; Ebert, P R

    2001-12-01

    Existing procedures for the generation of polymorphic DNA markers are not optimal for insect studies in which the organisms are often tiny and background molecular information is often non-existent. We have used a new high throughput DNA marker generation protocol called randomly amplified DNA fingerprints (RAF) to analyse the genetic variability in three separate strains of the stored grain pest, Rhyzopertha dominica. This protocol is quick, robust and reliable even though it requires minimal sample preparation, minute amounts of DNA and no prior molecular analysis of the organism. Arbitrarily selected oligonucleotide primers routinely produced approximately 50 scoreable polymorphic DNA markers, between individuals of three independent field isolates of R. dominica. Multivariate cluster analysis using forty-nine arbitrarily selected polymorphisms generated from a single primer reliably separated individuals into three clades corresponding to their geographical origin. The resulting clades were quite distinct, with an average genetic difference of 37.5 +/- 6.0% between clades and of 21.0 +/- 7.1% between individuals within clades. As a prelude to future gene mapping efforts, we have also assessed the performance of RAF under conditions commonly used in gene mapping. In this analysis, fingerprints from pooled DNA samples accurately and reproducibly reflected RAF profiles obtained from individual DNA samples that had been combined to create the bulked samples.

  20. Human factors evaluation of teletherapy: Training and organizational analysis. Volume 4

    Energy Technology Data Exchange (ETDEWEB)

    Henriksen, K.; Kaye, R.D.; Jones, R. [Hughes Training, Inc., Falls Church, VA (United States); Morisseau, D.S.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-07-01

    A series of human factors evaluations were undertaken to better understand the contributing factors to human error in the teletherapy environment. Teletherapy is a multidisciplinary methodology for treating cancerous tissue through selective exposure to an external beam of ionizing radiation. A team of human factors specialists, assisted by a panel of radiation oncologists, medical physicists, and radiation therapists, conducted site visits to radiation oncology departments at community hospitals, university centers, and free-standing clinics. A function and task analysis was initially performed to guide subsequent evaluations in the areas of system-user interfaces, procedures, training and qualifications, and organizational policies and practices. The present work focuses solely on training and qualifications of personnel (e.g., training received before and during employment), and the potential impact of organizational factors on the performance of teletherapy. Organizational factors include such topics as adequacy of staffing, performance evaluations, commonly occurring errors, implementation of quality assurance programs, and organizational climate.

  1. Human factors evaluation of teletherapy: Training and organizational analysis. Volume 4

    International Nuclear Information System (INIS)

    Henriksen, K.; Kaye, R.D.; Jones, R.; Morisseau, D.S.; Serig, D.I.

    1995-07-01

    A series of human factors evaluations were undertaken to better understand the contributing factors to human error in the teletherapy environment. Teletherapy is a multidisciplinary methodology for treating cancerous tissue through selective exposure to an external beam of ionizing radiation. A team of human factors specialists, assisted by a panel of radiation oncologists, medical physicists, and radiation therapists, conducted site visits to radiation oncology departments at community hospitals, university centers, and free-standing clinics. A function and task analysis was initially performed to guide subsequent evaluations in the areas of system-user interfaces, procedures, training and qualifications, and organizational policies and practices. The present work focuses solely on training and qualifications of personnel (e.g., training received before and during employment), and the potential impact of organizational factors on the performance of teletherapy. Organizational factors include such topics as adequacy of staffing, performance evaluations, commonly occurring errors, implementation of quality assurance programs, and organizational climate

  2. The development of human factors technologies -The development of human behaviour analysis techniques-

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Lee, Yong Heui; Park, Keun Ok; Chun, Se Woo; Suh, Sang Moon; Park, Jae Chang

    1995-07-01

    In order to contribute to human error reduction through the studies on human-machine interaction in nuclear power plants, this project has objectives to develop SACOM(Simulation Analyzer with a Cognitive Operator Model) and techniques for human error analysis and application. In this year, we studied the followings: 1) Site investigation of operator tasks, 2) Development of operator task micro structure and revision of micro structure, 3) Development of knowledge representation software and SACOM prototype, 4) Development of performance assessment methodologies in task simulation and analysis of the effects of performance shaping factors. 1) Classification of error shaping factors(ESFs) and development of software for ESF evaluation, 2) Analysis of human error occurrences and revision of analysis procedure, 3) Experiment for human error data collection using a compact nuclear simulator, 4) Development of a prototype data base system of the analyzed information on trip cases. 55 figs, 23 tabs, 33 refs. (Author)

  3. Análisis del fracaso empresarial por sectores: factores diferenciadores = Cross-industry analysis of business failure: differential factors

    Directory of Open Access Journals (Sweden)

    María Jesús Mures Quintana

    2012-12-01

    Full Text Available El objetivo de este trabajo se centra en el análisis del fracaso empresarial por sectores, a fin de identificar los factores explicativos y predictivos de este fenómeno que son diferentes en tres de los principales sectores que se distinguen en toda economía: industria, construcción y servicios. Para cada uno de estos sectores, seguimos el mismo procedimiento. En primer lugar, aplicamos un análisis de componentes principales con el que identificamos los factores explicativos del fracaso empresarial en los tres sectores. A continuación, consideramos dichos factores como variables independientes en un análisis discriminante, que aplicamos para predecir el fracaso de una muestra de empresas, utilizando no sólo información financiera en forma de ratios, sino también otras variables no financieras relativas a las empresas, así como información externa a las mismas que refleja las condiciones macroeconómicas bajo las que desarrollan su actividad. This paper focuses on a cross-industry analysis of business failure, in order to identify the explanatory and predictor factors of this event that are different in three of the main industries in every economy: manufacturing, building and service. For each one of these industries, the same procedure is followed. First, a principal components analysis is applied in order to identify the explanatory factors of business failure in the three industries. Next, these factors are considered as independent variables in a discriminant analysis, so as to predict the firms’ failure, using not only financial information expressed by ratios, but also other non-financial variables related to the firms, as well as external information that reflects macroeconomic conditions under which they develop their activity.

  4. Genesis of theory and analysis of practice of applying the analytical procedures in auditing

    OpenAIRE

    Сурніна, К. С.

    2012-01-01

    Determination of concept "Analytical procedures" in an audit by different researchers is investigated in the article, ownvision of necessity of wideuse of analytical procedures in audit is defined. Classification of analytical procedures is presentedtaking into account the specifity of auditing process on the whole

  5. High procedural fairness heightens the effect of outcome favorability on self-evaluations : An attributional analysis

    NARCIS (Netherlands)

    Brockner, J.; Heuer, L.; Magner, N.; Folger, R.; Umphress, E.; Bos, K. van den; Vermunt, Riël; Magner, M.; Siegel, P.

    2003-01-01

    Previous research has shown that outcome favorability and procedural fairness often interact to influence employees work attitudes and behaviors. Moreover, the form of the interaction effect depends upon the dependent variable. Relative to when procedural fairness is low, high procedural fairness:

  6. An analysis of contingency statements in a DRO procedure: A case report.

    Science.gov (United States)

    Gerow, Stephanie; Rispoli, Mandy; Boles, Margot B; Neely, Leslie C

    2015-06-01

    To examine latency to criterion for reduction of challenging behaviour with and without stating a contingency statement immediately prior to a DRO procedure. An ABAC design in which A was baseline, B was used to evaluate the efficacy of a DRO procedure, and C was used to evaluate the efficacy of a DRO procedure with a contingency statement. The DRO with the contingency statement intervention was associated with a shorter latency to behaviour change than the DRO procedure without the contingency statement. These preliminary findings from this case study highlight the importance of examining the efficiency of behaviour change procedures. Directions for future research are provided.

  7. Laboratory manual on sample preparation procedures for x-ray micro-analysis

    International Nuclear Information System (INIS)

    1997-01-01

    X-ray micro fluorescence is a non-destructive and sensitive method for studying the microscopic distribution of different elements in almost all kinds of samples. Since the beginning of this century, x-rays and electrons have been used for the analysis of many different kinds of material. Techniques which rely on electrons are mainly developed for microscopic studies, and are used in conventional Electron Microscopy (EM) or Scanning Electron Microscopy (SEM), while x-rays are widely used for chemical analysis at the microscopic level. The first chemical analysis by fluorescence spectroscopy using small x-ray beams was conducted in 1928 by Glockner and Schreiber. Since then much work has been devoted to developing different types of optical systems for focusing an x-ray beam, but the efficiency of these systems is still inferior to the conventional electron optical systems. However, even with a poor optical efficiency, the x-ray microbeam has many advantages compared with electron or proton induced x-ray emission methods. These include: The analyses are non-destructive, losses of mass are negligible, and due to the low thermal loading of x-rays, materials which may be thermally degraded can be analysed; Samples can be analysed in air, and no vacuum is required, therefore specimens with volatile components such as water in biological samples, can be imaged at normal pressure and temperature; No charging occurs during analysis and therefore coating of the sample with a conductive layer is not necessary; With these advantages, simpler sample preparation procedures including mounting and preservation can be used

  8. Exploring Technostress: Results of a Large Sample Factor Analysis

    Directory of Open Access Journals (Sweden)

    Steponas Jonušauskas

    2016-06-01

    Full Text Available With reference to the results of a large sample factor analysis, the article aims to propose the frame examining technostress in a population. The survey and principal component analysis of the sample consisting of 1013 individuals who use ICT in their everyday work was implemented in the research. 13 factors combine 68 questions and explain 59.13 per cent of the answers dispersion. Based on the factor analysis, questionnaire was reframed and prepared to reasonably analyze the respondents’ answers, revealing technostress causes and consequences as well as technostress prevalence in the population in a statistically validated pattern. A key elements of technostress based on factor analysis can serve for the construction of technostress measurement scales in further research.

  9. Economic Analysis of Factors Affecting Technical Efficiency of ...

    African Journals Online (AJOL)

    Economic Analysis of Factors Affecting Technical Efficiency of Smallholders ... socio-economic characteristics which influence technical efficiency in maize production. ... Ministry of Agriculture and livestock, records, books, reports and internet.

  10. Development of an optimized procedure bridging design and structural analysis codes for the automatized design of the SMART

    International Nuclear Information System (INIS)

    Kim, Tae Wan; Park, Keun Bae; Choi, Suhn; Kim, Kang Soo; Jeong, Kyeong Hoon; Lee, Gyu Mahn

    1998-09-01

    In this report, an optimized design and analysis procedure is established to apply to the SMART (System-integrated Modular Advanced ReacTor) development. The development of an optimized procedure is to minimize the time consumption and engineering effort by squeezing the design and feedback interactions. To achieve this goal, the data and information generated through the design development should be directly transferred to the analysis program with minimum operation. The verification of the design concept requires considerable effort since the communication between the design and analysis involves time consuming stage for the conversion of input information. In this report, an optimized procedure is established bridging the design and analysis stage utilizing the IDEAS, ABAQUS and ANSYS. (author). 3 refs., 2 tabs., 5 figs

  11. Super-delta: a new differential gene expression analysis procedure with robust data normalization.

    Science.gov (United States)

    Liu, Yuhang; Zhang, Jinfeng; Qiu, Xing

    2017-12-21

    Normalization is an important data preparation step in gene expression analyses, designed to remove various systematic noise. Sample variance is greatly reduced after normalization, hence the power of subsequent statistical analyses is likely to increase. On the other hand, variance reduction is made possible by borrowing information across all genes, including differentially expressed genes (DEGs) and outliers, which will inevitably introduce some bias. This bias typically inflates type I error; and can reduce statistical power in certain situations. In this study we propose a new differential expression analysis pipeline, dubbed as super-delta, that consists of a multivariate extension of the global normalization and a modified t-test. A robust procedure is designed to minimize the bias introduced by DEGs in the normalization step. The modified t-test is derived based on asymptotic theory for hypothesis testing that suitably pairs with the proposed robust normalization. We first compared super-delta with four commonly used normalization methods: global, median-IQR, quantile, and cyclic loess normalization in simulation studies. Super-delta was shown to have better statistical power with tighter control of type I error rate than its competitors. In many cases, the performance of super-delta is close to that of an oracle test in which datasets without technical noise were used. We then applied all methods to a collection of gene expression datasets on breast cancer patients who received neoadjuvant chemotherapy. While there is a substantial overlap of the DEGs identified by all of them, super-delta were able to identify comparatively more DEGs than its competitors. Downstream gene set enrichment analysis confirmed that all these methods selected largely consistent pathways. Detailed investigations on the relatively small differences showed that pathways identified by super-delta have better connections to breast cancer than other methods. As a new pipeline, super

  12. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis (Spanish Edition)

    International Nuclear Information System (INIS)

    2017-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  13. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis (French Edition)

    International Nuclear Information System (INIS)

    2017-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  14. Sustainable Manufacturing Practices in Malaysian Automotive Industry: Confirmatory Factor Analysis

    OpenAIRE

    Habidin, Nurul Fadly; Zubir, Anis Fadzlin Mohd; Fuz, Nursyazwani Mohd; Latip, Nor Azrin Md; Azman, Mohamed Nor Azhari

    2015-01-01

    Sustainable manufacturing practices (SMPs) have received enormous attention in current years as an effective solution to support the continuous growth and expansion of the automotive manufacturing industry. This reported study was conducted to examine confirmatory factor analysis for SMP such as manufacturing process, supply chain management, social responsibility, and environmental management based on automotive manufacturing industry. The results of confirmatory factor analysis show that fo...

  15. A cost and time analysis of laryngology procedures in the endoscopy suite versus the operating room.

    Science.gov (United States)

    Hillel, Alexander T; Ochsner, Matthew C; Johns, Michael M; Klein, Adam M

    2016-06-01

    To assess the costs, charges, reimbursement, and efficiency of performing awake laryngology procedures in an endoscopy suite (ES) compared with like procedures performed in the operating room (OR). Retrospective review of billing records. Cost, charges, and reimbursements for the hospital, surgeon, and anesthesiologist were compared between ES injection laryngoplasty and laser excision procedures and matched case controls in the OR. Time spent in 1) the preoperative unit, 2) the operating or endoscopy suite, and 3) recovery unit were compared between OR and ES procedures. Hospital expenses were significantly less for ES procedures when compared to OR procedures. Reimbursement was similar for ES and OR injection laryngoplasty, though greater for OR laser excisions. Net balance (reimbursement-expenses) was greater for ES procedures. A predictive model of payer costs over a 3-year period showed similar costs for ES and OR laser procedures and reduced costs for ES compared to OR injection laryngoplasty. Times spent preoperatively and the procedure were significantly less for ES procedures. For individual laryngology procedures, the ES reduces time and costs compared to the OR, increasing otolaryngologist and hospital efficiency. This reveals cost and time savings of ES injection laryngoplasty, which occurs at a similar frequency as OR injection laryngoplasty. Given the increased frequency for ES laser procedures, total costs are similar for ES and OR laser excision of papilloma, which usually require repeated procedures. When regulated office space is unavailable, endoscopy rooms represent an alternative setting for unsedated laryngology procedures. NA Laryngoscope, 126:1385-1389, 2016. © 2015 The American Laryngological, Rhinological and Otological Society, Inc.

  16. An Analysis of Construction Accident Factors Based on Bayesian Network

    OpenAIRE

    Yunsheng Zhao; Jinyong Pei

    2013-01-01

    In this study, we have an analysis of construction accident factors based on bayesian network. Firstly, accidents cases are analyzed to build Fault Tree method, which is available to find all the factors causing the accidents, then qualitatively and quantitatively analyzes the factors with Bayesian network method, finally determines the safety management program to guide the safety operations. The results of this study show that bad condition of geological environment has the largest posterio...

  17. A Meta-Analysis of Bilateral Essure® Procedural Placement Success Rates on First Attempt.

    Science.gov (United States)

    Frietze, Gabriel; Leyser-Whalen, Ophra; Rahman, Mahbubur; Rouhani, Mahta; Berenson, Abbey B

    2015-12-01

    Background: The Essure ® (Bayer HealthCare Pharmaceuticals, Leverkusen, Germany) female sterilization procedure entails using a hysteroscope to guide a microinsert into the Fallopian tube openings. Failed placement can lead to patient dissatisfaction, repeat procedures, unintended or ectopic pregnancy, perforation of internal organs, or need for subsequent medical interventions. Additional interventions increase women's health risks, and costs for patients and the health care industry. Demonstrated successful placement rates are 63%-100%. To date, there have not been any systematic analyses of variables associated with placement rates. Objectives: The aims of this review were: (1) to estimate the average rate of successful bilateral Essure microinsert placement on first attempt; and (2) to identify variables associated with successful placement. Materials and Methods: A meta-analysis was conducted on 64 published studies and 19 variables. Following Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines, all published studies between November 2001 and February 2015 were reviewed. The studies were taken from from PubMed and Google Scholar, and by using the the "snowball" method that reported variables associated with successful bilateral Essure placement rates. Results: The weighted average rate of successful bilateral microinsert placement on first attempt was 92% (0.92 [95% confidence interval: 0.904-0.931]). Variables associated with successful placements were: (1) newer device models; (2) higher body mass index; and (3) a higher percent of patients who received local anesthesia. Conclusions: The data gathered for this review indicate that the highest bilateral success rates may be obtained by utilizing the newest Essure device model with local anesthesia in heavier patients. More standardized data reporting in published Essure studies is recommended. (J GYNECOL SURG 31:308).

  18. Function Allocation in Complex Socio-Technical Systems: Procedure usage in nuclear power and the Context Analysis Method for Identifying Design Solutions (CAMIDS) Model

    Science.gov (United States)

    Schmitt, Kara Anne

    This research aims to prove that strict adherence to procedures and rigid compliance to process in the US Nuclear Industry may not prevent incidents or increase safety. According to the Institute of Nuclear Power Operations, the nuclear power industry has seen a recent rise in events, and this research claims that a contributing factor to this rise is organizational, cultural, and based on peoples overreliance on procedures and policy. Understanding the proper balance of function allocation, automation and human decision-making is imperative to creating a nuclear power plant that is safe, efficient, and reliable. This research claims that new generations of operators are less engaged and thinking because they have been instructed to follow procedures to a fault. According to operators, they were once to know the plant and its interrelations, but organizationally more importance is now put on following procedure and policy. Literature reviews were performed, experts were questioned, and a model for context analysis was developed. The Context Analysis Method for Identifying Design Solutions (CAMIDS) Model was created, verified and validated through both peer review and application in real world scenarios in active nuclear power plant simulators. These experiments supported the claim that strict adherence and rigid compliance to procedures may not increase safety by studying the industry's propensity for following incorrect procedures, and when it directly affects the outcome of safety or security of the plant. The findings of this research indicate that the younger generations of operators rely highly on procedures, and the organizational pressures of required compliance to procedures may lead to incidents within the plant because operators feel pressured into following the rules and policy above performing the correct actions in a timely manner. The findings support computer based procedures, efficient alarm systems, and skill of the craft matrices. The solution to

  19. The Recoverability of P-Technique Factor Analysis

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  20. Likelihood-based Dynamic Factor Analysis for Measurement and Forecasting

    NARCIS (Netherlands)

    Jungbacker, B.M.J.P.; Koopman, S.J.

    2015-01-01

    We present new results for the likelihood-based analysis of the dynamic factor model. The latent factors are modelled by linear dynamic stochastic processes. The idiosyncratic disturbance series are specified as autoregressive processes with mutually correlated innovations. The new results lead to

  1. Peri-procedural complications and associated risk factors in wingspan stent-assistant angioplasty of intracranial artery stenosis

    International Nuclear Information System (INIS)

    Li Zhaoshuo; Li Tianxiao; Wang Ziliang; Bai Weixing; Xue Jiangyu; Zhu Liangfu; Li Li

    2013-01-01

    Objective: To retrospectively evaluate the cerebrovascular complications from stenting for symptomatic intracranial stenosis and to detect the factors associated with complications. Methods: Medical records of Wingspan stenting were reviewed for 306 cases with symptomatic intracranial stenosis from July 2007 to February 2012, including transient ischemic attack, ischemic stroke, death and intracranial hemorrhage as clinical in-hospital complications. The location of lesions included middle cerebral artery level M1 (114 lesions), intracranial portion of the internal carotid artery (50 lesions), vertebral artery 4 (75 lesions), venebro-basilar artery (14 lesions), basilar artery (76 lesions). Complications were evaluated and analyzed to find out whether they were associated with patient-or stenosis-related risk factors using χ"2 test. Results: The technical success rate was 99% (303/306). Cerebrovascular complications rate was 6.9% (21/303), with 1.6% (14/303) of disabling stroke events and 0.7% (2/303) of deaths. Hemorrhagic events were consisted of procedure-related events (3 cases), hyperperfusion (3 cases), ischemic events of perforator stroke (8 cases), transient ischemic attack (3 cases), embolization (2 cases), thrombosis in stent (2 cases). Hemorrhagic events were associated with significantly higher morbidity and mortality rates (χ"2 = 2.908, P < 0.05) and occurred more frequently after treatment of middle cerebral artery stenosis than other lesions (χ"2 = 1.168, P < 0.05). Perforating branches were detected to be affected mainly in the basilar artery than other locations (χ"2 = 4.263, P < 0.05). Conclusion: The complication rates in the study are preliminary consistent with the previously published data. Hemorrhagic events are prone to occur in the treatment of middle cerebral artery stenosis, while perforating branches are affected mainly in the basilar artery. (authors)

  2. Analysis of factors important for the occurrence of Campylobacter in Danish broiler flocks

    DEFF Research Database (Denmark)

    Sommer, Helle Mølgaard; Heuer, Ole Eske; Sørensen, Anna Irene Vedel

    2013-01-01

    a multivariate analysis including all 43 variables. A multivariate analysis was conducted using a generalized linear model, and the correlations between the houses from the same farms were accounted for by adding a variance structure to the model. The procedures for analyses included backward elimination...... of positive flocks/total number of flocks delivered over the 2-year period).The following factors were found to be significantly associated with the occurrence of Campylobacter in the broiler flocks: old broiler houses, late introduction of whole wheat in the feed, relatively high broiler age at slaughter...

  3. Procedure-related risk of miscarriage following amniocentesis and chorionic villus sampling: a systematic review and meta-analysis.

    Science.gov (United States)

    Akolekar, R; Beta, J; Picciarelli, G; Ogilvie, C; D'Antonio, F

    2015-01-01

    To estimate procedure-related risks of miscarriage following amniocentesis and chorionic villus sampling (CVS) based on a systematic review of the literature and a meta-analysis. A search of MEDLINE, EMBASE, CINHAL and The Cochrane Library (2000-2014) was performed to review relevant citations reporting procedure-related complications of amniocentesis and CVS. Only studies reporting data on more than 1000 procedures were included in this review to minimize the effect of bias from smaller studies. Heterogeneity between studies was estimated using Cochran's Q, the I(2) statistic and Egger bias. Meta-analysis of proportions was used to derive weighted pooled estimates for the risk of miscarriage before 24 weeks' gestation. Incidence-rate difference meta-analysis was used to estimate pooled procedure-related risks. The weighted pooled risks of miscarriage following invasive procedures were estimated from analysis of controlled studies including 324 losses in 42 716 women who underwent amniocentesis and 207 losses in 8899 women who underwent CVS. The risk of miscarriage prior to 24 weeks in women who underwent amniocentesis and CVS was 0.81% (95% CI, 0.58-1.08%) and 2.18% (95% CI, 1.61-2.82%), respectively. The background rates of miscarriage in women from the control group that did not undergo any procedures were 0.67% (95% CI, 0.46-0.91%) for amniocentesis and 1.79% (95% CI, 0.61-3.58%) for CVS. The weighted pooled procedure-related risks of miscarriage for amniocentesis and CVS were 0.11% (95% CI, -0.04 to 0.26%) and 0.22% (95% CI, -0.71 to 1.16%), respectively. The procedure-related risks of miscarriage following amniocentesis and CVS are much lower than are currently quoted. Copyright © 2014 ISUOG. Published by John Wiley & Sons Ltd.

  4. Centre characteristics and procedure-related factors have an impact on outcomes of allogeneic transplantation for patients with CLL

    DEFF Research Database (Denmark)

    Schetelig, Johannes; de Wreede, Liesbeth C; Andersen, Niels S

    2017-01-01

    The best approach for allogeneic haematopoietic stem cell transplantations (alloHCT) in patients with chronic lymphocytic leukaemia (CLL) is unknown. We therefore analysed the impact of procedure- and centre-related factors on 5-year event-free survival (EFS) in a large retrospective study. Data...... of 684 CLL patients who received a first alloHCT between 2000 and 2011 were analysed by multivariable Cox proportional hazards models with a frailty component to investigate unexplained centre heterogeneity. Five-year EFS of the whole cohort was 37% (95% confidence interval [CI], 34-42%). Larger numbers...... of CLL alloHCTs (hazard ratio [HR] 0·96, P = 0·002), certification of quality management (HR 0·7, P = 0·045) and a higher gross national income per capita (HR 0·4, P = 0·04) improved EFS. In vivo T-cell depletion (TCD) with alemtuzumab compared to no TCD (HR 1·5, P = 0·03), and a female donor compared...

  5. Critical analysis of dose reduction trends with special reference to procedures involved in fluoroscopy

    International Nuclear Information System (INIS)

    Anderson, K.; Mattsson, O.

    1985-01-01

    Experiences of a half-year's use of dose-checking instrumentation in fluoroscopy are presented. Radiologists under training succeeded in lowering the patient dose surprisingly well - the diagnostic results remaining unchanged or even improving, because of higher image quality as a result of better diaphragming. Other factors involved in fluoroscopy are discussed. Present systems with heavy bulky intensifiers create problems for close patient contact and for the necessary manipulation, patient adjustment and application of compression. The examination will be simplified and facilitated by the use of a flat image system: proper adjustments need fewer fluoroscopic observations, and patient dose as well as examination time can be saved. Flat display principles will take over the function of the present old-fashioned intensifiers and monitors, either as single units or equipped with TV, video or digital processing accessories. A flat image system, the 'PET-scope', was tested and found to be very convenient for fluoroscopic procedures. The physical properties were studied thoroughly - the high intensification particularly gives these systems an advantage in dose reduction. New applications are possible with these light-weight low-dose units. Fluoroscopy represents a field where considerable contributions to the 'Quality Assurance' trend can be obtained. (author)

  6. Left ventricular wall motion abnormalities evaluated by factor analysis as compared with Fourier analysis

    International Nuclear Information System (INIS)

    Hirota, Kazuyoshi; Ikuno, Yoshiyasu; Nishikimi, Toshio

    1986-01-01

    Factor analysis was applied to multigated cardiac pool scintigraphy to evaluate its ability to detect left ventricular wall motion abnormalities in 35 patients with old myocardial infarction (MI), and in 12 control cases with normal left ventriculography. All cases were also evaluated by conventional Fourier analysis. In most cases with normal left ventriculography, the ventricular and atrial factors were extracted by factor analysis. In cases with MI, the third factor was obtained in the left ventricle corresponding to wall motion abnormality. Each case was scored according to the coincidence of findings of ventriculography and those of factor analysis or Fourier analysis. Scores were recorded for three items; the existence, location, and degree of asynergy. In cases of MI, the detection rate of asynergy was 94 % by factor analysis, 83 % by Fourier analysis, and the agreement in respect to location was 71 % and 66 %, respectively. Factor analysis had higher scores than Fourier analysis, but this was not significant. The interobserver error of factor analysis was less than that of Fourier analysis. Factor analysis can display locations and dynamic motion curves of asynergy, and it is regarded as a useful method for detecting and evaluating left ventricular wall motion abnormalities. (author)

  7. Increasing spelling achievement: an analysis of treatment procedures utilizing an alternating treatments design.

    OpenAIRE

    Ollendick, T H; Matson, J L; Esveldt-Dawson, K; Shapiro, E S

    1980-01-01

    Two studies which examine the effectiveness of spelling remediation procedures are reported. In both studies, an alternating treatment design was employed. In the first study, positive practice overcorrection plus positive reinforcement was compared to positive practice alone and a no-remediation control condition. In the second study, positive practice plus positive reinforcement was compared to a traditional corrective procedure plus positive reinforcement and a traditional procedure when u...

  8. Analysis of related risk factors for pancreatic fistula after pancreaticoduodenectomy

    Directory of Open Access Journals (Sweden)

    Qi-Song Yu

    2016-08-01

    Full Text Available Objective: To explore the related risk factors for pancreatic fistula after pancreaticoduodenectomy to provide a theoretical evidence for effectively preventing the occurrence of pancreatic fistula. Methods: A total of 100 patients who were admitted in our hospital from January, 2012 to January, 2015 and had performed pancreaticoduodenectomy were included in the study. The related risk factors for developing pancreatic fistula were collected for single factor and Logistic multi-factor analysis. Results: Among the included patients, 16 had pancreatic fistula, and the total occurrence rate was 16% (16/100. The single-factor analysis showed that the upper abdominal operation history, preoperative bilirubin, pancreatic texture, pancreatic duct diameter, intraoperative amount of bleeding, postoperative hemoglobin, and application of somatostatin after operation were the risk factors for developing pancreatic fistula (P<0.05. The multi-factor analysis showed that the upper abdominal operation history, the soft pancreatic texture, small pancreatic duct diameter, and low postoperative hemoglobin were the dependent risk factors for developing pancreatic fistula (OR=4.162, 6.104, 5.613, 4.034, P<0.05. Conclusions: The occurrence of pancreatic fistula after pancreaticoduodenectomy is closely associated with the upper abdominal operation history, the soft pancreatic texture, small pancreatic duct diameter, and low postoperative hemoglobin; therefore, effective measures should be taken to reduce the occurrence of pancreatic fistula according to the patients’ own conditions.

  9. Environmental Performance in Countries Worldwide: Determinant Factors and Multivariate Analysis

    Directory of Open Access Journals (Sweden)

    Isabel Gallego-Alvarez

    2014-11-01

    Full Text Available The aim of this study is to analyze the environmental performance of countries and the variables that can influence it. At the same time, we performed a multivariate analysis using the HJ-biplot, an exploratory method that looks for hidden patterns in the data, obtained from the usual singular value decomposition (SVD of the data matrix, to contextualize the countries grouped by geographical areas and the variables relating to environmental indicators included in the environmental performance index. The sample used comprises 149 countries of different geographic areas. The findings obtained from the empirical analysis emphasize that socioeconomic factors, such as economic wealth and education, as well as institutional factors represented by the style of public administration, in particular control of corruption, are determinant factors of environmental performance in the countries analyzed. In contrast, no effect on environmental performance was found for factors relating to the internal characteristics of a country or political factors.

  10. Evaluation of some procedures relevant to the determination of trace elemental components in biological materials by destructive neutron activation analysis

    International Nuclear Information System (INIS)

    Berry, D.L.

    1979-01-01

    The development of a simplified procedure for the analysis of biological materials by destructive neutron activation analysis (DNAA) is described. The sample manipulations preceding gamma ray assay were investigated as five specific stages of processing: (1) pre-irradiation treatment; (2) sample irradiation; (3) removal of the organic matrix; (4) removal of interfering radioactivities; and (5) concentration and separation of analyte activities. Each stage was evaluated with respect to susceptibility to sample contamination, loss of trace elemental components, and compatibility with other operations in the overall DNAA procedures. A complete DNAA procedure was proposed and evaluated for the analysis of standard bovine liver and blood samples. The DNAA system was effective for the determination of As, Cu, Fe, Hg, Mo, Rb, Sb, Se, and Zn without yield determinations and with a minimum turn-around time of approximately 3 days

  11. Evaluation of some procedures relevant to the determination of trace elemental components in biological materials by destructive neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Berry, D.L.

    1979-01-01

    The development of a simplified procedure for the analysis of biological materials by destructive neutron activation analysis (DNAA) is described. The sample manipulations preceding gamma ray assay were investigated as five specific stages of processing: (1) pre-irradiation treatment; (2) sample irradiation; (3) removal of the organic matrix; (4) removal of interfering radioactivities; and (5) concentration and separation of analyte activities. Each stage was evaluated with respect to susceptibility to sample contamination, loss of trace elemental components, and compatibility with other operations in the overall DNAA procedures. A complete DNAA procedure was proposed and evaluated for the analysis of standard bovine liver and blood samples. The DNAA system was effective for the determination of As, Cu, Fe, Hg, Mo, Rb, Sb, Se, and Zn without yield determinations and with a minimum turn-around time of approximately 3 days.

  12. Maturation of arteriovenous fistula: Analysis of key factors

    Directory of Open Access Journals (Sweden)

    Muhammad A. Siddiqui

    2017-12-01

    Full Text Available The growing proportion of individuals suffering from chronic kidney disease has considerable repercussions for both kidney specialists and primary care. Progressive and permanent renal failure is most frequently treated with hemodialysis. The efficiency of hemodialysis treatment relies on the functional status of vascular access. Determining the type of vascular access has prime significance for maximizing successful maturation of a fistula and avoiding surgical revision. Despite the frequency of arteriovenous fistula procedures, there are no consistent criteria applied before creation of arteriovenous fistulae. Increased prevalence and use of arteriovenous fistulae would result if there were reliable criteria to assess which arteriovenous fistulae are more likely to reach maturity without additional procedures. Published studies assessing the predictive markers of fistula maturation vary to a great extent with regard to definitions, design, study size, patient sample, and clinical factors. As a result, surgeons and specialists must decide which possible risk factors are most likely to occur, as well as which parameters to employ when evaluating the success rate of fistula development in patients awaiting the creation of permanent access. The purpose of this literature review is to discuss the role of patient factors and blood markers in the development of arteriovenous fistulae.

  13. Implementation and evaluation of nonparametric regression procedures for sensitivity analysis of computationally demanding models

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Swiler, Laura P.; Helton, Jon C.; Sallaberry, Cedric J.

    2009-01-01

    The analysis of many physical and engineering problems involves running complex computational models (simulation models, computer codes). With problems of this type, it is important to understand the relationships between the input variables (whose values are often imprecisely known) and the output. The goal of sensitivity analysis (SA) is to study this relationship and identify the most significant factors or variables affecting the results of the model. In this presentation, an improvement on existing methods for SA of complex computer models is described for use when the model is too computationally expensive for a standard Monte-Carlo analysis. In these situations, a meta-model or surrogate model can be used to estimate the necessary sensitivity index for each input. A sensitivity index is a measure of the variance in the response that is due to the uncertainty in an input. Most existing approaches to this problem either do not work well with a large number of input variables and/or they ignore the error involved in estimating a sensitivity index. Here, a new approach to sensitivity index estimation using meta-models and bootstrap confidence intervals is described that provides solutions to these drawbacks. Further, an efficient yet effective approach to incorporate this methodology into an actual SA is presented. Several simulated and real examples illustrate the utility of this approach. This framework can be extended to uncertainty analysis as well.

  14. Confirmatory factor analysis applied to the Force Concept Inventory

    Science.gov (United States)

    Eaton, Philip; Willoughby, Shannon D.

    2018-06-01

    In 1995, Huffman and Heller used exploratory factor analysis to draw into question the factors of the Force Concept Inventory (FCI). Since then several papers have been published examining the factors of the FCI on larger sets of student responses and understandable factors were extracted as a result. However, none of these proposed factor models have been verified to not be unique to their original sample through the use of independent sets of data. This paper seeks to confirm the factor models proposed by Scott et al. in 2012, and Hestenes et al. in 1992, as well as another expert model proposed within this study through the use of confirmatory factor analysis (CFA) and a sample of 20 822 postinstruction student responses to the FCI. Upon application of CFA using the full sample, all three models were found to fit the data with acceptable global fit statistics. However, when CFA was performed using these models on smaller sample sizes the models proposed by Scott et al. and Eaton and Willoughby were found to be far more stable than the model proposed by Hestenes et al. The goodness of fit of these models to the data suggests that the FCI can be scored on factors that are not unique to a single class. These scores could then be used to comment on how instruction methods effect the performance of students along a single factor and more in-depth analyses of curriculum changes may be possible as a result.

  15. Semi-automated uranium analysis by a modified Davies--Gray procedure

    International Nuclear Information System (INIS)

    Swanson, G.C.

    1977-01-01

    To rapidly and reliably determine uranium in fuel materials a semi-automated implementation of the Davies-Gray uranium titration was developed. The Davies-Gray method is essentially a three step procedure. First uranium is reduced quantitatively from +6 valence to +4 valence by excess of iron (II) in strong phosphoric acid in the absence of nitrite. Prior to the uranium reduction nitrite is destroyed by addition of sulfamic acid. In the second step iron (II) is selectively oxidized to iron (III) by nitric acid in the presence of Mo (VI) catalyst. Finally after dilution to reduce phosphate concentration, the uranium is titrated to U (VI) by standard dichromate. The original sluggish colorimetric endpoint determination used by Davies and Gray is seldom used since New Brunswick Laboratory discovered that addition of vanadium (IV) just prior to titration sufficiently improves reaction rate to allow a potentiometric endpoint determination. One of the advantages of the Davies-Gray uranium titration is that it is quite specific for uranium, most common impurity elements do not interfere with the analysis, and specifically high levels of Pu, Th, and Fe are tolerated

  16. A simplified calculation procedure for mass isotopomer distribution analysis (MIDA) based on multiple linear regression.

    Science.gov (United States)

    Fernández-Fernández, Mario; Rodríguez-González, Pablo; García Alonso, J Ignacio

    2016-10-01

    We have developed a novel, rapid and easy calculation procedure for Mass Isotopomer Distribution Analysis based on multiple linear regression which allows the simultaneous calculation of the precursor pool enrichment and the fraction of newly synthesized labelled proteins (fractional synthesis) using linear algebra. To test this approach, we used the peptide RGGGLK as a model tryptic peptide containing three subunits of glycine. We selected glycine labelled in two 13 C atoms ( 13 C 2 -glycine) as labelled amino acid to demonstrate that spectral overlap is not a problem in the proposed methodology. The developed methodology was tested first in vitro by changing the precursor pool enrichment from 10 to 40% of 13 C 2 -glycine. Secondly, a simulated in vivo synthesis of proteins was designed by combining the natural abundance RGGGLK peptide and 10 or 20% 13 C 2 -glycine at 1 : 1, 1 : 3 and 3 : 1 ratios. Precursor pool enrichments and fractional synthesis values were calculated with satisfactory precision and accuracy using a simple spreadsheet. This novel approach can provide a relatively rapid and easy means to measure protein turnover based on stable isotope tracers. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Error Analysis Of Students Working About Word Problem Of Linear Program With NEA Procedure

    Science.gov (United States)

    Santoso, D. A.; Farid, A.; Ulum, B.

    2017-06-01

    Evaluation and assessment is an important part of learning. In evaluation process of learning, written test is still commonly used. However, the tests usually do not following-up by further evaluation. The process only up to grading stage not to evaluate the process and errors which done by students. Whereas if the student has a pattern error and process error, actions taken can be more focused on the fault and why is that happen. NEA procedure provides a way for educators to evaluate student progress more comprehensively. In this study, students’ mistakes in working on some word problem about linear programming have been analyzed. As a result, mistakes are often made students exist in the modeling phase (transformation) and process skills (process skill) with the overall percentage distribution respectively 20% and 15%. According to the observations, these errors occur most commonly due to lack of precision of students in modeling and in hastiness calculation. Error analysis with students on this matter, it is expected educators can determine or use the right way to solve it in the next lesson.

  18. A Sensitive Photometric Procedure for Cobalt Determination in Water Employing a Compact Multicommuted Flow Analysis System.

    Science.gov (United States)

    da Silva Magalhães, Ticiane; Reis, Boaventura F

    2017-09-01

    In this work, a multicommuted flow analysis procedure is proposed for the spectrophotometric determination of cobalt in fresh water, employing an instrument setup of downsized dimension and improved cost-effectiveness. The method is based on the catalytic effect of Co(II) on the Tiron oxidation by hydrogen peroxide in alkaline medium, forming a complex that absorbs radiation at 425 nm. The photometric detection was accomplished using a homemade light-emitting-diode (LED)-based photometer designed to use a flow cell with an optical path-length of 100 mm to improve sensitivity. After selecting adequate values for the flow system variables, adherence to the Beer-Lambert-Bouguer law was observed for standard solution concentrations in the range of 0.13-1.5 µg L -1 Co(II). Other useful features including a relative standard deviation of 2.0% (n = 11) for a sample with 0.49 µg L -1 Co(II), a detection limit of 0.06 µg L -1 Co(II) (n = 20), an analytical frequency of 42 sample determinations per hour, and waste generation of 1.5 mL per determination were achieved.

  19. Reliable and Efficient Procedure for Steady-State Analysis of Nonautonomous and Autonomous Systems

    Directory of Open Access Journals (Sweden)

    J. Dobes

    2012-04-01

    Full Text Available The majority of contemporary design tools do not still contain steady-state algorithms, especially for the autonomous systems. This is mainly caused by insufficient accuracy of the algorithm for numerical integration, but also by unreliable steady-state algorithms themselves. Therefore, in the paper, a very stable and efficient procedure for the numerical integration of nonlinear differential-algebraic systems is defined first. Afterwards, two improved methods are defined for finding the steady state, which use this integration algorithm in their iteration loops. The first is based on the idea of extrapolation, and the second utilizes nonstandard time-domain sensitivity analysis. The two steady-state algorithms are compared by analyses of a rectifier and a C-class amplifier, and the extrapolation algorithm is primarily selected as a more reliable alternative. Finally, the method based on the extrapolation naturally cooperating with the algorithm for solving the differential-algebraic systems is thoroughly tested on various electronic circuits: Van der Pol and Colpitts oscillators, fragment of a large bipolar logical circuit, feedback and distributed microwave oscillators, and power amplifier. The results confirm that the extrapolation method is faster than a classical plain numerical integration, especially for larger circuits with complicated transients.

  20. Hemodynamic outcomes of the Ross procedure versus other aortic valve replacement: a systematic review and meta-analysis.

    Science.gov (United States)

    Um, Kevin J; McCLURE, Graham R; Belley-Cote, Emilie P; Gupta, Saurabh; Bouhout, Ismail; Lortie, Hugo; Alraddadi, Hatim; Alsagheir, Ali; Bossard, Matthias; McINTYRE, William F; Lengyel, Alexandra; Eikelboom, John W; Ouzounian, Maral; Chu, Michael W; Parry, Dominic; El-Hamamsy, Ismail; Whitlock, Richard P

    2018-01-09

    Life expectancy in young adults undergoing mechanical or bioprosthetic aortic valve replacement (AVR) may be reduced by up to 20 years compared to age matched controls. The Ross procedure is a durable, anticoagulation-sparing alternative. We performed a systematic review and meta-analysis to compare the valve hemodynamics of the Ross procedure versus other AVR. We searched Cochrane CENTRAL, MEDLINE and EMBASE from inception to February 2017 for randomized controlled trials (RCTs) and observational studies (n≥10 Ross). Independently and in duplicate, we performed title and abstract screening, full-text eligibility assessment, and data collection. We evaluated the risk of bias with the Cochrane and CLARITY tools, and the quality of evidence with the GRADE framework. We identified 2 RCTs and 13 observational studies that met eligibility criteria (n=1,412). In observational studies, the Ross procedure was associated with a lower mean aortic gradient at discharge (MD -9 mmHg, 95% CI [-13, -5], pRoss procedure was associated with a lower mean gradient at latest follow-up (MD -15 mmHg, 95% CI [-32, 2], p=0.08, I2=99%). The mean pulmonic gradient for the Ross procedure was 18.0 mmHg (95% CI [16, 20], pRoss procedure was associated with better aortic valve hemodynamics. Future studies should evaluate the impact of the Ross procedure on exercise capacity and quality of life.

  1. Probabilistic safety analysis procedures guide, Sections 8-12. Volume 2, Rev. 1

    International Nuclear Information System (INIS)

    McCann, M.; Reed, J.; Ruger, C.; Shiu, K.; Teichmann, T.; Unione, A.; Youngblood, R.

    1985-08-01

    A procedures guide for the performance of probabilistic safety assessment has been prepared for interim use in the Nuclear Regulatory Commission programs. It will be revised as comments are received, and as experience is gained from its use. The probabilistic safety assessment studies performed are intended to produce probabilistic predictive models that can be used and extended by the utilities and by NRC to sharpen the focus of inquiries into a range of issues affecting reactor safety. The first volume of the guide describes the determination of the probability (per year) of core damage resulting from accident initiators internal to the plant (i.e., intrinsic to plant operation) and from loss of off-site electric power. The scope includes human reliability analysis, a determination of the importance of various core damage accident sequences, and an explicit treatment and display of uncertainties for key accident sequences. This second volume deals with the treatment of the so-called external events including seismic disturbances, fires, floods, etc. Ultimately, the guide will be augmented to include the plant-specific analysis of in-plant processes (i.e., containment performance). This guide provides the structure of a probabilistic safety study to be performed, and indicates what products of the study are valuable for regulatory decision making. For internal events, methodology is treated in the guide only to the extent necessary to indicate the range of methods which is acceptable; ample reference is given to alternative methodologies which may be utilized in the performance of the study. For external events, more explicit guidance is given

  2. Cost-benefit analysis model: A tool for area-wide fruit fly management. Procedures manual

    International Nuclear Information System (INIS)

    Enkerlin, W.; Mumford, J.; Leach, A.

    2007-03-01

    The Generic Fruit Fly Cost-Benefit Analysis Model assists in economic decision making associated with area-wide fruit fly control options. The FRUIT FLY COST-BENEFIT ANALYSIS PROGRAM (available on 1 CD-ROM from the Joint FAO/IAEA Programme of Nuclear Techniques in Food and Agriculture) is an Excel 2000 Windows based program, for which all standard Windows and Excel conventions apply. The Model is user friendly and thus largely self-explanatory. Nevertheless, it includes a procedures manual that has been prepared to guide the user, and thus should be used together with the software. Please note that the table presenting the pest management options in the Introductory Page of the model is controlled by spin buttons and click boxes. These controls are linked to macros that hide non relevant tables and boxes. N.B. it is important that the medium level of security is selected from the Tools menu of Excel, to do this go to Tools|Macros|Security| and select Medium. When the file is opened a form will appear containing three buttons, click on the middle button, 'Enable Macros', so that the macros may be used. Ideally the model should be used as a support tool by working groups aiming at assessing the economic returns of different fruit fly control options (suppression, eradication, containment and prevention). The working group should include professionals in agriculture with experience in area-wide implementation of integrated pest management programmes, an economist or at least someone with basic knowledge in economics, and if relevant, an entomologist with some background in the application of the sterile insect technique (SIT)

  3. Probabilistic safety analysis procedures guide. Sections 1-7 and appendices. Volume 1, Revision 1

    International Nuclear Information System (INIS)

    Bari, R.A.; Buslik, A.J.; Cho, N.Z.

    1985-08-01

    A procedures guide for the performance of probabilistic safety assessment has been prepared for interim use in the Nuclear Regulatory Commission programs. It will be revised as comments are received, and as experience is gained from its use. The probabilistic safety assessment studies performed are intended to produce probabilistic predictive models that can be used and extended by the utilities and by NRC to sharpen the focus of inquiries into a range of issues affecting reactor safety. This first volume of the guide describes the determination of the probability (per year) of core damage resulting from accident initiators internal to the plant (i.e., intrinsic to plant operation) and from loss of off-site electric power. The scope includes human reliability analysis, a determination of the importance of various core damage accident sequences, and an explicit treatment and display of uncertainties for key accident sequences. The second volume deals with the treatment of the so-called external events including seismic disturbances, fires, floods, etc. Ultimately, the guide will be augmented to include the plant-specific analysis of in-plant processes (i.e., containment performance). This guide provides the structure of a probabilistic safety study to be performed, and indicates what products of the study are valuable for regulatory decision making. For internal events, methodology is treated in the guide only to the extent necessary to indicate the range of methods which is acceptable; ample reference is given to alternative methodologies which may be utilized in the performance of the study. For external events, more explicit guidance is given

  4. A New Variable Weighting and Selection Procedure for K-Means Cluster Analysis

    Science.gov (United States)

    Steinley, Douglas; Brusco, Michael J.

    2008-01-01

    A variance-to-range ratio variable weighting procedure is proposed. We show how this weighting method is theoretically grounded in the inherent variability found in data exhibiting cluster structure. In addition, a variable selection procedure is proposed to operate in conjunction with the variable weighting technique. The performances of these…

  5. An analysis of marketing authorisation applications via the mutual recognition and decentralised procedures in Europe

    DEFF Research Database (Denmark)

    Ebbers, Hans C; Langedijk, Joris; Bouvy, Jacoline C

    2015-01-01

    the frequency of licensing failure prior to CMDh referrals. RESULTS: During the study period, 10392 MRP/DCP procedures were finalized. Three hundred seventy-seven (3.6%) resulted in a referral procedure, of which 70 (19%) resulted in licensing failure, defined as refusal or withdrawal of the application...

  6. Procedure for hazards analysis of plutonium gloveboxes used in analytical chemistry operations

    International Nuclear Information System (INIS)

    Delvin, W.L.

    1977-06-01

    A procedure is presented to identify and assess hazards associated with gloveboxes used for analytical chemistry operations involving plutonium. This procedure is based upon analytic tree methodology and it has been adapted from the US Energy Research and Development Administration's safety program, the Management Oversight and Risk Tree

  7. Confirmatory factor analysis of the female sexual function index.

    Science.gov (United States)

    Opperman, Emily A; Benson, Lindsay E; Milhausen, Robin R

    2013-01-01

    The Female Sexual Functioning Index (Rosen et al., 2000 ) was designed to assess the key dimensions of female sexual functioning using six domains: desire, arousal, lubrication, orgasm, satisfaction, and pain. A full-scale score was proposed to represent women's overall sexual function. The fifth revision to the Diagnostic and Statistical Manual (DSM) is currently underway and includes a proposal to combine desire and arousal problems. The objective of this article was to evaluate and compare four models of the Female Sexual Functioning Index: (a) single-factor model, (b) six-factor model, (c) second-order factor model, and (4) five-factor model combining the desire and arousal subscales. Cross-sectional and observational data from 85 women were used to conduct a confirmatory factor analysis on the Female Sexual Functioning Index. Local and global goodness-of-fit measures, the chi-square test of differences, squared multiple correlations, and regression weights were used. The single-factor model fit was not acceptable. The original six-factor model was confirmed, and good model fit was found for the second-order and five-factor models. Delta chi-square tests of differences supported best fit for the six-factor model validating usage of the six domains. However, when revisions are made to the DSM-5, the Female Sexual Functioning Index can adapt to reflect these changes and remain a valid assessment tool for women's sexual functioning, as the five-factor structure was also supported.

  8. Emergency procedures

    International Nuclear Information System (INIS)

    Abd Nasir Ibrahim; Azali Muhammad; Ab Razak Hamzah; Abd Aziz Mohamed; Mohammad Pauzi Ismail

    2004-01-01

    The following subjects are discussed - Emergency Procedures: emergency equipment, emergency procedures; emergency procedure involving X-Ray equipment; emergency procedure involving radioactive sources

  9. Zinc injection implementation process at EDF: risk analysis, chemical specifications and operating procedures

    International Nuclear Information System (INIS)

    Tigeras, A.; Stutzmann, A.; Bremnes, O.; Claeys, M.; Ranchoux, G.; Segura, J.C.; Errera, J.; Bonne, S.

    2010-01-01

    's strategy and the different measures adopted by EDF to provide the necessary tools to the French units : zinc injection procedures, risk analysis, chemistry -radiochemistry surveillance programs, and chemical specifications. This work can be useful for other utilities, assisting them in optimizing and/or implementing the zinc injection in the most suitable conditions, which would help to obtain the expected results in the current and the future reactors. (author)

  10. Comparative cost analysis -- computed tomography vs. alternative diagnostic procedures, 1977-1980

    International Nuclear Information System (INIS)

    Gempel, P.A.; Harris, G.H.; Evans, R.G.

    1977-12-01

    In comparing the total national cost of utilizing computed tomography (CT) for medically indicated diagnoses with that of conventional x-ray, ultrasonography, nuclear medicine, and exploratory surgery, this investigation concludes that there was little, if any, added net cost from CT use in 1977 or will there be in 1980. Computed tomography, generally recognized as a reliable and useful diagnostic modality, has the potential to reduce net costs provided that an optimal number of units can be made available to physicians and patients to achieve projected reductions in alternative procedures. This study examines the actual cost impact of CT on both cranial and body diagnostic procedures. For abdominal and mediastinal disorders, CT scanning is just beginning to emerge as a diagnostic modality. As such, clinical experience is somewhat limited and the authors assume that no significant reduction in conventional procedures took place in 1977. It is estimated that the approximately 375,000 CT body procedures performed in 1977 represent only a 5 percent cost increase over use of other diagnostic modalities. It is projected that 2,400,000 CT body procedures will be performed in 1980 and, depending on assumptions used, total body diagnostic costs will increase only slightly or be reduced. Thirty-one tables appear throughout the text presenting cost data broken down by types of diagnostic procedures used and projections by years. Appendixes present technical cost components for diagnostic procedures, the comparative efficacy of CT as revealed in abstracts of published literature, selected medical diagnoses, and references

  11. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    Science.gov (United States)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  12. Clinicopathological Analysis of Factors Related to Colorectal Tumor Perforation

    OpenAIRE

    Medina-Arana, Vicente; Martínez-Riera, Antonio; Delgado-Plasencia, Luciano; Rodríguez-González, Diana; Bravo-Gutiérrez, Alberto; Álvarez-Argüelles, Hugo; Alarcó-Hernández, Antonio; Salido-Ruiz, Eduardo; Fernández-Peralta, Antonia M.; González-Aguilera, Juan J.

    2015-01-01

    Abstract Colorectal tumor perforation is a life-threatening complication of this disease. However, little is known about the anatomopathological factors or pathophysiologic mechanisms involved. Pathological and immunohistochemical analysis of factors related with tumoral neo-angiogenesis, which could influence tumor perforation are assessed in this study. A retrospective study of patients with perforated colon tumors (Group P) and T4a nonperforated (controls) was conducted between 2001 and 20...

  13. Analysis of Key Factors Driving Japan’s Military Normalization

    Science.gov (United States)

    2017-09-01

    no change to our policy of not giving in to terrorism.”40 Though the prime minister was democratically supported, Koizumi’s leadership style took...of the key driving factors of Japan’s normalization. The areas of prime ministerial leadership , regional security threats, alliance issues, and...analysis of the key driving factors of Japan’s normalization. The areas of prime ministerial leadership , regional security threats, alliance issues, and

  14. Capital Cost Optimization for Prefabrication: A Factor Analysis Evaluation Model

    Directory of Open Access Journals (Sweden)

    Hong Xue

    2018-01-01

    Full Text Available High capital cost is a significant hindrance to the promotion of prefabrication. In order to optimize cost management and reduce capital cost, this study aims to explore the latent factors and factor analysis evaluation model. Semi-structured interviews were conducted to explore potential variables and then questionnaire survey was employed to collect professionals’ views on their effects. After data collection, exploratory factor analysis was adopted to explore the latent factors. Seven latent factors were identified, including “Management Index”, “Construction Dissipation Index”, “Productivity Index”, “Design Efficiency Index”, “Transport Dissipation Index”, “Material increment Index” and “Depreciation amortization Index”. With these latent factors, a factor analysis evaluation model (FAEM, divided into factor analysis model (FAM and comprehensive evaluation model (CEM, was established. The FAM was used to explore the effect of observed variables on the high capital cost of prefabrication, while the CEM was used to evaluate comprehensive cost management level on prefabrication projects. Case studies were conducted to verify the models. The results revealed that collaborative management had a positive effect on capital cost of prefabrication. Material increment costs and labor costs had significant impacts on production cost. This study demonstrated the potential of on-site management and standardization design to reduce capital cost. Hence, collaborative management is necessary for cost management of prefabrication. Innovation and detailed design were needed to improve cost performance. The new form of precast component factories can be explored to reduce transportation cost. Meanwhile, targeted strategies can be adopted for different prefabrication projects. The findings optimized the capital cost and improved the cost performance through providing an evaluation and optimization model, which helps managers to

  15. A pragmatic approach to estimate alpha factors for common cause failure analysis

    International Nuclear Information System (INIS)

    Hassija, Varun; Senthil Kumar, C.; Velusamy, K.

    2014-01-01

    Highlights: • Estimation of coefficients in alpha factor model for common cause analysis. • A derivation of plant specific alpha factors is demonstrated. • We examine sensitivity of common cause contribution to total system failure. • We compare beta factor and alpha factor models for various redundant configurations. • The use of alpha factors is preferable, especially for large redundant systems. - Abstract: Most of the modern technological systems are deployed with high redundancy but still they fail mainly on account of common cause failures (CCF). Various models such as Beta Factor, Multiple Greek Letter, Binomial Failure Rate and Alpha Factor exists for estimation of risk from common cause failures. Amongst all, alpha factor model is considered most suitable for high redundant systems as it arrives at common cause failure probabilities from a set of ratios of failures and the total component failure probability Q T . In the present study, alpha factor model is applied for the assessment of CCF of safety systems deployed at two nuclear power plants. A method to overcome the difficulties in estimation of the coefficients viz., alpha factors in the model, importance of deriving plant specific alpha factors and sensitivity of common cause contribution to the total system failure probability with respect to hazard imposed by various CCF events is highlighted. An approach described in NUREG/CR-5500 is extended in this study to provide more explicit guidance for a statistical approach to derive plant specific coefficients for CCF analysis especially for high redundant systems. The procedure is expected to aid regulators for independent safety assessment

  16. Analysis of evacuation procedure after the accident of the Fukushima Daiichi Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Murayama, T.; Iizuka, F.; El-Asaad, H. [Tokyo Inst. of Tech., Tokyo (Japan)

    2014-07-01

    After the Great East Japan Earthquake of March 2011 struck the coast of Eastern Japan, evacuation procedures were undermined due to the unexpected magnitude and severity of the disaster. Also, communications between local and national government were weakened, leading to dismemberment between society and government. Consequently this left the affected people without sufficient information or updates regarding evacuation procedures. This paper will concentrate on evacuation procedures led by locating residents with the help of media outlets (local newspapers and news reports). Analyzing movements of evacuees will help improve the evacuation method both for local residents and government bodies. (author)

  17. Analysis of evacuation procedure after the accident of the Fukushima Daiichi Nuclear Power Plant

    International Nuclear Information System (INIS)

    Murayama, T.; Iizuka, F.; El-Asaad, H.

    2014-01-01

    After the Great East Japan Earthquake of March 2011 struck the coast of Eastern Japan, evacuation procedures were undermined due to the unexpected magnitude and severity of the disaster. Also, communications between local and national government were weakened, leading to dismemberment between society and government. Consequently this left the affected people without sufficient information or updates regarding evacuation procedures. This paper will concentrate on evacuation procedures led by locating residents with the help of media outlets (local newspapers and news reports). Analyzing movements of evacuees will help improve the evacuation method both for local residents and government bodies. (author)

  18. Procedure for Market Analysis in the Research, Development and Innovation Management Unit (R+D+I of the University of Mindelo

    Directory of Open Access Journals (Sweden)

    Joao Dias−da Silva

    2016-12-01

    Full Text Available Research, development and innovation (R + D + i is a competitive factor in organizations and refers to a set of specificities that determine the need for efficient and efficient management. The objective of this scientific paper was to show the application of a procedure for the market analysis of the R & D & I management unit of the University of Mindelo, Republic of Cape Verde. The results achieved by the application of the procedure allowed to perfect the decision-making process in response to the needs of the clients represented by the companies ENAPOR, ELECTRA, among others. In the same way strengthen projects related to the development of software to support business management and its financing from external sources, as well as the competitiveness of the organization.

  19. Factor analysis of the contextual fine motor questionnaire in children.

    Science.gov (United States)

    Lin, Chin-Kai; Meng, Ling-Fu; Yu, Ya-Wen; Chen, Che-Kuo; Li, Kuan-Hua

    2014-02-01

    Most studies treat fine motor as one subscale in a developmental test, hence, further factor analysis of fine motor has not been conducted. In fact, fine motor has been treated as a multi-dimensional domain from both clinical and theoretical perspectives, and therefore to know its factors would be valuable. The aim of this study is to analyze the internal consistency and factor validity of the Contextual Fine Motor Questionnaire (CFMQ). Based on the ecological observation and literature, the Contextual Fine Motor Questionnaire (CFMQ) was developed and includes 5 subscales: Pen Control, Tool Use During Handicraft Activities, the Use of Dining Utensils, Connecting and Separating during Dressing and Undressing, and Opening Containers. The main purpose of this study is to establish the factorial validity of the CFMQ through conducting this factor analysis study. Among 1208 questionnaires, 904 were successfully completed. Data from the children's CFMQ submitted by primary care providers was analyzed, including 485 females (53.6%) and 419 males (46.4%) from grades 1 to 5, ranging in age from 82 to 167 months (M=113.9, SD=16.3). Cronbach's alpha was used to measure internal consistency and explorative factor analysis was applied to test the five factor structures within the CFMQ. Results showed that Cronbach's alpha coefficient of the CFMQ for 5 subscales ranged from .77 to .92 and all item-total correlations with corresponding subscales were larger than .4 except one item. The factor loading of almost all items classified to their factor was larger than .5 except 3 items. There were five factors, explaining a total of 62.59% variance for the CFMQ. In conclusion, the remaining 24 items in the 5 subscales of the CFMQ had appropriate internal consistency, test-retest reliability and construct validity. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Using Factor Analysis to Identify Topic Preferences Within MBA Courses

    Directory of Open Access Journals (Sweden)

    Earl Chrysler

    2003-02-01

    Full Text Available This study demonstrates the role of a principal components factor analysis in conducting a gap analysis as to the desired characteristics of business alumni. Typically, gap analyses merely compare the emphases that should be given to areas of inquiry with perceptions of actual emphases. As a result, the focus is upon depth of coverage. A neglected area in need of investigation is the breadth of topic dimensions and their differences between the normative (should offer and the descriptive (actually offer. The implications of factor structures, as well as traditional gap analyses, are developed and discussed in the context of outcomes assessment.

  1. Root coverage procedures improve patient aesthetics. A systematic review and Bayesian network meta-analysis.

    Science.gov (United States)

    Cairo, Francesco; Pagliaro, Umberto; Buti, Jacopo; Baccini, Michela; Graziani, Filippo; Tonelli, Paolo; Pagavino, Gabriella; Tonetti, Maurizio S

    2016-11-01

    The aim of this study was to perform a systematic review (SR) of randomized controlled trials (RCTs) to explore if periodontal plastic surgery procedures for the treatment of single and multiple gingival recessions (Rec) may improve aesthetics at patient and professional levels. In order to combine evidence from direct and indirect comparisons by different trials a Bayesian network meta-analysis (BNM) was planned. A literature search on PubMed, Cochrane libraries, EMBASE, and hand-searched journals until January 2016 was conducted to identify RCTs presenting aesthetic outcomes after root coverage using standardized evaluations at patient and professional level. A total of 16 RCTs were selected in the SR; three RTCs presenting professional aesthetic evaluation with Root coverage Aesthetic Score (RES) and three showing final self-perception using the Visual Analogue Scale (VAS Est) could be included in a BNM model. Coronally Advanced Flap plus Connective Tissue Graft (CAF + CTG) and CAF + Acellular Dermal Matrix (ADM) and Autologous Fibroblasts (AF) were associated with the best RES outcomes (best probability = 24% and 64%, respectively), while CAF + CTG and CAF + CTG + Enamel matrix Derivatives (EMD) obtained highest values of VAS Est score (best probability = 44% and 26%, respectively). Periodontal Plastic Surgery (PPS) techniques applying grafts underneath CAF with or without the adding of EMD are associated with improved aesthetics assessed by final patient perception and RES as professional evaluation system. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. Analysis of IFR driver fuel hot channel factors

    International Nuclear Information System (INIS)

    Ku, J.Y.; Chang, L.K.; Mohr, D.

    1994-01-01

    Thermal-hydraulic uncertainty factors for Integral Fast Reactor (IFR) driver fuels have been determined based primarily on the database obtained from the predecessor fuels used in the IFR prototype, Experimental Breeder Reactor II. The uncertainty factors were applied to the channel factors (HCFs) analyses to obtain separate overall HCFs for fuel and cladding for steady-state analyses. A ''semistatistical horizontal method'' was used in the HCFs analyses. The uncertainty factor of the fuel thermal conductivity dominates the effects considered in the HCFs analysis; the uncertainty in fuel thermal conductivity will be reduced as more data are obtained to expand the currently limited database for the IFR ternary metal fuel (U-20Pu-10Zr). A set of uncertainty factors to be used for transient analyses has also been derived

  3. Analysis of IFR driver fuel hot channel factors

    International Nuclear Information System (INIS)

    Ku, J.Y.; Chang, L.K.; Mohr, D.

    2004-01-01

    Thermal-hydraulic uncertainty factors for Integral Fast Reactor (IFR) driver fuels have been determined based primarily on the database obtained from the predecessor fuels used in the IFR prototype. Experimental Breeder Reactor II. The uncertainty factors were applied to the hot channel factors (HCFs) analyses to obtain separate overall HCFs for fuel and cladding for steady-state analyses. A 'semistatistical horizontal method' was used in the HCFs analyses. The uncertainty factor of the fuel thermal conductivity dominates the effects considered in the HCFs analysis; the uncertainty in fuel thermal conductivity will be reduced as more data are obtained to expand the currently limited database for the IFR ternary metal fuel (U-20Pu-10Zr). A set of uncertainty factors to be used for transient analyses has also been derived. (author)

  4. Confirmatory Factor Analysis of the Procrastination Assessment Scale for Students

    Directory of Open Access Journals (Sweden)

    Ronald D. Yockey

    2015-10-01

    Full Text Available The relative fit of one- and two-factor models of the Procrastination Assessment Scale for Students (PASS was investigated using confirmatory factor analysis on an ethnically diverse sample of 345 participants. The results indicated that although the two-factor model provided better fit to the data than the one-factor model, neither model provided optimal fit. However, a two-factor model which accounted for common item theme pairs used by Solomon and Rothblum in the creation of the scale provided good fit to the data. In addition, a significant difference by ethnicity was also found on the fear of failure subscale of the PASS, with Whites having significantly lower scores than Asian Americans or Latino/as. Implications of the results are discussed and recommendations made for future work with the scale.

  5. Two Expectation-Maximization Algorithms for Boolean Factor Analysis

    Czech Academy of Sciences Publication Activity Database

    Frolov, A. A.; Húsek, Dušan; Polyakov, P.Y.

    2014-01-01

    Roč. 130, 23 April (2014), s. 83-97 ISSN 0925-2312 R&D Projects: GA ČR GAP202/10/0262 Grant - others:GA MŠk(CZ) ED1.1.00/02.0070; GA MŠk(CZ) EE.2.3.20.0073 Program:ED Institutional research plan: CEZ:AV0Z10300504 Keywords : Boolean Factor analysis * Binary Matrix factorization * Neural networks * Binary data model * Dimension reduction * Bars problem Subject RIV: IN - Informatics, Computer Science Impact factor: 2.083, year: 2014

  6. An alternative method for noise analysis using pixel variance as part of quality control procedures on digital mammography systems.

    NARCIS (Netherlands)

    Bouwman, R.; Young, K.; Lazzari, B.; Ravaglia, V.; Broeders, M.J.M.; Engen, R. van

    2009-01-01

    According to the European Guidelines for quality assured breast cancer screening and diagnosis, noise analysis is one of the measurements that needs to be performed as part of quality control procedures on digital mammography systems. However, the method recommended in the European Guidelines does

  7. 78 FR 37463 - Expedited Approval of Alternative Test Procedures for the Analysis of Contaminants Under the Safe...

    Science.gov (United States)

    2013-06-21

    ... Approval of Alternative Test Procedures for the Analysis of Contaminants Under the Safe Drinking Water Act... page 32570, with the table entitled ``ALTERNATIVE TESTING METHODS FOR CONTAMINANTS LISTED AT 40 CFR 141... Contaminants Listed at 40 CFR 141.25(a) SM 21st Edition SM 22nd Edition Contaminant Methodology \\1\\ \\28\\ ASTM...

  8. 75 FR 32295 - Expedited Approval of Alternative Test Procedures for the Analysis of Contaminants Under the Safe...

    Science.gov (United States)

    2010-06-08

    ... Approval of Alternative Test Procedures for the Analysis of Contaminants Under the Safe Drinking Water Act... methods for use in measuring the levels of contaminants in drinking water and determining compliance with... required to measure contaminants in drinking water samples. In addition, EPA Regions as well as States and...

  9. Workplace Innovation: Exploratory and Confirmatory Factor Analysis for Construct Validation

    Directory of Open Access Journals (Sweden)

    Wipulanusat Warit

    2017-06-01

    Full Text Available Workplace innovation enables the development and improvement of products, processes and services leading simultaneously to improvement in organisational performance. This study has the purpose of examining the factor structure of workplace innovation. Survey data, extracted from the 2014 APS employee census, comprising 3,125 engineering professionals in the Commonwealth of Australia’s departments were analysed using exploratory factor analysis (EFA and confirmatory factor analysis (CFA. EFA returned a two-factor structure explaining 69.1% of the variance of the construct. CFA revealed that a two-factor structure was indicated as a validated model (GFI = 0.98, AGFI = 0.95, RMSEA = 0.08, RMR = 0.02, IFI = 0.98, NFI = 0.98, CFI = 0.98, and TLI = 0.96. Both factors showed good reliability of the scale (Individual creativity: α = 0.83, CR = 0.86, and AVE = 0.62; Team Innovation: α = 0.82, CR = 0.88, and AVE = 0.61. These results confirm that the two factors extracted for characterising workplace innovation included individual creativity and team innovation.

  10. Ranking insurance firms using AHP and Factor Analysis

    Directory of Open Access Journals (Sweden)

    Mohammad Khodaei Valahzaghard

    2013-03-01

    Full Text Available Insurance industry includes a significant part of economy and it is important to learn more about the capabilities of different firms, which are active in this industry. In this paper, we present an empirical study to rank the insurance firms using analytical hierarchy process as well as factor analysis. The study considers four criteria including capital adequacy, quality of earning, quality of cash flow and quality of firms’ assets. The results of the implementation of factor analysis (FA have been verified using Kaiser-Meyer-Olkin (KMO=0.573 and Bartlett's Chi-Square (443.267 P-value=0.000 tests. According to the results FA, the first important factor, capital adequacy, represents 21.557% of total variance, the second factor, quality of income, represents 20.958% of total variance. In addition, the third factor, quality of cash flow, represents 19.417% of total variance and the last factor, quality of assets, represents 18.641% of total variance. The study has also used analytical hierarchy process (AHP to rank insurance firms. The results of our survey indicate that capital adequacy (0.559 is accounted as the most important factor followed by quality of income (0.235, quality of cash flow (0.144 and quality of assets (0.061. The results of AHP are consistent with the results of FA, which somewhat validates the overall study.

  11. WHY DO SOME NATIONS SUCCEED AND OTHERS FAIL IN INTERNATIONAL COMPETITION? FACTOR ANALYSIS AND CLUSTER ANALYSIS AT EUROPEAN LEVEL

    Directory of Open Access Journals (Sweden)

    Popa Ion

    2015-07-01

    Full Text Available As stated by Michael Porter (1998: 57, 'this is perhaps the most frequently asked economic question of our times.' However, a widely accepted answer is still missing. The aim of this paper is not to provide the BIG answer for such a BIG question, but rather to provide a different perspective on the competitiveness at the national level. In this respect, we followed a two step procedure, called “tandem analysis”. (OECD, 2008. First we employed a Factor Analysis in order to reveal the underlying factors of the initial dataset followed by a Cluster Analysis which aims classifying the 35 countries according to the main characteristics of competitiveness resulting from Factor Analysis. The findings revealed that clustering the 35 states after the first two factors: Smart Growth and Market Development, which recovers almost 76% of common variability of the twelve original variables, are highlighted four clusters as well as a series of useful information in order to analyze the characteristics of the four clusters and discussions on them.

  12. Chronic subdural hematoma : a systematic review and meta-analysis of surgical procedures

    NARCIS (Netherlands)

    Liu, Weiming; Bakker, Nicolaas A.; Groen, Rob J. M.

    Object. In this paper the authors systematically evaluate the results of different surgical procedures for chronic subdural hematoma (CSDH). Methods. The MEDLINE, Embase, Cochrane Central Register of Controlled Trials, and other databases were scrutinized according to the PRISMA (Preferred Reporting

  13. Health economic analysis of laparoscopic lavage versus Hartmann's procedure for diverticulitis in the randomized DILALA trial

    DEFF Research Database (Denmark)

    Gehrman, J.; Angenete, E; Björholt, I.

    2016-01-01

    Background: Open surgery with resection and colostomy (Hartmann's procedure) has been the standard treatment for perforated diverticulitis with purulent peritonitis. In recent years laparoscopic lavage has emerged as an alternative, with potential benefits for patients with purulent peritonitis...

  14. Patient safety during procedural sedation using capnography monitoring : A systematic review and meta-analysis

    NARCIS (Netherlands)

    Saunders, Rhodri; Struys, Michel M. R. F.; Pollock, Richard F.; Mestek, Michael; Lightdale, Jenifer R.

    2017-01-01

    Objective: To evaluate the effect of capnography monitoring on sedation-related adverse events during procedural sedation and analgesia (PSA) administered for ambulatory surgery relative to visual assessment and pulse oximetry alone. Design and setting: Systematic literature review and random

  15. A factor analysis to find critical success factors in retail brand

    Directory of Open Access Journals (Sweden)

    Naser Azad

    2013-03-01

    Full Text Available The present exploratory study aims to find critical components of retail brand among some retail stores. The study seeks to build a brand name in retail level and looks to find important factors affecting it. Customer behavior is largely influenced when the first retail customer experience is formed. These factors have direct impacts on customer experience and satisfaction in retail industry. The proposed study performs an empirical investigation on two well-known retain stores located in city of Tehran, Iran. Using a sample of 265 people from regular customers, the study uses factor analysis and extracts four main factors including related brand, product benefits, customer welfare strategy and corporate profits using the existing 31 factors in the literature.

  16. Worry About Caregiving Performance: A Confirmatory Factor Analysis

    Directory of Open Access Journals (Sweden)

    Ruijie Li

    2018-03-01

    Full Text Available Recent studies on the Zarit Burden Interview (ZBI support the existence of a unique factor, worry about caregiving performance (WaP, beyond role and personal strain. Our current study aims to confirm the existence of WaP within the multidimensionality of ZBI and to determine if predictors of WaP differ from the role and personal strain. We performed confirmatory factor analysis (CFA on 466 caregiver-patient dyads to compare between one-factor (total score, two-factor (role/personal strain, three-factor (role/personal strain and WaP, and four-factor models (role strain split into two factors. We conducted linear regression analyses to explore the relationships between different ZBI factors with socio-demographic and disease characteristics, and investigated the stage-dependent differences between WaP with role and personal strain by dyadic relationship. The four-factor structure that incorporated WaP and split role strain into two factors yielded the best fit. Linear regression analyses reveal that different variables significantly predict WaP (adult child caregiver and Neuropsychiatric Inventory Questionnaire (NPI-Q severity from role/personal strain (adult child caregiver, instrumental activities of daily living, and NPI-Q distress. Unlike other factors, WaP was significantly endorsed in early cognitive impairment. Among spouses, WaP remained low across Clinical Dementia Rating (CDR stages until a sharp rise in CDR 3; adult child and sibling caregivers experience a gradual rise throughout the stages. Our results affirm the existence of WaP as a unique factor. Future research should explore the potential of WaP as a possible intervention target to improve self-efficacy in the milder stages of burden.

  17. Analysis of four dental alloys following torch/centrifugal and induction/ vacuum-pressure casting procedures.

    Science.gov (United States)

    Thompson, Geoffrey A; Luo, Qing; Hefti, Arthur

    2013-12-01

    Previous studies have shown casting methodology to influence the as-cast properties of dental casting alloys. It is important to consider clinically important mechanical properties so that the influence of casting can be clarified. The purpose of this study was to evaluate how torch/centrifugal and inductively cast and vacuum-pressure casting machines may affect the castability, microhardness, chemical composition, and microstructure of 2 high noble, 1 noble, and 1 base metal dental casting alloys. Two commonly used methods for casting were selected for comparison: torch/centrifugal casting and inductively heated/ vacuum-pressure casting. One hundred and twenty castability patterns were fabricated and divided into 8 groups. Four groups were torch/centrifugally cast in Olympia (O), Jelenko O (JO), Genesis II (G), and Liberty (L) alloys. Similarly, 4 groups were cast in O, JO, G, and L by an inductively induction/vacuum-pressure casting machine. Each specimen was evaluated for casting completeness to determine a castability value, while porosity was determined by standard x-ray techniques. Each group was metallographically prepared for further evaluation that included chemical composition, Vickers microhardness, and grain analysis of microstructure. Two-way ANOVA was used to determine significant differences among the main effects. Statistically significant effects were examined further with the Tukey HSD procedure for multiple comparisons. Data obtained from the castability experiments were non-normal and the variances were unequal. They were analyzed statistically with the Kruskal-Wallis rank sum test. Significant results were further investigated statistically with the Steel-Dwass method for multiple comparisons (α=.05). The alloy type had a significant effect on surface microhardness (Pcasting did not affect the microhardness of the test specimen (P=.465). Similarly, the interaction between the alloy and casting technique was not significant (P=.119). A high

  18. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure.

    Science.gov (United States)

    Castellazzi, Giovanni; D'Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-07-28

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation.

  19. Complications Following Common Inpatient Urological Procedures: Temporal Trend Analysis from 2000 to 2010.

    Science.gov (United States)

    Meyer, Christian P; Hollis, Michael; Cole, Alexander P; Hanske, Julian; O'Leary, James; Gupta, Soham; Löppenberg, Björn; Zavaski, Mike E; Sun, Maxine; Sammon, Jesse D; Kibel, Adam S; Fisch, Margit; Chun, Felix K H; Trinh, Quoc-Dien

    2016-04-01

    Measuring procedure-specific complication-rate trends allows for benchmarking and improvement in quality of care but must be done in a standardized fashion. Using the Nationwide Inpatient Sample, we identified all instances of eight common inpatient urologic procedures performed in the United States between 2000 and 2010. This yielded 327218 cases including both oncologic and benign diseases. Complications were identified by International Classification of Diseases, Ninth Revision codes. Each complication was cross-referenced to the procedure code and graded according to the standardized Clavien system. The Mann-Whitney and chi-square were used to assess the statistical significance of medians and proportions, respectively. We assessed temporal variability in the rates of overall complications (Clavien grade 1-4), length of hospital stay, and in-hospital mortality using the estimated annual percent change (EAPC) linear regression methodology. We observed an overall reduction in length of stay (EAPC: -1.59; ptrends showed a significant increase in complications for inpatient ureterorenoscopy (EAPC: 5.53; ptrends of urologic procedures and their complications. A significant shift toward sicker patients and more complex procedures in the inpatient setting was found, but this did not result in higher mortality. These results are indicators of the high quality of care for urologic procedures in the inpatient setting. Copyright © 2015 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  20. Development of Procedures for the Analysis of Components of Dumped Chemical Weapons and Their Principal Transformation Products in Sea Water

    International Nuclear Information System (INIS)

    Saveleva, E. I.; Koryagina, N. L.; Radilov, A. S.; Khlebnikova, N. S.; Khrustaleva, V. S.

    2007-01-01

    A package of chemical analytical procedures was developed for the detection of products indicative of the presence of damped chemical weapons in the Baltic Sea. The principal requirements imposed upon the procedures were the following: high sensitivity, reliable identification of target compounds, wide range of components covered by survey analysis, and lack of interferences from sea salts. Thiodiglycol, a product of hydrolysis of sulfur mustard reportedly always detected in the sites of damping chemical weapons in the Baltic Sea, was considered the principal marker. We developed a high-sensitivity procedure for the determination of thiodiglycol in sea water, involving evaporation of samples to dryness in a vacuum concentrator, followed by tert-butyldimethylsilylation of the residue and GCMS analysis in the SIM mode with meta-fluorobenzoic acid as internal reference. The detection limit of thiodiglycol was 0.001 mg/l, and the procedure throughput was up to 30 samples per day. The same procedure, but with BSTFA as derivatizing agent instead of MTBSTFA, was used for preparing samples for survey analysis of nonvolatile components. In this case, full mass spectra were measured in the GCMS analysis. The use of BSTFA was motivated by the fact that trimethylsilyl derivatives are much wider represented in electronic mass spectral databases. The identification of sulfur mustard, volatile transformation products of sulfur mustard and lewisite, as well as chloroacetophenone in sea water was performed by means of GCMS in combination with SPME. The survey GC-MS analysis was focused on the identification of volatile and nonvolatile toxic chemicals whose mass spectra are included in the OPCW database (3219 toxic chemicals, precursors, and transformation products) with the use of AMDIS software (version 2.62). Using 2 GC-MS instruments, we could perform the survey analysis for volatile and nonvolatile components of up to 20 samples per day. Thus, the package of three procedures

  1. Studies on thermal neutron perturbation factor needed for bulk sample activation analysis

    CERN Document Server

    Csikai, J; Sanami, T; Michikawa, T

    2002-01-01

    The spatial distribution of thermal neutrons produced by an Am-Be source in a graphite pile was measured via the activation foil method. The results obtained agree well with calculated data using the MCNP-4B code. A previous method used for the determination of the average neutron flux within thin absorbing samples has been improved and extended for a graphite moderator. A procedure developed for the determination of the flux perturbation factor renders the thermal neutron activation analysis of bulky samples of unknown composition possible both in hydrogenous and graphite moderators.

  2. Cancer risk factors in Korean news media: a content analysis.

    Science.gov (United States)

    Kye, Su Yeon; Kwon, Jeong Hyun; Kim, Yong-Chan; Shim, Minsun; Kim, Jee Hyun; Cho, Hyunsoon; Jung, Kyu Won; Park, Keeho

    2015-01-01

    Little is known about the news coverage of cancer risk factors in Korea. This study aimed to examine how the news media encompasses a wide array of content regarding cancer risk factors and related cancer sites, and investigate whether news coverage of cancer risk factors is congruent with the actual prevalence of the disease. A content analysis was conducted on 1,138 news stories covered during a 5-year period between 2008 and 2012. The news stories were selected from nationally representative media in Korea. Information was collected about cancer risk factors and cancer sites. Of various cancer risk factors, occupational and environmental exposures appeared most frequently in the news. Breast cancer was mentioned the most in relation to cancer sites. Breast, cervical, prostate, and skin cancer were overrepresented in the media in comparison to incidence and mortality cases, whereas lung, thyroid, liver, and stomach cancer were underrepresented. To our knowledge, this research is the first investigation dealing with news coverage about cancer risk factors in Korea. The study findings show occupational and environmental exposures are emphasized more than personal lifestyle factors; further, more prevalent cancers in developed countries have greater media coverage, not reflecting the realities of the disease. The findings may help health journalists and other health storytellers to develop effective ways to communicate cancer risk factors.

  3. Landslides geotechnical analysis. Qualitative assessment by valuation factors

    Science.gov (United States)

    Cuanalo Oscar, Sc D.; Oliva Aldo, Sc D.; Polanco Gabriel, M. E.

    2012-04-01

    In general, a landslide can cause a disaster when it is combined a number of factors such as an extreme event related to a geological phenomenon, vulnerable elements exposed in a specific geographic area, and the probability of loss and damage evaluated in terms of lives and economic assets, in a certain period of time. This paper presents the qualitative evaluation of slope stability through of Valuation Factors, obtained from the characterization of the determinants and triggers factors that influence the instability; for the first the morphology and topography, geology, soil mechanics, hydrogeology and vegetation to the second, the rain, earthquakes, erosion and scour, human activity, and ultimately dependent factors of the stability analysis, and its influence ranges which greatly facilitate the selection of construction processes best suited to improve the behavior of a slope or hillside. The Valuation Factors are a set of parameters for assessing the influence of conditioning and triggering factors that influence the stability of slopes and hillsides. The characteristics of each factor must be properly categorized to involve its effect on behavior; a way to do this is by assigning a weighted value range indicating its effect on the stability of a slope. It is proposed to use Valuation Factors with weighted values between 0 and 1 (arbitrarily selected but common sense and logic), the first corresponds to no or minimal effect on stability (no effect or very little influence) and the second, the greatest impact on it (has a significant influence). The meddle effects are evaluated with intermediate values.

  4. Cost-effectiveness analysis comparing the essure tubal sterilization procedure and laparoscopic tubal sterilization.

    Science.gov (United States)

    Thiel, John A; Carson, George D

    2008-07-01

    To analyze the financial implications of establishing a hysteroscopic sterilization program using the Essure micro-insert tubal sterilization system in an ambulatory clinic. A retrospective cohort study (Canadian Task Force classification Type II-2), in an ambulatory women's health clinic in a tertiary hospital, of 108 women undergoing Essure coil insertion between 2005 and 2006, and 104 women undergoing laparoscopic tubal sterilization for permanent sterilization between 2001 and 2004. The Essure procedures used a 4 mm single channel operative hysteroscope and conscious sedation (fentanyl and midazolam); the laparoscopic tubal sterilizations were completed under general anaesthesia with a 7 mm laparoscope and either bipolar cautery or Filshie clips. Costs associated with the procedure, follow-up, and management of any complications (including nursing, hospital charges, equipment, and disposables) were tabulated. The Essure coils were successfully placed on the first attempt in 103 of 108 women (95%). Three patients required a second attempt to complete placement and two patients required laparoscopic tubal sterilization after an unsuccessful Essure. All 104 laparoscopic tubals were completed on the first attempt with no complications reported. The total cost for the 108 Essure procedures, including follow-up evaluation, was $138,996 or $1287 per case. The total cost associated with the 104 laparoscopic tubal sterilization procedures was $148,227 or $1398 per case. The incremental cost-effectiveness ratio was $111. The Essure procedure in an ambulatory setting resulted in a statistically significant cost saving of $111 per sterilization procedure. Carrying out the Essure procedure in an ambulatory setting frees space in the operating room for other types of cases, improving access to care for more patients.

  5. Salivary SPECT and factor analysis in Sjoegren's syndrome

    International Nuclear Information System (INIS)

    Nakamura, T.; Oshiumi, Y.; Yonetsu, K.; Muranaka, T.; Sakai, K.; Kanda, S.; National Fukuoka Central Hospital

    1991-01-01

    Salivary SPECT and factor analysis in Sjoegren's syndrome were performed in 17 patients and 6 volunteers as controls. The ability of SPECT to detect small differences in the level of uptake can be used to separate glands from background even when uptake is reduced as in the patients with Sjoegren's syndrome. In control and probable Sjoegren's syndrome groups the uptake ratio of the submandibular gland to parotid gland on salivary SPECT (S/P ratio) was less than 1.0. However, in the definite Sjoergren's syndrome group, the ratio was more than 1.0. Moreover, the ratio in all patients with sialectasia, which is characteristic of Sjoegren's syndrome, was more than 1.0. Salivary factor analysis of normal parotid glands showed slowly increasing patterns of uptake and normal submandibular glands had rapidly increasing patterns of uptake. However, in the definite Sjoegren's syndrome group, the factor analysis patterns were altered, with slowly increasing patterns dominating both in the parotid and submandibular glands. These results suggest that the S/P ratio in salivary SPECT and salivary factor analysis provide additional radiologic criteria in diagnosing Sjoegren's syndrome. (orig.)

  6. Genomewide analysis of TCP transcription factor gene family in ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Volume 93; Issue 3. Genomewide ... Teosinte branched1/cycloidea/proliferating cell factor1 (TCP) proteins are a large family of transcriptional regulators in angiosperms. They are ... To the best of our knowledge, this is the first study of a genomewide analysis of apple TCP gene family.

  7. Liquidity indicator for the Croatian economy – Factor analysis approach

    Directory of Open Access Journals (Sweden)

    Mirjana Čižmešija

    2014-12-01

    Full Text Available Croatian business surveys (BS are conducted in the manufacturing industry, retail trade and construction sector. In all of these sectors, manager´s assessments of liquidity are measured. The aim of the paper was to form a new composite liquidity indicator by including business survey liquidity measures from all three covered economic sectors in the Croatian economy mentioned above. In calculating the leading indicator, a factor analysis approach was used. However, this kind of indicator does not exist in a Croatia or in any other European economy. Furthermore, the issue of Croatian companies´ illiquidity is highly neglected in the literature. The empirical analysis consists of two parts. In the first part the new liquidity indicator was formed using factor analysis. One factor (representing the new liquidity indicator; LI was extracted out of the three liquidity variables in three economic sectors. This factor represents the new liquidity indicator. In the second part, econometric models were applied in order to investigate the forecasting properties of the new business survey liquidity indicator, when predicting the direction of changes in Croatian industrial production. The quarterly data used in the research covered the period from January 2000 to April 2013. Based on econometric analysis, it can be concluded that the LI is a leading indicator of Croatia’s industrial production with better forecasting properties then the standard liquidity indicators (formed in a manufacturing industry.

  8. Modular Open-Source Software for Item Factor Analysis

    Science.gov (United States)

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  9. A Confirmatory Factor Analysis of Reilly's Role Overload Scale

    Science.gov (United States)

    Thiagarajan, Palaniappan; Chakrabarty, Subhra; Taylor, Ronald D.

    2006-01-01

    In 1982, Reilly developed a 13-item scale to measure role overload. This scale has been widely used, but most studies did not assess the unidimensionality of the scale. Given the significance of unidimensionality in scale development, the current study reports a confirmatory factor analysis of the 13-item scale in two samples. Based on the…

  10. 48 CFR 1615.404-70 - Profit analysis factors.

    Science.gov (United States)

    2010-10-01

    ... CONTRACTING BY NEGOTIATION Contract Pricing 1615.404-70 Profit analysis factors. (a) OPM contracting officers... managerial expertise and effort. Evidence of effective contract performance will receive a plus weight, and... indifference to cost control will generally result in a negative weight. (2) Contract cost risk. In assessing...

  11. A methodology to incorporate organizational factors into human reliability analysis

    International Nuclear Information System (INIS)

    Li Pengcheng; Chen Guohua; Zhang Li; Xiao Dongsheng

    2010-01-01

    A new holistic methodology for Human Reliability Analysis (HRA) is proposed to model the effects of the organizational factors on the human reliability. Firstly, a conceptual framework is built, which is used to analyze the causal relationships between the organizational factors and human reliability. Then, the inference model for Human Reliability Analysis is built by combining the conceptual framework with Bayesian networks, which is used to execute the causal inference and diagnostic inference of human reliability. Finally, a case example is presented to demonstrate the specific application of the proposed methodology. The results show that the proposed methodology of combining the conceptual model with Bayesian Networks can not only easily model the causal relationship between organizational factors and human reliability, but in a given context, people can quantitatively measure the human operational reliability, and identify the most likely root causes or the prioritization of root causes caused human error. (authors)

  12. Identifying influential factors of business process performance using dependency analysis

    Science.gov (United States)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  13. An Evaluation on Factors Influencing Decision making for Malaysia Disaster Management: The Confirmatory Factor Analysis Approach

    Science.gov (United States)

    Zubir, S. N. A.; Thiruchelvam, S.; Mustapha, K. N. M.; Che Muda, Z.; Ghazali, A.; Hakimie, H.

    2017-12-01

    For the past few years, natural disaster has been the subject of debate in disaster management especially in flood disaster. Each year, natural disaster results in significant loss of life, destruction of homes and public infrastructure, and economic hardship. Hence, an effective and efficient flood disaster management would assure non-futile efforts for life saving. The aim of this article is to examine the relationship between approach, decision maker, influence factor, result, and ethic to decision making for flood disaster management in Malaysia. The key elements of decision making in the disaster management were studied based on the literature. Questionnaire surveys were administered among lead agencies at East Coast of Malaysia in the state of Kelantan and Pahang. A total of 307 valid responses had been obtained for further analysis. Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA) were carried out to analyse the measurement model involved in the study. The CFA for second-order reflective and first-order reflective measurement model indicates that approach, decision maker, influence factor, result, and ethic have a significant and direct effect on decision making during disaster. The results from this study showed that decision- making during disaster is an important element for disaster management to necessitate a successful collaborative decision making. The measurement model is accepted to proceed with further analysis known as Structural Equation Modeling (SEM) and can be assessed for the future research.

  14. Contemporary analysis of the intraoperative and perioperative complications of neurosurgical procedures performed in the sitting position.

    Science.gov (United States)

    Himes, Benjamin T; Mallory, Grant W; Abcejo, Arnoley S; Pasternak, Jeffrey; Atkinson, John L D; Meyer, Fredric B; Marsh, W Richard; Link, Michael J; Clarke, Michelle J; Perkins, William; Van Gompel, Jamie J

    2017-07-01

    OBJECTIVE Historically, performing neurosurgery with the patient in the sitting position offered advantages such as improved visualization and gravity-assisted retraction. However, this position fell out of favor at many centers due to the perceived risk of venous air embolism (VAE) and other position-related complications. Some neurosurgical centers continue to perform sitting-position cases in select patients, often using modern monitoring techniques that may improve procedural safety. Therefore, this paper reports the risks associated with neurosurgical procedures performed in the sitting position in a modern series. METHODS The authors reviewed the anesthesia records for instances of clinically significant VAE and other complications for all neurosurgical procedures performed in the sitting position between January 1, 2000, and October 8, 2013. In addition, a prospectively maintained morbidity and mortality log of these procedures was reviewed for instances of subdural or intracerebral hemorrhage, tension pneumocephalus, and quadriplegia. Both overall and specific complication rates were calculated in relation to the specific type of procedure. RESULTS In a series of 1792 procedures, the overall complication rate related to the sitting position was 1.45%, which included clinically significant VAE, tension pneumocephalus, and subdural hemorrhage. The rate of any detected VAE was 4.7%, but the rate of VAE requiring clinical intervention was 1.06%. The risk of clinically significant VAE was highest in patients undergoing suboccipital craniotomy/craniectomy with a rate of 2.7% and an odds ratio (OR) of 2.8 relative to deep brain stimulator cases (95% confidence interval [CI] 1.2-70, p = 0.04). Sitting cervical spine cases had a comparatively lower complication rate of 0.7% and an OR of 0.28 as compared with all cranial procedures (95% CI 0.12-0.67, p < 0.01). Sitting cervical cases were further subdivided into extradural and intradural procedures. The rate of

  15. Analysis of spatio-temporal variability of C-factor derived from remote sensing data

    Science.gov (United States)

    Pechanec, Vilem; Benc, Antonin; Purkyt, Jan; Cudlin, Pavel

    2016-04-01

    In some risk areas water erosion as the present task has got the strong influence on agriculture and can threaten inhabitants. In our country combination of USLE and RUSLE models has been used for water erosion assessment (Krása et al., 2013). Role of vegetation cover is characterized by the help of vegetation protection factor, so-called C- factor. Value of C-factor is given by the ratio of washing-off on a plot with arable crops to standard plot which is kept as fallow regularly spud after any rain (Janeček et al., 2012). Under conditions we cannot identify crop structure and its turn, determination of C-factor can be problem in large areas. In such case we only determine C-factor according to the average crop representation. New technologies open possibilities for acceleration and specification of the approach. Present-day approach for the C-factor determination is based on the analysis of multispectral image data. Red and infrared spectrum is extracted and these parts of image are used for computation of vegetation index series (NDVI, TSAVI). Acquired values for fractional time sections (during vegetation period) are averaged out. At the same time values of vegetation indices for a forest and cleared area are determined. Also regressive coefficients are computed. Final calculation is done by the help of regressive equations expressing relation between values of NDVI and C-factor (De Jong, 1994; Van der Knijff, 1999; Karaburun, 2010). Up-to-date land use layer is used for the determination of erosion threatened areas on the base of selection of individual landscape segments of erosion susceptible categories of land use. By means of Landsat 7 data C-factor has been determined for the whole area of the Czech Republic in every month of the year of 2014. At the model area in a small watershed C-factor has been determined by the conventional (tabular) procedure. Analysis was focused on: i) variability assessment of C-factor values while using the conventional

  16. Restraint Procedures and Challenging Behaviours in Intellectual Disability: An Analysis of Causative Factors

    Science.gov (United States)

    Matson, Johnny L.; Boisjoli, Jessica A.

    2009-01-01

    Background: Persons with intellectual disability often evince challenging behaviours. Efforts have been underway for some time to develop prosocial or positive skill acquisition treatments to address challenging behaviours. However, physical/mechanical and chemical restraint is still commonly used in many clinical and community settings. Such…

  17. Simple estimation procedures for regression analysis of interval-censored failure time data under the proportional hazards model.

    Science.gov (United States)

    Sun, Jianguo; Feng, Yanqin; Zhao, Hui

    2015-01-01

    Interval-censored failure time data occur in many fields including epidemiological and medical studies as well as financial and sociological studies, and many authors have investigated their analysis (Sun, The statistical analysis of interval-censored failure time data, 2006; Zhang, Stat Modeling 9:321-343, 2009). In particular, a number of procedures have been developed for regression analysis of interval-censored data arising from the proportional hazards model (Finkelstein, Biometrics 42:845-854, 1986; Huang, Ann Stat 24:540-568, 1996; Pan, Biometrics 56:199-203, 2000). For most of these procedures, however, one drawback is that they involve estimation of both regression parameters and baseline cumulative hazard function. In this paper, we propose two simple estimation approaches that do not need estimation of the baseline cumulative hazard function. The asymptotic properties of the resulting estimates are given, and an extensive simulation study is conducted and indicates that they work well for practical situations.

  18. A human factor analysis of a radiotherapy accident

    International Nuclear Information System (INIS)

    Thellier, S.

    2009-01-01

    Since September 2005, I.R.S.N. studies activities of radiotherapy treatment from the angle of the human and organizational factors to improve the reliability of treatment in radiotherapy. Experienced in nuclear industry incidents analysis, I.R.S.N. analysed and diffused in March 2008, for the first time in France, the detailed study of a radiotherapy accident from the angle of the human and organizational factors. The method used for analysis is based on interviews and documents kept by the hospital. This analysis aimed at identifying the causes of the difference recorded between the dose prescribed by the radiotherapist and the dose effectively received by the patient. Neither verbal nor written communication (intra-service meetings and protocols of treatment) allowed information to be transmitted correctly in order to permit radiographers to adjust the irradiation zones correctly. This analysis highlighted the fact that during the preparation and the carrying out of the treatment, various factors led planned controls to not be performed. Finally, this analysis highlighted the fact that unsolved areas persist in the report over this accident. This is due to a lack of traceability of a certain number of key actions. The article concluded that there must be improvement in three areas: cooperation between the practitioners, control of the actions and traceability of the actions. (author)

  19. Evaluation of six sample preparation procedures for qualitative and quantitative proteomics analysis of milk fat globule membrane.

    Science.gov (United States)

    Yang, Yongxin; Anderson, Elizabeth; Zhang, Sheng

    2018-04-12

    Proteomic analysis of membrane proteins is challenged by the proteins solubility and detergent incompatibility with MS analysis. No single perfect protocol can be used to comprehensively characterize the proteome of membrane fraction. Here, we used cow milk fat globule membrane (MFGM) proteome analysis to assess six sample preparation procedures including one in-gel and five in-solution digestion approaches prior to LC-MS/MS analysis. The largest number of MFGM proteins were identified by suspension trapping (S-Trap) and filter-aided sample preparation (FASP) methods, followed by acetone precipitation without clean-up of tryptic peptides method. Protein identifications with highest average coverage was achieved by Chloroform/MeOH, in-gel and S-Trap methods. Most distinct proteins were identified by FASP method, followed by S-Trap. Analyses by Venn diagram, principal-component analysis, hierarchical clustering and the abundance ranking of quantitative proteins highlight differences in the MFGM fraction by the all sample preparation procedures. These results reveal the biased proteins/peptides loss occurred in each protocol. In this study, we found several novel proteins that were not observed previously by in-depth proteomics characterization of MFGM fraction in milk. Thus, a combination of multiple procedures with orthologous properties of sample preparation was demonstrated to improve the protein sequence coverage and expression level accuracy of membrane samples. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Evaluation of ASME code flaw analysis procedure using the influence function method for application to PWR primary piping

    International Nuclear Information System (INIS)

    Hong, S.Y.; Yeater, M.L.

    1985-01-01

    This paper discusses stress intensity factor calculations and fatigue analysis for a PWR primary coolant piping system. The influence function method is applied to evaluate ASME Code Section XI Appendix A ''analysis of flaw indication'' for the application to a PWR primary piping. Results of the analysis are discussed in detail. (orig.)