WorldWideScience

Sample records for factor analysis procedures

  1. Human factoring administrative procedures

    International Nuclear Information System (INIS)

    Grider, D.A.; Sturdivant, M.H.

    1991-01-01

    In nonnuclear business, administrative procedures bring to mind such mundane topics as filing correspondence and scheduling vacation time. In the nuclear industry, on the other hand, administrative procedures play a vital role in assuring the safe operation of a facility. For some time now, industry focus has been on improving technical procedures. Significant efforts are under way to produce technical procedure requires that a validated technical, regulatory, and administrative basis be developed and that the technical process be established for each procedure. Producing usable technical procedures requires that procedure presentation be engineered to the same human factors principles used in control room design. The vital safety role of administrative procedures requires that they be just as sound, just a rigorously formulated, and documented as technical procedures. Procedure programs at the Tennessee Valley Authority and at Boston Edison's Pilgrim Station demonstrate that human factors engineering techniques can be applied effectively to technical procedures. With a few modifications, those same techniques can be used to produce more effective administrative procedures. Efforts are under way at the US Department of Energy Nuclear Weapons Complex and at some utilities (Boston Edison, for instance) to apply human factors engineering to administrative procedures: The techniques being adapted include the following

  2. Human Reliability Analysis For Computerized Procedures

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Gertman, David I.; Le Blanc, Katya

    2011-01-01

    This paper provides a characterization of human reliability analysis (HRA) issues for computerized procedures in nuclear power plant control rooms. It is beyond the scope of this paper to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper provides a review of HRA as applied to traditional paper-based procedures, followed by a discussion of what specific factors should additionally be considered in HRAs for computerized procedures. Performance shaping factors and failure modes unique to computerized procedures are highlighted. Since there is no definitive guide to HRA for paper-based procedures, this paper also serves to clarify the existing guidance on paper-based procedures before delving into the unique aspects of computerized procedures.

  3. Human factor analysis related to new symptom based procedures used by control room crews during treatment of emergency states

    International Nuclear Information System (INIS)

    Holy, J.

    1999-01-01

    New symptom based emergency procedures have been developed for Nuclear Power Plant Dukovany in the Czech Republic. As one point of the process of verification and validation of the procedures, a specific effort was devoted to detailed analysis of the procedures from human factors and human reliability point of view. The course and results of the analysis are discussed in this article. Although the analyzed procedures have been developed for one specific plant of WWER-440/213 type, most of the presented results may be valid for many other procedures recently developed for semi-automatic control of those technological units which are operated under measurable level of risk. (author)

  4. [Delirium in stroke patients : Critical analysis of statistical procedures for the identification of risk factors].

    Science.gov (United States)

    Nydahl, P; Margraf, N G; Ewers, A

    2017-04-01

    Delirium is a relevant complication following an acute stroke. It is a multifactor occurrence with numerous interacting risk factors that alternately influence each other. The risk factors of delirium in stroke patients are often based on limited clinical studies. The statistical procedures and clinical relevance of delirium related risk factors in adult stroke patients should therefore be questioned. This secondary analysis includes clinically relevant studies that give evidence for the clinical relevance and statistical significance of delirium-associated risk factors in stroke patients. The quality of the reporting of regression analyses was assessed using Ottenbacher's quality criteria. The delirium-associated risk factors identified were examined with regard to statistical significance using the Bonferroni method of multiple testing for forming incorrect positive hypotheses. This was followed by a literature-based discussion on clinical relevance. Nine clinical studies were included. None of the studies fulfilled all the prerequisites and assumptions given for the reporting of regression analyses according to Ottenbacher. Of the 108 delirium-associated risk factors, a total of 48 (44.4%) were significant, whereby a total of 28 (58.3%) were false positive after Bonferroni correction. Following a literature-based discussion on clinical relevance, the assumption of statistical significance and clinical relevance could be found for only four risk factors (dementia or cognitive impairment, total anterior infarct, severe infarct and infections). The statistical procedures used in the existing literature are questionable, as are their results. A post-hoc analysis and critical appraisal reduced the number of possible delirium-associated risk factors to just a few clinically relevant factors.

  5. Human Factors Process Task Analysis Liquid Oxygen Pump Acceptance Test Procedure for the Advanced Technology Development Center

    Science.gov (United States)

    Diorio, Kimberly A.

    2002-01-01

    A process task analysis effort was undertaken by Dynacs Inc. commencing in June 2002 under contract from NASA YA-D6. Funding was provided through NASA's Ames Research Center (ARC), Code M/HQ, and Industrial Engineering and Safety (IES). The John F. Kennedy Space Center (KSC) Engineering Development Contract (EDC) Task Order was 5SMA768. The scope of the effort was to conduct a Human Factors Process Failure Modes and Effects Analysis (HF PFMEA) of a hazardous activity and provide recommendations to eliminate or reduce the effects of errors caused by human factors. The Liquid Oxygen (LOX) Pump Acceptance Test Procedure (ATP) was selected for this analysis. The HF PFMEA table (see appendix A) provides an analysis of six major categories evaluated for this study. These categories include Personnel Certification, Test Procedure Format, Test Procedure Safety Controls, Test Article Data, Instrumentation, and Voice Communication. For each specific requirement listed in appendix A, the following topics were addressed: Requirement, Potential Human Error, Performance-Shaping Factors, Potential Effects of the Error, Barriers and Controls, Risk Priority Numbers, and Recommended Actions. This report summarizes findings and gives recommendations as determined by the data contained in appendix A. It also includes a discussion of technology barriers and challenges to performing task analyses, as well as lessons learned. The HF PFMEA table in appendix A recommends the use of accepted and required safety criteria in order to reduce the risk of human error. The items with the highest risk priority numbers should receive the greatest amount of consideration. Implementation of the recommendations will result in a safer operation for all personnel.

  6. Abdominoplasty: Risk Factors, Complication Rates, and Safety of Combined Procedures.

    Science.gov (United States)

    Winocour, Julian; Gupta, Varun; Ramirez, J Roberto; Shack, R Bruce; Grotting, James C; Higdon, K Kye

    2015-11-01

    Among aesthetic surgery procedures, abdominoplasty is associated with a higher complication rate, but previous studies are limited by small sample sizes or single-institution experience. A cohort of patients who underwent abdominoplasty between 2008 and 2013 was identified from the CosmetAssure database. Major complications were recorded. Univariate and multivariate analysis was performed evaluating risk factors, including age, smoking, body mass index, sex, diabetes, type of surgical facility, and combined procedures. The authors identified 25,478 abdominoplasties from 183,914 procedures in the database. Of these, 8,975 patients had abdominoplasty alone and 16,503 underwent additional procedures. The number of complications recorded was 1,012 (4.0 percent overall rate versus 1.4 percent in other aesthetic surgery procedures). Of these, 31.5 percent were hematomas, 27.2 percent were infections and 20.2 percent were suspected or confirmed venous thromboembolism. On multivariate analysis, significant risk factors (p procedures (1.5), and procedure performance in a hospital or surgical center versus office-based surgical suite (1.6). Combined procedures increased the risk of complication (abdominoplasty alone, 3.1 percent; with liposuction, 3.8 percent; breast procedure, 4.3 percent; liposuction and breast procedure, 4.6 percent; body-contouring procedure, 6.8 percent; liposuction and body-contouring procedure, 10.4 percent). Abdominoplasty is associated with a higher complication rate compared with other aesthetic procedures. Combined procedures can significantly increase complication rates and should be considered carefully in higher risk patients. Risk, II.

  7. Current outcomes and risk factors for the Norwood procedure.

    Science.gov (United States)

    Stasik, Chad N; Gelehrter, S; Goldberg, Caren S; Bove, Edward L; Devaney, Eric J; Ohye, Richard G

    2006-02-01

    Tremendous strides have been made in the outcomes for hypoplastic left heart syndrome and other functional single-ventricle malformations over the past 25 years. This progress relates primarily to improvements in survival for patients undergoing the Norwood procedure. Previous reports on risk factors have been on smaller groups of patients or collected over relatively long periods of time, during which management has evolved. We analyzed our current results for the Norwood procedure with attention to risk factors for poor outcome. A single-institution review of all patients undergoing a Norwood procedure for a single-ventricle malformation from May 1, 2001, through April 30, 2003, was performed. Patient demographics, anatomy, clinical condition, associated anomalies, operative details, and outcomes were recorded. Of the 111 patients, there were 23 (21%) hospital deaths. Univariate analysis revealed noncardiac abnormalities (genetic or significant extracardiac diagnosis, P = .0018), gestational age (P = .03), diagnosis of unbalanced atrioventricular septal defect (P = .017), and weight of less than 2.5 kg (P = .0072) to be related to hospital death. On multivariate analysis, only weight of less than 2.5 kg and noncardiac abnormalities were found to be independent risk factors. Patients with either of these characteristics had a hospital survival of 52% (12/23), whereas those at standard risk had a survival of 86% (76/88). Although improvements in management might have lessened the effect of some of the traditionally reported risk factors related to variations in the cardiovascular anatomy, noncardiac abnormalities and low birth weight remain as a future challenge for the physician caring for the patient with single-ventricle physiology.

  8. Procedure for measurement of anisotropy factor for neutron sources

    International Nuclear Information System (INIS)

    Creazolla, Prycylla Gomes

    2017-01-01

    Radioisotope neutron sources allow the production of reference fields for calibration of neutron detectors for radiation protection and analysis purposes. When the emission rate of these sources is isotropic, no correction is necessary. However, variations in source encapsulation and in the radioactive material concentration produce differences in its neutron emission rate, relative to the source axis, this effect is called anisotropy. In this study, is describe a procedure for measuring the anisotropy factor of neutron sources performed in the Laboratório de Metrologia de Neutrons (LN) using a Precision Long Counter (PLC) detector. A measurement procedure that takes into account the anisotropy factor of neutron sources contributes to solve some issues, particularly with respect to the high uncertainties associated with neutron dosimetry. Thus, a bibliographical review was carried out based on international standards and technical regulations specific to the area of neutron fields, and were later reproduced in practice by means of the procedure for measuring the anisotropy factor in neutron sources of the LN. The anisotropy factor is determined as a function of the angle of 90° in relation to the cylindrical axis of the source. This angle is more important due to its high use in measurements and also of its higher neutron emission rate if compared with other angles. (author)

  9. Aesthetic Surgical Procedures in Men: Major Complications and Associated Risk Factors.

    Science.gov (United States)

    Kaoutzanis, Christodoulos; Winocour, Julian; Yeslev, Max; Gupta, Varun; Asokan, Ishan; Roostaeian, Jason; Grotting, James C; Higdon, K Kye

    2018-03-14

    The number of men undergoing cosmetic surgery is increasing in North America. To determine the incidence and risk factors of major complications in males undergoing cosmetic surgery, compare the complication profiles between men and women, and identify specific procedures that are associated with higher risk of complications in males. A prospective cohort of patients undergoing cosmetic surgery between 2008 and 2013 was identified from the CosmetAssure database. Gender specific procedures were excluded. Primary outcome was occurrence of a major complication in males requiring emergency room visit, hospital admission, or reoperation within 30 days of the index operation. Univariate and multivariate analysis evaluated potential risk factors for major complications including age, body mass index (BMI), smoking, diabetes, type of surgical facility, type of procedure, and combined procedures. Of the 129,007 patients, 54,927 underwent gender nonspecific procedures, of which 5801 (10.6%) were males. Women showed a higher mean age (46.4 ± 14.1 vs 45.2 ± 16.7 years, P procedures (RR 3.47), and combined procedures (RR 2.56). Aesthetic surgery in men is safe with low major complication rates. Modifiable predictors of complications included BMI and combined procedures.

  10. Hierarchical Factoring Based On Image Analysis And Orthoblique Rotations.

    Science.gov (United States)

    Stankov, L

    1979-07-01

    The procedure for hierarchical factoring suggested by Schmid and Leiman (1957) is applied within the framework of image analysis and orthoblique rotational procedures. It is shown that this approach necessarily leads to correlated higher order factors. Also, one can obtain a smaller number of factors than produced by typical hierarchical procedures.

  11. Groin hematoma after electrophysiological procedures-incidence and predisposing factors.

    Science.gov (United States)

    Dalsgaard, Anja Borgen; Jakobsen, Christina Spåbæk; Riahi, Sam; Hjortshøj, Søren

    2014-10-01

    We evaluated the incidence and predisposing factors of groin hematomas after electrophysiological (EP) procedures. Prospective, observational study, enrolling consecutive patients after EP procedures (Atrial fibrillation: n = 151; Supraventricular tachycardia/Diagnostic EP: n = 82; Ventricular tachycardia: n = 18). Patients underwent manual compression for 10 min and 3 h post procedural bed rest. AF ablations were performed with INR 2-3, ACT > 300, and no protamine sulfate. Adhesive pressure dressings (APDs) were used if sheath size ≥ 10F; procedural time > 120 min; and BMI > 30. Patient-reported hematomas were recorded by a telephone follow-up after 2 weeks. Hematoma developed immediately in 26 patients (10%) and after 14 days significant hematoma was reported in 68 patients (27%). Regression analysis on sex, age, BMI 25, ACT 300, use of APD, sheath size and number, and complicated venous access was not associated with hematoma, either immediately after the procedure or after 14 days. Any hematoma presenting immediately after procedures was associated with patient-reported hematomas after 14 days, odds ratio 18.7 (CI 95%: 5.00-69.8; P hematoma immediately after EP procedures was the sole predictor of patient-reported hematoma after 2 weeks. Initiatives to prevent groin hematoma should focus on the procedure itself as well as post-procedural care.

  12. A procedure for effective Dancoff factor calculation

    International Nuclear Information System (INIS)

    Milosevic, M.

    2001-01-01

    In this paper, a procedure for Dancoff factors calculation based on equivalence principle and its application in the SCALE-4.3 code system is described. This procedure is founded on principle of conservation of neutron absorption for resolved resonance range in a heterogeneous medium and an equivalent medium consisted of an infinite array of two-region pin cells, where the presence of other fuel rods is taken into account through a Dancoff factor. The neutron absorption in both media is obtained using a fine-group elastic slowing-down calculation. This procedure is implemented in a design oriented lattice physics code, which is applicable for any geometry where the method of collision probability is possible to apply to get a flux solution. Proposed procedure was benchmarked for recent exercise that represents a system with a fuel double heterogeneity, i.e., fuel in solid form (pellets) surrounded by fissile material in solution, and for a 5x5 irregular pressurised water reactor assembly, which requires different Dancoff factors. (author)

  13. Step Complexity Measure for Emergency Operating Procedures - Determining Weighting Factors

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jung, Wondea; Kim, Jaewhan; Ha, Jaejoo

    2003-01-01

    weighting factors are determined by using a nonlinear regression analysis. The results show that the SC scores quantified by the new weighting factors show statistically meaningful correlation with averaged step performance time data. Thus, it can be concluded that the SC measure can represent the complexity of procedural steps included in EOPs

  14. Design-related influencing factors of the computerized procedure system for inclusion into human reliability analysis of the advanced control room

    International Nuclear Information System (INIS)

    Kim, Jaewhan; Lee, Seung Jun; Jang, Seung Cheol; Ahn, Kwang-Il; Shin, Yeong Cheol

    2013-01-01

    This paper presents major design factors of the computerized procedure system (CPS) by task characteristics/requirements, with individual relative weight evaluated by the analytic hierarchy process (AHP) technique, for inclusion into human reliability analysis (HRA) of the advanced control rooms. Task characteristics/requirements of an individual procedural step are classified into four categories according to the dynamic characteristics of an emergency situation: (1) a single-static step, (2) a single-dynamic and single-checking step, (3) a single-dynamic and continuous-monitoring step, and (4) a multiple-dynamic and continuous-monitoring step. According to the importance ranking evaluation by the AHP technique, ‘clearness of the instruction for taking action’, ‘clearness of the instruction and its structure for rule interpretation’, and ‘adequate provision of requisite information’ were rated as of being higher importance for all the task classifications. Importance of ‘adequacy of the monitoring function’ and ‘adequacy of representation of the dynamic link or relationship between procedural steps’ is dependent upon task characteristics. The result of the present study gives a valuable insight on which design factors of the CPS should be incorporated, with relative importance or weight between design factors, into HRA of the advanced control rooms. (author)

  15. Risk factors for postoperative urinary tract infection following midurethral sling procedures.

    Science.gov (United States)

    Doganay, Melike; Cavkaytar, Sabri; Kokanali, Mahmut Kuntay; Ozer, Irfan; Aksakal, Orhan Seyfi; Erkaya, Salim

    2017-04-01

    To identify the potential risk factors for urinary tract infections following midurethral sling procedures. 556 women who underwent midurethral sling procedure due to stress urinary incontinence over a four-year period were reviewed in this retrospective study. Of the study population, 280 women underwent TVT procedures and 276 women underwent TOT procedures. Patients were evaluated at 4-8 weeks postoperatively and were investigated for the occurrence of a urinary tract infection. Patients who experienced urinary tract infection were defined as cases, and patients who didn't were defined as controls. All data were collected from medical records. Multivariate logistic regression model was used to identify the risk factors for urinary tract infection. Of 556 women, 58 (10.4%) were defined as cases while 498 (89.6%) were controls. The mean age of women in cases (57.8±12.9years) was significantly greater than in controls (51.8±11.2years) (purinary tract infection, concomitant vaginal hysterectomy and cystocele repair, TVT procedure and postoperative postvoiding residual bladder volume ≥100ml were more common in cases than in controls. However, in multivariate regression analysis model presence of preoperative urinary tract infection [OR (95% CI)=0.1 (0.1-0.7); p=0.013], TVT procedure [OR (95% CI)=8.4 (3.1-22.3); p=0.000] and postoperative postvoiding residual bladder volume ≥100ml [OR (95% CI)=4.6 (1.1-19.2); p=0.036] were significant independent risk factors for urinary tract infection following midurethral slings CONCLUSION: Urinary tract infection after midurethral sling procedures is a relatively common complication. The presence of preoperative urinary tract infection, TVT procedure and postoperative postvoiding residual bladder volume ≥100ml may increase the risk of this complication. Identification of these factors could help surgeons to minimize this complicationby developing effective strategies. Copyright © 2017. Published by Elsevier B.V.

  16. Cost analysis of radiological interventional procedures and reimbursement within a clinic

    International Nuclear Information System (INIS)

    Strotzer, M.; Voelk, M.; Lenhart, M.; Fruend, R.; Feuerbach, S.

    2002-01-01

    Purpose: Analysis of costs for vascular radiological interventions on a per patient basis and comparison with reimbursement based on GOAe(Gebuehrenordnung fuer Aerzte) and DKG-NT (Deutsche Krankenhausgesellschaft-Nebenkostentarif). Material and Methods: The ten procedures most frequently performed within 12 months were evaluated. Personnel costs were derived from precise costs per hour and estimated procedure time for each intervention. Costs for medical devices were included. Reimbursement based on GOAewas calculated using the official conversion factor of 0.114 DM for each specific relative value unit and a multiplication factor of 1.0. The corresponding conversion factor for DKG-NT, determined by the DKG, was 0.168 DM. Results: A total of 832 interventional procedures were included. Marked differences between calculated costs and reimbursement rates were found. Regarding the ten most frequently performed procedures, there was a deficit of 1.06 million DM according GOAedata (factor 1.0) and 0.787 million DM according DKG-NT. The percentage of reimbursement was only 34.2 (GOAe; factor 1.0) and 51.3 (DKG-NT), respectively. Conclusion: Reimbursement of radiological interventional procedures based on GOAeand DKG-NT data is of limited value for economic controlling purposes within a hospital. (orig.) [de

  17. Bio-Oil Analysis Laboratory Procedures | Bioenergy | NREL

    Science.gov (United States)

    Bio-Oil Analysis Laboratory Procedures Bio-Oil Analysis Laboratory Procedures NREL develops laboratory analytical procedures (LAPs) for the analysis of raw and upgraded pyrolysis bio-oils. These standard procedures have been validated and allow for reliable bio-oil analysis. Procedures Determination

  18. Experimental verification for standard analysis procedure of 241Am in food

    International Nuclear Information System (INIS)

    Liu Qingfen; Zhu Hongda; Liu Shutian; Pan Jingshun; Yang Dating

    2005-01-01

    Objective: The briefly experimental verification for 'determination of 241 Am in food' has been described. Methods: The overall recovery, the MDL of method and decontamination experiment has been done by standard analysis procedure. Results: The overall recovery is 76.26 ± 4.1%. The MDL is 3.4 x 10 -5 Bq/g ash, decontamination factor is higher than 10 3 for Po, 10 2 for U, Th, Pu and 60 for 237 Np. Conclusion: The results showed that the overall recovery is quite high and reliable, the MDL of method is able to meet examining 241 Am limited values in foods. the obtained decontamination factors of recommended procedure can meet analysis of 241 Am in food examination. Venifying results of the procedure are satisfied by using 243 Am spike and 241 Am standard reference material. (authors)

  19. Biomass Compositional Analysis Laboratory Procedures | Bioenergy | NREL

    Science.gov (United States)

    Biomass Compositional Analysis Laboratory Procedures Biomass Compositional Analysis Laboratory Procedures NREL develops laboratory analytical procedures (LAPs) for standard biomass analysis. These procedures help scientists and analysts understand more about the chemical composition of raw biomass

  20. Exploratory factor analysis and reliability analysis with missing data: A simple method for SPSS users

    Directory of Open Access Journals (Sweden)

    Bruce Weaver

    2014-09-01

    Full Text Available Missing data is a frequent problem for researchers conducting exploratory factor analysis (EFA or reliability analysis. The SPSS FACTOR procedure allows users to select listwise deletion, pairwise deletion or mean substitution as a method for dealing with missing data. The shortcomings of these methods are well-known. Graham (2009 argues that a much better way to deal with missing data in this context is to use a matrix of expectation maximization (EM covariances(or correlations as input for the analysis. SPSS users who have the Missing Values Analysis add-on module can obtain vectors ofEM means and standard deviations plus EM correlation and covariance matrices via the MVA procedure. But unfortunately, MVA has no /MATRIX subcommand, and therefore cannot write the EM correlations directly to a matrix dataset of the type needed as input to the FACTOR and RELIABILITY procedures. We describe two macros that (in conjunction with an intervening MVA command carry out the data management steps needed to create two matrix datasets, one containing EM correlations and the other EM covariances. Either of those matrix datasets can then be used asinput to the FACTOR procedure, and the EM correlations can also be used as input to RELIABILITY. We provide an example that illustrates the use of the two macros to generate the matrix datasets and how to use those datasets as input to the FACTOR and RELIABILITY procedures. We hope that this simple method for handling missing data will prove useful to both students andresearchers who are conducting EFA or reliability analysis.

  1. Application of human error theory in case analysis of wrong procedures.

    Science.gov (United States)

    Duthie, Elizabeth A

    2010-06-01

    The goal of this study was to contribute to the emerging body of literature about the role of human behaviors and cognitive processes in the commission of wrong procedures. Case analysis of 5 wrong procedures in operative and nonoperative settings using James Reason's human error theory was performed. The case analysis showed that cognitive underspecification, cognitive flips, automode processing, and skill-based errors were contributory to wrong procedures. Wrong-site procedures accounted for the preponderance of the cases. Front-line supervisory staff used corrective actions that focused on the performance of the individual without taking into account cognitive factors. System fixes using human cognition concepts have a greater chance of achieving sustainable safety outcomes than those that are based on the traditional approach of counseling, education, and disciplinary action for staff.

  2. UNC Nuclear Industries' human-factored approach to the operating or maintenance procedure

    International Nuclear Information System (INIS)

    Nelson, A.A.; Clark, J.E.

    1982-01-01

    The development of Human Factors Engineering (HFE) and UNC Nuclear Industries' (UNC) commitment to minimizing the potential for human error in the performance of operating or maintenance procedures have lead to a procedure upgrade program. Human-factored procedures were developed using information from many sources including, but not limited to, operators, a human factors specialist, engineers and supervisors. This has resulted in the Job Performance Aid (JPA). This paper presents UNC's approach to providing human-factored operating and maintenance procedures

  3. Factors affecting the design of instrument flight procedures

    Directory of Open Access Journals (Sweden)

    Ivan FERENCZ

    2008-01-01

    Full Text Available The article highlights factors, which might affect the design of instrument flight procedures. Ishikawa diagram is used to distribute individual factors into classes, as are People, Methods, Regulations, Tools, Data and Environment.

  4. Human factors evaluation of teletherapy: Human-system interfaces and procedures. Volume 3

    International Nuclear Information System (INIS)

    Kaye, R.D.; Henriksen, K.; Jones, R.; Morisseau, D.S.; Serig, D.I.

    1995-07-01

    A series of human factors evaluations was undertaken to better understand the contributing factors to human error in the teletherapy environment. Teletherapy is a multidisciplinary methodology for treating cancerous tissue through selective exposure to an external beam of ionizing radiation. The principal sources of radiation are a radioactive isotope, typically cobalt60 (Co-60), or a linear accelerator device capable of producing very high energy x-ray and electron beams. A team of human factors specialists conducted site visits to radiation oncology departments at community hospitals, university centers, and free-standing clinics. In addition, a panel of radiation oncologists, medical physicists, and radiation technologists served as subject matter experts. A function and task analysis was initially performed to guide subsequent evaluations in the areas of user-system interfaces, procedures, training and qualifications, and organizational policies and practices. The present report focuses on an evaluation of the human-system interfaces in relation to the treatment machines and supporting equipment (e.g., simulators, treatment planning computers, control consoles, patient charts) found in the teletherapy environment. The report also evaluates operating, maintenance and emergency procedures and practices involved in teletherapy. The evaluations are based on the function and task analysis and established human engineering guidelines, where applicable

  5. Human factors evaluation of teletherapy: Human-system interfaces and procedures. Volume 3

    Energy Technology Data Exchange (ETDEWEB)

    Kaye, R.D.; Henriksen, K.; Jones, R. [Hughes Training, Inc., Falls Church, VA (United States); Morisseau, D.S.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-07-01

    A series of human factors evaluations was undertaken to better understand the contributing factors to human error in the teletherapy environment. Teletherapy is a multidisciplinary methodology for treating cancerous tissue through selective exposure to an external beam of ionizing radiation. The principal sources of radiation are a radioactive isotope, typically cobalt60 (Co-60), or a linear accelerator device capable of producing very high energy x-ray and electron beams. A team of human factors specialists conducted site visits to radiation oncology departments at community hospitals, university centers, and free-standing clinics. In addition, a panel of radiation oncologists, medical physicists, and radiation technologists served as subject matter experts. A function and task analysis was initially performed to guide subsequent evaluations in the areas of user-system interfaces, procedures, training and qualifications, and organizational policies and practices. The present report focuses on an evaluation of the human-system interfaces in relation to the treatment machines and supporting equipment (e.g., simulators, treatment planning computers, control consoles, patient charts) found in the teletherapy environment. The report also evaluates operating, maintenance and emergency procedures and practices involved in teletherapy. The evaluations are based on the function and task analysis and established human engineering guidelines, where applicable.

  6. CARVEDILOL POPULATION PHARMACOKINETIC ANALYSIS – APPLIED VALIDATION PROCEDURE

    Directory of Open Access Journals (Sweden)

    Aleksandra Catić-Đorđević

    2013-09-01

    and drug. This study confirmed the importance of using valid analytical procedure for the purpose of carvedilol population pharmacokinetic analysis. Identification of demographic, pathophysiological and other factors that may influence the population carvedilol PK parameters gives the physician the possibility of a more comprehensive overview of the patient and better optimization of the therapeutical regimen.

  7. Procedures for measurement of anisotropy factor of neutron sources

    International Nuclear Information System (INIS)

    Creazolla, P.G.; Camargo, A.; Astuto, A.; Silva, F.; Pereira, W.W.

    2017-01-01

    Radioisotope sources of neutrons allow the production of reference fields for calibration of neutron measurement devices for radioprotection and analysis purposes. When the emission rate of these sources is isotropic, no correction is necessary. However, variations in the source capsule material and variations in the concentration of the emitting material may produce differences in its neutron emission rate relative to the source axis, this effect is called anisotropy. A proposed procedure for measuring the anisotropy factor of the sources belonging to the IRD/LNMRI/LN Neutron Metrology Laboratory using a Precision Long Counter (PLC) detector will be presented

  8. Safety analysis procedures for PHWR

    International Nuclear Information System (INIS)

    Min, Byung Joo; Kim, Hyoung Tae; Yoo, Kun Joong

    2004-03-01

    The methodology of safety analyses for CANDU reactors in Canada, a vendor country, uses a combination of best-estimate physical models and conservative input parameters so as to minimize the uncertainty of the plant behavior predictions. As using the conservative input parameters, the results of the safety analyses are assured the regulatory requirements such as the public dose, the integrity of fuel and fuel channel, the integrity of containment and reactor structures, etc. However, there is not the comprehensive and systematic procedures for safety analyses for CANDU reactors in Korea. In this regard, the development of the safety analyses procedures for CANDU reactors is being conducted not only to establish the safety analyses system, but also to enhance the quality assurance of the safety assessment. In the first phase of this study, the general procedures of the deterministic safety analyses are developed. The general safety procedures are covered the specification of the initial event, selection of the methodology and accident sequences, computer codes, safety analysis procedures, verification of errors and uncertainties, etc. Finally, These general procedures of the safety analyses are applied to the Large Break Loss Of Coolant Accident (LBLOCA) in Final Safety Analysis Report (FSAR) for Wolsong units 2, 3, 4

  9. Factors Affecting Patient Satisfaction During Endoscopic Procedures

    International Nuclear Information System (INIS)

    Qureshi, M. O.; Shafqat, F.; Ahmed, S.; Niazi, T. K.; Khokhar, N. K.

    2013-01-01

    Objective: To assess the quality and patient satisfaction in Endoscopy Unit of Shifa International Hospital. Study Design: Cross-sectional survey. Place and Duration of Study: Division of Gastroenterology, Shifa International Hospital, Islamabad, Pakistan, from July 2011 to January 2012. Methodology: Quality and patient satisfaction after the endoscopic procedure was assessed using a modified GHAA-9 questionnaire. Data was analyzed using SPSS version 16. Results: A total of 1028 patients were included with a mean age of 45 A+- 14.21 years. Out of all the procedures, 670 (65.17%) were gastroscopies, 181 (17.60%) were flexible sigmoidoscopies and 177 (17.21%) were colonoscopies. The maximum unsatisfactory responses were on the waiting time before the procedure (13.13 %), followed by unsatisfactory explanation of the procedure and answers to questions (7.58%). Overall, unsatisfied impression was 4.86%. The problem rate was 6.22%. Conclusion: The quality of procedures and level of satisfaction of patients undergoing a gastroscopy or colonoscopy was generally good. The factors that influence the satisfaction of these patients are related to communication between doctor and patient, doctor's manner and waiting time for the procedure. Feedback information in an endoscopy unit may be useful in improving standards, including the performance of endoscopists. (author)

  10. Identifying cognitive complexity factors affecting the complexity of procedural steps in emergency operating procedures of a nuclear power plant

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jeong, Kwangsup; Jung, Wondea

    2005-01-01

    In complex systems such as a nuclear and chemical plant, it is well known that the provision of understandable procedures that allow operators to clarify what needs to be done and how to do it is one of the requisites to secure their safety. As a previous study in providing understandable procedures, the step complexity (SC) measure that can quantify the complexity of procedural steps in emergency operating procedures (EOPs) of a nuclear power plant (NPP) was suggested. However, the necessity of additional complexity factors that can consider a cognitive aspect in evaluating the complexity of procedural steps is raised. To this end, the comparisons between operators' performance data measured by the form of a step performance time with their behavior in carrying out the prescribed activities of procedural steps are conducted in this study. As a result, two kinds of complexity factors (the abstraction level of knowledge and the level of engineering decision) that could affect an operator's cognitive burden are identified. Although a well-designed experiment is indispensable for confirming the appropriateness of the additional complexity factors, it is strongly believed that the change of operators' performance data can be more authentically explained if the additional complexity factors are taken into consideration

  11. Identifying cognitive complexity factors affecting the complexity of procedural steps in emergency operating procedures of a nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jinkyun [Integrated Safety Assessment Division, Korea Atomic Energy Research Institute, P.O. Box 105, Duckjin-Dong, Yusong-Ku, Taejon 305-600 (Korea, Republic of)]. E-mail: kshpjk@kaeri.re.kr; Jeong, Kwangsup [Integrated Safety Assessment Division, Korea Atomic Energy Research Institute, P.O. Box 105, Duckjin-Dong, Yusong-Ku, Taejon 305-600 (Korea, Republic of); Jung, Wondea [Integrated Safety Assessment Division, Korea Atomic Energy Research Institute, P.O. Box 105, Duckjin-Dong, Yusong-Ku, Taejon 305-600 (Korea, Republic of)

    2005-08-01

    In complex systems such as a nuclear and chemical plant, it is well known that the provision of understandable procedures that allow operators to clarify what needs to be done and how to do it is one of the requisites to secure their safety. As a previous study in providing understandable procedures, the step complexity (SC) measure that can quantify the complexity of procedural steps in emergency operating procedures (EOPs) of a nuclear power plant (NPP) was suggested. However, the necessity of additional complexity factors that can consider a cognitive aspect in evaluating the complexity of procedural steps is raised. To this end, the comparisons between operators' performance data measured by the form of a step performance time with their behavior in carrying out the prescribed activities of procedural steps are conducted in this study. As a result, two kinds of complexity factors (the abstraction level of knowledge and the level of engineering decision) that could affect an operator's cognitive burden are identified. Although a well-designed experiment is indispensable for confirming the appropriateness of the additional complexity factors, it is strongly believed that the change of operators' performance data can be more authentically explained if the additional complexity factors are taken into consideration.

  12. Identifying cognitive complexity factors affecting the complexity of procedural steps in emergency operating procedures of a nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Jinkyun Park; Kwangsup Jeong; Wondea Jung [Korea Atomic Energy Research Institute, Taejon (Korea). Integrated Safety Assessment Division

    2005-08-15

    In complex systems such as a nuclear and chemical plant, it is well known that the provision of understandable procedures that allow operators to clarify what needs to be done and how to do it is one of the requisites to secure their safety. As a previous study in providing understandable procedures, the step complexity (SC) measure that can quantify the complexity of procedural steps in emergency operating procedures (EOPs) of a nuclear power plant (NPP) was suggested. However, the necessity of additional complexity factors that can consider a cognitive aspect in evaluating the complexity of procedural steps is raised. To this end, the comparisons between operator' performance data measured by the form of a step performance time with their behavior in carrying out the prescribed activities of procedural steps are conducted in this study. As a result, two kinds of complexity factors (the abstraction level of knowledge and the level of engineering decision) that could affect an operator's cognitive burden are identified. Although a well-designed experiment is indispensable for confirming the appropriateness of the additional complexity factors, it is strongly believed that the change of operators' performance data can be more authentically explained if the additional complexity factors are taken into consideration. (author)

  13. The scientific use of factor analysis in behavioral and life sciences

    National Research Council Canada - National Science Library

    Cattell, Raymond Bernard

    1978-01-01

    ...; the choice of procedures in experimentation; factor interpretation; the relationship of factor analysis to broadened psychometric concepts such as scaling, validity, and reliability, and to higher- strata models...

  14. Selective Sequential Zero-Base Budgeting Procedures Based on Total Factor Productivity Indicators

    OpenAIRE

    A. Ishikawa; E. F. Sudit

    1981-01-01

    The authors' purpose in this paper is to develop productivity-based sequential budgeting procedures designed to expedite identification of major problem areas in bugetary performance, as well as to reduce the costs associated with comprehensive zero-base analyses. The concept of total factor productivity is reviewed and its relations to ordinary and zero-based budgeting are discussed in detail. An outline for a selective sequential analysis based on monitoring of three key indicators of (a) i...

  15. EXPLORATORY FACTOR ANALYSIS (EFA IN CONSUMER BEHAVIOR AND MARKETING RESEARCH

    Directory of Open Access Journals (Sweden)

    Marcos Pascual Soler

    2012-06-01

    Full Text Available Exploratory Factor Analysis (EFA is one of the most widely used statistical procedures in social research. The main objective of this work is to describe the most common practices used by researchers in the consumer behavior and marketing area. Through a literature review methodology the practices of AFE in five consumer behavior and marketing journals(2000-2010 were analyzed. Then, the choices made by the researchers concerning factor model, retention criteria, rotation, factors interpretation and other relevant issues to factor analysis were analized. The results suggest that researchers routinely conduct analyses using such questionable methods. Suggestions for improving the use of factor analysis and the reporting of results are presented and a checklist (Exploratory Factor Analysis Checklist, EFAC is provided to help editors, reviewers, and authors improve reporting exploratory factor analysis.

  16. A receptor model for urban aerosols based on oblique factor analysis

    DEFF Research Database (Denmark)

    Keiding, Kristian; Sørensen, Morten S.; Pind, Niels

    1987-01-01

    A procedure is outlined for the construction of receptor models of urban aerosols, based on factor analysis. The advantage of the procedure is that the covariation of source impacts is included in the construction of the models. The results are compared with results obtained by other receptor......-modelling procedures. It was found that procedures based on correlating sources were physically sound as well as in mutual agreement. Procedures based on non-correlating sources were found to generate physically obscure models....

  17. Updating QR factorization procedure for solution of linear least squares problem with equality constraints.

    Science.gov (United States)

    Zeb, Salman; Yousaf, Muhammad

    2017-01-01

    In this article, we present a QR updating procedure as a solution approach for linear least squares problem with equality constraints. We reduce the constrained problem to unconstrained linear least squares and partition it into a small subproblem. The QR factorization of the subproblem is calculated and then we apply updating techniques to its upper triangular factor R to obtain its solution. We carry out the error analysis of the proposed algorithm to show that it is backward stable. We also illustrate the implementation and accuracy of the proposed algorithm by providing some numerical experiments with particular emphasis on dense problems.

  18. Structural Analysis of Correlated Factors: Lessons from the Verbal-Performance Dichotomy of the Wechsler Scales.

    Science.gov (United States)

    Macmann, Gregg M.; Barnett, David W.

    1994-01-01

    Describes exploratory and confirmatory analyses of verbal-performance procedures to illustrate concepts and procedures for analysis of correlated factors. Argues that, based on convergent and discriminant validity criteria, factors should have higher correlations with variables that they purport to measure than with other variables. Discusses…

  19. Minerals sampling: sensibility analysis and correction factors for Pierre Gy's equation

    International Nuclear Information System (INIS)

    Vallebuona, G.; Niedbalski, F.

    2005-01-01

    Pierre Gy's equation is widely used in ore sampling. This equation is based in four parameters: shape factor, size distribution factor, mineralogical factor and liberation factor. The usual practice is to consider fixed values for the shape and size distribution factors. This practice does not represent well several important ores. The mineralogical factor considers only one specie of interest and the gangue, leaving out other cases such as polymetallic ores where there are more than one species of interest. A sensibility analysis to the Gy's equation factors was done and a procedure to determine specific values for them was developed and presented in this work. mean ore characteristics, associated with an insecure use of the actual procedure, were determined. finally, for a case study, the effects of using each alternative were evaluated. (Author) 4 refs

  20. Accident Sequence Evaluation Program: Human reliability analysis procedure

    International Nuclear Information System (INIS)

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs

  1. Solid-phase extraction procedures in systematic toxicological analysis

    NARCIS (Netherlands)

    Franke, J.P.; de Zeeuw, R.A

    1998-01-01

    In systematic toxicological analysis (STA) the substance(s) present is (are) not known at the start of the analysis. in such an undirected search the extraction procedure cannot be directed to a given substance but must be a general procedure where a compromise must be reached in that the substances

  2. Confirmatory factor analysis using Microsoft Excel.

    Science.gov (United States)

    Miles, Jeremy N V

    2005-11-01

    This article presents a method for using Microsoft (MS) Excel for confirmatory factor analysis (CFA). CFA is often seen as an impenetrable technique, and thus, when it is taught, there is frequently little explanation of the mechanisms or underlying calculations. The aim of this article is to demonstrate that this is not the case; it is relatively straightforward to produce a spreadsheet in MS Excel that can carry out simple CFA. It is possible, with few or no programming skills, to effectively program a CFA analysis and, thus, to gain insight into the workings of the procedure.

  3. Text mining factor analysis (TFA) in green tea patent data

    Science.gov (United States)

    Rahmawati, Sela; Suprijadi, Jadi; Zulhanif

    2017-03-01

    Factor analysis has become one of the most widely used multivariate statistical procedures in applied research endeavors across a multitude of domains. There are two main types of analyses based on factor analysis: Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA). Both EFA and CFA aim to observed relationships among a group of indicators with a latent variable, but they differ fundamentally, a priori and restrictions made to the factor model. This method will be applied to patent data technology sector green tea to determine the development technology of green tea in the world. Patent analysis is useful in identifying the future technological trends in a specific field of technology. Database patent are obtained from agency European Patent Organization (EPO). In this paper, CFA model will be applied to the nominal data, which obtain from the presence absence matrix. While doing processing, analysis CFA for nominal data analysis was based on Tetrachoric matrix. Meanwhile, EFA model will be applied on a title from sector technology dominant. Title will be pre-processing first using text mining analysis.

  4. Human Factors Analysis of Pipeline Monitoring and Control Operations: Final Technical Report

    Science.gov (United States)

    2008-11-26

    The purpose of the Human Factors Analysis of Pipeline Monitoring and Control Operations project was to develop procedures that could be used by liquid pipeline operators to assess and manage the human factors risks in their control rooms that may adv...

  5. Multicriteria decision analysis in ranking of analytical procedures for aldrin determination in water.

    Science.gov (United States)

    Tobiszewski, Marek; Orłowski, Aleksander

    2015-03-27

    The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Accident Sequence Evaluation Program: Human reliability analysis procedure

    Energy Technology Data Exchange (ETDEWEB)

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs.

  7. Seismic Retrofit of Reinforced Concrete Frame Buildings with Hysteretic Bracing Systems: Design Procedure and Behaviour Factor

    Directory of Open Access Journals (Sweden)

    Antonio Di Cesare

    2017-01-01

    Full Text Available This paper presents a design procedure to evaluate the mechanical characteristics of hysteretic Energy Dissipation Bracing (EDB systems for seismic retrofitting of existing reinforced concrete framed buildings. The proposed procedure, aiming at controlling the maximum interstorey drifts, imposes a maximum top displacement as function of the seismic demand and, if needed, regularizes the stiffness and strength of the building along its elevation. In order to explain the application of the proposed procedure and its capacity to involve most of the devices in the energy dissipation with similar level of ductility demand, a simple benchmark structure has been studied and nonlinear dynamic analyses have been performed. A further goal of this work is to propose a simplified approach for designing dissipating systems based on linear analysis with the application of a suitable behaviour factor, in order to achieve a widespread adoption of the passive control techniques. At this goal, the increasing of the structural performances due to the addition of an EDB system designed with the above-mentioned procedure has been estimated considering one thousand case studies designed with different combinations of the main design parameters. An analytical formulation of the behaviour factor for braced buildings has been proposed.

  8. Using P-Stat, BMDP and SPSS for a cross-products factor analysis.

    Science.gov (United States)

    Tanner, B A; Leiman, J M

    1983-06-01

    The major disadvantage of the Q factor analysis with Euclidean distances described by Tanner and Koning [Comput. Progr. Biomed. 12 (1980) 201-202] is the considerable editing required. An alternative procedure with commercially distributed software, and with cross-products in place of Euclidean distances is described. This procedure does not require any editing.

  9. Analysis and optimization of blood-testing procedures.

    NARCIS (Netherlands)

    Bar-Lev, S.K.; Boxma, O.J.; Perry, D.; Vastazos, L.P.

    2017-01-01

    This paper is devoted to the performance analysis and optimization of blood testing procedures. We present a queueing model of two queues in series, representing the two stages of a blood-testing procedure. Service (testing) in stage 1 is performed in batches, whereas it is done individually in

  10. Summative Mass Analysis of Algal Biomass - Integration of Analytical Procedures: Laboratory Analytical Procedure (LAP)

    Energy Technology Data Exchange (ETDEWEB)

    Laurens, Lieve M. L.

    2016-01-13

    This procedure guides the integration of laboratory analytical procedures to measure algal biomass constituents in an unambiguous manner and ultimately achieve mass balance closure for algal biomass samples. Many of these methods build on years of research in algal biomass analysis.

  11. U.F.F.A.: A numerical procedure for fatigue analysis according to ASME code

    International Nuclear Information System (INIS)

    Bellettato, W.; Ticozzi, C.; Zucchini, C.

    1981-01-01

    A new procedure is developed, which employs some already used methodologies and brings some new concepts. The computer code UFFA employs the so obtained procedure. This paper in the first part describes the methodology used for the usage factor calculation, in the second part carries a general description of the code and in the third part shows some example and their respective results. We suppose an elastic behaviour of the materials and we do not consider the effect of the application order of the loads. Moreover, we suppose valid the hypothesis of cumulative damage, that is we apply the Miner's rule. One of the problems in the nuclear components fatigue analysis is that in the load histories there is a high number of operational cycles for which we cannot specify a succession in the time. Therefore, it was introduced the concept of 'level' (or steady working status) by which we can approximate the load conditions in realistic way. As regard the problem of multiaxial cases, it is possible to show that it is not right, an neither conservative, to make a distinguished analysis of the 3 stress differences and then take the maximum of the 3 compuoted usage factors as component usage factor. Indeed, as the stresses act on the structure at the same time, it is necessary a contemporary analysis of the 3 stress difference curves. The computer code can deal as well with the case of sher stresses (varying principal stress directions) through the ASME 'normalization' procedure. The results of the UFFA program, compared with the results of other programs used at present, come up to the expectations. (orig./HP)

  12. Fatigue Analysis of Tubesheet/Shell Juncture Applying the Mitigation Factor for Over-conservatism

    International Nuclear Information System (INIS)

    Kang, Deog Ji; Kim, Kyu Hyoung; Lee, Jae Gon

    2009-01-01

    If the environmental fatigue requirements are applied to the primary components of a nuclear power plant, to which the present ASME Code fatigue curves are applied, some locations with high level CUF (Cumulative Usage Factor) are anticipated not to meet the code criteria. The application of environmental fatigue damage is still particularly controversial for plants with 60-year design lives. Therefore, it is need to develop a detailed fatigue analysis procedure to identify the conservatisms in the procedure and to lower the cumulative usage factor. Several factors are being considered to mitigate the conservatism such as three-dimensional finite element modeling. In the present analysis, actual pressure transient data instead of conservative maximum and minimum pressure data was applied as one of mitigation factors. Unlike in the general method, individual transient events were considered instead of the grouped transient events. The tubesheet/shell juncture in the steam generator assembly is the one of the weak locations and was, therefore, selected as a target to evaluate the mitigation factor in the present analysis

  13. Human Factors Evaluation of Procedures for Periodic Safety Review of Yonggwang Unit no. 1, 2

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Hee; Lee, Jung Woon; Park, Jae Chang (and others)

    2006-01-15

    This report describes the results of human factors assessment on the plant operating procedures as part of Periodic Safety Review(PSR) of Yonggwang Nuclear Power Plant Unit no. 1, 2. The suitability of item and appropriateness of format and structure in the key operating procedures of nuclear power plants were investigated by the review of plant operating experiences and procedure documents, field survey, and experimental assessment on some part of procedures. A checklist was used to perform this assessment and record the review results. The reviewed procedures include EOP(Emergency Operating Procedures), GOP(General Operating Procedures), AOP(Abnormal Operating Procedures), and management procedures of some technical departments. As results of the assessments, any significant problem challenging the safety was not found on the human factors in the operating procedures. However, several small items to be changed and improved were discovered. An action plan is recommended to accommodate the suggestions and review comments. It will enhance the plant safety on the operating procedure.

  14. Procedures monitoring and MAAP analysis

    International Nuclear Information System (INIS)

    May, R.S.

    1991-01-01

    Numerous studies of severe accidents in light water reactors have shown that operator response can play a crucial role in the predicted outcomes of dominant accident scenarios. MAAP provides the capability to specify certain operator actions as input data. However, making reasonable assumptions about the nature and timing of operator response requires substantial knowledge about plant practices and procedures and what they imply for the event being analyzed. The appearance of knowledge based software technology in the mid-1980s provided a natural format for representing and maintaining procedures as IF-THEN rules. The boiling water reactor (BWR) Emergency Operating Procedures Tracking System (EOPTS) was composed of a rule base of procedures and a dedicated inference engine (problem-solver). Based on the general approach and experience of EOPTS, the authors have developed a prototype procedures monitoring system that reads MAAP transient output files and evaluate the EOP messages and instructions that would be implied during each transient time interval. The prototype system was built using the NEXPERT OBJECT expert system development system, running on a 386-class personal computer with 4 MB of memory. The limited scope prototype includes a reduced set of BWR6 EOPs procedures evaluation on a coarse time interval, a simple text-based user interface, and a summary-report generator. The prototype, which is limited to batch-mode analysis of MAAP output, is intended to demonstrate the concept and aid in the design of a production system, which will involve a direct link to MAAP and interactive capabilities

  15. Item-level factor analysis of the Self-Efficacy Scale.

    Science.gov (United States)

    Bunketorp Käll, Lina

    2014-03-01

    This study explores the internal structure of the Self-Efficacy Scale (SES) using item response analysis. The SES was previously translated into Swedish and modified to encompass all types of pain, not exclusively back pain. Data on perceived self-efficacy in 47 patients with subacute whiplash-associated disorders were derived from a previously conducted randomized-controlled trial. The item-level factor analysis was carried out using a six-step procedure. To further study the item inter-relationships and to determine the underlying structure empirically, the 20 items of the SES were also subjected to principal component analysis with varimax rotation. The analyses showed two underlying factors, named 'social activities' and 'physical activities', with seven items loading on each factor. The remaining six items of the SES appeared to measure somewhat different constructs and need to be analysed further.

  16. Supplement to procedures, analysis, and comparison of groundwater velocity measurement methods for unconfined aquifers

    International Nuclear Information System (INIS)

    Zinkl, R.J.; Kearl, P.M.

    1988-09-01

    This report is a supplement to Procedures, Analysis, and Comparison of Groundwater Velocity Measurement Methods for Unconfined Aquifers and provides computer program descriptions, type curves, and calculations for the analysis of field data in determining groundwater velocity in unconfined aquifers. The computer programs analyze bail or slug tests, pumping tests, Geoflo Meter data, and borehole dilution data. Appendix A is a description of the code, instructions for using the code, an example data file, and the calculated results to allow checking the code after installation on the user's computer. Calculations, development of formulas, and correction factors for the various programs are presented in Appendices B through F. Appendix G provides a procedure for calculating transmissivity and specific yield for pumping tests performed in unconfined aquifers

  17. Quantification procedures in micro X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Kanngiesser, Birgit

    2003-01-01

    For the quantification in micro X-ray fluorescence analysis standardfree quantification procedures have become especially important. An introduction to the basic concepts of these quantification procedures is given, followed by a short survey of the procedures which are available now and what kind of experimental situations and analytical problems are addressed. The last point is extended by the description of an own development for the fundamental parameter method, which renders the inclusion of nonparallel beam geometries possible. Finally, open problems for the quantification procedures are discussed

  18. System analysis procedures for conducting PSA of nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Yoon Hwan; Jeong, Won Dae; Kim, Tae Un; Kim, Kil You; Han, Sang Hoon; Chang, Seung Chul; Sung, Tae Yong; Yang, Jun Eon; Kang, Dae Il; Park, Jin Hee; Hwang, Mi Jeong; Jin, Young Ho.

    1997-03-01

    This document, the Probabilistic Safety Assessment(PSA) procedures guide for system analysis, is intended to provide the guidelines to analyze the target of system consistently and technically in the performance of PSA for nuclear power plants(NPPs). The guide has been prepared in accordance with the procedures and techniques for fault tree analysis(FTA) used in system analysis. Normally the main objective of system analysis is to assess the reliability of system modeled by Event Tree Analysis(ETA). A variety of analytical techniques can be used for the system analysis, however, FTA method is used in this procedures guide. FTA is the method used for representing the failure logic of plant systems deductively using AND, OR or NOT gates. The fault tree should reflect all possible failure modes that may contribute to the system unavailability. This should include contributions due to the mechanical failures of the components, Common Cause Failures (CCFs), human errors and outages for testing and maintenance. After the construction of fault tree is completed, system unavailability is calculated with the CUT module of KIRAP, and the qualitative and quantitative analysis is performed through the process as above stated. As above mentioned, the procedures for system analysis is based on PSA procedures and methods which has been applied to the safety assessments of constructing NPPs in the country. Accordingly, the method of FTA stated in this procedures guide will be applicable to PSA for the NPPs to be constructed in the future. (author). 6 tabs., 11 figs., 7 refs

  19. System analysis procedures for conducting PSA of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yoon Hwan; Jeong, Won Dae; Kim, Tae Un; Kim, Kil You; Han, Sang Hoon; Chang, Seung Chul; Sung, Tae Yong; Yang, Jun Eon; Kang, Dae Il; Park, Jin Hee; Hwang, Mi Jeong; Jin, Young Ho

    1997-03-01

    This document, the Probabilistic Safety Assessment(PSA) procedures guide for system analysis, is intended to provide the guidelines to analyze the target of system consistently and technically in the performance of PSA for nuclear power plants(NPPs). The guide has been prepared in accordance with the procedures and techniques for fault tree analysis(FTA) used in system analysis. Normally the main objective of system analysis is to assess the reliability of system modeled by Event Tree Analysis(ETA). A variety of analytical techniques can be used for the system analysis, however, FTA method is used in this procedures guide. FTA is the method used for representing the failure logic of plant systems deductively using AND, OR or NOT gates. The fault tree should reflect all possible failure modes that may contribute to the system unavailability. This should include contributions due to the mechanical failures of the components, Common Cause Failures (CCFs), human errors and outages for testing and maintenance. After the construction of fault tree is completed, system unavailability is calculated with the CUT module of KIRAP, and the qualitative and quantitative analysis is performed through the process as above stated. As above mentioned, the procedures for system analysis is based on PSA procedures and methods which has been applied to the safety assessments of constructing NPPs in the country. Accordingly, the method of FTA stated in this procedures guide will be applicable to PSA for the NPPs to be constructed in the future. (author). 6 tabs., 11 figs., 7 refs.

  20. Factors that influence length of stay for in-patient gynaecology surgery: is the Case Mix Group (CMG) or type of procedure more important?

    Science.gov (United States)

    Carey, Mark S; Victory, Rahi; Stitt, Larry; Tsang, Nicole

    2006-02-01

    To compare the association between the Case Mix Group (CMG) code and length of stay (LOS) with the association between the type of procedure and LOS in patients admitted for gynaecology surgery. We examined the records of women admitted for surgery in CMG 579 (major uterine/adnexal procedure, no malignancy) or 577 (major surgery ovary/adnexa with malignancy) between April 1997 and March 1999. Factors thought to influence LOS included age, weight, American Society of Anesthesiologists (ASA) score, physician, day of the week on which surgery was performed, and procedure type. Procedures were divided into six categories, four for CMG 579 and two for CMG 577. Data were abstracted from the hospital information costing system (T2 system) and by retrospective chart review. Multivariable analysis was performed using linear regression with backwards elimination. There were 606 patients in CMG 579 and 101 patients in CMG 577, and the corresponding median LOS was four days (range 1-19) for CMG 579 and nine days (range 3-30) for CMG 577. Combined analysis of both CMGs 577 and 579 revealed the following factors as highly significant determinants of LOS: procedure, age, physician, and ASA score. Although confounded by procedure type, the CMG did not significantly account for differences in LOS in the model if procedure was considered. Pairwise comparisons of procedure categories were all found to be statistically significant, even when controlled for other important variables. The type of procedure better accounts for differences in LOS by describing six statistically distinct procedure groups rather than the traditional two CMGs. It is reasonable therefore to consider changing the current CMG codes for gynaecology to a classification based on the type of procedure.

  1. ORNL-PWR BDHT analysis procedure: an overview

    International Nuclear Information System (INIS)

    Cliff, S.B.

    1978-01-01

    The key computer programs currently used by the analysis procedure of the ORNL-PWR Blowdown Heat Transfer Separate Effects Program are overviewed with particular emphasis placed on their interrelationships. The major modeling and calculational programs, COBRA, ORINC, ORTCAL, PINSIM, and various versions of RELAP4, are summarized and placed into the perspective of the procedure. The supportive programs, REDPLT, ORCPLT, BDHTPLOT, OXREPT, and OTOCI, and their uses are described

  2. Nonparametric factor analysis of time series

    OpenAIRE

    Rodríguez-Poo, Juan M.; Linton, Oliver Bruce

    1998-01-01

    We introduce a nonparametric smoothing procedure for nonparametric factor analaysis of multivariate time series. The asymptotic properties of the proposed procedures are derived. We present an application based on the residuals from the Fair macromodel.

  3. Assessment of job stress factors and organizational personality types for procedure-based jobs in nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Dae-Ho; Lee, Yong-Hee; Lee, Jung-Woon

    2008-01-01

    The purpose of this study is to assess the organizational types and the job stress factors that affect procedure-based job performances in nuclear power plants. We derived 24 organizational factors affecting job stress level in nuclear power plants from the job stress analysis models developed by NIOSH, JDI, and IOR. Considering the safety characteristics in the operating tasks of nuclear power plants, we identified the job contents and characteristics through the analyses of job assignments that appeared in the organizational chart and the results of an activity-based costing. By using questionnaire surveys and structured interviews with the plant personnel and expert panels, we assessed 70 jobs among the 777 jobs managed officially in accordance with the procedures. They consist of the representative jobs of each department and are directly related to safety. We utilized the organizational personality type indicators to characterize the personality types of each organization in nuclear power plants. (author)

  4. Flood risk analysis procedure for nuclear power plants

    International Nuclear Information System (INIS)

    Wagner, D.P.

    1982-01-01

    This paper describes a methodology and procedure for determining the impact of floods on nuclear power plant risk. The procedures are based on techniques of fault tree and event tree analysis and use the logic of these techniques to determine the effects of a flood on system failure probability and accident sequence occurrence frequency. The methodology can be applied independently or as an add-on analysis for an existing risk assessment. Each stage of the analysis yields useful results such as the critical flood level, failure flood level, and the flood's contribution to accident sequence occurrence frequency. The results of applications show the effects of floods on the risk from nuclear power plants analyzed in the Reactor Safety Study

  5. Human Factors Process Task Analysis: Liquid Oxygen Pump Acceptance Test Procedure at the Advanced Technology Development Center

    Science.gov (United States)

    Diorio, Kimberly A.; Voska, Ned (Technical Monitor)

    2002-01-01

    This viewgraph presentation provides information on Human Factors Process Failure Modes and Effects Analysis (HF PFMEA). HF PFMEA includes the following 10 steps: Describe mission; Define System; Identify human-machine; List human actions; Identify potential errors; Identify factors that effect error; Determine likelihood of error; Determine potential effects of errors; Evaluate risk; Generate solutions (manage error). The presentation also describes how this analysis was applied to a liquid oxygen pump acceptance test.

  6. Long terms results of Draf 3 procedure

    NARCIS (Netherlands)

    Georgalas, C.; Hansen, F.; Videler, W. J. M.; Fokkens, W. J.

    2011-01-01

    To assess the effectiveness and factors associated with restenosis after Draf type III (Endoscopic Modified Lothrop) frontal sinus drainage procedure. Retrospective analysis of prospectively collected data. A hundred and twenty two consecutive patients undergoing Draf III procedure for recalcitrant

  7. Cement Leakage in Percutaneous Vertebral Augmentation for Osteoporotic Vertebral Compression Fractures: Analysis of Risk Factors.

    Science.gov (United States)

    Xie, Weixing; Jin, Daxiang; Ma, Hui; Ding, Jinyong; Xu, Jixi; Zhang, Shuncong; Liang, De

    2016-05-01

    The risk factors for cement leakage were retrospectively reviewed in 192 patients who underwent percutaneous vertebral augmentation (PVA). To discuss the factors related to the cement leakage in PVA procedure for the treatment of osteoporotic vertebral compression fractures. PVA is widely applied for the treatment of osteoporotic vertebral fractures. Cement leakage is a major complication of this procedure. The risk factors for cement leakage were controversial. A retrospective review of 192 patients who underwent PVA was conducted. The following data were recorded: age, sex, bone density, number of fractured vertebrae before surgery, number of treated vertebrae, severity of the treated vertebrae, operative approach, volume of injected bone cement, preoperative vertebral compression ratio, preoperative local kyphosis angle, intraosseous clefts, preoperative vertebral cortical bone defect, and ratio and type of cement leakage. To study the correlation between each factor and cement leakage ratio, bivariate regression analysis was employed to perform univariate analysis, whereas multivariate linear regression analysis was employed to perform multivariate analysis. The study included 192 patients (282 treated vertebrae), and cement leakage occurred in 100 vertebrae (35.46%). The vertebrae with preoperative cortical bone defects generally exhibited higher cement leakage ratio, and the leakage is typically type C. Vertebrae with intact cortical bones before the procedure tend to experience type S leakage. Univariate analysis showed that patient age, bone density, number of fractured vertebrae before surgery, and vertebral cortical bone were associated with cement leakage ratio (Pcement leakage are bone density and vertebral cortical bone defect, with standardized partial regression coefficients of -0.085 and 0.144, respectively. High bone density and vertebral cortical bone defect are independent risk factors associated with bone cement leakage.

  8. Aesthetic Breast Surgery and Concomitant Procedures: Incidence and Risk Factors for Major Complications in 73,608 Cases.

    Science.gov (United States)

    Gupta, Varun; Yeslev, Max; Winocour, Julian; Bamba, Ravinder; Rodriguez-Feo, Charles; Grotting, James C; Higdon, K Kye

    2017-05-01

    Major complications following aesthetic breast surgery are uncommon and thus assessment of risk factors is challenging. To determine the incidence and risk factors of major complications following aesthetic breast surgery and concomitant procedures. A prospective cohort of patients who enrolled into the CosmetAssure (Birmingham, AL) insurance program and underwent aesthetic breast surgery between 2008 and 2013 was identified. Major complications (requiring reoperation, readmission, or emergency room visit) within 30 days of surgery were recorded. Risk factors including age, smoking, body mass index (BMI), diabetes, type of surgical facility, and combined procedures were evaluated. Among women, augmentation was the most common breast procedure (n = 41,651, 58.6%) followed by augmentation-mastopexy, mastopexy, and reduction. Overall, major complications occurred in 1.46% with hematoma (0.99%) and infection (0.25%) being most common. Augmentation-mastopexy had a higher risk of complications, particularly infection (relative risk [RR] 1.74, P procedures. Age was the only significant predictor for hematomas (RR 1.01, P procedures or abdominoplasty performed alone. Among men, correction of gynecomastia was the most common breast procedure (n = 1613, 64.6%) with a complication rate of 1.80% and smoking as a risk factor (RR 2.73, P = 0.03). Incidence of major complications after breast cosmetic surgical procedures is low. Risk factors for major complications include increasing age and BMI. Combining abdominoplasty with any breast procedure increases the risk of major complications. 2. © 2017 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com

  9. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  10. Safeguards Network Analysis Procedure (SNAP): overview

    International Nuclear Information System (INIS)

    Chapman, L.D; Engi, D.

    1979-08-01

    Nuclear safeguards systems provide physical protection and control of nuclear materials. The Safeguards Network Analysis Procedure (SNAP) provides a convenient and standard analysis methodology for the evaluation of physical protection system effectiveness. This is achieved through a standard set of symbols which characterize the various elements of safeguards systems and an analysis program to execute simulation models built using the SNAP symbology. The outputs provided by the SNAP simulation program supplements the safeguards analyst's evaluative capabilities and supports the evaluation of existing sites as well as alternative design possibilities. This paper describes the SNAP modeling technique and provides an example illustrating its use

  11. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  12. Risk factors for unplanned readmission within 30 days after pediatric neurosurgery: a nationwide analysis of 9799 procedures from the American College of Surgeons National Surgical Quality Improvement Program

    Science.gov (United States)

    Sherrod, Brandon A.; Johnston, James M.; Rocque, Brandon G.

    2017-01-01

    Objective Readmission rate is increasingly used as a quality outcome measure after surgery. The purpose of this study was to establish, using a national database, the baseline readmission rates and risk factors for readmission after pediatric neurosurgical procedures. Methods The American College of Surgeons National Surgical Quality Improvement Program–Pediatric database was queried for pediatric patients treated by a neurosurgeon from 2012 to 2013. Procedures were categorized by current procedural terminology code. Patient demographics, comorbidities, preoperative laboratory values, operative variables, and postoperative complications were analyzed via univariate and multivariate techniques to find associations with unplanned readmission within 30 days of the primary procedure. Results A total of 9799 cases met the inclusion criteria, 1098 (11.2%) of which had an unplanned readmission within 30 days. Readmission occurred 14.0 ± 7.7 days postoperatively (mean ± standard deviation). The 4 procedures with the highest unplanned readmission rates were CSF shunt revision (17.3%), repair of myelomeningocele > 5 cm in diameter (15.4%), CSF shunt creation (14.1%), and craniectomy for infratentorial tumor excision (13.9%). Spine (6.5%), craniotomy for craniosynostosis (2.1%), and skin lesion (1.0%) procedures had the lowest unplanned readmission rates. On multivariate regression analysis, the odds of readmission were greatest in patients experiencing postoperative surgical site infection (SSI; deep, organ/space, superficial SSI and wound disruption: OR > 12 and p readmission risk. Independent patient risk factors for unplanned readmission included Native American race (OR 2.363, p = 0.019), steroid use > 10 days (OR 1.411, p = 0.010), oxygen supplementation (OR 1.645, p = 0.010), nutritional support (OR 1.403, p = 0.009), seizure disorder (OR 1.250, p = 0.021), and longer operative time (per hour increase, OR 1.059, p = 0.014). Conclusions This study may aid in

  13. Human factors review for Severe Accident Sequence Analysis (SASA)

    International Nuclear Information System (INIS)

    Krois, P.A.; Haas, P.M.; Manning, J.J.; Bovell, C.R.

    1984-01-01

    The paper will discuss work being conducted during this human factors review including: (1) support of the Severe Accident Sequence Analysis (SASA) Program based on an assessment of operator actions, and (2) development of a descriptive model of operator severe accident management. Research by SASA analysts on the Browns Ferry Unit One (BF1) anticipated transient without scram (ATWS) was supported through a concurrent assessment of operator performance to demonstrate contributions to SASA analyses from human factors data and methods. A descriptive model was developed called the Function Oriented Accident Management (FOAM) model, which serves as a structure for bridging human factors, operations, and engineering expertise and which is useful for identifying needs/deficiencies in the area of accident management. The assessment of human factors issues related to ATWS required extensive coordination with SASA analysts. The analysis was consolidated primarily to six operator actions identified in the Emergency Procedure Guidelines (EPGs) as being the most critical to the accident sequence. These actions were assessed through simulator exercises, qualitative reviews, and quantitative human reliability analyses. The FOAM descriptive model assumes as a starting point that multiple operator/system failures exceed the scope of procedures and necessitates a knowledge-based emergency response by the operators. The FOAM model provides a functionally-oriented structure for assembling human factors, operations, and engineering data and expertise into operator guidance for unconventional emergency responses to mitigate severe accident progression and avoid/minimize core degradation. Operators must also respond to potential radiological release beyond plant protective barriers. Research needs in accident management and potential uses of the FOAM model are described. 11 references, 1 figure

  14. Computation of stress intensity factors for nozzle corner cracks by various finite element procedures

    International Nuclear Information System (INIS)

    Broekhoven, M.J.G.

    1975-01-01

    The present study aims at deriving accurate K-factors for a series of 5 elliptical nozzle corner cracks of increasing size by various finite element procedures, using a three-level recursive substructuring scheme to perform the computations in an economic way on an intermediate size computer (IBM 360/65 system). A nozzle on a flat plate has been selected for subsequent experimental verification, this configuration being considered an adequate simulation of a nozzle on a shallow shell. The computations have been performed with the ASKA finite element system using mainly HEXEC-27 (incomplete quartic) elements. The geometry has been subdivided into 5 subnets with a total of 3515 nodal points and 6250 unknowns, two main nets and one hyper net. Each crack front is described by 11 nodal points and all crack front nodes are inserted in the hyper net, which allows for the realization of the successive crack geometries by changing only a relatively small hyper net (615 to 725 unknowns). Output data have been interpreted in terms of K-factors by the global energy method, the displacement method and the stress method. Besides, a stiffness derivative procedure, recently developed at Brown University, which takes full advantage of the finite element formulation to calculate local K-factors, has been applied. Finally it has been investigated whether sufficiently accurate results can be obtained by analyzing a considerably smaller part than one half of the geometry (as strictly required by symmetry considerations), using fixed boundary conditions derived from a far cheaper analysis of the uncracked structure

  15. Procedural-support music therapy in the healthcare setting: a cost-effectiveness analysis.

    Science.gov (United States)

    DeLoach Walworth, Darcy

    2005-08-01

    This comparative analysis examined the cost-effectiveness of music therapy as a procedural support in the pediatric healthcare setting. Many healthcare organizations are actively attempting to reduce the amount of sedation for pediatric patients undergoing various procedures. Patients receiving music therapy-assisted computerized tomography scans ( n = 57), echocardiograms ( n = 92), and other procedures ( n = 17) were included in the analysis. Results of music therapy-assisted procedures indicate successful elimination of patient sedation, reduction in procedural times, and decrease in the number of staff members present for procedures. Implications for nurses and music therapists in the healthcare setting are discussed.

  16. Simple procedure for evaluating earthquake response spectra of large-event motions based on site amplification factors derived from smaller-event records

    International Nuclear Information System (INIS)

    Dan, Kazuo; Miyakoshi, Jun-ichi; Yashiro, Kazuhiko.

    1996-01-01

    A primitive procedure was proposed for evaluating earthquake response spectra of large-event motions to make use of records from smaller events. The result of the regression analysis of the response spectra was utilized to obtain the site amplification factors in the proposed procedure, and the formulation of the seismic-source term in the regression analysis was examined. A linear form of the moment magnitude, Mw, is good for scaling the source term of moderate earthquakes with Mw of 5.5 to 7.0, while a quadratic form of Mw and the ω-square source-spectrum model is appropriate for scaling the source term of smaller and greater earthquakes, respectively. (author). 52 refs

  17. A Quantitative Review of Functional Analysis Procedures in Public School Settings

    Science.gov (United States)

    Solnick, Mark D.; Ardoin, Scott P.

    2010-01-01

    Functional behavioral assessments can consist of indirect, descriptive and experimental procedures, such as a functional analysis. Although the research contains numerous examples demonstrating the effectiveness of functional analysis procedures, experimental conditions are often difficult to implement in classroom settings and analog conditions…

  18. Seismic analysis response factors and design margins of piping systems

    International Nuclear Information System (INIS)

    Shieh, L.C.; Tsai, N.C.; Yang, M.S.; Wong, W.L.

    1985-01-01

    The objective of the simplified methods project of the Seismic Safety Margins Research Program is to develop a simplified seismic risk methodology for general use. The goal is to reduce seismic PRA costs to roughly 60 man-months over a 6 to 8 month period, without compromising the quality of the product. To achieve the goal, it is necessary to simplify the calculational procedure of the seismic response. The response factor approach serves this purpose. The response factor relates the median level response to the design data. Through a literature survey, we identified the various seismic analysis methods adopted in the U.S. nuclear industry for the piping system. A series of seismic response calculations was performed. The response factors and their variabilities for each method of analysis were computed. A sensitivity study of the effect of piping damping, in-structure response spectra envelop method, and analysis method was conducted. In addition, design margins, which relate the best-estimate response to the design data, are also presented

  19. Radiation and environmental data analysis computer (REDAC) hardware, software band analysis procedures

    International Nuclear Information System (INIS)

    Hendricks, T.J.

    1985-01-01

    The REDAC was conceived originally as a tape verifier for the Radiation and Environmental Data Acquisition Recorder (REDAR). From that simple beginning in 1971, the REDAC has evolved into a family of systems used for complete analysis of data obtained by the REDAR and other acquisition systems. Portable or mobile REDACs are deployed to support checkout and analysis tasks in the field. Laboratory systems are additionally used for software development, physics investigations, data base management and graphics. System configurations range from man-portable systems to a large laboratory-based system which supports time-shared analysis and development tasks. Custom operating software allows the analyst to process data either interactively or by batch procedures. Analysis packages are provided for numerous necessary functions. All these analysis procedures can be performed even on the smallest man-portable REDAC. Examples of the multi-isotope stripping and radiation isopleth mapping are presented. Techniques utilized for these operations are also presented

  20. Experience with conventional inelastic analysis procedures in very high temperature applications

    International Nuclear Information System (INIS)

    Mallett, R.H.; Thompson, J.M.; Swindeman, R.W.

    1991-01-01

    Conventional incremental plasticity and creep analysis procedures for inelastic analysis are applied to hot flue gas cleanup system components. These flue gas systems operate at temperatures where plasticity and creep are very much intertwined while the two phenomena are treated separately in the conventional inelastic analysis procedure. Data for RA333 material are represented in forms appropriate for the conventional inelastic analysis procedures. Behavior is predicted for typical operating cycles. Creep-fatigue damage is estimated based upon usage fractions. Excessive creep damage is predicted; the major contributions occur during high stress short term intervals caused by rapid temperature changes. In this paper these results are presented for discussion of the results and their interpretation in terms of creep-fatigue damage for very high temperature applications

  1. Sensitivity of the diagnostic radiological index of protection to procedural factors in fluoroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Jones, A. Kyle, E-mail: kyle.jones@mdanderson.org [Department of Imaging Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States); Pasciak, Alexander S. [Department of Radiology, The University of Tennessee Medical Center at Knoxville, Knoxville, Tennessee 37922 (United States); Wagner, Louis K. [Department of Diagnostic and Interventional Imaging, The John P. and Katharine G. McGovern Medical School, Houston, Texas 77030 (United States)

    2016-07-15

    Purpose: To evaluate the sensitivity of the diagnostic radiological index of protection (DRIP), used to quantify the protective value of radioprotective garments, to procedural factors in fluoroscopy in an effort to determine an appropriate set of scatter-mimicking primary beams to be used in measuring the DRIP. Methods: Monte Carlo simulations were performed to determine the shape of the scattered x-ray spectra incident on the operator in different clinical fluoroscopy scenarios, including interventional radiology and interventional cardiology (IC). Two clinical simulations studied the sensitivity of the scattered spectrum to gantry angle and patient size, while technical factors were varied according to measured automatic dose rate control (ADRC) data. Factorial simulations studied the sensitivity of the scattered spectrum to gantry angle, field of view, patient size, and beam quality for constant technical factors. Average energy (E{sub avg}) was the figure of merit used to condense fluence in each energy bin to a single numerical index. Results: Beam quality had the strongest influence on the scattered spectrum in fluoroscopy. Many procedural factors affect the scattered spectrum indirectly through their effect on primary beam quality through ADRC, e.g., gantry angle and patient size. Lateral C-arm rotation, common in IC, increased the energy of the scattered spectrum, regardless of the direction of rotation. The effect of patient size on scattered radiation depended on ADRC characteristics, patient size, and procedure type. Conclusions: The scattered spectrum striking the operator in fluoroscopy is most strongly influenced by primary beam quality, particularly kV. Use cases for protective garments should be classified by typical procedural primary beam qualities, which are governed by the ADRC according to the impacts of patient size, anatomical location, and gantry angle.

  2. The mathematical pathogenetic factors analysis of acute inflammatory diseases development of bronchopulmonary system among infants

    Directory of Open Access Journals (Sweden)

    G. O. Lezhenko

    2017-10-01

    Full Text Available The purpose. To study the factor structure and to establish the associative interaction of pathogenetic links of acute diseases development of the bronchopulmonary system in infants.Materials and methods. The examination group consisted of 59 infants (average age 13.8 ± 1.4 months sick with acute inflammatory bronchopulmonary diseases. Also we tested the level of 25-hydroxyvitamin D (25(ОНD, vitamin D-binding protein, hBPI, cathelicidin LL-37, ß1-defensins, lactoferrin in blood serum with the help of immunoenzymometric analysis. Selection of prognostically important pathogenetic factors of acute bronchopulmonary disease among infants was conducted using ROC-analysis. The procedure for classifying objects was carried out using Hierarchical Cluster Analysis by the method of Centroid-based clustering. Results. Based on the results of the ROC-analysis were selected 15 potential predictors of the development of acute inflammatory diseases of the bronchopulmonary system among infants. The factor analysis made it possible to determine the 6 main components . The biggest influence in the development of the disease was made by "the anemia factor", "the factor of inflammation", "the maternal factor", "the vitamin D supply factor", "the immune factor" and "the phosphorus-calcium exchange factor” with a factor load of more than 0.6. The performed procedure of hierarchical cluster analysis confirmed the initial role of immuno-inflammatory components. The conclusions. The highlighted factors allowed to define a group of parameters, that must be influenced to achieve a maximum effect in carrying out preventive and therapeutic measures. First of all, it is necessary to influence the "the anemia factor" and "the calcium exchange factor", as well as the "the vitamin D supply factor". In other words, to correct vitamin D deficiency and carry out measures aimed at preventing the development of anemia. The prevention and treatment of the pathological course of

  3. Cost analysis of robotic versus laparoscopic general surgery procedures.

    Science.gov (United States)

    Higgins, Rana M; Frelich, Matthew J; Bosler, Matthew E; Gould, Jon C

    2017-01-01

    Robotic surgical systems have been used at a rapidly increasing rate in general surgery. Many of these procedures have been performed laparoscopically for years. In a surgical encounter, a significant portion of the total costs is associated with consumable supplies. Our hospital system has invested in a software program that can track the costs of consumable surgical supplies. We sought to determine the differences in cost of consumables with elective laparoscopic and robotic procedures for our health care organization. De-identified procedural cost and equipment utilization data were collected from the Surgical Profitability Compass Procedure Cost Manager System (The Advisory Board Company, Washington, DC) for our health care system for laparoscopic and robotic cholecystectomy, fundoplication, and inguinal hernia between the years 2013 and 2015. Outcomes were length of stay, case duration, and supply cost. Statistical analysis was performed using a t-test for continuous variables, and statistical significance was defined as p robotic procedures. Length of stay did not differ for fundoplication or cholecystectomy. Length of stay was greater for robotic inguinal hernia repair. Case duration was similar for cholecystectomy (84.3 robotic and 75.5 min laparoscopic, p = 0.08), but significantly longer for robotic fundoplication (197.2 robotic and 162.1 min laparoscopic, p = 0.01) and inguinal hernia repair (124.0 robotic and 84.4 min laparoscopic, p = ≪0.01). We found a significantly increased cost of general surgery procedures for our health care system when cases commonly performed laparoscopically are instead performed robotically. Our analysis is limited by the fact that we only included costs associated with consumable surgical supplies. The initial acquisition cost (over $1 million for robotic surgical system), depreciation, and service contract for the robotic and laparoscopic systems were not included in this analysis.

  4. Procedures as a Contributing Factor to Events in the Swedish Nuclear Power Plants. Analysis of a Database with Licensee Event Reports 1995-1999

    International Nuclear Information System (INIS)

    Bento, Jean-Pierre

    2002-12-01

    The operating experience from the twelve Swedish nuclear power units has been reviewed for the years 1995 - 1999 with respect to events - both Scrams and Licensee Event Reports, LERs - to which deficient procedure has been a contributing cause. In the present context 'Procedure' is defined as all written documentation used for the planning, performance and control of the tasks necessary for the operation and maintenance of the plants. The study has used an MTO-database (Man - Technology - Organisation) containing, for the five years studied, 42 MTO-related scrams out of 87 occurred scrams, and about 800 MTO-related LERs out of 2000 reported LERs. On an average, deficient procedures contribute to approximately 0,2 scram/unit/ year and to slightly more than three LERs/unit/year. Presented differently, procedure related scrams amount to 15% of the total number of scrams and to 31% of the MTO-related scrams. Similarly procedure related LERs amount to 10% of the total number of LERs and to 25% of the MTO-related LERs. For the most frequent work types performed at the plants, procedure related LERs are - in decreasing order - associated with tasks performed during maintenance, modification, testing and operation. However, for the latest year studied almost as many procedure related LERs are associated with modification tasks as with the three other work types together. A further analysis indicates that 'Deficient procedure content' is, by far, the dominating underlying cause contributing to procedure related scrams and LERs. The study also discusses the coupling between procedure related scrams/LERs, power operation and refuelling outages, and Common Cause Failures, CCF. An overall conclusion is that procedure related events in the Swedish nuclear power plants do not, on a national scale, represent an alarming issue. Significant and sustained efforts have been and are made at most units to improve the quality of procedures. However, a few units exhibit a noticeable

  5. A study on the identification of cognitive complexity factors related to the complexity of procedural steps

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Kyun; Jeong, Kwang Sup; Jung, Won Dea [KAERI, Taejon (Korea, Republic of)

    2004-07-01

    In complex systems, it is well recognized that the provision of understandable procedures that allow operators to clarify 'what needs to be done' and 'how to do it' is one of the requisites to confirm their safety. In this regard, the step complexity (SC) measure that can quantify the complexity of procedural steps in emergency operating procedures (EOPs) of a nuclear power plant (NPP) was suggested. However, the necessity of additional complexity factors that can consider a cognitive aspect in evaluating the complexity of procedural steps is evinced from the comparisons between SC scores and operators' performance data. To this end, the comparisons between operators' performance data with their behavior in conducting prescribed activities of procedural steps are conducted in this study. As a result, two kinds of complexity factors (the abstraction level of knowledge and the level of engineering decision) that could affect operators' cognitive burden are identified. Although a well-designed experiment is indispensable in confirming the appropriateness of cognitive complexity factors, it is strongly believed that the change of an operator's performance can be more authentically explained if they are taken into consideration.

  6. A study on the identification of cognitive complexity factors related to the complexity of procedural steps

    International Nuclear Information System (INIS)

    Park, Jin Kyun; Jeong, Kwang Sup; Jung, Won Dea

    2004-01-01

    In complex systems, it is well recognized that the provision of understandable procedures that allow operators to clarify 'what needs to be done' and 'how to do it' is one of the requisites to confirm their safety. In this regard, the step complexity (SC) measure that can quantify the complexity of procedural steps in emergency operating procedures (EOPs) of a nuclear power plant (NPP) was suggested. However, the necessity of additional complexity factors that can consider a cognitive aspect in evaluating the complexity of procedural steps is evinced from the comparisons between SC scores and operators' performance data. To this end, the comparisons between operators' performance data with their behavior in conducting prescribed activities of procedural steps are conducted in this study. As a result, two kinds of complexity factors (the abstraction level of knowledge and the level of engineering decision) that could affect operators' cognitive burden are identified. Although a well-designed experiment is indispensable in confirming the appropriateness of cognitive complexity factors, it is strongly believed that the change of an operator's performance can be more authentically explained if they are taken into consideration

  7. Building America House Performance Analysis Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, R.; Farrar-Nagy, S.; Anderson, R.; Judkoff, R.

    2001-10-29

    As the Building America Program has grown to include a large and diverse cross section of the home building industry, accurate and consistent analysis techniques have become more important to help all program partners as they perform design tradeoffs and calculate energy savings for prototype houses built as part of the program. This document illustrates some of the analysis concepts proven effective and reliable for analyzing the transient energy usage of advanced energy systems as well as entire houses. The analysis procedure described here provides a starting point for calculating energy savings of a prototype house relative to two base cases: builder standard practice and regional standard practice. Also provides building simulation analysis to calculate annual energy savings based on side-by-side short-term field testing of a prototype house.

  8. Foundations of factor analysis

    CERN Document Server

    Mulaik, Stanley A

    2009-01-01

    Introduction Factor Analysis and Structural Theories Brief History of Factor Analysis as a Linear Model Example of Factor AnalysisMathematical Foundations for Factor Analysis Introduction Scalar AlgebraVectorsMatrix AlgebraDeterminants Treatment of Variables as Vectors Maxima and Minima of FunctionsComposite Variables and Linear Transformations Introduction Composite Variables Unweighted Composite VariablesDifferentially Weighted Composites Matrix EquationsMulti

  9. Fracture analysis procedure for cast austenitic stainless steel pipe with an axial crack

    International Nuclear Information System (INIS)

    Kamaya, Masayuki

    2012-01-01

    Since the ductility of cast austenitic stainless steel pipes decreases due to thermal aging embrittlement after long term operation, not only plastic collapse failure but also unstable ductile crack propagation (elastic-plastic failure) should be taken into account for the structural integrity assessment of cracked pipes. In the fitness-for-service code of the Japan Society of Mechanical Engineers (JSME), Z-factor is used to incorporate the reduction in failure load due to elastic-plastic failure. However, the JSME code does not provide the Z-factor for axial cracks. In this study, Z-factor for axial cracks in aged cast austenitic stainless steel pipes was derived. Then, a comparison was made for the elastic-plastic failure load obtained from different analysis procedures. It was shown that the obtained Z-factor could derive reasonable elastic-plastic failure loads, although the failure loads were more conservative than those obtained by the two-parameter method. (author)

  10. Procedure automation: the effect of automated procedure execution on situation awareness and human performance

    International Nuclear Information System (INIS)

    Andresen, Gisle; Svengren, Haakan; Heimdal, Jan O.; Nilsen, Svein; Hulsund, John-Einar; Bisio, Rossella; Debroise, Xavier

    2004-04-01

    As advised by the procedure workshop convened in Halden in 2000, the Halden Project conducted an experiment on the effect of automation of Computerised Procedure Systems (CPS) on situation awareness and human performance. The expected outcome of the study was to provide input for guidance on CPS design, and to support the Halden Project's ongoing research on human reliability analysis. The experiment was performed in HAMMLAB using the HAMBO BWR simulator and the COPMA-III CPS. Eight crews of operators from Forsmark 3 and Oskarshamn 3 participated. Three research questions were investigated: 1) Does procedure automation create Out-Of-The-Loop (OOTL) performance problems? 2) Does procedure automation affect situation awareness? 3) Does procedure automation affect crew performance? The independent variable, 'procedure configuration', had four levels: paper procedures, manual CPS, automation with breaks, and full automation. The results showed that the operators experienced OOTL problems in full automation, but that situation awareness and crew performance (response time) were not affected. One possible explanation for this is that the operators monitored the automated procedure execution conscientiously, something which may have prevented the OOTL problems from having negative effects on situation awareness and crew performance. In a debriefing session, the operators clearly expressed their dislike for the full automation condition, but that automation with breaks could be suitable for some tasks. The main reason why the operators did not like the full automation was that they did not feel being in control. A qualitative analysis addressing factors contributing to response time delays revealed that OOTL problems did not seem to cause delays, but that some delays could be explained by the operators having problems with the freeze function of the CPS. Also other factors such as teamwork and operator tendencies were of importance. Several design implications were drawn

  11. Procedure for statistical analysis of one-parameter discrepant experimental data

    International Nuclear Information System (INIS)

    Badikov, Sergey A.; Chechev, Valery P.

    2012-01-01

    A new, Mandel–Paule-type procedure for statistical processing of one-parameter discrepant experimental data is described. The procedure enables one to estimate a contribution of unrecognized experimental errors into the total experimental uncertainty as well as to include it in analysis. A definition of discrepant experimental data for an arbitrary number of measurements is introduced as an accompanying result. In the case of negligible unrecognized experimental errors, the procedure simply reduces to the calculation of the weighted average and its internal uncertainty. The procedure was applied to the statistical analysis of half-life experimental data; Mean half-lives for 20 actinides were calculated and results were compared to the ENSDF and DDEP evaluations. On the whole, the calculated half-lives are consistent with the ENSDF and DDEP evaluations. However, the uncertainties calculated in this work essentially exceed the ENSDF and DDEP evaluations for discrepant experimental data. This effect can be explained by adequately taking into account unrecognized experimental errors. - Highlights: ► A new statistical procedure for processing one-parametric discrepant experimental data has been presented. ► Procedure estimates a contribution of unrecognized errors in the total experimental uncertainty. ► Procedure was applied for processing half-life discrepant experimental data. ► Results of the calculations are compared to the ENSDF and DDEP evaluations.

  12. Formalizing the Austrian Procedure Catalogue: A 4-step methodological analysis approach.

    Science.gov (United States)

    Neururer, Sabrina Barbara; Lasierra, Nelia; Peiffer, Karl Peter; Fensel, Dieter

    2016-04-01

    Due to the lack of an internationally accepted and adopted standard for coding health interventions, Austria has established its own country-specific procedure classification system - the Austrian Procedure Catalogue (APC). Even though the APC is an elaborate coding standard for medical procedures, it has shortcomings that limit its usability. In order to enhance usability and usefulness, especially for research purposes and e-health applications, we developed an ontologized version of the APC. In this paper we present a novel four-step approach for the ontology engineering process, which enables accurate extraction of relevant concepts for medical ontologies from written text. The proposed approach for formalizing the APC consists of the following four steps: (1) comparative pre-analysis, (2) definition analysis, (3) typological analysis, and (4) ontology implementation. The first step contained a comparison of the APC to other well-established or elaborate health intervention coding systems in order to identify strengths and weaknesses of the APC. In the second step, a list of definitions of medical terminology used in the APC was obtained. This list of definitions was used as input for Step 3, in which we identified the most important concepts to describe medical procedures using the qualitative typological analysis approach. The definition analysis as well as the typological analysis are well-known and effective methods used in social sciences, but not commonly employed in the computer science or ontology engineering domain. Finally, this list of concepts was used in Step 4 to formalize the APC. The pre-analysis highlighted the major shortcomings of the APC, such as the lack of formal definition, leading to implicitly available, but not directly accessible information (hidden data), or the poor procedural type classification. After performing the definition and subsequent typological analyses, we were able to identify the following main characteristics of

  13. Procedures for conducting common cause failure analysis in probabilistic safety assessment

    International Nuclear Information System (INIS)

    1992-05-01

    The principal objective of this report is to supplement the procedure developed in Mosleh et al. (1988, 1989) by providing more explicit guidance for a practical approach to common cause failures (CCF) analysis. The detailed CCF analysis following that procedure would be very labour intensive and time consuming. This document identifies a number of options for performing the more labour intensive parts of the analysis in an attempt to achieve a balance between the need for detail, the purpose of the analysis and the resources available. The document is intended to be compatible with the Agency's Procedures for Conducting Probabilistic Safety Assessments for Nuclear Power Plants (IAEA, 1992), but can be regarded as a stand-alone report to be used in conjunction with NUREG/CR-4780 (Mosleh et al., 1988, 1989) to provide additional detail, and discussion of key technical issues

  14. Improving Your Exploratory Factor Analysis for Ordinal Data: A Demonstration Using FACTOR

    Directory of Open Access Journals (Sweden)

    James Baglin

    2014-06-01

    Full Text Available Exploratory factor analysis (EFA methods are used extensively in the field of assessment and evaluation. Due to EFA's widespread use, common methods and practices have come under close scrutiny. A substantial body of literature has been compiled highlighting problems with many of the methods and practices used in EFA, and, in response, many guidelines have been proposed with the aim to improve application. Unfortunately, implementing recommended EFA practices has been restricted by the range of options available in commercial statistical packages and, perhaps, due to an absence of clear, practical - how-to' demonstrations. Consequently, this article describes the application of methods recommended to get the most out of your EFA. The article focuses on dealing with the common situation of analysing ordinal data as derived from Likert-type scales. These methods are demonstrated using the free, stand-alone, easy-to-use and powerful EFA package FACTOR (http://psico.fcep.urv.es/utilitats/factor/, Lorenzo-Seva & Ferrando, 2006. The demonstration applies the recommended techniques using an accompanying dataset, based on the Big 5 personality test. The outcomes obtained by the EFA using the recommended procedures through FACTOR are compared to the default techniques currently available in SPSS.

  15. Effective Work Procedure design Using Discomfort and Effort Factor in Brick stacking operation-A case study

    Science.gov (United States)

    Rout, Biswaranjan; Dash, R. R.; Dhupal, D.

    2018-02-01

    In this work a typical planning of movement of limbs and torso of the worker to be well design to reduce fatigue and energy of the worker. A simulation model is generated to suit the procedure and comply with the constraints in the workspace. It requires verifying the capability of human postures and movements in different working conditions for the evaluation of effectiveness of the new design. In this article a simple human performance measure is introduce that enable the mathematical model for evaluation of a cost function. The basic scheme is to evaluate the performance in the form of several cost factors using AI techniques. Here two main cost factors taken in to consideration are discomfort factor and effort factor in limb movements. Discomfort factor measures the level of discomfort from the most neutral position of a given limb to the position of the corresponding limb after movement and effort factor is a measure of the displacement of the corresponding limbs from the original position. The basic aim is to optimize the movement of the limbs with the above mentioned cost functions. The effectiveness of the procedure is tested with an example of working procedure of workers used for stacking of fly ash bricks in a local fly ash bricks manufacturing unit. The objective is to find out the optimised movement of the limbs to reduce discomfort level and effort required of workers. The effectiveness of the procedure in this case study illustrated with the obtained results.

  16. 75 FR 72739 - Compliance Testing Procedures: Correction Factor for Room Air Conditioners

    Science.gov (United States)

    2010-11-26

    ...: Correction Factor for Room Air Conditioners AGENCY: Office of the General Counsel, Department of Energy (DOE... air conditioners. The petition seeks temporary enforcement forbearance, or in the alternative, a... procedures for room air conditioners. Public comment is requested on whether DOE should grant the petition...

  17. Interactive analysis of human error factors in NPP operation events

    International Nuclear Information System (INIS)

    Zhang Li; Zou Yanhua; Huang Weigang

    2010-01-01

    Interactive of human error factors in NPP operation events were introduced, and 645 WANO operation event reports from 1999 to 2008 were analyzed, among which 432 were found relative to human errors. After classifying these errors with the Root Causes or Causal Factors, and then applying SPSS for correlation analysis,we concluded: (1) Personnel work practices are restricted by many factors. Forming a good personnel work practices is a systematic work which need supports in many aspects. (2)Verbal communications,personnel work practices, man-machine interface and written procedures and documents play great roles. They are four interaction factors which often come in bundle. If some improvements need to be made on one of them,synchronous measures are also necessary for the others.(3) Management direction and decision process, which are related to management,have a significant interaction with personnel factors. (authors)

  18. Pragmatic evaluation of the Toyota Production System (TPS analysis procedure for problem solving with entry-level nurses

    Directory of Open Access Journals (Sweden)

    Lukasz Maciej Mazur

    2008-12-01

    Full Text Available Medication errors occurring in hospitals are a growing national concern. These medication errors and their related costs (or wastes are seen as major factors leading to increased patient safety risks and increased waste in the hospital setting.  This article presents a study in which sixteen entry-level nurses utilized a Toyota Production System (TPS analysis procedure to solve medication delivery problems at one community hospital. The objective of this research was to study and evaluate the TPS analysis procedure for problem solving with entry-level nurses. Personal journals, focus group discussions, and a survey study were used to collect data about entry-level nurses’ perceptions of using the TPS problem solving approach to study medication delivery. A regression analysis was used to identify characteristics that enhance problem solving efforts. In addition, propositions for effective problem solving by entry-level nurses to aid in the reduction of medication errors in healthcare delivery settings are offered.

  19. ORNL: PWR-BDHT analysis procedure, a preliminary overview

    International Nuclear Information System (INIS)

    Cliff, S.B.

    1978-01-01

    The computer programs currently used in the analysis of the ORNL-PWR Blowdown Heat Transfer Separate-Effects Program are overviewed. The current linkages and relationships among the programs are given along with general comments about the future directions of some of these programs. The overview is strictly from the computer science point of view with only minimal information concerning the engineering aspects of the analysis procedure

  20. Analysis of Factors Associated With Rhytidectomy Malpractice Litigation Cases.

    Science.gov (United States)

    Kandinov, Aron; Mutchnick, Sean; Nangia, Vaibhuv; Svider, Peter F; Zuliani, Giancarlo F; Shkoukani, Mahdi A; Carron, Michael A

    2017-07-01

    This study investigates the financial burden of medical malpractice litigation associated with rhytidectomies, as well as factors that contribute to litigation and poor defendant outcomes, which can help guide physician practices. To comprehensively evaluate rhytidectomy malpractice litigation. Jury verdict and settlement reports related to rhytidectomy malpractice litigations were obtained using the Westlaw Next database. Use of medical malpractice in conjunction with several terms for rhytidectomy, to account for the various procedure names associated with the procedure, yielded 155 court cases. Duplicate and nonrelevant cases were removed, and 89 cases were included in the analysis and reviewed for outcomes, defendant specialty, payments, and other allegations raised in proceedings. Data were collected from November 21, 2015, to December 25, 2015. Data analysis took place from December 25, 2015, to January 20, 2016. A total of 89 cases met our inclusion criteria. Most plaintiffs were female (81 of 88 with known sex [92%]), and patient age ranged from 40 to 76 years (median age, 56 years). Fifty-three (60%) were resolved in the defendant's favor, while the remaining 36 cases (40%) were resolved with either a settlement or a plaintiff verdict payment. The mean payment was $1.4 million. A greater proportion of cases involving plastic surgeon defendants were resolved with payment compared with cases involving defendants with ear, nose, and throat specialty (15 [36%] vs 4 [24%]). The most common allegations raised in litigation were intraoperative negligence (61 [69%]), poor cosmesis or disfigurement (57 [64%]), inadequate informed consent (30 [34%]), additional procedures required (14 [16%]), postoperative negligence (12 [14%]), and facial nerve injury (10 [11%]). Six cases (7%) involved alleged negligence surrounding a "lifestyle-lift" procedure, which tightens or oversews the superficial muscular aponeurosis system layer. In this study, although most cases of

  1. Sample preparation procedure for PIXE elemental analysis on soft tissues

    International Nuclear Information System (INIS)

    Kubica, B.; Kwiatek, W.M.; Dutkiewicz, E.M.; Lekka, M.

    1997-01-01

    Trace element analysis is one of the most important field in analytical chemistry. There are several instrumental techniques which are applied for determinations of microscopic elemental content. The PIXE (Proton Induced X-ray Emission) technique is one of the nuclear techniques that is commonly applied for such purpose due to its multielemental analysis possibilities. The aim of this study was to establish the optimal conditions for target preparation procedure. In this paper two different approaches to the topic are presented and widely discussed. The first approach was the traditional pellet technique and the second one was mineralization procedure. For the analysis soft tissue such as liver was used. Some results are also presented on water samples. (author)

  2. Quantitative analysis of crystalline pharmaceuticals in tablets by pattern-fitting procedure using X-ray diffraction pattern.

    Science.gov (United States)

    Takehira, Rieko; Momose, Yasunori; Yamamura, Shigeo

    2010-10-15

    A pattern-fitting procedure using an X-ray diffraction pattern was applied to the quantitative analysis of binary system of crystalline pharmaceuticals in tablets. Orthorhombic crystals of isoniazid (INH) and mannitol (MAN) were used for the analysis. Tablets were prepared under various compression pressures using a direct compression method with various compositions of INH and MAN. Assuming that X-ray diffraction pattern of INH-MAN system consists of diffraction intensities from respective crystals, observed diffraction intensities were fitted to analytic expression based on X-ray diffraction theory and separated into two intensities from INH and MAN crystals by a nonlinear least-squares procedure. After separation, the contents of INH were determined by using the optimized normalization constants for INH and MAN. The correction parameter including all the factors that are beyond experimental control was required for quantitative analysis without calibration curve. The pattern-fitting procedure made it possible to determine crystalline phases in the range of 10-90% (w/w) of the INH contents. Further, certain characteristics of the crystals in the tablets, such as the preferred orientation, size of crystallite, and lattice disorder were determined simultaneously. This method can be adopted to analyze compounds whose crystal structures are known. It is a potentially powerful tool for the quantitative phase analysis and characterization of crystals in tablets and powders using X-ray diffraction patterns. Copyright 2010 Elsevier B.V. All rights reserved.

  3. CONSIDERATIONS FOR THE TREATMENT OF COMPUTERIZED PROCEDURES IN HUMAN RELIABILITY ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2012-07-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room. Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  4. Procedure for the analysis of americium in complex matrices

    International Nuclear Information System (INIS)

    Knab, D.

    1978-02-01

    A radioanalytical procedure for the analysis of americium in complex matrices has been developed. Clean separations of americium can be obtained from up to 100 g of sample ash, regardless of the starting material. The ability to analyze large masses of material provides the increased sensitivity necessary to detect americium in many environmental samples. The procedure adequately decontaminates from rare earth elements and natural radioactive nuclides that interfere with the alpha spectrometric measurements

  5. Risk factors for surgical site infection following nonshunt pediatric neurosurgery: a review of 9296 procedures from a national database and comparison with a single-center experience.

    Science.gov (United States)

    Sherrod, Brandon A; Arynchyna, Anastasia A; Johnston, James M; Rozzelle, Curtis J; Blount, Jeffrey P; Oakes, W Jerry; Rocque, Brandon G

    2017-04-01

    OBJECTIVE Surgical site infection (SSI) following CSF shunt operations has been well studied, yet risk factors for nonshunt pediatric neurosurgery are less well understood. The purpose of this study was to determine SSI rates and risk factors following nonshunt pediatric neurosurgery using a nationwide patient cohort and an institutional data set specifically for better understanding SSI. METHODS The authors reviewed the American College of Surgeons National Surgical Quality Improvement Program-Pediatric (ACS NSQIP-P) database for the years 2012-2014, including all neurosurgical procedures performed on pediatric patients except CSF shunts and hematoma evacuations. SSI included deep (intracranial abscesses, meningitis, osteomyelitis, and ventriculitis) and superficial wound infections. The authors performed univariate analyses of SSI association with procedure, demographic, comorbidity, operative, and hospital variables, with subsequent multivariate logistic regression analysis to determine independent risk factors for SSI within 30 days of the index procedure. A similar analysis was performed using a detailed institutional infection database from Children's of Alabama (COA). RESULTS A total of 9296 nonshunt procedures were identified in NSQIP-P with an overall 30-day SSI rate of 2.7%. The 30-day SSI rate in the COA institutional database was similar (3.3% of 1103 procedures, p = 0.325). Postoperative time to SSI in NSQIP-P and COA was 14.6 ± 6.8 days and 14.8 ± 7.3 days, respectively (mean ± SD). Myelomeningocele (4.3% in NSQIP-P, 6.3% in COA), spine (3.5%, 4.9%), and epilepsy (3.4%, 3.1%) procedure categories had the highest SSI rates by procedure category in both NSQIP-P and COA. Independent SSI risk factors in NSQIP-P included postoperative pneumonia (OR 4.761, 95% CI 1.269-17.857, p = 0.021), immune disease/immunosuppressant use (OR 3.671, 95% CI 1.371-9.827, p = 0.010), cerebral palsy (OR 2.835, 95% CI 1.463-5.494, p = 0.002), emergency operation (OR 1

  6. Incidence and Risk Factors for Major Hematomas in Aesthetic Surgery: Analysis of 129,007 Patients.

    Science.gov (United States)

    Kaoutzanis, Christodoulos; Winocour, Julian; Gupta, Varun; Ganesh Kumar, Nishant; Sarosiek, Konrad; Wormer, Blair; Tokin, Christopher; Grotting, James C; Higdon, K Kye

    2017-10-16

    Postoperative hematomas are one of the most frequent complications following aesthetic surgery. Identifying risk factors for hematoma has been limited by underpowered studies from single institution experiences. To examine the incidence and identify independent risk factors for postoperative hematomas following cosmetic surgery utilizing a prospective, multicenter database. A prospectively enrolled cohort of patients who underwent aesthetic surgery between 2008 and 2013 was identified from the CosmetAssure database. Primary outcome was occurrence of major hematomas requiring emergency room visit, hospital admission, or reoperation within 30 days of the index operation. Univariate and multivariate analysis was used to identify potential risk factors for hematomas including age, gender, body mass index (BMI), smoking, diabetes, type of surgical facility, procedure by body region, and combined procedures. Of 129,007 patients, 1180 (0.91%) had a major hematoma. Mean age (42.0 ± 13.0 years vs 40.9 ± 13.9 years, P hematomas. Males suffered more hematomas than females (1.4% vs 0.9%, P Hematoma rates were higher in patients undergoing combined procedures compared to single procedures (1.1% vs 0.8%, P hematoma included age (Relative Risk [RR] 1.01), male gender (RR 1.98), the procedure being performed in a hospital setting rather than an office-based setting (RR 1.68), combined procedures (RR 1.35), and breast procedures rather than the body/extremity and face procedures (RR 1.81). Major hematoma is the most common complication following aesthetic surgery. Male patients and those undergoing breast or combined procedures have a significantly higher risk of developing hematomas. 2. © 2017 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com

  7. First Outbreak with MRSA in a Danish Neonatal Intensive Care Unit: Risk Factors and Control Procedures

    Science.gov (United States)

    Ramsing, Benedicte Grenness Utke; Arpi, Magnus; Andersen, Erik Arthur; Knabe, Niels; Mogensen, Dorthe; Buhl, Dorte; Westh, Henrik; Østergaard, Christian

    2013-01-01

    Introduction The purpose of the study was to describe demographic and clinical characteristics and outbreak handling of a large methicillin-resistant Staphylococcus aureus (MRSA) outbreak in a neonatal intensive care unit (NICU) in Denmark June 25th–August 8th 2008, and to identify risk factors for MRSA transmission. Methods Data were collected retrospectively from medical records and the Danish Neobase database. All MRSA isolates obtained from neonates, relatives and NICU health care workers (HCW) as well as environmental cultures were typed. Results During the 46 day outbreak period, 102 neonates were admitted to the two neonatal wards. Ninety-nine neonates were subsequently sampled, and 32 neonates (32%) from 25 families were colonized with MRSA (spa-type t127, SCCmec V, PVL negative). Thirteen family members from 11 of those families (44%) and two of 161 HCWs (1%) were colonized with the same MRSA. No one was infected. Five environmental cultures were MRSA positive. In a multiple logistic regression analysis, nasal Continuous Positive Airway Pressure (nCPAP) treatment (p = 0.006) and Caesarean section (p = 0.016) were independent risk factors for MRSA acquisition, whereas days of exposure to MRSA was a risk factors in the unadjusted analysis (p = 0.04). Conclusions MRSA transmission occurs with high frequency in the NICU during hospitalization with unidentified MRSA neonates. Caesarean section and nCPAP treatment were identified as risk factors for MRSA colonization. The MRSA outbreak was controlled through infection control procedures. PMID:23825581

  8. First outbreak with MRSA in a Danish neonatal intensive care unit: risk factors and control procedures.

    Directory of Open Access Journals (Sweden)

    Benedicte Grenness Utke Ramsing

    Full Text Available INTRODUCTION: The purpose of the study was to describe demographic and clinical characteristics and outbreak handling of a large methicillin-resistant Staphylococcus aureus (MRSA outbreak in a neonatal intensive care unit (NICU in Denmark June 25(th-August 8(th 2008, and to identify risk factors for MRSA transmission. METHODS: Data were collected retrospectively from medical records and the Danish Neobase database. All MRSA isolates obtained from neonates, relatives and NICU health care workers (HCW as well as environmental cultures were typed. RESULTS: During the 46 day outbreak period, 102 neonates were admitted to the two neonatal wards. Ninety-nine neonates were subsequently sampled, and 32 neonates (32% from 25 families were colonized with MRSA (spa-type t127, SCCmec V, PVL negative. Thirteen family members from 11 of those families (44% and two of 161 HCWs (1% were colonized with the same MRSA. No one was infected. Five environmental cultures were MRSA positive. In a multiple logistic regression analysis, nasal Continuous Positive Airway Pressure (nCPAP treatment (p = 0.006 and Caesarean section (p = 0.016 were independent risk factors for MRSA acquisition, whereas days of exposure to MRSA was a risk factors in the unadjusted analysis (p = 0.04. CONCLUSIONS: MRSA transmission occurs with high frequency in the NICU during hospitalization with unidentified MRSA neonates. Caesarean section and nCPAP treatment were identified as risk factors for MRSA colonization. The MRSA outbreak was controlled through infection control procedures.

  9. Development of a draft of human factors safety review procedures for the Korean Next Generation Reactor

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Moon, B. S.; Park, J. C.; Lee, Y. H.; Oh, I. S.; Lee, H. C.

    2000-02-01

    In this study, a draft of Human Factors Engineering (HFE) Safety Review Procedures (SRP) was developed for the safety review of KNGR based on HFE Safety and Regulatory Requirements and Guidelines (SRRG). This draft includes acceptance criteria, review procedure, and evaluation findings for the areas of review including HFE program management, human factors analyses, human factors design, and HFE verification and validation, based on section 15.1 'human factors engineering design process' and 15.2 'control room human factors engineering' of KNGR specific safety requirements and chapter 15 'human factors engineering' of KNGR safety regulatory guides. For the effective review, human factors concerns or issues related to advanced HSI design that have been reported so far should be extensively examined. In this study, a total of 384 human factors issues related to the advanced HSI design were collected through our review of a total of 145 documents. A summary of each issue was described and the issues were identified by specific features of HSI design. These results were implemented into a database system

  10. Procedure for measurement of anisotropy factor for neutron sources; Procedimentos para medição do fator de anisotropia de fontes de nêutrons

    Energy Technology Data Exchange (ETDEWEB)

    Creazolla, Prycylla Gomes

    2017-07-01

    Radioisotope neutron sources allow the production of reference fields for calibration of neutron detectors for radiation protection and analysis purposes. When the emission rate of these sources is isotropic, no correction is necessary. However, variations in source encapsulation and in the radioactive material concentration produce differences in its neutron emission rate, relative to the source axis, this effect is called anisotropy. In this study, is describe a procedure for measuring the anisotropy factor of neutron sources performed in the Laboratório de Metrologia de Neutrons (LN) using a Precision Long Counter (PLC) detector. A measurement procedure that takes into account the anisotropy factor of neutron sources contributes to solve some issues, particularly with respect to the high uncertainties associated with neutron dosimetry. Thus, a bibliographical review was carried out based on international standards and technical regulations specific to the area of neutron fields, and were later reproduced in practice by means of the procedure for measuring the anisotropy factor in neutron sources of the LN. The anisotropy factor is determined as a function of the angle of 90° in relation to the cylindrical axis of the source. This angle is more important due to its high use in measurements and also of its higher neutron emission rate if compared with other angles. (author)

  11. Factors influencing changes in levels of radiation doses received by patients during gastroduodenal series procedures in the Hospital Dr. Max Peralta de Cartago

    International Nuclear Information System (INIS)

    Guzman Campos, Jeremy; Vargas Navarro, Jonnathan

    2009-01-01

    A measurement was made of the number of radiation doses emitted by fluoroscopy equipment used in Hospital Dr. Max Peralta, specifically at the Centro de Deteccion de Cancer Gastrico. The analysis has included the factors could be influencing on increase of the total dose to the patient, by means of indicators that directly affect the unnecessary increase in dose, such as: the procedure, sequences of images, indicators of dosage levels, varying conditions of actual studies, variations dose levels and production process factors. [es

  12. Qualitative content analysis in nursing research: concepts, procedures and measures to achieve trustworthiness.

    Science.gov (United States)

    Graneheim, U H; Lundman, B

    2004-02-01

    Qualitative content analysis as described in published literature shows conflicting opinions and unsolved issues regarding meaning and use of concepts, procedures and interpretation. This paper provides an overview of important concepts (manifest and latent content, unit of analysis, meaning unit, condensation, abstraction, content area, code, category and theme) related to qualitative content analysis; illustrates the use of concepts related to the research procedure; and proposes measures to achieve trustworthiness (credibility, dependability and transferability) throughout the steps of the research procedure. Interpretation in qualitative content analysis is discussed in light of Watzlawick et al.'s [Pragmatics of Human Communication. A Study of Interactional Patterns, Pathologies and Paradoxes. W.W. Norton & Company, New York, London] theory of communication.

  13. Method and procedure of fatigue analysis for nuclear equipment

    International Nuclear Information System (INIS)

    Wen Jing; Fang Yonggang; Lu Yan; Zhang Yue; Sun Zaozhan; Zou Mingzhong

    2014-01-01

    As an example, the fatigue analysis for the upper head of the pressurizer in one NPP was carried out by using ANSYS, a finite element method analysis software. According to RCC-M code, only two kinds of typical transients of temperature and pressure were considered in the fatigue analysis. Meanwhile, the influence of earthquake was taken into account. The method and procedure of fatigue analysis for nuclear safety equipment were described in detail. This paper provides a reference for fatigue analysis and assessment of nuclear safety grade equipment and pipe. (authors)

  14. Landslide susceptibility assessment in the Upper Orcia Valley (Southern Tuscany, Italy through conditional analysis: a contribution to the unbiased selection of causal factors

    Directory of Open Access Journals (Sweden)

    F. Vergari

    2011-05-01

    Full Text Available In this work the conditional multivariate analysis was applied to evaluate landslide susceptibility in the Upper Orcia River Basin (Tuscany, Italy, where widespread denudation processes and agricultural practices have a mutual impact. We introduced an unbiased procedure for causal factor selection based on some intuitive statistical indices. This procedure is aimed at detecting among different potential factors the most discriminant ones in a given study area. Moreover, this step avoids generating too small and statistically insignificant spatial units by intersecting the factor maps. Finally, a validation procedure was applied based on the partition of the landslide inventory from multi-temporal aerial photo interpretation.

    Although encompassing some sources of uncertainties, the applied susceptibility assessment method provided a satisfactory and unbiased prediction for the Upper Orcia Valley. The results confirmed the efficiency of the selection procedure, as an unbiased step of the landslide susceptibility evaluation. Furthermore, we achieved the purpose of presenting a conceptually simple but, at the same time, effective statistical procedure for susceptibility analysis to be used as well by decision makers in land management.

  15. Procedure for conducting a human-reliability analysis for nuclear power plants. Final report

    International Nuclear Information System (INIS)

    Bell, B.J.; Swain, A.D.

    1983-05-01

    This document describes in detail a procedure to be followed in conducting a human reliability analysis as part of a probabilistic risk assessment when such an analysis is performed according to the methods described in NUREG/CR-1278, Handbook for Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications. An overview of the procedure describing the major elements of a human reliability analysis is presented along with a detailed description of each element and an example of an actual analysis. An appendix consists of some sample human reliability analysis problems for further study

  16. Confidence ellipses: A variation based on parametric bootstrapping applicable on Multiple Factor Analysis results for rapid graphical evaluation

    DEFF Research Database (Denmark)

    Dehlholm, Christian; Brockhoff, Per B.; Bredie, Wender L. P.

    2012-01-01

    A new way of parametric bootstrapping allows similar construction of confidence ellipses applicable on all results from Multiple Factor Analysis obtained from the FactoMineR package in the statistical program R. With this procedure, a similar approach will be applied to Multiple Factor Analysis r...... in different studies performed on the same set of products. In addition, the graphical display of confidence ellipses eases interpretation and communication of results....

  17. Risk factors for surgical site infection following nonshunt pediatric neurosurgery: a review of 9296 procedures from a national database and comparison with a single-center experience

    Science.gov (United States)

    Sherrod, Brandon A.; Arynchyna, Anastasia A.; Johnston, James M.; Rozzelle, Curtis J.; Blount, Jeffrey P.; Oakes, W. Jerry; Rocque, Brandon G.

    2017-01-01

    Objective Surgical site infection (SSI) following CSF shunt operations has been well studied, yet risk factors for nonshunt pediatric neurosurgery are less well understood. The purpose of this study was to determine SSI rates and risk factors following nonshunt pediatric neurosurgery using a nationwide patient cohort and an institutional dataset specifically for better understanding SSI. Methods The authors reviewed the American College of Surgeons National Surgical Quality Improvement Program Pediatric (ACS NSQIP-P) database for the years 2012–2014, including all neurosurgical procedures performed on pediatric patients except CSF shunts and hematoma evacuations. SSI included deep (intracranial abscesses, meningitis, osteomyelitis, and ventriculitis) and superficial wound infections. The authors performed univariate analyses of SSI association with procedure, demographic, comorbidity, operative, and hospital variables, with subsequent multivariate logistic regression analysis to determine independent risk factors for SSI within 30 days of the index procedure. A similar analysis was performed using a detailed institutional infection database from Children’s Hospital of Alabama (COA). Results A total of 9296 nonshunt procedures were identified in NSQIP-P with an overall 30-day SSI rate of 2.7%. The 30-day SSI rate in the COA institutional database was similar (3.3% of 1103 procedures, p = 0.325). Postoperative time to SSI in NSQIP-P and COA was 14.6 ± 6.8 days and 14.8 ± 7.3 days, respectively (mean ± SD). Myelomeningocele (4.3% in NSQIP-P, 6.3% in COA), spine (3.5%, 4.9%), and epilepsy (3.4%, 3.1%) procedure categoriess had the highest SSI rates by procedure category in both NSQIP-P and COA. Independent SSI risk factors in NSQIP-P included postoperative pneumonia (OR 4.761, 95% CI 1.269–17.857, p = 0.021), immune disease/immunosuppressant use (OR 3.671, 95% CI 1.371–9.827, p = 0.010), cerebral palsy (OR 2.835, 95% CI 1.463–5.494, p = 0.002), emergency

  18. Development of a draft of human factors safety review procedures for the Korean next generation reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Moon, B. S.; Park, J. C.; Lee, Y. H.; Oh, I. S.; Lee, H. C. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-02-01

    In this study, a draft of human factors engineering (HFE) safety review procedures (SRP) was developed for the safety review of KNGR based on HFE Safety and Regulatory Requirements and Guidelines (SRRG). This draft includes acceptance criteria, review procedure, and evaluation findings for the areas of review including HFE Program Management, Human Factors Analyses, Human Factors Design, and HFE Verification and Validation, based on Section 15.1 'Human Factors Engineering Design Process' and 15.2 'Control Room Human Factors Engineering' of KNGR Specific Safety Requirements and Chapter 15 'Human Factors Engineering' of KNGR Safety Regulatory Guides. For the effective review, human factors concerns or issues related to advanced HSI design that have been reported so far should be extensively examined. In this study, a total of 384 human factors issues related to the advanced HSI design were collected through our review of a total of 145 documents. A summary of each issue was described and the issues were identified by specific features of HSI design. These results were implemented into a database system. 8 refs., 2 figs. (Author)

  19. WHY DO SOME NATIONS SUCCEED AND OTHERS FAIL IN INTERNATIONAL COMPETITION? FACTOR ANALYSIS AND CLUSTER ANALYSIS AT EUROPEAN LEVEL

    Directory of Open Access Journals (Sweden)

    Popa Ion

    2015-07-01

    Full Text Available As stated by Michael Porter (1998: 57, 'this is perhaps the most frequently asked economic question of our times.' However, a widely accepted answer is still missing. The aim of this paper is not to provide the BIG answer for such a BIG question, but rather to provide a different perspective on the competitiveness at the national level. In this respect, we followed a two step procedure, called “tandem analysis”. (OECD, 2008. First we employed a Factor Analysis in order to reveal the underlying factors of the initial dataset followed by a Cluster Analysis which aims classifying the 35 countries according to the main characteristics of competitiveness resulting from Factor Analysis. The findings revealed that clustering the 35 states after the first two factors: Smart Growth and Market Development, which recovers almost 76% of common variability of the twelve original variables, are highlighted four clusters as well as a series of useful information in order to analyze the characteristics of the four clusters and discussions on them.

  20. Thermal fatigue in mixing tees: A step by step simplified procedure

    International Nuclear Information System (INIS)

    Faidy, Claude

    2003-01-01

    Following the CIVAUX 1 incident of a leak on RHR system, EDF has developed a step by step procedure to screen and analyse similar locations: mixing tees with long duration at high ΔT between the 2 fluids. The paper present the procedure, the background of the methodology and few R and D work that support this procedure. The procedure is based on: screening criteria on maximum DT and minimum duration. screening criteria without any duration consideration, only DT and material. a simplified and conservative estimation of a usage factor. a detailed analysis of usage factor and crack growth rate, based on specific data collection of operating transients. Around that procedure EDF launched an R and D program on fatigue curves and fatigue reduction factors for high cycle fatigue. The procedure is compared with field experience and recent R and D fatigue tests. (author)

  1. A definition and evaluation procedure of generalized stress intensity factors at cracks and multi-material wedges

    International Nuclear Information System (INIS)

    Song Chongmin

    2010-01-01

    A definition of generalized stress intensity factors is proposed. It is based on a matrix function solution for singular stress fields obtained from the scaled boundary finite-element method. The dimensions of the matrix are equal to the number of singular terms. Not only real and complex power singularities but also power-logarithmic singularities are represented in a unified expression without explicitly determining the type of singularity. The generalized stress intensity factors are evaluated directly from the definition by following standard stress recovery procedures in the finite element method. Numerical examples are presented to valid the definition and evaluation procedure.

  2. A Quantile Regression Approach to Estimating the Distribution of Anesthetic Procedure Time during Induction.

    Directory of Open Access Journals (Sweden)

    Hsin-Lun Wu

    Full Text Available Although procedure time analyses are important for operating room management, it is not easy to extract useful information from clinical procedure time data. A novel approach was proposed to analyze procedure time during anesthetic induction. A two-step regression analysis was performed to explore influential factors of anesthetic induction time (AIT. Linear regression with stepwise model selection was used to select significant correlates of AIT and then quantile regression was employed to illustrate the dynamic relationships between AIT and selected variables at distinct quantiles. A total of 1,060 patients were analyzed. The first and second-year residents (R1-R2 required longer AIT than the third and fourth-year residents and attending anesthesiologists (p = 0.006. Factors prolonging AIT included American Society of Anesthesiologist physical status ≧ III, arterial, central venous and epidural catheterization, and use of bronchoscopy. Presence of surgeon before induction would decrease AIT (p < 0.001. Types of surgery also had significant influence on AIT. Quantile regression satisfactorily estimated extra time needed to complete induction for each influential factor at distinct quantiles. Our analysis on AIT demonstrated the benefit of quantile regression analysis to provide more comprehensive view of the relationships between procedure time and related factors. This novel two-step regression approach has potential applications to procedure time analysis in operating room management.

  3. Determining the Number of Factors in P-Technique Factor Analysis

    Science.gov (United States)

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  4. Risk factors for unplanned readmission within 30 days after pediatric neurosurgery: a nationwide analysis of 9799 procedures from the American College of Surgeons National Surgical Quality Improvement Program.

    Science.gov (United States)

    Sherrod, Brandon A; Johnston, James M; Rocque, Brandon G

    2016-09-01

    OBJECTIVE Hospital readmission rate is increasingly used as a quality outcome measure after surgery. The purpose of this study was to establish, using a national database, the baseline readmission rates and risk factors for patient readmission after pediatric neurosurgical procedures. METHODS The American College of Surgeons National Surgical Quality Improvement Program-Pediatric database was queried for pediatric patients treated by a neurosurgeon between 2012 and 2013. Procedures were categorized by current procedural terminology (CPT) code. Patient demographics, comorbidities, preoperative laboratory values, operative variables, and postoperative complications were analyzed via univariate and multivariate techniques to find associations with unplanned readmissions within 30 days of the primary procedure. RESULTS A total of 9799 cases met the inclusion criteria, 1098 (11.2%) of which had an unplanned readmission within 30 days. Readmission occurred 14.0 ± 7.7 days postoperatively (mean ± standard deviation). The 4 procedures with the highest unplanned readmission rates were CSF shunt revision (17.3%; CPT codes 62225 and 62230), repair of myelomeningocele > 5 cm in diameter (15.4%), CSF shunt creation (14.1%), and craniectomy for infratentorial tumor excision (13.9%). The lowest unplanned readmission rates were for spine (6.5%), craniotomy for craniosynostosis (2.1%), and skin lesion (1.0%) procedures. On multivariate regression analysis, the odds of readmission were greatest in patients experiencing postoperative surgical site infection (SSI; deep, organ/space, superficial SSI, and wound disruption: OR > 12 and p 10 days (OR 1.411, p = 0.010), oxygen supplementation (OR 1.645, p = 0.010), nutritional support (OR 1.403, p = 0.009), seizure disorder (OR 1.250, p = 0.021), and longer operative time (per hour increase, OR 1.059, p = 0.029). CONCLUSIONS This study may aid in identifying patients at risk for unplanned readmission following pediatric neurosurgery

  5. A Beginner’s Guide to Factor Analysis: Focusing on Exploratory Factor Analysis

    Directory of Open Access Journals (Sweden)

    An Gie Yong

    2013-10-01

    Full Text Available The following paper discusses exploratory factor analysis and gives an overview of the statistical technique and how it is used in various research designs and applications. A basic outline of how the technique works and its criteria, including its main assumptions are discussed as well as when it should be used. Mathematical theories are explored to enlighten students on how exploratory factor analysis works, an example of how to run an exploratory factor analysis on SPSS is given, and finally a section on how to write up the results is provided. This will allow readers to develop a better understanding of when to employ factor analysis and how to interpret the tables and graphs in the output.

  6. HUMAN RELIABILITY ANALYSIS FOR COMPUTERIZED PROCEDURES, PART TWO: APPLICABILITY OF CURRENT METHODS

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2012-10-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no U.S. nuclear power plant has implemented CPs in its main control room. Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  7. Replica Analysis for Portfolio Optimization with Single-Factor Model

    Science.gov (United States)

    Shinzato, Takashi

    2017-06-01

    In this paper, we use replica analysis to investigate the influence of correlation among the return rates of assets on the solution of the portfolio optimization problem. We consider the behavior of an optimal solution for the case where the return rate is described with a single-factor model and compare the findings obtained from our proposed methods with correlated return rates with those obtained with independent return rates. We then analytically assess the increase in the investment risk when correlation is included. Furthermore, we also compare our approach with analytical procedures for minimizing the investment risk from operations research.

  8. The utilisation of thermal analysis to optimise radiocarbon dating procedures

    International Nuclear Information System (INIS)

    Brandova, D.; Keller, W.A.; Maciejewski, M.

    1999-01-01

    Thermal analysis combined with mass spectrometry was applied to radiocarbon dating procedures (age determination of carbon-containing samples). Experiments carried out under an oxygen atmosphere were used to determine carbon content and combustion range of soil and wood samples. Composition of the shell sample and its decomposition were investigated. The quantification of CO 2 formed by the oxidation of carbon was done by the application of pulse thermal analysis. Experiments carried out under an inert atmosphere determined the combustion range of coal with CuO as an oxygen source. To eliminate a possible source of contamination in the radiocarbon dating procedures the adsorption of CO 2 by CuO was investigated. (author)

  9. Main clinical, therapeutic and technical factors related to patient's maximum skin dose in interventional cardiology procedures

    Science.gov (United States)

    Journy, N; Sinno-Tellier, S; Maccia, C; Le Tertre, A; Pirard, P; Pagès, P; Eilstein, D; Donadieu, J; Bar, O

    2012-01-01

    Objective The study aimed to characterise the factors related to the X-ray dose delivered to the patient's skin during interventional cardiology procedures. Methods We studied 177 coronary angiographies (CAs) and/or percutaneous transluminal coronary angioplasties (PTCAs) carried out in a French clinic on the same radiography table. The clinical and therapeutic characteristics, and the technical parameters of the procedures, were collected. The dose area product (DAP) and the maximum skin dose (MSD) were measured by an ionisation chamber (Diamentor; Philips, Amsterdam, The Netherlands) and radiosensitive film (Gafchromic; International Specialty Products Advanced Materials Group, Wayne, NJ). Multivariate analyses were used to assess the effects of the factors of interest on dose. Results The mean MSD and DAP were respectively 389 mGy and 65 Gy cm−2 for CAs, and 916 mGy and 69 Gy cm−2 for PTCAs. For 8% of the procedures, the MSD exceeded 2 Gy. Although a linear relationship between the MSD and the DAP was observed for CAs (r=0.93), a simple extrapolation of such a model to PTCAs would lead to an inadequate assessment of the risk, especially for the highest dose values. For PTCAs, the body mass index, the therapeutic complexity, the fluoroscopy time and the number of cine frames were independent explanatory factors of the MSD, whoever the practitioner was. Moreover, the effect of technical factors such as collimation, cinematography settings and X-ray tube orientations on the DAP was shown. Conclusion Optimising the technical options for interventional procedures and training staff on radiation protection might notably reduce the dose and ultimately avoid patient skin lesions. PMID:22457404

  10. The development of human factors technologies -The development of human behaviour analysis techniques-

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Heui; Park, Keun Ok; Chun, Se Woo; Suh, Sang Moon; Park, Jae Chang [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    In order to contribute to human error reduction through the studies on human-machine interaction in nuclear power plants, this project has objectives to develop SACOM(Simulation Analyzer with a Cognitive Operator Model) and techniques for human error analysis and application. In this year, we studied the followings: (1) Site investigation of operator tasks, (2) Development of operator task micro structure and revision of micro structure, (3) Development of knowledge representation software and SACOM prototype, (4) Development of performance assessment methodologies in task simulation and analysis of the effects of performance shaping factors. analysis and application techniques> (1) Classification of error shaping factors(ESFs) and development of software for ESF evaluation, (2) Analysis of human error occurrences and revision of analysis procedure, (3) Experiment for human error data collection using a compact nuclear simulator, (4) Development of a prototype data base system of the analyzed information on trip cases. 55 figs, 23 tabs, 33 refs. (Author).

  11. An easy guide to factor analysis

    CERN Document Server

    Kline, Paul

    2014-01-01

    Factor analysis is a statistical technique widely used in psychology and the social sciences. With the advent of powerful computers, factor analysis and other multivariate methods are now available to many more people. An Easy Guide to Factor Analysis presents and explains factor analysis as clearly and simply as possible. The author, Paul Kline, carefully defines all statistical terms and demonstrates step-by-step how to work out a simple example of principal components analysis and rotation. He further explains other methods of factor analysis, including confirmatory and path analysis, a

  12. Use of safety analysis to site comfirmation procedure in case of hard rock repository

    International Nuclear Information System (INIS)

    Peltonen, E.K.

    1984-02-01

    The role of safety analysis in a confirmation procedure of a candidate disposal site of radioactive wastes is discussed. Items dealt with include principle reasons and practical goals of the use of safety analysis, methodology of safety analysis and assessment, as well as usefulness and adequacy of the present safety analysis. Safety analysis is a tool, which enables one to estimate quantitatively the possible radiological impacts from the disposal. The results can be compared with the criteria and the suitability conclusions drawn. Because of its systems analytical nature safety analysis is an effective method to reveal, what are the most important factors of the disposal system and the most critical site characteristics inside the lumped parameters often provided by the experimental site investigation methods. Furthermore it gives information on the accuracy needs of different site properties. This can be utilized to judge whether the quality and quantity of the measurements for the characterization are sufficient as well as to guide the further site investigations. A more practical discussion regarding the applicability of the use of safety analysis is presented by an example concerning the assessment of a Finnish candidate site for low- and intermediate-level radioactive waste repository. (author)

  13. Sample handling and chemical procedures for efficacious trace analysis of urine by neutron activation analysis

    International Nuclear Information System (INIS)

    Blotcky, A.J.; Rack, E.P.; Roman, F.R.

    1988-01-01

    Important for the determination of trace elements, ions, or compounds in urine by chemical neutron activation analysis is the optimization of sample handling, preirradiation chemistry, and radioassay procedures necessary for viable analysis. Each element, because of its natural abundance in the earth's crust and, hence, its potential for reagent and environmental contamination, requires specific procedures for storage, handling, and preirradiation chemistry. Radioassay techniques for radionuclides vary depending on their half-lives and decay characteristics. Described in this paper are optimized procedures for aluminum and selenium. While 28 Al (T 1/2 = 2.24 min) and 77m Se(T 1/2 = 17.4s) have short half-lives, their gamma-ray spectra are quite different. Aluminum-28 decays by a 1779-keV gamma and 77m Se by a 162-keV gamma. Unlike selenium, aluminum is a ubiquitous element in the environment requiring special handling to minimize contamination in all phases of its analytical determination

  14. Human factors evaluation of teletherapy: Training and organizational analysis. Volume 4

    International Nuclear Information System (INIS)

    Henriksen, K.; Kaye, R.D.; Jones, R.; Morisseau, D.S.; Serig, D.I.

    1995-07-01

    A series of human factors evaluations were undertaken to better understand the contributing factors to human error in the teletherapy environment. Teletherapy is a multidisciplinary methodology for treating cancerous tissue through selective exposure to an external beam of ionizing radiation. A team of human factors specialists, assisted by a panel of radiation oncologists, medical physicists, and radiation therapists, conducted site visits to radiation oncology departments at community hospitals, university centers, and free-standing clinics. A function and task analysis was initially performed to guide subsequent evaluations in the areas of system-user interfaces, procedures, training and qualifications, and organizational policies and practices. The present work focuses solely on training and qualifications of personnel (e.g., training received before and during employment), and the potential impact of organizational factors on the performance of teletherapy. Organizational factors include such topics as adequacy of staffing, performance evaluations, commonly occurring errors, implementation of quality assurance programs, and organizational climate

  15. Human factors evaluation of teletherapy: Training and organizational analysis. Volume 4

    Energy Technology Data Exchange (ETDEWEB)

    Henriksen, K.; Kaye, R.D.; Jones, R. [Hughes Training, Inc., Falls Church, VA (United States); Morisseau, D.S.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-07-01

    A series of human factors evaluations were undertaken to better understand the contributing factors to human error in the teletherapy environment. Teletherapy is a multidisciplinary methodology for treating cancerous tissue through selective exposure to an external beam of ionizing radiation. A team of human factors specialists, assisted by a panel of radiation oncologists, medical physicists, and radiation therapists, conducted site visits to radiation oncology departments at community hospitals, university centers, and free-standing clinics. A function and task analysis was initially performed to guide subsequent evaluations in the areas of system-user interfaces, procedures, training and qualifications, and organizational policies and practices. The present work focuses solely on training and qualifications of personnel (e.g., training received before and during employment), and the potential impact of organizational factors on the performance of teletherapy. Organizational factors include such topics as adequacy of staffing, performance evaluations, commonly occurring errors, implementation of quality assurance programs, and organizational climate.

  16. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

    International Nuclear Information System (INIS)

    Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

    2015-01-01

    Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based procedure system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the

  17. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

    2015-02-01

    Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based procedure system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the

  18. Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha

    2002-04-01

    Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes

  19. Development of a 3-dimensional flow analysis procedure for axial pump impellers

    International Nuclear Information System (INIS)

    Kim, Min Hwan; Kim, Jong In; Park, Jin Seok; Huh, Houng Huh; Chang, Moon Hee

    1999-06-01

    A fluid dynamic analysis procedure was developed using the three-dimensional solid model of an axial pump impeller which was theoretically designed using I-DEAS CAD/CAM/CAE software. The CFD software FLUENT was used in the flow field analysis. The steady-state flow regime in the MCP impeller and diffuser was simulated using the developed procedure. The results of calculation were analyzed to confirm whether the design requirements were properly implemented in the impeller model. The validity of the developed procedure was demonstrated by comparing the calculation results with the experimental data available. The pump performance at the design point could be effectively predicted using the developed procedure. The computed velocity distributions have shown a good agreement with the experimental data except for the regions near the wall. The computed head, however, was over-predicted than the experiment. The design period and cost required for the development of an axial pump impeller can be significantly reduced by applying the proposed methodology. (author). 7 refs., 2 tabs

  20. Analysis of factors important for the occurrence of Campylobacter in Danish broiler flocks

    DEFF Research Database (Denmark)

    Sommer, Helle Mølgaard; Heuer, Ole Eske; Sørensen, Anna Irene Vedel

    2013-01-01

    a multivariate analysis including all 43 variables. A multivariate analysis was conducted using a generalized linear model, and the correlations between the houses from the same farms were accounted for by adding a variance structure to the model. The procedures for analyses included backward elimination...... of positive flocks/total number of flocks delivered over the 2-year period).The following factors were found to be significantly associated with the occurrence of Campylobacter in the broiler flocks: old broiler houses, late introduction of whole wheat in the feed, relatively high broiler age at slaughter...

  1. The development of human factors technologies -The development of human behaviour analysis techniques-

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Lee, Yong Heui; Park, Keun Ok; Chun, Se Woo; Suh, Sang Moon; Park, Jae Chang

    1995-07-01

    In order to contribute to human error reduction through the studies on human-machine interaction in nuclear power plants, this project has objectives to develop SACOM(Simulation Analyzer with a Cognitive Operator Model) and techniques for human error analysis and application. In this year, we studied the followings: 1) Site investigation of operator tasks, 2) Development of operator task micro structure and revision of micro structure, 3) Development of knowledge representation software and SACOM prototype, 4) Development of performance assessment methodologies in task simulation and analysis of the effects of performance shaping factors. 1) Classification of error shaping factors(ESFs) and development of software for ESF evaluation, 2) Analysis of human error occurrences and revision of analysis procedure, 3) Experiment for human error data collection using a compact nuclear simulator, 4) Development of a prototype data base system of the analyzed information on trip cases. 55 figs, 23 tabs, 33 refs. (Author)

  2. Pricing of common cosmetic surgery procedures: local economic factors trump supply and demand.

    Science.gov (United States)

    Richardson, Clare; Mattison, Gennaya; Workman, Adrienne; Gupta, Subhas

    2015-02-01

    The pricing of cosmetic surgery procedures has long been thought to coincide with laws of basic economics, including the model of supply and demand. However, the highly variable prices of these procedures indicate that additional economic contributors are probable. The authors sought to reassess the fit of cosmetic surgery costs to the model of supply and demand and to determine the driving forces behind the pricing of cosmetic surgery procedures. Ten plastic surgery practices were randomly selected from each of 15 US cities of various population sizes. Average prices of breast augmentation, mastopexy, abdominoplasty, blepharoplasty, and rhytidectomy in each city were compared with economic and demographic statistics. The average price of cosmetic surgery procedures correlated substantially with population size (r = 0.767), cost-of-living index (r = 0.784), cost to own real estate (r = 0.714), and cost to rent real estate (r = 0.695) across the 15 US cities. Cosmetic surgery pricing also was found to correlate (albeit weakly) with household income (r = 0.436) and per capita income (r = 0.576). Virtually no correlations existed between pricing and the density of plastic surgeons (r = 0.185) or the average age of residents (r = 0.076). Results of this study demonstrate a correlation between costs of cosmetic surgery procedures and local economic factors. Cosmetic surgery pricing cannot be completely explained by the supply-and-demand model because no association was found between procedure cost and the density of plastic surgeons. © 2015 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com.

  3. Pertinent anatomy and analysis for midface volumizing procedures.

    Science.gov (United States)

    Surek, Christopher C; Beut, Javier; Stephens, Robert; Jelks, Glenn; Lamb, Jerome

    2015-05-01

    The study was conducted to construct an anatomically inspired midfacial analysis facilitating safe, accurate, and dynamic nonsurgical rejuvenation. Emphasis is placed on determining injection target areas and adverse event zones. Twelve hemifacial fresh cadavers were dissected in a layered fashion. Dimensional measurements between the midfacial fat compartments, prezygomatic space, mimetic muscles, and neurovascular bundles were used to develop a topographic analysis for clinical injections. A longitudinal line from the base of the alar crease to the medial edge of the levator anguli oris muscle (1.9 cm), lateral edge of the levator anguli oris muscle (2.6 cm), and zygomaticus major muscle (4.6 cm) partitions the cheek into two aesthetic regions. A six-step facial analysis outlines three target zones and two adverse event zones and triangulates the point of maximum cheek projection. The lower adverse event zone yields an anatomical explanation to inadvertent jowling during anterior cheek injection. The upper adverse event zone localizes the palpebral branch of the infraorbital artery. The medial malar target area isolates quadrants for anterior cheek projection and tear trough effacement. The middle malar target area addresses lid-cheek blending and superficial compartment turgor. The lateral malar target area highlights lateral cheek projection and locates the prezygomatic space. This stepwise analysis illustrates target areas and adverse event zones to achieve midfacial support, contour, and profile in the repose position and simultaneous molding of a natural shape during animation. This reproducible method can be used both procedurally and in record-keeping for midface volumizing procedures.

  4. Stent-Assisted Coil Embolization of Vertebrobasilar Dissecting Aneurysms: Procedural Outcomes and Factors for Recanalization.

    Science.gov (United States)

    Jeon, Jin Pyeong; Cho, Young Dae; Rhim, Jong Kook; Park, Jeong Jin; Cho, Won-Sang; Kang, Hyun-Seung; Kim, Jeong Eun; Hwang, Gyojun; Kwon, O-Ki; Han, Moon Hee

    2016-01-01

    Outcomes of stent-assisted coil embolization (SACE) have not been well established in the setting of vertebrobasilar dissecting aneurysms (VBDAs) due to the low percentage of cases that need treatment and the array of available therapeutic options. Herein, we presented clinical and radiographic results of SACE in patients with VBDAs. A total of 47 patients (M:F, 30:17; mean age ± SD, 53.7 ± 12.6 years), with a VBDA who underwent SACE between 2008 and 2014 at two institutions were evaluated retrospectively. Medical records and radiologic data were analyzed to assess the outcome of SACE procedures. Cox proportional hazards regression analysis was conducted to determine the factors that were associated with aneurysmal recanalization after SACE. Stent-assisted coil embolization technically succeeded in all patients. Three cerebellar infarctions occurred on postembolization day 1, week 2, and month 2, but no other procedure-related complications developed. Immediately following SACE, 25 aneurysms (53.2%) showed no contrast filling into the aneurysmal sac. During a mean follow-up of 20.2 months, 37 lesions (78.7%) appeared completely occluded, whereas 10 lesions showed recanalization, 5 of which required additional embolization. Overall recanalization rate was 12.64% per lesion-year, and mean postoperative time to recanalization was 18 months (range, 3-36 months). In multivariable analysis, major branch involvement (hazard ratio [HR]: 7.28; p = 0.013) and the presence of residual sac filling (HR: 8.49, p = 0.044) were identified as statistically significant independent predictors of recanalization. No bleeding was encountered in follow-up monitoring. Stent-assisted coil embolization appears feasible and safe for treatment of VBDAs. Long-term results were acceptable in a majority of patients studied, despite a relatively high rate of incomplete occlusion immediately after SACE. Major branch involvement and coiled aneurysms with residual sac filling may predispose to

  5. Risk analysis procedure for post-wildfire natural hazards in British Columbia

    Science.gov (United States)

    Jordan, Peter

    2010-05-01

    Following a severe wildfire season in 2003, and several subsequent damaging debris flow and flood events, the British Columbia Forest Service developed a procedure for analysing risks to public safety and infrastructure from such events. At the same time, the Forest Service undertook a research program to determine the extent of post-wildfire hazards, and examine the hydrologic and geomorphic processes contributing to the hazards. The risk analysis procedure follows the Canadian Standards Association decision-making framework for risk management (which in turn is based on international standards). This has several steps: identification of risk, risk analysis and estimation, evaluation of risk tolerability, developing control or mitigation strategies, and acting on these strategies. The Forest Service procedure deals only with the first two steps. The results are passed on to authorities such as the Provincial Emergency Program and local government, who are responsible for evaluating risks, warning residents, and applying mitigation strategies if appropriate. The objective of the procedure is to identify and analyse risks to public safety and infrastructure. The procedure is loosely based on the BAER (burned area emergency response) program in the USA, with some important differences. Our procedure focuses on identifying risks and warning affected parties, not on mitigation activities such as broadcast erosion control measures. Partly this is due to limited staff and financial resources. Also, our procedure is not multi-agency, but is limited to wildfires on provincial forest land; in British Columbia about 95% of forest land is in the publicly-owned provincial forest. Each fire season, wildfires are screened by size and proximity to values at risk such as populated areas. For selected fires, when the fire is largely contained, the procedure begins with an aerial reconnaissance of the fire, and photography with a hand-held camera, which can be used to make a

  6. Factor analysis

    CERN Document Server

    Gorsuch, Richard L

    2013-01-01

    Comprehensive and comprehensible, this classic covers the basic and advanced topics essential for using factor analysis as a scientific tool in psychology, education, sociology, and related areas. Emphasizing the usefulness of the techniques, it presents sufficient mathematical background for understanding and sufficient discussion of applications for effective use. This includes not only theory but also the empirical evaluations of the importance of mathematical distinctions for applied scientific analysis.

  7. Development of residual stress analysis procedure for fitness-for-service assessment of welded structure

    International Nuclear Information System (INIS)

    Kim, Jong Sung; Jin, Tae Eun; Dong, P.; Prager, M.

    2003-01-01

    In this study, a state of art review of existing residual stress analysis techniques and representative solutions is presented in order to develop the residual stress analysis procedure for Fitness-For-Service(FFS) assessment of welded structure. Critical issues associated with existing residual stress solutions and their treatments in performing FFS are discussed. It should be recognized that detailed residual stress evolution is an extremely complicated phenomenon that typically involves material-specific thermomechanical/metallurgical response, welding process physics, and structural interactions within a component being welded. As a result, computational procedures can vary significantly from highly complicated numerical techniques intended only to elucidate a small part of the process physics to cost-effective procedures that are deemed adequate for capturing some of the important features in a final residual stress distribution. Residual stress analysis procedure for FFS purposes belongs to the latter category. With this in mind, both residual stress analysis techniques and their adequacy for FFS are assessed based on both literature data and analyses performed in this investigation

  8. Operating procedure automation to enhance safety of nuclear power plants

    International Nuclear Information System (INIS)

    Husseiny, A.A.; Sabri, Z.A.; Adams, S.K.; Rodriguez, R.J.; Packer, D.; Holmes, J.W.

    1989-01-01

    Use of logic statements and computer assist are explored as means for automation and improvement on design of operating procedures including those employed in abnormal and emergency situations. Operating procedures for downpower and loss of forced circulation are used for demonstration. Human-factors analysis is performed on generic emergency operating procedures for three strategies of control; manual, semi-automatic and automatic, using standard emergency operating procedures. Such preliminary analysis shows that automation of procedures is feasible provided that fault-tolerant software and hardware become available for design of the controllers. Recommendations are provided for tests to substantiate the promise of enhancement of plant safety. Adequate design of operating procedures through automation may alleviate several major operational problems of nuclear power plants. Also, automation of procedures is necessary for partial or overall automatic control of plants. Fully automatic operations are needed for space applications while supervised automation of land-based and offshore plants may become the thrust of new generation of nulcear power plants. (orig.)

  9. Procedures for multielement analysis using high-flux fast-neutron activation

    International Nuclear Information System (INIS)

    Williams, R.E.; Hopke, P.K.; Meyer, R.A.

    1981-06-01

    Improvements have been made in the rabbit system used for multi-element fast-neutron activation analysis at the Lawrence Livermore National Laboratory Rotating Target Neutron Source, RTNS-I. Procedures have been developed for the analysis of 20 to 25 elements in samples with an inorganic matrix and 10 to 15 elements in biological samples, without the need for prohibitively expensive, long irradiations. Results are presented for the analysis of fly ash, orchard leaves, and bovine liver

  10. Task Analysis of Emergency Operating Procedures for Generating Quantitative HRA Data

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yochan; Park, Jinkyun; Kim, Seunghwan; Choi, Sun Yeong; Jung, Wondea; Jang, Inseok [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    In this paper, the analysis results of the emergency task in the procedures (EOPs; emergency operating procedures) that can be observed from the simulator data are introduced. The task type, component type, system type, and additional information related with the performance of the operators were described. In addition, a prospective application of the analyzed information to HEP quantification process was discussed. In the probabilistic safety analysis (PSA) field, various human reliability analyses (HRAs) have been performed to produce estimates of human error probabilities (HEPs) for significant tasks in complex socio-technical systems. To this end, Many HRA methods have provided basic or nominal HEPs for typical tasks and the quantitative relations describing how a certain performance context or performance shaping factors (PSFs) affects the HEPs. In the HRA community, however, the necessity of appropriate and sufficient human performance data has been recently indicated. This is because a wide range of quantitative estimates in the previous HRA methods are not supported by solid empirical bases. Hence, there have been attempts to collect HRA supporting data. For example, KAERI has started to collect information on both unsafe acts of operators and the relevant PSFs. A characteristic of the database that is being developed at KAERI is that human errors and related PSF surrogates that can be objectively observable are collected from full-scope simulator experiences. In this environment, to produce concretely grounded bases of the HEPs, the traits or attributes of tasks where significant human errors can be observed should be definitely determined. The determined traits should be applicable to compare the HEPs on the traits with the data in previous HRA methods or databases. In this study, task characteristics in a Westinghouse type of EOPs were analyzed with the defining task, component, and system taxonomies.

  11. Exploratory Bi-factor Analysis: The Oblique Case

    OpenAIRE

    Jennrich, Robert L.; Bentler, Peter M.

    2011-01-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford (1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler (2011) introduced an exploratory form of bi-factor analysis that does not require one to provide an explicit bi-factor structure a priori. They use exploratory factor analysis and a bi-factor rotation criterion designed to produce a rotated loading mat...

  12. Work procedures and risk factors for high rdiation exposure among radiologic technologists in South Korea

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Young; Choi, Yeong Chull [Dept. of Preventive Medicine, Keimyung University College of Medicine, Daegu (Korea, Republic of); Lee, Won Jin; Cha, Eun Shil [Dept. of Preventive Medicine, Korea University College of Medicine, Seoul (Korea, Republic of)

    2016-12-15

    Radiologic technologists currently consist of 31.5% among diagnostic radiation workers in South Korea. Among diagnostic radiation workers, radiologic technologists receive the highest annual and collective doses in South Korea. Comprehensive assessment of the work practices and associated radiation doses from diagnostic radiology procedures should be undertaken for effective prevention for radiologic technologists. Using the national survey, this study aimed (1) to explore the distribution of the work procedures performed by gender, (2) to evaluate occupational radiation exposure by work characteristics and safety compliance, (3) to identify the primary factors influencing high radiation exposure among radiologic technologists in South Korea. This study provided detailed information on work practices, number of procedures performed on weekly basis, and occupational radiation doses among radiologic technologists in South Korea. Average radiation dose for radiologic technologists is higher than other countries, and type of facility, work safety, and wearing lead apron explained quite a portion of increased risk in the association between radiology procedures and radiation exposure among radiologic technologists.

  13. Work procedures and risk factors for high rdiation exposure among radiologic technologists in South Korea

    International Nuclear Information System (INIS)

    Kim, Jae Young; Choi, Yeong Chull; Lee, Won Jin; Cha, Eun Shil

    2016-01-01

    Radiologic technologists currently consist of 31.5% among diagnostic radiation workers in South Korea. Among diagnostic radiation workers, radiologic technologists receive the highest annual and collective doses in South Korea. Comprehensive assessment of the work practices and associated radiation doses from diagnostic radiology procedures should be undertaken for effective prevention for radiologic technologists. Using the national survey, this study aimed (1) to explore the distribution of the work procedures performed by gender, (2) to evaluate occupational radiation exposure by work characteristics and safety compliance, (3) to identify the primary factors influencing high radiation exposure among radiologic technologists in South Korea. This study provided detailed information on work practices, number of procedures performed on weekly basis, and occupational radiation doses among radiologic technologists in South Korea. Average radiation dose for radiologic technologists is higher than other countries, and type of facility, work safety, and wearing lead apron explained quite a portion of increased risk in the association between radiology procedures and radiation exposure among radiologic technologists.

  14. Development of a procedure for qualitative and quantitative evaluation of human factors as a part of probabilistic safety assessments of nuclear power plants. Part A

    International Nuclear Information System (INIS)

    Richei, A.

    1998-01-01

    The objective of this project is the development of a procedure for the qualitative and quantitative evaluation of human factors in the probabilistic safety assessment for nuclear power plants. The Human Error Rate Assessment and Optimizing System (HEROS) is introduced. The evaluation of a task with HEROS is realized in the three evaluation levels, i.e. 'Management Structure', 'Working Environment' and 'Man-Machine-Interface'. The developed expert system uses the fuzzy set theory for an assessment. For the evaluation of cognitive tasks evaluation criteria are derived also. The validation of the procedure is based on three examples, reflecting the common practice of probabilistic safety assessments and including problems, which cannot, respectively - only insufficiently - be evaluated with the established human risk analysis procedures. HERO applications give plausible and comprehensible results. (orig.) [de

  15. A Bayesian multidimensional scaling procedure for the spatial analysis of revealed choice data

    NARCIS (Netherlands)

    DeSarbo, WS; Kim, Y; Fong, D

    1999-01-01

    We present a new Bayesian formulation of a vector multidimensional scaling procedure for the spatial analysis of binary choice data. The Gibbs sampler is gainfully employed to estimate the posterior distribution of the specified scalar products, bilinear model parameters. The computational procedure

  16. Responding to Self-Harm: A Documentary Analysis of Agency Policy and Procedure

    Science.gov (United States)

    Paul, Sally; Hill, Malcolm

    2013-01-01

    This paper reports on the findings of a documentary analysis of policies and procedures relating to self-harm from a range of organisations working with young people in the UK. It identifies the extent to which policies and/or procedures relating to self-harm are available for service providers and offers a wider understanding of the concepts of…

  17. Factors affecting construction performance: exploratory factor analysis

    Science.gov (United States)

    Soewin, E.; Chinda, T.

    2018-04-01

    The present work attempts to develop a multidimensional performance evaluation framework for a construction company by considering all relevant measures of performance. Based on the previous studies, this study hypothesizes nine key factors, with a total of 57 associated items. The hypothesized factors, with their associated items, are then used to develop questionnaire survey to gather data. The exploratory factor analysis (EFA) was applied to the collected data which gave rise 10 factors with 57 items affecting construction performance. The findings further reveal that the items constituting ten key performance factors (KPIs) namely; 1) Time, 2) Cost, 3) Quality, 4) Safety & Health, 5) Internal Stakeholder, 6) External Stakeholder, 7) Client Satisfaction, 8) Financial Performance, 9) Environment, and 10) Information, Technology & Innovation. The analysis helps to develop multi-dimensional performance evaluation framework for an effective measurement of the construction performance. The 10 key performance factors can be broadly categorized into economic aspect, social aspect, environmental aspect, and technology aspects. It is important to understand a multi-dimension performance evaluation framework by including all key factors affecting the construction performance of a company, so that the management level can effectively plan to implement an effective performance development plan to match with the mission and vision of the company.

  18. Limiting excessive postoperative blood transfusion after cardiac procedures. A review.

    OpenAIRE

    Ferraris, V A; Ferraris, S P

    1995-01-01

    Analysis of blood product use after cardiac operations reveals that a few patients ( 80%). The risk factors that predispose a minority of patients to excessive blood use include patient-related factors, transfusion practices, drug-related causes, and procedure-related factors. Multivariate studies suggest that patient age and red blood cell volume are independent patient-related variables that predict excessive blood product transfusion aft...

  19. Reliability assessment of a manual-based procedure towards learning curve modeling and fmea analysis

    Directory of Open Access Journals (Sweden)

    Gustavo Rech

    2013-03-01

    Full Text Available Separation procedures in drug Distribution Centers (DC are manual-based activities prone to failures such as shipping exchanged, expired or broken drugs to the customer. Two interventions seem as promising in improving the reliability in the separation procedure: (i selection and allocation of appropriate operators to the procedure, and (ii analysis of potential failure modes incurred by selected operators. This article integrates Learning Curves (LC and FMEA (Failure Mode and Effect Analysis aimed at reducing the occurrence of failures in the manual separation of a drug DC. LCs parameters enable generating an index to identify the recommended operators to perform the procedures. The FMEA is then applied to the separation procedure carried out by the selected operators in order to identify failure modes. It also deployed the traditional FMEA severity index into two sub-indexes related to financial issues and damage to company´s image in order to characterize failures severity. When applied to a drug DC, the proposed method significantly reduced the frequency and severity of failures in the separation procedure.

  20. A factor analysis to detect factors influencing building national brand

    Directory of Open Access Journals (Sweden)

    Naser Azad

    Full Text Available Developing a national brand is one of the most important issues for development of a brand. In this study, we present factor analysis to detect the most important factors in building a national brand. The proposed study uses factor analysis to extract the most influencing factors and the sample size has been chosen from two major auto makers in Iran called Iran Khodro and Saipa. The questionnaire was designed in Likert scale and distributed among 235 experts. Cronbach alpha is calculated as 84%, which is well above the minimum desirable limit of 0.70. The implementation of factor analysis provides six factors including “cultural image of customers”, “exciting characteristics”, “competitive pricing strategies”, “perception image” and “previous perceptions”.

  1. Performing MR-guided biopsies in clinical routine: factors that influence accuracy and procedure time

    International Nuclear Information System (INIS)

    Hoffmann, Ruediger; Thomas, Christoph; Rempp, Hansjoerg; Schmidt, Diethard; Claussen, Claus D.; Clasen, Stephan; Pereira, Philippe L.

    2012-01-01

    To assess the accuracy, the duration and factors that influence the duration of MRI-guided liver or soft-tissue biopsies. Nineteen liver biopsies and 19 soft-tissue biopsies performed using 1.5T-MRI guidance were retrospectively analysed. Diagnostic performance and complications were assessed. Intervention time was subdivided into preparation period, puncture period and control period. Correlation between procedure time and target size, skin-to-target-distance, used sequences and interventionalists' experience were analysed. Overall sensitivity, specificity and accuracy were 0.86, 1.0 and 0.92, respectively. Two minor complications occurred. Overall median procedure time was 103.5 min. Liver biopsies lasted longer than soft-tissue biopsies (mean [soft-tissue] : 73.0 min, mean [liver] : 134.1 min, P [liver] = 0.048, P [soft-tissue] = 0.005) was significantly prolonged for longer skin-to-target-distances. Lower numbers of image acquisitions (P [liver] = 0.0007, P [soft-tissue] = 0.0012) and interventionalists' experience reduces the procedure duration significantly (P < 0.05), besides all false-negative results appeared during the first five biopsies of each individual radiologist. The interventionalists' experience, skin-to-target-distances and number of image acquisition influence the procedure time significantly. (orig.)

  2. A pragmatic approach to estimate alpha factors for common cause failure analysis

    International Nuclear Information System (INIS)

    Hassija, Varun; Senthil Kumar, C.; Velusamy, K.

    2014-01-01

    Highlights: • Estimation of coefficients in alpha factor model for common cause analysis. • A derivation of plant specific alpha factors is demonstrated. • We examine sensitivity of common cause contribution to total system failure. • We compare beta factor and alpha factor models for various redundant configurations. • The use of alpha factors is preferable, especially for large redundant systems. - Abstract: Most of the modern technological systems are deployed with high redundancy but still they fail mainly on account of common cause failures (CCF). Various models such as Beta Factor, Multiple Greek Letter, Binomial Failure Rate and Alpha Factor exists for estimation of risk from common cause failures. Amongst all, alpha factor model is considered most suitable for high redundant systems as it arrives at common cause failure probabilities from a set of ratios of failures and the total component failure probability Q T . In the present study, alpha factor model is applied for the assessment of CCF of safety systems deployed at two nuclear power plants. A method to overcome the difficulties in estimation of the coefficients viz., alpha factors in the model, importance of deriving plant specific alpha factors and sensitivity of common cause contribution to the total system failure probability with respect to hazard imposed by various CCF events is highlighted. An approach described in NUREG/CR-5500 is extended in this study to provide more explicit guidance for a statistical approach to derive plant specific coefficients for CCF analysis especially for high redundant systems. The procedure is expected to aid regulators for independent safety assessment

  3. Análisis del fracaso empresarial por sectores: factores diferenciadores = Cross-industry analysis of business failure: differential factors

    Directory of Open Access Journals (Sweden)

    María Jesús Mures Quintana

    2012-12-01

    Full Text Available El objetivo de este trabajo se centra en el análisis del fracaso empresarial por sectores, a fin de identificar los factores explicativos y predictivos de este fenómeno que son diferentes en tres de los principales sectores que se distinguen en toda economía: industria, construcción y servicios. Para cada uno de estos sectores, seguimos el mismo procedimiento. En primer lugar, aplicamos un análisis de componentes principales con el que identificamos los factores explicativos del fracaso empresarial en los tres sectores. A continuación, consideramos dichos factores como variables independientes en un análisis discriminante, que aplicamos para predecir el fracaso de una muestra de empresas, utilizando no sólo información financiera en forma de ratios, sino también otras variables no financieras relativas a las empresas, así como información externa a las mismas que refleja las condiciones macroeconómicas bajo las que desarrollan su actividad. This paper focuses on a cross-industry analysis of business failure, in order to identify the explanatory and predictor factors of this event that are different in three of the main industries in every economy: manufacturing, building and service. For each one of these industries, the same procedure is followed. First, a principal components analysis is applied in order to identify the explanatory factors of business failure in the three industries. Next, these factors are considered as independent variables in a discriminant analysis, so as to predict the firms’ failure, using not only financial information expressed by ratios, but also other non-financial variables related to the firms, as well as external information that reflects macroeconomic conditions under which they develop their activity.

  4. Using plant procedures as the basis for conducting a job and task analysis

    International Nuclear Information System (INIS)

    Haynes, F.H.; Ruth, B.W.

    1985-01-01

    Plant procedures were selected, by Northeast Utilities (NU), as the basis for conducting Job and Task Analyses (JTA). The resultant JTA was used to design procedure based simulator training programs for Millstone 1, 2, and Connecticut Yankee. The task listings were both plant specific and exhibited excellent correlation to INPO's generic PWR and BWR task analyses. Using the procedures based method enabled us to perform the JTA using plant and training staff. This proved cost effective in terms of both time and money. Learning objectives developed from the JTA were easily justified and correlated directly to job performance within the context of the plant procedures. In addition, the analysis generated a comprehensive review of plant procedures and, conversely, the plant's normal procedure revision process generated an automatic trigger for updating the task data

  5. A scaling procedure for the response of an isolated system with high modal overlap factor

    Science.gov (United States)

    De Rosa, S.; Franco, F.

    2008-10-01

    The paper deals with a numerical approach that reduces some physical sizes of the solution domain to compute the dynamic response of an isolated system: it has been named Asymptotical Scaled Modal Analysis (ASMA). The proposed numerical procedure alters the input data needed to obtain the classic modal responses to increase the frequency band of validity of the discrete or continuous coordinates model through the definition of a proper scaling coefficient. It is demonstrated that the computational cost remains acceptable while the frequency range of analysis increases. Moreover, with reference to the flexural vibrations of a rectangular plate, the paper discusses the ASMA vs. the statistical energy analysis and the energy distribution approach. Some insights are also given about the limits of the scaling coefficient. Finally it is shown that the linear dynamic response, predicted with the scaling procedure, has the same quality and characteristics of the statistical energy analysis, but it can be useful when the system cannot be solved appropriately by the standard Statistical Energy Analysis (SEA).

  6. Factor analysis of multivariate data

    Digital Repository Service at National Institute of Oceanography (India)

    Fernandes, A.A.; Mahadevan, R.

    A brief introduction to factor analysis is presented. A FORTRAN program, which can perform the Q-mode and R-mode factor analysis and the singular value decomposition of a given data matrix is presented in Appendix B. This computer program, uses...

  7. A finite volume procedure for fluid flow, heat transfer and solid-body stress analysis

    KAUST Repository

    Jagad, P. I.; Puranik, B. P.; Date, A. W.

    2018-01-01

    A unified cell-centered unstructured mesh finite volume procedure is presented for fluid flow, heat transfer and solid-body stress analysis. An in-house procedure (A. W. Date, Solution of Transport Equations on Unstructured Meshes with Cell

  8. Factor analysis and scintigraphy

    International Nuclear Information System (INIS)

    Di Paola, R.; Penel, C.; Bazin, J.P.; Berche, C.

    1976-01-01

    The goal of factor analysis is usually to achieve reduction of a large set of data, extracting essential features without previous hypothesis. Due to the development of computerized systems, the use of largest sampling, the possibility of sequential data acquisition and the increase of dynamic studies, the problem of data compression can be encountered now in routine. Thus, results obtained for compression of scintigraphic images were first presented. Then possibilities given by factor analysis for scan processing were discussed. At last, use of this analysis for multidimensional studies and specially dynamic studies were considered for compression and processing [fr

  9. Gastroesophageal reflux disease after peroral endoscopic myotomy: Analysis of clinical, procedural and functional factors, associated with gastroesophageal reflux disease and esophagitis.

    Science.gov (United States)

    Familiari, Pietro; Greco, Santi; Gigante, Giovanni; Calì, Anna; Boškoski, Ivo; Onder, Graziano; Perri, Vincenzo; Costamagna, Guido

    2016-01-01

    Peroral endoscopic myotomy (POEM) does not include any antireflux procedure, resulting in a certain risk of iatrogenic gastroesophageal reflux disease (GERD). The aim of the present study was to evaluate the incidence of iatrogenic GERD after POEM and identify preoperative, perioperative and postoperative factors associated with GERD. All patients treated at a single center who had a complete GERD evaluation after POEM were included in the study. Demographics, preoperative and follow-up data, results of functional studies and procedural data were collected and analyzed. A total of 103 patients (mean age 46.6 years, 47 males) were included. Postoperative altered esophageal acid exposure was attested in 52 patients (50.5%). A total of 19 patients (18.4%) had heartburn and 21 had esophagitis (20.4%). Overall, a clinically relevant GERD (altered esophageal acid exposure, associated with heartburn and/or esophagitis) was diagnosed in 30 patients (29.1%). Correlation between the severity of esophageal acid exposure with heartburn and esophagitis after POEM was found. Patients with heartburn had a lower postoperative 4-second integrated relaxation pressure compared to patients without symptoms (7.6 ± 3.8 mmHg vs 10.01 ± 4.4 mmHg, p<0.05). No correlations were identified with patient sex, age, postoperative body mass index, esophageal shape (sigmoid vs non sigmoid), lower esophageal sphincter pressure, length of myotomy, previous therapies and type of achalasia at high-resolution manometry. Preoperative, perioperative or postoperative factors minimally correlated with GERD after POEM. Clinically relevant GERD was identified in less than one-third of patients, but all patients were well controlled with medical therapy. © 2015 The Authors Digestive Endoscopy © 2015 Japan Gastroenterological Endoscopy Society.

  10. Studies on thermal neutron perturbation factor needed for bulk sample activation analysis

    CERN Document Server

    Csikai, J; Sanami, T; Michikawa, T

    2002-01-01

    The spatial distribution of thermal neutrons produced by an Am-Be source in a graphite pile was measured via the activation foil method. The results obtained agree well with calculated data using the MCNP-4B code. A previous method used for the determination of the average neutron flux within thin absorbing samples has been improved and extended for a graphite moderator. A procedure developed for the determination of the flux perturbation factor renders the thermal neutron activation analysis of bulky samples of unknown composition possible both in hydrogenous and graphite moderators.

  11. A finite volume procedure for fluid flow, heat transfer and solid-body stress analysis

    KAUST Repository

    Jagad, P. I.

    2018-04-12

    A unified cell-centered unstructured mesh finite volume procedure is presented for fluid flow, heat transfer and solid-body stress analysis. An in-house procedure (A. W. Date, Solution of Transport Equations on Unstructured Meshes with Cell-Centered Colocated Variables. Part I: Discretization, International Journal of Heat and Mass Transfer, vol. 48 (6), 1117-1127, 2005) is extended to include the solid-body stress analysis. The transport terms for a cell-face are evaluated in a structured grid-like manner. The Cartesian gradients at the center of each cell-face are evaluated using the coordinate transformation relations. The accuracy of the procedure is demonstrated by solving several benchmark problems involving different boundary conditions, source terms, and types of loading.

  12. A simplified procedure of linear regression in a preliminary analysis

    Directory of Open Access Journals (Sweden)

    Silvia Facchinetti

    2013-05-01

    Full Text Available The analysis of a statistical large data-set can be led by the study of a particularly interesting variable Y – regressed – and an explicative variable X, chosen among the remained variables, conjointly observed. The study gives a simplified procedure to obtain the functional link of the variables y=y(x by a partition of the data-set into m subsets, in which the observations are synthesized by location indices (mean or median of X and Y. Polynomial models for y(x of order r are considered to verify the characteristics of the given procedure, in particular we assume r= 1 and 2. The distributions of the parameter estimators are obtained by simulation, when the fitting is done for m= r + 1. Comparisons of the results, in terms of distribution and efficiency, are made with the results obtained by the ordinary least square methods. The study also gives some considerations on the consistency of the estimated parameters obtained by the given procedure.

  13. SU-D-209-05: Sensitivity of the Diagnostic Radiological Index of Protection (DRIP) to Procedural Factors in Fluoroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Jones, A [UT MD Anderson Cancer Center, Houston, TX (United States); Pasciak, A [University of Tennessee Medical Center, Knoxville, TN (United States); Wagner, L [UT Medical School, Houston, TX (United States)

    2016-06-15

    Purpose: To evaluate the sensitivity of the Diagnostic Radiological Index of Protection (DRIP) to procedural factors in fluoroscopy in an effort to determine an appropriate set of scatter-mimicking primary beams (SMPB) to be used in measuring the DRIP. Methods: A series of clinical and factorial Monte Carlo simulations were conducted to determine the shape of the scattered X-ray spectra incident on the operator in different clinical fluoroscopy scenarios. Two clinical evaluations studied the sensitivity of the scattered spectrum to gantry angle and patient size while technical factors were varied according to measured automatic dose rate control (ADRC) data. Factorial evaluations studied the sensitivity of the scattered spectrum to gantry angle, field of view, patient size and beam quality for constant technical factors. Average energy was the figure of merit used to condense fluence in each energy bin to a single numerical index. Results: Beam quality had the strongest influence on the scattered spectrum in fluoroscopy. Many procedural factors affected the scattered spectrum indirectly through their effects on primary beam quality through ADRC, e.g., gantry angle and patient size. Lateral C-arm rotation, common in interventional cardiology, increased the energy of the scattered spectrum, regardless of the direction of rotation. The effect of patient size on scattered radiation depended on ADRC characteristics, patient size, and procedure type. Conclusion: The scattered spectrum striking the operator in fluoroscopy, and therefore the DRIP, is most strongly influenced by primary beam quality, particularly kV. Use cases for protective garments should be classified by typical procedural primary beam qualities, which are governed by the ADRC according to the impacts of patient size, anatomical location, and gantry angle. These results will help determine an appropriate set of SMPB to be used for measuring the DRIP.

  14. A Human Error Analysis Procedure for Identifying Potential Error Modes and Influencing Factors for Test and Maintenance Activities

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Park, Jin Kyun

    2010-01-01

    Periodic or non-periodic test and maintenance (T and M) activities in large, complex systems such as nuclear power plants (NPPs) are essential for sustaining stable and safe operation of the systems. On the other hand, it also has been raised that human erroneous actions that might occur during T and M activities has the possibility of incurring unplanned reactor trips (RTs) or power derate, making safety-related systems unavailable, or making the reliability of components degraded. Contribution of human errors during normal and abnormal activities of NPPs to the unplanned RTs is known to be about 20% of the total events. This paper introduces a procedure for predictively analyzing human error potentials when maintenance personnel perform T and M tasks based on a work procedure or their work plan. This procedure helps plant maintenance team prepare for plausible human errors. The procedure to be introduced is focusing on the recurrent error forms (or modes) in execution-based errors such as wrong object, omission, too little, and wrong action

  15. Human factors evaluation of remote afterloading brachytherapy. Volume 2, Function and task analysis

    Energy Technology Data Exchange (ETDEWEB)

    Callan, J.R.; Gwynne, J.W. III; Kelly, T.T.; Muckler, F.A. [Pacific Science and Engineering Group, San Diego, CA (United States); Saunders, W.M.; Lepage, R.P.; Chin, E. [University of California San Diego Medical Center, CA (United States). Div. of Radiation Oncology; Schoenfeld, I.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-05-01

    A human factors project on the use of nuclear by-product material to treat cancer using remotely operated afterloaders was undertaken by the Nuclear Regulatory Commission. The purpose of the project was to identify factors that contribute to human error in the system for remote afterloading brachytherapy (RAB). This report documents the findings from the first phase of the project, which involved an extensive function and task analysis of RAB. This analysis identified the functions and tasks in RAB, made preliminary estimates of the likelihood of human error in each task, and determined the skills needed to perform each RAB task. The findings of the function and task analysis served as the foundation for the remainder of the project, which evaluated four major aspects of the RAB system linked to human error: human-system interfaces; procedures and practices; training and qualifications of RAB staff; and organizational practices and policies. At its completion, the project identified and prioritized areas for recommended NRC and industry attention based on all of the evaluations and analyses.

  16. Human factors evaluation of remote afterloading brachytherapy. Volume 2, Function and task analysis

    International Nuclear Information System (INIS)

    Callan, J.R.; Gwynne, J.W. III; Kelly, T.T.; Muckler, F.A.; Saunders, W.M.; Lepage, R.P.; Chin, E.; Schoenfeld, I.; Serig, D.I.

    1995-05-01

    A human factors project on the use of nuclear by-product material to treat cancer using remotely operated afterloaders was undertaken by the Nuclear Regulatory Commission. The purpose of the project was to identify factors that contribute to human error in the system for remote afterloading brachytherapy (RAB). This report documents the findings from the first phase of the project, which involved an extensive function and task analysis of RAB. This analysis identified the functions and tasks in RAB, made preliminary estimates of the likelihood of human error in each task, and determined the skills needed to perform each RAB task. The findings of the function and task analysis served as the foundation for the remainder of the project, which evaluated four major aspects of the RAB system linked to human error: human-system interfaces; procedures and practices; training and qualifications of RAB staff; and organizational practices and policies. At its completion, the project identified and prioritized areas for recommended NRC and industry attention based on all of the evaluations and analyses

  17. A limited assessment of the ASEP human reliability analysis procedure using simulator examination results

    International Nuclear Information System (INIS)

    Gore, B.R.; Dukelow, J.S. Jr.; Mitts, T.M.; Nicholson, W.L.

    1995-10-01

    This report presents a limited assessment of the conservatism of the Accident Sequence Evaluation Program (ASEP) human reliability analysis (HRA) procedure described in NUREG/CR-4772. In particular, the, ASEP post-accident, post-diagnosis, nominal HRA procedure is assessed within the context of an individual's performance of critical tasks on the simulator portion of requalification examinations administered to nuclear power plant operators. An assessment of the degree to which operator perforn:Lance during simulator examinations is an accurate reflection of operator performance during actual accident conditions was outside the scope of work for this project; therefore, no direct inference can be made from this report about such performance. The data for this study are derived from simulator examination reports from the NRC requalification examination cycle. A total of 4071 critical tasks were identified, of which 45 had been failed. The ASEP procedure was used to estimate human error probability (HEP) values for critical tasks, and the HEP results were compared with the failure rates observed in the examinations. The ASEP procedure was applied by PNL operator license examiners who supplemented the limited information in the examination reports with expert judgment based upon their extensive simulator examination experience. ASEP analyses were performed for a sample of 162 critical tasks selected randomly from the 4071, and the results were used to characterize the entire population. ASEP analyses were also performed for all of the 45 failed critical tasks. Two tests were performed to assess the bias of the ASEP HEPs compared with the data from the requalification examinations. The first compared the average of the ASEP HEP values with the fraction of the population actually failed and it found a statistically significant factor of two bias on the average

  18. An analytical inductor design procedure for three-phase PWM converters in power factor correction applications

    DEFF Research Database (Denmark)

    Kouchaki, Alireza; Niroumand, Farideh Javidi; Haase, Frerk

    2015-01-01

    This paper presents an analytical method for designing the inductor of three-phase power factor correction converters (PFCs). The complex behavior of the inductor current complicates the inductor design procedure as well as the core loss and copper loss calculations. Therefore, this paper analyze...... to calculate the core loss in the PFC application. To investigate the impact of the dc link voltage level, two inductors for different dc voltage levels are designed and the results are compared.......This paper presents an analytical method for designing the inductor of three-phase power factor correction converters (PFCs). The complex behavior of the inductor current complicates the inductor design procedure as well as the core loss and copper loss calculations. Therefore, this paper analyzes...... circuit is used to provide the inductor current harmonic spectrum. Therefore, using the harmonic spectrum, the low and high frequency copper losses are calculated. The high frequency minor B-H loops in one switching cycle are also analyzed. Then, the loss map provided by the measurement setup is used...

  19. Shrunken head (tsantsa): a complete forensic analysis procedure.

    Science.gov (United States)

    Charlier, P; Huynh-Charlier, I; Brun, L; Hervé, C; de la Grandmaison, G Lorin

    2012-10-10

    Based on the analysis of shrunken heads referred to our forensic laboratory for anthropological expertise, and data from both anthropological and medical literature, we propose a complete forensic procedure for the analysis of such pieces. A list of 14 original morphological criteria has been developed, based on the global aspect, color, physical deformation, anatomical details, and eventual associated material (wood, vegetal fibers, sand, charcoals, etc.). Such criteria have been tested on a control sample of 20 tsantsa (i.e. shrunken heads from the Jivaro or Shuar tribes of South America). Further complementary analyses are described such as CT-scan and microscopic examination. Such expertise is more and more asked to forensic anthropologists and practitioners in a context of global repatriation of human artifacts to native communities. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  20. Exploratory Bi-Factor Analysis: The Oblique Case

    Science.gov (United States)

    Jennrich, Robert I.; Bentler, Peter M.

    2012-01-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford ("Psychometrika" 47:41-54, 1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler ("Psychometrika" 76:537-549, 2011) introduced an exploratory form of bi-factor…

  1. Experimental design technique applied to the validation of an instrumental Neutron Activation Analysis procedure

    International Nuclear Information System (INIS)

    Santos, Uanda Paula de M. dos; Moreira, Edson Gonçalves

    2017-01-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) method were carried out for the determination of the elements bromine, chlorine, magnesium, manganese, potassium, sodium and vanadium in biological matrix materials using short irradiations at a pneumatic system. 2 k experimental designs were applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. The chosen experimental designs were the 2 3 and the 2 4 , depending on the radionuclide half life. Different certified reference materials and multi-element comparators were analyzed considering the following variables: sample decay time, irradiation time, counting time and sample distance to detector. Comparator concentration, sample mass and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations, it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN /CNEN-SP). Optimized conditions were estimated based on the results of z-score tests, main effect, interaction effects and better irradiation conditions. (author)

  2. Experimental design technique applied to the validation of an instrumental Neutron Activation Analysis procedure

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Uanda Paula de M. dos; Moreira, Edson Gonçalves, E-mail: uandapaula@gmail.com, E-mail: emoreira@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) method were carried out for the determination of the elements bromine, chlorine, magnesium, manganese, potassium, sodium and vanadium in biological matrix materials using short irradiations at a pneumatic system. 2{sup k} experimental designs were applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. The chosen experimental designs were the 2{sup 3} and the 2{sup 4}, depending on the radionuclide half life. Different certified reference materials and multi-element comparators were analyzed considering the following variables: sample decay time, irradiation time, counting time and sample distance to detector. Comparator concentration, sample mass and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations, it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN /CNEN-SP). Optimized conditions were estimated based on the results of z-score tests, main effect, interaction effects and better irradiation conditions. (author)

  3. THE INFLUENCE OF THE CHOSEN SOCIO-DEMOGRAPHIC FACTORS ON THE QUALITY OF LIFE IN WOMEN AFTER GYNAECOLOGICAL SURGICAL PROCEDURES

    Directory of Open Access Journals (Sweden)

    Beata Karakiewicz

    2010-09-01

    Full Text Available Background: The aim of this study was to assess how the chosen socio-demographic factors effect the quality of life in the patients after gynaecological surgical procedures. Materials and Methods: Research was conducted in 2007 among 250 women operated in the Department of Reproduction and Gynaecology, the Pomeranian Medical University in Szczecin. In this survey-based study, we used a standardized quality of life questionnaire, the Women’s Health Questionnaire (WHQ, developed by Dr Myra Hunter at London University. Results: The most numerous patients were those with sleep disorders (38,8%, 37,6% of the surveyed complained of troublesome menstrual symptoms, 26,8% of respondents had disturbing somatic symptoms, short memory and problems with concentration. The lowest percentage of women (12,4% felt anxiety and fear associated with the past gynaecological surgical procedure. Conclusions: 1. General satisfaction and good disposition is declared by the majority of patients after gynaecological surgical procedures. 2. Age, education, having a partner, place of residence, and the number of children are the factors which have significant effect on the quality of life in women after gynaecological procedures.

  4. Failure mode and effects analysis: an empirical comparison of failure mode scoring procedures.

    Science.gov (United States)

    Ashley, Laura; Armitage, Gerry

    2010-12-01

    To empirically compare 2 different commonly used failure mode and effects analysis (FMEA) scoring procedures with respect to their resultant failure mode scores and prioritization: a mathematical procedure, where scores are assigned independently by FMEA team members and averaged, and a consensus procedure, where scores are agreed on by the FMEA team via discussion. A multidisciplinary team undertook a Healthcare FMEA of chemotherapy administration. This included mapping the chemotherapy process, identifying and scoring failure modes (potential errors) for each process step, and generating remedial strategies to counteract them. Failure modes were scored using both an independent mathematical procedure and a team consensus procedure. Almost three-fifths of the 30 failure modes generated were scored differently by the 2 procedures, and for just more than one-third of cases, the score discrepancy was substantial. Using the Healthcare FMEA prioritization cutoff score, almost twice as many failure modes were prioritized by the consensus procedure than by the mathematical procedure. This is the first study to empirically demonstrate that different FMEA scoring procedures can score and prioritize failure modes differently. It found considerable variability in individual team members' opinions on scores, which highlights the subjective and qualitative nature of failure mode scoring. A consensus scoring procedure may be most appropriate for FMEA as it allows variability in individuals' scores and rationales to become apparent and to be discussed and resolved by the team. It may also yield team learning and communication benefits unlikely to result from a mathematical procedure.

  5. Procedure for Market Analysis in the Research, Development and Innovation Management Unit (R+D+I of the University of Mindelo

    Directory of Open Access Journals (Sweden)

    Joao Dias−da Silva

    2016-12-01

    Full Text Available Research, development and innovation (R + D + i is a competitive factor in organizations and refers to a set of specificities that determine the need for efficient and efficient management. The objective of this scientific paper was to show the application of a procedure for the market analysis of the R & D & I management unit of the University of Mindelo, Republic of Cape Verde. The results achieved by the application of the procedure allowed to perfect the decision-making process in response to the needs of the clients represented by the companies ENAPOR, ELECTRA, among others. In the same way strengthen projects related to the development of software to support business management and its financing from external sources, as well as the competitiveness of the organization.

  6. A novel procedure on next generation sequencing data analysis using text mining algorithm.

    Science.gov (United States)

    Zhao, Weizhong; Chen, James J; Perkins, Roger; Wang, Yuping; Liu, Zhichao; Hong, Huixiao; Tong, Weida; Zou, Wen

    2016-05-13

    Next-generation sequencing (NGS) technologies have provided researchers with vast possibilities in various biological and biomedical research areas. Efficient data mining strategies are in high demand for large scale comparative and evolutional studies to be performed on the large amounts of data derived from NGS projects. Topic modeling is an active research field in machine learning and has been mainly used as an analytical tool to structure large textual corpora for data mining. We report a novel procedure to analyse NGS data using topic modeling. It consists of four major procedures: NGS data retrieval, preprocessing, topic modeling, and data mining using Latent Dirichlet Allocation (LDA) topic outputs. The NGS data set of the Salmonella enterica strains were used as a case study to show the workflow of this procedure. The perplexity measurement of the topic numbers and the convergence efficiencies of Gibbs sampling were calculated and discussed for achieving the best result from the proposed procedure. The output topics by LDA algorithms could be treated as features of Salmonella strains to accurately describe the genetic diversity of fliC gene in various serotypes. The results of a two-way hierarchical clustering and data matrix analysis on LDA-derived matrices successfully classified Salmonella serotypes based on the NGS data. The implementation of topic modeling in NGS data analysis procedure provides a new way to elucidate genetic information from NGS data, and identify the gene-phenotype relationships and biomarkers, especially in the era of biological and medical big data. The implementation of topic modeling in NGS data analysis provides a new way to elucidate genetic information from NGS data, and identify the gene-phenotype relationships and biomarkers, especially in the era of biological and medical big data.

  7. Chronic subdural hematoma: a systematic review and meta-analysis of surgical procedures.

    Science.gov (United States)

    Liu, Weiming; Bakker, Nicolaas A; Groen, Rob J M

    2014-09-01

    In this paper the authors systematically evaluate the results of different surgical procedures for chronic subdural hematoma (CSDH). The MEDLINE, Embase, Cochrane Central Register of Controlled Trials, and other databases were scrutinized according to the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analysis) statement, after which only randomized controlled trials (RCTs) and quasi-RCTs were included. At least 2 different neurosurgical procedures in the management of chronic subdural hematoma (CSDH) had to be evaluated. Included studies were assessed for the risk of bias. Recurrence rates, complications, and outcome including mortality were taken as outcome measures. Statistical heterogeneity in each meta-analysis was assessed using the T(2) (tau-squared), I(2), and chi-square tests. The DerSimonian-Laird method was used to calculate the summary estimates using the fixed-effect model in meta-analysis. Of the 297 studies identified, 19 RCTs were included. Of them, 7 studies evaluated the use of postoperative drainage, of which the meta-analysis showed a pooled OR of 0.36 (95% CI 0.21-0.60; p < 0.001) in favor of drainage. Four studies compared twist drill and bur hole procedures. No significant differences between the 2 methods were present, but heterogeneity was considered to be significant. Three studies directly compared the use of irrigation before drainage. A fixed-effects meta-analysis showed a pooled OR of 0.49 (95% CI 0.21-1.14; p = 0.10) in favor of irrigation. Two studies evaluated postoperative posture. The available data did not reveal a significant advantage in favor of the postoperative supine posture. Regarding positioning of the catheter used for drainage, it was shown that a frontal catheter led to a better outcome. One study compared duration of drainage, showing that 48 hours of drainage was as effective as 96 hours of drainage. Postoperative drainage has the advantage of reducing recurrence without increasing complications

  8. Impacts of biological and procedural factors on semiquantification uptake value of liver in fluorine-18 fluorodeoxyglucose positron emission tomography/computed tomography imaging.

    Science.gov (United States)

    Mahmud, Mohd Hafizi; Nordin, Abdul Jalil; Ahmad Saad, Fathinul Fikri; Azman, Ahmad Zaid Fattah

    2015-10-01

    Increased metabolic activity of fluorodeoxyglucose (FDG) in tissue is not only resulting of pathological uptake, but due to physiological uptake as well. This study aimed to determine the impacts of biological and procedural factors on FDG uptake of liver in whole body positron emission tomography/computed tomography (PET/CT) imaging. Whole body fluorine-18 ((18)F) FDG PET/CT scans of 51 oncology patients have been reviewed. Maximum standardized uptake value (SUVmax) of lesion-free liver was quantified in each patient. Pearson correlation was performed to determine the association between the factors of age, body mass index (BMI), blood glucose level, FDG dose and incubation period and liver SUVmax. Multivariate regression analysis was established to determine the significant factors that best predicted the liver SUVmax. Then the subjects were dichotomised into four BMI groups. Analysis of variance (ANOVA) was established for mean difference of SUVmax of liver between those BMI groups. BMI and incubation period were significantly associated with liver SUVmax. These factors were accounted for 29.6% of the liver SUVmax variance. Statistically significant differences were observed in the mean SUVmax of liver among those BMI groups (Pvalue for physiological liver SUVmax as a reference standard for different BMI of patients in PET/CT interpretation and use a standard protocol for incubation period of patient to reduce variation in physiological FDG uptake of liver in PET/CT study.

  9. Theoretical analysis about early detection of hepatocellular carcinoma by medical imaging procedure

    Energy Technology Data Exchange (ETDEWEB)

    Odano, Ikuo; Hinata, Hiroshi; Hara, Keiji; Sakai, Kunio [Niigata Univ. (Japan). School of Medicine

    1983-04-01

    It is well-known that patients with chronic hepatitis and liver cirrhosis are frequently accompanied by hepatocellular carcinoma (hepatoma). They are called high risk group for hepatoma. In order to detect a small hepatoma, it is reasonable for us to perform screening examinations on these high risk group patients. Optimal screening interval, however, has not been established. In this report, a theoretical analysis was made to estimate optimal screening interval by imaging procedure such as ultrasonography, x-ray computed tomography and scintigraphy. By the analysis of eight cases, mean doubling time of hepatoma was estimated about four months (73 - 143 days). If we want to detect a hepatoma not greater than 3.0cm in diameter, medical screening procedure combining ultrasonography and scintigraphy should be performed once per about nine months.

  10. A comparative examination of sample treatment procedures for ICAP-AES analysis of biological tissue

    Science.gov (United States)

    De Boer, J. L. M.; Maessen, F. J. M. J.

    The objective of this study was to contribute to the evaluation of existing sample preparation procedures for ICAP-AES analysis of biological material. Performance characteristics were established of current digestion procedures comprising extraction, solubilization, pressure digestion, and wet and dry ashing methods. Apart from accuracy and precision, a number of criteria of special interest for the analytical practice was applied. As a test sample served SRM bovine liver. In this material six elements were simultaneously determined. Results showed that every procedure has its defects and advantages. Hence, unambiguous recommendation of standard digestion procedures can be made only when taking into account the specific analytical problem.

  11. A comparison of various procedures in photon activation analysis with the same irradiation setup

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Z.J. [Chemical Sciences and Engineering Division, Argonne National Laboratory, 9700 S. Cass Ave., Argonne, IL 60439 (United States); Wells, D. [Physics Department, South Dakota School of Mines and Technology, 501 E. Saint Joseph St., Rapid City, SD 57701 (United States); Segebade, C. [Idaho Accelerator Center, Idaho State University, 921 S. 8th Ave., Pocatello, ID 83209 (United States); Quigley, K.; Chemerisov, S. [Chemical Sciences and Engineering Division, Argonne National Laboratory, 9700 S. Cass Ave., Argonne, IL 60439 (United States)

    2014-11-15

    A sample of known elemental concentrations was activated in the bremsstrahlung photon beam which was created by a pulsed electron LINAC. Several procedures of photon activation analysis, including those applied with/without reference material and with/without photon flux monitor, were conducted to make a comparison of their precision and accuracy in practice. Experimental results have indicated that: (1) relative procedures usually produce better outcome despite that the absolute measurement is straightforward and eliminate the assistance of reference materials; (2) among relative procedures, the method with internal flux monitor yields higher quality of the analytical results. In the article, the pros and cons of each procedure are discussed as well.

  12. Development of a Field Management Standard for Improving Human Factors

    International Nuclear Information System (INIS)

    Yun, Young Su; Son, Il Moon; Son, Byung Chang; Kwak, Hyo Yean

    2009-07-01

    This project is to develop a management guideline for improving human performances as a part of the Human Factors Management System of Kori unit 1 which is managing all of human factors items such as man-machine system interfaces, work procedures, work environments, and human reliabilities in nuclear power plants. Human factors engineering includes an human factors suitability analysis and improvement of human works, an analysis of accidents by human error, an improvement of work environment, an establishment of human factors management rules and a development of human resources to manage and perform those things consistently. For assisting these human factors engineering tasks, we developed human factors management guidelines, checklists and work procedures to be used in staffing, qualification, training, and human information requirements and workload. We also provided a software tool for managing the above items. Additionally, contents and an item pool for a human factors qualifying examination and training programs were developed. A procedures improvement and a human factors V and V on the Kori unit 1 have been completed as a part of this project, too

  13. Factor analysis in optimization of formulation of high content uniformity tablets containing low dose active substance.

    Science.gov (United States)

    Lukášová, Ivana; Muselík, Jan; Franc, Aleš; Goněc, Roman; Mika, Filip; Vetchý, David

    2017-11-15

    Warfarin is intensively discussed drug with narrow therapeutic range. There have been cases of bleeding attributed to varying content or altered quality of the active substance. Factor analysis is useful for finding suitable technological parameters leading to high content uniformity of tablets containing low amount of active substance. The composition of tabletting blend and technological procedure were set with respect to factor analysis of previously published results. The correctness of set parameters was checked by manufacturing and evaluation of tablets containing 1-10mg of warfarin sodium. The robustness of suggested technology was checked by using "worst case scenario" and statistical evaluation of European Pharmacopoeia (EP) content uniformity limits with respect to Bergum division and process capability index (Cpk). To evaluate the quality of active substance and tablets, dissolution method was developed (water; EP apparatus II; 25rpm), allowing for statistical comparison of dissolution profiles. Obtained results prove the suitability of factor analysis to optimize the composition with respect to batches manufactured previously and thus the use of metaanalysis under industrial conditions is feasible. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1984-05-01

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  15. An automated sensitivity analysis procedure for the performance assessment of nuclear waste isolation systems

    International Nuclear Information System (INIS)

    Pin, F.G.; Worley, B.A.; Oblow, E.M.; Wright, R.Q.; Harper, W.V.

    1986-01-01

    To support an effort in making large-scale sensitivity analyses feasible, cost efficient and quantitatively complete, the authors have developed an automated procedure making use of computer calculus. The procedure, called GRESS (GRadient Enhanced Software System), is embodied in a precompiler that can process Fortran computer codes and add derivative-taking capabilities to the normal calculation scheme. In this paper, the automated GRESS procedure is described and applied to the code UCB-NE-10.2, which simulates the migration through a sorption medium of the radionuclide members of a decay chain. The sensitivity calculations for a sample problem are verified using comparison with analytical and perturbation analysis results. Conclusions are drawn relative to the applicability of GRESS for more general large-scale sensitivity studies, and the role of such techniques in an overall sensitivity and uncertainty analysis program is discussed

  16. Analysis of spatio-temporal variability of C-factor derived from remote sensing data

    Science.gov (United States)

    Pechanec, Vilem; Benc, Antonin; Purkyt, Jan; Cudlin, Pavel

    2016-04-01

    In some risk areas water erosion as the present task has got the strong influence on agriculture and can threaten inhabitants. In our country combination of USLE and RUSLE models has been used for water erosion assessment (Krása et al., 2013). Role of vegetation cover is characterized by the help of vegetation protection factor, so-called C- factor. Value of C-factor is given by the ratio of washing-off on a plot with arable crops to standard plot which is kept as fallow regularly spud after any rain (Janeček et al., 2012). Under conditions we cannot identify crop structure and its turn, determination of C-factor can be problem in large areas. In such case we only determine C-factor according to the average crop representation. New technologies open possibilities for acceleration and specification of the approach. Present-day approach for the C-factor determination is based on the analysis of multispectral image data. Red and infrared spectrum is extracted and these parts of image are used for computation of vegetation index series (NDVI, TSAVI). Acquired values for fractional time sections (during vegetation period) are averaged out. At the same time values of vegetation indices for a forest and cleared area are determined. Also regressive coefficients are computed. Final calculation is done by the help of regressive equations expressing relation between values of NDVI and C-factor (De Jong, 1994; Van der Knijff, 1999; Karaburun, 2010). Up-to-date land use layer is used for the determination of erosion threatened areas on the base of selection of individual landscape segments of erosion susceptible categories of land use. By means of Landsat 7 data C-factor has been determined for the whole area of the Czech Republic in every month of the year of 2014. At the model area in a small watershed C-factor has been determined by the conventional (tabular) procedure. Analysis was focused on: i) variability assessment of C-factor values while using the conventional

  17. Analysis of Elementary School students’ algebraic perceptions and procedures

    Directory of Open Access Journals (Sweden)

    Sandra Mara Marasini

    2012-12-01

    Full Text Available This study aims to verify how students in elementary school see themselves in relation to mathematics and, at the same time, analyze the procedures used to solve algebraic tasks. These students in the 8th year of elementary school, and first and third years of high school, from two State schools in Passo Fundo/RS, answered a questionnaire about their own perceptions of the mathematics lessons, the subject mathematics and algebraic content. The analysis was based mainly on authors from the athematical education and the historic-cultural psychology areas. It was verifi ed that even among students who claimed to be happy with the idea of having mathematicsclasses several presented learning diffi culties regarding algebraic contents, revealed by the procedures employed. It was concluded that it is necessary to design proposals with didactic sequences, mathematically and pedagogically based, which can effi cientlyoptimize the appropriation of meaning from the concepts approached and their application in different situations.

  18. A procedure for the determination of scenario earthquakes for seismic design based on probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Hirose, Jiro; Muramatsu, Ken

    2002-03-01

    This report presents a study on the procedures for the determination of scenario earthquakes for seismic design of nuclear power plants (NPPs) based on probabilistic seismic hazard analysis (PSHA). In the recent years, the use of PSHA, which is a part of seismic probabilistic safety assessment (PSA), to determine the design basis earthquake motions for NPPs has been proposed. The identified earthquakes are called probability-based scenario earthquakes (PBSEs). The concept of PBSEs originates both from the study of US NRC and from Ishikawa and Kameda. The assessment of PBSEs is composed of seismic hazard analysis and identification of dominant earthquakes. The objectives of this study are to formulate the concept of PBSEs and to examine the procedures for determining the PBSEs for a domestic NPP site. This report consists of three parts, namely, procedures to compile analytical conditions for PBSEs, an assessment to identify PBSEs for a model site using the Ishikawa's concept and the examination of uncertainties involved in analytical conditions. The results obtained from the examination of PBSEs using Ishikawa's concept are as follows. (a) Since PBSEs are expressed by hazard-consistent magnitude and distance in terms of a prescribed reference probability, it is easy to obtain a concrete image of earthquakes that determine the ground response spectrum to be considered in the design of NPPs. (b) Source contribution factors provide the information on the importance of the earthquake source regions and/or active faults, and allows the selection of a couple of PBSEs based on their importance to the site. (c) Since analytical conditions involve uncertainty, sensitivity analyses on uncertainties that would affect seismic hazard curves and identification of PBSEs were performed on various aspects and provided useful insights for assessment of PBSEs. A result from this sensitivity analysis was that, although the difference in selection of attenuation equations led to a

  19. Bilateral effects of hospital patient-safety procedures on nurses' job satisfaction.

    Science.gov (United States)

    Inoue, T; Karima, R; Harada, K

    2017-09-01

    The aim of this study was to examine how hospital patient-safety procedures affect the job satisfaction of hospital nurses. Additionally, we investigated the association between perceived autonomy and hospital patient-safety procedures and job satisfaction. Recently, measures for patient safety have been recognized as an essential requirement in hospitals. Hospital patient-safety procedures may enhance the job satisfaction of nurses by improving the quality of their work. However, such procedures may also decrease their job satisfaction by imposing excessive stress on nurses because they cannot make mistakes. The participants included 537 nurses at 10 private hospitals in Japan (The surveys were collected from March to July 2012). Factors related to hospital patient-safety procedures were demonstrated using factor analysis, and the associations between these factors and nurses' self-perceived autonomy and job satisfaction were examined using structural equation modelling. Five factors regarding hospital patient-safety procedures were extracted. Additionally, structural equation modelling revealed statistically significant associations between these factors and the nurses' self-perceived autonomy and job satisfaction. The findings showed that nurses' perceived autonomy of the workplace enhanced their job satisfaction and that their perceptions of hospital patient-safety procedures promoted their job satisfaction. However, some styles of chief nurses' leadership regarding patient safety restrict nurses' independent and autonomous decision-making and actions, resulting in a lowering of job satisfaction. This study demonstrated that hospital patient-safety procedures have ambiguous effects on nurses' job satisfaction. In particular, chief nurses' leadership relating to patient safety can have a positive or negative effect on nurses' job satisfaction. The findings indicated that hospital managers should demonstrate positive attitudes to improve patient safety for

  20. An Improved Multidimensional MPA Procedure for Bidirectional Earthquake Excitations

    Directory of Open Access Journals (Sweden)

    Feng Wang

    2014-01-01

    Full Text Available Presently, the modal pushover analysis procedure is extended to multidimensional analysis of structures subjected to multidimensional earthquake excitations. an improved multidimensional modal pushover analysis (IMMPA method is presented in the paper in order to estimate the response demands of structures subjected to bidirectional earthquake excitations, in which the unidirectional earthquake excitation applied on equivalent SDOF system is replaced by the direct superposition of two components earthquake excitations, and independent analysis in each direction is not required and the application of simplified superposition formulas is avoided. The strength reduction factor spectra based on superposition of earthquake excitations are discussed and compared with the traditional strength reduction factor spectra. The step-by-step procedure is proposed to estimate seismic demands of structures. Two examples are implemented to verify the accuracy of the method, and the results of the examples show that (1 the IMMPA method can be used to estimate the responses of structure subjected to bidirectional earthquake excitations. (2 Along with increase of peak of earthquake acceleration, structural response deviation estimated with the IMMPA method may also increase. (3 Along with increase of the number of total floors of structures, structural response deviation estimated with the IMMPA method may also increase.

  1. A Procedure for the Computerized Analysis of Cleft Palate Speech Transcription

    Science.gov (United States)

    Fitzsimons, David A.; Jones, David L.; Barton, Belinda; North, Kathryn N.

    2012-01-01

    The phonetic symbols used by speech-language pathologists to transcribe speech contain underlying hexadecimal values used by computers to correctly display and process transcription data. This study aimed to develop a procedure to utilise these values as the basis for subsequent computerized analysis of cleft palate speech. A computer keyboard…

  2. Policy analysis of authorisation procedures for wind energy deployment in Spain

    International Nuclear Information System (INIS)

    Iglesias, Guillermo; Rio, Pablo del; Dopico, Jesus Angel

    2011-01-01

    The aim of this paper is to analyse the administrative procedures for the granting of authorisations for the siting of wind farms in Spain, currently the competency of regional authorities. The analysis reveals some commonalities and differences between the procedures across regions. Furthermore, some aspects regarding these procedures have raised the concern of different stakeholders, including the central government and wind energy investors. A conflict between the interests of the central and regional governments can be observed. Lack of coordination between the different administrative levels and the 'more is better mentality' of regional authorities have led to a significant growth of wind energy requests for the (national) feed-in tariff. In turn, investors have complained about the discretionarity and non-transparency of those procedures and the lack of homogeneity across regions. This is likely to result in delays, uncertainty for investors and higher transaction costs. Although there has been a trend to a model which involves the use of multicriteria bidding procedures with more explicit, objective and precise criteria regarding project selection, the aforementioned problems suggest the need to improve coordination between the different administrative levels. - Highlights: → A conflict between the interests of the central and regional governments in the granting of administrative procedures can be observed. → Lack of coordination between different administrative levels have led to a significant growth of wind energy requests for the (national) feed-in tariff. → The resulting increase in the total costs of wind energy promotion has been a major concern for national policy-makers. → In turn, investors have complained about the discretionarity and non-transparency of those procedures and the lack of homogeneity across regions. → Those problems suggest the need to improve coordination between the different administrative levels.

  3. Procedure-related risk of miscarriage following amniocentesis and chorionic villus sampling: a systematic review and meta-analysis.

    Science.gov (United States)

    Akolekar, R; Beta, J; Picciarelli, G; Ogilvie, C; D'Antonio, F

    2015-01-01

    To estimate procedure-related risks of miscarriage following amniocentesis and chorionic villus sampling (CVS) based on a systematic review of the literature and a meta-analysis. A search of MEDLINE, EMBASE, CINHAL and The Cochrane Library (2000-2014) was performed to review relevant citations reporting procedure-related complications of amniocentesis and CVS. Only studies reporting data on more than 1000 procedures were included in this review to minimize the effect of bias from smaller studies. Heterogeneity between studies was estimated using Cochran's Q, the I(2) statistic and Egger bias. Meta-analysis of proportions was used to derive weighted pooled estimates for the risk of miscarriage before 24 weeks' gestation. Incidence-rate difference meta-analysis was used to estimate pooled procedure-related risks. The weighted pooled risks of miscarriage following invasive procedures were estimated from analysis of controlled studies including 324 losses in 42 716 women who underwent amniocentesis and 207 losses in 8899 women who underwent CVS. The risk of miscarriage prior to 24 weeks in women who underwent amniocentesis and CVS was 0.81% (95% CI, 0.58-1.08%) and 2.18% (95% CI, 1.61-2.82%), respectively. The background rates of miscarriage in women from the control group that did not undergo any procedures were 0.67% (95% CI, 0.46-0.91%) for amniocentesis and 1.79% (95% CI, 0.61-3.58%) for CVS. The weighted pooled procedure-related risks of miscarriage for amniocentesis and CVS were 0.11% (95% CI, -0.04 to 0.26%) and 0.22% (95% CI, -0.71 to 1.16%), respectively. The procedure-related risks of miscarriage following amniocentesis and CVS are much lower than are currently quoted. Copyright © 2014 ISUOG. Published by John Wiley & Sons Ltd.

  4. A P-value model for theoretical power analysis and its applications in multiple testing procedures

    Directory of Open Access Journals (Sweden)

    Fengqing Zhang

    2016-10-01

    Full Text Available Abstract Background Power analysis is a critical aspect of the design of experiments to detect an effect of a given size. When multiple hypotheses are tested simultaneously, multiplicity adjustments to p-values should be taken into account in power analysis. There are a limited number of studies on power analysis in multiple testing procedures. For some methods, the theoretical analysis is difficult and extensive numerical simulations are often needed, while other methods oversimplify the information under the alternative hypothesis. To this end, this paper aims to develop a new statistical model for power analysis in multiple testing procedures. Methods We propose a step-function-based p-value model under the alternative hypothesis, which is simple enough to perform power analysis without simulations, but not too simple to lose the information from the alternative hypothesis. The first step is to transform distributions of different test statistics (e.g., t, chi-square or F to distributions of corresponding p-values. We then use a step function to approximate each of the p-value’s distributions by matching the mean and variance. Lastly, the step-function-based p-value model can be used for theoretical power analysis. Results The proposed model is applied to problems in multiple testing procedures. We first show how the most powerful critical constants can be chosen using the step-function-based p-value model. Our model is then applied to the field of multiple testing procedures to explain the assumption of monotonicity of the critical constants. Lastly, we apply our model to a behavioral weight loss and maintenance study to select the optimal critical constants. Conclusions The proposed model is easy to implement and preserves the information from the alternative hypothesis.

  5. Simplified procedures for fast reactor fuel cycle and sensitivity analysis

    International Nuclear Information System (INIS)

    Badruzzaman, A.

    1979-01-01

    The Continuous Slowing Down-Integral Transport Theory has been extended to perform criticality calculations in a Fast Reactor Core-blanket system achieving excellent prediction of the spectrum and the eigenvalue. The integral transport parameters did not need recalculation with source iteration and were found to be relatively constant with exposure. Fuel cycle parameters were accurately predicted when these were not varied, thus reducing a principal potential penalty of the Intergal Transport approach where considerable effort may be required to calculate transport parameters in more complicated geometries. The small variation of the spectrum in the central core region, and its weak dependence on exposure for both this region, the core blanket interface and blanket region led to the extension and development of inexpensive simplified procedures to complement exact methods. These procedures gave accurate predictions of the key fuel cycle parameters such as cost and their sensitivity to variation in spectrum-averaged and multigroup cross sections. They also predicted the implications of design variation on these parameters very well. The accuracy of these procedures and their use in analyzing a wide variety of sensitivities demonstrate the potential utility of survey calculations in Fast Reactor analysis and fuel management

  6. An improved and explicit surrogate variable analysis procedure by coefficient adjustment.

    Science.gov (United States)

    Lee, Seunggeun; Sun, Wei; Wright, Fred A; Zou, Fei

    2017-06-01

    Unobserved environmental, demographic, and technical factors can negatively affect the estimation and testing of the effects of primary variables. Surrogate variable analysis, proposed to tackle this problem, has been widely used in genomic studies. To estimate hidden factors that are correlated with the primary variables, surrogate variable analysis performs principal component analysis either on a subset of features or on all features, but weighting each differently. However, existing approaches may fail to identify hidden factors that are strongly correlated with the primary variables, and the extra step of feature selection and weight calculation makes the theoretical investigation of surrogate variable analysis challenging. In this paper, we propose an improved surrogate variable analysis using all measured features that has a natural connection with restricted least squares, which allows us to study its theoretical properties. Simulation studies and real data analysis show that the method is competitive to state-of-the-art methods.

  7. Contribution of the ergonomic analysis to the improvement of the design of operating procedures in nuclear power plants

    International Nuclear Information System (INIS)

    Dien, Y.; Montmayeul, R.

    1992-11-01

    The design of operating procedures for continuous processes is much too often based on implicit assumptions both concerning the operators and the operating conditions that must be dealt with. The merit of the ergonomic approach to the design of procedures is to take account of the way the various operators actually use operating procedures. The actual use is determined from the analysis of on-site operation (normal and incident operating conditions) and the analysis of full-scale simulators tests (incident operating conditions). The introduction of the ergonomic approach in the procedure design results in new design principles being proposed

  8. The Infinitesimal Jackknife with Exploratory Factor Analysis

    Science.gov (United States)

    Zhang, Guangjian; Preacher, Kristopher J.; Jennrich, Robert I.

    2012-01-01

    The infinitesimal jackknife, a nonparametric method for estimating standard errors, has been used to obtain standard error estimates in covariance structure analysis. In this article, we adapt it for obtaining standard errors for rotated factor loadings and factor correlations in exploratory factor analysis with sample correlation matrices. Both…

  9. Maturation of arteriovenous fistula: Analysis of key factors

    Directory of Open Access Journals (Sweden)

    Muhammad A. Siddiqui

    2017-12-01

    Full Text Available The growing proportion of individuals suffering from chronic kidney disease has considerable repercussions for both kidney specialists and primary care. Progressive and permanent renal failure is most frequently treated with hemodialysis. The efficiency of hemodialysis treatment relies on the functional status of vascular access. Determining the type of vascular access has prime significance for maximizing successful maturation of a fistula and avoiding surgical revision. Despite the frequency of arteriovenous fistula procedures, there are no consistent criteria applied before creation of arteriovenous fistulae. Increased prevalence and use of arteriovenous fistulae would result if there were reliable criteria to assess which arteriovenous fistulae are more likely to reach maturity without additional procedures. Published studies assessing the predictive markers of fistula maturation vary to a great extent with regard to definitions, design, study size, patient sample, and clinical factors. As a result, surgeons and specialists must decide which possible risk factors are most likely to occur, as well as which parameters to employ when evaluating the success rate of fistula development in patients awaiting the creation of permanent access. The purpose of this literature review is to discuss the role of patient factors and blood markers in the development of arteriovenous fistulae.

  10. Development of SRC-I product analysis. Volume 3. Documentation of procedures

    Energy Technology Data Exchange (ETDEWEB)

    Schweighardt, F.K.; Kingsley, I.S.; Cooper, F.E.; Kamzelski, A.Z.; Parees, D.M.

    1983-09-01

    This section documents the BASIC computer program written to simulate Wilsonville's GC-simulated distillation (GCSD) results at APCI-CRSD Trexlertown. The GC conditions used at APCI for the Wilsonville GCSD analysis of coal-derived liquid samples were described in the SRC-I Quarterly Technical Report, April-June 1981. The approach used to simulate the Wilsonville GCSD results is also from an SRC-I Quarterly Technical Report and is reproduced in Appendix VII-A. The BASIC computer program is described in the attached Appendix VII-B. Analysis of gases produced during coal liquefaction generates key information needed to determine product yields for material balance and process control. Gas samples from the coal process development unit (CPDU) and tubing bombs are the primary samples analyzed. A Carle gas chromatographic system was used to analyze coal liquefaction gas samples. A BASIC computer program was written to calculate the gas chromatographic peak area results into mole percent results. ICRC has employed several analytical workup procedures to determine the amount of distillate, oils, asphaltenes, preasphaltenes, and residue in SRC-I process streams. The ASE procedure was developed using Conoco's liquid column fractionation (LC/F) method as a model. In developing the ASE procedure, ICRC was able to eliminate distillation, and therefore quantify the oils fraction in one extraction step. ASE results were shown to be reproducible within +- 2 wt %, and to yield acceptable material balances. Finally, the ASE method proved to be the least affected by sample composition.

  11. A highly rationalized procedure of neutron activation analysis for routine applications in dairy science

    International Nuclear Information System (INIS)

    Heine, K.; Wiechen, A.

    1976-01-01

    A rational procedure was developed for the multi-element analysis by neutron activation for applications in dairy science. The preparation of samples prior to irradiation consists of drying, weighing, and welding in quartz ampoules. The neutron flux, samples are exposed to during reactor irradiation , is determined by the mono-comparator technique for which the Co-content of a commercial aluminium foil was chosen as the flux monitor. Constancy of the Co-content and uncomplicated handling of the foil essentially simplify the determination of flux. The samples are irradiated for 72 h, dissolved in HNO 3 /H 2 SO 4 and measured in the liquid state after waiting times of 1-2, 4 and 8 weeks by a Ge(Li)-detector and a 4,096 channel spectrometer. The procedure was confirmed by investigations of the biological KALE standard and by participation in inter-comparisons of biological substances of the Analytical Quality Control Service of the IAEA for the analysis of the elements Na, Ca, Cr, Fe, Co, Zn, Se, Rb, and Cs. So a procedure was developed suitable for routine multi-element analysis of biologic samples by optimizing and rationalizing all analytical steps. (orig./MG) [de

  12. Immunoautoradiographic analysis of epidermal growth factor receptors: a sensitive method for the in situ identification of receptor proteins and for studying receptor specificity

    International Nuclear Information System (INIS)

    Fernandez-Pol, J.A.

    1982-01-01

    The use of an immunoautoradiographic system for the detection and analysis of epidermal growth factor (EGF) receptors in human epidermoid carcinoma A-431 cells is reported. By utilizing this technique, the interaction between EGF and its membrane receptor in A-431 cells can be rapidly visualized. The procedure is simple, rapid, and very sensitive, and it provides conclusive evidence that the 150K dalton protein is the receptor fo EGF in A-431 cells. In summary, the immunoautoradiographic procedure brings to the analysis of hormone rceptor proteins the power that the radioimmunoassay technique has brought to the analysis of hormones. Thus, this assay system is potentially applicable in a wide spectrum in many fields of nuclear medicine and biology

  13. Left ventricular wall motion abnormalities evaluated by factor analysis as compared with Fourier analysis

    International Nuclear Information System (INIS)

    Hirota, Kazuyoshi; Ikuno, Yoshiyasu; Nishikimi, Toshio

    1986-01-01

    Factor analysis was applied to multigated cardiac pool scintigraphy to evaluate its ability to detect left ventricular wall motion abnormalities in 35 patients with old myocardial infarction (MI), and in 12 control cases with normal left ventriculography. All cases were also evaluated by conventional Fourier analysis. In most cases with normal left ventriculography, the ventricular and atrial factors were extracted by factor analysis. In cases with MI, the third factor was obtained in the left ventricle corresponding to wall motion abnormality. Each case was scored according to the coincidence of findings of ventriculography and those of factor analysis or Fourier analysis. Scores were recorded for three items; the existence, location, and degree of asynergy. In cases of MI, the detection rate of asynergy was 94 % by factor analysis, 83 % by Fourier analysis, and the agreement in respect to location was 71 % and 66 %, respectively. Factor analysis had higher scores than Fourier analysis, but this was not significant. The interobserver error of factor analysis was less than that of Fourier analysis. Factor analysis can display locations and dynamic motion curves of asynergy, and it is regarded as a useful method for detecting and evaluating left ventricular wall motion abnormalities. (author)

  14. Exploratory factor analysis in Rehabilitation Psychology: a content analysis.

    Science.gov (United States)

    Roberson, Richard B; Elliott, Timothy R; Chang, Jessica E; Hill, Jessica N

    2014-11-01

    Our objective was to examine the use and quality of exploratory factor analysis (EFA) in articles published in Rehabilitation Psychology. Trained raters examined 66 separate exploratory factor analyses in 47 articles published between 1999 and April 2014. The raters recorded the aim of the EFAs, the distributional statistics, sample size, factor retention method(s), extraction and rotation method(s), and whether the pattern coefficients, structure coefficients, and the matrix of association were reported. The primary use of the EFAs was scale development, but the most widely used extraction and rotation method was principle component analysis, with varimax rotation. When determining how many factors to retain, multiple methods (e.g., scree plot, parallel analysis) were used most often. Many articles did not report enough information to allow for the duplication of their results. EFA relies on authors' choices (e.g., factor retention rules extraction, rotation methods), and few articles adhered to all of the best practices. The current findings are compared to other empirical investigations into the use of EFA in published research. Recommendations for improving EFA reporting practices in rehabilitation psychology research are provided.

  15. Estimation of the behavior factor of existing RC-MRF buildings

    Science.gov (United States)

    Vona, Marco; Mastroberti, Monica

    2018-01-01

    In recent years, several research groups have studied a new generation of analysis methods for seismic response assessment of existing buildings. Nevertheless, many important developments are still needed in order to define more reliable and effective assessment procedures. Moreover, regarding existing buildings, it should be highlighted that due to the low knowledge level, the linear elastic analysis is the only analysis method allowed. The same codes (such as NTC2008, EC8) consider the linear dynamic analysis with behavior factor as the reference method for the evaluation of seismic demand. This type of analysis is based on a linear-elastic structural model subject to a design spectrum, obtained by reducing the elastic spectrum through a behavior factor. The behavior factor (reduction factor or q factor in some codes) is used to reduce the elastic spectrum ordinate or the forces obtained from a linear analysis in order to take into account the non-linear structural capacities. The behavior factors should be defined based on several parameters that influence the seismic nonlinear capacity, such as mechanical materials characteristics, structural system, irregularity and design procedures. In practical applications, there is still an evident lack of detailed rules and accurate behavior factor values adequate for existing buildings. In this work, some investigations of the seismic capacity of the main existing RC-MRF building types have been carried out. In order to make a correct evaluation of the seismic force demand, actual behavior factor values coherent with force based seismic safety assessment procedure have been proposed and compared with the values reported in the Italian seismic code, NTC08.

  16. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure.

    Science.gov (United States)

    Castellazzi, Giovanni; D'Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-07-28

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation.

  17. Four points function fitted and first derivative procedure for determining the end points in potentiometric titration curves: statistical analysis and method comparison.

    Science.gov (United States)

    Kholeif, S A

    2001-06-01

    A new method that belongs to the differential category for determining the end points from potentiometric titration curves is presented. It uses a preprocess to find first derivative values by fitting four data points in and around the region of inflection to a non-linear function, and then locate the end point, usually as a maximum or minimum, using an inverse parabolic interpolation procedure that has an analytical solution. The behavior and accuracy of the sigmoid and cumulative non-linear functions used are investigated against three factors. A statistical evaluation of the new method using linear least-squares method validation and multifactor data analysis are covered. The new method is generally applied to symmetrical and unsymmetrical potentiometric titration curves, and the end point is calculated using numerical procedures only. It outperforms the "parent" regular differential method in almost all factors levels and gives accurate results comparable to the true or estimated true end points. Calculated end points from selected experimental titration curves compatible with the equivalence point category of methods, such as Gran or Fortuin, are also compared with the new method.

  18. Lithuanian Population Aging Factors Analysis

    Directory of Open Access Journals (Sweden)

    Agnė Garlauskaitė

    2015-05-01

    Full Text Available The aim of this article is to identify the factors that determine aging of Lithuania’s population and to assess the influence of these factors. The article shows Lithuanian population aging factors analysis, which consists of two main parts: the first describes the aging of the population and its characteristics in theoretical terms. Second part is dedicated to the assessment of trends that influence the aging population and demographic factors and also to analyse the determinants of the aging of the population of Lithuania. After analysis it is concluded in the article that the decline in the birth rate and increase in the number of emigrants compared to immigrants have the greatest impact on aging of the population, so in order to show the aging of the population, a lot of attention should be paid to management of these demographic processes.

  19. Cell-based land use screening procedure for regional siting analysis

    International Nuclear Information System (INIS)

    Jalbert, J.S.; Dobson, J.E.

    1976-01-01

    An energy facility site-screening methodology which permits the land resource planner to identify candidate siting areas was developed. Through the use of spatial analysis procedures and computer graphics, a selection of candidate areas is obtained. Specific sites then may be selected from among candidate areas for environmental impact analysis. The computerized methodology utilizes a cell-based geographic information system for specifying the suitability of candidate areas for an energy facility. The criteria to be considered may be specified by the user and weighted in terms of importance. Three primary computer programs have been developed. These programs produce thematic maps, proximity calculations, and suitability calculations. Programs are written so as to be transferrable to regional planning or regulatory agencies to assist in rational and comprehensive power plant site identification and analysis

  20. Development of an optimized procedure bridging design and structural analysis codes for the automatized design of the SMART

    International Nuclear Information System (INIS)

    Kim, Tae Wan; Park, Keun Bae; Choi, Suhn; Kim, Kang Soo; Jeong, Kyeong Hoon; Lee, Gyu Mahn

    1998-09-01

    In this report, an optimized design and analysis procedure is established to apply to the SMART (System-integrated Modular Advanced ReacTor) development. The development of an optimized procedure is to minimize the time consumption and engineering effort by squeezing the design and feedback interactions. To achieve this goal, the data and information generated through the design development should be directly transferred to the analysis program with minimum operation. The verification of the design concept requires considerable effort since the communication between the design and analysis involves time consuming stage for the conversion of input information. In this report, an optimized procedure is established bridging the design and analysis stage utilizing the IDEAS, ABAQUS and ANSYS. (author). 3 refs., 2 tabs., 5 figs

  1. Great Lakes water quality initiative technical support document for the procedure to determine bioaccumulation factors. Draft report

    International Nuclear Information System (INIS)

    1993-03-01

    The purpose of the document is to provide the technical information and rationale in support of the proposed procedures to determine bioaccumulation factors. Bioaccumulation factors, together with the quantity of aquatic organisms eaten, determine the extent to which people and wildlife are exposed to chemicals through the consumption of aquatic organisms. The more bioaccumulative a pollutant is, the more important the consumption of aquatic organisms becomes as a potential source of contaminants to humans and wildlife. Bioaccumulation factors are needed to determine both human health and wildlife tier I water quality criteria and tier II values. Also, they are used to define Bioaccumulative Chemicals of Concern among the Great Lakes Initiative universe of pollutants. Bioaccumulation factors range from less than one to several million

  2. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Kaye, R.D.; Henriksen, K.; Jones, R. [Hughes Training, Inc., Falls Church, VA (United States); Morisseau, D.S.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-07-01

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included.

  3. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    International Nuclear Information System (INIS)

    Kaye, R.D.; Henriksen, K.; Jones, R.; Morisseau, D.S.; Serig, D.I.

    1995-07-01

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included

  4. 40 CFR 60.50Da - Compliance determination procedures and methods.

    Science.gov (United States)

    2010-07-01

    ... probe and filter holder heating system in the sampling train may be set to provide an average gas... correction factor, integrated or grab sampling and analysis procedures of Method 3B of appendix A of this... fuel oil, etc.), coal pulverizers, and bottom and fly ash interactions. This determination is optional...

  5. Factor Economic Analysis at Forestry Enterprises

    Directory of Open Access Journals (Sweden)

    M.Yu. Chik

    2018-03-01

    Full Text Available The article studies the importance of economic analysis according to the results of research of scientific works of domestic and foreign scientists. The calculation of the influence of factors on the change in the cost of harvesting timber products by cost items has been performed. The results of the calculation of the influence of factors on the change of costs on 1 UAH are determined using the full cost of sold products. The variable and fixed costs and their distribution are allocated that influences the calculation of the impact of factors on cost changes on 1 UAH of sold products. The paper singles out the general results of calculating the influence of factors on cost changes on 1 UAH of sold products. According to the results of the analysis, the list of reserves for reducing the cost of production at forest enterprises was proposed. The main sources of reserves for reducing the prime cost of forest products at forest enterprises are investigated based on the conducted factor analysis.

  6. The role of human error in risk analysis: Application to pre- and post-maintenance procedures of process facilities

    International Nuclear Information System (INIS)

    Noroozi, Alireza; Khakzad, Nima; Khan, Faisal; MacKinnon, Scott; Abbassi, Rouzbeh

    2013-01-01

    Human factors play an important role in the safe operation of a facility. Human factors include the systematic application of information about human characteristics and behavior to increase the safety of a process system. A significant proportion of human errors occur during the maintenance phase. However, the quantification of human error probabilities in the maintenance phase has not been given the amount of attention it deserves. This paper focuses on a human factors analysis in pre-and post- pump maintenance operations. The procedures for removing process equipment from service (pre-maintenance) and returning the equipment to service (post-maintenance) are considered for possible failure scenarios. For each scenario, human error probability is calculated for each activity using the Success Likelihood Index Method (SLIM). Consequences are also assessed in this methodology. The risk assessment is conducted for each component and the overall risk is estimated by adding individual risks. The present study is aimed at highlighting the importance of considering human error in quantitative risk analyses. The developed methodology has been applied to a case study of an offshore process facility

  7. Risk factor and cost accounting analysis for dialysis patients in Taiwan.

    Science.gov (United States)

    Su, Bin-Guang; Tsai, Kai-Li; Yeh, Shu-Hsing; Ho, Yi-Yi; Liu, Shin-Yi; Rivers, Patrick A

    2010-05-01

    with high blood lipids incurred significant differences in cost of oral medication. This study identified the relationship between cost and risk factors of dialysis procedures for ESRD patients based on average variable costs for each dialysis treatment. The results show that certain risk factors (e.g. aged 75 and older, hypertension, bile-duct disease, cancer and high blood lipids) are associated with higher cost. The results from this study could enable health policy makers and the National Health Insurance Bureau to design a fairer and more convincible reimbursement system for dialysis procedures. This study also provides a better understanding of what risk factors play more influential roles in affecting ESRD patients to receive haemodialysis treatment. It will help policy makers and health-care providers in better control or even prevent the disease and manage the distribution of the treatment. In addition, with the results from the analysis of cost information, we can tell which risk factors have more impacts on the dialysis cost. It will further help us control the cost for those high-risk dialysis patients more efficiently.

  8. Effectiveness of internet-based affect induction procedures: A systematic review and meta-analysis.

    Science.gov (United States)

    Ferrer, Rebecca A; Grenen, Emily G; Taber, Jennifer M

    2015-12-01

    Procedures used to induce affect in a laboratory are effective and well-validated. Given recent methodological and technological advances in Internet research, it is important to determine whether affect can be effectively induced using Internet methodology. We conducted a meta-analysis and systematic review of prior research that has used Internet-based affect induction procedures, and examined potential moderators of the effectiveness of affect induction procedures. Twenty-six studies were included in final analyses, with 89 independent effect sizes. Affect induction procedures effectively induced general positive affect, general negative affect, fear, disgust, anger, sadness, and guilt, but did not significantly induce happiness. Contamination of other nontarget affect did not appear to be a major concern. Video inductions resulted in greater effect sizes. Overall, results indicate that affect can be effectively induced in Internet studies, suggesting an important venue for the acceleration of affective science. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  9. A finite element based substructuring procedure for design analysis of large smart structural systems

    International Nuclear Information System (INIS)

    Ashwin, U; Raja, S; Dwarakanathan, D

    2009-01-01

    A substructuring based design analysis procedure is presented for large smart structural system using the Craig–Bampton method. The smart structural system is distinctively characterized as an active substructure, modelled as a design problem, and a passive substructure, idealized as an analysis problem. Furthermore, a novel thought has been applied by introducing the electro–elastic coupling into the reduction scheme to solve the global structural control problem in a local domain. As an illustration, a smart composite box beam with surface bonded actuators/sensors is considered, and results of the local to global control analysis are presented to show the potential use of the developed procedure. The present numerical scheme is useful for optimally designing the active substructures to study their locations, coupled structure–actuator interaction and provide a solution to the global design of large smart structural systems

  10. HETERO code, heterogeneous procedure for reactor calculation

    International Nuclear Information System (INIS)

    Jovanovic, S.M.; Raisic, N.M.

    1966-11-01

    This report describes the procedure for calculating the parameters of heterogeneous reactor system taking into account the interaction between fuel elements related to established geometry. First part contains the analysis of single fuel element in a diffusion medium, and criticality condition of the reactor system described by superposition of elements interactions. the possibility of performing such analysis by determination of heterogeneous system lattice is described in the second part. Computer code HETERO with the code KETAP (calculation of criticality factor η n and flux distribution) is part of this report together with the example of RB reactor square lattice

  11. Uncertainty Analysis of A Flood Risk Mapping Procedure Applied In Urban Areas

    Science.gov (United States)

    Krause, J.; Uhrich, S.; Bormann, H.; Diekkrüger, B.

    In the framework of IRMA-Sponge program the presented study was part of the joint research project FRHYMAP (flood risk and hydrological mapping). A simple con- ceptual flooding model (FLOODMAP) has been developed to simulate flooded areas besides rivers within cities. FLOODMAP requires a minimum of input data (digital el- evation model (DEM), river line, water level plain) and parameters and calculates the flood extent as well as the spatial distribution of flood depths. of course the simulated model results are affected by errors and uncertainties. Possible sources of uncertain- ties are the model structure, model parameters and input data. Thus after the model validation (comparison of simulated water to observed extent, taken from airborne pictures) the uncertainty of the essential input data set (digital elevation model) was analysed. Monte Carlo simulations were performed to assess the effect of uncertain- ties concerning the statistics of DEM quality and to derive flooding probabilities from the set of simulations. The questions concerning a minimum resolution of a DEM re- quired for flood simulation and concerning the best aggregation procedure of a given DEM was answered by comparing the results obtained using all available standard GIS aggregation procedures. Seven different aggregation procedures were applied to high resolution DEMs (1-2m) in three cities (Bonn, Cologne, Luxembourg). Basing on this analysis the effect of 'uncertain' DEM data was estimated and compared with other sources of uncertainties. Especially socio-economic information and monetary transfer functions required for a damage risk analysis show a high uncertainty. There- fore this study helps to analyse the weak points of the flood risk and damage risk assessment procedure.

  12. Application of a statistical thermal design procedure to evaluate the PWR DNBR safety analysis limits

    International Nuclear Information System (INIS)

    Robeyns, J.; Parmentier, F.; Peeters, G.

    2001-01-01

    In the framework of safety analysis for the Belgian nuclear power plants and for the reload compatibility studies, Tractebel Energy Engineering (TEE) has developed, to define a 95/95 DNBR criterion, a statistical thermal design method based on the analytical full statistical approach: the Statistical Thermal Design Procedure (STDP). In that methodology, each DNBR value in the core assemblies is calculated with an adapted CHF (Critical Heat Flux) correlation implemented in the sub-channel code Cobra for core thermal hydraulic analysis. The uncertainties of the correlation are represented by the statistical parameters calculated from an experimental database. The main objective of a sub-channel analysis is to prove that in all class 1 and class 2 situations, the minimum DNBR (Departure from Nucleate Boiling Ratio) remains higher than the Safety Analysis Limit (SAL). The SAL value is calculated from the Statistical Design Limit (SDL) value adjusted with some penalties and deterministic factors. The search of a realistic value for the SDL is the objective of the statistical thermal design methods. In this report, we apply a full statistical approach to define the DNBR criterion or SDL (Statistical Design Limit) with the strict observance of the design criteria defined in the Standard Review Plan. The same statistical approach is used to define the expected number of rods experiencing DNB. (author)

  13. Centre characteristics and procedure-related factors have an impact on outcomes of allogeneic transplantation for patients with CLL: a retrospective analysis from the European Society for Blood and Marrow Transplantation (EBMT)

    NARCIS (Netherlands)

    Schetelig, J.; Wreede, L.C. de; Andersen, N.S.; Moreno, C.; Gelder, M. van; Vitek, A.; Karas, M.; Michallet, M.; Machaczka, M.; Gramatzki, M.; Beelen, D.; Finke, J.; Delgado, J.; Volin, L.; Passweg, J.; Dreger, P.; Schaap, N.P.; Wagner, E.; Henseler, A.; Biezen, A. van; Bornhauser, M.; Iacobelli, S.; Putter, H.; Schonland, S.O.; Kroger, N.

    2017-01-01

    The best approach for allogeneic haematopoietic stem cell transplantations (alloHCT) in patients with chronic lymphocytic leukaemia (CLL) is unknown. We therefore analysed the impact of procedure- and centre-related factors on 5-year event-free survival (EFS) in a large retrospective study. Data of

  14. Human factors assessment in PRA using task analysis linked evaluation technique (TALENT)

    International Nuclear Information System (INIS)

    Wells, J.E.; Banks, W.W.

    1990-01-01

    Human error is a primary contributor to risk in complex high-reliability systems. A 1985 U.S. Nuclear Regulatory Commission (USNRC) study of licensee event reports (LERs) suggests that upwards of 65% of commercial nuclear system failures involve human error. Since then, the USNRC has initiated research to fully and properly integrate human errors into the probabilistic risk assessment (PRA) process. The resulting implementation procedure is known as the Task Analysis Linked Evaluation Technique (TALENT). As indicated, TALENT is a broad-based method for integrating human factors expertise into the PRA process. This process achieves results which: (1) provide more realistic estimates of the impact of human performance on nuclear power safety, (2) can be fully audited, (3) provide a firm technical base for equipment-centered and personnel-centered retrofit/redesign of plants enabling them to meet internally and externally imposed safety standards, and (4) yield human and hardware data capable of supporting inquiries into human performance issues that transcend the individual plant. The TALENT procedure is being field-tested to verify its effectiveness and utility. The objectives of the field-test are to examine (1) the operability of the process, (2) its acceptability to the users, and (3) its usefulness for achieving measurable improvements in the credibility of the analysis. The field-test will provide the information needed to enhance the TALENT process

  15. Evaluation and optimization of the man-machine interface in nuclear power plants using the HEROS procedure in PSA

    International Nuclear Information System (INIS)

    Richei, A.; Koch, M.K.; Unger, H.; Hauptmanns, U.

    1998-01-01

    For the probabilistic evaluation and optimization of the man-machine-system a new procedure is developed. This and the resulting expert system, which is based on the fuzzy set theory, is called HEROS, an acronym for Human Error Rate Assessment and Optimizing System. There are several existing procedures for the probabilistic evaluation of human errors, which have several disadvantages. However, in all of these procedures fuzzy verbal expressions are often used for the evaluation of human factors, also in the best known procedures. This use of verbal expressions for describing performance-shaping factors, enabling the evaluation of human factors is the basic approach for HEROS. A model of a modem man-machine-system is the basis of the procedure. Results from ergonomic studies are used to establish a rule-based expert system. HEROS simplifies the importance analysis for the evaluation of human factors, which is necessary for the optimization of the man-machine-system. It can be used in all areas of probabilistic safety assessment. The application of HEROS in the scope of accident management procedures and the comparison with the results of other procedures as an example for the usefulness and substantially more extensive applicability of this new procedure will be shown. (author)

  16. Radiation treatment of glottic squamous cell carcinoma, Stage I and II: analysis of factors affecting prognosis

    International Nuclear Information System (INIS)

    Franchin, Giovanni; Minatel, Emilio; Gobitti, Carlo; Talamini, Renato; Sartor, Giovanna; Caruso, Giuseppe; Grando, Giuseppe; Politi, Doriano; Gigante, Marco; Toffoli, Giuseppe; Trovo, Mauro G.; Barzan, Luigi

    1998-01-01

    Purpose: At least in some European Countries, there is still considerable controversy regarding the choice between surgery and radiotherapy for the treatment of patients with early laryngeal-glottic carcinoma. Methods and Materials: Two hundred and forty-six patients with laryngeal-glottic neoplasms, Stage I-II, were treated with radical radiotherapy. Before radiotherapy the patients were evaluated to determine the surgical procedure of choice. Either 66-68.4 Gy (33-38 fractions) or 63-65 Gy (28-29 fractions) of radiation therapy (RT) were administered. The overall disease free survival was determined for each subgroup of patients. Univariate and multivariate analyses were performed to determine significant prognostic variables. Results: Five- and 10-year overall survival rates were 83 and 72%, respectively. At a median follow-up of 6 years 204 patients are alive and disease free. No patient developed distant metastases. One patient died of a large local recurrence, 38 patients died of causes unrelated to their tumor, and 3 patients were lost to follow-up. The multivariate analysis confirmed that performance status (PS), macroscopic presentation of the lesion, and persistence of dysphonia after radiotherapy are significant prognostic factors. Conclusions: According to the multivariate analysis, the patients with PS >80 and with exophytic lesions are eligible for radical RT. The surgical procedure proposed for each patient was not found to be an independent prognostic factor

  17. Structural analysis and optimization procedure of the TFTR device substructure

    International Nuclear Information System (INIS)

    Driesen, G.

    1975-10-01

    A structural evaluation of the TFTR device substructure is performed in order to verify the feasibility of the proposed design concept as well as to establish a design optimization procedure for minimizing the material and fabrication cost of the substructure members. A preliminary evaluation of the seismic capability is also presented. The design concept on which the analysis is based is consistent with that described in the Conceptual Design Status Briefing report dated June 18, 1975

  18. Conversion from laparoscopic to open cholecystectomy: Multivariate analysis of preoperative risk factors

    Directory of Open Access Journals (Sweden)

    Khan M

    2005-01-01

    Full Text Available BACKGROUND: Laparoscopic cholecystectomy has become the gold standard in the treatment of symptomatic cholelithiasis. Some patients require conversion to open surgery and several preoperative variables have been identified as risk factors that are helpful in predicting the probability of conversion. However, there is a need to devise a risk-scoring system based on the identified risk factors to (a predict the risk of conversion preoperatively for selected patients, (b prepare the patient psychologically, (c arrange operating schedules accordingly, and (d minimize the procedure-related cost and help overcome financial constraints, which is a significant problem in developing countries. AIM: This study was aimed to evaluate preoperative risk factors for conversion from laparoscopic to open cholecystectomy in our setting. SETTINGS AND DESIGNS: A case control study of patients who underwent laparoscopic surgery from January 1997 to December 2001 was conducted at the Aga Khan University Hospital, Karachi, Pakistan. MATERIALS AND METHODS: All those patients who were converted to open surgery (n = 73 were enrolled as cases. Two controls who had successful laparoscopic surgery (n = 146 were matched with each case for operating surgeon and closest date of surgery. STATISTICAL ANALYSIS USED: Descriptive statistics were computed and, univariate and multivariate analysis was done through multiple logistic regression. RESULTS: The final multivariate model identified two risk factors for conversion: ultrasonographic signs of inflammation (adjusted odds ratio [aOR] = 8.5; 95% confidence interval [CI]: 3.3, 21.9 and age > 60 years (aOR = 8.1; 95% CI: 2.9, 22.2 after adjusting for physical signs, alkaline phosphatase and BMI levels. CONCLUSION: Preoperative risk factors evaluated by the present study confirm the likelihood of conversion. Recognition of these factors is important for understanding the characteristics of patients at a higher risk of conversion.

  19. An expert system-based aid for analysis of Emergency Operating Procedures in NPPs

    International Nuclear Information System (INIS)

    Jakubowski, Z.; Beraha, D.

    1996-01-01

    Emergency Operating Procedures (EOPs) generally and an accident management (AM) particularly play a significant part in the safety philosophy on NPPs since many years. A better methodology for development and validation of EOPs is desired. A prototype of an Emergency Operating Procedures Analysis System (EOPAS), which has been developed at GRS, is presented in the paper. The hardware configuration and software organisation of the system is briefly reviewed. The main components of the system such as the knowledge base of an expert system and the engineering simulator are described. (author)

  20. First course in factor analysis

    CERN Document Server

    Comrey, Andrew L

    2013-01-01

    The goal of this book is to foster a basic understanding of factor analytic techniques so that readers can use them in their own research and critically evaluate their use by other researchers. Both the underlying theory and correct application are emphasized. The theory is presented through the mathematical basis of the most common factor analytic models and several methods used in factor analysis. On the application side, considerable attention is given to the extraction problem, the rotation problem, and the interpretation of factor analytic results. Hence, readers are given a background of

  1. Development of a procedure for qualitative and quantitative evaluation of human factors as a part of probabilistic safety assessments of nuclear power plants. Part B. Technical documentation

    International Nuclear Information System (INIS)

    Richei, A.

    1998-01-01

    As international studies have shown, accidents in plants are increasingly caused by combinations of technical failures and human errors. Therefore careful investigations of man-machine-interactions to determine human reliability are gaining importance worldwide. Regarding nuclear power plants such investigations are usually carried out within the scope of probabilistic safety assessments. A great number of procedures to evaluate human factors has been developed up to now. However, none of them is able to take into account the whole spectrum of requirements - as for instance transferability of date to other plants, analysis of weak points, and evaluation of cognitive tasks - for a complete and reliable probabilistic safety assessment. Based on an advanced model for a man-machine-system, the Human Error Rate Assessment and Optimizing System (HEROS) and a corresponding expert system of the same name are introduced. This expert system enables the quantification of human error probabilities for plant operator actions on the one hand and is also capable of providing quantitative statements regarding the optimization of man-machine-system in terms of human error probability minimization on the other one. Three relevant evaluation levels, i.e. 'Management Structure', 'Working Environment' and 'Man-Machine-Interface', are derived from a model of the man-machine-system. Linguistic variables are assigned to all performance shaping factors at these levels. These variables are used to establish a rule-based expert system. The knowledge bases of this system are represented by rules. Processing of these rules is carried out by means of the fuzzy set theory, after provision of relevant data for a particular personal action to be evaluated. This procedure enables a simple and effective use of ergonomic studies as the relevant database, which is also transferable to other plants with any design. The expert system consist in total of 16 rule bases in which all ascertainable and

  2. Maintenance procedure upgrade programs

    International Nuclear Information System (INIS)

    Campbell, J.J.; Zimmerman, C.M.

    1988-01-01

    This paper describes a systematic approach to upgrading nuclear power plant maintenance procedures. The approach consists of four phases: diagnosis, program planning, program implementation, and program evaluation. Each phase is explained as a series of steps to ensure that all factors in a procedure upgrade program are considered

  3. Processes and Procedures for Application of CFD to Nuclear Reactor Safety Analysis

    International Nuclear Information System (INIS)

    Richard W. Johnson; Richard R. Schultz; Patrick J. Roache; Ismail B. Celik; William D. Pointer; Yassin A. Hassan

    2006-01-01

    Traditionally, nuclear reactor safety analysis has been performed using systems analysis codes such as RELAP5, which was developed at the INL. However, goals established by the Generation IV program, especially the desire to increase efficiency, has lead to an increase in operating temperatures for the reactors. This increase pushes reactor materials to operate towards their upper temperature limits relative to structural integrity. Because there will be some finite variation of the power density in the reactor core, there will be a potential for local hot spots to occur in the reactor vessel. Hence, it has become apparent that detailed analysis will be required to ensure that local ''hot spots'' do not exceed safety limits. It is generally accepted that computational fluid dynamics (CFD) codes are intrinsically capable of simulating fluid dynamics and heat transport locally because they are based on ''first principles''. Indeed, CFD analysis has reached a fairly mature level of development, including the commercial level. However, CFD experts are aware that even though commercial codes are capable of simulating local fluid and thermal physics, great care must be taken in their application to avoid errors caused by such things as inappropriate grid meshing, low-order discretization schemes, lack of iterative convergence and inaccurate time-stepping. Just as important is the choice of a turbulence model for turbulent flow simulation. Turbulence models model the effects of turbulent transport of mass, momentum and energy, but are not necessarily applicable for wide ranges of flow types. Therefore, there is a well-recognized need to establish practices and procedures for the proper application of CFD to simulate flow physics accurately and establish the level of uncertainty of such computations. The present document represents contributions of CFD experts on what the basic practices, procedures and guidelines should be to aid CFD analysts to obtain accurate estimates

  4. Risk Factor Analysis for Oral Precancer among Slum Dwellers in ...

    African Journals Online (AJOL)

    risk factors for oral precancer, i.e., smoking/smokeless tobacco, chewing ... procedure was performed on a group of 10 subjects, which were ... clinical description of observed oral mucosal lesions was made ..... use and effects of cessation.

  5. Function Allocation in Complex Socio-Technical Systems: Procedure usage in nuclear power and the Context Analysis Method for Identifying Design Solutions (CAMIDS) Model

    Science.gov (United States)

    Schmitt, Kara Anne

    This research aims to prove that strict adherence to procedures and rigid compliance to process in the US Nuclear Industry may not prevent incidents or increase safety. According to the Institute of Nuclear Power Operations, the nuclear power industry has seen a recent rise in events, and this research claims that a contributing factor to this rise is organizational, cultural, and based on peoples overreliance on procedures and policy. Understanding the proper balance of function allocation, automation and human decision-making is imperative to creating a nuclear power plant that is safe, efficient, and reliable. This research claims that new generations of operators are less engaged and thinking because they have been instructed to follow procedures to a fault. According to operators, they were once to know the plant and its interrelations, but organizationally more importance is now put on following procedure and policy. Literature reviews were performed, experts were questioned, and a model for context analysis was developed. The Context Analysis Method for Identifying Design Solutions (CAMIDS) Model was created, verified and validated through both peer review and application in real world scenarios in active nuclear power plant simulators. These experiments supported the claim that strict adherence and rigid compliance to procedures may not increase safety by studying the industry's propensity for following incorrect procedures, and when it directly affects the outcome of safety or security of the plant. The findings of this research indicate that the younger generations of operators rely highly on procedures, and the organizational pressures of required compliance to procedures may lead to incidents within the plant because operators feel pressured into following the rules and policy above performing the correct actions in a timely manner. The findings support computer based procedures, efficient alarm systems, and skill of the craft matrices. The solution to

  6. Surgical site infection rates following laparoscopic urological procedures.

    Science.gov (United States)

    George, Arvin K; Srinivasan, Arun K; Cho, Jane; Sadek, Mostafa A; Kavoussi, Louis R

    2011-04-01

    Surgical site infections have been categorized by the Centers for Medicare and Medicaid Services as "never events". The incidence of surgical site infection following laparoscopic urological surgery and its risk factors are poorly defined. We evaluated surgical site infection following urological laparoscopic surgery and identified possible factors that may influence occurrence. Patients who underwent transperitoneal laparoscopic procedures during a 4-year period by a single laparoscopic surgeon were retrospectively reviewed. Surgical site infections were identified postoperatively and defined using the Centers for Disease Control criteria. Clinical parameters, comorbidities, smoking history, preoperative urinalysis and culture results as well as operative data were analyzed. Nonparametric testing using the Mann-Whitney U test, multivariable logistic regression and Spearman's rank correlation coefficient were used for data analysis. In 556 patients undergoing urological laparoscopic procedures 14 surgical site infections (2.5%) were identified at mean postoperative day 21.5. Of the 14 surgical site infections 10 (71.4%) were located at a specimen extraction site. Operative time, procedure type and increasing body mass index were significantly associated with the occurrence of surgical site infections (p = 0.007, p = 0.019, p = 0.038, respectively), whereas history of diabetes mellitus (p = 0.071) and intraoperative transfusion (p = 0.053) were found to trend toward significance. Age, gender, positive urine culture, steroid use, procedure type and smoking history were not significantly associated with surgical site infection. Body mass index and operative time remained significant predictors of surgical site infection on multivariate logistic regression analysis. Surgical site infection is an infrequent complication following laparoscopic surgery with the majority occurring at the specimen extraction site. Infection is associated with prolonged operative time and

  7. Probabilistic safety analysis procedures guide

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Bari, R.A.; Buslik, A.J.

    1984-01-01

    A procedures guide for the performance of probabilistic safety assessment has been prepared for interim use in the Nuclear Regulatory Commission programs. The probabilistic safety assessment studies performed are intended to produce probabilistic predictive models that can be used and extended by the utilities and by NRC to sharpen the focus of inquiries into a range of tissues affecting reactor safety. This guide addresses the determination of the probability (per year) of core damage resulting from accident initiators internal to the plant and from loss of offsite electric power. The scope includes analyses of problem-solving (cognitive) human errors, a determination of importance of the various core damage accident sequences, and an explicit treatment and display of uncertainties for the key accident sequences. Ultimately, the guide will be augmented to include the plant-specific analysis of in-plant processes (i.e., containment performance) and the risk associated with external accident initiators, as consensus is developed regarding suitable methodologies in these areas. This guide provides the structure of a probabilistic safety study to be performed, and indicates what products of the study are essential for regulatory decision making. Methodology is treated in the guide only to the extent necessary to indicate the range of methods which is acceptable; ample reference is given to alternative methodologies which may be utilized in the performance of the study

  8. A qualitative analysis of the determinants in the choice of a French journal reviewing procedures

    Science.gov (United States)

    Morge, Ludovic

    2015-12-01

    Between 1993 and 2010, two French journals (Aster and Didaskalia) coming from different backgrounds but belonging to the same institution used to publish papers on research in science and technology education. The merging of these journals made it necessary for them to compare the different reviewing procedures used by each. This merging occurred at a time when research is becoming increasingly international which partly determines some of the reviewing procedure choices. In order for a francophone international journal to survive, it needs to take this internationalization into account in a reasoned manner. The author of this article, as a chief editor of RDST (Recherches en Didactique des Sciences et des Technologies)—the journal resulting from the merging- taking part in this merger, analyses the social, cultural and pragmatic determinants which impacted the choices made in reviewing procedures. This paper describes how these diversity of factors leads us to drop the idea of a standard reviewing procedure which would be valid for all journals.

  9. Elemental hair analysis: A review of procedures and applications

    International Nuclear Information System (INIS)

    Pozebon, D.; Scheffler, G.L.; Dressler, V.L.

    2017-01-01

    Although exogenous contamination and unreliable reference values have limited the utility of scalp hair as a biomarker of chemical elements exposure, its use in toxicological, clinical, environmental and forensic investigations is growing and becoming more extensive. Therefore, hair elemental analysis is reviewed in the current manuscript which spans articles published in the last 10 years. It starts with a general discussion of history, morphology and possible techniques for elemental analysis, where inductively coupled plasma-mass spectrometry (ICP-MS) is clearly highlighted since this technique is leading quantitative ultra-trace elemental analysis. Emphasis over sampling, quality assurance, washing procedures and sample decomposition is given with detailed protocols compiled in tables as well as the utility of hair to identify human gender, age, diseases, healthy conditions, nutrition status and contamination sites. Isotope ratio information, chemical speciation analysis and analyte preconcentration are also considered for hair. Finally, the potential of laser ablation ICP-MS (LA-ICP-MS) to provide spatial resolution and time-track the monitoring of elements in hair strands instead of conventional bulk analysis is spotlighted as a real future trend in the field. - Highlights: • Elemental analysis of hair is critically reviewed, with focus on ICP-MS employment. • Standards protocols of hair washing and sample decomposition are compiled. • The usefulness of elemental and/or isotopic analysis of hair is demonstrated. • The potential of LA-ICP-MS for elemental time tracking in hair is highlighted.

  10. IDENTIFICATION OF MEASUREMENT ITEMS OF DESIGN REQUIREMENTS FOR LEAN AND AGILE SUPPLY CHAIN-CONFIRMATORY FACTOR ANALYSIS

    Directory of Open Access Journals (Sweden)

    D.Venkata Ramana

    2013-06-01

    Full Text Available This study examines the consistency approaches by confirmatory factor analysis that determines the construct validity, convergent validity, construct reliability and internal consistency of the items of strategic design requirements. The design requirements includes use of information technology, sourcing procedures, new product development, flexible manufacturing functions and demand management supply chain net work design, management, commitment and inventory management policies among manufacturers of volatile and unforeseeable products in Andhraadesh, India. This study suggested that the seven factor model with 20 items of the leagile supply chain design requirements had a good fit. Further, the study showed a val id and reliable measurement to identify critical items among the design requirements of leagile supply chains.

  11. Multiple factor analysis by example using R

    CERN Document Server

    Pagès, Jérôme

    2014-01-01

    Multiple factor analysis (MFA) enables users to analyze tables of individuals and variables in which the variables are structured into quantitative, qualitative, or mixed groups. Written by the co-developer of this methodology, Multiple Factor Analysis by Example Using R brings together the theoretical and methodological aspects of MFA. It also includes examples of applications and details of how to implement MFA using an R package (FactoMineR).The first two chapters cover the basic factorial analysis methods of principal component analysis (PCA) and multiple correspondence analysis (MCA). The

  12. A risk-factor analysis of medical litigation judgments related to fall injuries in Korea.

    Science.gov (United States)

    Kim, Insook; Won, Seonae; Lee, Mijin; Lee, Won

    2018-01-01

    The aim of this study was to find out the risk factors through analysis of seven medical malpractice judgments related to fall injuries. The risk factors were analysed by using the framework that approaches falls from a systems perspective and comprised people, organisational or environmental factors, with each factor being comprised of subfactors. The risk factors found in each of the seven judgments were aggregated into one framework. The risk factors related to patients (i.e. the people factor) were age, pain, related disease, activities and functional status, urination state, cognitive function impairment, past history of fall, blood transfusion, sleep endoscopy state and uncooperative attitude. The risk factors related to the medical staff and caregivers (i.e. people factor) were observation negligence, no fall prevention activities and negligence in managing high-risk group for fall. Organisational risk factors were a lack of workforce, a lack of training, neglecting the management of the high-risk group, neglecting the management of caregivers and the absence of a fall prevention procedure. Regarding the environment, the risk factors were found to be the emergency room, chairs without a backrest and the examination table. Identifying risk factors is essential for preventing fall accidents, since falls are preventable patient-safety incidents. Falls do not happen as a result of a single risk factor. Therefore, a systems approach is effective to identify risk factors, especially organisational and environmental factors.

  13. Container Throughput Forecasting Using Dynamic Factor Analysis and ARIMAX Model

    Directory of Open Access Journals (Sweden)

    Marko Intihar

    2017-11-01

    Full Text Available The paper examines the impact of integration of macroeconomic indicators on the accuracy of container throughput time series forecasting model. For this purpose, a Dynamic factor analysis and AutoRegressive Integrated Moving-Average model with eXogenous inputs (ARIMAX are used. Both methodologies are integrated into a novel four-stage heuristic procedure. Firstly, dynamic factors are extracted from external macroeconomic indicators influencing the observed throughput. Secondly, the family of ARIMAX models of different orders is generated based on the derived factors. In the third stage, the diagnostic and goodness-of-fit testing is applied, which includes statistical criteria such as fit performance, information criteria, and parsimony. Finally, the best model is heuristically selected and tested on the real data of the Port of Koper. The results show that by applying macroeconomic indicators into the forecasting model, more accurate future throughput forecasts can be achieved. The model is also used to produce future forecasts for the next four years indicating a more oscillatory behaviour in (2018-2020. Hence, care must be taken concerning any bigger investment decisions initiated from the management side. It is believed that the proposed model might be a useful reinforcement of the existing forecasting module in the observed port.

  14. The human factors and job task analysis in nuclear power plant operation

    International Nuclear Information System (INIS)

    Stefanescu, Petre; Mihailescu, Nicolae; Dragusin, Octavian

    1999-01-01

    After a long period of time, during the development of the NPP technology, where the plant hardware has been considered to be the main factor for a safe, reliable and economic operation, the industry is now changing to an adequate responsibility of plant hardware and operation. Since the human factors has been not discussed methodically so far, there is still a lack of improved classification systems for human errors as well as a lack of methods for the systematic approach in designing the operator's working system, as for instance by using the job task analysis (J.T.A.). The J.T.A. appears to be an adequate method to study the human factor in the nuclear power plant operation, enabling an easy conversion to operational improvements. While the results of the analysis of human errors tell 'what' is to be improved, the J.T.A. shows 'how' to improve, for increasing the quality of the work and the safety of the operator's working system. The paper analyses the issue of setting the task and displays four criteria used to select aspects in NPP operation which require special consideration as personal training, design of control room, content and layout of the procedure manual, or organizing the operating personnel. The results are given as three tables giving: 1- Evaluation of deficiencies in the Working System; 2- Evaluation of the Deficiencies of Operator's Disposition; 3- Evaluation of the Mental Structure of Operation

  15. Postoperative infections in craniofacial reconstructive procedures.

    Science.gov (United States)

    Fialkov, J A; Holy, C; Forrest, C R; Phillips, J H; Antonyshyn, O M

    2001-07-01

    The rate of, and possible risk factors for, postoperative craniofacial infection is unclear. To investigate this problem, we reviewed 349 cases of craniofacial skeletal procedures performed from 1996 to 1999 at our institution. Infection rate was determined and correlated with the use of implants, operative site, and cause of deformity. The inclusion criteria consisted of all procedures requiring autologous or prosthetic implantation in craniofacial skeletal sites, as well as all procedures involving bone or cartilage resection, osteotomies, debridement, reduction and/or fixation. Procedures that did not involve bone or cartilage surgery were excluded. The criteria for diagnosis of infection included clinical confirmation and one or more of 1) intravenous or oral antibiotic treatment outside of the prophylactic surgical regimen; 2) surgical intervention for drainage, irrigation, and or debridement; and 3) microbiological confirmation. Among the 280 surgical cases that fit the inclusion criteria and had complete records, there were 23 cases of postoperative infection (8.2%). The most common site for postoperative infection was the mandible (infection rate = 16.7%). Multiple logistic regression analysis revealed gunshot wound to be the most significant predictor of postoperative infection. Additionally, porous polyethylene implantation through a transoral route was correlated with a significant risk of postoperative infection.

  16. Procedure of trace element analysis in oyster tissues by using X-ray fluorescence

    International Nuclear Information System (INIS)

    Vo Thi Tuong Hanh; Dinh Thi Bich Lieu; Dinh Thien Lam and Nguyen Manh Hung

    2004-01-01

    The procedure of trace element analysis such as Ca, Mn, Fe, Zn, Cu, Pb in molluscs (oyster tissues) was established by using X-ray fluorescence techniques. The procedure was investigated from the sample collection, drying, ashing ratio to the analytical techniques by using Cd-109, detector Si (Li) and the peak processing MCAPLUS program was applied for this study. The procedure is based on direct comparison with certified concentrations of international standard reference SRM 1566b Oyster Tissue of National Institute of Standards and Technology, Department of commerce, United States of America for Ca, Mn, Fe, Zn, Cu and the Standard Addition Methods for Pb. The accuracy of the Standard Addition Methods was estimated by CRM281 Rye Grass of Community Bureau of Reference-BCR, European Commission. The results of 10 samples which were collected from several markets in Hanoi are shown. (author)

  17. Robust statistics and geochemical data analysis

    International Nuclear Information System (INIS)

    Di, Z.

    1987-01-01

    Advantages of robust procedures over ordinary least-squares procedures in geochemical data analysis is demonstrated using NURE data from the Hot Springs Quadrangle, South Dakota, USA. Robust principal components analysis with 5% multivariate trimming successfully guarded the analysis against perturbations by outliers and increased the number of interpretable factors. Regression with SINE estimates significantly increased the goodness-of-fit of the regression and improved the correspondence of delineated anomalies with known uranium prospects. Because of the ubiquitous existence of outliers in geochemical data, robust statistical procedures are suggested as routine procedures to replace ordinary least-squares procedures

  18. Analysis of mineral phases in coal utilizing factor analysis

    International Nuclear Information System (INIS)

    Roscoe, B.A.; Hopke, P.K.

    1982-01-01

    The mineral phase inclusions of coal are discussed. The contribution of these to a coal sample are determined utilizing several techniques. Neutron activation analysis in conjunction with coal washability studies have produced some information on the general trends of elemental variation in the mineral phases. These results have been enhanced by the use of various statistical techniques. The target transformation factor analysis is specifically discussed and shown to be able to produce elemental profiles of the mineral phases in coal. A data set consisting of physically fractionated coal samples was generated. These samples were analyzed by neutron activation analysis and then their elemental concentrations examined using TTFA. Information concerning the mineral phases in coal can thus be acquired from factor analysis even with limited data. Additional data may permit the resolution of additional mineral phases as well as refinement of theose already identified

  19. Structural integrity evaluation of X52 gas pipes subjected to external corrosion defects using the SINTAP procedure

    Energy Technology Data Exchange (ETDEWEB)

    Adib-Ramezani, H. [Ecole Polytechnique de l' Universite d' Orleans, CNRS-CRMD, 8 rue Leonard de Vinci, 45072 Orleans Cedex 2 (France)]. E-mail: hradib_2000@yahoo.com; Jeong, J. [Ecole Polytechnique de l' Universite d' Orleans, CNRS-CRMD, 8 rue Leonard de Vinci, 45072 Orleans Cedex 2 (France); Pluvinage, G. [Laboratoire de Fiabilite Mecanique (LFM), Universite de Metz-ENIM, 57045 Metz (France)

    2006-06-15

    In the present study, the SINTAP procedure has been proposed as a general structural integrity tool for semi-spherical, semi-elliptical and long blunt notch defects. The notch stress intensity factor concept and SINTAP structural integrity procedure are employed to assess gas pipelines integrity. The external longitudinal defects have been investigated via elastic-plastic finite element method results. The notch stress intensity concept is implemented into SINTAP procedure. The safety factor is calculated via SINTAP procedure levels 0B and 1B. The extracted evaluations are compared with the limit load analysis based on ASME B31G, modified ASME B31G, DNV RP-F101 and recent proposed formulation [Choi JB, Goo BK, Kim JC, Kim YJ, Kim WS. Development of limit load solutions for corroded gas pipelines. Int J Pressure Vessel Piping 2003;80(2):121-128]. The comparison among extracted safety factors exhibits that SINTAP predictions are located between lower and upper safety factor bounds. The SINTAP procedure including notch-based assessment diagram or so-called 'NFAD' involves wide range of defect geometries with low, moderate and high stress concentrations and relative stress gradients. Finally, some inspired and advanced viewpoints have been investigated.

  20. Structural integrity evaluation of X52 gas pipes subjected to external corrosion defects using the SINTAP procedure

    International Nuclear Information System (INIS)

    Adib-Ramezani, H.; Jeong, J.; Pluvinage, G.

    2006-01-01

    In the present study, the SINTAP procedure has been proposed as a general structural integrity tool for semi-spherical, semi-elliptical and long blunt notch defects. The notch stress intensity factor concept and SINTAP structural integrity procedure are employed to assess gas pipelines integrity. The external longitudinal defects have been investigated via elastic-plastic finite element method results. The notch stress intensity concept is implemented into SINTAP procedure. The safety factor is calculated via SINTAP procedure levels 0B and 1B. The extracted evaluations are compared with the limit load analysis based on ASME B31G, modified ASME B31G, DNV RP-F101 and recent proposed formulation [Choi JB, Goo BK, Kim JC, Kim YJ, Kim WS. Development of limit load solutions for corroded gas pipelines. Int J Pressure Vessel Piping 2003;80(2):121-128]. The comparison among extracted safety factors exhibits that SINTAP predictions are located between lower and upper safety factor bounds. The SINTAP procedure including notch-based assessment diagram or so-called 'NFAD' involves wide range of defect geometries with low, moderate and high stress concentrations and relative stress gradients. Finally, some inspired and advanced viewpoints have been investigated

  1. Procedure for conducting probabilistic safety assessment: level 1 full power internal event analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dae; Lee, Y. H.; Hwang, M. J. [and others

    2003-07-01

    This report provides guidance on conducting a Level I PSA for internal events in NPPs, which is based on the method and procedure that was used in the PSA for the design of Korea Standard Nuclear Plants (KSNPs). Level I PSA is to delineate the accident sequences leading to core damage and to estimate their frequencies. It has been directly used for assessing and modifying the system safety and reliability as a key and base part of PSA. Also, Level I PSA provides insights into design weakness and into ways of preventing core damage, which in most cases is the precursor to accidents leading to major accidents. So Level I PSA has been used as the essential technical bases for risk-informed application in NPPs. The report consists six major procedural steps for Level I PSA; familiarization of plant, initiating event analysis, event tree analysis, system fault tree analysis, reliability data analysis, and accident sequence quantification. The report is intended to assist technical persons performing Level I PSA for NPPs. A particular aim is to promote a standardized framework, terminology and form of documentation for PSAs. On the other hand, this report would be useful for the managers or regulatory persons related to risk-informed regulation, and also for conducting PSA for other industries.

  2. Review and Application of Ship Collision and Grounding Analysis Procedures

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    2010-01-01

    It is the purpose of the paper to present a review of prediction and analysis tools for collision and grounding analyses and to outline a probabilistic procedure for which these tools can be used by the maritime industry to develop performance based rules to reduce the risk associated with human,......, environmental and economic costs of collision and grounding events. The main goal of collision and grounding research should be to identify the most economic risk control options associated with prevention and mitigation of collision and grounding events....

  3. Direct versus indirect revascularization procedures for moyamoya disease: a comparative effectiveness study.

    Science.gov (United States)

    Macyszyn, Luke; Attiah, Mark; Ma, Tracy S; Ali, Zarina; Faught, Ryan; Hossain, Alisha; Man, Karen; Patel, Hiren; Sobota, Rosanna; Zager, Eric L; Stein, Sherman C

    2017-05-01

    OBJECTIVE Moyamoya disease (MMD) is a chronic cerebrovascular disease that can lead to devastating neurological outcomes. Surgical intervention is the definitive treatment, with direct, indirect, and combined revascularization procedures currently employed by surgeons. The optimal surgical approach, however, remains unclear. In this decision analysis, the authors compared the effectiveness of revascularization procedures in both adult and pediatric patients with MMD. METHODS A comprehensive literature search was performed for studies of MMD. Using complication and success rates from the literature, the authors constructed a decision analysis model for treatment using a direct and indirect revascularization technique. Utility values for the various outcomes and complications were extracted from the literature examining preferences in similar clinical conditions. Sensitivity analysis was performed. RESULTS A structured literature search yielded 33 studies involving 4197 cases. Cases were divided into adult and pediatric populations. These were further subdivided into 3 different treatment groups: indirect, direct, and combined revascularization procedures. In the pediatric population at 5- and 10-year follow-up, there was no significant difference between indirect and combination procedures, but both were superior to direct revascularization. In adults at 4-year follow-up, indirect was superior to direct revascularization. CONCLUSIONS In the absence of factors that dictate a specific approach, the present decision analysis suggests that direct revascularization procedures are inferior in terms of quality-adjusted life years in both adults at 4 years and children at 5 and 10 years postoperatively, respectively. These findings were statistically significant (p indirect and combination procedures may offer optimal results at long-term follow-up.

  4. Evaluation of the effect of corrosion defects on the structural integrity of X52 gas pipelines using the SINTAP procedure and notch theory

    International Nuclear Information System (INIS)

    Adib, H.; Jallouf, S.; Schmitt, C.; Carmasol, A.; Pluvinage, G.

    2007-01-01

    The notch stress intensity factor concept and the structural integrity assessment procedure for European industry (SINTAP) structural integrity procedure are used to assess gas pipeline integrity using deterministic and probabilistic methods. These pipes have external longitudinal semi-elliptical corrosion defects. Stress concentration at a defect tip is investigated via elastic-plastic finite element method analysis. The notch stress intensity concept is implemented into the SINTAP procedure and a notch-based failure assessment diagram is proposed. The safety factor and security factor are calculated through the SINTAP basic level

  5. Evaluation of the effect of corrosion defects on the structural integrity of X52 gas pipelines using the SINTAP procedure and notch theory

    Energy Technology Data Exchange (ETDEWEB)

    Adib, H. [ENIM, Laboratoire de Fiabilite Mecanique (LFM) Ile du Saulcy, 57045 Metz Cedex (France); Jallouf, S. [ENIM, Laboratoire de Fiabilite Mecanique (LFM) Ile du Saulcy, 57045 Metz Cedex (France); Schmitt, C. [ENIM, Laboratoire de Fiabilite Mecanique (LFM) Ile du Saulcy, 57045 Metz Cedex (France)]. E-mail: schmitt@enim.fr; Carmasol, A. [ENIM, Laboratoire de Fiabilite Mecanique (LFM) Ile du Saulcy, 57045 Metz Cedex (France); Pluvinage, G. [ENIM, Laboratoire de Fiabilite Mecanique (LFM) Ile du Saulcy, 57045 Metz Cedex (France)

    2007-03-15

    The notch stress intensity factor concept and the structural integrity assessment procedure for European industry (SINTAP) structural integrity procedure are used to assess gas pipeline integrity using deterministic and probabilistic methods. These pipes have external longitudinal semi-elliptical corrosion defects. Stress concentration at a defect tip is investigated via elastic-plastic finite element method analysis. The notch stress intensity concept is implemented into the SINTAP procedure and a notch-based failure assessment diagram is proposed. The safety factor and security factor are calculated through the SINTAP basic level.

  6. Scenes for Social Information Processing in Adolescence: Item and factor analytic procedures for psychometric appraisal.

    Science.gov (United States)

    Vagos, Paula; Rijo, Daniel; Santos, Isabel M

    2016-04-01

    Relatively little is known about measures used to investigate the validity and applications of social information processing theory. The Scenes for Social Information Processing in Adolescence includes items built using a participatory approach to evaluate the attribution of intent, emotion intensity, response evaluation, and response decision steps of social information processing. We evaluated a sample of 802 Portuguese adolescents (61.5% female; mean age = 16.44 years old) using this instrument. Item analysis and exploratory and confirmatory factor analytic procedures were used for psychometric examination. Two measures for attribution of intent were produced, including hostile and neutral; along with 3 emotion measures, focused on negative emotional states; 8 response evaluation measures; and 4 response decision measures, including prosocial and impaired social behavior. All of these measures achieved good internal consistency values and fit indicators. Boys seemed to favor and choose overt and relational aggression behaviors more often; girls conveyed higher levels of neutral attribution, sadness, and assertiveness and passiveness. The Scenes for Social Information Processing in Adolescence achieved adequate psychometric results and seems a valuable alternative for evaluating social information processing, even if it is essential to continue investigation into its internal and external validity. (c) 2016 APA, all rights reserved.

  7. Communication Network Analysis Methods.

    Science.gov (United States)

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  8. Safety of Running Two Rooms: A Systematic Review and Meta-Analysis of Overlapping Neurosurgical Procedures.

    Science.gov (United States)

    Self, D Mitchell; Ilyas, Adeel; Stetler, William R

    2018-04-27

    Overlapping surgery, a long-standing practice within academic neurosurgery centers nationwide, has recently come under scrutiny from the government and media as potentially harmful to patients. Therefore, the objective of this systematic review and meta-analysis is to determine the safety of overlapping neurosurgical procedures. The authors performed a systematic review and meta-analysis in accordance with PRISMA guidelines. A review of PubMed and Medline databases was undertaken with the search phrase "overlapping surgery AND neurosurgery AND outcomes." Data regarding patient demographics, type of neurosurgical procedure, and outcomes and complications were extracted from each study. The principle summary measure was odds ratio (OR) of the association of overlapping versus non-overlapping surgery with outcomes. The literature search yielded a total of 36 studies, of which 5 studies met inclusion criteria and were included in this study. These studies included a total of 25,764 patients undergoing neurosurgical procedures. Overlapping surgery was associated with an increased likelihood of being discharged home (OR = 1.32; 95% CI 1.20 to 1.44; P < 0.001) and a reduced 30-day unexpected return to the operating room (OR = 0.79; 95% CI 0.72 to 0.87; P < 0.001). Overlapping surgery did not significantly affect OR of length of surgery, 30-day mortality, or 30-day readmission. Overlapping neurosurgical procedures were not associated with worse patient outcomes. Additional, prospective studies are needed to further assess the safety overlapping procedures. Copyright © 2018. Published by Elsevier Inc.

  9. [Costing nuclear medicine diagnostic procedures].

    Science.gov (United States)

    Markou, Pavlos

    2005-01-01

    To the Editor: Referring to a recent special report about the cost analysis of twenty-nine nuclear medicine procedures, I would like to clarify some basic aspects for determining costs of nuclear medicine procedure with various costing methodologies. Activity Based Costing (ABC) method, is a new approach in imaging services costing that can provide the most accurate cost data, but is difficult to perform in nuclear medicine diagnostic procedures. That is because ABC requires determining and analyzing all direct and indirect costs of each procedure, according all its activities. Traditional costing methods, like those for estimating incomes and expenses per procedure or fixed and variable costs per procedure, which are widely used in break-even point analysis and the method of ratio-of-costs-to-charges per procedure may be easily performed in nuclear medicine departments, to evaluate the variability and differences between costs and reimbursement - charges.

  10. Predictive factors for complete removal in soft tissue sarcomas: a retrospective analysis in a series of 592 cases.

    Science.gov (United States)

    Sastre-Garau, X; Coindre, J M; Leroyer, A; Terrier, P; Ollivier, L; Stöckle, E; Bonichon, F; Collin, F; Le Doussal, V; Contesso, G; Vilain, M O; Jacquemier, J; Nguyen, B B

    1997-07-01

    In order to specify the indications for conservative surgery and preoperative therapeutic approaches of soft tissues sarcomas (STS), we looked for the clinico-pathological parameters associated with the failure to obtain a complete removal (CRm) of the tumor. We retrospectively analyzed a series of 592 cases of primary non-metastatic STS. Surgery was performed in 495 cases as a primary treatment and in 88 cases after chemo- or radiotherapy. Nine patients were treated by chemotherapy-radiotherapy. In a univariate analysis, 20 parameters were tested for their association with CRm. A multivariate analysis was then used to define the independent parameters linked to the achievement of a CRm. In the univariate analysis, 15 parameters were found to be linked to the achievement of a CRm. Three of them proved to be independent in the multivariate analysis: T in the TNM classification, tumor location, and tumor necrosis. By the combination of these risk factors, four groups of patients were defined, with respective rates of CRm of 97% (no factor), 95% (one factor), 70% (two factors), and 48% (three factors). The achievement of a CRm after surgery of STS depends not only on the accessibility of the lesion, but also on tumor aggressiveness, a reflection of which is necrosis. The detection of necrosis by imaging procedures may thus help predicting the resectability of tumors and defining the indications for neoadjuvant therapies, likely to broaden the use of conservative surgery.

  11. Identifying Human Factors Issues in Aircraft Maintenance Operations

    Science.gov (United States)

    Veinott, Elizabeth S.; Kanki, Barbara G.; Shafto, Michael G. (Technical Monitor)

    1995-01-01

    Maintenance operations incidents submitted to the Aviation Safety Reporting System (ASRS) between 1986-1992 were systematically analyzed in order to identify issues relevant to human factors and crew coordination. This exploratory analysis involved 95 ASRS reports which represented a wide range of maintenance incidents. The reports were coded and analyzed according to the type of error (e.g, wrong part, procedural error, non-procedural error), contributing factors (e.g., individual, within-team, cross-team, procedure, tools), result of the error (e.g., aircraft damage or not) as well as the operational impact (e.g., aircraft flown to destination, air return, delay at gate). The main findings indicate that procedural errors were most common (48.4%) and that individual and team actions contributed to the errors in more than 50% of the cases. As for operational results, most errors were either corrected after landing at the destination (51.6%) or required the flight crew to stop enroute (29.5%). Interactions among these variables are also discussed. This analysis is a first step toward developing a taxonomy of crew coordination problems in maintenance. By understanding what variables are important and how they are interrelated, we may develop intervention strategies that are better tailored to the human factor issues involved.

  12. Experimental Analysis for Factors Affecting the Repeatability of Plastics Injection Molding Tests on the Self-developed Apparatus

    Directory of Open Access Journals (Sweden)

    Yugang Huang

    2013-06-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 To improve the repeatability of the injection molding test result, the affecting factors were investigated by means of experiments. Besides the traditional processing parameter, the factors of test conditions were also considered. In order to focus on the molding process rather than the molded part, the curve measurement of the melt pressure at the entrance to the nozzle was used as the output characteristic. Experiments for polypropylene (PP showed that the injected volume was the key processing parameter. Within the test conditions, the injection number is the most important factor. According to the analysis the operating procedure was improved effectively. Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Doi: 10.12777/ijse.5.1.6-11 [How to cite this article: Huang, Y., Li, D., Liu, Y. (2013. Experimental Analysis for Factors Affecting the Repeatability of Plastics Injection Molding Tests on the Self-developed Apparatus. International Journal of Science and Engineering, 5(1,6-11. Doi: 10.12777/ijse.5.1.6-11]

  13. Optimization of headspace experimental factors to determine chlorophenols in water by means of headspace solid-phase microextraction and gas chromatography coupled with mass spectrometry and parallel factor analysis.

    Science.gov (United States)

    Morales, Rocío; Cruz Ortiz, M; Sarabia, Luis A

    2012-11-19

    In this work an analytical procedure based on headspace solid-phase microextraction and gas chromatography coupled with mass spectrometry (HS-SPME-GC/MS) is proposed to determine chlorophenols with prior derivatization step to improve analyte volatility and therefore the decision limit (CCα). After optimization, the analytical procedure was applied to analyze river water samples. The following analytes are studied: 2,4-dichlorophenol (2,4-DCP), 2,4,6-trichlorophenol (2,4,6-TrCP), 2,3,4,6-tetrachlorophenol (2,4,6-TeCP) and pentachlorophenol (PCP). A D-optimal design is used to study the parameters affecting the HS-SPME process and the derivatization step. Four experimental factors at two levels and one factor at three levels were considered: (i) equilibrium/extraction temperature, (ii) extraction time, (iii) sample volume, (iv) agitation time and (v) equilibrium time. In addition two interactions between four of them were considered. The D-optimal design enables the reduction of the number of experiments from 48 to 18 while maintaining enough precision in the estimation of the effects. As every analysis took 1h, the design is blocked in 2 days. The second-order property of the PARAFAC (parallel factor analysis) decomposition avoids the need of fitting a new calibration model each time that the experimental conditions change. In consequence, the standardized loadings in the sample mode estimated by a PARAFAC decomposition are the response used in the design because they are proportional to the amount of analyte extracted. It has been found that block effect is significant and that 60°C equilibrium temperature together with 25min extraction time are necessary to achieve the best extraction for the chlorophenols analyzed. The other factors and interactions were not significant. After that, a calibration based in a PARAFAC2 decomposition provided the following values of CCα: 120, 208, 86, 39ngL(-1) for 2,4-DCP, 2,4,6-TrCP, 2,3,4,5-TeCP and PCP respectively for a

  14. Analysis of Academic Administrators' Attitudes: Annual Evaluations and Factors That Improve Teaching

    Science.gov (United States)

    Cherry, Brian D.; Grasse, Nathan; Kapla, Dale; Hamel, Brad

    2017-01-01

    This article examines academic administrators' attitudes towards the academic evaluation process in the US and those factors that are utilised to improve teaching. We use path regressions to examine satisfaction with evaluation procedures, as well as the direct and indirect effects of these factors on perceptions of whether the evaluation process…

  15. Power analysis for multivariate and repeated measures designs: a flexible approach using the SPSS MANOVA procedure.

    Science.gov (United States)

    D'Amico, E J; Neilands, T B; Zambarano, R

    2001-11-01

    Although power analysis is an important component in the planning and implementation of research designs, it is often ignored. Computer programs for performing power analysis are available, but most have limitations, particularly for complex multivariate designs. An SPSS procedure is presented that can be used for calculating power for univariate, multivariate, and repeated measures models with and without time-varying and time-constant covariates. Three examples provide a framework for calculating power via this method: an ANCOVA, a MANOVA, and a repeated measures ANOVA with two or more groups. The benefits and limitations of this procedure are discussed.

  16. Patient Dose During Carotid Artery Stenting With Embolic-Protection Devices: Evaluation With Radiochromic Films and Related Diagnostic Reference Levels According to Factors Influencing the Procedure

    International Nuclear Information System (INIS)

    D’Ercole, Loredana; Quaretti, Pietro; Cionfoli, Nicola; Klersy, Catherine; Bocchiola, Milena; Rodolico, Giuseppe; Azzaretti, Andrea; Lisciandro, Francesco; Cascella, Tommaso; Zappoli Thyrion, Federico

    2013-01-01

    To measure the maximum entrance skin dose (MESD) on patients undergoing carotid artery stenting (CAS) using embolic-protection devices, to analyze the dependence of dose and exposure parameters on anatomical, clinical, and technical factors affecting the procedure complexity, to obtain some local diagnostic reference levels (DRLs), and to evaluate whether overcoming DRLs is related to procedure complexity. MESD were evaluated with radiochromic films in 31 patients (mean age 72 ± 7 years). Five of 33 (15 %) procedures used proximal EPD, and 28 of 33 (85 %) procedures used distal EPD. Local DRLs were derived from the recorded exposure parameters in 93 patients (65 men and 28 women, mean age 73 ± 9 years) undergoing 96 CAS with proximal (33 %) or distal (67 %) EPD. Four bilateral lesions were included. MESD values (mean 0.96 ± 0.42 Gy) were FR ) were 269 Gy cm 2 , 28 minutes, and 251, respectively. Only simultaneous bilateral treatment was associated with KAP (odds ratio [OR] 10.14, 95 % confidence interval [CI] 1–102.7, p FR overexposures (OR 10.8, 95 % CI 1.1–109.5, p FR overexposure (OR 2.8, 95 % CI 1.1–7.4, p = 0.040). At multivariable analysis, stenosis ≥ 90 % (OR 2.8, 95 % CI 1.1–7.4, p = 0.040) and bilateral treatment (OR 10.8, 95 % CI 1.1–109.5, p = 0.027) were associated with overexposure for two or more parameters. Skin doses are not problematic in CAS with EPD because these procedures rarely lead to doses >2 Gy.

  17. Comparative analysis of lockout programs and procedures applied to industrial machines

    Energy Technology Data Exchange (ETDEWEB)

    Chinniah, Y.; Champoux, M.; Burlet-Vienney, D.; Daigle, R. [Institut de recherche Robert-Sauve en sante et en securite du travail, Montreal, PQ (Canada)

    2008-09-15

    In 2005, approximately 20 workers in Quebec were killed by dangerous machines. Approximately 13,000 accidents in the province were linked to the use of machines. The resulting cost associated with these accidents was estimated to be $70 million to the Quebec Occupational Health and Safety Commission (CSST) in compensation and salary replacement. According to article 185 of the Quebec Occupational Health and Safety Regulation (RSST), workers intervening in hazardous zones of machines and processes during maintenance, repairs, and unjamming activities must apply lockout procedures. Lockout is defined as the placement of a lock or tag on an energy-isolating device in accordance with an established procedure, indicating that the energy-isolating device is not to be operated until removal of the lock or tag in accordance with an established procedure. This report presented a comparative analysis of lockout programs and procedures applied to industrial machines. The study attempted to answer several questions regarding the concept of lockout and its definition in the literature; the differences between legal lockout requirements among provinces and countries; different standards on lockout; the contents of lockout programs as described by different documents; and the compliance of lockout programs in a sample of industries in Quebec in terms of Canadian standard on lockout, the CSA Z460-05 (2005). The report discussed the research objectives, methodology, and results of the study. It was concluded that the concept of lockout has different meanings or definitions in the literature, especially in regulations. However, definitions of lockout which are found in standards have certain similarities. 50 refs., 52 tabs., 2 appendices.

  18. Methodology for LOCA analysis and its qualification procedures for PWR reload licensing

    International Nuclear Information System (INIS)

    Serrano, M.A.B.

    1986-01-01

    The methodology for LOCA analysis developed by FURNAS and its qualification procedure for PWR reload licensing are presented. Digital computer codes developed by NRC and published collectively as the WREM package were modified to get versions that comply to each requirement of Brazilian Licensing Criteria. This metodology is applied to Angra-1 basic case to conclude the qualification process. (Author) [pt

  19. Random analysis of bearing capacity of square footing using the LAS procedure

    Science.gov (United States)

    Kawa, Marek; Puła, Wojciech; Suska, Michał

    2016-09-01

    In the present paper, a three-dimensional problem of bearing capacity of square footing on random soil medium is analyzed. The random fields of strength parameters c and φ are generated using LAS procedure (Local Average Subdivision, Fenton and Vanmarcke 1990). The procedure used is re-implemented by the authors in Mathematica environment in order to combine it with commercial program. Since the procedure is still tested the random filed has been assumed as one-dimensional: the strength properties of soil are random in vertical direction only. Individual realizations of bearing capacity boundary-problem with strength parameters of medium defined the above procedure are solved using FLAC3D Software. The analysis is performed for two qualitatively different cases, namely for the purely cohesive and cohesive-frictional soils. For the latter case the friction angle and cohesion have been assumed as independent random variables. For these two cases the random square footing bearing capacity results have been obtained for the range of fluctuation scales from 0.5 m to 10 m. Each time 1000 Monte Carlo realizations have been performed. The obtained results allow not only the mean and variance but also the probability density function to be estimated. An example of application of this function for reliability calculation has been presented in the final part of the paper.

  20. Meta-analysis of the technical performance of an imaging procedure: guidelines and statistical methodology.

    Science.gov (United States)

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2015-02-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  1. Analysis of nasopharyngeal carcinoma risk factors with Bayesian networks.

    Science.gov (United States)

    Aussem, Alex; de Morais, Sérgio Rodrigues; Corbex, Marilys

    2012-01-01

    We propose a new graphical framework for extracting the relevant dietary, social and environmental risk factors that are associated with an increased risk of nasopharyngeal carcinoma (NPC) on a case-control epidemiologic study that consists of 1289 subjects and 150 risk factors. This framework builds on the use of Bayesian networks (BNs) for representing statistical dependencies between the random variables. We discuss a novel constraint-based procedure, called Hybrid Parents and Children (HPC), that builds recursively a local graph that includes all the relevant features statistically associated to the NPC, without having to find the whole BN first. The local graph is afterwards directed by the domain expert according to his knowledge. It provides a statistical profile of the recruited population, and meanwhile helps identify the risk factors associated to NPC. Extensive experiments on synthetic data sampled from known BNs show that the HPC outperforms state-of-the-art algorithms that appeared in the recent literature. From a biological perspective, the present study confirms that chemical products, pesticides and domestic fume intake from incomplete combustion of coal and wood are significantly associated with NPC risk. These results suggest that industrial workers are often exposed to noxious chemicals and poisonous substances that are used in the course of manufacturing. This study also supports previous findings that the consumption of a number of preserved food items, like house made proteins and sheep fat, are a major risk factor for NPC. BNs are valuable data mining tools for the analysis of epidemiologic data. They can explicitly combine both expert knowledge from the field and information inferred from the data. These techniques therefore merit consideration as valuable alternatives to traditional multivariate regression techniques in epidemiologic studies. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Application procedures and analysis examples of the SIE ASME-NH program

    International Nuclear Information System (INIS)

    Kim, Seok Hoon; Koo, G. H.; Kim, J. B.

    2010-12-01

    In this report, the design rule of the ASME-NH Code was briefly summarized and the application procedures of SIE ASME-NH program were analysed, the analysis examples were described. The SIE ASME-NH program was developed according to the ASME Code Section III Subsection NH rules to perform the primary stress limits, the accumulated inelastic strain limits and the creep fatigue damage evaluations in the structural design of nuclear power plants operating with high temperatures over creep temperature at normal operating conditions. In the analysis examples, the benchmark problem for the high temperature reactor vessel which was discussed in the SIE ASME-NH user's seminar was described. Also, the preliminary structural analysis of an Advanced Burner Test Reactor internal structure was described. Considering the load combinations of the various cycle types submitted from significant operating conditions, the integrity of a reactor internal structure was reviewed according to the stress and strain limits of the ASME-NH rules and the analysis and evaluation results were summarized

  3. Analysis of Pre-Analytic Factors Affecting the Success of Clinical Next-Generation Sequencing of Solid Organ Malignancies

    International Nuclear Information System (INIS)

    Chen, Hui; Luthra, Rajyalakshmi; Goswami, Rashmi S.; Singh, Rajesh R.; Roy-Chowdhuri, Sinchita

    2015-01-01

    Application of next-generation sequencing (NGS) technology to routine clinical practice has enabled characterization of personalized cancer genomes to identify patients likely to have a response to targeted therapy. The proper selection of tumor sample for downstream NGS based mutational analysis is critical to generate accurate results and to guide therapeutic intervention. However, multiple pre-analytic factors come into play in determining the success of NGS testing. In this review, we discuss pre-analytic requirements for AmpliSeq PCR-based sequencing using Ion Torrent Personal Genome Machine (PGM) (Life Technologies), a NGS sequencing platform that is often used by clinical laboratories for sequencing solid tumors because of its low input DNA requirement from formalin fixed and paraffin embedded tissue. The success of NGS mutational analysis is affected not only by the input DNA quantity but also by several other factors, including the specimen type, the DNA quality, and the tumor cellularity. Here, we review tissue requirements for solid tumor NGS based mutational analysis, including procedure types, tissue types, tumor volume and fraction, decalcification, and treatment effects

  4. Analysis of Pre-Analytic Factors Affecting the Success of Clinical Next-Generation Sequencing of Solid Organ Malignancies

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Hui [Department of Pathology, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Blvd, Houston, TX 77030 (United States); Luthra, Rajyalakshmi, E-mail: rluthra@mdanderson.org; Goswami, Rashmi S.; Singh, Rajesh R. [Department of Hematopathology, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Blvd, Houston, TX 77030 (United States); Roy-Chowdhuri, Sinchita [Department of Pathology, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Blvd, Houston, TX 77030 (United States)

    2015-08-28

    Application of next-generation sequencing (NGS) technology to routine clinical practice has enabled characterization of personalized cancer genomes to identify patients likely to have a response to targeted therapy. The proper selection of tumor sample for downstream NGS based mutational analysis is critical to generate accurate results and to guide therapeutic intervention. However, multiple pre-analytic factors come into play in determining the success of NGS testing. In this review, we discuss pre-analytic requirements for AmpliSeq PCR-based sequencing using Ion Torrent Personal Genome Machine (PGM) (Life Technologies), a NGS sequencing platform that is often used by clinical laboratories for sequencing solid tumors because of its low input DNA requirement from formalin fixed and paraffin embedded tissue. The success of NGS mutational analysis is affected not only by the input DNA quantity but also by several other factors, including the specimen type, the DNA quality, and the tumor cellularity. Here, we review tissue requirements for solid tumor NGS based mutational analysis, including procedure types, tissue types, tumor volume and fraction, decalcification, and treatment effects.

  5. Analysis of Pre-Analytic Factors Affecting the Success of Clinical Next-Generation Sequencing of Solid Organ Malignancies

    Directory of Open Access Journals (Sweden)

    Hui Chen

    2015-08-01

    Full Text Available Application of next-generation sequencing (NGS technology to routine clinical practice has enabled characterization of personalized cancer genomes to identify patients likely to have a response to targeted therapy. The proper selection of tumor sample for downstream NGS based mutational analysis is critical to generate accurate results and to guide therapeutic intervention. However, multiple pre-analytic factors come into play in determining the success of NGS testing. In this review, we discuss pre-analytic requirements for AmpliSeq PCR-based sequencing using Ion Torrent Personal Genome Machine (PGM (Life Technologies, a NGS sequencing platform that is often used by clinical laboratories for sequencing solid tumors because of its low input DNA requirement from formalin fixed and paraffin embedded tissue. The success of NGS mutational analysis is affected not only by the input DNA quantity but also by several other factors, including the specimen type, the DNA quality, and the tumor cellularity. Here, we review tissue requirements for solid tumor NGS based mutational analysis, including procedure types, tissue types, tumor volume and fraction, decalcification, and treatment effects.

  6. Research procedure and criteria for analysis and choice of variants for construction of national radioactive waste repository

    International Nuclear Information System (INIS)

    Vachev, B.

    1993-01-01

    General principles, overlying objectives and basic radioactive waste management strategy future priorities are considered. The research procedure is based on system approach and analysis, decision making theory, basic objectives and principles of the national repository construction. Main criteria and some basic notions (like radioactive wastes environment and radioactive wastes barriers - input and output) are introduced. Six environment elements are identified: surroundings and natural environment, economic, scientific and technical-technological, socio-psychological, legal and institutional-political. Flow charts of the hierarchical structure of research procedure, decision making levels and direct and back feeds are presented and a scenario analysis is proposed as one of the tools for reflection of uncertainty. The hierarchical structure of the high level waste repository construction scenarios and variants tree (8 levels) is defined. The methodology and methods of analysis, screening and choice of variants is considered. A 7-group system of criteria and constrains for analysis, screening and choice of variants is formulated. One implementation of the proposed methodology and procedure is the technological choice for radioactive waste conditioning and solving of a preliminary site selection problem. 4 figs., 25 refs. (author)

  7. Assessment of Various Risk Factors for Success of Delayed and Immediate Loaded Dental Implants: A Retrospective Analysis.

    Science.gov (United States)

    Prasant, M C; Thukral, Rishi; Kumar, Sachin; Sadrani, Sannishth M; Baxi, Harsh; Shah, Aditi

    2016-10-01

    Ever since its introduction in 1977, a minimum of few months of period is required for osseointegration to take place after dental implant surgery. With the passage of time and advancements in the fields of dental implant, this healing period is getting smaller and smaller. Immediate loading of dental implants is becoming a very popular procedure in the recent time. Hence, we retrospectively analyzed the various risk factors for the failure of delayed and immediate loaded dental implants. In the present study, retrospective analysis of all the patients was done who underwent dental implant surgeries either by immediate loading procedure or by delayed loading procedures. All the patients were divided broadly into two groups with one group containing patients in which delayed loaded dental implants were placed while other consisted of patients in whom immediate loaded dental implants were placed. All the patients in whom follow-up records were missing and who had past medical history of any systemic diseases were excluded from the present study. Evaluation of associated possible risk factors was done by classifying the predictable factors as primary and secondary factors. All the results were analyzed by Statistical Package for the Social Sciences (SPSS) software. Kaplan-Meier survival analyses and chi-square test were used for assessment of level of significance. In delayed and immediate group of dental implants, mean age of the patients was 54.2 and 54.8 years respectively. Statistically significant results were obtained while comparing the clinical parameters of the dental implants in both the groups while demographic parameters showed nonsignificant correlation. Significant higher risk of dental implant failure is associated with immediate loaded dental implants. Tobacco smoking, shorter implant size, and other risk factors play a significant role in predicting the success and failure of dental implants. Delayed loaded dental implant placement should be preferred

  8. Reference levels in PTCA as a function of procedure complexity

    International Nuclear Information System (INIS)

    Peterzol, A.; Quai, E.; Padovani, R.; Bernardi, G.; Kotre, C. J.; Dowling, A.

    2005-01-01

    The multicentre assessment of a procedure complexity index (CI) for the introduction of reference levels (RLs) in percutaneous transluminal coronary angio-plasties (PTCA) is presented here. PTCAs were investigated based on methodology proposed by Bernardi et al. Multiple linear stepwise regression analysis, including clinical, anatomical and technical factors, was performed to obtain fluoroscopy time predictors. Based on these regression coefficients, a scoring system was defined and CI obtained. CI was used to classify dose values into three groups: low, medium and high complexity procedures, since there was good correlation (r = 0.41; P 2 , and 12, 20 and 27 min for fluoroscopy time, for the three CI groups. (authors)

  9. The use of analytical procedures in the internal audit of the restaurant business expenses

    Directory of Open Access Journals (Sweden)

    T.Yu. Kopotienko

    2015-06-01

    Full Text Available The important task of carrying out the internal audit of expenses is to get the sufficient and reliable audit evidence. This can be achieved by using the analytical procedures in the audit process. The identification of the analytical procedures with the financial analysis of the business activities prevents from the efficient usage of them in the internal audit of the restaurant business expenses. The knowledge of internal auditors about the instructional techniques of analytical procedures and their tasks, depending on the verification steps are insufficient. The purpose of the article is the developing the methods of the internal audit of the restaurant business expenses based on an integrated application of analytical procedures. The nature and purpose of analytical procedures have been investigated in the article. It have been identified the factors influencing on auditor’s decision about the choice of analytical procedures complex. It was recommended to identify among them the purpose of analytical procedures, the type and structure of the enterprise, the source of the available information, the existence of financial and non-financial information, reliability and comparability of the available information. It have been identified the tasks of analytical procedures, depending on the verification steps. It was offered the analytical procedures complex as a part of internal audit of the restaurant business expenses. This complex contains a list of the analytical procedures, instructional techniques of analysis that are used in the appropriate procedure and the brief overview of the content of procedure.

  10. Logistic regression analysis of risk factors for postoperative recurrence of spinal tumors and analysis of prognostic factors.

    Science.gov (United States)

    Zhang, Shanyong; Yang, Lili; Peng, Chuangang; Wu, Minfei

    2018-02-01

    The aim of the present study was to investigate the risk factors for postoperative recurrence of spinal tumors by logistic regression analysis and analysis of prognostic factors. In total, 77 male and 48 female patients with spinal tumor were selected in our hospital from January, 2010 to December, 2015 and divided into the benign (n=76) and malignant groups (n=49). All the patients underwent microsurgical resection of spinal tumors and were reviewed regularly 3 months after operation. The McCormick grading system was used to evaluate the postoperative spinal cord function. Data were subjected to statistical analysis. Of the 125 cases, 63 cases showed improvement after operation, 50 cases were stable, and deterioration was found in 12 cases. The improvement rate of patients with cervical spine tumor, which reached 56.3%, was the highest. Fifty-two cases of sensory disturbance, 34 cases of pain, 30 cases of inability to exercise, 26 cases of ataxia, and 12 cases of sphincter disorders were found after operation. Seventy-two cases (57.6%) underwent total resection, 18 cases (14.4%) received subtotal resection, 23 cases (18.4%) received partial resection, and 12 cases (9.6%) were only treated with biopsy/decompression. Postoperative recurrence was found in 57 cases (45.6%). The mean recurrence time of patients in the malignant group was 27.49±6.09 months, and the mean recurrence time of patients in the benign group was 40.62±4.34. The results were significantly different (Pregression analysis of total resection-related factors showed that total resection should be the preferred treatment for patients with benign tumors, thoracic and lumbosacral tumors, and lower McCormick grade, as well as patients without syringomyelia and intramedullary tumors. Logistic regression analysis of recurrence-related factors revealed that the recurrence rate was relatively higher in patients with malignant, cervical, thoracic and lumbosacral, intramedullary tumors, and higher Mc

  11. Interdisciplinary analysis procedures in the modeling and control of large space-based structures

    Science.gov (United States)

    Cooper, Paul A.; Stockwell, Alan E.; Kim, Zeen C.

    1987-01-01

    The paper describes a computer software system called the Integrated Multidisciplinary Analysis Tool, IMAT, that has been developed at NASA Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven interactive executive program, IMAT links a relational database to commercial structural and controls analysis codes. The paper describes the procedures followed to analyze a complex satellite structure and control system. The codes used to accomplish the analysis are described, and an example is provided of an application of IMAT to the analysis of a reference space station subject to a rectangular pulse loading at its docking port.

  12. Physics Metacognition Inventory Part II: Confirmatory factor analysis and Rasch analysis

    Science.gov (United States)

    Taasoobshirazi, Gita; Bailey, MarLynn; Farley, John

    2015-11-01

    The Physics Metacognition Inventory was developed to measure physics students' metacognition for problem solving. In one of our earlier studies, an exploratory factor analysis provided evidence of preliminary construct validity, revealing six components of students' metacognition when solving physics problems including knowledge of cognition, planning, monitoring, evaluation, debugging, and information management. The college students' scores on the inventory were found to be reliable and related to students' physics motivation and physics grade. However, the results of the exploratory factor analysis indicated that the questionnaire could be revised to improve its construct validity. The goal of this study was to revise the questionnaire and establish its construct validity through a confirmatory factor analysis. In addition, a Rasch analysis was applied to the data to better understand the psychometric properties of the inventory and to further evaluate the construct validity. Results indicated that the final, revised inventory is a valid, reliable, and efficient tool for assessing student metacognition for physics problem solving.

  13. Factors influencing left ventricular outflow tract obstruction following a mitral valve-in-valve or valve-in-ring procedure, part 1.

    Science.gov (United States)

    Bapat, Vinnie; Pirone, Francesco; Kapetanakis, Stam; Rajani, Ronak; Niederer, Steven

    2015-10-01

    To determine the factors influencing left ventricular outflow tract (LVOT) area reduction after a mitral valve-in-valve (VIV) or a valve-in-ring (VIR) procedure. Transcatheter heart valves (THVs) are increasingly used in performing a VIV or a VIR procedure in high-risk patients. Although less invasive, a potential complication is LVOT obstruction. However, the factors predisposing to LVOT obstruction are ill defined. To understand the effects of the various factors, the study was carried out in three parts: To understand the effect of VIV and VIR on reduction in LVOT area with special attention to different surgical heart valve (SHV) orientations and depth of THV implant. This was carried out in porcine and cadaver hearts. To quantify aorto-mitral-annular (AMA) angle in 20 patients with or without mitral disease and to derive a static computational model to predict LVOT obstruction. To study the effect of SHV design on LVOT obstruction after VIV. This was carried out as a bench test. LVOT area reduction was similar after VIV irrespective of orientation of the mitral SHV implantation as it pinned open the SHV leaflets. Similar effect was seen after VIR. The degree of LVOT obstruction was partly determined by AMAangle and was inversely proportional. SHV design, ring design, and depth of SPAIEN XT implantation also had effect on LVOT obstruction. A possibility of LVOT obstruction should be considered when performing a VIV and VIR procedure. Type of SHV, flexible ring, less obtuse AMA angle, and depth of SAPIEN XT implant can influence the risk. © 2015 Wiley Periodicals, Inc.

  14. Risk factors of chronic periodontitis on healing response: a multilevel modelling analysis.

    Science.gov (United States)

    Song, J; Zhao, H; Pan, C; Li, C; Liu, J; Pan, Y

    2017-09-15

    Chronic periodontitis is a multifactorial polygenetic disease with an increasing number of associated factors that have been identified over recent decades. Longitudinal epidemiologic studies have demonstrated that the risk factors were related to the progression of the disease. A traditional multivariate regression model was used to find risk factors associated with chronic periodontitis. However, the approach requirement of standard statistical procedures demands individual independence. Multilevel modelling (MLM) data analysis has widely been used in recent years, regarding thorough hierarchical structuring of the data, decomposing the error terms into different levels, and providing a new analytic method and framework for solving this problem. The purpose of our study is to investigate the relationship of clinical periodontal index and the risk factors in chronic periodontitis through MLM analysis and to identify high-risk individuals in the clinical setting. Fifty-four patients with moderate to severe periodontitis were included. They were treated by means of non-surgical periodontal therapy, and then made follow-up visits regularly at 3, 6, and 12 months after therapy. Each patient answered a questionnaire survey and underwent measurement of clinical periodontal parameters. Compared with baseline, probing depth (PD) and clinical attachment loss (CAL) improved significantly after non-surgical periodontal therapy with regular follow-up visits at 3, 6, and 12 months after therapy. The null model and variance component models with no independent variables included were initially obtained to investigate the variance of the PD and CAL reductions across all three levels, and they showed a statistically significant difference (P periodontal therapy with regular follow-up visits had a remarkable curative effect. All three levels had a substantial influence on the reduction of PD and CAL. Site-level had the largest effect on PD and CAL reductions.

  15. Legitimization Arguments for Procedural Reforms: a semio-linguistic analysis of statement of reasons from the Civil Procedure Code of 1939 and of the draft bill of the New Civil Procedure Code of 2010.

    Directory of Open Access Journals (Sweden)

    Matheus Guarino Sant’Anna Lima de Almeida

    2016-08-01

    Full Text Available This research aims to analyze the arguments of legitimization that were used in the reform of Brazilian procedural legal codes, by comparing the texts of the statement of reasons of the Civil Procedure Code of 1939 and the draft bill of the New Civil Procedure Code. We consider these codes as milestones: the Civil Procedure Code of 1939 was the first one with a national scope; the draft bill of the New Civil Procedure Code was the first one produced during a democratic period. Our goal is to search for similarities and contrasts between the legitimization arguments used in each historical and political period, by asking if they were only arguments to bestow legitimacy to such reforms. We decided to use the methodological tools of sociolinguistic analysis of speech developed by Patrick Charaudeau in his analyses of political speech in order to elucidate how the uses of language and elements of meaning in the speech construction provide justification for the concept of procedure, in both 1939 and 2010. As a result, we conclude that the process of drafting the CPC of 1939 and the New CPC, even if they are very distant in terms of political and historical contexts, they are also very close in their rhetorical construction and their attempt to find justification and adherence. On balance, some of the differences depend on the vocabulary used when the codes were developed, their justification and the need for change. 

  16. A Risk Analysis Methodology to Address Human and Organizational Factors in Offshore Drilling Safety: With an Emphasis on Negative Pressure Test

    Science.gov (United States)

    Tabibzadeh, Maryam

    According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test

  17. A “Cookbook” Cost Analysis Procedure for Medical Information Systems*

    Science.gov (United States)

    Torrance, Janice L.; Torrance, George W.; Covvey, H. Dominic

    1983-01-01

    A costing procedure for medical information systems is described. The procedure incorporates state-of-the-art costing methods in an easy to follow “cookbook” format. Application of the procedure consists of filling out a series of Mac-Tor EZ-Cost forms. The procedure and forms have been field tested by application to a cardiovascular database system. This article describes the major features of the costing procedure. The forms and other details are available upon request.

  18. General Staining and Segmentation Procedures for High Content Imaging and Analysis.

    Science.gov (United States)

    Chambers, Kevin M; Mandavilli, Bhaskar S; Dolman, Nick J; Janes, Michael S

    2018-01-01

    Automated quantitative fluorescence microscopy, also known as high content imaging (HCI), is a rapidly growing analytical approach in cell biology. Because automated image analysis relies heavily on robust demarcation of cells and subcellular regions, reliable methods for labeling cells is a critical component of the HCI workflow. Labeling of cells for image segmentation is typically performed with fluorescent probes that bind DNA for nuclear-based cell demarcation or with those which react with proteins for image analysis based on whole cell staining. These reagents, along with instrument and software settings, play an important role in the successful segmentation of cells in a population for automated and quantitative image analysis. In this chapter, we describe standard procedures for labeling and image segmentation in both live and fixed cell samples. The chapter will also provide troubleshooting guidelines for some of the common problems associated with these aspects of HCI.

  19. Improving the efficiency of aerodynamic shape optimization procedures

    Science.gov (United States)

    Burgreen, Greg W.; Baysal, Oktay; Eleshaky, Mohamed E.

    1992-01-01

    The computational efficiency of an aerodynamic shape optimization procedure which is based on discrete sensitivity analysis is increased through the implementation of two improvements. The first improvement involves replacing a grid point-based approach for surface representation with a Bezier-Bernstein polynomial parameterization of the surface. Explicit analytical expressions for the grid sensitivity terms are developed for both approaches. The second improvement proposes the use of Newton's method in lieu of an alternating direction implicit (ADI) methodology to calculate the highly converged flow solutions which are required to compute the sensitivity coefficients. The modified design procedure is demonstrated by optimizing the shape of an internal-external nozzle configuration. A substantial factor of 8 decrease in computational time for the optimization process was achieved by implementing both of the design improvements.

  20. Classification analysis of organization factors related to system safety

    International Nuclear Information System (INIS)

    Liu Huizhen; Zhang Li; Zhang Yuling; Guan Shihua

    2009-01-01

    This paper analyzes the different types of organization factors which influence the system safety. The organization factor can be divided into the interior organization factor and exterior organization factor. The latter includes the factors of political, economical, technical, law, social culture and geographical, and the relationships among different interest groups. The former includes organization culture, communication, decision, training, process, supervision and management and organization structure. This paper focuses on the description of the organization factors. The classification analysis of the organization factors is the early work of quantitative analysis. (authors)

  1. Recurrent-neural-network-based Boolean factor analysis and its application to word clustering.

    Science.gov (United States)

    Frolov, Alexander A; Husek, Dusan; Polyakov, Pavel Yu

    2009-07-01

    The objective of this paper is to introduce a neural-network-based algorithm for word clustering as an extension of the neural-network-based Boolean factor analysis algorithm (Frolov , 2007). It is shown that this extended algorithm supports even the more complex model of signals that are supposed to be related to textual documents. It is hypothesized that every topic in textual data is characterized by a set of words which coherently appear in documents dedicated to a given topic. The appearance of each word in a document is coded by the activity of a particular neuron. In accordance with the Hebbian learning rule implemented in the network, sets of coherently appearing words (treated as factors) create tightly connected groups of neurons, hence, revealing them as attractors of the network dynamics. The found factors are eliminated from the network memory by the Hebbian unlearning rule facilitating the search of other factors. Topics related to the found sets of words can be identified based on the words' semantics. To make the method complete, a special technique based on a Bayesian procedure has been developed for the following purposes: first, to provide a complete description of factors in terms of component probability, and second, to enhance the accuracy of classification of signals to determine whether it contains the factor. Since it is assumed that every word may possibly contribute to several topics, the proposed method might be related to the method of fuzzy clustering. In this paper, we show that the results of Boolean factor analysis and fuzzy clustering are not contradictory, but complementary. To demonstrate the capabilities of this attempt, the method is applied to two types of textual data on neural networks in two different languages. The obtained topics and corresponding words are at a good level of agreement despite the fact that identical topics in Russian and English conferences contain different sets of keywords.

  2. Evaluation of six sample preparation procedures for qualitative and quantitative proteomics analysis of milk fat globule membrane.

    Science.gov (United States)

    Yang, Yongxin; Anderson, Elizabeth; Zhang, Sheng

    2018-04-12

    Proteomic analysis of membrane proteins is challenged by the proteins solubility and detergent incompatibility with MS analysis. No single perfect protocol can be used to comprehensively characterize the proteome of membrane fraction. Here, we used cow milk fat globule membrane (MFGM) proteome analysis to assess six sample preparation procedures including one in-gel and five in-solution digestion approaches prior to LC-MS/MS analysis. The largest number of MFGM proteins were identified by suspension trapping (S-Trap) and filter-aided sample preparation (FASP) methods, followed by acetone precipitation without clean-up of tryptic peptides method. Protein identifications with highest average coverage was achieved by Chloroform/MeOH, in-gel and S-Trap methods. Most distinct proteins were identified by FASP method, followed by S-Trap. Analyses by Venn diagram, principal-component analysis, hierarchical clustering and the abundance ranking of quantitative proteins highlight differences in the MFGM fraction by the all sample preparation procedures. These results reveal the biased proteins/peptides loss occurred in each protocol. In this study, we found several novel proteins that were not observed previously by in-depth proteomics characterization of MFGM fraction in milk. Thus, a combination of multiple procedures with orthologous properties of sample preparation was demonstrated to improve the protein sequence coverage and expression level accuracy of membrane samples. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Development of Procedures for the Analysis of Components of Dumped Chemical Weapons and Their Principal Transformation Products in Sea Water

    International Nuclear Information System (INIS)

    Saveleva, E. I.; Koryagina, N. L.; Radilov, A. S.; Khlebnikova, N. S.; Khrustaleva, V. S.

    2007-01-01

    A package of chemical analytical procedures was developed for the detection of products indicative of the presence of damped chemical weapons in the Baltic Sea. The principal requirements imposed upon the procedures were the following: high sensitivity, reliable identification of target compounds, wide range of components covered by survey analysis, and lack of interferences from sea salts. Thiodiglycol, a product of hydrolysis of sulfur mustard reportedly always detected in the sites of damping chemical weapons in the Baltic Sea, was considered the principal marker. We developed a high-sensitivity procedure for the determination of thiodiglycol in sea water, involving evaporation of samples to dryness in a vacuum concentrator, followed by tert-butyldimethylsilylation of the residue and GCMS analysis in the SIM mode with meta-fluorobenzoic acid as internal reference. The detection limit of thiodiglycol was 0.001 mg/l, and the procedure throughput was up to 30 samples per day. The same procedure, but with BSTFA as derivatizing agent instead of MTBSTFA, was used for preparing samples for survey analysis of nonvolatile components. In this case, full mass spectra were measured in the GCMS analysis. The use of BSTFA was motivated by the fact that trimethylsilyl derivatives are much wider represented in electronic mass spectral databases. The identification of sulfur mustard, volatile transformation products of sulfur mustard and lewisite, as well as chloroacetophenone in sea water was performed by means of GCMS in combination with SPME. The survey GC-MS analysis was focused on the identification of volatile and nonvolatile toxic chemicals whose mass spectra are included in the OPCW database (3219 toxic chemicals, precursors, and transformation products) with the use of AMDIS software (version 2.62). Using 2 GC-MS instruments, we could perform the survey analysis for volatile and nonvolatile components of up to 20 samples per day. Thus, the package of three procedures

  4. Unanticipated hospital admission in pediatric patients with congenital heart disease undergoing ambulatory noncardiac surgical procedures.

    Science.gov (United States)

    Yuki, Koichi; Koutsogiannaki, Sophia; Lee, Sandra; DiNardo, James A

    2018-05-18

    An increasing number of surgical and nonsurgical procedures are being performed on an ambulatory basis in children. Analysis of a large group of pediatric patients with congenital heart disease undergoing ambulatory procedures has not been undertaken. The objective of this study was to characterize the profile of children with congenital heart disease who underwent noncardiac procedures on an ambulatory basis at our institution, to determine the incidence of adverse cardiovascular and respiratory adverse events, and to determine the risk factors for unscheduled hospital admission. This is a retrospective study of children with congenital heart disease who underwent noncardiac procedures on an ambulatory basis in a single center. Using the electronic preoperative anesthesia evaluation form, we identified 3010 patients with congenital heart disease who underwent noncardiac procedures of which 1028 (34.1%) were scheduled to occur on an ambulatory basis. Demographic, echocardiographic and functional status data, cardiovascular and respiratory adverse events, and reasons for postprocedure admission were recorded. Univariable analysis was conducted. The unplanned hospital admission was 2.7% and univariable analysis demonstrated that performance of an echocardiogram within 6 mo of the procedure and procedures performed in radiology were associated with postoperative admission. Cardiovascular adverse event incidence was 3.9%. Respiratory adverse event incidence was 1.8%. Ambulatory, noncomplex procedures can be performed in pediatric patients with congenital heart disease and good functional status with a relatively low unanticipated hospital admission rate. © 2018 John Wiley & Sons Ltd.

  5. Time Series Factor Analysis with an Application to Measuring Money

    NARCIS (Netherlands)

    Gilbert, Paul D.; Meijer, Erik

    2005-01-01

    Time series factor analysis (TSFA) and its associated statistical theory is developed. Unlike dynamic factor analysis (DFA), TSFA obviates the need for explicitly modeling the process dynamics of the underlying phenomena. It also differs from standard factor analysis (FA) in important respects: the

  6. Forensic analysis of Salvia divinorum using multivariate statistical procedures. Part I: discrimination from related Salvia species.

    Science.gov (United States)

    Willard, Melissa A Bodnar; McGuffin, Victoria L; Smith, Ruth Waddell

    2012-01-01

    Salvia divinorum is a hallucinogenic herb that is internationally regulated. In this study, salvinorin A, the active compound in S. divinorum, was extracted from S. divinorum plant leaves using a 5-min extraction with dichloromethane. Four additional Salvia species (Salvia officinalis, Salvia guaranitica, Salvia splendens, and Salvia nemorosa) were extracted using this procedure, and all extracts were analyzed by gas chromatography-mass spectrometry. Differentiation of S. divinorum from other Salvia species was successful based on visual assessment of the resulting chromatograms. To provide a more objective comparison, the total ion chromatograms (TICs) were subjected to principal components analysis (PCA). Prior to PCA, the TICs were subjected to a series of data pretreatment procedures to minimize non-chemical sources of variance in the data set. Successful discrimination of S. divinorum from the other four Salvia species was possible based on visual assessment of the PCA scores plot. To provide a numerical assessment of the discrimination, a series of statistical procedures such as Euclidean distance measurement, hierarchical cluster analysis, Student's t tests, Wilcoxon rank-sum tests, and Pearson product moment correlation were also applied to the PCA scores. The statistical procedures were then compared to determine the advantages and disadvantages for forensic applications.

  7. A comparison study on detection of key geochemical variables and factors through three different types of factor analysis

    Science.gov (United States)

    Hoseinzade, Zohre; Mokhtari, Ahmad Reza

    2017-10-01

    Large numbers of variables have been measured to explain different phenomena. Factor analysis has widely been used in order to reduce the dimension of datasets. Additionally, the technique has been employed to highlight underlying factors hidden in a complex system. As geochemical studies benefit from multivariate assays, application of this method is widespread in geochemistry. However, the conventional protocols in implementing factor analysis have some drawbacks in spite of their advantages. In the present study, a geochemical dataset including 804 soil samples collected from a mining area in central Iran in order to search for MVT type Pb-Zn deposits was considered to outline geochemical analysis through various fractal methods. Routine factor analysis, sequential factor analysis, and staged factor analysis were applied to the dataset after opening the data with (additive logratio) alr-transformation to extract mineralization factor in the dataset. A comparison between these methods indicated that sequential factor analysis has more clearly revealed MVT paragenesis elements in surface samples with nearly 50% variation in F1. In addition, staged factor analysis has given acceptable results while it is easy to practice. It could detect mineralization related elements while larger factor loadings are given to these elements resulting in better pronunciation of mineralization.

  8. Pretreatment procedures applied to samples to be analysed by neutron activation analysis at CDTN/CNEN

    International Nuclear Information System (INIS)

    Francisco, Dovenir; Menezes, Maria Angela de Barros Correia

    2009-01-01

    The neutron activation technique - using several methods - has been applied in 80% of the analytical demand of Division for Reactor and Analytical Techniques at CDTN/CNEN, Belo Horizonte, Minas Gerais. This scenario emphasizes the responsibility of the Laboratory to provide and assure the quality of the measurements. The first step to assure the results quality is the preparation of the samples. Therefore, this paper describes the experimental procedures adopted at CDTN/CNEN in order to uniform conditions of analysis and to avoid contaminations by elements present everywhere. Some of the procedures are based on methods described in the literature; others are based on many years of experience preparing samples from many kinds of matrices. The procedures described are related to geological material - soil, sediment, rock, gems, clay, archaeological ceramics and ore - biological materials - hair, fish, plants, food - water, etc. Analytical results in sediment samples are shown as n example pointing out the efficiency of the experimental procedure. (author)

  9. Cost-effectiveness analysis of clinic-based chloral hydrate sedation versus general anaesthesia for paediatric ophthalmological procedures.

    Science.gov (United States)

    Burnett, Heather F; Lambley, Rosemary; West, Stephanie K; Ungar, Wendy J; Mireskandari, Kamiar

    2015-11-01

    The inability of some children to tolerate detailed eye examinations often necessitates general anaesthesia (GA). The objective was to assess the incremental cost effectiveness of paediatric eye examinations carried out in an outpatient sedation unit compared with GA. An episode of care cost-effectiveness analysis was conducted from a societal perspective. Model inputs were based on a retrospective cross-over cohort of Canadian children aged Costs ($CAN), adverse events and number of successful procedures were modelled in a decision analysis with one-way and probabilistic sensitivity analysis. The mean cost per patient was $406 (95% CI $401 to $411) for EUS and $1135 (95% CI $1125 to $1145) for EUA. The mean number of successful procedures per patient was 1.39 (95% CI 1.34 to 1.42) for EUS and 2.06 (95% CI 2.02 to 2.11) for EUA. EUA was $729 more costly on average than EUS (95% CI $719 to $738) but resulted in an additional 0.68 successful procedures per child. The result was robust to varying the cost assumptions. Cross-over designs offer a powerful way to assess costs and effectiveness of two interventions because patients serve as their own control. This study demonstrated significant savings when ophthalmological exams were carried out in a hospital outpatient clinic, although with slightly fewer procedures completed. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  10. Exchange nailing for nonunion of diaphyseal fractures of the tibia: our results and an analysis of the risk factors for failure.

    Science.gov (United States)

    Tsang, S T J; Mills, L A; Frantzias, J; Baren, J P; Keating, J F; Simpson, A H R W

    2016-04-01

    The aim of this study was to identify risk factors for the failure of exchange nailing in nonunion of tibial diaphyseal fractures. A cohort of 102 tibial diaphyseal nonunions in 101 patients with a mean age of 36.9 years (15 to 74) were treated between January 1992 and December 2012 by exchange nailing. Of which 33 (32%) were initially open injuries. The median time from primary fixation to exchange nailing was 6.5 months (interquartile range (IQR) 4.3 to 9.8 months). The main outcome measures were union, number of secondary fixation procedures required to achieve union and time to union. Univariate analysis and multiple regression were used to identify risk factors for failure to achieve union. Multiple causes for the primary nonunion were found for 28 (27%) tibiae, with infection present in 32 (31%). Six patients were lost to follow-up. Further surgical procedures were required in 35 (36%) nonunions. Other fixation modalities were required in five fractures. A single nail exchange procedure achieved union in 60/96 (63%) of all nonunions. Only 11 out of 31 infected nonunions (35.4%) healed after one exchange nail procedure. Up to five repeated exchange nailings, with or without bone grafting, ultimately achieved union in 89 (93%) fractures. The median time to union after exchange nailing was 8.7 months (IQR 5.7 to 14.0 months). Univariate analysis confirmed that an oligotrophic/atrophic pattern of nonunion (p = 0.002), a bone gap of 5 mm or more (p = 0.04) and infection (p exchange nailing Multiple regression analysis found that infection was the strongest predictor of failure (p Exchange nailing is an effective treatment for aseptic tibial diaphyseal nonunion. However, in the presence of severe infection with a highly resistant organism, or extensive sclerosis of the bone, other fixation modalities, such as Ilizarov treatment, should be considered. Exchange nailing is an effective treatment for aseptic tibial diaphyseal nonunion. ©2016 The British Editorial

  11. Predictive Factors of In-Stent Restenosis in Renal Artery Stenting: A Retrospective Analysis

    International Nuclear Information System (INIS)

    Vignali, Claudio; Bargellini, Irene; Lazzereschi, Michele; Cioni, Roberto; Petruzzi, Pasquale; Caramella, Davide; Pinto, Stefania; Napoli, Vinicio; Zampa, Virna; Bartolozzi, Carlo

    2005-01-01

    Purpose. To retrospectively evaluate the role of clinical and procedural factors in predicting in-stent restenosis in patients with renovascular disease treated by renal artery stenting. Methods. From 1995 to 2002, 147 patients underwent renal artery stenting for the treatment of significant ostial atherosclerotic stenosis. Patients underwent strict clinical and color-coded duplex ultrasound follow-up. Ninety-nine patients (111 stents), with over 6 months of continuous follow-up (mean 22±12 months, range 6-60 months), were selected and classified according to the presence (group A, 30 patients, 32 lesions) or absence (group B, 69 patients, 79 lesions) of significant in-stent restenosis. A statistical analysis was performed to identify possible preprocedural and procedural predictors of restenosis considering the following data: sex, age, smoking habit, diabetes mellitus, hypertension, serum creatinine, cholesterol and triglyceride levels, renal artery stenosis grade, and stent type, length and diameter. Results. Comparing group A and B patients (χ 2 test), a statistically significant relation was demonstrated between stent diameter and length and restenosis: the risk of in-stent restenosis decreased when the stent was ≥6 mm in diameter and between 15 and 20 mm in length. This finding was confirmed by multiple logistic regression analysis. Stent diameter and length were proved to be significantly related to in-stent restenosis also when evaluating only patients treated by Palmaz stent (71 stents). Conclusion. Although it is based on a retrospective analysis, the present study confirms the importance of correct stent selection in increasing long-term patency, using stents of at least 6 mm in diameter and with a length of approximately 15-20 mm

  12. Round robin analysis on stress intensity factor of inner surface cracks in welded stainless steel pipes

    Energy Technology Data Exchange (ETDEWEB)

    Han, Chang Gi; Chang, Yoon Suk [Dept. of Nuclear Engineering, College of Engineering, Kyung Hee University, Yongin (Korea, Republic of); Kim, Jong Sung [Dept. of Mechanical Engineering, Sunchon National University, Sunchon (Korea, Republic of); Kim, Maan Won [Central Research Institute, Korea Hydro and Nuclear Power Company, Daejeon (Korea, Republic of)

    2016-12-15

    Austenitic stainless steels (ASSs) are widely used for nuclear pipes as they exhibit a good combination of mechanical properties and corrosion resistance. However, high tensile residual stresses may occur in ASS welds because postweld heat treatment is not generally conducted in order to avoid sensitization, which causes a stress corrosion crack. In this study, round robin analyses on stress intensity factors (SIFs) were carried out to examine the appropriateness of structural integrity assessment methods for ASS pipe welds with two types of circumferential cracks. Typical stress profiles were generated from finite element analyses by considering residual stresses and normal operating conditions. Then, SIFs of cracked ASS pipes were determined by analytical equations represented in fitness-for-service assessment codes as well as reference finite element analyses. The discrepancies of estimated SIFs among round robin participants were confirmed due to different assessment procedures and relevant considerations, as well as the mistakes of participants. The effects of uncertainty factors on SIFs were deducted from sensitivity analyses and, based on the similarity and conservatism compared with detailed finite element analysis results, the R6 code, taking into account the applied internal pressure and combination of stress components, was recommended as the optimum procedure for SIF estimation.

  13. Risk factors for pedicled flap necrosis in hand soft tissue reconstruction: a multivariate logistic regression analysis.

    Science.gov (United States)

    Gong, Xu; Cui, Jianli; Jiang, Ziping; Lu, Laijin; Li, Xiucun

    2018-03-01

    Few clinical retrospective studies have reported the risk factors of pedicled flap necrosis in hand soft tissue reconstruction. The aim of this study was to identify non-technical risk factors associated with pedicled flap perioperative necrosis in hand soft tissue reconstruction via a multivariate logistic regression analysis. For patients with hand soft tissue reconstruction, we carefully reviewed hospital records and identified 163 patients who met the inclusion criteria. The characteristics of these patients, flap transfer procedures and postoperative complications were recorded. Eleven predictors were identified. The correlations between pedicled flap necrosis and risk factors were analysed using a logistic regression model. Of 163 skin flaps, 125 flaps survived completely without any complications. The pedicled flap necrosis rate in hands was 11.04%, which included partial flap necrosis (7.36%) and total flap necrosis (3.68%). Soft tissue defects in fingers were noted in 68.10% of all cases. The logistic regression analysis indicated that the soft tissue defect site (P = 0.046, odds ratio (OR) = 0.079, confidence interval (CI) (0.006, 0.959)), flap size (P = 0.020, OR = 1.024, CI (1.004, 1.045)) and postoperative wound infection (P < 0.001, OR = 17.407, CI (3.821, 79.303)) were statistically significant risk factors for pedicled flap necrosis of the hand. Soft tissue defect site, flap size and postoperative wound infection were risk factors associated with pedicled flap necrosis in hand soft tissue defect reconstruction. © 2017 Royal Australasian College of Surgeons.

  14. Identification of risk factors for mucosal injury during laparoscopic Heller myotomy for achalasia.

    Science.gov (United States)

    Tsuboi, Kazuto; Omura, Nobuo; Yano, Fumiaki; Hoshino, Masato; Yamamoto, Se-Ryung; Akimoto, Shusuke; Masuda, Takahiro; Kashiwagi, Hideyuki; Yanaga, Katsuhiko

    2016-02-01

    Mucosal injury during myotomy is the most frequent complication seen with the Heller-Dor procedure for achalasia. The present study aimed to examine risk factors for such mucosal injury during this procedure. This was a retrospective analysis of patients who underwent the laparoscopic Heller-Dor procedure for achalasia at a single facility. Variables for evaluation included patient characteristics, preoperative pathophysiological findings, and surgeon's operative experience. Logistic regression was used to identify risk factors. We also examined surgical outcomes and the degree of patient satisfaction in relation to intraoperative mucosal injury. Four hundred thirty-five patients satisfied study criteria. Intraoperative mucosal injury occurred in 67 patients (15.4%). In univariate analysis, mucosal injury was significantly associated with the patient age ≥60 years, disease history ≥10 years, prior history of cardiac diseases, preoperative esophageal transverse diameter ≥80 mm, and surgeon's operative experience with fewer than five cases. In multivariate analysis involving these factors, the following variables were identified as risk factors: age ≥60 years, esophageal transverse diameter ≥80 mm, and surgeon's operative experience with fewer than five cases. The mucosal injury group had significant extension of the operative time and increased blood loss. However, there were no significant differences between the two groups in the incidence of reflux esophagitis or the degree of symptom alleviation postoperatively. The fragile esophagus caused by advanced patient age and/or dilatation were risk factor for mucosal injury during laparoscopic Heller-Dor procedure. And novice surgeon was also identified as an isolated risk factor for mucosal injury.

  15. A Numerical Procedure for Analysis of W/R Contact Using Explicit Finite Element Methods

    NARCIS (Netherlands)

    Ma, Y.; Markine, V.L.

    2015-01-01

    Since no effective experimental approaches have been proposed to assess wheel and rail (W/R) contact performance till now, numerical computational analysis is known as an alternative to approximately simulate the W/R interaction. In this paper, one numerical procedure is proposed on the basis of

  16. Rates and risk factors of unplanned 30-day readmission following general and thoracic pediatric surgical procedures.

    Science.gov (United States)

    Polites, Stephanie F; Potter, Donald D; Glasgow, Amy E; Klinkner, Denise B; Moir, Christopher R; Ishitani, Michael B; Habermann, Elizabeth B

    2017-08-01

    Postoperative unplanned readmissions are costly and decrease patient satisfaction; however, little is known about this complication in pediatric surgery. The purpose of this study was to determine rates and predictors of unplanned readmission in a multi-institutional cohort of pediatric surgical patients. Unplanned 30-day readmissions following general and thoracic surgical procedures in children readmission per 30 person-days were determined to account for varied postoperative length of stay (pLOS). Patients were randomly divided into 70% derivation and 30% validation cohorts which were used for creation and validation of a risk model for readmission. Readmission occurred in 1948 (3.6%) of 54,870 children for a rate of 4.3% per 30 person-days. Adjusted predictors of readmission included hepatobiliary procedures, increased wound class, operative duration, complications, and pLOS. The predictive model discriminated well in the derivation and validation cohorts (AUROC 0.710 and 0.701) with good calibration between observed and expected readmission events in both cohorts (p>.05). Unplanned readmission occurs less frequently in pediatric surgery than what is described in adults, calling into question its use as a quality indicator in this population. Factors that predict readmission including type of procedure, complications, and pLOS can be used to identify at-risk children and develop prevention strategies. III. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Using BMDP and SPSS for a Q factor analysis.

    Science.gov (United States)

    Tanner, B A; Koning, S M

    1980-12-01

    While Euclidean distances and Q factor analysis may sometimes be preferred to correlation coefficients and cluster analysis for developing a typology, commercially available software does not always facilitate their use. Commands are provided for using BMDP and SPSS in a Q factor analysis with Euclidean distances.

  18. The development of human factors experimental evaluation techniques

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Bong Shick; Oh, In Suk; Cha, Kyung Ho; Lee, Hyun Chul; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon

    1997-07-01

    New human factors issues, such as evaluation of information navigation, the consideration of operator characteristics, and operator performance assessment, related to the HMI design based on VDUs are being risen. Thus, in order to solve these human factors issues, this project aims to establish the experimental technologies including the techniques for experimental design, experimental measurement, data collection and analysis, and to develop ITF (Integrated Test Facility) suitable for the experiment of HMI design evaluation. For the establish of the experimental data analysis and evaluation methodologies, we developed as the following: (1) a paradigm for human factors experimentation including experimental designs, procedures, and data analysis. (2) the methods for the assessment of operator`s mental workload (3) DAEXESS (data analysis and experiment evaluation supporting system). Also, we have established a experiment execution technologies through the preliminary experiments, such as the suitability evaluation of information display on a LSDP, the evaluation of information display on a LSDP, the evaluation of computerized operation procedure and an experiment of advanced alarm system (ADIOS). Finally, we developed the ITF including human machine simulator, telemetry system, an eye tracking system, an audio/video data measurement system, and three dimensional micro behaviour analysis system. (author). 81 refs., 68 tabs., 73 figs.

  19. Standard Procedure for Grid Interaction Analysis

    International Nuclear Information System (INIS)

    Svensson, Bertil; Lindahl, Sture; Karlsson, Daniel; Joensson, Jonas; Heyman, Fredrik

    2015-01-01

    Grid events, simultaneously affecting all safety related auxiliary systems in a nuclear power plant, are critical and must be carefully addressed in the design, upgrading and operational processes. Up to now, the connecting grid has often been treated as either fully available or totally unavailable, and too little attention has been paid to specify the grid performance criteria. This paper deals with standard procedures for grid interaction analysis, to derive tools and criteria to handle grid events challenging the safety systems of the plant. Critical external power system events are investigated and characterised, with respect to severity and rate of occurrence. These critical events are then grouped with respect to impact on the safety systems, when a disturbance propagates into the plant. It is then important to make sure that 1) the impact of the disturbance will never reach any critical system, 2) the impact of the disturbance will be eliminated before it will hurt any critical system, or 3) the critical systems will be proven to be designed in such a way that they can withstand the impact of the disturbance, and the associated control and protection systems can withstand voltage and frequency transients associated with the disturbances. A number of representative disturbance profiles, reflecting connecting grid conditions, are therefore derived, to be used for equipment testing. (authors)

  20. Exploring Technostress: Results of a Large Sample Factor Analysis

    Directory of Open Access Journals (Sweden)

    Steponas Jonušauskas

    2016-06-01

    Full Text Available With reference to the results of a large sample factor analysis, the article aims to propose the frame examining technostress in a population. The survey and principal component analysis of the sample consisting of 1013 individuals who use ICT in their everyday work was implemented in the research. 13 factors combine 68 questions and explain 59.13 per cent of the answers dispersion. Based on the factor analysis, questionnaire was reframed and prepared to reasonably analyze the respondents’ answers, revealing technostress causes and consequences as well as technostress prevalence in the population in a statistically validated pattern. A key elements of technostress based on factor analysis can serve for the construction of technostress measurement scales in further research.

  1. Uncertainty Evaluation of the SFR Subchannel Thermal-Hydraulic Modeling Using a Hot Channel Factors Analysis

    International Nuclear Information System (INIS)

    Choi, Sun Rock; Cho, Chung Ho; Kim, Sang Ji

    2011-01-01

    In an SFR core analysis, a hot channel factors (HCF) method is most commonly used to evaluate uncertainty. It was employed to the early design such as the CRBRP and IFR. In other ways, the improved thermal design procedure (ITDP) is able to calculate the overall uncertainty based on the Root Sum Square technique and sensitivity analyses of each design parameters. The Monte Carlo method (MCM) is also employed to estimate the uncertainties. In this method, all the input uncertainties are randomly sampled according to their probability density functions and the resulting distribution for the output quantity is analyzed. Since an uncertainty analysis is basically calculated from the temperature distribution in a subassembly, the core thermal-hydraulic modeling greatly affects the resulting uncertainty. At KAERI, the SLTHEN and MATRA-LMR codes have been utilized to analyze the SFR core thermal-hydraulics. The SLTHEN (steady-state LMR core thermal hydraulics analysis code based on the ENERGY model) code is a modified version of the SUPERENERGY2 code, which conducts a multi-assembly, steady state calculation based on a simplified ENERGY model. The detailed subchannel analysis code MATRA-LMR (Multichannel Analyzer for Steady-State and Transients in Rod Arrays for Liquid Metal Reactors), an LMR version of MATRA, was also developed specifically for the SFR core thermal-hydraulic analysis. This paper describes comparative studies for core thermal-hydraulic models. The subchannel analysis and a hot channel factors based uncertainty evaluation system is established to estimate the core thermofluidic uncertainties using the MATRA-LMR code and the results are compared to those of the SLTHEN code

  2. Exploring Technostress: Results of a Large Sample Factor Analysis

    OpenAIRE

    Jonušauskas, Steponas; Raišienė, Agota Giedrė

    2016-01-01

    With reference to the results of a large sample factor analysis, the article aims to propose the frame examining technostress in a population. The survey and principal component analysis of the sample consisting of 1013 individuals who use ICT in their everyday work was implemented in the research. 13 factors combine 68 questions and explain 59.13 per cent of the answers dispersion. Based on the factor analysis, questionnaire was reframed and prepared to reasonably analyze the respondents’ an...

  3. Human factors research plan for instrument procedures : FY12 version 1.1

    Science.gov (United States)

    2012-06-19

    This research will support the development of instrument procedures for performance-based navigation (PBN) operations. These procedures include, but are not limited to, area navigation (RNAV) and required navigation performance (RNP) operations. The ...

  4. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    Science.gov (United States)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  5. Measuring cognitive load during procedural skills training with colonoscopy as an exemplar.

    Science.gov (United States)

    Sewell, Justin L; Boscardin, Christy K; Young, John Q; Ten Cate, Olle; O'Sullivan, Patricia S

    2016-06-01

    Few studies have investigated cognitive factors affecting learning of procedural skills in medical education. Cognitive load theory, which focuses on working memory, is highly relevant, but methods for measuring cognitive load during procedural training are not well understood. Using colonoscopy as an exemplar, we used cognitive load theory to develop a self-report instrument to measure three types of cognitive load (intrinsic, extraneous and germane load) and to provide evidence for instrument validity. We developed the instrument (the Cognitive Load Inventory for Colonoscopy [CLIC]) using a multi-step process. It included 19 items measuring three types of cognitive load, three global rating items and demographics. We then conducted a cross-sectional survey that was administered electronically to 1061 gastroenterology trainees in the USA. Participants completed the CLIC following a colonoscopy. The two study phases (exploratory and confirmatory) each lasted for 10 weeks during the 2014-2015 academic year. Exploratory factor analysis determined the most parsimonious factor structure; confirmatory factor analysis assessed model fit. Composite measures of intrinsic, extraneous and germane load were compared across years of training and with global rating items. A total of 477 (45.0%) invitees participated (116 in the exploratory study and 361 in the confirmatory study) in 154 (95.1%) training programmes. Demographics were similar to national data from the USA. The most parsimonious factor structure included three factors reflecting the three types of cognitive load. Confirmatory factor analysis verified that a three-factor model was the best fit. Intrinsic, extraneous and germane load items had high internal consistency (Cronbach's alpha 0.90, 0.87 and 0.96, respectively) and correlated as expected with year in training and global assessment of cognitive load. The CLIC measures three types of cognitive load during colonoscopy training. Evidence of validity is

  6. User's operating procedures. Volume 2: Scout project financial analysis program

    Science.gov (United States)

    Harris, C. G.; Haris, D. K.

    1985-01-01

    A review is presented of the user's operating procedures for the Scout Project Automatic Data system, called SPADS. SPADS is the result of the past seven years of software development on a Prime mini-computer located at the Scout Project Office, NASA Langley Research Center, Hampton, Virginia. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, two (2) of three (3), provides the instructions to operate the Scout Project Financial Analysis program in data retrieval and file maintenance via the user friendly menu drivers.

  7. Reduction procedures for accurate analysis of MSX surveillance experiment data

    Science.gov (United States)

    Gaposchkin, E. Mike; Lane, Mark T.; Abbot, Rick I.

    1994-01-01

    Technical challenges of the Midcourse Space Experiment (MSX) science instruments require careful characterization and calibration of these sensors for analysis of surveillance experiment data. Procedures for reduction of Resident Space Object (RSO) detections will be presented which include refinement and calibration of the metric and radiometric (and photometric) data and calculation of a precise MSX ephemeris. Examples will be given which support the reduction, and these are taken from ground-test data similar in characteristics to the MSX sensors and from the IRAS satellite RSO detections. Examples to demonstrate the calculation of a precise ephemeris will be provided from satellites in similar orbits which are equipped with S-band transponders.

  8. Cath lab costs in patients undergoing percutaneous coronary angioplasty - detailed analysis of consecutive procedures.

    Science.gov (United States)

    Dziki, Beata; Miechowicz, Izabela; Iwachów, Piotr; Kuzemczak, Michał; Kałmucki, Piotr; Szyszka, Andrzej; Baszko, Artur; Siminiak, Tomasz

    2017-01-01

    Costs of percutaneous coronary interventions (PCI) have an important impact on health care expenditures. Despite the present stress upon the cost-effectiveness issues in medicine, few comprehensive data exist on costs and resource use in different clinical settings. To assess catheterisation laboratory costs related to use of drugs and single-use devices in patients undergoing PCI due to coronary artery disease. Retrospective analysis of 1500 consecutive PCIs (radial approach, n = 1103; femoral approach, n = 397) performed due to ST segment elevation myocardial infarction (STEMI; n = 345) and non ST-segment elevation myocardial infarction (NSTEMI; n = 426) as well as unstable angina (UA; n = 489) and stable angina (SA; n = 241) was undertaken. Comparative cost analysis was performed and shown in local currency units (PLN). The cath lab costs were higher in STEMI (4295.01 ± 2384.54PLN, p costs were positively correlated with X-ray dose, fluoroscopy, and total procedure times. Patients' age negatively correlated with cath lab costs in STEMI/NSTEMI patients. Cath lab costs were higher in STEMI patients compared to other groups. In STEMI/NSTEMI they were lower in older patients. In all analysed groups costs were related to the level of procedural difficulty. In female patients, the costs of PCI performed via radial approach were higher compared to femoral approach. Despite younger age, male patients underwent more expensive procedures.

  9. Computer-Aided Diagnosis of Solid Breast Lesions Using an Ultrasonic Multi-Feature Analysis Procedure

    Science.gov (United States)

    2011-01-01

    ultrasound. 1. BACKGROUND AND INTRODUCTION Breast cancer affects one of every eight women, it kills one of 29 women in the United States, and is the leading...feature analysis procedure for computer-aided diagnosis of solid breast lesions,” Ultrason Imag, 2010 (In Press). 22. C. B. Shakespeare , personal

  10. Factor analysis improves the selection of prescribing indicators

    DEFF Research Database (Denmark)

    Rasmussen, Hanne Marie Skyggedal; Søndergaard, Jens; Sokolowski, Ineta

    2006-01-01

    OBJECTIVE: To test a method for improving the selection of indicators of general practitioners' prescribing. METHODS: We conducted a prescription database study including all 180 general practices in the County of Funen, Denmark, approximately 472,000 inhabitants. Principal factor analysis was us...... appropriate and inappropriate prescribing, as revealed by the correlation of the indicators in the first factor. CONCLUSION: Correlation and factor analysis is a feasible method that assists the selection of indicators and gives better insight into prescribing patterns....

  11. What are the factors predictive of hysterosalpingogram compliance after female sterilization by the Essure procedure in a publicly insured population?

    Science.gov (United States)

    Howard, David L; Wall, Jeffrey; Strickland, Julie L

    2013-12-01

    To determine what factors are predictive of post-Essure hysterosalpingogram (HSG) compliance. We conducted a retrospective chart review of all patients who underwent the Essure procedure at the two campuses of the Truman Medical Center, Kansas City, Missouri, from January 1, 2005 through December 31, 2010. Our study population consisted primarily of women who were publicly insured (89.0 %) and unmarried (76.7 %). Of 132 patients referred for HSG, 70 (53.0 %) complied. In adjusted analyses women 35 years and older had an almost fourfold higher odds of HSG compliance (OR = 3.72, 95 % CI 1.35-10.23) and women with 3 or more living children had a 64 % lower odds of HSG compliance (OR = 0.36, 95 % CI 0.16-0.82). Women younger than 35 who had 3 or more children had the lowest compliance rate (36.4 %) suggesting an interaction between age and parity. Women undergoing the Essure procedure at the campus with a dedicated protocol to ensure compliance had an almost fourfold higher odds of HSG compliance (OR = 3.67, 95 % CI 1.01-13.40). In a population consisting largely of publicly insured, unmarried women, several factors are predictive of post-Essure HSG compliance. These include age, parity and the presence or absence of an institutional protocol to keep track of patients after their Essure procedure.

  12. Secondary Analysis of Audio Data. Technical Procedures for Virtual Anonymization and Pseudonymization

    Directory of Open Access Journals (Sweden)

    Henning Pätzold

    2005-01-01

    Full Text Available Qualitative material presented as audio data requires a greater degree of protecting of anonymity than for example textual data. Apart from the verbal content, it carries paraverbal aspects including voice characteristics, thus making it easier to identify the speaker. This complicates secondary analysis or reanalysis conducted by researchers who were not involved in the data collection. Difficulties increase if the chances are high that the researcher and the interviewee come in contact for example through a meeting. This paper describes the technical procedures that are used to modify the sound of the audio source in a way that it reduces the possibility of recognition (i.e. similar to that of a carefully written transcript. A discussion of the technical possibilities of this procedure along with an exploration of the boundaries of anonymization is presented. URN: urn:nbn:de:0114-fqs0501249

  13. A model for analysing factors which may influence quality management procedures in higher education

    Directory of Open Access Journals (Sweden)

    Cătălin MAICAN

    2015-12-01

    Full Text Available In all universities, the Office for Quality Assurance defines the procedure for assessing the performance of the teaching staff, with a view to establishing students’ perception as regards the teachers’ activity from the point of view of the quality of the teaching process, of the relationship with the students and of the assistance provided for learning. The present paper aims at creating a combined model for evaluation, based on Data Mining statistical methods: starting from the findings revealed by the evaluations teachers performed to students, using the cluster analysis and the discriminant analysis, we identified the subjects which produced significant differences between students’ grades, subjects which were subsequently subjected to an evaluation by students. The results of these analyses allowed the formulation of certain measures for enhancing the quality of the evaluation process.

  14. Human factors analysis of incident/accident report

    International Nuclear Information System (INIS)

    Kuroda, Isao

    1992-01-01

    Human factors analysis of accident/incident has different kinds of difficulties in not only technical, but also psychosocial background. This report introduces some experiments of 'Variation diagram method' which is able to extend to operational and managemental factors. (author)

  15. Phrenic nerve injury after radiofrequency ablation of lung tumors: retrospective evaluation of the incidence and risk factors.

    Science.gov (United States)

    Matsui, Yusuke; Hiraki, Takao; Gobara, Hideo; Uka, Mayu; Masaoka, Yoshihisa; Tada, Akihiro; Toyooka, Shinichi; Mitsuhashi, Toshiharu; Mimura, Hidefumi; Kanazawa, Susumu

    2012-06-01

    To retrospectively investigate the incidence of and risk factors for phrenic nerve injury after radiofrequency (RF) ablation of lung tumors. The study included 814 RF ablation procedures of lung tumors. To evaluate the development of phrenic nerve injury, chest radiographs obtained before and after the procedure were examined. Phrenic nerve injury was assumed to have developed if the diaphragmatic level was elevated after the procedure. To identify risk factors for phrenic nerve injury, multiple variables were compared between cases of phrenic nerve injury and randomly selected controls by using univariate analyses. Multivariate analysis was then performed to identify independent risk factors. Evaluation of phrenic nerve injury from chest radiographs was possible after 786 procedures. Evidence of phrenic nerve injury developed after 10 cases (1.3%). Univariate analysis revealed that larger tumor size (≥ 20 mm; P = .014), proximity of the phrenic nerve to the tumor (phrenic nerve injury. Multivariate analysis demonstrated that the proximity of the phrenic nerve to the tumor (phrenic nerve injury after RF ablation was 1.3%. The proximity of the phrenic nerve to the tumor was an independent risk factor for phrenic nerve injury. Copyright © 2012 SIR. Published by Elsevier Inc. All rights reserved.

  16. Factors affecting successful colonoscopy procedures: Single-center experience.

    Science.gov (United States)

    Kozan, Ramazan; Yılmaz, Tonguç Utku; Baştuğral, Uygar; Kerimoğlu, Umut; Yavuz, Yücel

    2018-01-01

    Colonoscopy is a gold standard procedure for several colon pathologies. Successful colonoscopy means demonstration of the ileocecal valve and determination of colon polyps. Here we aimed to evaluate our colonoscopy success and results. This retrospective descriptive study was performed in İstanbul Eren hospital endoscopy unit between 2012 and 2015. Colonoscopy results and patient demographics were obtained from the hospital database. All colonoscopy procedures were performed under general anesthesia and after full bowel preparation. In all, 870 patients were included to the study. We reached to the cecum in 850 (97.8%) patients. We were unable to reach the cecum in patients who were old and obese and those with previous lower abdominal operations. Angulation, inability to move forward, and tortuous colon were the reasons for inability to reach the cecum. Total 203 polyp samplings were performed in 139 patients. We performed 1, 2, and 3 polypectomies in 97, 28, and 10 patients, respectively. There were 29 (3.3%) colorectal cancers in our series. There was no mortality or morbidity in our study. General anesthesia and full bowel preparation may be the reason for increased success of colonoscopy. Increased experience and patient-endoscopist cooperation increased the rate of cecum access and polyp resection and decreased the complication rate.

  17. PGDP [Paducah Gaseous Diffusion Plant]-UF6 handling, sampling, analysis and associated QC/QA and safety related procedures

    International Nuclear Information System (INIS)

    Harris, R.L.

    1987-01-01

    This document is a compilation of Paducah Gaseous Diffusion Plant procedures on UF 6 handling, sampling, and analysis, along with associated QC/QA and safety related procedures. It was assembled for transmission by the US Department of Energy to the Korean Advanced Energy Institute as a part of the US-Korea technical exchange program

  18. Nominal Performance Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M.A. Wasiolek

    2005-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standards. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1-1). The objectives of this analysis are to develop BDCFs for the

  19. Simulation and analysis of data for enhancing low cycle fatigue test procedures

    Energy Technology Data Exchange (ETDEWEB)

    Sarajaervi, U.; Cronvall, O. [VTT Technical Research Centre of Finland (Finland)

    2006-04-15

    The simulation and analysis of data for enhancing low cycle fatigue test procedures is discussed in this report. The analysed materials are an austenitic stainless piping steel and an austenitic weld material. This project continues the work performed in 2003 and 2004. The fatigue test data treatment application developed within the project in 2004 for the preparation of the fatigue data has been developed further. Also, more fatigue test data has been analysed with the application than in 2004. In addition to this numerical fatigue simulations were performed with FEM code ABAQUS. With the fatigue test data treatment application one can e.g. both calculate cyclically certain relevant characteristic values, e.g. elastic range, and form a set of certain cyclical parameter values needed as a part of ABAQUS analysis input files. The hardening properties of metals were modelled with both isotropic and kinematic hardening models. The further development of the application included trimming of the analysed data, and consequently trimming of resulting hardening parameters. The need for the trimming arose from the fact that the analysed fatigue test data presents some scatter caused by the limited accuracy of the test equipment and the sampling rate. The hardening parameters obtained from the application analysis results were used in the subsequent ABAQUS analyses, and then the fatigue test data were compared with the ABAQUS simulation results. After finding a procedure to trim result data to get smooth curves for cyclic hardening, hardening and softening could be reproduced in ABAQUS analysis with a reasonable accuracy. The modelling of the fatigue induced initiation and growth of cracks was not considered in this study. On the other hand, a considerable part of the fatigue life of nuclear power plant (NPP) piping components is spent in the phase preceding the initiation and growth of cracks. (au)

  20. Simulation and analysis of data for enhancing low cycle fatigue test procedures

    International Nuclear Information System (INIS)

    Sarajaervi, U.; Cronvall, O.

    2006-04-01

    The simulation and analysis of data for enhancing low cycle fatigue test procedures is discussed in this report. The analysed materials are an austenitic stainless piping steel and an austenitic weld material. This project continues the work performed in 2003 and 2004. The fatigue test data treatment application developed within the project in 2004 for the preparation of the fatigue data has been developed further. Also, more fatigue test data has been analysed with the application than in 2004. In addition to this numerical fatigue simulations were performed with FEM code ABAQUS. With the fatigue test data treatment application one can e.g. both calculate cyclically certain relevant characteristic values, e.g. elastic range, and form a set of certain cyclical parameter values needed as a part of ABAQUS analysis input files. The hardening properties of metals were modelled with both isotropic and kinematic hardening models. The further development of the application included trimming of the analysed data, and consequently trimming of resulting hardening parameters. The need for the trimming arose from the fact that the analysed fatigue test data presents some scatter caused by the limited accuracy of the test equipment and the sampling rate. The hardening parameters obtained from the application analysis results were used in the subsequent ABAQUS analyses, and then the fatigue test data were compared with the ABAQUS simulation results. After finding a procedure to trim result data to get smooth curves for cyclic hardening, hardening and softening could be reproduced in ABAQUS analysis with a reasonable accuracy. The modelling of the fatigue induced initiation and growth of cracks was not considered in this study. On the other hand, a considerable part of the fatigue life of nuclear power plant (NPP) piping components is spent in the phase preceding the initiation and growth of cracks. (au)

  1. Analysis of technological, institutional and socioeconomic factors ...

    African Journals Online (AJOL)

    Analysis of technological, institutional and socioeconomic factors that influences poor reading culture among secondary school students in Nigeria. ... Proliferation and availability of smart phones, chatting culture and social media were identified as technological factors influencing poor reading culture among secondary ...

  2. Exploratory Analysis of the Factors Affecting Consumer Choice in E-Commerce: Conjoint Analysis

    Directory of Open Access Journals (Sweden)

    Elena Mazurova

    2017-05-01

    Full Text Available According to previous studies of online consumer behaviour, three factors are the most influential on purchasing behavior - brand, colour and position of the product on the screen. However, a simultaneous influence of these three factors on the consumer decision making process has not been investigated previously. In this particular work we aim to execute a comprehensive study of the influence of these three factors. In order to answer our main research questions, we conducted an experiment with 96 different combinations of the three attributes, and using statistical analysis, such as conjoint analysis, t-test analysis and Kendall analysis we identified that the most influential factor to the online consumer decision making process is brand, the second most important attribute is the colour, which was estimated half as important as brand, and the least important attribute is the position on the screen. Additionally, we identified the main differences regarding consumers stated and revealed preferences regarding these three attributes.

  3. A factor analysis to find critical success factors in retail brand

    Directory of Open Access Journals (Sweden)

    Naser Azad

    2013-03-01

    Full Text Available The present exploratory study aims to find critical components of retail brand among some retail stores. The study seeks to build a brand name in retail level and looks to find important factors affecting it. Customer behavior is largely influenced when the first retail customer experience is formed. These factors have direct impacts on customer experience and satisfaction in retail industry. The proposed study performs an empirical investigation on two well-known retain stores located in city of Tehran, Iran. Using a sample of 265 people from regular customers, the study uses factor analysis and extracts four main factors including related brand, product benefits, customer welfare strategy and corporate profits using the existing 31 factors in the literature.

  4. Interim Reliability Evaluation Program procedures guide

    International Nuclear Information System (INIS)

    Carlson, D.D.; Gallup, D.R.; Kolaczkowski, A.M.; Kolb, G.J.; Stack, D.W.; Lofgren, E.; Horton, W.H.; Lobner, P.R.

    1983-01-01

    This document presents procedures for conducting analyses of a scope similar to those performed in Phase II of the Interim Reliability Evaluation Program (IREP). It documents the current state of the art in performing the plant systems analysis portion of a probabilistic risk assessment. Insights gained into managing such an analysis are discussed. Step-by-step procedures and methodological guidance constitute the major portion of the document. While not to be viewed as a cookbook, the procedures set forth the principal steps in performing an IREP analysis. Guidance for resolving the problems encountered in previous analyses is offered. Numerous examples and representative products from previous analyses clarify the discussion

  5. Optimization of procedures for mercury-203 instrumental neutron activation analysis in human urine

    Energy Technology Data Exchange (ETDEWEB)

    Blotcky, A J; Claassen, J P [Nebraska Univ., Omaha, NE (United States). Medical Center; Fung, Y K [Nebraska Univ., Lincoln, NE (United States). Dept. of Chemistry; Meade, A G; Rack, E P [Nebraska Univ., Lincoln, NE (United States)

    1995-08-01

    Mercury, a known neurotoxin, has been implicated in etiology and pathogenesis of such disease states as Alzheimer`s and Parkinson`s diseases. There is concern that the exposure to mercury vapor released from dental amalgam restorations is a potential health hazard. Measurement of mercury concentrations in blood or urine may be useful in diagnosis of mercury poisoning and in assessing the extent exposure. This study describes the optimization of pre-neutron activation analysis procedures such as sampling, selection of irradiation and counting vials and acid digestion in order to minimize mercury loss via volatilization and/or permeation through containers. Therefore, the determination of mercury can be complicated by these potential losses. In the optimized procedure 20mL of urine was spiked with three different concentrations of mercury, digested with concentrated nitric acid, and placed in polypropylene vials for irradiation and counting. Analysis was performed by subtracting the Se-75 photopeak contribution to the 279 keV Hg-203 photopeak and applying the method of standard additions. Urinary mercury concentrations in normal human subjects were determined to be of the order of 10ng/mL. (author). 22 refs., 1 fig., 5 tabs.

  6. Optimization of procedures for mercury-203 instrumental neutron activation analysis in human urine

    International Nuclear Information System (INIS)

    Blotcky, A.J.; Claassen, J.P.

    1995-01-01

    Mercury, a known neurotoxin, has been implicated in etiology and pathogenesis of such disease states as Alzheimer's and Parkinson's diseases. There is concern that the exposure to mercury vapor released from dental amalgam restorations is a potential health hazard. Measurement of mercury concentrations in blood or urine may be useful in diagnosis of mercury poisoning and in assessing the extent exposure. This study describes the optimization of pre-neutron activation analysis procedures such as sampling, selection of irradiation and counting vials and acid digestion in order to minimize mercury loss via volatilization and/or permeation through containers. Therefore, the determination of mercury can be complicated by these potential losses. In the optimized procedure 20mL of urine was spiked with three different concentrations of mercury, digested with concentrated nitric acid, and placed in polypropylene vials for irradiation and counting. Analysis was performed by subtracting the Se-75 photopeak contribution to the 279 keV Hg-203 photopeak and applying the method of standard additions. Urinary mercury concentrations in normal human subjects were determined to be of the order of 10ng/mL. (author). 22 refs., 1 fig., 5 tabs

  7. Complications associated with transobturator sling procedures: analysis of 233 consecutive cases with a 27 months follow-up

    Directory of Open Access Journals (Sweden)

    Dubuisson Jean-Bernard

    2009-09-01

    Full Text Available Abstract Backround The transobturator tape procedure (TOT is an effective surgical treatment of female stress urinary incontinence. However data concerning safety are rare, follow-up is often less than two years, and complications are probably underreported. The aim of this study was to describe early and late complications associated with TOT procedures and identify risk factors for erosions. Methods It was a 27 months follow-up of a cohort of 233 women who underwent TOT with three different types of slings (Aris®, Obtape®, TVT-O®. Follow-up information was available for 225 (96.6% women. Results There were few per operative complications. Forty-eight women (21.3% reported late complications including de novo or worsening of preexisting urgencies (10.2%, perineal pain (2.2%, de novo dyspareunia (9%, and vaginal erosion (7.6%. The risk of erosion significantly differed between the three types of slings and was 4%, 17% and 0% for Aris®, Obtape® and TVT-O® respectively (P = 0.001. The overall proportion of women satisfied by the procedure was 72.1%. The percentage of women satisfied was significantly lower in women who experienced erosion (29.4% compared to women who did not (78.4% (RR 0.14, 95% CI 0.05-0.38, P Conclusion Late post operative complications are relatively frequent after TOT and can impair patient's satisfaction. Women should be informed of these potential complications preoperatively and require careful follow-up after the procedure. Choice of the safest sling material is crucial as it is a risk factor for erosion.

  8. Analysis of the acceptance procedure and quality control a virtual simulation system

    International Nuclear Information System (INIS)

    Gonzalez Ruiz, C.; Pedrero de Aristizabal, D.; Jimenez Rojas, R.; Garcia Hernandez, M. J.; Ruiz Galan, G.; Ayala Lazaro, R.; Garcia Marcos, R.

    2011-01-01

    Acceptance has been made, determining the reference state, commissioning and implementation of control protocol virtual simulation system consists of an image acquisition unit of computerized tomography (CT), an independent external location laser locator and a simulation module associated with the existing scheduler for clinical dosimetry in radiotherapy. This paper summarizes the path followed in this process, together with the established procedure for periodic monitoring and analysis system of the results obtained in the two years of clinical and control.

  9. Effect of music in endoscopy procedures: systematic review and meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Wang, Man Cai; Zhang, Ling Yi; Zhang, Yu Long; Zhang, Ya Wu; Xu, Xiao Dong; Zhang, You Cheng

    2014-10-01

    Endoscopies are common clinical examinations that are somewhat painful and even cause fear and anxiety for patients. We performed this systematic review and meta-analysis of randomized controlled trials to determine the effect of music on patients undergoing various endoscopic procedures. We searched the Cochrane Library, Issue 6, 2013, PubMed, and EMBASE databases up to July 2013. Randomized controlled trials comparing endoscopies, with and without the use of music, were included. Two authors independently abstracted data and assessed risk of bias. Subgroup analyses were performed to examine the impact of music on different types of endoscopic procedures. Twenty-one randomized controlled trials involving 2,134 patients were included. The overall effect of music on patients undergoing a variety of endoscopic procedures significantly improved pain score (weighted mean difference [WMD] = -1.53, 95% confidence interval [CI] [-2.53, -0.53]), anxiety (WMD = -6.04, 95% CI [-9.61, -2.48]), heart rate (P = 0.01), arterial pressure (P music group, compared with the control group. Furthermore, music had little effect for patients undergoing colposcopy and bronchoscopy in the subanalysis. Our meta-analysis suggested that music may offer benefits for patients undergoing endoscopy, except in colposcopy and bronchoscopy. Wiley Periodicals, Inc.

  10. Risk factors for postoperative complications in robotic general surgery.

    Science.gov (United States)

    Fantola, Giovanni; Brunaud, Laurent; Nguyen-Thi, Phi-Linh; Germain, Adeline; Ayav, Ahmet; Bresler, Laurent

    2017-03-01

    The feasibility and safety of robotically assisted procedures in general surgery have been reported from various groups worldwide. Because postoperative complications may lead to longer hospital stays and higher costs overall, analysis of risk factors for postoperative surgical complications in this subset of patients is clinically relevant. The goal of this study was to identify risk factors for postoperative morbidity after robotic surgical procedures in general surgery. We performed an observational monocentric retrospective study. All consecutive robotic surgical procedures from November 2001 to December 2013 were included. One thousand consecutive general surgery patients met the inclusion criteria. The mean overall postoperative morbidity and major postoperative morbidity (Clavien >III) rates were 20.4 and 6 %, respectively. This included a conversion rate of 4.4 %, reoperation rate of 4.5 %, and mortality rate of 0.2 %. Multivariate analysis showed that ASA score >3 [OR 1.7; 95 % CI (1.2-2.4)], hematocrit value surgery [OR 1.5; 95 % CI (1-2)], advanced dissection [OR 5.8; 95 % CI (3.1-10.6)], and multiquadrant surgery [OR 2.5; 95 % CI (1.7-3.8)] remained independent risk factors for overall postoperative morbidity. It also showed that advanced dissection [OR 4.4; 95 % CI (1.9-9.6)] and multiquadrant surgery [OR 4.4; 95 % CI (2.3-8.5)] remained independent risk factors for major postoperative morbidity (Clavien >III). This study identifies independent risk factors for postoperative overall and major morbidity in robotic general surgery. Because these factors independently impacted postoperative complications, we believe they could be taken into account in future studies comparing conventional versus robot-assisted laparoscopic procedures in general surgery.

  11. Analysis of half diallel mating designs I: a practical analysis procedure for ANOVA approximation.

    Science.gov (United States)

    G.R. Johnson; J.N. King

    1998-01-01

    Procedures to analyze half-diallel mating designs using the SAS statistical package are presented. The procedure requires two runs of PROC and VARCOMP and results in estimates of additive and non-additive genetic variation. The procedures described can be modified to work on most statistical software packages which can compute variance component estimates. The...

  12. Measuring variability of procedure progression in proceduralized scenarios

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea

    2012-01-01

    Highlights: ► The VPP measure was developed to quantify how differently operators follow the procedures. ► Sources that cause variability of ways to follow a given procedure were identified. ► The VPP values for the scenarios are positively related to the scenario performance time. ► The VPP measure is meaningful for explaining characteristics of several PSFs. -- Abstract: Various performance shaping factors (PSFs) have been presented to explain the contributors to unsafe acts in a human failure event or predict a human error probability of new human performance. However, because most of these parameters of an HRA depend on the subjective knowledge and experience of HRA analyzers, the results of an HRA insufficiently provide unbiased standards to explain human performance variations or compare collected data with other data from different analyzers. To secure the validity of the HRA results, we propose a quantitative measure, which represents the variability of procedure progression (VPP) in proceduralized scenarios. A VPP measure shows how differently the operators follow the steps of the procedures. This paper introduces the sources of the VPP measure and relevance to PSFs. The assessment method of the VPP measure is also proposed, and the application examples are shown with a comparison of the performance time. Although more empirical studies should be conducted to reveal the relationship between the VPP measure and other PSFs, it is believed that the VPP measure provides evidence to quantitatively evaluate human performance variations and to cross-culturally compare the collected data.

  13. Bootstrap-based procedures for inference in nonparametric receiver-operating characteristic curve regression analysis.

    Science.gov (United States)

    Rodríguez-Álvarez, María Xosé; Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Tahoces, Pablo G

    2018-03-01

    Prior to using a diagnostic test in a routine clinical setting, the rigorous evaluation of its diagnostic accuracy is essential. The receiver-operating characteristic curve is the measure of accuracy most widely used for continuous diagnostic tests. However, the possible impact of extra information about the patient (or even the environment) on diagnostic accuracy also needs to be assessed. In this paper, we focus on an estimator for the covariate-specific receiver-operating characteristic curve based on direct regression modelling and nonparametric smoothing techniques. This approach defines the class of generalised additive models for the receiver-operating characteristic curve. The main aim of the paper is to offer new inferential procedures for testing the effect of covariates on the conditional receiver-operating characteristic curve within the above-mentioned class. Specifically, two different bootstrap-based tests are suggested to check (a) the possible effect of continuous covariates on the receiver-operating characteristic curve and (b) the presence of factor-by-curve interaction terms. The validity of the proposed bootstrap-based procedures is supported by simulations. To facilitate the application of these new procedures in practice, an R-package, known as npROCRegression, is provided and briefly described. Finally, data derived from a computer-aided diagnostic system for the automatic detection of tumour masses in breast cancer is analysed.

  14. Comparative analysis of diagnostic accuracy of different brain biopsy procedures.

    Science.gov (United States)

    Jain, Deepali; Sharma, Mehar Chand; Sarkar, Chitra; Gupta, Deepak; Singh, Manmohan; Mahapatra, A K

    2006-12-01

    Image-guided procedures such as computed tomography (CT) guided, neuronavigator-guided and ultrasound-guided methods can assist neurosurgeons in localizing the intraparenchymal lesion of the brain. However, despite improvements in the imaging techniques, an accurate diagnosis of intrinsic lesion requires tissue sampling and histological verification. The present study was carried out to examine the reliability of the diagnoses made on tumor sample obtained via different stereotactic and ultrasound-guided brain biopsy procedures. A retrospective analysis was conducted of all brain biopsies (frame-based and frameless stereotactic and ultrasound-guided) performed in a single tertiary care neurosciences center between 1995 and 2005. The overall diagnostic accuracy achieved on histopathology and correlation with type of biopsy technique was evaluated. A total of 130 cases were included, which consisted of 82 males and 48 females. Age ranged from 4 to 75 years (mean age 39.5 years). Twenty per cent (27 patients) were in the pediatric age group, while 12% (16 patients) were >or= 60-years of age. A definitive histological diagnosis was established in 109 cases (diagnostic yield 80.2%), which encompassed 101 neoplastic and eight nonneoplastic lesions. Frame-based, frameless stereotactic and ultrasound-guided biopsies were done in 95, 15 and 20 patients respectively. Although the numbers of cases were small there was trend for better yield with frameless image-guided stereotactic biopsy and maximum diagnostic yield was obtained i.e, 87% (13/15) in comparison to conventional frame-based CT-guided stereotactic biopsy and ultrasound-guided biopsy. Overall, a trend of higher diagnostic yield was seen in cases with frameless image-guided stereotactic biopsy. Thus, this small series confirms that frameless neuronavigator-guided stereotactic procedures represent the lesion sufficiently in order to make histopathologic diagnosis.

  15. Factor analysis for exercise stress radionuclide ventriculography

    International Nuclear Information System (INIS)

    Hirota, Kazuyoshi; Yasuda, Mitsutaka; Oku, Hisao; Ikuno, Yoshiyasu; Takeuchi, Kazuhide; Takeda, Tadanao; Ochi, Hironobu

    1987-01-01

    Using factor analysis, a new image processing in exercise stress radionuclide ventriculography, changes in factors associated with exercise were evaluated in 14 patients with angina pectoris or old myocardial infarction. The patients were imaged in the left anterior oblique projection, and three factor images were presented on a color coded scale. Abnormal factors (AF) were observed in 6 patients before exercise, 13 during exercise, and 4 after exercise. In 7 patients, the occurrence of AF was associated with exercise. Five of them became free from AF after exercise. Three patients showing AF before exercise had aggravation of AF during exercise. Overall, the occurrence or aggravation of AF was associated with exercise in ten (71 %) of the patients. The other three patients, however, had disappearance of AF during exercise. In the last patient, none of the AF was observed throughout the study. In view of a high incidence of AF associated with exercise, the factor analysis may have the potential in evaluating cardiac reverse from the viewpoint of left ventricular wall motion abnormality. (Namekawa, K.)

  16. Patient Dose During Carotid Artery Stenting With Embolic-Protection Devices: Evaluation With Radiochromic Films and Related Diagnostic Reference Levels According to Factors Influencing the Procedure

    Energy Technology Data Exchange (ETDEWEB)

    D' Ercole, Loredana, E-mail: l.dercole@smatteo.pv.it [Fondazione IRCCS Policlinico San Matteo, Department of Medical Physics (Italy); Quaretti, Pietro; Cionfoli, Nicola [Fondazione IRCCS Policlinico San Matteo, Department of Radiology (Italy); Klersy, Catherine [Fondazione IRCCS Policlinico San Matteo, Biometry and Clinical Epidemiology Service, Research Department, (Italy); Bocchiola, Milena [Fondazione IRCCS Policlinico San Matteo, Department of Medical Physics (Italy); Rodolico, Giuseppe; Azzaretti, Andrea [Fondazione IRCCS Policlinico San Matteo, Department of Radiology (Italy); Lisciandro, Francesco [Fondazione IRCCS Policlinico San Matteo, Department of Medical Physics (Italy); Cascella, Tommaso; Zappoli Thyrion, Federico [Fondazione IRCCS Policlinico San Matteo, Department of Radiology (Italy)

    2013-04-15

    To measure the maximum entrance skin dose (MESD) on patients undergoing carotid artery stenting (CAS) using embolic-protection devices, to analyze the dependence of dose and exposure parameters on anatomical, clinical, and technical factors affecting the procedure complexity, to obtain some local diagnostic reference levels (DRLs), and to evaluate whether overcoming DRLs is related to procedure complexity. MESD were evaluated with radiochromic films in 31 patients (mean age 72 {+-} 7 years). Five of 33 (15 %) procedures used proximal EPD, and 28 of 33 (85 %) procedures used distal EPD. Local DRLs were derived from the recorded exposure parameters in 93 patients (65 men and 28 women, mean age 73 {+-} 9 years) undergoing 96 CAS with proximal (33 %) or distal (67 %) EPD. Four bilateral lesions were included. MESD values (mean 0.96 {+-} 0.42 Gy) were <2 Gy without relevant dependence on procedure complexity. Local DRL values for kerma area product (KAP), fluoroscopy time (FT), and number of frames (N{sub FR}) were 269 Gy cm{sup 2}, 28 minutes, and 251, respectively. Only simultaneous bilateral treatment was associated with KAP (odds ratio [OR] 10.14, 95 % confidence interval [CI] 1-102.7, p < 0.05) and N{sub FR} overexposures (OR 10.8, 95 % CI 1.1-109.5, p < 0.05). Type I aortic arch decreased the risk of FT overexposure (OR 0.4, 95 % CI 0.1-0.9, p = 0.042), and stenosis {>=} 90 % increased the risk of N{sub FR} overexposure (OR 2.8, 95 % CI 1.1-7.4, p = 0.040). At multivariable analysis, stenosis {>=} 90 % (OR 2.8, 95 % CI 1.1-7.4, p = 0.040) and bilateral treatment (OR 10.8, 95 % CI 1.1-109.5, p = 0.027) were associated with overexposure for two or more parameters. Skin doses are not problematic in CAS with EPD because these procedures rarely lead to doses >2 Gy.

  17. Evaluation of some procedures relevant to the determination of trace elemental components in biological materials by destructive neutron activation analysis

    International Nuclear Information System (INIS)

    Berry, D.L.

    1979-01-01

    The development of a simplified procedure for the analysis of biological materials by destructive neutron activation analysis (DNAA) is described. The sample manipulations preceding gamma ray assay were investigated as five specific stages of processing: (1) pre-irradiation treatment; (2) sample irradiation; (3) removal of the organic matrix; (4) removal of interfering radioactivities; and (5) concentration and separation of analyte activities. Each stage was evaluated with respect to susceptibility to sample contamination, loss of trace elemental components, and compatibility with other operations in the overall DNAA procedures. A complete DNAA procedure was proposed and evaluated for the analysis of standard bovine liver and blood samples. The DNAA system was effective for the determination of As, Cu, Fe, Hg, Mo, Rb, Sb, Se, and Zn without yield determinations and with a minimum turn-around time of approximately 3 days

  18. Evaluation of some procedures relevant to the determination of trace elemental components in biological materials by destructive neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Berry, D.L.

    1979-01-01

    The development of a simplified procedure for the analysis of biological materials by destructive neutron activation analysis (DNAA) is described. The sample manipulations preceding gamma ray assay were investigated as five specific stages of processing: (1) pre-irradiation treatment; (2) sample irradiation; (3) removal of the organic matrix; (4) removal of interfering radioactivities; and (5) concentration and separation of analyte activities. Each stage was evaluated with respect to susceptibility to sample contamination, loss of trace elemental components, and compatibility with other operations in the overall DNAA procedures. A complete DNAA procedure was proposed and evaluated for the analysis of standard bovine liver and blood samples. The DNAA system was effective for the determination of As, Cu, Fe, Hg, Mo, Rb, Sb, Se, and Zn without yield determinations and with a minimum turn-around time of approximately 3 days.

  19. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M.A. Wasiolek

    2003-07-25

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports (BSC 2003 [DIRS 160964]; BSC 2003 [DIRS 160965]; BSC 2003 [DIRS 160976]; BSC 2003 [DIRS 161239]; BSC 2003 [DIRS 161241]) contain detailed description of the model input parameters. This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs and conversion factors for the TSPA. The BDCFs will be used in performance assessment for calculating annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle activity in groundwater and the annual dose from beta- and photon-emitting radionuclides.

  20. Hand function evaluation: a factor analysis study.

    Science.gov (United States)

    Jarus, T; Poremba, R

    1993-05-01

    The purpose of this study was to investigate hand function evaluations. Factor analysis with varimax rotation was used to assess the fundamental characteristics of the items included in the Jebsen Hand Function Test and the Smith Hand Function Evaluation. The study sample consisted of 144 subjects without disabilities and 22 subjects with Colles fracture. Results suggest a four factor solution: Factor I--pinch movement; Factor II--grasp; Factor III--target accuracy; and Factor IV--activities of daily living. These categories differentiated the subjects without Colles fracture from the subjects with Colles fracture. A hand function evaluation consisting of these four factors would be useful. Such an evaluation that can be used for current clinical purposes is provided.

  1. Salivary SPECT and factor analysis in Sjoegren's syndrome

    International Nuclear Information System (INIS)

    Nakamura, T.; Oshiumi, Y.; Yonetsu, K.; Muranaka, T.; Sakai, K.; Kanda, S.; National Fukuoka Central Hospital

    1991-01-01

    Salivary SPECT and factor analysis in Sjoegren's syndrome were performed in 17 patients and 6 volunteers as controls. The ability of SPECT to detect small differences in the level of uptake can be used to separate glands from background even when uptake is reduced as in the patients with Sjoegren's syndrome. In control and probable Sjoegren's syndrome groups the uptake ratio of the submandibular gland to parotid gland on salivary SPECT (S/P ratio) was less than 1.0. However, in the definite Sjoergren's syndrome group, the ratio was more than 1.0. Moreover, the ratio in all patients with sialectasia, which is characteristic of Sjoegren's syndrome, was more than 1.0. Salivary factor analysis of normal parotid glands showed slowly increasing patterns of uptake and normal submandibular glands had rapidly increasing patterns of uptake. However, in the definite Sjoegren's syndrome group, the factor analysis patterns were altered, with slowly increasing patterns dominating both in the parotid and submandibular glands. These results suggest that the S/P ratio in salivary SPECT and salivary factor analysis provide additional radiologic criteria in diagnosing Sjoegren's syndrome. (orig.)

  2. Procedural learning is impaired in dyslexia: Evidence from a meta-analysis of serial reaction time studies☆

    Science.gov (United States)

    Lum, Jarrad A.G.; Ullman, Michael T.; Conti-Ramsden, Gina

    2013-01-01

    A number of studies have investigated procedural learning in dyslexia using serial reaction time (SRT) tasks. Overall, the results have been mixed, with evidence of both impaired and intact learning reported. We undertook a systematic search of studies that examined procedural learning using SRT tasks, and synthesized the data using meta-analysis. A total of 14 studies were identified, representing data from 314 individuals with dyslexia and 317 typically developing control participants. The results indicate that, on average, individuals with dyslexia have worse procedural learning abilities than controls, as indexed by sequence learning on the SRT task. The average weighted standardized mean difference (the effect size) was found to be 0.449 (CI95: .204, .693), and was significant (p dyslexia. PMID:23920029

  3. Using exploratory factor analysis in personality research: Best-practice recommendations

    Directory of Open Access Journals (Sweden)

    Sumaya Laher

    2010-11-01

    Research purpose: This article presents more objective methods to determine the number of factors, most notably parallel analysis and Velicer’s minimum average partial (MAP. The benefits of rotation are also discussed. The article argues for more consistent use of Procrustes rotation and congruence coefficients in factor analytic studies. Motivation for the study: Exploratory factor analysis is often criticised for not being rigorous and objective enough in terms of the methods used to determine the number of factors, the rotations to be used and ultimately the validity of the factor structure. Research design, approach and method: The article adopts a theoretical stance to discuss the best-practice recommendations for factor analytic research in the field of psychology. Following this, an example located within personality assessment and using the NEO-PI-R specifically is presented. A total of 425 students at the University of the Witwatersrand completed the NEO-PI-R. These responses were subjected to a principal components analysis using varimax rotation. The rotated solution was subjected to a Procrustes rotation with Costa and McCrae’s (1992 matrix as the target matrix. Congruence coefficients were also computed. Main findings: The example indicates the use of the methods recommended in the article and demonstrates an objective way of determining the number of factors. It also provides an example of Procrustes rotation with coefficients of agreement as an indication of how factor analytic results may be presented more rigorously in local research. Practical/managerial implications: It is hoped that the recommendations in this article will have best-practice implications for both researchers and practitioners in the field who employ factor analysis regularly. Contribution/value-add: This article will prove useful to all researchers employing factor analysis and has the potential to set the trend for better use of factor analysis in the South African context.

  4. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M.A. Wasiolek

    2005-04-28

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standards. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis

  5. Automated processing of first-pass radionuclide angiocardiography by factor analysis of dynamic structures.

    Science.gov (United States)

    Cavailloles, F; Bazin, J P; Capderou, A; Valette, H; Herbert, J L; Di Paola, R

    1987-05-01

    A method for automatic processing of cardiac first-pass radionuclide study is presented. This technique, factor analysis of dynamic structures (FADS) provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. FADS has been applied to 76 studies. A description of factor patterns obtained in various pathological categories is presented. FADS provides easy diagnosis of shunts and tricuspid insufficiency. Quantitative information derived from the factors (cardiac output and mean transit time) were compared to those obtained by the region of interest method. Using FADS, a higher correlation with cardiac catheterization was found for cardiac output calculation. Thus compared to the ROI method, FADS presents obvious advantages: a good separation of overlapping cardiac chambers is obtained; this operator independant method provides more objective and reproducible results. A number of parameters of the cardio-pulmonary function can be assessed by first-pass radionuclide angiocardiography (RNA) [1,2]. Usually, they are calculated using time-activity curves (TAC) from regions of interest (ROI) drawn on the cardiac chambers and the lungs. This method has two main drawbacks: (1) the lack of inter and intra-observers reproducibility; (2) the problem of crosstalk which affects the evaluation of the cardio-pulmonary performance. The crosstalk on planar imaging is due to anatomical superimposition of the cardiac chambers and lungs. The activity measured in any ROI is the sum of the activity in several organs and 'decontamination' of the TAC cannot easily be performed using the ROI method [3]. Factor analysis of dynamic structures (FADS) [4,5] can solve the two problems mentioned above. It provides an automatic separation of anatomical structures according to their different temporal behaviour, even if they are superimposed. The resulting factors are estimates of the time evolution of the activity in each

  6. Procedural learning in Parkinson's disease, specific language impairment, dyslexia, schizophrenia, developmental coordination disorder, and autism spectrum disorders: A second-order meta-analysis.

    Science.gov (United States)

    Clark, Gillian M; Lum, Jarrad A G

    2017-10-01

    The serial reaction time task (SRTT) has been used to study procedural learning in clinical populations. In this report, second-order meta-analysis was used to investigate whether disorder type moderates performance on the SRTT. Using this approach to quantitatively summarise past research, it was tested whether autism spectrum disorder, developmental coordination disorder, dyslexia, Parkinson's disease, schizophrenia, and specific language impairment differentially affect procedural learning on the SRTT. The main analysis revealed disorder type moderated SRTT performance (p=0.010). This report demonstrates comparable levels of procedural learning impairment in developmental coordination disorder, dyslexia, Parkinson's disease, schizophrenia, and specific language impairment. However, in autism, procedural learning is spared. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Periodic tests: a human factors analysis of documentary aspects

    International Nuclear Information System (INIS)

    Perinet, Romuald; Rousseau, Jean-Marie

    2007-01-01

    Periodic tests are technical inspections aimed at verifying the availability of the safety-related systems during operation. The French licensee, Electricite de France (EDF), manages periodic tests according to procedures, methods of examination and a frequency, which were defined when the systems were designed. These requirements are defined by national authorities of EDF in a reference document composed of rules of testing and tables containing the reference values to be respected. This reference document is analyzed and transformed by each 'Centre Nucleaire de Production d'Electricite' (CNPE) into station-specific operating ranges of periodic tests. In 2003, the IRSN noted that significant events for safety (ESS) involving periodic tests represented more than 20% of ESS between 2000 and 2002. Thus, 340 ESS were related to non-compliance with the conditions of the test and errors in the implementation of the procedures. A first analysis showed that almost 26% of all ESSs from 2000 to 2002 were related to periodic tests. For many of them, the national reference document and the operating ranges of tests were involved. In this context, the 'Direction Generale de la Surete Nucleaire' (DGSNR), requested the 'Institut de Radioprotection et de Surete Nucleaire' (IRSN) to examine the process of definition and implementation of the periodic tests. The IRSN analyzed about thirty French Licensee event reports occurring during the considered period (2000-2002). The IRSN also interviewed the main persons responsible for the processes and observed the performance of 3 periodic tests. The results of this analysis were presented to a group of experts ('Groupe Permanent') charged with delivering advice to the DGSNR about the origin of the problems identified and the improvements to be implemented. The main conclusions of the IRSN addressed the quality of the prescriptive documents. In this context, EDF decided to carry out a thorough analysis of the whole process. The first

  8. Sampling, storage and sample preparation procedures for X ray fluorescence analysis of environmental materials

    International Nuclear Information System (INIS)

    1997-06-01

    X ray fluorescence (XRF) method is one of the most commonly used nuclear analytical technique because of its multielement and non-destructive character, speed, economy and ease of operation. From the point of view of quality assurance practices, sampling and sample preparation procedures are the most crucial steps in all analytical techniques, (including X ray fluorescence) applied for the analysis of heterogeneous materials. This technical document covers recent modes of the X ray fluorescence method and recent developments in sample preparation techniques for the analysis of environmental materials. Refs, figs, tabs

  9. Factor Analysis for Clustered Observations.

    Science.gov (United States)

    Longford, N. T.; Muthen, B. O.

    1992-01-01

    A two-level model for factor analysis is defined, and formulas for a scoring algorithm for this model are derived. A simple noniterative method based on decomposition of total sums of the squares and cross-products is discussed and illustrated with simulated data and data from the Second International Mathematics Study. (SLD)

  10. Completely automated modal analysis procedure based on the combination of different OMA methods

    Science.gov (United States)

    Ripamonti, Francesco; Bussini, Alberto; Resta, Ferruccio

    2018-03-01

    In this work a completely automated output-only Modal Analysis procedure is presented and all its benefits are listed. Based on the merging of different Operational Modal Analysis methods and a statistical approach, the identification process has been improved becoming more robust and giving as results only the real natural frequencies, damping ratios and mode shapes of the system. The effect of the temperature can be taken into account as well, leading to the creation of a better tool for automated Structural Health Monitoring. The algorithm has been developed and tested on a numerical model of a scaled three-story steel building present in the laboratories of Politecnico di Milano.

  11. Analysis and application of ratcheting evaluation procedure of Japanese high temperature design code DDS

    International Nuclear Information System (INIS)

    Lee, H. Y.; Kim, J. B.; Lee, J. H.

    2002-01-01

    In this study, the evaluation procedure of Japanese DDS code which was recently developed to assess the progressive inelastic deformation occurring under repetition of secondary stresses was analyzed and the evaluation results according to DDS was compared those of the thermal ratchet structural test carried out by KAERI to analyze the conservativeness of the code. The existing high temperature codes of US ASME-NH and French RCC-MR suggest the limited ratcheting procedures for only the load cases of cyclic secondary stresses under primary stresses. So they are improper to apply to the actual ratcheting problem which can occur under cyclic secondary membrane stresses due to the movement of hot free surface for the pool type LMR. DDS provides explicitly an analysis procedure of ratcheting due to moving thermal gradients near hot free surface. A comparison study was carried out between the results by the design code of DDS and by the structural test to investigate the conservativeness of DDS code, which showed that the evaluation results by DDS were in good agreement with those of the structural test

  12. Neutron activation analysis with k0-standardisation : general formalism and procedure

    International Nuclear Information System (INIS)

    Pomme, S.; Hardeman, F.; Robouch, P.; Etxebarria, N.; Arana, G.

    1997-09-01

    Instrumental neutron activation analysis (INAA) with k 0 -standardisation is a powerful tool for multi-element analysis at a broad range of trace element concentrations. An overview is given of the basic principles, fundamental equations, and general procedure of this method. Different aspects of the description of the neutron activation reaction rate are discussed, applying the Hogdahl convention. A general activation-decay formula is derived and its application to INAA is demonstrated. Relevant k 0 -definitions for different activation decay schemes are summarised and upgraded to cases of extremely high fluxes. The main standardisation techniques for INAA are discussed, emphasizing the k 0 -standardisation. Some general aspects of the basic equipment and its calibration are discussed, such as the characterisation of the neutron field and the tuning of the spectrometry part. A method for the prediction and optimisation of the analytical performance of INAA is presented

  13. Analysis and optimization of the TWINKLE factoring device

    NARCIS (Netherlands)

    Lenstra, A.K.; Shamir, A.; Preneel, B.

    2000-01-01

    We describe an enhanced version of the TWINKLE factoring device and analyse to what extent it can be expected to speed up the sieving step of the Quadratic Sieve and Number Field Sieve factoring al- gorithms. The bottom line of our analysis is that the TWINKLE-assisted factorization of 768-bit

  14. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs for the groundwater exposure scenario for the three climate states considered in the TSPA-LA as well as conversion factors for evaluating compliance with the groundwater protection standard. The BDCFs will be used in performance assessment for calculating all-pathway annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle

  15. Nominal Performance Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. Wasiolek

    2004-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs for the groundwater exposure scenario for the three climate states considered in the TSPA-LA as well as conversion factors for evaluating compliance with the groundwater protection standard. The BDCFs will be used in performance assessment for calculating all-pathway annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle activity in groundwater and the annual dose

  16. Comparing the operators' behavior in conducting emergency operating procedures with the complexity of procedural steps

    International Nuclear Information System (INIS)

    Park, Jin Kyun; Jung, Won Dea

    2003-01-01

    Many kinds of procedures have been used to reduce the operators' workload throughout various industries. However, significant portion of accidents or incidents was caused by procedure related human errors that are originated from non-compliance of procedures. According to related studies, several important factors for non-compliance behavior have been identified, and one if them is the complexity of procedures. This means that comparing the change of the operators' behavior with the complexity of procedures may be meaningful for investigating plausible reasons for the operators' non-compliance behavior. In this study, emergency training records were collected using a full scope simulator in order to obtain data related to the operators' non-compliance behavior. And then, collected data are compared with the complexity of procedural steps. As the result, two remarkable relationships are found, which indicate that the operators' behavior could be reasonably characterized by the complexity of procedural steps. Thus, these relationships can be used as meaningful clues not only to scrutinize the reason of non-compliance behavior but also to suggest appropriate remedies for the reduction of non-compliance behavior that can result in procedure related human errors

  17. Transforming Rubrics Using Factor Analysis

    Science.gov (United States)

    Baryla, Ed; Shelley, Gary; Trainor, William

    2012-01-01

    Student learning and program effectiveness is often assessed using rubrics. While much time and effort may go into their creation, it is equally important to assess how effective and efficient the rubrics actually are in terms of measuring competencies over a number of criteria. This study demonstrates the use of common factor analysis to identify…

  18. Evaluation of solution procedures for material and/or geometrically nonlinear structural analysis by the direct stiffness method.

    Science.gov (United States)

    Stricklin, J. A.; Haisler, W. E.; Von Riesemann, W. A.

    1972-01-01

    This paper presents an assessment of the solution procedures available for the analysis of inelastic and/or large deflection structural behavior. A literature survey is given which summarized the contribution of other researchers in the analysis of structural problems exhibiting material nonlinearities and combined geometric-material nonlinearities. Attention is focused at evaluating the available computation and solution techniques. Each of the solution techniques is developed from a common equation of equilibrium in terms of pseudo forces. The solution procedures are applied to circular plates and shells of revolution in an attempt to compare and evaluate each with respect to computational accuracy, economy, and efficiency. Based on the numerical studies, observations and comments are made with regard to the accuracy and economy of each solution technique.

  19. A Factor Analysis of the BSRI and the PAQ.

    Science.gov (United States)

    Edwards, Teresa A.; And Others

    Factor analysis of the Bem Sex Role Inventory (BSRI) and the Personality Attributes Questionnaire (PAQ) was undertaken to study the independence of the masculine and feminine scales within each instrument. Both instruments were administered to undergraduate education majors. Analysis of primary first and second order factors of the BSRI indicated…

  20. Identification of noise in linear data sets by factor analysis

    International Nuclear Information System (INIS)

    Roscoe, B.A.; Hopke, Ph.K.

    1982-01-01

    A technique which has the ability to identify bad data points, after the data has been generated, is classical factor analysis. The ability of classical factor analysis to identify two different types of data errors make it ideally suited for scanning large data sets. Since the results yielded by factor analysis indicate correlations between parameters, one must know something about the nature of the data set and the analytical techniques used to obtain it to confidentially isolate errors. (author)

  1. The human error rate assessment and optimizing system HEROS - a new procedure for evaluating and optimizing the man-machine interface in PSA

    International Nuclear Information System (INIS)

    Richei, A.; Hauptmanns, U.; Unger, H.

    2001-01-01

    A new procedure allowing the probabilistic evaluation and optimization of the man-machine system is presented. This procedure and the resulting expert system HEROS, which is an acronym for Human Error Rate Assessment and Optimizing System, is based on the fuzzy set theory. Most of the well-known procedures employed for the probabilistic evaluation of human factors involve the use of vague linguistic statements on performance shaping factors to select and to modify basic human error probabilities from the associated databases. This implies a large portion of subjectivity. Vague statements are expressed here in terms of fuzzy numbers or intervals which allow mathematical operations to be performed on them. A model of the man-machine system is the basis of the procedure. A fuzzy rule-based expert system was derived from ergonomic and psychological studies. Hence, it does not rely on a database, whose transferability to situations different from its origin is questionable. In this way, subjective elements are eliminated to a large extent. HEROS facilitates the importance analysis for the evaluation of human factors, which is necessary for optimizing the man-machine system. HEROS is applied to the analysis of a simple diagnosis of task of the operating personnel in a nuclear power plant

  2. Risk factors for severity of pneumothorax after CT-guided percutaneous lung biopsy using the single-needle method.

    Science.gov (United States)

    Kakizawa, Hideaki; Toyota, Naoyuki; Hieda, Masashi; Hirai, Nobuhiko; Tachikake, Toshihiro; Matsuura, Noriaki; Oda, Miyo; Ito, Katsuhide

    2010-09-01

    The purpose of this study is to evaluate the risk factors for the severity of pneumothorax after computed tomography (CT)-guided percutaneous lung biopsy using the single-needle method. We reviewed 91 biopsy procedures for 90 intrapulmonary lesions in 89 patients. Patient factors were age, sex, history of ipsilateral lung surgery and grade of emphysema. Lesion factors were size, location and pleural contact. Procedure factors were position, needle type, needle size, number of pleural punctures, pleural angle, length of needle passes in the aerated lung and number of harvesting samples. The severity of pneumothorax after biopsy was classified into 4 groups: "none", "mild", "moderate" and "severe". The risk factors for the severity of pneumothorax were determined by multivariate analyzing of the factors derived from univariate analysis. Pneumothorax occurred in 39 (43%) of the 91 procedures. Mild, moderate, and severe pneumothorax occurred in 24 (26%), 8 (9%) and 7 (8%) of all procedures, respectively. Multivariate analysis showed that location, pleural contact, number of pleural punctures and number of harvesting samples were significantly associated with the severity of pneumothorax (p < 0.05). In conclusion, lower locations and non-pleural contact lesions, increased number of pleural punctures and increased number of harvesting samples presented a higher severity of pneumothorax.

  3. Risk factors for severity of pneumothorax after CT-guided percutaneous lung biopsy using the single-needle method

    International Nuclear Information System (INIS)

    Kakizawa, Hideaki; Hieda, Masashi; Oda, Miyo; Toyota, Naoyuki; Hirai, Nobuhiko; Tachikake, Toshihiro; Matsuura, Noriaki; Ito, Katsuhide

    2010-01-01

    The purpose of this study is to evaluate the risk factors for the severity of pneumothorax after computed tomography (CT)-guided percutaneous lung biopsy using the single-needle method. We reviewed 91 biopsy procedures for 90 intrapulmonary lesions in 89 patients. Patient factors were age, sex, history of ipsilateral lung surgery and grade of emphysema. Lesion factors were size, location and pleural contact. Procedure factors were position, needle type, needle size, number of pleural punctures, pleural angle, length of needle passes in the aerated lung and number of harvesting samples. The severity of pneumothorax after biopsy was classified into 4 groups: 'none', 'mild', 'moderate' and 'severe'. The risk factors for the severity of pneumothorax were determined by multivariate analyzing of the factors derived from univariate analysis. Pneumothorax occurred in 39 (43%) of the 91 procedures. Mild, moderate, and severe pneumothorax occurred in 24 (26%), 8 (9%) and 7 (8%) of all procedures, respectively. Multivariate analysis showed that location, pleural contact, number of pleural punctures and number of harvesting samples were significantly associated with the severity of pneumothorax (p<0.05). In conclusion, lower locations and non-pleural contact lesions, increased number of pleural punctures and increased number of harvesting samples presented a higher severity of pneumothorax. (author)

  4. Systematic review and meta-analysis of enterocolitis after one-stage transanal pull-through procedure for Hirschsprung's disease.

    LENUS (Irish Health Repository)

    Ruttenstock, Elke

    2012-02-01

    PURPOSE: The transanal one-stage pull-through procedure (TERPT) has gained worldwide popularity over open and laparoscopic-assisted one-stage techniques in children with Hirschsprung\\'s disease (HD). It offers the advantages of avoiding laparotomy, laparoscopy, scars, abdominal contamination, and adhesions. However, enterocolitis associated with Hirschsprung\\'s disease (HAEC) still remains to be a potentially life-threatening complication after pull-through operation. The reported incidence of HAEC ranges from 4.6 to 54%. This meta-analysis was designed to evaluate postoperative incidence of HAEC following TERPT procedure. METHODS: A meta-analysis of cases of TERPT reported between 1998 and 2009 was performed. Detailed information was recorded regarding intraoperative details and postoperative complications with particular emphasis on incidence of HAEC. Diagnosis of HAEC in a HD patient was based on the clinical presentation of diarrhoea, abdominal distension, and fever. RESULTS: Of the 54 published articles worldwide, 27 articles, including 899 patients were identified as reporting entirely TERPT procedure. Postoperative HAEC occurred in 92 patients (10.2%). Recurrent episodes of HAEC were reported in 18 patients (2%). Conservative treatment of HAEC was successful in 75 patients (81.5%), whereas in 17 patients (18.5%) surgical treatment was needed. CONCLUSIONS: This systematic review reveals that TERPT is a safe and less-invasive procedure with a low incidence of postoperative HAEC.

  5. Factors Associated with Anxiety About Colonoscopy: The Preparation, the Procedure, and the Anticipated Findings.

    Science.gov (United States)

    Shafer, L A; Walker, J R; Waldman, C; Yang, C; Michaud, V; Bernstein, C N; Hathout, L; Park, J; Sisler, J; Restall, G; Wittmeier, K; Singh, H

    2018-03-01

    Previous research has assessed anxiety around colonoscopy procedures, but has not considered anxiety related to different aspects related to the colonoscopy process. Before colonoscopy, we assessed anxiety about: bowel preparation, the procedure, and the anticipated results. We evaluated associations between patient characteristics and anxiety in each area. An anonymous survey was distributed to patients immediately prior to their outpatient colonoscopy in six hospitals and two ambulatory care centers in Winnipeg, Canada. Anxiety was assessed using a visual analog scale. For each aspect, logistic regression models were used to explore associations between patient characteristics and high anxiety. A total of 1316 respondents completed the questions about anxiety (52% female, median age 56 years). Anxiety scores > 70 (high anxiety) were reported by 18% about bowel preparation, 29% about the procedure, and 28% about the procedure results. High anxiety about bowel preparation was associated with female sex, perceived unclear instructions, unfinished laxative, and no previous colonoscopies. High anxiety about the procedure was associated with female sex, no previous colonoscopies, and confusing instructions. High anxiety about the results was associated with symptoms as an indication for colonoscopy and instructions perceived as confusing. Fewer people had high anxiety about preparation than about the procedure and findings of the procedure. There are unique predictors of anxiety about each colonoscopy aspect. Understanding the nuanced differences in aspects of anxiety may help to design strategies to reduce anxiety, leading to improved acceptance of the procedure, compliance with preparation instructions, and less discomfort with the procedure.

  6. "Factor Analysis Using ""R"""

    Directory of Open Access Journals (Sweden)

    A. Alexander Beaujean

    2013-02-01

    Full Text Available R (R Development Core Team, 2011 is a very powerful tool to analyze data, that is gaining in popularity due to its costs (its free and flexibility (its open-source. This article gives a general introduction to using R (i.e., loading the program, using functions, importing data. Then, using data from Canivez, Konold, Collins, and Wilson (2009, this article walks the user through how to use the program to conduct factor analysis, from both an exploratory and confirmatory approach.

  7. Interactions between toxic chemicals and natural environmental factors--a meta-analysis and case studies.

    Science.gov (United States)

    Laskowski, Ryszard; Bednarska, Agnieszka J; Kramarz, Paulina E; Loureiro, Susana; Scheil, Volker; Kudłek, Joanna; Holmstrup, Martin

    2010-08-15

    The paper addresses problems arising from effects of natural environmental factors on toxicity of pollutants to organisms. Most studies on interactions between toxicants and natural factors, including those completed in the EU project NoMiracle (Novel Methods for Integrated Risk Assessment of Cumulative Stressors in Europe) described herein, showed that effects of toxic chemicals on organisms can differ vastly depending purely on external conditions. We compiled data from 61 studies on effects of temperature, moisture and dissolved oxygen on toxicity of a range of chemicals representing pesticides, polycyclic aromatic hydrocarbons, plant protection products of bacterial origin and trace metals. In 62.3% cases significant interactions (pnatural factors and chemicals were found, reaching 100% for the effect of dissolved oxygen on toxicity of waterborne chemicals. The meta-analysis of the 61 studies showed that the null hypothesis assuming no interactions between toxic chemicals and natural environmental factors should be rejected at p=2.7 x 10(-82) (truncated product method probability). In a few cases of more complex experimental designs, also second-order interactions were found, indicating that natural factors can modify interactions among chemicals. Such data emphasize the necessity of including information on natural factors and their variation in time and across geographic regions in ecological risk assessment. This can be done only if appropriate ecotoxicological test designs are used, in which test organisms are exposed to toxicants at a range of environmental conditions. We advocate designing such tests for the second-tier ecological risk assessment procedures. Copyright 2010 Elsevier B.V. All rights reserved.

  8. Clinical Outcomes of Root Reimplantation and Bentall Procedure: Propensity Score Matching Analysis.

    Science.gov (United States)

    Lee, Heemoon; Cho, Yang Hyun; Sung, Kiick; Kim, Wook Sung; Park, Kay-Hyun; Jeong, Dong Seop; Park, Pyo Won; Lee, Young Tak

    2018-03-26

    This study aimed to evaluate the clinical outcomes of aortic root replacement(ARR) surgery:Root reimplantation as valve-sparing root replacement(VSR) and the Bentall procedure. We retrospectively reviewed 216 patients who underwent ARR between 1995 and 2013 at Samsung Medical Center. Patients were divided into two groups, depending on the procedure they underwent: Bentall(n=134) and VSR(n=82). The mean follow-up duration was 100.9±56.4 months. There were 2 early deaths in the Bentall group and none in the VSR group(p=0.53). Early morbidities were not different between the groups. Overall mortality was significantly lower in the VSR group (HR=0.12,p=0.04). Despite the higher reoperation rate in the VSR group(p=0.03), major adverse valve-related events(MAVRE) did not differ between the groups(p=0.28). Bleeding events were significantly higher in the Bentall group during follow-up(10 in Bentall group, 0 in VSR group, p=0.04). here were 6 thromboembolic events only in the Bentall group(p=0.11). We performed a propensity score matching analysis comparing the groups(134 Bentall vs 43 VSR). Matched analysis gave similar results, i.e. HR=0.17 and p=0.10 for overall mortality and HR=1.01 and p=0.99 for MAVRE. Although there was marginal significance in the propensity matched analysis, it is plausible to anticipate a survival benefit with VSR during long-term follow-up. Despite a higher reoperation for aortic valves, VSR can be a viable option in patients who decline life-long anticoagulation, especially the young or the patients in whom anticoagulation is contraindicated. Copyright © 2018. Published by Elsevier Inc.

  9. Simple estimation procedures for regression analysis of interval-censored failure time data under the proportional hazards model.

    Science.gov (United States)

    Sun, Jianguo; Feng, Yanqin; Zhao, Hui

    2015-01-01

    Interval-censored failure time data occur in many fields including epidemiological and medical studies as well as financial and sociological studies, and many authors have investigated their analysis (Sun, The statistical analysis of interval-censored failure time data, 2006; Zhang, Stat Modeling 9:321-343, 2009). In particular, a number of procedures have been developed for regression analysis of interval-censored data arising from the proportional hazards model (Finkelstein, Biometrics 42:845-854, 1986; Huang, Ann Stat 24:540-568, 1996; Pan, Biometrics 56:199-203, 2000). For most of these procedures, however, one drawback is that they involve estimation of both regression parameters and baseline cumulative hazard function. In this paper, we propose two simple estimation approaches that do not need estimation of the baseline cumulative hazard function. The asymptotic properties of the resulting estimates are given, and an extensive simulation study is conducted and indicates that they work well for practical situations.

  10. The Recoverability of P-Technique Factor Analysis

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  11. Incidence, prognostic factors and impact of postoperative delirium after major vascular surgery: A meta-analysis and systematic review.

    Science.gov (United States)

    Aitken, Sarah Joy; Blyth, Fiona M; Naganathan, Vasi

    2017-10-01

    Although postoperative delirium is a common complication and increases patient care needs, little is known about the predictors and outcomes of delirium in patients having vascular surgery. This review aimed to determine the incidence, prognostic factors and impact of postoperative delirium in vascular surgical patients. MEDLINE and EMBASE were systematically searched for articles published between January 2000 and January 2016 on delirium after vascular surgery. The primary outcome was the incidence of delirium. Secondary outcomes were contributing prognostic factors and impact of delirium. Study quality and risk of bias was assessed using the QUIPS tool for systematic reviews of prognostic studies, and MOOSE guidelines for reviews of observational studies. Quantitative analyses of extracted data were conducted using meta-analysis where possible to determine incidence of delirium and prognostic factors. A qualitative review of outcomes was performed. Fifteen articles were eligible for inclusion. Delirium incidence ranged between 5% and 39%. Meta-analysis found that patients with delirium were older than those without delirium (OR 3.6, pdelirium included increased age (OR 1.04, pdelirium. Data were limited on the impact of procedure complexity, endovascular compared to open surgery or type of anaesthetic. Postoperative delirium occurs frequently, resulting in major morbidity for vascular patients. Improved quality of prognostic studies may identify modifiable peri-operative factors to improve quality of care for vascular surgical patients.

  12. Surgeon and type of anesthesia predict variability in surgical procedure times.

    Science.gov (United States)

    Strum, D P; Sampson, A R; May, J H; Vargas, L G

    2000-05-01

    Variability in surgical procedure times increases the cost of healthcare delivery by increasing both the underutilization and overutilization of expensive surgical resources. To reduce variability in surgical procedure times, we must identify and study its sources. Our data set consisted of all surgeries performed over a 7-yr period at a large teaching hospital, resulting in 46,322 surgical cases. To study factors associated with variability in surgical procedure times, data mining techniques were used to segment and focus the data so that the analyses would be both technically and intellectually feasible. The data were subdivided into 40 representative segments of manageable size and variability based on headers adopted from the common procedural terminology classification. Each data segment was then analyzed using a main-effects linear model to identify and quantify specific sources of variability in surgical procedure times. The single most important source of variability in surgical procedure times was surgeon effect. Type of anesthesia, age, gender, and American Society of Anesthesiologists risk class were additional sources of variability. Intrinsic case-specific variability, unexplained by any of the preceding factors, was found to be highest for shorter surgeries relative to longer procedures. Variability in procedure times among surgeons was a multiplicative function (proportionate to time) of surgical time and total procedure time, such that as procedure times increased, variability in surgeons' surgical time increased proportionately. Surgeon-specific variability should be considered when building scheduling heuristics for longer surgeries. Results concerning variability in surgical procedure times due to factors such as type of anesthesia, age, gender, and American Society of Anesthesiologists risk class may be extrapolated to scheduling in other institutions, although specifics on individual surgeons may not. This research identifies factors associated

  13. Human factor analysis and preventive countermeasures in nuclear power plant

    International Nuclear Information System (INIS)

    Li Ye

    2010-01-01

    Based on the human error analysis theory and the characteristics of maintenance in a nuclear power plant, human factors of maintenance in NPP are divided into three different areas: human, technology, and organization. Which is defined as individual factors, including psychological factors, physiological characteristics, health status, level of knowledge and interpersonal skills; The technical factors including technology, equipment, tools, working order, etc.; The organizational factors including management, information exchange, education, working environment, team building and leadership management,etc The analysis found that organizational factors can directly or indirectly affect the behavior of staff and technical factors, is the most basic human error factor. Based on this nuclear power plant to reduce human error and measures the response. (authors)

  14. Microbial ecology laboratory procedures manual NASA/MSFC

    Science.gov (United States)

    Huff, Timothy L.

    1990-01-01

    An essential part of the efficient operation of any microbiology laboratory involved in sample analysis is a standard procedures manual. The purpose of this manual is to provide concise and well defined instructions on routine technical procedures involving sample analysis and methods for monitoring and maintaining quality control within the laboratory. Of equal importance is the safe operation of the laboratory. This manual outlines detailed procedures to be followed in the microbial ecology laboratory to assure safety, analytical control, and validity of results.

  15. 32 CFR 162.6 - Procedures.

    Science.gov (United States)

    2010-07-01

    ... Department of Defense OFFICE OF THE SECRETARY OF DEFENSE DEFENSE CONTRACTING PRODUCTIVITY ENHANCING CAPITAL INVESTMENT (PECI) § 162.6 Procedures. The following procedures shall be followed by the DoD Components in the... documentation, pre-investment analysis, financing, and post-investment accountability of PECI projects, when DoD...

  16. Hemodynamic outcomes of the Ross procedure versus other aortic valve replacement: a systematic review and meta-analysis.

    Science.gov (United States)

    Um, Kevin J; McCLURE, Graham R; Belley-Cote, Emilie P; Gupta, Saurabh; Bouhout, Ismail; Lortie, Hugo; Alraddadi, Hatim; Alsagheir, Ali; Bossard, Matthias; McINTYRE, William F; Lengyel, Alexandra; Eikelboom, John W; Ouzounian, Maral; Chu, Michael W; Parry, Dominic; El-Hamamsy, Ismail; Whitlock, Richard P

    2018-01-09

    Life expectancy in young adults undergoing mechanical or bioprosthetic aortic valve replacement (AVR) may be reduced by up to 20 years compared to age matched controls. The Ross procedure is a durable, anticoagulation-sparing alternative. We performed a systematic review and meta-analysis to compare the valve hemodynamics of the Ross procedure versus other AVR. We searched Cochrane CENTRAL, MEDLINE and EMBASE from inception to February 2017 for randomized controlled trials (RCTs) and observational studies (n≥10 Ross). Independently and in duplicate, we performed title and abstract screening, full-text eligibility assessment, and data collection. We evaluated the risk of bias with the Cochrane and CLARITY tools, and the quality of evidence with the GRADE framework. We identified 2 RCTs and 13 observational studies that met eligibility criteria (n=1,412). In observational studies, the Ross procedure was associated with a lower mean aortic gradient at discharge (MD -9 mmHg, 95% CI [-13, -5], pRoss procedure was associated with a lower mean gradient at latest follow-up (MD -15 mmHg, 95% CI [-32, 2], p=0.08, I2=99%). The mean pulmonic gradient for the Ross procedure was 18.0 mmHg (95% CI [16, 20], pRoss procedure was associated with better aortic valve hemodynamics. Future studies should evaluate the impact of the Ross procedure on exercise capacity and quality of life.

  17. Investigation of the factor structure of the Wechsler Adult Intelligence Scale--Fourth Edition (WAIS-IV): exploratory and higher order factor analyses.

    Science.gov (United States)

    Canivez, Gary L; Watkins, Marley W

    2010-12-01

    The present study examined the factor structure of the Wechsler Adult Intelligence Scale--Fourth Edition (WAIS-IV; D. Wechsler, 2008a) standardization sample using exploratory factor analysis, multiple factor extraction criteria, and higher order exploratory factor analysis (J. Schmid & J. M. Leiman, 1957) not included in the WAIS-IV Technical and Interpretation Manual (D. Wechsler, 2008b). Results indicated that the WAIS-IV subtests were properly associated with the theoretically proposed first-order factors, but all but one factor-extraction criterion recommended extraction of one or two factors. Hierarchical exploratory analyses with the Schmid and Leiman procedure found that the second-order g factor accounted for large portions of total and common variance, whereas the four first-order factors accounted for small portions of total and common variance. It was concluded that the WAIS-IV provides strong measurement of general intelligence, and clinical interpretation should be primarily at that level.

  18. Effect of Previous Abdominal Surgery on Laparoscopic Liver Resection: Analysis of Feasibility and Risk Factors for Conversion.

    Science.gov (United States)

    Cipriani, Federica; Ratti, Francesca; Fiorentini, Guido; Catena, Marco; Paganelli, Michele; Aldrighetti, Luca

    2018-03-28

    Previous abdominal surgery has traditionally been considered an additional element of difficulty to later laparoscopic procedures. The aim of the study is to analyze the effect of previous surgery on the feasibility and safety of laparoscopic liver resection (LLR), and its role as a risk factor for conversion. After matching, 349 LLR in patients known for previous abdominal surgery (PS group) were compared with 349 LLR on patients with a virgin abdomen (NPS group). Subgroup analysis included 161 patients with previous upper abdominal surgery (UPS subgroup). Feasibility and safety were evaluated in terms of conversion rate, reasons for conversion and outcomes, and risk factors for conversion assessed via uni/multivariable analysis. Conversion rate was 9.4%, and higher for PS patients compared with NPS patients (13.7% versus 5.1%, P = .021). Difficult adhesiolysis resulted the commonest reason for conversion in PS group (5.7%). However, operative time (P = .840), blood loss (P = .270), transfusion (P = .650), morbidity rate (P = .578), hospital stay (P = .780), and R1 rate (P = .130) were comparable between PS and NPS group. Subgroup analysis confirmed higher conversion rates for UPS patients (23%) compared with both NPS (P = .015) and PS patients (P = .041). Previous surgery emerged as independent risk factor for conversion (P = .033), alongside the postero-superior location and major hepatectomy. LLR are feasible in case of previous surgery and proved to be safe and maintain the benefits of LLR carried out in standard settings. However, a history of surgery should be considered a risk factor for conversion.

  19. EFFICIENCY OF MOMENT AMPLIFICATION PROCEDURES FOR THE SECOND-ORDER ANALYSIS OF STEEL FRAMES

    OpenAIRE

    Paschal Chiadighikaobi

    2018-01-01

    Beam-column is the member subjected to axial compression and bending. Secondary Moment was accounted for in the design and was additional moment induced by axial load. Comparing the results analysis from two computer aided software (SAP2000 and Java). The moment amplification factor Af was inputted in the Java code. Af did not create any change in the result outputs in the Java Code results. There are many different ways to apply amplification factors to first-order analysis results, each wit...

  20. Development of advanced MCR task analysis methods

    International Nuclear Information System (INIS)

    Na, J. C.; Park, J. H.; Lee, S. K.; Kim, J. K.; Kim, E. S.; Cho, S. B.; Kang, J. S.

    2008-07-01

    This report describes task analysis methodology for advanced HSI designs. Task analyses was performed by using procedure-based hierarchical task analysis and task decomposition methods. The results from the task analysis were recorded in a database. Using the TA results, we developed static prototype of advanced HSI and human factors engineering verification and validation methods for an evaluation of the prototype. In addition to the procedure-based task analysis methods, workload estimation based on the analysis of task performance time and analyses for the design of information structure and interaction structures will be necessary

  1. Multielemental analysis of 18 essential and toxic elements in amniotic fluid samples by ICP-MS: Full procedure validation and estimation of measurement uncertainty.

    Science.gov (United States)

    Markiewicz, B; Sajnóg, A; Lorenc, W; Hanć, A; Komorowicz, I; Suliburska, J; Kocyłowski, R; Barałkiewicz, D

    2017-11-01

    Amniotic fluid is the substantial factor in the development of an embryo and fetus due to the fact that water and solutes contained in it penetrate the fetal membranes in an hydrostatic and osmotic way as well as being swallowed by the fetus. Elemental composition of amniotic fluid influences the growth and health of the fetus, therefore, an analysis of amniotic fluid is important because the results would indicate abnormal levels of minerals or toxic elements. Inductively coupled plasma mass spectroscopy (ICP-MS) is often used for determination of trace and ultra-trace level elements in a wide range of matrices including biological samples because of its unique analytical capabilities. In the case of trace and ultra-trace level analysis detailed characteristics of analytical procedure as well as properties of the analytical result are particularly important. The purpose of this study was to develop a new analytical procedure for multielemental analysis of 18 elements (Al, As, Ba, Ca, Cd, Co, Cr, Cu, Mg, Mn, Ni, Pb, Sb, Se, Sr, U, V and Zn) in amniotic fluid samples using ICP-MS. Dynamic reaction cell (DRC) with two reaction gases, ammonia and oxygen, was involved in the experiment to eliminate spectral interferences. Detailed validation was conducted using 3 certified reference mterials (CRMs) and real amniotic fluid samples collected from patients. Repeatability for all analyzed analytes was found to range from 0.70% to 8.0% and for intermediate precision results varied from 1.3% to 15%. Trueness expressed as recovery ranged from 80% to 125%. Traceability was assured through the analyses of CRMs. Uncertainty of the results was also evaluated using single-laboratory validation approach. The obtained expanded uncertainty (U) results for CRMs, expressed as a percentage of the concentration of an analyte, were found to be between 8.3% for V and 45% for Cd. Standard uncertainty of the precision was found to have a greater influence on the combined standard uncertainty

  2. Deterministic effects of interventional radiology procedures

    International Nuclear Information System (INIS)

    Shope, Thomas B.

    1997-01-01

    The purpose of this paper is to describe deterministic radiation injuries reported to the Food and Drug Administration (FDA) that resulted from therapeutic, interventional procedures performed under fluoroscopic guidance, and to investigate the procedure or equipment-related factors that may have contributed to the injury. Reports submitted to the FDA under both mandatory and voluntary reporting requirements which described radiation-induced skin injuries from fluoroscopy were investigated. Serious skin injuries, including moist desquamation and tissues necrosis, have occurred since 1992. These injuries have resulted from a variety of interventional procedures which have required extended periods of fluoroscopy compared to typical diagnostic procedures. Facilities conducting therapeutic interventional procedures need to be aware of the potential for patient radiation injury and take appropriate steps to limit the potential for injury. (author)

  3. Concrete containment integrity software: Procedure manual and guidelines

    International Nuclear Information System (INIS)

    Dameron, R.A.; Dunham, R.S.; Rashid, Y.R.

    1990-06-01

    This report is an executive summary describing the concrete containment analysis methodology and software that was developed in the EPRI-sponsored research to predict the overpressure behavior and leakage of concrete containments. A set of guidelines has been developed for performing reliable 2D axisymmetric concrete containment analysis with a cracking concrete constitutive model developed by ANATECH. The software package developed during this research phase is designed for use in conjunction with ABAQUS-EPGEN; it provides the concrete model and automates axisymmetric grid preparation, and rebar generation for 2D and 3D grids. The software offers the option of generating pre-programmed axisymmetric grids that can be tailored to a specific containment by input of a few geometry parameters. The goal of simplified axisymmetric analysis within the framework of the containment leakage prediction methodology is to compute global liner strain histories at various locations within the containment. A simplified approach for generating peak liner strains at structural discontinuities as function of the global liner strains has been presented in a separate leakage criteria document; the curves for strain magnification factors and liner stress triaxiality factors found in that document are intended to be applied to the global liner strain histories developed through global 2D analysis. This report summarizes the procedures for global 2D analysis and gives an overview of the constitutive model and the special purpose concrete containment analysis software developed in this research phase. 8 refs., 10 figs

  4. Analysis of the Impact of Transparency, Corruption, Openness in Competition and Tender Procedures on Public Procurement in the Czech Republic

    Directory of Open Access Journals (Sweden)

    František Ochrana

    2014-01-01

    Full Text Available This study analyses the impact of transparency and openness to competition in public procurement in the Czech Republic. The problems of the Czech procurement market have been demonstrated on the analysis of a sample of contracts awarded by local government entities. From among a set of factors influencing the efficiency of public procurement, we closely analyse transparency, resilience against corruption, openness, effective administrative award procedure, and formulation of appropriate evaluation criteria for selecting the most suitable bid. Some assumptions were confirmed, including a positive effect of open procedures on the level of competition on the supply side as well as the dominant use of price criteria only. The latter case is probably often caused by low skills of workers at the contracting entities, as well as the lack of resources in public budgets. However, we have to reject the persistent legend of “undershooting” tender prices and subsequently increasing the final prices of public contracts. Increases of final prices are very limited. Based on the results of the analyses presented, we argue that the main problem of the Czech public procurement market lies in a rather low competence of administrators who are not able to use non-price criteria more often.

  5. Estimating the Cost of Neurosurgical Procedures in a Low-Income Setting: An Observational Economic Analysis.

    Science.gov (United States)

    Abdelgadir, Jihad; Tran, Tu; Muhindo, Alex; Obiga, Doomwin; Mukasa, John; Ssenyonjo, Hussein; Muhumza, Michael; Kiryabwire, Joel; Haglund, Michael M; Sloan, Frank A

    2017-05-01

    There are no data on cost of neurosurgery in low-income and middle-income countries. The objective of this study was to estimate the cost of neurosurgical procedures in a low-resource setting to better inform resource allocation and health sector planning. In this observational economic analysis, microcosting was used to estimate the direct and indirect costs of neurosurgical procedures at Mulago National Referral Hospital (Kampala, Uganda). During the study period, October 2014 to September 2015, 1440 charts were reviewed. Of these patients, 434 had surgery, whereas the other 1006 were treated nonsurgically. Thirteen types of procedures were performed at the hospital. The estimated mean cost of a neurosurgical procedure was $542.14 (standard deviation [SD], $253.62). The mean cost of different procedures ranged from $291 (SD, $101) for burr hole evacuations to $1,221 (SD, $473) for excision of brain tumors. For most surgeries, overhead costs represented the largest proportion of the total cost (29%-41%). This is the first study using primary data to determine the cost of neurosurgery in a low-resource setting. Operating theater capacity is likely the binding constraint on operative volume, and thus, investing in operating theaters should achieve a higher level of efficiency. Findings from this study could be used by stakeholders and policy makers for resource allocation and to perform economic analyses to establish the value of neurosurgery in achieving global health goals. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Analysis of assistance procedures to normal birth in primiparous

    Directory of Open Access Journals (Sweden)

    Joe Luiz Vieira Garcia Novo

    2016-04-01

    Full Text Available Introduction: Current medical technologies in care in birth increased maternal and fetal benefits persist, despite numerous unnecessary procedures. The purpose of the normal childbirth care is to have healthy women and newborns, using a minimum of safe interventions. Objective: To analyze the assistance to normal delivery in secondary care maternity. Methodology: A total of 100 primiparous mothers who had vaginal delivery were included, in which care practices used were categorized: 1 according to the WHO classification for assistance to normal childbirth: effective, harmful, used with caution and used inappropriately; 2 associating calculations with the Bologna Index parameters: presence of a birth partner, partograph, no stimulation of labor, delivery in non-supine position, and mother-newborn skin-to-skin contact. Results: Birth partners (85%, correctly filled partographs (62%, mother-newborn skin-to-skin contact (36%, use of oxytocin (87%, use of parenteral nutrition during labor (86% and at delivery (74%, episiotomy (94% and uterine fundal pressure in the expulsion stage (58%. The overall average value of the Bologna Index of the mothers analyzed was 1.95. Conclusions: Some effective procedures recommended by WHO (presence of a birth partner, some effective and mandatory practices were not complied with (partograph completely filled, potentially harmful or ineffective procedures were used (oxytocin in labor/post-partum, as well as inadequate procedures (uterine fundal pressure during the expulsion stage, use of forceps and episiotomy. The maternity’s care model did not offer excellence procedures in natural birth to their mothers in primiparity, (BI=1.95.

  7. Integrating human factors into process hazard analysis

    International Nuclear Information System (INIS)

    Kariuki, S.G.; Loewe, K.

    2007-01-01

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors

  8. Relationship of Genetic Variants With Procedural Pain, Anxiety, and Distress in Children.

    Science.gov (United States)

    Ersig, Anne L; Schutte, Debra L; Standley, Jennifer; Leslie, Elizabeth; Zimmerman, Bridget; Kleiber, Charmaine; Hanrahan, Kirsten; Murray, Jeffrey C; McCarthy, Ann Marie

    2017-05-01

    This study used a candidate gene approach to examine genomic variation associated with pain, anxiety, and distress in children undergoing a medical procedure. Children aged 4-10 years having an IV catheter insertion were recruited from three Midwestern children's hospitals. Self-report measures of pain, anxiety, and distress were obtained as well as an observed measure of distress. Samples were collected from children and biological parents for analysis of genomic variation. Genotyped variants had known or suspected association with phenotypes of interest. Analyses included child-only association and family-based transmission disequilibrium tests. Genotype and phenotype data were available from 828 children and 376 family trios. Children were 50% male, had a mean age of 7.2 years, and were 84% White/non-Hispanic. In family-based analysis, one single-nucleotide polymorphism (SNP; rs1143629, interleukin ( IL1B) 1β) was associated with observed child distress at Bonferroni-corrected levels of significance ( p = .00013), while two approached significance for association with high state anxiety (rs6330 Nerve Growth Factor, Beta Subunit, [ NGFB]) and high trait anxiety (rs6265 brain-derived neurotrophic factor [ BDNF]). In the child-only analysis, multiple SNPs showed nominal evidence of relationships with phenotypes of interest. rs6265 BDNF and rs2941026 cholecystokinin B receptor had possible relationships with trait anxiety in child-only and family-based analyses. Exploring genomic variation furthers our understanding of pain, anxiety, and distress and facilitates genomic screening to identify children at high risk of procedural pain, anxiety, and distress. Combined with clinical observations and knowledge, such explorations could help guide tailoring of interventions to limit procedure-related distress and identify genes and pathways of interest for future genotype-phenotype studies.

  9. Analysis of decision procedures for a sequence of inventory periods

    International Nuclear Information System (INIS)

    Avenhaus, R.

    1982-07-01

    Optimal test procedures for a sequence of inventory periods will be discussed. Starting with a game theoretical description of the conflict situation between the plant operator and the inspector, the objectives of the inspector as well as the general decision theoretical problem will be formulated. In the first part the objective of 'secure' detection will be emphasized which means that only at the end of the reference time a decision is taken by the inspector. In the second part the objective of 'timely' detection will be emphasized which will lead to sequential test procedures. At the end of the paper all procedures will be summarized, and in view of the multitude of procedures available at the moment some comments about future work will be given. (orig./HP) [de

  10. The plant operating procedure information modeling system for creation and maintenance of procedures

    International Nuclear Information System (INIS)

    Fanto, S.V.; Petras, D.S.; Reiner, R.T.; Frost, D.R.; Orendi, R.G.

    1990-01-01

    This paper reports that as a result of the accident at Three Mile Island, regulatory requirements were issued to upgrade Emergency Operating Procedures for nuclear power plants. The use of human-factored, function-oriented, EOPs were mandated to improve human reliability and to mitigate the consequences of a broad range of initiating events, subsequent failures and operator errors, without having to first diagnose the specific events. The Westinghouse Owners Group responded by developing the Emergency Response Guidelines in a human-factored, two-column format to aid in the transfer of the improved technical information to the operator during transients and accidents. The ERGs are a network of 43 interrelated guidelines which specify operator actions to be taken during plant emergencies to restore the plant to a safe and stable condition. Each utility then translates these guidelines into plant specific EOPs. The creation and maintenance of this large web of interconnecting ERGs/EOPs is an extremely complex task. This paper reports that in order to aid procedure documentation specialists with this time-consuming and tedious task, the Plant Operating Procedure Information Modeling system was developed to provide a controlled and consistent means to build and maintain the ERGs/EOPs and their supporting documentation

  11. Applications of factor analysis to electron and ion beam surface techniques

    International Nuclear Information System (INIS)

    Solomon, J.S.

    1987-01-01

    Factor analysis, a mathematical technique for extracting chemical information from matrices of data, is used to enhance Auger electron spectroscopy (AES), core level electron energy loss spectroscopy (EELS), ion scattering spectroscopy (ISS), and secondary ion mass spectroscopy (SIMS) in studies of interfaces, thin films, and surfaces. Several examples of factor analysis enhancement of chemical bonding variations in thin films and at interfaces studied with AES and SIMS are presented. Factor analysis is also shown to be of great benefit in quantifying electron and ion beam doses required to induce surface damage. Finally, examples are presented of the use of factor analysis to reconstruct elemental profiles when peaks of interest overlap each other during the course of depth profile analysis. (author)

  12. Methods for testing the logical structure of plant procedure documents

    International Nuclear Information System (INIS)

    Horne, C.P.; Colley, R.; Fahley, J.M.

    1990-01-01

    This paper describes an ongoing EPRI project to investigate computer based methods to improve the development, maintenance, and verification of plant operating procedures. This project began as an evaluation of the applicability of structured software analysis methods to operating procedures. It was found that these methods offer benefits, if procedures are transformed to a structured representation to make them amenable to computer analysis. The next task was to investigate methods to transform procedures into a structured representation. The use of natural language techniques to read and compile the procedure documents appears to be viable for this purpose and supports conformity to guidelines. The final task was to consider possibilities of automated verification methods for procedures. Methods to help verify procedures were defined and information requirements specified. These methods take the structured representation of procedures as input. The software system being constructed in this project is called PASS, standing for Procedures Analysis Software System

  13. An SPSSR -Menu for Ordinal Factor Analysis

    Directory of Open Access Journals (Sweden)

    Mario Basto

    2012-01-01

    Full Text Available Exploratory factor analysis is a widely used statistical technique in the social sciences. It attempts to identify underlying factors that explain the pattern of correlations within a set of observed variables. A statistical software package is needed to perform the calculations. However, there are some limitations with popular statistical software packages, like SPSS. The R programming language is a free software package for statistical and graphical computing. It offers many packages written by contributors from all over the world and programming resources that allow it to overcome the dialog limitations of SPSS. This paper offers an SPSS dialog written in theR programming language with the help of some packages, so that researchers with little or no knowledge in programming, or those who are accustomed to making their calculations based on statistical dialogs, have more options when applying factor analysis to their data and hence can adopt a better approach when dealing with ordinal, Likert-type data.

  14. Electrically evoked compound action potentials artefact rejection by independent component analysis: procedure automation.

    Science.gov (United States)

    Akhoun, Idrick; McKay, Colette; El-Deredy, Wael

    2015-01-15

    Independent-components-analysis (ICA) successfully separated electrically-evoked compound action potentials (ECAPs) from the stimulation artefact and noise (ECAP-ICA, Akhoun et al., 2013). This paper shows how to automate the ECAP-ICA artefact cancellation process. Raw-ECAPs without artefact rejection were consecutively recorded for each stimulation condition from at least 8 intra-cochlear electrodes. Firstly, amplifier-saturated recordings were discarded, and the data from different stimulus conditions (different current-levels) were concatenated temporally. The key aspect of the automation procedure was the sequential deductive source categorisation after ICA was applied with a restriction to 4 sources. The stereotypical aspect of the 4 sources enables their automatic classification as two artefact components, a noise and the sought ECAP based on theoretical and empirical considerations. The automatic procedure was tested using 8 cochlear implant (CI) users and one to four stimulus electrodes. The artefact and noise sources were successively identified and discarded, leaving the ECAP as the remaining source. The automated ECAP-ICA procedure successfully extracted the correct ECAPs compared to standard clinical forward masking paradigm in 22 out of 26 cases. ECAP-ICA does not require extracting the ECAP from a combination of distinct buffers as it is the case with regular methods. It is an alternative that does not have the possible bias of traditional artefact rejections such as alternate-polarity or forward-masking paradigms. The ECAP-ICA procedure bears clinical relevance, for example as the artefact rejection sub-module of automated ECAP-threshold detection techniques, which are common features of CI clinical fitting software. Copyright © 2014. Published by Elsevier B.V.

  15. An Aggregate IRT Procedure for Exploratory Factor Analysis

    NARCIS (Netherlands)

    Camilli, Gregory; Fox, Gerardus J.A.

    2015-01-01

    An aggregation strategy is proposed to potentially address practical limitation related to computing resources for two-level multidimensional item response theory (MIRT) models with large data sets. The aggregate model is derived by integration of the normal ogive model, and an adaptation of the

  16. An Aggregate IRT Procedure for Exploratory Factor Analysis

    Science.gov (United States)

    Camilli, Gregory; Fox, Jean-Paul

    2015-01-01

    An aggregation strategy is proposed to potentially address practical limitation related to computing resources for two-level multidimensional item response theory (MIRT) models with large data sets. The aggregate model is derived by integration of the normal ogive model, and an adaptation of the stochastic approximation expectation maximization…

  17. [Risk factors related to surgical site infection in elective surgery].

    Science.gov (United States)

    Angeles-Garay, Ulises; Morales-Márquez, Lucy Isabel; Sandoval-Balanzarios, Miguel Antonio; Velázquez-García, José Arturo; Maldonado-Torres, Lulia; Méndez-Cano, Andrea Fernanda

    2014-01-01

    The risk factors for surgical site infections in surgery should be measured and monitored from admission to 30 days after the surgical procedure, because 30% of Surgical Site Infection is detected when the patient was discharged. Calculate the Relative Risk of associated factors to surgical site infections in adult with elective surgery. Patients were classified according to the surgery contamination degree; patient with surgery clean was defined as no exposed and patient with clean-contaminated or contaminated surgery was defined exposed. Risk factors for infection were classified as: inherent to the patient, pre-operative, intra-operative and post-operative. Statistical analysis; we realized Student t or Mann-Whitney U, chi square for Relative Risk (RR) and multivariate analysis by Cox proportional hazards. Were monitored up to 30 days after surgery 403 patients (59.8% women), 35 (8.7%) developed surgical site infections. The factors associated in multivariate analysis were: smoking, RR of 3.21, underweight 3.4 hand washing unsuitable techniques 4.61, transfusion during the procedure 3.22, contaminated surgery 60, and intensive care stay 8 to 14 days 11.64, permanence of 1 to 3 days 2.4 and use of catheter 1 to 3 days 2.27. To avoid all risk factors is almost impossible; therefore close monitoring of elective surgery patients can prevent infectious complications.

  18. A procedure for partitioning bulk sediments into distinct grain-size fractions for geochemical analysis

    Science.gov (United States)

    Barbanti, A.; Bothner, Michael H.

    1993-01-01

    A method to separate sediments into discrete size fractions for geochemical analysis has been tested. The procedures were chosen to minimize the destruction or formation of aggregates and involved gentle sieving and settling of wet samples. Freeze-drying and sonication pretreatments, known to influence aggregates, were used for comparison. Freeze-drying was found to increase the silt/clay ratio by an average of 180 percent compared to analysis of a wet sample that had been wet sieved only. Sonication of a wet sample decreased the silt/clay ratio by 51 percent. The concentrations of metals and organic carbon in the separated fractions changed depending on the pretreatment procedures in a manner consistent with the hypothesis that aggregates consist of fine-grained organic- and metal-rich particles. The coarse silt fraction of a freeze-dried sample contained 20–44 percent higher concentrations of Zn, Cu, and organic carbon than the coarse silt fraction of the wet sample. Sonication resulted in concentrations of these analytes that were 18–33 percent lower in the coarse silt fraction than found in the wet sample. Sonication increased the concentration of lead in the clay fraction by an average of 40 percent compared to an unsonicated sample. Understanding the magnitude of change caused by different analysis protocols is an aid in designing future studies that seek to interpret the spatial distribution of contaminated sediments and their transport mechanisms.

  19. ANAESTHESIA FOR OPHTHALMIC SURGICAL PROCEDURES

    African Journals Online (AJOL)

    Objective: To review factors influencing the choice of anaesthesia for ophthalmic surgical procedures. ... as risk associated with general anaesthesia (8) they are more .... Wilson ME, Pandey SK, Thakur J. Paediatric cataract blindness in the ...

  20. A Numerical Procedure for Model Identifiability Analysis Applied to Enzyme Kinetics

    DEFF Research Database (Denmark)

    Daele, Timothy, Van; Van Hoey, Stijn; Gernaey, Krist

    2015-01-01

    The proper calibration of models describing enzyme kinetics can be quite challenging. In the literature, different procedures are available to calibrate these enzymatic models in an efficient way. However, in most cases the model structure is already decided on prior to the actual calibration...... and Pronzato (1997) and which can be easily set up for any type of model. In this paper the proposed approach is applied to the forward reaction rate of the enzyme kinetics proposed by Shin and Kim(1998). Structural identifiability analysis showed that no local structural model problems were occurring......) identifiability problems. By using the presented approach it is possible to detect potential identifiability problems and avoid pointless calibration (and experimental!) effort....

  1. National Trends of Simple Prostatectomy for Benign Prostatic Hyperplasia With an Analysis of Risk Factors for Adverse Perioperative Outcomes.

    Science.gov (United States)

    Pariser, Joseph J; Pearce, Shane M; Patel, Sanjay G; Bales, Gregory T

    2015-10-01

    To examine the national trends of simple prostatectomy (SP) for benign prostatic hyperplasia (BPH) focusing on perioperative outcomes and risk factors for complications. The National Inpatient Sample (2002-2012) was utilized to identify patients with BPH undergoing SP. Analysis included demographics, hospital details, associated procedures, and operative approach (open, robotic, or laparoscopic). Outcomes included complications, length of stay, charges, and mortality. Multivariate logistic regression was used to determine the risk factors for perioperative complications. Linear regression was used to assess the trends in the national annual utilization of SP. The study population included 35,171 patients. Median length of stay was 4 days (interquartile range 3-6). Cystolithotomy was performed concurrently in 6041 patients (17%). The overall complication rate was 28%, with bleeding occurring most commonly. In total, 148 (0.4%) patients experienced in-hospital mortality. On multivariate analysis, older age, black race, and overall comorbidity were associated with greater risk of complications while the use of a minimally invasive approach and concurrent cystolithotomy had a decreased risk. Over the study period, the national use of simple prostatectomy decreased, on average, by 145 cases per year (P = .002). By 2012, 135/2580 procedures (5%) were performed using a minimally invasive approach. The nationwide utilization of SP for BPH has decreased. Bleeding complications are common, but perioperative mortality is low. Patients who are older, black race, or have multiple comorbidities are at higher risk of complications. Minimally invasive approaches, which are becoming increasingly utilized, may reduce perioperative morbidity. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Utilization of a statistical procedure for DNBR calculation and in the survey of reactor protection limits

    International Nuclear Information System (INIS)

    Pontedeiro, A.C.; Camargo, C.T.M.; Galetti, M.R. da Silva.

    1987-01-01

    A new procedure is applied to Angra 1 NPP, which is related to DNBR calculations, considering the design parameters statistically: Improved Thermal Design Procedure (ITDP). The ITDP application leads to the determination of uncertainties in the input parameters, the sensitivity factors on DNBR. The DNBR limit and new reactor protection limits. This was done to Angra 1 with the subchannel code COBRA-IIIP. The analysis of limiting accident in terms of DNB confirmed a gain in DNBR margin, and greater operation flexibility of the plant, decreasing unnecessary trips of the reactor. (author) [pt

  3. Error analysis for intrinsic quality factor measurement in superconducting radio frequency resonators.

    Science.gov (United States)

    Melnychuk, O; Grassellino, A; Romanenko, A

    2014-12-01

    In this paper, we discuss error analysis for intrinsic quality factor (Q0) and accelerating gradient (Eacc) measurements in superconducting radio frequency (SRF) resonators. The analysis is applicable for cavity performance tests that are routinely performed at SRF facilities worldwide. We review the sources of uncertainties along with the assumptions on their correlations and present uncertainty calculations with a more complete procedure for treatment of correlations than in previous publications [T. Powers, in Proceedings of the 12th Workshop on RF Superconductivity, SuP02 (Elsevier, 2005), pp. 24-27]. Applying this approach to cavity data collected at Vertical Test Stand facility at Fermilab, we estimated total uncertainty for both Q0 and Eacc to be at the level of approximately 4% for input coupler coupling parameter β1 in the [0.5, 2.5] range. Above 2.5 (below 0.5) Q0 uncertainty increases (decreases) with β1 whereas Eacc uncertainty, in contrast with results in Powers [in Proceedings of the 12th Workshop on RF Superconductivity, SuP02 (Elsevier, 2005), pp. 24-27], is independent of β1. Overall, our estimated Q0 uncertainty is approximately half as large as that in Powers [in Proceedings of the 12th Workshop on RF Superconductivity, SuP02 (Elsevier, 2005), pp. 24-27].

  4. Retrospective analysis of technical success rate and procedure-related complications of 867 percutaneous CT-guided needle biopsies of lung lesions.

    Science.gov (United States)

    Mills, M; Choi, J; El-Haddad, G; Sweeney, J; Biebel, B; Robinson, L; Antonia, S; Kumar, A; Kis, B

    2017-12-01

    To investigate the technical success rate and procedure-related complications of computed tomography (CT)-guided needle biopsy of lung lesions and to identify the factors that are correlated with the occurrence of procedure-related complications. This was a single- institution retrospective study of 867 consecutive CT-guided needle biopsies of lung lesions performed on 772 patients in a tertiary cancer centre. The technical success rate and complications were correlated with patient, lung lesion, and procedure-related variables. The technical success rate was 87.2% and the mortality rate was 0.12%. Of the 867 total biopsies 25.7% were associated with pneumothorax, and 6.5% required chest tube drainage. The haemothorax rate was 1.8%. There was positive correlation between the development of pneumothorax and smaller lesion diameter (ptechnical success and a low rate of major complications. The present study has revealed several variables that can be used to identify high-risk procedures. A post-procedural chest X-ray within hours after the procedure is highly recommended to identify high-risk patients who require chest tube placement. Copyright © 2017 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  5. Multi-response permutation procedure as an alternative to the analysis of variance: an SPSS implementation.

    Science.gov (United States)

    Cai, Li

    2006-02-01

    A permutation test typically requires fewer assumptions than does a comparable parametric counterpart. The multi-response permutation procedure (MRPP) is a class of multivariate permutation tests of group difference useful for the analysis of experimental data. However, psychologists seldom make use of the MRPP in data analysis, in part because the MRPP is not implemented in popular statistical packages that psychologists use. A set of SPSS macros implementing the MRPP test is provided in this article. The use of the macros is illustrated by analyzing example data sets.

  6. Disruptive Event Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. A. Wasiolek

    2003-07-21

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in

  7. Disruptive Event Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. A. Wasiolek

    2003-01-01

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in the biosphere. The biosphere process

  8. Is isomerism a risk factor for intestinal volvulus?

    Science.gov (United States)

    Landisch, Rachel M; Loomba, Rohit S; Salazar, Jose H; Buelow, Matthew W; Frommelt, Michele; Anderson, Robert H; Wagner, Amy J

    2018-03-06

    Isomerism, or heterotaxy syndrome, affects many organ systems anatomically and functionally. Intestinal malrotation is common in patients with isomerism. Despite a low reported risk of volvulus, some physicians perform routine screening and prophylactic Ladd procedures on asymptomatic patients with isomerism who are found to have intestinal malrotation. The primary aim of this study was to determine if isomerism is an independent risk factor for volvulus. Kid's Inpatient Database data from 1997 to 2012 was utilized for this study. Characteristics of admissions with and without isomerism were compared with a particular focus on intestinal malrotation, volvulus, and Ladd procedure. A logistic regression was conducted to determine independent risk factors for volvulus with respect to isomerism. 15,962,403 inpatient admissions were included in the analysis, of which 7970 (0.05%) patients had isomerism, and 6 patients (0.1%) developed volvulus. Isomerism was associated with a 52-fold increase in the odds of intestinal malrotation by univariate analysis. Of 251 with isomerism and intestinal malrotation, only 2.4% experienced volvulus. Logistic regression demonstrated that isomerism was not an independent risk factor for volvulus. Isomerism is associated with an increased risk of intestinal malrotation but is not an independent risk factor for volvulus. Prognosis study. Level III. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. A replication of a factor analysis of motivations for trapping

    Science.gov (United States)

    Schroeder, Susan; Fulton, David C.

    2015-01-01

    Using a 2013 sample of Minnesota trappers, we employed confirmatory factor analysis to replicate an exploratory factor analysis of trapping motivations conducted by Daigle, Muth, Zwick, and Glass (1998).  We employed the same 25 items used by Daigle et al. and tested the same five-factor structure using a recent sample of Minnesota trappers. We also compared motivations in our sample to those reported by Daigle et el.

  10. A human factor analysis of a radiotherapy accident

    International Nuclear Information System (INIS)

    Thellier, S.

    2009-01-01

    Since September 2005, I.R.S.N. studies activities of radiotherapy treatment from the angle of the human and organizational factors to improve the reliability of treatment in radiotherapy. Experienced in nuclear industry incidents analysis, I.R.S.N. analysed and diffused in March 2008, for the first time in France, the detailed study of a radiotherapy accident from the angle of the human and organizational factors. The method used for analysis is based on interviews and documents kept by the hospital. This analysis aimed at identifying the causes of the difference recorded between the dose prescribed by the radiotherapist and the dose effectively received by the patient. Neither verbal nor written communication (intra-service meetings and protocols of treatment) allowed information to be transmitted correctly in order to permit radiographers to adjust the irradiation zones correctly. This analysis highlighted the fact that during the preparation and the carrying out of the treatment, various factors led planned controls to not be performed. Finally, this analysis highlighted the fact that unsolved areas persist in the report over this accident. This is due to a lack of traceability of a certain number of key actions. The article concluded that there must be improvement in three areas: cooperation between the practitioners, control of the actions and traceability of the actions. (author)

  11. EFFICIENCY OF MOMENT AMPLIFICATION PROCEDURES FOR THE SECOND-ORDER ANALYSIS OF STEEL FRAMES

    Directory of Open Access Journals (Sweden)

    Paschal Chiadighikaobi

    2018-02-01

    Full Text Available Beam-column is the member subjected to axial compression and bending. Secondary Moment was accounted for in the design and was additional moment induced by axial load. Comparing the results analysis from two computer aided software (SAP2000 and Java. The moment amplification factor Af was inputted in the Java code. Af did not create any change in the result outputs in the Java Code results. There are many different ways to apply amplification factors to first-order analysis results, each with various ranges of applicability. The results shown in this paper are the comparative results of the moment diagrams, axial forces, and shear forces. The type of steel used in the design and analysis is ASTM A992.

  12. Typical NRC inspection procedures for model plant

    International Nuclear Information System (INIS)

    Blaylock, J.

    1984-01-01

    A summary of NRC inspection procedures for a model LEU fuel fabrication plant is presented. Procedures and methods for combining inventory data, seals, measurement techniques, and statistical analysis are emphasized

  13. Procedures for determination of 239,240Pu, 241Am, 237Np, 234,238U, 228,230,232Th, 99Tc and 210Pb-210Po in environmental material

    International Nuclear Information System (INIS)

    Kolstad, A. K.; Lind, B.; QingJiang Chen; Aarkrog, A.; Nielsen, S.P.; Dahlgaard, H.; Yixuan Yu

    2001-12-01

    Since 1987, the Department of Nuclear Safety Research, Risoe National Laboratory has developed procedures for analysis of low-level amounts of radioactivity in large samples of 200 liters seawater, 10 gram sediment, soil and other environmental materials. These analytical procedures provide high chemical yields, good resolution and excellent decontamination factors for large environmental samples analysed by alpha spectrometry and mass spectrometry (ICPMS). The procedures have been checked through practical analysis work and are used in Norway, the Netherlands, Germany, Spain, France and Denmark. (au)

  14. A General Procedure to Assess the Internal Structure of a Noncognitive Measure--The Student360 Insight Program (S360) Time Management Scale. Research Report. ETS RR-11-42

    Science.gov (United States)

    Ling, Guangming; Rijmen, Frank

    2011-01-01

    The factorial structure of the Time Management (TM) scale of the Student 360: Insight Program (S360) was evaluated based on a national sample. A general procedure with a variety of methods was introduced and implemented, including the computation of descriptive statistics, exploratory factor analysis (EFA), and confirmatory factor analysis (CFA).…

  15. Factor Analysis of the Brazilian Version of UPPS Impulsive Behavior Scale

    Science.gov (United States)

    Sediyama, Cristina Y. N.; Moura, Ricardo; Garcia, Marina S.; da Silva, Antonio G.; Soraggi, Carolina; Neves, Fernando S.; Albuquerque, Maicon R.; Whiteside, Setephen P.; Malloy-Diniz, Leandro F.

    2017-01-01

    Objective: To examine the internal consistency and factor structure of the Brazilian adaptation of the UPPS Impulsive Behavior Scale. Methods: UPPS is a self-report scale composed by 40 items assessing four factors of impulsivity: (a) urgency, (b) lack of premeditation; (c) lack of perseverance; (d) sensation seeking. In the present study 384 participants (278 women and 106 men), who were recruited from schools, universities, leisure centers and workplaces fulfilled the UPPS scale. An exploratory factor analysis was performed by using Varimax factor rotation and Kaiser Normalization, and we also conducted two confirmatory analyses to test the independency of the UPPS components found in previous analysis. Results: Results showed a decrease in mean UPPS total scores with age and this analysis showed that the youngest participants (below 30 years) scored significantly higher than the other groups over 30 years. No difference in gender was found. Cronbach’s alpha, results indicated satisfactory values for all subscales, with similar high values for the subscales and confirmatory factor analysis indexes also indicated a poor model fit. The results of two exploratory factor analysis were satisfactory. Conclusion: Our results showed that the Portuguese version has the same four-factor structure of the original and previous translations of the UPPS. PMID:28484414

  16. Factor Analysis of the Brazilian Version of UPPS Impulsive Behavior Scale

    Directory of Open Access Journals (Sweden)

    Leandro F. Malloy-Diniz

    2017-04-01

    Full Text Available Objective: To examine the internal consistency and factor structure of the Brazilian adaptation of the UPPS Impulsive Behavior Scale.Methods: UPPS is a self-report scale composed by 40 items assessing four factors of impulsivity: (a urgency, (b lack of premeditation; (c lack of perseverance; (d sensation seeking. In the present study 384 participants (278 women and 106 men, who were recruited from schools, universities, leisure centers and workplaces fulfilled the UPPS scale. An exploratory factor analysis was performed by using Varimax factor rotation and Kaiser Normalization, and we also conducted two confirmatory analyses to test the independency of the UPPS components found in previous analysis.Results: Results showed a decrease in mean UPPS total scores with age and this analysis showed that the youngest participants (below 30 years scored significantly higher than the other groups over 30 years. No difference in gender was found. Cronbach’s alpha, results indicated satisfactory values for all subscales, with similar high values for the subscales and confirmatory factor analysis indexes also indicated a poor model fit. The results of two exploratory factor analysis were satisfactory.Conclusion: Our results showed that the Portuguese version has the same four-factor structure of the original and previous translations of the UPPS.

  17. Factor Analysis of the Brazilian Version of UPPS Impulsive Behavior Scale.

    Science.gov (United States)

    Sediyama, Cristina Y N; Moura, Ricardo; Garcia, Marina S; da Silva, Antonio G; Soraggi, Carolina; Neves, Fernando S; Albuquerque, Maicon R; Whiteside, Setephen P; Malloy-Diniz, Leandro F

    2017-01-01

    Objective: To examine the internal consistency and factor structure of the Brazilian adaptation of the UPPS Impulsive Behavior Scale. Methods: UPPS is a self-report scale composed by 40 items assessing four factors of impulsivity: (a) urgency, (b) lack of premeditation; (c) lack of perseverance; (d) sensation seeking. In the present study 384 participants (278 women and 106 men), who were recruited from schools, universities, leisure centers and workplaces fulfilled the UPPS scale. An exploratory factor analysis was performed by using Varimax factor rotation and Kaiser Normalization, and we also conducted two confirmatory analyses to test the independency of the UPPS components found in previous analysis. Results: Results showed a decrease in mean UPPS total scores with age and this analysis showed that the youngest participants (below 30 years) scored significantly higher than the other groups over 30 years. No difference in gender was found. Cronbach's alpha, results indicated satisfactory values for all subscales, with similar high values for the subscales and confirmatory factor analysis indexes also indicated a poor model fit. The results of two exploratory factor analysis were satisfactory. Conclusion: Our results showed that the Portuguese version has the same four-factor structure of the original and previous translations of the UPPS.

  18. Factors affecting the HIV/AIDS epidemic: An ecological analysis of ...

    African Journals Online (AJOL)

    Factors affecting the HIV/AIDS epidemic: An ecological analysis of global data. ... Backward multiple linear regression analysis identified the proportion of Muslims, physicians density, and adolescent fertility rate are as the three most prominent factors linked with the national HIV epidemic. Conclusions: The findings support ...

  19. Analysis of Economic Factors Affecting Stock Market

    OpenAIRE

    Xie, Linyin

    2010-01-01

    This dissertation concentrates on analysis of economic factors affecting Chinese stock market through examining relationship between stock market index and economic factors. Six economic variables are examined: industrial production, money supply 1, money supply 2, exchange rate, long-term government bond yield and real estate total value. Stock market comprises fixed interest stocks and equities shares. In this dissertation, stock market is restricted to equity market. The stock price in thi...

  20. Reversal of Hartmann's procedure following acute diverticulitis: is timing everything?

    LENUS (Irish Health Repository)

    Fleming, Fergal J

    2012-02-01

    BACKGROUND: Patients who undergo a Hartmann\\'s procedure may not be offered a reversal due to concerns over the morbidity of the second procedure. The aims of this study were to examine the morbidity post reversal of Hartmann\\'s procedure. METHODS: Patients who underwent a Hartmann\\'s procedure for acute diverticulitis (Hinchey 3 or 4) between 1995 and 2006 were studied. Clinical factors including patient comorbidities were analysed to elucidate what preoperative factors were associated with complications following reversal of Hartmann\\'s procedure. RESULTS: One hundred and ten patients were included. Median age was 70 years and 56% of the cohort were male (n = 61). The mortality and morbidity rate for the acute presentation was 7.3% (n = 8) and 34% (n = 37) respectively. Seventy six patients (69%) underwent a reversal at a median of 7 months (range 3-22 months) post-Hartmann\\'s procedure. The complication rate in the reversal group was 25% (n = 18). A history of current smoking (p = 0.004), increasing time to reversal (p = 0.04) and low preoperative albumin (p = 0.003) were all associated with complications following reversal. CONCLUSIONS: Reversal of Hartmann\\'s procedure can be offered to appropriately selected patients though with a significant (25%) morbidity rate. The identification of potential modifiable factors such as current smoking, prolonged time to reversal and low preoperative albumin may allow optimisation of such patients preoperatively.

  1. Relations among conceptual knowledge, procedural knowledge, and procedural flexibility in two samples differing in prior knowledge.

    Science.gov (United States)

    Schneider, Michael; Rittle-Johnson, Bethany; Star, Jon R

    2011-11-01

    Competence in many domains rests on children developing conceptual and procedural knowledge, as well as procedural flexibility. However, research on the developmental relations between these different types of knowledge has yielded unclear results, in part because little attention has been paid to the validity of the measures or to the effects of prior knowledge on the relations. To overcome these problems, we modeled the three constructs in the domain of equation solving as latent factors and tested (a) whether the predictive relations between conceptual and procedural knowledge were bidirectional, (b) whether these interrelations were moderated by prior knowledge, and (c) how both constructs contributed to procedural flexibility. We analyzed data from 2 measurement points each from two samples (Ns = 228 and 304) of middle school students who differed in prior knowledge. Conceptual and procedural knowledge had stable bidirectional relations that were not moderated by prior knowledge. Both kinds of knowledge contributed independently to procedural flexibility. The results demonstrate how changes in complex knowledge structures contribute to competence development.

  2. Influence of the derivatization procedure on the results of the gaschromatographic fatty acid analysis of human milk and infant formulae.

    Science.gov (United States)

    Kohn, G; van der Ploeg, P; Möbius, M; Sawatzki, G

    1996-09-01

    Many different analytical procedures for fatty acid analysis of infant formulae and human milk are described. The objective was to study possible pitfalls in the use of different acid-catalyzed procedures compared to a base-catalyzed procedure based on sodium-methoxide in methanol. The influence of the different methods on the relative fatty acid composition (wt% of total fatty acids) and the total fatty acid recovery rate (expressed as % of total lipids) was studied in two experimental LCP-containing formulae and a human milk sample. MeOH/HCl-procedures were found to result in an incomplete transesterification of triglycerides, if an additional nonpolar solvent like toluene or hexane is not added and a water-free preparation is not guaranteed. In infant formulae the low transesterification of triglycerides (up to only 37%) could result in an 100%-overestimation of the relative amount of LCP, if these fatty acids primarily derive from phospholipids. This is the case in infant formulae containing egg lipids as raw materials. In formula containing fish oils and in human milk the efficacy of esterification results in incorrect absolute amounts of fatty acids, but has no remarkable effect on the relative fatty acid distribution. This is due to the fact that in these samples LCP are primarily bound to triglycerides. Furthermore, in formulae based on butterfat the derivatization procedure should be designed in such a way that losses of short-chain fatty acids due to evaporation steps can be avoided. The procedure based on sodium methoxide was found to result in a satisfactory (about 90%) conversion of formula lipids and a reliable content of all individual fatty acids. Due to a possibly high amount of free fatty acids in human milk, which are not methylated by sodium-methoxide, caution is expressed about the use of this reagent for fatty acid analysis of mothers milk. It is concluded that accurate fatty acid analysis of infant formulae and human milk requires a careful

  3. Deterministic factor analysis: methods of integro-differentiation of non-integral order

    Directory of Open Access Journals (Sweden)

    Valentina V. Tarasova

    2016-12-01

    Full Text Available Objective to summarize the methods of deterministic factor economic analysis namely the differential calculus and the integral method. nbsp Methods mathematical methods for integrodifferentiation of nonintegral order the theory of derivatives and integrals of fractional nonintegral order. Results the basic concepts are formulated and the new methods are developed that take into account the memory and nonlocality effects in the quantitative description of the influence of individual factors on the change in the effective economic indicator. Two methods are proposed for integrodifferentiation of nonintegral order for the deterministic factor analysis of economic processes with memory and nonlocality. It is shown that the method of integrodifferentiation of nonintegral order can give more accurate results compared with standard methods method of differentiation using the first order derivatives and the integral method using the integration of the first order for a wide class of functions describing effective economic indicators. Scientific novelty the new methods of deterministic factor analysis are proposed the method of differential calculus of nonintegral order and the integral method of nonintegral order. Practical significance the basic concepts and formulas of the article can be used in scientific and analytical activity for factor analysis of economic processes. The proposed method for integrodifferentiation of nonintegral order extends the capabilities of the determined factorial economic analysis. The new quantitative method of deterministic factor analysis may become the beginning of quantitative studies of economic agents behavior with memory hereditarity and spatial nonlocality. The proposed methods of deterministic factor analysis can be used in the study of economic processes which follow the exponential law in which the indicators endogenous variables are power functions of the factors exogenous variables including the processes

  4. Structural-Vibration-Response Data Analysis

    Science.gov (United States)

    Smith, W. R.; Hechenlaible, R. N.; Perez, R. C.

    1983-01-01

    Computer program developed as structural-vibration-response data analysis tool for use in dynamic testing of Space Shuttle. Program provides fast and efficient time-domain least-squares curve-fitting procedure for reducing transient response data to obtain structural model frequencies and dampings from free-decay records. Procedure simultaneously identifies frequencies, damping values, and participation factors for noisy multiple-response records.

  5. Modification and analysis of engineering hot spot factor of HFETR

    International Nuclear Information System (INIS)

    Hu Yuechun; Deng Caiyu; Li Haitao; Xu Taozhong; Mo Zhengyu

    2014-01-01

    This paper presents the modification and analysis of engineering hot spot factors of HFETR. The new factors are applied in the fuel temperature analysis and the estimated value of the safety allowable operating power of HFETR. The result shows the maximum cladding temperature of the fuel is lower when the new factor are in utilization, and the safety allowable operating power of HFETR if higher, thus providing the economical efficiency of HFETR. (authors)

  6. Ranking insurance firms using AHP and Factor Analysis

    Directory of Open Access Journals (Sweden)

    Mohammad Khodaei Valahzaghard

    2013-03-01

    Full Text Available Insurance industry includes a significant part of economy and it is important to learn more about the capabilities of different firms, which are active in this industry. In this paper, we present an empirical study to rank the insurance firms using analytical hierarchy process as well as factor analysis. The study considers four criteria including capital adequacy, quality of earning, quality of cash flow and quality of firms’ assets. The results of the implementation of factor analysis (FA have been verified using Kaiser-Meyer-Olkin (KMO=0.573 and Bartlett's Chi-Square (443.267 P-value=0.000 tests. According to the results FA, the first important factor, capital adequacy, represents 21.557% of total variance, the second factor, quality of income, represents 20.958% of total variance. In addition, the third factor, quality of cash flow, represents 19.417% of total variance and the last factor, quality of assets, represents 18.641% of total variance. The study has also used analytical hierarchy process (AHP to rank insurance firms. The results of our survey indicate that capital adequacy (0.559 is accounted as the most important factor followed by quality of income (0.235, quality of cash flow (0.144 and quality of assets (0.061. The results of AHP are consistent with the results of FA, which somewhat validates the overall study.

  7. Interpreting and Correcting Cross-scale Mismatches in Resilience Analysis: a Procedure and Examples from Australia's Rangelands

    Directory of Open Access Journals (Sweden)

    John A. Ludwig

    2005-12-01

    Full Text Available Many rangelands around the globe are degraded because of mismatches between the goals and actions of managers operating at different spatial scales. In this paper, we focus on identifying, interpreting, and correcting cross-scale mismatches in rangeland management by building on an existing four-step resilience analysis procedure. Resilience analysis is an evaluation of the capacity of a system to persist in the face of disturbances. We provide three examples of cross-scale resilience analysis using a rangeland system located in northern Australia. The system was summarized in a diagram showing key interactions between three attributes (water quality, regional biodiversity, and beef quality, which can be used to indicate the degree of resilience of the system, and other components that affect these attributes at different scales. The strengths of cross-scale interactions were rated as strong or weak, and the likely causes of mismatches in strength were interpreted. Possible actions to correct cross-scale mismatches were suggested and evaluated. We found this four-step, cross-scale resilience analysis procedure very helpful because it reduced a complex problem down to manageable parts without losing sight of the larger-scale whole. To build rangeland resilience, many such cross-scale mismatches in management will need to be corrected, especially as the global use of rangelands increases over the coming decades.

  8. Risk factors and prognosis for neonatal sepsis in southeastern Mexico: analysis of a four-year historic cohort follow-up

    Directory of Open Access Journals (Sweden)

    Leal Yelda A

    2012-06-01

    Full Text Available Abstract Background Neonatal sepsis is a worldwide public health issue in which, depending on the studied population, marked variations concerning its risk and prognostic factors have been reported. The aim of this study was to assess risk and prognostic factors for neonatal sepsis prevailing at a medical unit in southeastern Mexico. Thus, we used a historic cohort design to assess the association between a series of neonates and their mothers, in addition to hospital evolution features and the risk and prognosis of neonatal sepsis (defined by Pediatric Sepsis Consensus [PSC] criteria in 11,790 newborns consecutively admitted to a Neonatology Service in Mérida, Mexico, between 2004 and 2007. Results Sepsis was found in 514 of 11,790 (4.3 % newborns; 387 of these cases were categorized as early-onset (72 h (24.7 %. After logistic regression, risk factors for sepsis included the following: low birth weight; prematurity; abnormal amniotic fluid; premature membrane rupture (PMR at >24 h; respiratory complications, and the requirement of assisted ventilation, O2 Inspiration fraction (IF >60 %, or a surgical procedure. Some of these factors were differentially associated with early- or late-onset neonatal sepsis. The overall mortality rate of sepsis was 9.5 %. A marked difference in the mortality rate was found between early- and late-onset sepsis (p >0.0001. After Cox analysis, factors associated with mortality in newborns with sepsis comprised the following: prematurity; low birth weight; low Apgar score; perinatal asphyxia, and the requirement of any invasive medical or surgical procedure. Conclusions The incidence of neonatal sepsis in southeastern Mexico was 4.3 %. A different risk and prognostic profile between early- and late-onset neonatal sepsis was found.

  9. Soil Sampling Operating Procedure

    Science.gov (United States)

    EPA Region 4 Science and Ecosystem Support Division (SESD) document that describes general and specific procedures, methods, and considerations when collecting soil samples for field screening or laboratory analysis.

  10. An XML Representation for Crew Procedures

    Science.gov (United States)

    Simpson, Richard C.

    2005-01-01

    NASA ensures safe operation of complex systems through the use of formally-documented procedures, which encode the operational knowledge of the system as derived from system experts. Crew members use procedure documentation on the ground for training purposes and on-board space shuttle and space station to guide their activities. Investigators at JSC are developing a new representation for procedures that is content-based (as opposed to display-based). Instead of specifying how a procedure should look on the printed page, the content-based representation will identify the components of a procedure and (more importantly) how the components are related (e.g., how the activities within a procedure are sequenced; what resources need to be available for each activity). This approach will allow different sets of rules to be created for displaying procedures on a computer screen, on a hand-held personal digital assistant (PDA), verbally, or on a printed page, and will also allow intelligent reasoning processes to automatically interpret and use procedure definitions. During his NASA fellowship, Dr. Simpson examined how various industries represent procedures (also called business processes or workflows), in areas such as manufacturing, accounting, shipping, or customer service. A useful method for designing and evaluating workflow representation languages is by determining their ability to encode various workflow patterns, which depict abstract relationships between the components of a procedure removed from the context of a specific procedure or industry. Investigators have used this type of analysis to evaluate how well-suited existing workflow representation languages are for various industries based on the workflow patterns that commonly arise across industry-specific procedures. Based on this type of analysis, it is already clear that existing workflow representations capture discrete flow of control (i.e., when one activity should start and stop based on when other

  11. Neutron activation analysis with k{sub 0}-standardisation : general formalism and procedure

    Energy Technology Data Exchange (ETDEWEB)

    Pomme, S.; Hardeman, F. [Centre de l`Etude de l`Energie Nucleaire, Mol (Belgium); Robouch, P.; Etxebarria, N.; Arana, G. [European Commission, Joint Research Centre, Institute for Reference Materials and Measurements, Geel (Belgium)

    1997-09-01

    Instrumental neutron activation analysis (INAA) with k{sub 0}-standardisation is a powerful tool for multi-element analysis at a broad range of trace element concentrations. An overview is given of the basic principles, fundamental equations, and general procedure of this method. Different aspects of the description of the neutron activation reaction rate are discussed, applying the Hogdahl convention. A general activation-decay formula is derived and its application to INAA is demonstrated. Relevant k{sub 0}-definitions for different activation decay schemes are summarised and upgraded to cases of extremely high fluxes. The main standardisation techniques for INAA are discussed, emphasizing the k{sub 0}-standardisation. Some general aspects of the basic equipment and its calibration are discussed, such as the characterisation of the neutron field and the tuning of the spectrometry part. A method for the prediction and optimisation of the analytical performance of INAA is presented.

  12. New non-cognitive procedures for medical applicant selection: a qualitative analysis in one school.

    Science.gov (United States)

    Katz, Sara; Vinker, Shlomo

    2014-11-07

    Recent data have called into question the reliability and predictive validity of standard admission procedures to medical schools. Eliciting non-cognitive attributes of medical school applicants using qualitative tools and methods has thus become a major challenge. 299 applicants aged 18-25 formed the research group. A set of six research tools was developed in addition to the two existing ones. These included: a portfolio task, an intuitive task, a cognitive task, a personal task, an open self-efficacy questionnaire and field-notes. The criteria-based methodology design used constant comparative analysis and grounded theory techniques to produce a personal attributes profile per participant, scored on a 5-point scale holistic rubric. Qualitative validity of data gathering was checked by comparing the profiles elicited from the existing interview against the profiles elicited from the other tools, and by comparing two profiles of each of the applicants who handed in two portfolio tasks. Qualitative validity of data analysis was checked by comparing researcher results with those of an external rater (n =10). Differences between aggregated profile groups were checked by the Npar Wilcoxon Signed Ranks Test and by Spearman Rank Order Correlation Test. All subjects gave written informed consent to their participation. Privacy was protected by using code numbers. A concept map of 12 personal attributes emerged, the core constructs of which were motivation, sociability and cognition. A personal profile was elicited. Inter-rater agreement was 83.3%. Differences between groups by aggregated profiles were found significant (p < .05, p < .01, p < .001).A random sample of sixth year students (n = 12) underwent the same admission procedure as the research group. Rank order was different; and arrogance was a new construct elicited in the sixth year group. This study suggests a broadening of the methodology for selecting medical school applicants. This methodology

  13. An integrated portfolio optimisation procedure based on data envelopment analysis, artificial bee colony algorithm and genetic programming

    Science.gov (United States)

    Hsu, Chih-Ming

    2014-12-01

    Portfolio optimisation is an important issue in the field of investment/financial decision-making and has received considerable attention from both researchers and practitioners. However, besides portfolio optimisation, a complete investment procedure should also include the selection of profitable investment targets and determine the optimal timing for buying/selling the investment targets. In this study, an integrated procedure using data envelopment analysis (DEA), artificial bee colony (ABC) and genetic programming (GP) is proposed to resolve a portfolio optimisation problem. The proposed procedure is evaluated through a case study on investing in stocks in the semiconductor sub-section of the Taiwan stock market for 4 years. The potential average 6-month return on investment of 9.31% from 1 November 2007 to 31 October 2011 indicates that the proposed procedure can be considered a feasible and effective tool for making outstanding investment plans, and thus making profits in the Taiwan stock market. Moreover, it is a strategy that can help investors to make profits even when the overall stock market suffers a loss.

  14. ANALYSIS OF THE FACTORS AFFECTING THE AVERAGE

    Directory of Open Access Journals (Sweden)

    Carmen BOGHEAN

    2013-12-01

    Full Text Available Productivity in agriculture most relevantly and concisely expresses the economic efficiency of using the factors of production. Labour productivity is affected by a considerable number of variables (including the relationship system and interdependence between factors, which differ in each economic sector and influence it, giving rise to a series of technical, economic and organizational idiosyncrasies. The purpose of this paper is to analyse the underlying factors of the average work productivity in agriculture, forestry and fishing. The analysis will take into account the data concerning the economically active population and the gross added value in agriculture, forestry and fishing in Romania during 2008-2011. The distribution of the average work productivity per factors affecting it is conducted by means of the u-substitution method.

  15. Radiochemical procedures

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1982-01-01

    The modern counting instrumentation has largely obviated the need for separation processes in the radiochemical analysis but problems in low-level radioactivity measurement, environmental-type analyses, and special situations caused in the last years a renaissance of the need for separation techniques. Most of the radiochemical procedures, based on the classic works of the Manhattan Project chemists of the 1940's, were published in the National Nuclear Energy Series (NNES). Improvements such as new solvent extraction and ion exchange separations have been added to these methods throughout the years. Recently the Los Alamos Group have reissued their collected Radiochemical Procedures containing a short summary and review of basic inorganic chemistry - 'Chemistry of the Elements on the Basis of Electronic Configuration'. (A.L.)

  16. Evaluation of Parallel Analysis Methods for Determining the Number of Factors

    Science.gov (United States)

    Crawford, Aaron V.; Green, Samuel B.; Levy, Roy; Lo, Wen-Juo; Scott, Lietta; Svetina, Dubravka; Thompson, Marilyn S.

    2010-01-01

    Population and sample simulation approaches were used to compare the performance of parallel analysis using principal component analysis (PA-PCA) and parallel analysis using principal axis factoring (PA-PAF) to identify the number of underlying factors. Additionally, the accuracies of the mean eigenvalue and the 95th percentile eigenvalue criteria…

  17. Protein precipitation: an expedient procedure for the routine analysis of the plasma metabolites of [123I]IBZM

    International Nuclear Information System (INIS)

    Zea-Ponce, Yolanda; Laruelle, Marc

    1999-01-01

    Plasma metabolite analysis of the single photon emission computed tomography (SPECT) D 2 /D 3 receptor radiotracer (S)(-)-N-[(1-ethyl-2-pyrrolidinyl)methyl]-2-hydroxy-3-[ 123 I] iodo-6-methoxyb enzamide ([ 123 I]IBZM) is needed for the equilibrium analysis of the SPECT data, in brain imaging studies involving bolus plus constant infusion paradigm. The purpose of these experiments was to find an appropriate procedure to expedite this analysis during routine determinations. The procedure was applied to the plasma analysis of 22 human subjects. Each plasma sample was subjected to acetonitrile protein precipitation. After separation of the pellet, the acetonitrile fraction contained 91%±2% (n=88) of the mixture of labeled metabolites and parent compound. The recovery coefficient of unmetabolized [ 123 I]IBZM determined with an standard plasma sample was 95%±2% (n=22). The percent parent compound present in the extracted fraction, measured by high performance liquid chromatography, was 16%±9% (n=85) and the percent metabolites was 84%±9% (n=85). Free fraction determination (f 1 , fraction of radiotracer unbound to protein), was 4%±0.8% (n=22). Free fraction of parent was 15%±8% (n=85). The results indicate that acetonitrile protein precipitation is an adequate method for the analysis of the [ 123 I]IBZM plasma metabolites

  18. Normalization of RNA-seq data using factor analysis of control genes or samples

    Science.gov (United States)

    Risso, Davide; Ngai, John; Speed, Terence P.; Dudoit, Sandrine

    2015-01-01

    Normalization of RNA-seq data has proven essential to ensure accurate inference of expression levels. Here we show that usual normalization approaches mostly account for sequencing depth and fail to correct for library preparation and other more-complex unwanted effects. We evaluate the performance of the External RNA Control Consortium (ERCC) spike-in controls and investigate the possibility of using them directly for normalization. We show that the spike-ins are not reliable enough to be used in standard global-scaling or regression-based normalization procedures. We propose a normalization strategy, remove unwanted variation (RUV), that adjusts for nuisance technical effects by performing factor analysis on suitable sets of control genes (e.g., ERCC spike-ins) or samples (e.g., replicate libraries). Our approach leads to more-accurate estimates of expression fold-changes and tests of differential expression compared to state-of-the-art normalization methods. In particular, RUV promises to be valuable for large collaborative projects involving multiple labs, technicians, and/or platforms. PMID:25150836

  19. 40 CFR 1065.12 - Approval of alternate procedures.

    Science.gov (United States)

    2010-07-01

    ... Compliance Officer an initial written request describing the alternate procedure and why you believe it is... described in this section, we may ask you to submit to us in writing supplemental information showing that... allowed procedure, considering the following factors: (1) The cost, difficulty, and availability to switch...

  20. Disruptive Event Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this

  1. Disruptive Event Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. Wasiolek

    2004-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this analysis was to develop the BDCFs for the volcanic ash

  2. An inter-battery factor analysis of the comrey personality scales and the 16 personality factor questionnaire

    OpenAIRE

    Gideon P. de Bruin

    2000-01-01

    The scores of 700 Afrikaans-speaking university students on the Comrey Personality Scales and the 16 Personality Factor Questionnaire were subjected to an inter-battery factor analysis. This technique uses only the correlations between two sets of variables and reveals only the factors that they have in common. Three of the Big Five personality factors were revealed, namely Extroversion, Neuroticism and Conscientiousness. However, the Conscientiousness factor contained a relatively strong uns...

  3. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis

    International Nuclear Information System (INIS)

    2016-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  4. Multi-Scale Factor Analysis of High-Dimensional Brain Signals

    KAUST Repository

    Ting, Chee-Ming

    2017-05-18

    In this paper, we develop an approach to modeling high-dimensional networks with a large number of nodes arranged in a hierarchical and modular structure. We propose a novel multi-scale factor analysis (MSFA) model which partitions the massive spatio-temporal data defined over the complex networks into a finite set of regional clusters. To achieve further dimension reduction, we represent the signals in each cluster by a small number of latent factors. The correlation matrix for all nodes in the network are approximated by lower-dimensional sub-structures derived from the cluster-specific factors. To estimate regional connectivity between numerous nodes (within each cluster), we apply principal components analysis (PCA) to produce factors which are derived as the optimal reconstruction of the observed signals under the squared loss. Then, we estimate global connectivity (between clusters or sub-networks) based on the factors across regions using the RV-coefficient as the cross-dependence measure. This gives a reliable and computationally efficient multi-scale analysis of both regional and global dependencies of the large networks. The proposed novel approach is applied to estimate brain connectivity networks using functional magnetic resonance imaging (fMRI) data. Results on resting-state fMRI reveal interesting modular and hierarchical organization of human brain networks during rest.

  5. Ureteroneocystostomy in primary vesicoureteral reflux: critical retrospective analysis of factors affecting the postoperative urinary tract infection rates

    Directory of Open Access Journals (Sweden)

    Hasan Serkan Dogan

    2014-08-01

    Full Text Available Introduction To determine the parameters affecting the outcome of ureteroneocystostomy (UNC procedure for vesicoureteral reflux (VUR. Materials and Methods Data of 398 patients who underwent UNC procedure from 2001 to 2012 were analyzed retrospectively. Different UNC techniques were used according to laterality of reflux and ureteral orifice configuration. Effects of several parameters on outcome were examined. Disappearance of reflux on control VCUG or absence of any kind of UTI/symptoms in patients without control VCUG was considered as clinical improvement. Results Mean age at operation was 59.2 ± 39.8 months and follow-up was 25.6 ± 23.3 months. Grade of VUR was 1-2, 3 and 4-5 in 17, 79, 302 patients, respectively. Male to female ratio was 163/235. UNC was performed bilaterally in 235 patients and intravesical approach was used in 373 patients. The frequency of voiding dysfunction, scar on preoperative DMSA, breakthrough infection and previous surgery was 28.4%, 70.7%, 49.3% and 22.4%, respectively. Twelve patients (8.9% with postoperative contralateral reflux were excluded from the analysis. Overall clinical improvement rate for UNC was 92%. Gender, age at diagnosis and operation, laterality and grade of reflux, mode of presentation, breakthrough infections (BTI under antibiotic prophylaxis, presence of voiding dysfunction and renal scar, and operation technique did not affect the surgical outcome. However, the clinical improvement rate was lower in patients with a history of previous endoscopic intervention (83.9% vs. 94%. Postoperative UTI rate was 27.2% and factors affecting the occurrence of postoperative UTI were previous failed endoscopic injection on univariate analysis and gender, preoperative BTI, postoperative VUR state, voiding dysfunction on multivariate analysis. Surgery related complication rate was 2% (8/398. These were all low grade complications (blood transfusion in 1, hematoma under incision in 3 and prolonged

  6. Quantitative EDXS analysis of organic materials using the ζ-factor method

    International Nuclear Information System (INIS)

    Fladischer, Stefanie; Grogger, Werner

    2014-01-01

    In this study we successfully applied the ζ-factor method to perform quantitative X-ray analysis of organic thin films consisting of light elements. With its ability to intrinsically correct for X-ray absorption, this method significantly improved the quality of the quantification as well as the accuracy of the results compared to conventional techniques in particular regarding the quantification of light elements. We describe in detail the process of determining sensitivity factors (ζ-factors) using a single standard specimen and the involved parameter optimization for the estimation of ζ-factors for elements not contained in the standard. The ζ-factor method was then applied to perform quantitative analysis of organic semiconducting materials frequently used in organic electronics. Finally, the results were verified and discussed concerning validity and accuracy. - Highlights: • The ζ-factor method is used for quantitative EDXS analysis of light elements. • We describe the process of determining ζ-factors from a single standard in detail. • Organic semiconducting materials are successfully quantified

  7. Experimental comparison between total calibration factors and components calibration factors of reference dosemeters used in secondary standard laboratory dosemeters

    International Nuclear Information System (INIS)

    Silva, T.A. da.

    1981-06-01

    A quantitative comparison of component calibration factors with the corresponding overall calibration factor was used to evaluate the adopted component calibration procedure in regard to parasitic elements. Judgement of significance is based upon the experimental uncertainty of a well established procedure for determination of the overall calibration factor. The experimental results obtained for different ionization chambers and different electrometers demonstrate that for one type of electrometer the parasitic elements have no influence on its sensitivity considering the experimental uncertainty of the calibration procedures. In this case the adopted procedure for determination of component calibration factors is considered to be equivalent to the procedure of determination of the overall calibration factor and thus might be used as a strong quality control measure in routine calibration. (Author) [pt

  8. Centre characteristics and procedure-related factors have an impact on outcomes of allogeneic transplantation for patients with CLL: a retrospective analysis from the European Society for Blood and Marrow Transplantation (EBMT).

    Science.gov (United States)

    Schetelig, Johannes; de Wreede, Liesbeth C; Andersen, Niels S; Moreno, Carol; van Gelder, Michel; Vitek, Antonin; Karas, Michal; Michallet, Mauricette; Machaczka, Maciej; Gramatzki, Martin; Beelen, Dietrich; Finke, Jürgen; Delgado, Julio; Volin, Liisa; Passweg, Jakob; Dreger, Peter; Schaap, Nicolaas; Wagner, Eva; Henseler, Anja; van Biezen, Anja; Bornhäuser, Martin; Iacobelli, Simona; Putter, Hein; Schönland, Stefan O; Kröger, Nicolaus

    2017-08-01

    The best approach for allogeneic haematopoietic stem cell transplantations (alloHCT) in patients with chronic lymphocytic leukaemia (CLL) is unknown. We therefore analysed the impact of procedure- and centre-related factors on 5-year event-free survival (EFS) in a large retrospective study. Data of 684 CLL patients who received a first alloHCT between 2000 and 2011 were analysed by multivariable Cox proportional hazards models with a frailty component to investigate unexplained centre heterogeneity. Five-year EFS of the whole cohort was 37% (95% confidence interval [CI], 34-42%). Larger numbers of CLL alloHCTs (hazard ratio [HR] 0·96, P = 0·002), certification of quality management (HR 0·7, P = 0·045) and a higher gross national income per capita (HR 0·4, P = 0·04) improved EFS. In vivo T-cell depletion (TCD) with alemtuzumab compared to no TCD (HR 1·5, P = 0·03), and a female donor compared to a male donor for a male patient (HR 1·4, P = 0·02) had a negative impact on EFS, but not non-myeloablative versus more intensive conditioning. After correcting for patient-, procedure- and centre-characteristics, significant variation in centre outcomes persisted. In conclusion, further research on the impact of centre and procedural characteristics is warranted. Non-myeloablative conditioning appears to be the preferable approach for patients with CLL. © 2017 John Wiley & Sons Ltd.

  9. Radiochemical procedures and techniques

    International Nuclear Information System (INIS)

    Flynn, K.

    1975-04-01

    A summary is presented of the radiochemical procedures and techniques currently in use by the Chemistry Division Nuclear Chemistry Group at Argonne National Laboratory for the analysis of radioactive samples. (U.S.)

  10. Statistical design of mass spectrometry calibration procedures

    International Nuclear Information System (INIS)

    Bayne, C.K.

    1996-11-01

    The main objective of this task was to agree on calibration procedures to estimate the system parameters (i.e., dead-time correction, ion-counting conversion efficiency, and detector efficiency factors) for SAL's new Finnigan MAT-262 mass spectrometer. SAL will use this mass spectrometer in a clean-laboratory which was opened in December 1995 to measure uranium and plutonium isotopes on environmental samples. The Finnigan MAT-262 mass spectrometer has a multi-detector system with seven Faraday cup detectors and one ion- counter for the measurement of very small signals (e.g. 10 -17 Ampere range). ORNL has made preliminary estimates of the system parameters based on SAL's experimental data measured in late 1994 when the Finnigan instrument was relatively new. SAL generated additional data in 1995 to verify the calibration procedures for estimating the dead-time correction factor, the ion-counting conversion factor and the Faraday cup detector efficiency factors. The system parameters estimated on the present data will have to be reestablished when the Finnigan MAT-262 is moved-to the new clean- laboratory. Different methods will be used to analyzed environmental samples than the current measurement methods being used. For example, the environmental samples will be electroplated on a single filament rather than using the current two filament system. An outline of the calibration standard operating procedure (SOP) is included

  11. Bayesian Sensitivity Analysis of a Nonlinear Dynamic Factor Analysis Model with Nonparametric Prior and Possible Nonignorable Missingness.

    Science.gov (United States)

    Tang, Niansheng; Chow, Sy-Miin; Ibrahim, Joseph G; Zhu, Hongtu

    2017-12-01

    Many psychological concepts are unobserved and usually represented as latent factors apprehended through multiple observed indicators. When multiple-subject multivariate time series data are available, dynamic factor analysis models with random effects offer one way of modeling patterns of within- and between-person variations by combining factor analysis and time series analysis at the factor level. Using the Dirichlet process (DP) as a nonparametric prior for individual-specific time series parameters further allows the distributional forms of these parameters to deviate from commonly imposed (e.g., normal or other symmetric) functional forms, arising as a result of these parameters' restricted ranges. Given the complexity of such models, a thorough sensitivity analysis is critical but computationally prohibitive. We propose a Bayesian local influence method that allows for simultaneous sensitivity analysis of multiple modeling components within a single fitting of the model of choice. Five illustrations and an empirical example are provided to demonstrate the utility of the proposed approach in facilitating the detection of outlying cases and common sources of misspecification in dynamic factor analysis models, as well as identification of modeling components that are sensitive to changes in the DP prior specification.

  12. The Road Towards Lean Six Sigma: Sustainable Success Factors in Service Industry

    Directory of Open Access Journals (Sweden)

    Vouzas Fotis

    2014-11-01

    Full Text Available It has been widely investigated that the application of operations management techniques is not only based on technical factors, but it is mainly associated with organisational factors such as culture, previous polices and procedures, etc. A prime example of promisng operations practices is Lean Six Sigma (L6σ. The main research question for L6σ is related to its liabilities and constrains regarding its implementation. Therefore, this paper aims to explore the critical factors related to the application L6σ. The context of the analysis is service industry since it seems that it has been neglected from the literature that mainly focuses on manufacturing. The methodology was based on the qualitative exploration of three case studies from the service industry. Secondary data were collected through an analysis of companies' documents, written procedures and quality assurance policies and primary data were collected through a number of in-depth face-to-face interviews with managers and quality experts. The findings show that there are ten (10 particular factors that influence the implementation of L6σ in service organizations.

  13. [Hereditary heterozygous factor VII deficiency in patients undergoing surgery : Clinical relevance].

    Science.gov (United States)

    Woehrle, D; Martinez, M; Bolliger, D

    2016-10-01

    A hereditary deficiency in coagulation factor VII (FVII) may affect the international normalized ratio (INR) value. However, FVII deficiency is occasionally associated with a tendency to bleed spontaneously. We hypothesized that perioperative substitution with coagulation factor concentrates might not be indicated in most patients. In this retrospective data analysis, we included all patients with hereditary heterozygous FVII deficiency who underwent surgical procedures at the University Hospital Basel between December 2010 and November 2015. In addition, by searching the literature, we identified publications reporting patients with FVII deficiency undergoing surgical procedures without perioperative substitution. We identified 22 patients undergoing 46 surgical procedures, resulting in a prevalence of 1:1500-2000. Coagulation factor concentrates were administered during the perioperative period in 15 procedures (33 %), whereas in the other 31 procedures (66 %), FVII deficiency was not substituted. No postoperative bleeding or thromboembolic events were reported. In addition, we found no differences in pre- and postoperative hemoglobin and coagulation parameters, with the exception of an improved postoperative INR value in the substituted group. In the literature review, we identified five publications, including 125 patients with FVII deficiency, undergoing 213 surgical procedures with no perioperative substitution. Preoperative substitution using coagulation factor concentrates does not seem to be mandatory in patients with an FVII level ≥15 %. For decision-making on preoperative substitution, patient history of an increased tendency to bleed may be more important than the FVII level or increased INR value.

  14. Multiple timescale analysis and factor analysis of energy ecological footprint growth in China 1953-2006

    International Nuclear Information System (INIS)

    Chen Chengzhong; Lin Zhenshan

    2008-01-01

    Scientific analysis of energy consumption and its influencing factors is of great importance for energy strategy and policy planning. The energy consumption in China 1953-2006 is estimated by applying the energy ecological footprint (EEF) method, and the fluctuation periods of annual China's per capita EEF (EEF cpc ) growth rate are analyzed with the empirical mode decomposition (EMD) method in this paper. EEF intensity is analyzed to depict energy efficiency in China. The main timescales of the 37 factors that affect the annual growth rate of EEF cpc are also discussed based on EMD and factor analysis methods. Results show three obvious undulation cycles of the annual growth rate of EEF cpc , i.e., 4.6, 14.4 and 34.2 years over the last 53 years. The analysis findings from the common synthesized factors of IMF1, IMF2 and IMF3 timescales of the 37 factors suggest that China's energy policy-makers should attach more importance to stabilizing economic growth, optimizing industrial structure, regulating domestic petroleum exploitation and improving transportation efficiency

  15. Analysis of related risk factors for pancreatic fistula after pancreaticoduodenectomy

    Directory of Open Access Journals (Sweden)

    Qi-Song Yu

    2016-08-01

    Full Text Available Objective: To explore the related risk factors for pancreatic fistula after pancreaticoduodenectomy to provide a theoretical evidence for effectively preventing the occurrence of pancreatic fistula. Methods: A total of 100 patients who were admitted in our hospital from January, 2012 to January, 2015 and had performed pancreaticoduodenectomy were included in the study. The related risk factors for developing pancreatic fistula were collected for single factor and Logistic multi-factor analysis. Results: Among the included patients, 16 had pancreatic fistula, and the total occurrence rate was 16% (16/100. The single-factor analysis showed that the upper abdominal operation history, preoperative bilirubin, pancreatic texture, pancreatic duct diameter, intraoperative amount of bleeding, postoperative hemoglobin, and application of somatostatin after operation were the risk factors for developing pancreatic fistula (P<0.05. The multi-factor analysis showed that the upper abdominal operation history, the soft pancreatic texture, small pancreatic duct diameter, and low postoperative hemoglobin were the dependent risk factors for developing pancreatic fistula (OR=4.162, 6.104, 5.613, 4.034, P<0.05. Conclusions: The occurrence of pancreatic fistula after pancreaticoduodenectomy is closely associated with the upper abdominal operation history, the soft pancreatic texture, small pancreatic duct diameter, and low postoperative hemoglobin; therefore, effective measures should be taken to reduce the occurrence of pancreatic fistula according to the patients’ own conditions.

  16. Comparative cost analysis -- computed tomography vs. alternative diagnostic procedures, 1977-1980

    International Nuclear Information System (INIS)

    Gempel, P.A.; Harris, G.H.; Evans, R.G.

    1977-12-01

    In comparing the total national cost of utilizing computed tomography (CT) for medically indicated diagnoses with that of conventional x-ray, ultrasonography, nuclear medicine, and exploratory surgery, this investigation concludes that there was little, if any, added net cost from CT use in 1977 or will there be in 1980. Computed tomography, generally recognized as a reliable and useful diagnostic modality, has the potential to reduce net costs provided that an optimal number of units can be made available to physicians and patients to achieve projected reductions in alternative procedures. This study examines the actual cost impact of CT on both cranial and body diagnostic procedures. For abdominal and mediastinal disorders, CT scanning is just beginning to emerge as a diagnostic modality. As such, clinical experience is somewhat limited and the authors assume that no significant reduction in conventional procedures took place in 1977. It is estimated that the approximately 375,000 CT body procedures performed in 1977 represent only a 5 percent cost increase over use of other diagnostic modalities. It is projected that 2,400,000 CT body procedures will be performed in 1980 and, depending on assumptions used, total body diagnostic costs will increase only slightly or be reduced. Thirty-one tables appear throughout the text presenting cost data broken down by types of diagnostic procedures used and projections by years. Appendixes present technical cost components for diagnostic procedures, the comparative efficacy of CT as revealed in abstracts of published literature, selected medical diagnoses, and references

  17. An analysis of contingency statements in a DRO procedure: A case report.

    Science.gov (United States)

    Gerow, Stephanie; Rispoli, Mandy; Boles, Margot B; Neely, Leslie C

    2015-06-01

    To examine latency to criterion for reduction of challenging behaviour with and without stating a contingency statement immediately prior to a DRO procedure. An ABAC design in which A was baseline, B was used to evaluate the efficacy of a DRO procedure, and C was used to evaluate the efficacy of a DRO procedure with a contingency statement. The DRO with the contingency statement intervention was associated with a shorter latency to behaviour change than the DRO procedure without the contingency statement. These preliminary findings from this case study highlight the importance of examining the efficiency of behaviour change procedures. Directions for future research are provided.

  18. Two Expectation-Maximization Algorithms for Boolean Factor Analysis

    Czech Academy of Sciences Publication Activity Database

    Frolov, A. A.; Húsek, Dušan; Polyakov, P.Y.

    2014-01-01

    Roč. 130, 23 April (2014), s. 83-97 ISSN 0925-2312 R&D Projects: GA ČR GAP202/10/0262 Grant - others:GA MŠk(CZ) ED1.1.00/02.0070; GA MŠk(CZ) EE.2.3.20.0073 Program:ED Institutional research plan: CEZ:AV0Z10300504 Keywords : Boolean Factor analysis * Binary Matrix factorization * Neural networks * Binary data model * Dimension reduction * Bars problem Subject RIV: IN - Informatics, Computer Science Impact factor: 2.083, year: 2014

  19. Factor analysis of symptom profile in early onset and late onset OCD.

    Science.gov (United States)

    Grover, Sandeep; Sarkar, Siddharth; Gupta, Gourav; Kate, Natasha; Ghosh, Abhishek; Chakrabarti, Subho; Avasthi, Ajit

    2018-04-01

    This study aimed to assess the factor structure of early and late onset OCD. Additionally, cluster analysis was conducted in the same sample to assess the applicability of the factors. 345 participants were assessed with Yale Brown Obsessive Compulsive Scale symptom checklist. Patients were classified as early onset (onset of symptoms at age ≤ 18 years) and late onset (onset at age > 18 years) OCD depending upon the age of onset of the symptoms. Factor analysis and cluster analysis of early-onset and late-onset OCD was conducted. The study sample comprised of 91 early onset and 245 late onset OCD subjects. Males were more common in the early onset group. Differences in the frequency of phenomenology related to contamination related, checking, repeating, counting and ordering/arranging compulsions were present across the early and late onset groups. Factor analysis of YBOCS revealed a 3 factor solution for both the groups, which largely concurred with each other. These factors were named as hoarding and symmetry (factor-1), contamination (factor-2) and aggressive, sexual and religious factor (factor-3). To conclude this study shows that factor structure of symptoms of OCD seems to be similar between early-onset and late-onset OCD. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Tensor-Dictionary Learning with Deep Kruskal-Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, Andrew J.; Pu, Yunchen; Sun, Yannan; Spell, Gregory; Carin, Lawrence

    2017-04-20

    We introduce new dictionary learning methods for tensor-variate data of any order. We represent each data item as a sum of Kruskal decomposed dictionary atoms within the framework of beta-process factor analysis (BPFA). Our model is nonparametric and can infer the tensor-rank of each dictionary atom. This Kruskal-Factor Analysis (KFA) is a natural generalization of BPFA. We also extend KFA to a deep convolutional setting and develop online learning methods. We test our approach on image processing and classification tasks achieving state of the art results for 2D & 3D inpainting and Caltech 101. The experiments also show that atom-rank impacts both overcompleteness and sparsity.

  1. HETERO code, heterogeneous procedure for reactor calculation; Program Hetero, heterogeni postupak proracuna reaktora

    Energy Technology Data Exchange (ETDEWEB)

    Jovanovic, S M; Raisic, N M [Boris Kidric Institute of Nuclear Sciences Vinca, Beograd (Yugoslavia)

    1966-11-15

    This report describes the procedure for calculating the parameters of heterogeneous reactor system taking into account the interaction between fuel elements related to established geometry. First part contains the analysis of single fuel element in a diffusion medium, and criticality condition of the reactor system described by superposition of elements interactions. the possibility of performing such analysis by determination of heterogeneous system lattice is described in the second part. Computer code HETERO with the code KETAP (calculation of criticality factor {eta}{sub n} and flux distribution) is part of this report together with the example of RB reactor square lattice.

  2. A novel sample preparation procedure for effect-directed analysis of micro-contaminants of emerging concern in surface waters.

    Science.gov (United States)

    Osorio, Victoria; Schriks, Merijn; Vughs, Dennis; de Voogt, Pim; Kolkman, Annemieke

    2018-08-15

    A novel sample preparation procedure relying on Solid Phase Extraction (SPE) combining different sorbent materials on a sequential-based cartridge was optimized and validated for the enrichment of 117 widely diverse contaminants of emerging concern (CECs) from surface waters (SW) and further combined chemical and biological analysis on subsequent extracts. A liquid chromatography coupled to high resolution tandem mass spectrometry LC-(HR)MS/MS protocol was optimized and validated for the quantitative analysis of organic CECs in SW extracts. A battery of in vitro CALUX bioassays for the assessment of endocrine, metabolic and genotoxic interference and oxidative stress were performed on the same SW extracts. Satisfactory recoveries ([70-130]%) and precision ( 0.99) over three orders of magnitude. Instrumental limits of detection and method limits of quantification were of [1-96] pg injected and [0.1-58] ng/L, respectively; while corresponding intra-day and inter-day precision did not exceed 11% and 20%. The developed procedure was successfully applied for the combined chemical and toxicological assessment of SW intended for drinking water supply. Levels of compounds varied from < 10 ng/L to < 500 ng/L. Endocrine (i.e. estrogenic and anti-androgenic) and metabolic interference responses were observed. Given the demonstrated reliability of the validated sample preparation method, the authors propose its integration in an effect-directed analysis procedure for a proper evaluation of SW quality and hazard assessment of CECs. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Cytological preparations for molecular analysis: A review of technical procedures, advantages and limitations for referring samples for testing.

    Science.gov (United States)

    da Cunha Santos, G; Saieg, M A; Troncone, G; Zeppa, P

    2018-04-01

    Minimally invasive procedures such as endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) must yield not only good quality and quantity of material for morphological assessment, but also an adequate sample for analysis of molecular markers to guide patients to appropriate targeted therapies. In this context, cytopathologists worldwide should be familiar with minimum requirements for refereeing cytological samples for testing. The present manuscript is a review with comprehensive description of the content of the workshop entitled Cytological preparations for molecular analysis: pre-analytical issues for EBUS TBNA, presented at the 40th European Congress of Cytopathology in Liverpool, UK. The present review emphasises the advantages and limitations of different types of cytology substrates used for molecular analysis such as archival smears, liquid-based preparations, archival cytospin preparations and FTA (Flinders Technology Associates) cards, as well as their technical requirements/features. These various types of cytological specimens can be successfully used for an extensive array of molecular studies, but the quality and quantity of extracted nucleic acids rely directly on adequate pre-analytical assessment of those samples. In this setting, cytopathologists must not only be familiar with the different types of specimens and associated technical procedures, but also correctly handle the material provided by minimally invasive procedures, ensuring that there is sufficient amount of material for a precise diagnosis and correct management of the patient through personalised care. © 2018 John Wiley & Sons Ltd.

  4. Retrospective analysis of 104 histologically proven adult brainstem gliomas: clinical symptoms, therapeutic approaches and prognostic factors

    International Nuclear Information System (INIS)

    Reithmeier, Thomas; Kuzeawu, Aanyo; Hentschel, Bettina; Loeffler, Markus; Trippel, Michael; Nikkhah, Guido

    2014-01-01

    Adult brainstem gliomas are rare primary brain tumors (<2% of gliomas). The goal of this study was to analyze clinical, prognostic and therapeutic factors in a large series of histologically proven brainstem gliomas. Between 1997 and 2007, 104 patients with a histologically proven brainstem glioma were retrospectively analyzed. Data about clinical course of disease, neuropathological findings and therapeutic approaches were analyzed. The median age at diagnosis was 41 years (range 18-89 years), median KPS before any operative procedure was 80 (range 20-100) and median survival for the whole cohort was 18.8 months. Histopathological examinations revealed 16 grade I, 31 grade II, 42 grade III and 14 grade IV gliomas. Grading was not possible in 1 patient. Therapeutic concepts differed according to the histopathology of the disease. Median overall survival for grade II tumors was 26.4 months, for grade III tumors 12.9 months and for grade IV tumors 9.8 months. On multivariate analysis the relative risk to die increased with a KPS ≤ 70 by factor 6.7, with grade III/IV gliomas by the factor 1.8 and for age ≥ 40 by the factor 1.7. External beam radiation reduced the risk to die by factor 0.4. Adult brainstem gliomas present with a wide variety of neurological symptoms and postoperative radiation remains the cornerstone of therapy with no proven benefit of adding chemotherapy. Low KPS, age ≥ 40 and higher tumor grade have a negative impact on overall survival

  5. Multivariate factor analysis of Girgentana goat milk composition

    Directory of Open Access Journals (Sweden)

    Pietro Giaccone

    2010-01-01

    Full Text Available The interpretation of the several variables that contribute to defining milk quality is difficult due to the high degree of  correlation among them. In this case, one of the best methods of statistical processing is factor analysis, which belongs  to the multivariate groups; for our study this particular statistical approach was employed.  A total of 1485 individual goat milk samples from 117 Girgentana goats, were collected fortnightly from January to July,  and analysed for physical and chemical composition, and clotting properties. Milk pH and tritable acidity were within the  normal range for fresh goat milk. Morning milk yield resulted 704 ± 323 g with 3.93 ± 1.23% and 3.48±0.38% for fat  and protein percentages, respectively. The milk urea content was 43.70 ± 8.28 mg/dl. The clotting ability of Girgentana  milk was quite good, with a renneting time equal to 16.96 ± 3.08 minutes, a rate of curd formation of 2.01 ± 1.63 min-  utes and a curd firmness of 25.08 ± 7.67 millimetres.  Factor analysis was performed by applying axis orthogonal rotation (rotation type VARIMAX; the analysis grouped the  milk components into three latent or common factors. The first, which explained 51.2% of the total covariance, was  defined as “slow milks”, because it was linked to r and pH. The second latent factor, which explained 36.2% of the total  covariance, was defined as “milk yield”, because it is positively correlated to the morning milk yield and to the urea con-  tent, whilst negatively correlated to the fat percentage. The third latent factor, which explained 12.6% of the total covari-  ance, was defined as “curd firmness,” because it is linked to protein percentage, a30 and titatrable acidity. With the aim  of evaluating the influence of environmental effects (stage of kidding, parity and type of kidding, factor scores were anal-  ysed with the mixed linear model. Results showed significant effects of the season of

  6. Optimization of instrumental neutron activation analysis method by means of 2k experimental design technique aiming the validation of analytical procedures

    International Nuclear Information System (INIS)

    Petroni, Robson; Moreira, Edson G.

    2013-01-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) methods were carried out for the determination of the elements arsenic, chromium, cobalt, iron, rubidium, scandium, selenium and zinc in biological materials. The aim is to validate the analytical methods for future accreditation at the National Institute of Metrology, Quality and Technology (INMETRO). The 2 k experimental design was applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. Samples of Mussel Tissue Certified Reference Material and multi-element standards were analyzed considering the following variables: sample decay time, counting time and sample distance to detector. The standard multi-element concentration (comparator standard), mass of the sample and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN - CNEN/SP). Optimized conditions were estimated based on the results of z-score tests, main effect and interaction effects. The results obtained with the different experimental configurations were evaluated for accuracy (precision and trueness) for each measurement. (author)

  7. Boolean Factor Analysis by Attractor Neural Network

    Czech Academy of Sciences Publication Activity Database

    Frolov, A. A.; Húsek, Dušan; Muraviev, I. P.; Polyakov, P.Y.

    2007-01-01

    Roč. 18, č. 3 (2007), s. 698-707 ISSN 1045-9227 R&D Projects: GA AV ČR 1ET100300419; GA ČR GA201/05/0079 Institutional research plan: CEZ:AV0Z10300504 Keywords : recurrent neural network * Hopfield-like neural network * associative memory * unsupervised learning * neural network architecture * neural network application * statistics * Boolean factor analysis * dimensionality reduction * features clustering * concepts search * information retrieval Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 2.769, year: 2007

  8. The effects of predictor method factors on selection outcomes: A modular approach to personnel selection procedures.

    Science.gov (United States)

    Lievens, Filip; Sackett, Paul R

    2017-01-01

    Past reviews and meta-analyses typically conceptualized and examined selection procedures as holistic entities. We draw on the product design literature to propose a modular approach as a complementary perspective to conceptualizing selection procedures. A modular approach means that a product is broken down into its key underlying components. Therefore, we start by presenting a modular framework that identifies the important measurement components of selection procedures. Next, we adopt this modular lens for reviewing the available evidence regarding each of these components in terms of affecting validity, subgroup differences, and applicant perceptions, as well as for identifying new research directions. As a complement to the historical focus on holistic selection procedures, we posit that the theoretical contributions of a modular approach include improved insight into the isolated workings of the different components underlying selection procedures and greater theoretical connectivity among different selection procedures and their literatures. We also outline how organizations can put a modular approach into operation to increase the variety in selection procedures and to enhance the flexibility in designing them. Overall, we believe that a modular perspective on selection procedures will provide the impetus for programmatic and theory-driven research on the different measurement components of selection procedures. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. MOOC Success Factors: Proposal of an Analysis Framework

    Directory of Open Access Journals (Sweden)

    Margarida M. Marques

    2017-10-01

    Full Text Available Aim/Purpose: From an idea of lifelong-learning-for-all to a phenomenon affecting higher education, Massive Open Online Courses (MOOCs can be the next step to a truly universal education. Indeed, MOOC enrolment rates can be astoundingly high; still, their completion rates are frequently disappointingly low. Nevertheless, as courses, the participants’ enrolment and learning within the MOOCs must be considered when assessing their success. In this paper, the authors’ aim is to reflect on what makes a MOOC successful to propose an analysis framework of MOOC success factors. Background: A literature review was conducted to identify reported MOOC success factors and to propose an analysis framework. Methodology: This literature-based framework was tested against data of a specific MOOC and refined, within a qualitative interpretivist methodology. The data were collected from the ‘As alterações climáticas nos média escolares - Clima@EduMedia’ course, which was developed by the project Clima@EduMedia and was submitted to content analysis. This MOOC aimed to support science and school media teachers in the use of media to teach climate change Contribution: By proposing a MOOC success factors framework the authors are attempting to contribute to fill in a literature gap regarding what concerns criteria to consider a specific MOOC successful. Findings: This work major finding is a literature-based and empirically-refined MOOC success factors analysis framework. Recommendations for Practitioners: The proposed framework is also a set of best practices relevant to MOOC developers, particularly when targeting teachers as potential participants. Recommendation for Researchers: This work’s relevance is also based on its contribution to increasing empirical research on MOOCs. Impact on Society: By providing a proposal of a framework on factors to make a MOOC successful, the authors hope to contribute to the quality of MOOCs. Future Research: Future

  10. Red blood cell transfusion probability and associated costs in neurosurgical procedures.

    Science.gov (United States)

    Barth, Martin; Weiss, Christel; Schmieder, Kirsten

    2018-03-20

    The extent of red blood cell units (RBC) needed for different neurosurgical procedures and the time point of their administration are widely unknown, which results in generously cross-matching prior to surgery. However, RBC are increasingly requested in the aging western populations, and blood donations are significantly reduced. Therefore, the knowledge of the extent and time point of administration of RBC is of major importance. This is a retrospective single center analysis. The incidence of RBC transfusion during surgery or within 48 h after surgery was analyzed for all neurosurgical patients within 3 years. Costs for cross-matched and transfused RBC were calculated and risk factors for RBC transfusion analyzed. The risk of intraoperative RBC administration was low for spinal and intracranial tumor resections (1.87%) and exceeded 10% only in spinal fusion procedures. This was dependent on the number of fused segments with an intraoperative transfusion risk of > 12.5% with fusion of more than three levels. Multiple logistic regression analysis showed a significantly increased risk for RBC transfusion for female gender (p = 0.006; OR 1.655), higher age (N = 4812; p < 0.0001; OR 1.028), and number of fused segments (N = 737; p < 0.0001; OR 1.433). Annual costs for cross-matching were 783,820.88 USD and for intraoperative RBC administration 121,322.13 USD. Neurosurgical procedures are associated with a low number of RBC needed intraoperatively. Only elective spine fusion procedures with ≥ 3 levels involved and AVM resections seem to require cross-matching of RBC. The present data may allow changing the preoperative algorithm of RBC cross-matching in neurosurgical procedures and help to save resources and costs.

  11. Analysis of plant gums and saccharide materials in paint samples: comparison of GC-MS analytical procedures and databases.

    Science.gov (United States)

    Lluveras-Tenorio, Anna; Mazurek, Joy; Restivo, Annalaura; Colombini, Maria Perla; Bonaduce, Ilaria

    2012-10-10

    Saccharide materials have been used for centuries as binding media, to paint, write and illuminate manuscripts and to apply metallic leaf decorations. Although the technical literature often reports on the use of plant gums as binders, actually several other saccharide materials can be encountered in paint samples, not only as major binders, but also as additives. In the literature, there are a variety of analytical procedures that utilize GC-MS to characterize saccharide materials in paint samples, however the chromatographic profiles are often extremely different and it is impossible to compare them and reliably identify the paint binder. This paper presents a comparison between two different analytical procedures based on GC-MS for the analysis of saccharide materials in works-of-art. The research presented here evaluates the influence of the analytical procedure used, and how it impacts the sugar profiles obtained from the analysis of paint samples that contain saccharide materials. The procedures have been developed, optimised and systematically used to characterise plant gums at the Getty Conservation Institute in Los Angeles, USA (GCI) and the Department of Chemistry and Industrial Chemistry of the University of Pisa, Italy (DCCI). The main steps of the analytical procedures and their optimisation are discussed. The results presented highlight that the two methods give comparable sugar profiles, whether the samples analysed are simple raw materials, pigmented and unpigmented paint replicas, or paint samples collected from hundreds of centuries old polychrome art objects. A common database of sugar profiles of reference materials commonly found in paint samples was thus compiled. The database presents data also from those materials that only contain a minor saccharide fraction. This database highlights how many sources of saccharides can be found in a paint sample, representing an important step forward in the problem of identifying polysaccharide binders in

  12. Ultra-high performance supercritical fluid chromatography-mass spectrometry procedure for analysis of monosaccharides from plant gum binders.

    Science.gov (United States)

    Pauk, Volodymyr; Pluháček, Tomáš; Havlíček, Vladimír; Lemr, Karel

    2017-10-09

    The ultra-high performance supercritical fluid chromatography-mass spectrometry (UHPSFC/MS) procedure for analysis of native monosaccharides was developed. Chromatographic conditions were investigated to separate a mixture of four hexoses, three pentoses, two deoxyhexoses and two uronic acids. Increasing water content in methanol modifier to 5% and formic acid to 4% improved peak shapes of neutral monosaccharides and allowed complete elution of highly polar uronic acids in a single run. An Acquity HSS C18SB column outperformed other three tested stationary phases (BEH (silica), BEH 2-ethylpyridine, CSH Fluoro-Phenyl) in terms of separation of isomers and analysis time (4.5 min). Limits of detection were in the range 0.01-0.12 ng μL -1 . Owing to separation of anomers, identification of critical pairs (arabinose-xylose and glucose-galactose) was possible. Feasibility of the new method was demonstrated on plant-derived polysaccharide binders. Samples of watercolor paints, painted paper and three plant gums widely encountered in painting media (Arabic, cherry and tragacanth) were decomposed prior the analysis by microwave-assisted hydrolysis at 40 bar initial pressure using 2 mol L -1 trifluoroacetic acid. Among tested temperatures, 120 °C ensured appropriate hydrolysis efficiency for different types of gum and avoided excessive degradation of labile monosaccharides. Procedure recovery tested on gum Arabic was 101% with an RSD below 8%. Aqueous hydrolysates containing monosaccharides in different ratios specific to each type of plant gum were diluted or analyzed directly. Filtration of samples before hydrolysis reduced interferences from a paper support and identification of gum Arabic in watercolor-painted paper samples was demonstrated. Successful identification of pure gum Arabic was confirmed for sample quantities as little as 1 μg. Two classification approaches were compared and principal component analysis was superior to analysis based on peak area

  13. DISRUPTIVE EVENT BIOSPHERE DOSE CONVERSION FACTOR ANALYSIS

    International Nuclear Information System (INIS)

    M.A. Wasiolek

    2005-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The Biosphere Model Report (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1-1). The objective of this analysis was to develop the BDCFs for the volcanic

  14. Probabilistic safety evaluation: Development of procedures with applications on components used in nuclear power plants

    International Nuclear Information System (INIS)

    Dillstroem, P.

    2000-12-01

    A probabilistic procedure has been developed by SAQ Kontroll AB to calculate two different failure probabilities, P F : Probability of failure, defect size given by NDT/NDE. Probability of failure, defect not detected by NDT/NDE. Based on the procedure, SAQ Kontroll AB has developed a computer program PROPSE (PRObabilistic Program for Safety Evaluation). Within PROPSE, the following features are implemented: Two different algorithms to calculate the probability of failure are included: Simple Monte Carlo Simulation (MCS), with an error estimate on P F . First-Order Reliability Method (FORM), with sensitivity factors using the most probable point of failure in a standard normal space. Using these factors, it is possible to rank the parameters within an analysis. Estimation of partial safety factors, given an input target failure probability and characteristic values for fracture toughness, yield strength, tensile strength and defect depth. Extensive validation has been carried out, using the probabilistic computer program STAR6 from Nuclear Electric and the deterministic program SACC from SAQ Kontroll AB. The validation showed that the results from PROPSE were correct, and that the algorithms used in STAR6 were not intended to work for a general problem, when the standard deviation is either 'small' or 'large'. Distributions, to be used in a probabilistic analysis, are discussed. Examples on data to be used are also given

  15. Probabilistic safety evaluation: Development of procedures with applications on components used in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Dillstroem, P. [Det Norske Veritas AB, Stockholm (Sweden)

    2000-12-01

    A probabilistic procedure has been developed by SAQ Kontroll AB to calculate two different failure probabilities, P{sub F}: Probability of failure, defect size given by NDT/NDE. Probability of failure, defect not detected by NDT/NDE. Based on the procedure, SAQ Kontroll AB has developed a computer program PROPSE (PRObabilistic Program for Safety Evaluation). Within PROPSE, the following features are implemented: Two different algorithms to calculate the probability of failure are included: Simple Monte Carlo Simulation (MCS), with an error estimate on P{sub F}. First-Order Reliability Method (FORM), with sensitivity factors using the most probable point of failure in a standard normal space. Using these factors, it is possible to rank the parameters within an analysis. Estimation of partial safety factors, given an input target failure probability and characteristic values for fracture toughness, yield strength, tensile strength and defect depth. Extensive validation has been carried out, using the probabilistic computer program STAR6 from Nuclear Electric and the deterministic program SACC from SAQ Kontroll AB. The validation showed that the results from PROPSE were correct, and that the algorithms used in STAR6 were not intended to work for a general problem, when the standard deviation is either 'small' or 'large'. Distributions, to be used in a probabilistic analysis, are discussed. Examples on data to be used are also given.

  16. Investigation of Deterioration Behavior of Hysteretic Loops in Nonlinear Static Procedure Analysis of Concrete Structures with Shear Walls

    International Nuclear Information System (INIS)

    Ghodrati Amiri, G.; Amidi, S.; Khorasani, M.

    2008-01-01

    In the recent years, scientists developed the seismic rehabilitation of structures and their view points were changed from sufficient strength to the performance of structures (Performance Base Design) to prepare a safe design. Nonlinear Static Procedure analysis (NSP) or pushover analysis is a new method that is chosen for its speed and simplicity in calculations. 'Seismic Rehabilitation Code for Existing Buildings' and FEMA 356 considered this method. Result of this analysis is a target displacement that is the base of the performance and rehabilitation procedure of the structures. Exact recognition of that displacement could develop the workability of pushover analysis. In these days, Nonlinear Dynamic Analysis (NDP) is only method can exactly apply the seismic ground motions. In this case because it consumes time, costs very high and is more difficult than other methods, is not applicable as much as NSP. A coefficient used in NSP for determining the target displacement is C2 (Stiffness and Strength Degradations Coefficient) and is applicable for correcting the errors due to eliminating the stiffness and strength degradations in hysteretic loops. In this study it has been tried to analysis three concrete frames with shear walls by several accelerations that scaled according to FEMA 273 and FEMA 356. These structures were designed with Iranian 2800 standard (vers.3). Finally after the analyzing by pushover method and comparison results with dynamic analysis, calculated C2 was comprised with values in rehabilitation codes

  17. Analysis of the IEA-R1 reactor start-up procedures - an application of the HazOp method

    International Nuclear Information System (INIS)

    Sauer, Maria Eugenia Lago Jacques

    2000-01-01

    An analysis of technological catastrophic events that took place in this century shows that human failure and vulnerability of risk management programs are the main causes for the occurrence of accidents. As an example, plants and complex systems where the interface man-machine is close, the frequency of failures tends to be higher. Thus, a comprehensive knowledge of how a specific process can be potentially hazardous is a sine qua non condition to the operators training, as well as to define and implement more efficient plans for loss prevention and risk management. A study of the IEA-R1 research reactor start-up procedures was carried out, based upon the methodology Hazard and Operability Study (HazOp). The analytical and qualitative multidisciplinary HazOp approach provided means to a comprehensive review of the reactor start-up procedures, contributing to improve the understanding of the potential hazards associated to deviations on performing this routine. The present work includes a historical summary and a detailed description of the HazOp technique, as well as case studies in the process industries and the use of expert systems in the application of the method. An analysis of 53 activities of the IEA-R1 reactor start-up procedures was made, resulting in 25 recommendations of changes covering aspects of the project, operation and safety of the reactor. Eleven recommendations have been implemented. (author)

  18. Capital Cost Optimization for Prefabrication: A Factor Analysis Evaluation Model

    Directory of Open Access Journals (Sweden)

    Hong Xue

    2018-01-01

    Full Text Available High capital cost is a significant hindrance to the promotion of prefabrication. In order to optimize cost management and reduce capital cost, this study aims to explore the latent factors and factor analysis evaluation model. Semi-structured interviews were conducted to explore potential variables and then questionnaire survey was employed to collect professionals’ views on their effects. After data collection, exploratory factor analysis was adopted to explore the latent factors. Seven latent factors were identified, including “Management Index”, “Construction Dissipation Index”, “Productivity Index”, “Design Efficiency Index”, “Transport Dissipation Index”, “Material increment Index” and “Depreciation amortization Index”. With these latent factors, a factor analysis evaluation model (FAEM, divided into factor analysis model (FAM and comprehensive evaluation model (CEM, was established. The FAM was used to explore the effect of observed variables on the high capital cost of prefabrication, while the CEM was used to evaluate comprehensive cost management level on prefabrication projects. Case studies were conducted to verify the models. The results revealed that collaborative management had a positive effect on capital cost of prefabrication. Material increment costs and labor costs had significant impacts on production cost. This study demonstrated the potential of on-site management and standardization design to reduce capital cost. Hence, collaborative management is necessary for cost management of prefabrication. Innovation and detailed design were needed to improve cost performance. The new form of precast component factories can be explored to reduce transportation cost. Meanwhile, targeted strategies can be adopted for different prefabrication projects. The findings optimized the capital cost and improved the cost performance through providing an evaluation and optimization model, which helps managers to

  19. 75 FR 32295 - Expedited Approval of Alternative Test Procedures for the Analysis of Contaminants Under the Safe...

    Science.gov (United States)

    2010-06-08

    ... Approval of Alternative Test Procedures for the Analysis of Contaminants Under the Safe Drinking Water Act... methods for use in measuring the levels of contaminants in drinking water and determining compliance with... required to measure contaminants in drinking water samples. In addition, EPA Regions as well as States and...

  20. Collected radiochemical and geochemical procedures

    Energy Technology Data Exchange (ETDEWEB)

    Kleinberg, J [comp.

    1990-05-01

    This revision of LA-1721, 4th Ed., Collected Radiochemical Procedures, reflects the activities of two groups in the Isotope and Nuclear Chemistry Division of the Los Alamos National Laboratory: INC-11, Nuclear and radiochemistry; and INC-7, Isotope Geochemistry. The procedures fall into five categories: I. Separation of Radionuclides from Uranium, Fission-Product Solutions, and Nuclear Debris; II. Separation of Products from Irradiated Targets; III. Preparation of Samples for Mass Spectrometric Analysis; IV. Dissolution Procedures; and V. Geochemical Procedures. With one exception, the first category of procedures is ordered by the positions of the elements in the Periodic Table, with separate parts on the Representative Elements (the A groups); the d-Transition Elements (the B groups and the Transition Triads); and the Lanthanides (Rare Earths) and Actinides (the 4f- and 5f-Transition Elements). The members of Group IIIB-- scandium, yttrium, and lanthanum--are included with the lanthanides, elements they resemble closely in chemistry and with which they occur in nature. The procedures dealing with the isolation of products from irradiated targets are arranged by target element.