WorldWideScience

Sample records for factor analysis demonstrated

  1. Improving Your Exploratory Factor Analysis for Ordinal Data: A Demonstration Using FACTOR

    Directory of Open Access Journals (Sweden)

    James Baglin

    2014-06-01

    Full Text Available Exploratory factor analysis (EFA methods are used extensively in the field of assessment and evaluation. Due to EFA's widespread use, common methods and practices have come under close scrutiny. A substantial body of literature has been compiled highlighting problems with many of the methods and practices used in EFA, and, in response, many guidelines have been proposed with the aim to improve application. Unfortunately, implementing recommended EFA practices has been restricted by the range of options available in commercial statistical packages and, perhaps, due to an absence of clear, practical - how-to' demonstrations. Consequently, this article describes the application of methods recommended to get the most out of your EFA. The article focuses on dealing with the common situation of analysing ordinal data as derived from Likert-type scales. These methods are demonstrated using the free, stand-alone, easy-to-use and powerful EFA package FACTOR (http://psico.fcep.urv.es/utilitats/factor/, Lorenzo-Seva & Ferrando, 2006. The demonstration applies the recommended techniques using an accompanying dataset, based on the Big 5 personality test. The outcomes obtained by the EFA using the recommended procedures through FACTOR are compared to the default techniques currently available in SPSS.

  2. Unbiased proteomics analysis demonstrates significant variability in mucosal immune factor expression depending on the site and method of collection.

    Directory of Open Access Journals (Sweden)

    Kenzie M Birse

    Full Text Available Female genital tract secretions are commonly sampled by lavage of the ectocervix and vaginal vault or via a sponge inserted into the endocervix for evaluating inflammation status and immune factors critical for HIV microbicide and vaccine studies. This study uses a proteomics approach to comprehensively compare the efficacy of these methods, which sample from different compartments of the female genital tract, for the collection of immune factors. Matching sponge and lavage samples were collected from 10 healthy women and were analyzed by tandem mass spectrometry. Data was analyzed by a combination of differential protein expression analysis, hierarchical clustering and pathway analysis. Of the 385 proteins identified, endocervical sponge samples collected nearly twice as many unique proteins as cervicovaginal lavage (111 vs. 61 with 55% of proteins common to both (213. Each method/site identified 73 unique proteins that have roles in host immunity according to their gene ontology. Sponge samples enriched for specific inflammation pathways including acute phase response proteins (p = 3.37×10(-24 and LXR/RXR immune activation pathways (p = 8.82×10(-22 while the role IL-17A in psoriasis pathway (p = 5.98×10(-4 and the complement system pathway (p = 3.91×10(-3 were enriched in lavage samples. Many host defense factors were differentially enriched (p<0.05 between sites including known/potential antimicrobial factors (n = 21, S100 proteins (n = 9, and immune regulatory factors such as serpins (n = 7. Immunoglobulins (n = 6 were collected at comparable levels in abundance in each site although 25% of those identified were unique to sponge samples. This study demonstrates significant differences in types and quantities of immune factors and inflammation pathways collected by each sampling technique. Therefore, clinical studies that measure mucosal immune activation or factors assessing HIV transmission should utilize

  3. Factor analysis

    CERN Document Server

    Gorsuch, Richard L

    2013-01-01

    Comprehensive and comprehensible, this classic covers the basic and advanced topics essential for using factor analysis as a scientific tool in psychology, education, sociology, and related areas. Emphasizing the usefulness of the techniques, it presents sufficient mathematical background for understanding and sufficient discussion of applications for effective use. This includes not only theory but also the empirical evaluations of the importance of mathematical distinctions for applied scientific analysis.

  4. Environmental analysis for pipeline gas demonstration plants

    Energy Technology Data Exchange (ETDEWEB)

    Stinton, L.H.

    1978-09-01

    The Department of Energy (DOE) has implemented programs for encouraging the development and commercialization of coal-related technologies, which include coal gasification demonstration-scale activities. In support of commercialization activities the Environmental Analysis for Pipeline Gas Demonstration Plants has been prepared as a reference document to be used in evaluating potential environmental and socioeconomic effects from construction and operation of site- and process-specific projects. Effluents and associated impacts are identified for six coal gasification processes at three contrasting settings. In general, impacts from construction of a high-Btu gas demonstration plant are similar to those caused by the construction of any chemical plant of similar size. The operation of a high-Btu gas demonstration plant, however, has several unique aspects that differentiate it from other chemical plants. Offsite development (surface mining) and disposal of large quantities of waste solids constitute important sources of potential impact. In addition, air emissions require monitoring for trace metals, polycyclic aromatic hydrocarbons, phenols, and other emissions. Potential biological impacts from long-term exposure to these emissions are unknown, and additional research and data analysis may be necessary to determine such effects. Possible effects of pollutants on vegetation and human populations are discussed. The occurrence of chemical contaminants in liquid effluents and the bioaccumulation of these contaminants in aquatic organisms may lead to adverse ecological impact. Socioeconomic impacts are similar to those from a chemical plant of equivalent size and are summarized and contrasted for the three surrogate sites.

  5. Demonstration sensitivity analysis for RADTRAN III

    International Nuclear Information System (INIS)

    Neuhauser, K.S.; Reardon, P.C.

    1986-10-01

    A demonstration sensitivity analysis was performed to: quantify the relative importance of 37 variables to the total incident free dose; assess the elasticity of seven dose subgroups to those same variables; develop density distributions for accident dose to combinations of accident data under wide-ranging variations; show the relationship between accident consequences and probabilities of occurrence; and develop limits for the variability of probability consequence curves

  6. Reliability demonstration test planning using bayesian analysis

    International Nuclear Information System (INIS)

    Chandran, Senthil Kumar; Arul, John A.

    2003-01-01

    In Nuclear Power Plants, the reliability of all the safety systems is very critical from the safety viewpoint and it is very essential that the required reliability requirements be met while satisfying the design constraints. From practical experience, it is found that the reliability of complex systems such as Safety Rod Drive Mechanism is of the order of 10 -4 with an uncertainty factor of 10. To demonstrate the reliability of such systems is prohibitive in terms of cost and time as the number of tests needed is very large. The purpose of this paper is to develop a Bayesian reliability demonstrating testing procedure for exponentially distributed failure times with gamma prior distribution on the failure rate which can be easily and effectively used to demonstrate component/subsystem/system reliability conformance to stated requirements. The important questions addressed in this paper are: With zero failures, how long one should perform the tests and how many components are required to conclude with a given degree of confidence, that the component under test, meets the reliability requirement. The procedure is explained with an example. This procedure can also be extended to demonstrate with more number of failures. The approach presented is applicable for deriving test plans for demonstrating component failure rates of nuclear power plants, as the failure data for similar components are becoming available in existing plants elsewhere. The advantages of this procedure are the criterion upon which the procedure is based is simple and pertinent, the fitting of the prior distribution is an integral part of the procedure and is based on the use of information regarding two percentiles of this distribution and finally, the procedure is straightforward and easy to apply in practice. (author)

  7. Foundations of factor analysis

    CERN Document Server

    Mulaik, Stanley A

    2009-01-01

    Introduction Factor Analysis and Structural Theories Brief History of Factor Analysis as a Linear Model Example of Factor AnalysisMathematical Foundations for Factor Analysis Introduction Scalar AlgebraVectorsMatrix AlgebraDeterminants Treatment of Variables as Vectors Maxima and Minima of FunctionsComposite Variables and Linear Transformations Introduction Composite Variables Unweighted Composite VariablesDifferentially Weighted Composites Matrix EquationsMulti

  8. WIS decontamination factor demonstration test with radioactive nuclides

    International Nuclear Information System (INIS)

    Kanbe, Hiromi; Mayuzumi, Masami; Ono, Tetsuo; Nagae, Madoka; Sekiguchi, Ryosaku; Takaoku, Yoshinobu.

    1987-01-01

    A radioactive Waste Incineration System (WIS) with suspension combustion is noticed as effective volume reduction technology of low level radiactive wastes that are increasing every year. In order to demonstrate the decontamination efficiency of ceramic filter used on WIS, this test has been carried out with the test facilities as joint research of Central Research Institute of Electric Power Industry (CRIEPI) and Sumitomo Heavy Industries, Ltd. Miscellaneous combustible waste and power resin, to which 5 nuclides (Mn-54, Fe-59, Co-60, Zn-65, Cs-137) were added, were used as samples for incineration. As the result of the test, it was verified that Decontamination Factor (DF) of the single stage ceramic filter was usually kept over 10 5 for every nuclide, and from the results of above DF, over 10 8 is expected for real commercial plant as a total system. Therefore, it is realized that the off-gas clean up system of the WIS composed of only single stage of ceramic filter is capable of sufficiently efficient decontamination of exhaust gas to be released to stack. (author)

  9. On-farm demonstrations: consideration factors for their success and ...

    African Journals Online (AJOL)

    Journal Home > Vol 32 (2003) > ... long been a key hallmark of program delivery and teaching in extension work. ... This study resulted in the development of both the Advantages and Disadvantages associated with on-farm demonstrations ...

  10. Transforming Rubrics Using Factor Analysis

    Science.gov (United States)

    Baryla, Ed; Shelley, Gary; Trainor, William

    2012-01-01

    Student learning and program effectiveness is often assessed using rubrics. While much time and effort may go into their creation, it is equally important to assess how effective and efficient the rubrics actually are in terms of measuring competencies over a number of criteria. This study demonstrates the use of common factor analysis to identify…

  11. Factor analysis and scintigraphy

    International Nuclear Information System (INIS)

    Di Paola, R.; Penel, C.; Bazin, J.P.; Berche, C.

    1976-01-01

    The goal of factor analysis is usually to achieve reduction of a large set of data, extracting essential features without previous hypothesis. Due to the development of computerized systems, the use of largest sampling, the possibility of sequential data acquisition and the increase of dynamic studies, the problem of data compression can be encountered now in routine. Thus, results obtained for compression of scintigraphic images were first presented. Then possibilities given by factor analysis for scan processing were discussed. At last, use of this analysis for multidimensional studies and specially dynamic studies were considered for compression and processing [fr

  12. Data Analysis for ARRA Early Fuel Cell Market Demonstrations (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, J.; Wipke, K.; Sprik, S.; Ramsden, T.

    2010-05-01

    Presentation about ARRA Early Fuel Cell Market Demonstrations, including an overview of the ARRE Fuel Cell Project, the National Renewable Energy Laboratory's data analysis objectives, deployment composite data products, and planned analyses.

  13. An easy guide to factor analysis

    CERN Document Server

    Kline, Paul

    2014-01-01

    Factor analysis is a statistical technique widely used in psychology and the social sciences. With the advent of powerful computers, factor analysis and other multivariate methods are now available to many more people. An Easy Guide to Factor Analysis presents and explains factor analysis as clearly and simply as possible. The author, Paul Kline, carefully defines all statistical terms and demonstrates step-by-step how to work out a simple example of principal components analysis and rotation. He further explains other methods of factor analysis, including confirmatory and path analysis, a

  14. AEP Ohio gridSMART Demonstration Project Real-Time Pricing Demonstration Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Subbarao, Krishnappa; Fuller, Jason C.; Chassin, David P.; Somani, Abhishek; Marinovici, Maria C.; Hammerstrom, Janelle L.

    2014-02-01

    This report contributes initial findings from an analysis of significant aspects of the gridSMART® Real-Time Pricing (RTP) – Double Auction demonstration project. Over the course of four years, Pacific Northwest National Laboratory (PNNL) worked with American Electric Power (AEP), Ohio and Battelle Memorial Institute to design, build, and operate an innovative system to engage residential consumers and their end-use resources in a participatory approach to electric system operations, an incentive-based approach that has the promise of providing greater efficiency under normal operating conditions and greater flexibility to react under situations of system stress. The material contained in this report supplements the findings documented by AEP Ohio in the main body of the gridSMART report. It delves into three main areas: impacts on system operations, impacts on households, and observations about the sensitivity of load to price changes.

  15. Analysis techniques for background rejection at the Majorana Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Cuestra, Clara [University of Washington; Rielage, Keith Robert [Los Alamos National Laboratory; Elliott, Steven Ray [Los Alamos National Laboratory; Xu, Wenqin [Los Alamos National Laboratory; Goett, John Jerome III [Los Alamos National Laboratory

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  16. Analysis of Skylab IV fluid mechanic science demonstration

    Science.gov (United States)

    Klett, M. G.; Bourgeois, S. V.

    1975-01-01

    Several science demonstrations performed on Skylab III and IV were concerned with the behavior of fluid drops free floating in microgravity. These demonstrations, with large liquid drops, included the oscillation, rotation, impact and coalescence, and air injection into the drops. Rayleigh's analysis of the oscillation of spherical drops of a liquid predicts accurately the effect of size and surface tension on the frequency of vibrated water globules in the Skylab demonstration. However, damping occurred much faster than predicted by Lamb's or Scriven's analyses of the damping time for spherical drops. The impact demonstrations indicated that a minimum velocity is necessary to overcome surface forces and effect a coalescence, but a precise criterion for the coalescence of liquids in low g could not be determined.

  17. Performance demonstration program plan for analysis of simulated headspace gases

    International Nuclear Information System (INIS)

    1995-06-01

    The Performance Demonstration Program (PDP) for analysis of headspace gases will consist of regular distribution and analyses of test standards to evaluate the capability for analyzing VOCs, hydrogen, and methane in the headspace of transuranic (TRU) waste throughout the Department of Energy (DOE) complex. Each distribution is termed a PDP cycle. These evaluation cycles will provide an objective measure of the reliability of measurements performed for TRU waste characterization. Laboratory performance will be demonstrated by the successful analysis of blind audit samples of simulated TRU waste drum headspace gases according to the criteria set within the text of this Program Plan. Blind audit samples (hereinafter referred to as PDP samples) will be used as an independent means to assess laboratory performance regarding compliance with the QAPP QAOs. The concentration of analytes in the PDP samples will encompass the range of concentrations anticipated in actual waste characterization gas samples. Analyses which are required by the WIPP to demonstrate compliance with various regulatory requirements and which are included in the PDP must be performed by laboratories which have demonstrated acceptable performance in the PDP

  18. "Factor Analysis Using ""R"""

    Directory of Open Access Journals (Sweden)

    A. Alexander Beaujean

    2013-02-01

    Full Text Available R (R Development Core Team, 2011 is a very powerful tool to analyze data, that is gaining in popularity due to its costs (its free and flexibility (its open-source. This article gives a general introduction to using R (i.e., loading the program, using functions, importing data. Then, using data from Canivez, Konold, Collins, and Wilson (2009, this article walks the user through how to use the program to conduct factor analysis, from both an exploratory and confirmatory approach.

  19. Post mitigation impact risk analysis for asteroid deflection demonstration missions

    Science.gov (United States)

    Eggl, Siegfried; Hestroffer, Daniel; Thuillot, William; Bancelin, David; Cano, Juan L.; Cichocki, Filippo

    2015-08-01

    Even though mankind believes to have the capabilities to avert potentially disastrous asteroid impacts, only the realization of mitigation demonstration missions can validate this claim. Such a deflection demonstration attempt has to be cost effective, easy to validate, and safe in the sense that harmless asteroids must not be turned into potentially hazardous objects. Uncertainties in an asteroid's orbital and physical parameters as well as those additionally introduced during a mitigation attempt necessitate an in depth analysis of deflection mission designs in order to dispel planetary safety concerns. We present a post mitigation impact risk analysis of a list of potential kinetic impactor based deflection demonstration missions proposed in the framework of the NEOShield project. Our results confirm that mitigation induced uncertainties have a significant influence on the deflection outcome. Those cannot be neglected in post deflection impact risk studies. We show, furthermore, that deflection missions have to be assessed on an individual basis in order to ensure that asteroids are not inadvertently transported closer to the Earth at a later date. Finally, we present viable targets and mission designs for a kinetic impactor test to be launched between the years 2025 and 2032.

  20. Demonstrating Interactions of Transcription Factors with DNA by Electrophoretic Mobility Shift Assay.

    Science.gov (United States)

    Yousaf, Nasim; Gould, David

    2017-01-01

    Confirming the binding of a transcription factor with a particular DNA sequence may be important in characterizing interactions with a synthetic promoter. Electrophoretic mobility shift assay is a powerful approach to demonstrate the specific DNA sequence that is bound by a transcription factor and also to confirm the specific transcription factor involved in the interaction. In this chapter we describe a method we have successfully used to demonstrate interactions of endogenous transcription factors with sequences derived from endogenous and synthetic promoters.

  1. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    International Nuclear Information System (INIS)

    2006-01-01

    The Performance Demonstration Program (PDP) for headspace gases distributes sample gases of volatile organic compounds (VOCs) for analysis. Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility's compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document. Participating measurement

  2. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    International Nuclear Information System (INIS)

    2007-01-01

    The Performance Demonstration Program (PDP) for headspace gases distributes blind audit samples in a gas matrix for analysis of volatile organic compounds (VOCs). Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility's compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document

  3. Transforming growth factor alpha and epidermal growth factor in laryngeal carcinomas demonstrated by immunohistochemistry

    DEFF Research Database (Denmark)

    Christensen, M E; Therkildsen, M H; Poulsen, Steen Seier

    1993-01-01

    the basal cell layer. The present investigation and our previous results confirm the existence of EGF receptors, TGF-alpha and EGF in laryngeal carcinomas. In addition, we conclude that the conditions do exist for growth factors to act through an autocrine system in poorly differentiated tumours and through......Fifteen laryngeal squamous cell carcinomas were investigated for the presence of transforming growth factor alpha (TGF-alpha) and epidermal growth factor (EGF) using immunohistochemical methods. In a recent study the same material was characterized for epidermal growth factor receptors (EGF...... receptors) which were confined predominantly to the undifferentiated cells. The expression of this growth factor system in malignant cells may play a role in carcinogenesis and/or tumour growth. All carcinomas were positive for TGF-alpha and 12 were positive for EGF. In moderately-to-well differentiated...

  4. Factors affecting construction performance: exploratory factor analysis

    Science.gov (United States)

    Soewin, E.; Chinda, T.

    2018-04-01

    The present work attempts to develop a multidimensional performance evaluation framework for a construction company by considering all relevant measures of performance. Based on the previous studies, this study hypothesizes nine key factors, with a total of 57 associated items. The hypothesized factors, with their associated items, are then used to develop questionnaire survey to gather data. The exploratory factor analysis (EFA) was applied to the collected data which gave rise 10 factors with 57 items affecting construction performance. The findings further reveal that the items constituting ten key performance factors (KPIs) namely; 1) Time, 2) Cost, 3) Quality, 4) Safety & Health, 5) Internal Stakeholder, 6) External Stakeholder, 7) Client Satisfaction, 8) Financial Performance, 9) Environment, and 10) Information, Technology & Innovation. The analysis helps to develop multi-dimensional performance evaluation framework for an effective measurement of the construction performance. The 10 key performance factors can be broadly categorized into economic aspect, social aspect, environmental aspect, and technology aspects. It is important to understand a multi-dimension performance evaluation framework by including all key factors affecting the construction performance of a company, so that the management level can effectively plan to implement an effective performance development plan to match with the mission and vision of the company.

  5. Geospatial analysis of food environment demonstrates associations with gestational diabetes.

    Science.gov (United States)

    Kahr, Maike K; Suter, Melissa A; Ballas, Jerasimos; Ramin, Susan M; Monga, Manju; Lee, Wesley; Hu, Min; Shope, Cindy D; Chesnokova, Arina; Krannich, Laura; Griffin, Emily N; Mastrobattista, Joan; Dildy, Gary A; Strehlow, Stacy L; Ramphul, Ryan; Hamilton, Winifred J; Aagaard, Kjersti M

    2016-01-01

    Gestational diabetes mellitus (GDM) is one of most common complications of pregnancy, with incidence rates varying by maternal age, race/ethnicity, obesity, parity, and family history. Given its increasing prevalence in recent decades, covariant environmental and sociodemographic factors may be additional determinants of GDM occurrence. We hypothesized that environmental risk factors, in particular measures of the food environment, may be a diabetes contributor. We employed geospatial modeling in a populous US county to characterize the association of the relative availability of fast food restaurants and supermarkets to GDM. Utilizing a perinatal database with >4900 encoded antenatal and outcome variables inclusive of ZIP code data, 8912 consecutive pregnancies were analyzed for correlations between GDM and food environment based on countywide food permit registration data. Linkage between pregnancies and food environment was achieved on the basis of validated 5-digit ZIP code data. The prevalence of supermarkets and fast food restaurants per 100,000 inhabitants for each ZIP code were gathered from publicly available food permit sources. To independently authenticate our findings with objective data, we measured hemoglobin A1c levels as a function of geospatial distribution of food environment in a matched subset (n = 80). Residence in neighborhoods with a high prevalence of fast food restaurants (fourth quartile) was significantly associated with an increased risk of developing GDM (relative to first quartile: adjusted odds ratio, 1.63; 95% confidence interval, 1.21-2.19). In multivariate analysis, this association held true after controlling for potential confounders (P = .002). Measurement of hemoglobin A1c levels in a matched subset were significantly increased in association with residence in a ZIP code with a higher fast food/supermarket ratio (n = 80, r = 0.251 P analysis, a relationship of food environment and risk for gestational diabetes was

  6. Photosynthesis energy factory: analysis, synthesis, and demonstration. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1978-11-01

    This quantitative assessment of the potential of a combined dry-land Energy Plantation, wood-fired power plant, and algae wastewater treatment system demonstrates the cost-effectiveness of recycling certain by-products and effluents from one subsystem to another. Designed to produce algae up to the limit of the amount of carbon in municipal wastewater, the algae pond provides a positive cash credit, resulting mainly from the wastewater treatment credit, which may be used to reduce the cost of the Photosynthesis Energy Factory (PEF)-generated electricity. The algae pond also produces fertilizer, which reduces the cost of the biomass produced on the Energy Plantation, and some gas. The cost of electricity was as low as 35 mills per kilowatt-hour for a typical municipally-owned PEF consisting of a 65-MWe power plant, a 144-acre algae pond, and a 33,000-acre Energy Plantation. Using only conventional or near-term technology, the most cost-effective algae pond for a PEF is the carbon-limited secondary treatment system. This system does not recycle CO/sub 2/ from the flue gas. Analysis of the Energy Plantation subsystem at 15 sites revealed that plantations of 24,000 to 36,000 acres produce biomass at the lowest cost per ton. The following sites are recommended for more detailed evaluation as potential demonstration sites: Pensacola, Florida; Jamestown, New York; Knoxville, Tennessee; Martinsville, Virginia, and Greenwood, South Carolina. A major possible extension of the PEF concept is to include the possibility for irrigation.

  7. Experimental Demonstration and Theoretical Analysis of Slow Light in a Semiconductor Waveguide at GHz Frequencies

    DEFF Research Database (Denmark)

    Mørk, Jesper; Kjær, Rasmus; Poel, Mike van der

    2005-01-01

    Experimental demonstration and theoretical analysis of slow light in a semiconductor waveguide at GHz frequencies slow-down of light by a factor of two in a semiconductor waveguide at room temperature with a bandwidth of 16.7 GHz using the effect of coherent pulsations of the carrier density...

  8. G-computation demonstration in causal mediation analysis

    International Nuclear Information System (INIS)

    Wang, Aolin; Arah, Onyebuchi A.

    2015-01-01

    Recent work has considerably advanced the definition, identification and estimation of controlled direct, and natural direct and indirect effects in causal mediation analysis. Despite the various estimation methods and statistical routines being developed, a unified approach for effect estimation under different effect decomposition scenarios is still needed for epidemiologic research. G-computation offers such unification and has been used for total effect and joint controlled direct effect estimation settings, involving different types of exposure and outcome variables. In this study, we demonstrate the utility of parametric g-computation in estimating various components of the total effect, including (1) natural direct and indirect effects, (2) standard and stochastic controlled direct effects, and (3) reference and mediated interaction effects, using Monte Carlo simulations in standard statistical software. For each study subject, we estimated their nested potential outcomes corresponding to the (mediated) effects of an intervention on the exposure wherein the mediator was allowed to attain the value it would have under a possible counterfactual exposure intervention, under a pre-specified distribution of the mediator independent of any causes, or under a fixed controlled value. A final regression of the potential outcome on the exposure intervention variable was used to compute point estimates and bootstrap was used to obtain confidence intervals. Through contrasting different potential outcomes, this analytical framework provides an intuitive way of estimating effects under the recently introduced 3- and 4-way effect decomposition. This framework can be extended to complex multivariable and longitudinal mediation settings

  9. Factor analysis of multivariate data

    Digital Repository Service at National Institute of Oceanography (India)

    Fernandes, A.A.; Mahadevan, R.

    A brief introduction to factor analysis is presented. A FORTRAN program, which can perform the Q-mode and R-mode factor analysis and the singular value decomposition of a given data matrix is presented in Appendix B. This computer program, uses...

  10. Demonstration of innovative techniques for work zone safety data analysis

    Science.gov (United States)

    2009-07-15

    Based upon the results of the simulator data analysis, additional future research can be : identified to validate the driving simulator in terms of similarities with Ohio work zones. For : instance, the speeds observed in the simulator were greater f...

  11. Global Inventory and Analysis of Smart Grid Demonstration Projects

    Energy Technology Data Exchange (ETDEWEB)

    Mulder, W.; Kumpavat, K.; Faasen, C.; Verheij, F.; Vaessen, P [DNV KEMA Energy and Sustainability, Arnhem (Netherlands)

    2012-10-15

    As the key enabler of a more sustainable, economical and reliable energy system, the development of smart grids has received a great deal of attention in recent times. In many countries around the world the benefits of such a system have begun to be investigated through a number of demonstration projects. With such a vast array of projects it can be difficult to keep track of changes, and to understand which best practices are currently available with regard to smart grids. This report aims to address these issues through providing a comprehensive outlook on the current status of smart grid projects worldwide.

  12. First course in factor analysis

    CERN Document Server

    Comrey, Andrew L

    2013-01-01

    The goal of this book is to foster a basic understanding of factor analytic techniques so that readers can use them in their own research and critically evaluate their use by other researchers. Both the underlying theory and correct application are emphasized. The theory is presented through the mathematical basis of the most common factor analytic models and several methods used in factor analysis. On the application side, considerable attention is given to the extraction problem, the rotation problem, and the interpretation of factor analytic results. Hence, readers are given a background of

  13. Demonstration study on shielding safety analysis code (VI)

    Energy Technology Data Exchange (ETDEWEB)

    Sawamura, Sadashi [Hokkaido Univ., Sapporo (Japan). Faculty of Engineering

    1999-03-01

    Dose evaluation for direct radiation and skyshine from nuclear fuel facilities is one of the environment evaluation items. This evaluation is carried out by using some shielding calculation codes. Because of extremely few benchmark data of skyshine, the calculation has to be performed very conservatively. Therefore, the benchmark data of skyshine and the well-investigated code for skyshine would be necessary to carry out the rational evaluation of nuclear facilities. The purpose of this steady is to obtain the benchmark data of skyshine and to investigate the calculation code for skyshine. In this fiscal year, the followings are investigated; (1) Construction and improvement of a pulsed radiation measurement system due to the gated counting method. (2) Using the system, carried out the radiation monitoring near and in the facility of 45 MeV Linear accelerator installed at Hokkaido University. (3) Simulation analysis of the photo-neutron production and the transport by using the EGS4 and MCNP code. (author)

  14. Malaysian adolescent students' needs for enhancing thinking skills, counteracting risk factors and demonstrating academic resilience

    Science.gov (United States)

    Kuldas, Seffetullah; Hashim, Shahabuddin; Ismail, Hairul Nizam

    2015-01-01

    The adolescence period of life comes along with changes and challenges in terms of physical and cognitive development. In this hectic period, many adolescents may suffer more from various risk factors such as low socioeconomic status, substance abuse, sexual abuse and teenage pregnancy. Findings indicate that such disadvantaged backgrounds of Malaysian adolescent students lead to failure or underachievement in their academic performance. This narrative review scrutinises how some of these students are able to demonstrate academic resilience, which is satisfactory performance in cognitive or academic tasks in spite of their disadvantaged backgrounds. The review stresses the need for developing a caregiving relationship model for at-risk adolescent students in Malaysia. Such a model would allow educators to meet the students' needs for enhancing thinking skills, counteracting risk factors and demonstrating academic resilience. PMID:25663734

  15. Lithuanian Population Aging Factors Analysis

    Directory of Open Access Journals (Sweden)

    Agnė Garlauskaitė

    2015-05-01

    Full Text Available The aim of this article is to identify the factors that determine aging of Lithuania’s population and to assess the influence of these factors. The article shows Lithuanian population aging factors analysis, which consists of two main parts: the first describes the aging of the population and its characteristics in theoretical terms. Second part is dedicated to the assessment of trends that influence the aging population and demographic factors and also to analyse the determinants of the aging of the population of Lithuania. After analysis it is concluded in the article that the decline in the birth rate and increase in the number of emigrants compared to immigrants have the greatest impact on aging of the population, so in order to show the aging of the population, a lot of attention should be paid to management of these demographic processes.

  16. Factor Analysis for Clustered Observations.

    Science.gov (United States)

    Longford, N. T.; Muthen, B. O.

    1992-01-01

    A two-level model for factor analysis is defined, and formulas for a scoring algorithm for this model are derived. A simple noniterative method based on decomposition of total sums of the squares and cross-products is discussed and illustrated with simulated data and data from the Second International Mathematics Study. (SLD)

  17. Data analysis on work activities in dismantling of Japan Power Demonstration Reactor (JPDR). Contract research

    International Nuclear Information System (INIS)

    Shiraishi, Kunio; Sukegawa, Takenori; Yanagihara, Satoshi

    1998-03-01

    The safe dismantling of a retired nuclear power plant was demonstrated by completion of dismantling activities for the Japan Power Demonstration Reactor (JPDR), March, 1996, which had been conducted since 1986. This project was a flag ship project for dismantling of nuclear power plants in Japan, aiming at demonstrating an applicability of developed dismantling techniques in actual dismantling work, developing database on work activities as well as dismantling of components and structures. Various data on dismantling activities were therefore systematically collected and these were accumulated on computer files to build the decommissioning database; dismantling activities were characterized by analyzing the data. The data analysis resulted in producing general forms such as unit activity factors, for example, manpower need per unit weight of component to be dismantled, and simple arithmetic forms for forecasting of project management data to be applied to planning another dismantling project through the evaluation for general use of the analyzed data. The results of data analysis could be usefully applied to planning of future decommissioning of commercial nuclear power plants in Japan. This report describes the data collection and analysis on the JPDR dismantling activities. (author)

  18. Confirmatory factor analysis using Microsoft Excel.

    Science.gov (United States)

    Miles, Jeremy N V

    2005-11-01

    This article presents a method for using Microsoft (MS) Excel for confirmatory factor analysis (CFA). CFA is often seen as an impenetrable technique, and thus, when it is taught, there is frequently little explanation of the mechanisms or underlying calculations. The aim of this article is to demonstrate that this is not the case; it is relatively straightforward to produce a spreadsheet in MS Excel that can carry out simple CFA. It is possible, with few or no programming skills, to effectively program a CFA analysis and, thus, to gain insight into the workings of the procedure.

  19. Advanced reactor passive system reliability demonstration analysis for an external event

    International Nuclear Information System (INIS)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin

    2017-01-01

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event

  20. Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event

    Directory of Open Access Journals (Sweden)

    Matthew Bucknor

    2017-03-01

    Full Text Available Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general for the postulated transient event.

  1. Advanced reactor passive system reliability demonstration analysis for an external event

    Energy Technology Data Exchange (ETDEWEB)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin [Argonne National Laboratory, Argonne (United States)

    2017-03-15

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.

  2. An ergonomics action research demonstration: integrating human factors into assembly design processes.

    Science.gov (United States)

    Village, J; Greig, M; Salustri, F; Zolfaghari, S; Neumann, W P

    2014-01-01

    In action research (AR), the researcher participates 'in' the actions in an organisation, while simultaneously reflecting 'on' the actions to promote learning for both the organisation and the researchers. This paper demonstrates a longitudinal AR collaboration with an electronics manufacturing firm where the goal was to improve the organisation's ability to integrate human factors (HF) proactively into their design processes. During the three-year collaboration, all meetings, workshops, interviews and reflections were digitally recorded and qualitatively analysed to inform new 'actions'. By the end of the collaboration, HF tools with targets and sign-off by the HF specialist were integrated into several stages of the design process, and engineers were held accountable for meeting the HF targets. We conclude that the AR approach combined with targeting multiple initiatives at different stages of the design process helped the organisation find ways to integrate HF into their processes in a sustainable way. Researchers acted as a catalyst to help integrate HF into the engineering design process in a sustainable way. This paper demonstrates how an AR approach can help achieve HF integration, the benefits of using a reflective stance and one method for reporting an AR study.

  3. Standardized UXO Technology Demonstration Site Open Field Scoring Recording Number 231 (Human Factors Applications, Inc.)

    National Research Council Canada - National Science Library

    Overbay, Larry; Robitaille, George

    2005-01-01

    ...) utilizing the APG Standardized UXO Technology Demonstration Site Open Field. The scoring record was coordinated by Larry Overbuy and by the Standardized UXO Technology Demonstration Site Scoring Committee...

  4. A Demonstrative Analysis of News Articles Using Fairclough’s Critical Discourse Analysis Framework

    Directory of Open Access Journals (Sweden)

    Roy Randy Y. Briones

    2017-07-01

    Full Text Available This paper attempts to demonstrate Norman Fairclough’s Critical Discourse Analysis (CDA framework by conducting internal and external level analyses on two online news articles that report on the Moro Islamic Liberation Front’s (MILF submission of its findings on the “Mamasapano Incident” that happened in the Philippines in 2015. In performing analyses using this framework, the social context and background for these texts, as well as the relationship between the internal discourse features and the external social practices and structures in which the texts were produced are thoroughly examined. As a result, it can be noted that from the texts’ internal discourse features, the news articles portray ideological and social distinctions among social actors such as the Philippine Senate, the SAF troopers, the MILF, the MILF fighters, and the civilians. Moreover, from the viewpoint of the texts as being external social practices, the texts maintain institutional identities as news reports, but they also reveal some evaluative stance as exemplified by the adjectival phrases that the writers employed. Having both the internal and external features examined, it can be said that the way these texts were written seems to portray power relations that exist between the Philippine government and the MILF. Key words: Critical Discourse Analysis, discourse analysis, news articles, social practices, social structures, power relations

  5. Affinity purification of human granulocyte macrophage colony-stimulating factor receptor alpha-chain. Demonstration of binding by photoaffinity labeling

    International Nuclear Information System (INIS)

    Chiba, S.; Shibuya, K.; Miyazono, K.; Tojo, A.; Oka, Y.; Miyagawa, K.; Takaku, F.

    1990-01-01

    The human granulocyte macrophage colony-stimulating factor (GM-CSF) receptor alpha-chain, a low affinity component of the receptor, was solubilized and affinity-purified from human placenta using biotinylated GM-CSF. Scatchard analysis of 125 I-GM-CSF binding to the placental membrane extract disclosed that the GM-CSF receptor had a dissociation constant (Kd) of 0.5-0.8 nM, corresponding to the Kd value of the GM-CSF receptor alpha-chain on the intact placental membrane. Affinity labeling of the solubilized protein using a photoreactive cross-linking agent, N-hydroxysuccinimidyl-4-azidobenzoate (HSAB), demonstrated a single specific band of 70-95 kDa representing a ligand-receptor complex. Approximately 2 g of the placental membrane extract was subjected to a biotinylated GM-CSF-fixed streptavidin-agarose column, resulting in a single major band at 70 kDa on a silver-stained sodium dodecyl sulfate gel. The radioiodination for the purified material disclosed that the purified protein had an approximate molecular mass of 70 kDa and a pI of 6.6. Binding activity of the purified material was demonstrated by photoaffinity labeling using HSAB- 125 I-GM-CSF, producing a similar specific band at 70-95 kDa as was demonstrated for the crude protein

  6. Modulation of global low-frequency motions underlies allosteric regulation: demonstration in CRP/FNR family transcription factors.

    Science.gov (United States)

    Rodgers, Thomas L; Townsend, Philip D; Burnell, David; Jones, Matthew L; Richards, Shane A; McLeish, Tom C B; Pohl, Ehmke; Wilson, Mark R; Cann, Martin J

    2013-09-01

    Allostery is a fundamental process by which ligand binding to a protein alters its activity at a distinct site. There is growing evidence that allosteric cooperativity can be communicated by modulation of protein dynamics without conformational change. The mechanisms, however, for communicating dynamic fluctuations between sites are debated. We provide a foundational theory for how allostery can occur as a function of low-frequency dynamics without a change in structure. We have generated coarse-grained models that describe the protein backbone motions of the CRP/FNR family transcription factors, CAP of Escherichia coli and GlxR of Corynebacterium glutamicum. The latter we demonstrate as a new exemplar for allostery without conformation change. We observe that binding the first molecule of cAMP ligand is correlated with modulation of the global normal modes and negative cooperativity for binding the second cAMP ligand without a change in mean structure. The theory makes key experimental predictions that are tested through an analysis of variant proteins by structural biology and isothermal calorimetry. Quantifying allostery as a free energy landscape revealed a protein "design space" that identified the inter- and intramolecular regulatory parameters that frame CRP/FNR family allostery. Furthermore, through analyzing CAP variants from diverse species, we demonstrate an evolutionary selection pressure to conserve residues crucial for allosteric control. This finding provides a link between the position of CRP/FNR transcription factors within the allosteric free energy landscapes and evolutionary selection pressures. Our study therefore reveals significant features of the mechanistic basis for allostery. Changes in low-frequency dynamics correlate with allosteric effects on ligand binding without the requirement for a defined spatial pathway. In addition to evolving suitable three-dimensional structures, CRP/FNR family transcription factors have been selected to

  7. Modulation of global low-frequency motions underlies allosteric regulation: demonstration in CRP/FNR family transcription factors.

    Directory of Open Access Journals (Sweden)

    Thomas L Rodgers

    2013-09-01

    Full Text Available Allostery is a fundamental process by which ligand binding to a protein alters its activity at a distinct site. There is growing evidence that allosteric cooperativity can be communicated by modulation of protein dynamics without conformational change. The mechanisms, however, for communicating dynamic fluctuations between sites are debated. We provide a foundational theory for how allostery can occur as a function of low-frequency dynamics without a change in structure. We have generated coarse-grained models that describe the protein backbone motions of the CRP/FNR family transcription factors, CAP of Escherichia coli and GlxR of Corynebacterium glutamicum. The latter we demonstrate as a new exemplar for allostery without conformation change. We observe that binding the first molecule of cAMP ligand is correlated with modulation of the global normal modes and negative cooperativity for binding the second cAMP ligand without a change in mean structure. The theory makes key experimental predictions that are tested through an analysis of variant proteins by structural biology and isothermal calorimetry. Quantifying allostery as a free energy landscape revealed a protein "design space" that identified the inter- and intramolecular regulatory parameters that frame CRP/FNR family allostery. Furthermore, through analyzing CAP variants from diverse species, we demonstrate an evolutionary selection pressure to conserve residues crucial for allosteric control. This finding provides a link between the position of CRP/FNR transcription factors within the allosteric free energy landscapes and evolutionary selection pressures. Our study therefore reveals significant features of the mechanistic basis for allostery. Changes in low-frequency dynamics correlate with allosteric effects on ligand binding without the requirement for a defined spatial pathway. In addition to evolving suitable three-dimensional structures, CRP/FNR family transcription factors have

  8. Integrated corridor management initiative : demonstration phase evaluation - Dallas technical capability analysis test plan.

    Science.gov (United States)

    This report presents the test plan for conducting the Technical Capability Analysis for the United States : Department of Transportation (U.S. DOT) evaluation of the Dallas U.S. 75 Integrated Corridor : Management (ICM) Initiative Demonstration. The ...

  9. Integrated corridor management initiative : demonstration phase evaluation, San Diego technical capability analysis test plan.

    Science.gov (United States)

    2012-08-01

    This report presents the test plan for conducting the Technical Capability Analysis for the United States Department of Transportation (U.S. DOT) evaluation of the San Diego Integrated Corridor Management (ICM) Initiative Demonstration. The ICM proje...

  10. Enzymatic solubilisation and degradation of soybean fibre demonstrated by viscosity, fibre analysis and microscopy

    DEFF Research Database (Denmark)

    Ravn, Jonas Laukkonen; Martens, Helle Juel; Pettersson, Dan

    2015-01-01

    The effect of a commercial multienzyme product obtained by fermentation from Aspergillus aculeatus on soybean and soybean meal was investigated using viscosity measurements, dietary fibre component analysis and different microscopy techniques utilizing histochemical dyes and antibody labelling....... The results obtained demonstrated a strong viscosity reducing effect of the enzyme preparation on soluble galactomannan and xyloglucan polysaccharides and in addition non-starch polysaccharide analysis demonstrated a notable solubilisation of all polysaccharide constituents. The degradation...

  11. A factor analysis to detect factors influencing building national brand

    Directory of Open Access Journals (Sweden)

    Naser Azad

    Full Text Available Developing a national brand is one of the most important issues for development of a brand. In this study, we present factor analysis to detect the most important factors in building a national brand. The proposed study uses factor analysis to extract the most influencing factors and the sample size has been chosen from two major auto makers in Iran called Iran Khodro and Saipa. The questionnaire was designed in Likert scale and distributed among 235 experts. Cronbach alpha is calculated as 84%, which is well above the minimum desirable limit of 0.70. The implementation of factor analysis provides six factors including “cultural image of customers”, “exciting characteristics”, “competitive pricing strategies”, “perception image” and “previous perceptions”.

  12. The Recoverability of P-Technique Factor Analysis

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  13. The Infinitesimal Jackknife with Exploratory Factor Analysis

    Science.gov (United States)

    Zhang, Guangjian; Preacher, Kristopher J.; Jennrich, Robert I.

    2012-01-01

    The infinitesimal jackknife, a nonparametric method for estimating standard errors, has been used to obtain standard error estimates in covariance structure analysis. In this article, we adapt it for obtaining standard errors for rotated factor loadings and factor correlations in exploratory factor analysis with sample correlation matrices. Both…

  14. Demonstration of the application of weighting factors for cost and radiological impact to waste management decisions

    International Nuclear Information System (INIS)

    Barraclough, I.M.; Morrey, M.; Mobbs, S.F.

    1991-01-01

    Radioactive waste management can require difficult decisions involving many complex and often competing factors. In order to make decisions, the relevant factors need to be compared with each other and balanced, so that the resulting action produces the greatest net benefit. Decision-aiding techniques may help to carry out this balancing. A public survey has been designed and analyzed, which focused on the importance of both social values and the psychological processes likely to contribute to their formation. A method has been developed by which the preferences of the public concerning the consequence of waste management options may be obtained in a form suitable for use in multi-attribute decision-aiding techniques. It appears that this method is capable of producing useful, meaningful values for these weights, and therefore represents a major improvement on previous methods of obtaining weighting factors

  15. Analysis of Bernstein's factorization circuit

    NARCIS (Netherlands)

    Lenstra, A.K.; Shamir, A.; Tomlinson, J.; Tromer, E.; Zheng, Y.

    2002-01-01

    In [1], Bernstein proposed a circuit-based implementation of the matrix step of the number field sieve factorization algorithm. These circuits offer an asymptotic cost reduction under the measure "construction cost x run time". We evaluate the cost of these circuits, in agreement with [1], but argue

  16. Demonstration uncertainty/sensitivity analysis using the health and economic consequence model CRAC2

    International Nuclear Information System (INIS)

    Alpert, D.J.; Iman, R.L.; Johnson, J.D.; Helton, J.C.

    1985-01-01

    This paper summarizes a demonstration uncertainty/sensitivity analysis performed on the reactor accident consequence model CRAC2. The study was performed with uncertainty/sensitivity analysis techniques compiled as part of the MELCOR program. The principal objectives of the study were: 1) to demonstrate the use of the uncertainty/sensitivity analysis techniques on a health and economic consequence model, 2) to test the computer models which implement the techniques, 3) to identify possible difficulties in performing such an analysis, and 4) to explore alternative means of analyzing, displaying, and describing the results. Demonstration of the applicability of the techniques was the motivation for performing this study; thus, the results should not be taken as a definitive uncertainty analysis of health and economic consequences. Nevertheless, significant insights on health and economic consequence analysis can be drawn from the results of this type of study. Latin hypercube sampling (LHS), a modified Monte Carlo technique, was used in this study. LHS generates a multivariate input structure in which all the variables of interest are varied simultaneously and desired correlations between variables are preserved. LHS has been shown to produce estimates of output distribution functions that are comparable with results of larger random samples

  17. Demonstration of epidermal growth factor binding sites in the adult rat pancreas by light microscopic autoradiography

    International Nuclear Information System (INIS)

    Chabot, J.G.; Walker, P.; Pelletier, G.

    1987-01-01

    The distribution of epidermal growth factor (EGF) receptors was studied in the pancreas using light microscopic autoradiography, which was performed at different time intervals (2-60 min) after injecting 125 I-labeled EGF intravenously into the adult rat. In the exocrine pancreas, a labeling was found to occur over the pyramidal cells of the acini and cells lining the intercalated ducts. Moreover, substantial binding of EGF to cells of the islets of Langerhans was also revealed. At the 2-min time interval, most silver grains were found at the periphery of the target cells. The localization, as well as the diminution of silver grains over the cytoplasm of these cells, between 7 and 60 min, suggested the internalization and degradation of 125 I-labeled EGF. Control experiments indicated that the autoradiography reaction was due to specific interaction of 125 I-labeled EGF with its receptor. These results clearly indicate that EGF receptors are present in the acinar cells and the cells of intercalated ducts of the exocrine pancreas, as well as the cells of the endocrine pancreas. Finding that there are EGF binding sites in pancreatic acinar cells supports the physiological role of EGF in the regulation of pancreatic exocrine function. The presence of EGF receptors in cells of the islets of Langerhans suggests that EGF may play a role in the regulation of the endocrine pancreas

  18. Using Musical Intervals to Demonstrate Superposition of Waves and Fourier Analysis

    Science.gov (United States)

    LoPresto, Michael C.

    2013-01-01

    What follows is a description of a demonstration of superposition of waves and Fourier analysis using a set of four tuning forks mounted on resonance boxes and oscilloscope software to create, capture and analyze the waveforms and Fourier spectra of musical intervals.

  19. Competing definitions: a public policy analysis of the federal recreational fee demonstration program

    Science.gov (United States)

    Thomas A. E. More

    2003-01-01

    Problem definition theory specifies that however controls the definition of a problem is in a unique position to control debate over the issue, influence others, and determine the problem's place on the agenda. This paper uses a rhetorical analysis and a questionnaire survey of congressional aides to examine the federal Recreational Fee Demonstration Program....

  20. Feasibility and demonstration of a cloud-based RIID analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Michael C., E-mail: wrightmc@ornl.gov [Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Hertz, Kristin L.; Johnson, William C. [Sandia National Laboratories, Livermore, CA 94551 (United States); Sword, Eric D.; Younkin, James R. [Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Sadler, Lorraine E. [Sandia National Laboratories, Livermore, CA 94551 (United States)

    2015-06-01

    A significant limitation in the operational utility of handheld and backpack radioisotope identifiers (RIIDs) is the inability of their onboard algorithms to accurately and reliably identify the isotopic sources of the measured gamma-ray energy spectrum. A possible solution is to move the spectral analysis computations to an external device, the cloud, where significantly greater capabilities are available. The implementation and demonstration of a prototype cloud-based RIID analysis system have shown this type of system to be feasible with currently available communication and computational technology. A system study has shown that the potential user community could derive significant benefits from an appropriately implemented cloud-based analysis system and has identified the design and operational characteristics required by the users and stakeholders for such a system. A general description of the hardware and software necessary to implement reliable cloud-based analysis, the value of the cloud expressed by the user community, and the aspects of the cloud implemented in the demonstrations are discussed. - Highlights: • A prototype cloud-based RIID analysis system was implemented and demonstrated. • A cloud-based system was shown to be feasible with currently available technology. • A system study identified the operational characteristics required by the users. • The system study showed that the user community could derive significant benefit. • An architecture was defined for field testing by users in relevant environments.

  1. Feasibility and demonstration of a cloud-based RIID analysis system

    International Nuclear Information System (INIS)

    Wright, Michael C.; Hertz, Kristin L.; Johnson, William C.; Sword, Eric D.; Younkin, James R.; Sadler, Lorraine E.

    2015-01-01

    A significant limitation in the operational utility of handheld and backpack radioisotope identifiers (RIIDs) is the inability of their onboard algorithms to accurately and reliably identify the isotopic sources of the measured gamma-ray energy spectrum. A possible solution is to move the spectral analysis computations to an external device, the cloud, where significantly greater capabilities are available. The implementation and demonstration of a prototype cloud-based RIID analysis system have shown this type of system to be feasible with currently available communication and computational technology. A system study has shown that the potential user community could derive significant benefits from an appropriately implemented cloud-based analysis system and has identified the design and operational characteristics required by the users and stakeholders for such a system. A general description of the hardware and software necessary to implement reliable cloud-based analysis, the value of the cloud expressed by the user community, and the aspects of the cloud implemented in the demonstrations are discussed. - Highlights: • A prototype cloud-based RIID analysis system was implemented and demonstrated. • A cloud-based system was shown to be feasible with currently available technology. • A system study identified the operational characteristics required by the users. • The system study showed that the user community could derive significant benefit. • An architecture was defined for field testing by users in relevant environments

  2. Multiple factor analysis by example using R

    CERN Document Server

    Pagès, Jérôme

    2014-01-01

    Multiple factor analysis (MFA) enables users to analyze tables of individuals and variables in which the variables are structured into quantitative, qualitative, or mixed groups. Written by the co-developer of this methodology, Multiple Factor Analysis by Example Using R brings together the theoretical and methodological aspects of MFA. It also includes examples of applications and details of how to implement MFA using an R package (FactoMineR).The first two chapters cover the basic factorial analysis methods of principal component analysis (PCA) and multiple correspondence analysis (MCA). The

  3. Demonstration Analysis of Relationship Between R&D Investment and GDP

    Institute of Scientific and Technical Information of China (English)

    HAN Bo-tang; LIU Bai-shan; CHEN Keng

    2005-01-01

    To reveal the quantitative relationship between research and development (R&D) investment and gross domestic product (GDP) in China, we have demonstrated and analyzed the relationship between R&D investment and science and technology (S&T) progress, and based on a mount of S&T statistical data, have proceeded demonstration research of the relationship between R&D investment and GDP in China with Solow and vector auto regression (VAR) models. Cubic curve fitting and cross-correlation analysis of them with SPSS have shown that there is a strong synchronic relationship between R&D investment and GDP.

  4. PA activity by using nuclear power plant safety demonstration and analysis

    International Nuclear Information System (INIS)

    Tsuchiya, Mitsuo; Kamimae, Rie

    1999-01-01

    INS/NUPEC presents one of Public acceptance (PA) methods for nuclear power in Japan, 'PA activity by using Nuclear Power Plant Safety Demonstration and Analysis', by using one of videos which is explained and analyzed accident events (Loss of Coolant Accident). Safety regulations of The National Government are strictly implemented in licensing at each of basic design and detailed design. To support safety regulation activities conducted by the National Government, INS/NLTPEC continuously implement Safety demonstration and analysis. With safety demonstration and analysis, made by assuming some abnormal conditions, what impacts could be produced by the assumed conditions are forecast based on specific design data on a given nuclear power plants. When analysis results compared with relevant decision criteria, the safety of nuclear power plants is confirmed. The decision criteria are designed to help judge if or not safety design of nuclear power plants is properly made. The decision criteria are set in the safety examination guidelines by taking sufficient safety allowance based on the latest technical knowledge obtained from a wide range of tests and safety studies. Safety demonstration and analysis is made by taking the procedure which are summarized in this presentation. In Japan, various PA (Public Acceptance) pamphlets and videos on nuclear energy have been published. But many of them focused on such topics as necessity or importance of nuclear energy, basic principles of nuclear power generation, etc., and a few described safety evaluation particularly of abnormal and accident events in accordance with the regulatory requirements. In this background, INS/NUPEC has been making efforts to prepare PA pamphlets and videos to explain the safety of nuclear power plants, to be simple and concrete enough, using various analytical computations for abnormal and accident events. In results, PA activity of INS/NUPEC is evaluated highly by the people

  5. Supplement analysis 2 of environmental impacts resulting from modifications in the West Valley Demonstration Project

    International Nuclear Information System (INIS)

    1998-01-01

    The West Valley Demonstration Project, located in western New York, has approximately 600,000 gallons of liquid high-level radioactive waste (HLW) in storage in underground tanks. While corrosion analysis has revealed that only limited tank degradation has taken place, the failure of these tanks could release HLW to the environment. Congress requires DOE to demonstrate the technology for removal and solidification of HLW. DOE issued the Final Environmental Impact Statement (FEIS) in 1982. The purpose of this second supplement analysis is to re-assess the 1982 Final Environmental Impact Statement's continued adequacy. This report provides the necessary and appropriate data for DOE to determine whether the environmental impacts presented by the ongoing refinements in the design, process, and operations of the Project are considered sufficiently bounded within the envelope of impacts presented in the FEIS and supporting documentation

  6. Performance demonstration program plan for RCRA constituent analysis of solidified wastes

    International Nuclear Information System (INIS)

    1995-06-01

    Performance Demonstration Programs (PDPS) are designed to help ensure compliance with the Quality Assurance Objectives (QAOs) for the Waste Isolation Pilot Plant (WIPP). The PDPs are intended for use by the Department of Energy (DOE) Carlsbad Area Office (CAO) to assess and approve the laboratories and other measurement facilities supplying services for the characterization of WIPP TRU waste. The PDPs may also be used by CAO in qualifying laboratories proposing to supply additional analytical services that are required for other than waste characterization, such as WIPP site operations. The purpose of this PDP is to test laboratory performance for the analysis of solidified waste samples for TRU waste characterization. This performance will be demonstrated by the successful analysis of blind audit samples of simulated, solidified TRU waste according to the criteria established in this plan. Blind audit samples (hereinafter referred to as PDP samples) will be used as an independent means to assess laboratory performance regarding compliance with the QAOs. The concentration of analytes in the PDP samples will address levels of regulatory concern and will encompass the range of concentrations anticipated in actual waste characterization samples. Analyses that are required by the WIPP to demonstrate compliance with various regulatory requirements and which are included in the PDP must be performed by laboratories that demonstrate acceptable performance in the PDP. These analyses are referred to as WIPP analyses and the samples on which they are performed are referred to as WIPP samples for the balance of this document

  7. Analysis of technological, institutional and socioeconomic factors ...

    African Journals Online (AJOL)

    Analysis of technological, institutional and socioeconomic factors that influences poor reading culture among secondary school students in Nigeria. ... Proliferation and availability of smart phones, chatting culture and social media were identified as technological factors influencing poor reading culture among secondary ...

  8. Safety analysis report for packaging (onsite) transuranic performance demonstration program sample packaging

    International Nuclear Information System (INIS)

    Mccoy, J.C.

    1997-01-01

    The Transuranic Performance Demonstration Program (TPDP) sample packaging is used to transport highway route controlled quantities of weapons grade (WG) plutonium samples from the Plutonium Finishing Plant (PFP) to the Waste Receiving and Processing (WRAP) facility and back. The purpose of these shipments is to test the nondestructive assay equipment in the WRAP facility as part of the Nondestructive Waste Assay PDP. The PDP is part of the U. S. Department of Energy (DOE) National TRU Program managed by the U. S. Department of Energy, Carlsbad Area Office, Carlsbad, New Mexico. Details of this program are found in CAO-94-1045, Performance Demonstration Program Plan for Nondestructive Assay for the TRU Waste Characterization Program (CAO 1994); INEL-96/0129, Design of Benign Matrix Drums for the Non-Destructive Assay Performance Demonstration Program for the National TRU Program (INEL 1996a); and INEL-96/0245, Design of Phase 1 Radioactive Working Reference Materials for the Nondestructive Assay Performance Demonstration Program for the National TRU Program (INEL 1996b). Other program documentation is maintained by the national TRU program and each DOE site participating in the program. This safety analysis report for packaging (SARP) provides the analyses and evaluations necessary to demonstrate that the TRU PDP sample packaging meets the onsite transportation safety requirements of WHC-CM-2-14, Hazardous Material Packaging and Shipping, for an onsite Transportation Hazard Indicator (THI) 2 packaging. This SARP, however, does not include evaluation of any operations within the PFP or WRAP facilities, including handling, maintenance, storage, or operating requirements, except as they apply directly to transportation between the gate of PFP and the gate of the WRAP facility. All other activities are subject to the requirements of the facility safety analysis reports (FSAR) of the PFP or WRAP facility and requirements of the PDP

  9. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets.

    Science.gov (United States)

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-11-01

    With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. The work demonstrates the viability of the design approach and the software tool for analysis of large data sets.

  10. Feasibility and demonstration of a cloud-based RIID analysis system

    Science.gov (United States)

    Wright, Michael C.; Hertz, Kristin L.; Johnson, William C.; Sword, Eric D.; Younkin, James R.; Sadler, Lorraine E.

    2015-06-01

    A significant limitation in the operational utility of handheld and backpack radioisotope identifiers (RIIDs) is the inability of their onboard algorithms to accurately and reliably identify the isotopic sources of the measured gamma-ray energy spectrum. A possible solution is to move the spectral analysis computations to an external device, the cloud, where significantly greater capabilities are available. The implementation and demonstration of a prototype cloud-based RIID analysis system have shown this type of system to be feasible with currently available communication and computational technology. A system study has shown that the potential user community could derive significant benefits from an appropriately implemented cloud-based analysis system and has identified the design and operational characteristics required by the users and stakeholders for such a system. A general description of the hardware and software necessary to implement reliable cloud-based analysis, the value of the cloud expressed by the user community, and the aspects of the cloud implemented in the demonstrations are discussed.

  11. Hand function evaluation: a factor analysis study.

    Science.gov (United States)

    Jarus, T; Poremba, R

    1993-05-01

    The purpose of this study was to investigate hand function evaluations. Factor analysis with varimax rotation was used to assess the fundamental characteristics of the items included in the Jebsen Hand Function Test and the Smith Hand Function Evaluation. The study sample consisted of 144 subjects without disabilities and 22 subjects with Colles fracture. Results suggest a four factor solution: Factor I--pinch movement; Factor II--grasp; Factor III--target accuracy; and Factor IV--activities of daily living. These categories differentiated the subjects without Colles fracture from the subjects with Colles fracture. A hand function evaluation consisting of these four factors would be useful. Such an evaluation that can be used for current clinical purposes is provided.

  12. Demonstration of risk-based decision analysis in remedial alternative selection and design

    International Nuclear Information System (INIS)

    Evans, E.K.; Duffield, G.M.; Massmann, J.W.; Freeze, R.A.; Stephenson, D.E.

    1993-01-01

    This study demonstrates the use of risk-based decision analysis (Massmann and Freeze 1987a, 1987b) in the selection and design of an engineering alternative for groundwater remediation at a waste site at the Savannah River Site, a US Department of Energy facility in South Carolina. The investigation focuses on the remediation and closure of the H-Area Seepage Basins, an inactive disposal site that formerly received effluent water from a nearby production facility. A previous study by Duffield et al. (1992), which used risk-based decision analysis to screen a number of ground-water remediation alternatives under consideration for this site, indicated that the most attractive remedial option is ground-water extraction by wells coupled with surface water discharge of treated effluent. The aim of the present study is to demonstrate the iterative use of risk-based decision analysis throughout the design of a particular remedial alternative. In this study, we consider the interaction between two episodes of aquifer testing over a 6-year period and the refinement of a remedial extraction well system design. Using a three-dimensional ground-water flow model, this study employs (1) geostatistics and Monte Carlo techniques to simulate hydraulic conductivity as a stochastic process and (2) Bayesian updating and conditional simulation to investigate multiple phases of aquifer testing. In our evaluation of a remedial alternative, we compute probabilistic costs associated with the failure of an alternative to completely capture a simulated contaminant plume. The results of this study demonstrate the utility of risk-based decision analysis as a tool for improving the design of a remedial alternative through the course of phased data collection at a remedial site

  13. Tank 241-AX-104 upper vadose zone cone penetrometer demonstration sampling and analysis plan

    International Nuclear Information System (INIS)

    FIELD, J.G.

    1999-01-01

    This sampling and analysis plan (SAP) is the primary document describing field and laboratory activities and requirements for the tank 241-AX-104 upper vadose zone cone penetrometer (CP) demonstration. It is written in accordance with Hanford Tank Initiative Tank 241-AX-104 Upper Vadose Zone Demonstration Data Quality Objective (Banning 1999). This technology demonstration, to be conducted at tank 241-AX-104, is being performed by the Hanford Tanks Initiative (HTI) Project as a part of Tank Waste Remediation System (TWRS) Retrieval Program (EM-30) and the Office of Science and Technology (EM-50) Tanks Focus Area. Sample results obtained as part of this demonstration will provide additional information for subsequent revisions to the Retrieval Performance Evaluation (RPE) report (Jacobs 1998). The RPE Report is the result of an evaluation of a single tank farm (AX Tank Farm) used as the basis for demonstrating a methodology for developing the data and analyses necessary to support making tank waste retrieval decisions within the context of tank farm closure requirements. The RPE includes a study of vadose zone contaminant transport mechanisms, including analysis of projected tank leak characteristics, hydrogeologic characteristics of tank farm soils, and the observed distribution of contaminants in the vadose zone in the tank farms. With limited characterization information available, large uncertainties exist as to the nature and extent of contaminants that may exist in the upper vadose zone in the AX Tank Farm. Traditionally, data has been collected from soils in the vadose zone through the installation of boreholes and wells. Soil samples are collected as the bore hole is advanced and samples are screened on site and/or sent to a laboratory for analysis. Some in-situ geophysical methods of contaminant analysis can be used to evaluate radionuclide levels in the soils adjacent to an existing borehole. However, geophysical methods require compensation for well

  14. Integrating human factors into process hazard analysis

    International Nuclear Information System (INIS)

    Kariuki, S.G.; Loewe, K.

    2007-01-01

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors

  15. Preliminary analysis of West Valley Waste Removal System equipment development and mock demonstration facilities

    International Nuclear Information System (INIS)

    Janicek, G.P.

    1981-06-01

    This report defines seven areas requiring further investigation to develop and demonstrate a safe and viable West Valley Waste Removal System. These areas of endeavor are discussed in terms of their minimum facility requirements. It is concluded that utilizing separated specific facilities at different points in time is of a greater advantage than an exact duplication of the West Valley tanks. Savannah River Plant's full-scale, full-circle and half-circle tanks, and their twelfth scale model tank would all be useful to varying degrees but would require modifications. Hanford's proposed full-size mock tank would be useful, but is not seriously considered because its construction may not coincide with West Valley needs. Costs of modifying existing facilities and/or constructing new facilities are assessed in terms of their benefit to the equipment development and mock demonstration. Six facilities were identified for further analysis which would benefit development of waste removal equipment

  16. Design, demonstration and analysis of a modified wavelength-correlating receiver for incoherent OCDMA system.

    Science.gov (United States)

    Zhou, Heng; Qiu, Kun; Wang, Leyang

    2011-03-28

    A novel wavelength-correlating receiver for incoherent Optical Code Division Multiple Access (OCDMA) system is proposed and demonstrated in this paper. Enabled by the wavelength conversion based scheme, the proposed receiver can support various code types including one-dimensional optical codes and time-spreading/wavelength-hopping two dimensional codes. Also, a synchronous detection scheme with time-to- wavelength based code acquisition is proposed, by which code acquisition time can be substantially reduced. Moreover, a novel data-validation methodology based on all-optical pulse-width monitoring is introduced for the wavelength-correlating receiver. Experimental demonstration of the new proposed receiver is presented and low bit error rate data-receiving is achieved without optical hard limiting and electronic power thresholding. For the first time, a detailed theoretical performance analysis specialized for the wavelength-correlating receiver is presented. Numerical results show that the overall performance of the proposed receiver prevails over conventional OCDMA receivers.

  17. Policy Analysis Screening System (PASS) demonstration: sample queries and terminal instructions

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-10-16

    This document contains the input and output for the Policy Analysis Screening System (PASS) demonstration. This demonstration is stored on a portable disk at the Environmental Impacts Division. Sample queries presented here include: (1) how to use PASS; (2) estimated 1995 energy consumption from Mid-Range Energy-Forecasting System (MEFS) data base; (3) pollution projections from Strategic Environmental Assessment System (SEAS) data base; (4) diesel auto regulations; (5) diesel auto health effects; (6) oil shale health and safety measures; (7) water pollution effects of SRC; (8) acid rainfall from Energy Environmental Statistics (EES) data base; 1990 EIA electric generation by fuel type; sulfate concentrations by Federal region; forecast of 1995 SO/sub 2/ emissions in Region III; and estimated electrical generating capacity in California to 1990. The file name for each query is included.

  18. Performance Demonstration Program Plan for RCRA Constituent Analysis of Solidified Wastes

    International Nuclear Information System (INIS)

    2006-01-01

    The Performance Demonstration Program (PDP) for Resource Conservation and Recovery Act (RCRA) constituents distributes test samples for analysis of volatile organic compounds (VOCs), semivolatile organic compounds (SVOCs), and metals in solid matrices. Each distribution of test samples is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD; DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department. The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the RCRA PDP. Participating laboratories demonstrate acceptable performance by successfully analyzing single-blind performance evaluation samples (subsequently referred to as PDP samples) according to the criteria established in this plan. PDP samples are used as an independent means to assess laboratory performance regarding compliance with the WAP quality assurance objectives (QAOs). The concentrations of analytes in the PDP samples address levels of regulatory concern and encompass the range of concentrations anticipated in waste characterization samples. The WIPP requires analyses of homogeneous solid wastes to demonstrate compliance with regulatory requirements. These analyses must be performed by laboratories that demonstrate acceptable performance in this PDP. These analyses are referred to as WIPP analyses, and the samples on which they are performed are referred to as WIPP samples. Participating laboratories must analyze PDP samples using the same procedures used for WIPP samples.

  19. AZ-101 Mixer Pump Demonstration and Tests Data Management Analysis Plan

    Energy Technology Data Exchange (ETDEWEB)

    DOUGLAS, D.G.

    2000-02-22

    This document provides a plan for the analysis of the data collected during the AZ-101 Mixer Pump Demonstration and Tests. This document was prepared after a review of the AZ-101 Mixer Pump Test Plan (Revision 4) [1] and other materials. The plan emphasizes a structured and well-ordered approach towards handling and examining the data. This plan presumes that the data will be collected and organized into a unified body of data, well annotated and bearing the date and time of each record. The analysis of this data will follow a methodical series of steps that are focused on well-defined objectives. Section 2 of this plan describes how the data analysis will proceed from the real-time monitoring of some of the key sensor data to the final analysis of the three-dimensional distribution of suspended solids. This section also identifies the various sensors or sensor systems and associates them with the various functions they serve during the test program. Section 3 provides an overview of the objectives of the AZ-101 test program and describes the data that will be analyzed to support that test. The objectives are: (1) to demonstrate that the mixer pumps can be operated within the operating requirements; (2) to demonstrate that the mixer pumps can mobilize the sludge in sufficient quantities to provide feed to the private contractor facility, and (3) to determine if the in-tank instrumentation is sufficient to monitor sludge mobilization and mixer pump operation. Section 3 also describes the interim analysis that organizes the data during the test, so the analysis can be more readily accomplished. Section 4 describes the spatial orientation of the various sensors in the tank. This section is useful in visualizing the relationship of the Sensors in terms of their location in the tank and how the data from these sensors may be related to the data from other sensors. Section 5 provides a summary of the various analyses that will be performed on the data during the test

  20. AZ-101 Mixer Pump Demonstration and Tests: Data Management (Analysis) Plan

    International Nuclear Information System (INIS)

    DOUGLAS, D.G.

    2000-01-01

    This document provides a plan for the analysis of the data collected during the AZ-101 Mixer Pump Demonstration and Tests. This document was prepared after a review of the AZ-101 Mixer Pump Test Plan (Revision 4) [1] and other materials. The plan emphasizes a structured and well-ordered approach towards handling and examining the data. This plan presumes that the data will be collected and organized into a unified body of data, well annotated and bearing the date and time of each record. The analysis of this data will follow a methodical series of steps that are focused on well-defined objectives. Section 2 of this plan describes how the data analysis will proceed from the real-time monitoring of some of the key sensor data to the final analysis of the three-dimensional distribution of suspended solids. This section also identifies the various sensors or sensor systems and associates them with the various functions they serve during the test program. Section 3 provides an overview of the objectives of the AZ-101 test program and describes the data that will be analyzed to support that test. The objectives are: (1) to demonstrate that the mixer pumps can be operated within the operating requirements; (2) to demonstrate that the mixer pumps can mobilize the sludge in sufficient quantities to provide feed to the private contractor facility, and (3) to determine if the in-tank instrumentation is sufficient to monitor sludge mobilization and mixer pump operation. Section 3 also describes the interim analysis that organizes the data during the test, so the analysis can be more readily accomplished. Section 4 describes the spatial orientation of the various sensors in the tank. This section is useful in visualizing the relationship of the Sensors in terms of their location in the tank and how the data from these sensors may be related to the data from other sensors. Section 5 provides a summary of the various analyses that will be performed on the data during the test

  1. Bayesian analysis of heat pipe life test data for reliability demonstration testing

    International Nuclear Information System (INIS)

    Bartholomew, R.J.; Martz, H.F.

    1985-01-01

    The demonstration testing duration requirements to establish a quantitative measure of assurance of expected lifetime for heat pipes was determined. The heat pipes are candidate devices for transporting heat generated in a nuclear reactor core to thermoelectric converters for use as a space-based electric power plant. A Bayesian analysis technique is employed, utilizing a limited Delphi survey, and a geometric mean accelerated test criterion involving heat pipe power (P) and temperature (T). Resulting calculations indicate considerable test savings can be achieved by employing the method, but development testing to determine heat pipe failure mechanisms should not be circumvented

  2. Performance-Based Technology Selection Filter description report. INEL Buried Waste Integrated Demonstration System Analysis project

    Energy Technology Data Exchange (ETDEWEB)

    O`Brien, M.C.; Morrison, J.L.; Morneau, R.A.; Rudin, M.J.; Richardson, J.G.

    1992-05-01

    A formal methodology has been developed for identifying technology gaps and assessing innovative or postulated technologies for inclusion in proposed Buried Waste Integrated Demonstration (BWID) remediation systems. Called the Performance-Based Technology Selection Filter, the methodology provides a formalized selection process where technologies and systems are rated and assessments made based on performance measures, and regulatory and technical requirements. The results are auditable, and can be validated with field data. This analysis methodology will be applied to the remedial action of transuranic contaminated waste pits and trenches buried at the Idaho National Engineering Laboratory (INEL).

  3. Using Factor Analysis to Identify Topic Preferences Within MBA Courses

    Directory of Open Access Journals (Sweden)

    Earl Chrysler

    2003-02-01

    Full Text Available This study demonstrates the role of a principal components factor analysis in conducting a gap analysis as to the desired characteristics of business alumni. Typically, gap analyses merely compare the emphases that should be given to areas of inquiry with perceptions of actual emphases. As a result, the focus is upon depth of coverage. A neglected area in need of investigation is the breadth of topic dimensions and their differences between the normative (should offer and the descriptive (actually offer. The implications of factor structures, as well as traditional gap analyses, are developed and discussed in the context of outcomes assessment.

  4. Engaging Patients through Mobile Phones: Demonstrator Services, Success Factors, and Future Opportunities in Low and Middle-income Countries.

    Science.gov (United States)

    Hartzler, A; Wetter, T

    2014-08-15

    Evolving technology and infrastructure can benefit patients even in the poorest countries through mobile health (mHealth). Yet, what makes mobile-phone-based services succeed in low and middle-income countries (LMIC) and what opportunities does the future hold that still need to be studied. We showcase demonstrator services that leverage mobile phones in the hands of patients to promote health and facilitate health care. We surveyed the recent biomedical literature for demonstrator services that illustrate well-considered examples of mobile phone interventions for consumer health. We draw upon those examples to discuss enabling factors, scalability, reach, and potential of mHealth as well as obstacles in LMIC. Among the 227 articles returned by a PubMed search, we identified 55 articles that describe services targeting health consumers equipped with mobile phones. From those articles, we showcase 19 as demonstrator services across clinical care, prevention, infectious diseases, and population health. Services range from education, reminders, reporting, and peer support, to epidemiologic reporting, and care management with phone communication and messages. Key achievements include timely adherence to treatment and appointments, clinical effectiveness of treatment reminders, increased vaccination coverage and uptake of screening, and capacity for efficient disease surveillance. We discuss methodologies of delivery and evaluation of mobile-phone-based mHealth in LMIC, including service design, social context, and environmental factors to success. Demonstrated promises using mobile phones in the poorest countries encourage a future in which IMIA takes a lead role in leveraging mHealth for citizen empowerment through Consumer Health Informatics.

  5. Analysis of Economic Factors Affecting Stock Market

    OpenAIRE

    Xie, Linyin

    2010-01-01

    This dissertation concentrates on analysis of economic factors affecting Chinese stock market through examining relationship between stock market index and economic factors. Six economic variables are examined: industrial production, money supply 1, money supply 2, exchange rate, long-term government bond yield and real estate total value. Stock market comprises fixed interest stocks and equities shares. In this dissertation, stock market is restricted to equity market. The stock price in thi...

  6. Analysis of Return and Forward Links from STARS' Flight Demonstration 1

    Science.gov (United States)

    Gering, James A.

    2003-01-01

    Space-based Telemetry And Range Safety (STARS) is a Kennedy Space Center (KSC) led proof-of-concept demonstration, which utilizes NASA's space network of Tracking and Data Relay Satellites (TDRS) as a pathway for launch and mission related information streams. Flight Demonstration 1 concluded on July 15,2003 with the seventh flight of a Low Power Transmitter (LPT) a Command and Data Handler (C&DH), a twelve channel GPS receiver and associated power supplies and amplifiers. The equipment flew on NASA's F-I5 aircraft at the Dryden Flight Research Center located at Edwards Air Force Base in California. During this NASA-ASEE Faculty Fellowship, the author participated in the collection and analysis of data from the seven flights comprising Flight Demonstration 1. Specifically, the author examined the forward and return links bit energy E(sub B) (in Watt-seconds) divided by the ambient radio frequency noise N(sub 0) (in Watts / Hertz). E(sub b)/N(sub 0) is commonly thought of as a signal-to-noise parameter, which characterizes a particular received radio frequency (RF) link. Outputs from the data analysis include the construction of time lines for all flights, production of graphs of range safety values for all seven flights, histograms of range safety E(sub b)/N(sub 0) values in five dB increments, calculation of associated averages and standard deviations, production of graphs of range user E(sub b)/N(sub 0) values for the all flights, production of graphs of AGC's and E(sub b)/N(sub 0) estimates for flight 1, recorded onboard, transmitted directly to the launch head and transmitted through TDRS. The data and graphs are being used to draw conclusions related to a lower than expected signal strength seen in the range safety return link.

  7. Demonstration of Emulator-Based Bayesian Calibration of Safety Analysis Codes: Theory and Formulation

    Directory of Open Access Journals (Sweden)

    Joseph P. Yurko

    2015-01-01

    Full Text Available System codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the code and thereby improves its support for decision-making. The work reported here implements several improvements on classic calibration techniques afforded by modern analysis techniques. The key innovation has come from development of code surrogate model (or code emulator construction and prediction algorithms. Use of a fast emulator makes the calibration processes used here with Markov Chain Monte Carlo (MCMC sampling feasible. This work uses Gaussian Process (GP based emulators, which have been used previously to emulate computer codes in the nuclear field. The present work describes the formulation of an emulator that incorporates GPs into a factor analysis-type or pattern recognition-type model. This “function factorization” Gaussian Process (FFGP model allows overcoming limitations present in standard GP emulators, thereby improving both accuracy and speed of the emulator-based calibration process. Calibration of a friction-factor example using a Method of Manufactured Solution is performed to illustrate key properties of the FFGP based process.

  8. Factor Economic Analysis at Forestry Enterprises

    Directory of Open Access Journals (Sweden)

    M.Yu. Chik

    2018-03-01

    Full Text Available The article studies the importance of economic analysis according to the results of research of scientific works of domestic and foreign scientists. The calculation of the influence of factors on the change in the cost of harvesting timber products by cost items has been performed. The results of the calculation of the influence of factors on the change of costs on 1 UAH are determined using the full cost of sold products. The variable and fixed costs and their distribution are allocated that influences the calculation of the impact of factors on cost changes on 1 UAH of sold products. The paper singles out the general results of calculating the influence of factors on cost changes on 1 UAH of sold products. According to the results of the analysis, the list of reserves for reducing the cost of production at forest enterprises was proposed. The main sources of reserves for reducing the prime cost of forest products at forest enterprises are investigated based on the conducted factor analysis.

  9. An SPSSR -Menu for Ordinal Factor Analysis

    Directory of Open Access Journals (Sweden)

    Mario Basto

    2012-01-01

    Full Text Available Exploratory factor analysis is a widely used statistical technique in the social sciences. It attempts to identify underlying factors that explain the pattern of correlations within a set of observed variables. A statistical software package is needed to perform the calculations. However, there are some limitations with popular statistical software packages, like SPSS. The R programming language is a free software package for statistical and graphical computing. It offers many packages written by contributors from all over the world and programming resources that allow it to overcome the dialog limitations of SPSS. This paper offers an SPSS dialog written in theR programming language with the help of some packages, so that researchers with little or no knowledge in programming, or those who are accustomed to making their calculations based on statistical dialogs, have more options when applying factor analysis to their data and hence can adopt a better approach when dealing with ordinal, Likert-type data.

  10. Demonstration of Mobile Auto-GPS for Large Scale Human Mobility Analysis

    Science.gov (United States)

    Horanont, Teerayut; Witayangkurn, Apichon; Shibasaki, Ryosuke

    2013-04-01

    The greater affordability of digital devices and advancement of positioning and tracking capabilities have presided over today's age of geospatial Big Data. Besides, the emergences of massive mobile location data and rapidly increase in computational capabilities open up new opportunities for modeling of large-scale urban dynamics. In this research, we demonstrate the new type of mobile location data called "Auto-GPS" and its potential use cases for urban applications. More than one million Auto-GPS mobile phone users in Japan have been observed nationwide in a completely anonymous form for over an entire year from August 2010 to July 2011 for this analysis. A spate of natural disasters and other emergencies during the past few years has prompted new interest in how mobile location data can help enhance our security, especially in urban areas which are highly vulnerable to these impacts. New insights gleaned from mining the Auto-GPS data suggest a number of promising directions of modeling human movement during a large-scale crisis. We question how people react under critical situation and how their movement changes during severe disasters. Our results demonstrate a case of major earthquake and explain how people who live in Tokyo Metropolitan and vicinity area behave and return home after the Great East Japan Earthquake on March 11, 2011.

  11. Bayesian probability analysis: a prospective demonstration of its clinical utility in diagnosing coronary disease

    International Nuclear Information System (INIS)

    Detrano, R.; Yiannikas, J.; Salcedo, E.E.; Rincon, G.; Go, R.T.; Williams, G.; Leatherman, J.

    1984-01-01

    One hundred fifty-four patients referred for coronary arteriography were prospectively studied with stress electrocardiography, stress thallium scintigraphy, cine fluoroscopy (for coronary calcifications), and coronary angiography. Pretest probabilities of coronary disease were determined based on age, sex, and type of chest pain. These and pooled literature values for the conditional probabilities of test results based on disease state were used in Bayes theorem to calculate posttest probabilities of disease. The results of the three noninvasive tests were compared for statistical independence, a necessary condition for their simultaneous use in Bayes theorem. The test results were found to demonstrate pairwise independence in patients with and those without disease. Some dependencies that were observed between the test results and the clinical variables of age and sex were not sufficient to invalidate application of the theorem. Sixty-eight of the study patients had at least one major coronary artery obstruction of greater than 50%. When these patients were divided into low-, intermediate-, and high-probability subgroups according to their pretest probabilities, noninvasive test results analyzed by Bayesian probability analysis appropriately advanced 17 of them by at least one probability subgroup while only seven were moved backward. Of the 76 patients without disease, 34 were appropriately moved into a lower probability subgroup while 10 were incorrectly moved up. We conclude that posttest probabilities calculated from Bayes theorem more accurately classified patients with and without disease than did pretest probabilities, thus demonstrating the utility of the theorem in this application

  12. ANALYSIS OF THE FACTORS AFFECTING THE AVERAGE

    Directory of Open Access Journals (Sweden)

    Carmen BOGHEAN

    2013-12-01

    Full Text Available Productivity in agriculture most relevantly and concisely expresses the economic efficiency of using the factors of production. Labour productivity is affected by a considerable number of variables (including the relationship system and interdependence between factors, which differ in each economic sector and influence it, giving rise to a series of technical, economic and organizational idiosyncrasies. The purpose of this paper is to analyse the underlying factors of the average work productivity in agriculture, forestry and fishing. The analysis will take into account the data concerning the economically active population and the gross added value in agriculture, forestry and fishing in Romania during 2008-2011. The distribution of the average work productivity per factors affecting it is conducted by means of the u-substitution method.

  13. Nominal Performance Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. Wasiolek

    2004-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs for the groundwater exposure scenario for the three climate states considered in the TSPA-LA as well as conversion factors for evaluating compliance with the groundwater protection standard. The BDCFs will be used in performance assessment for calculating all-pathway annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle activity in groundwater and the annual dose

  14. Factor analysis for exercise stress radionuclide ventriculography

    International Nuclear Information System (INIS)

    Hirota, Kazuyoshi; Yasuda, Mitsutaka; Oku, Hisao; Ikuno, Yoshiyasu; Takeuchi, Kazuhide; Takeda, Tadanao; Ochi, Hironobu

    1987-01-01

    Using factor analysis, a new image processing in exercise stress radionuclide ventriculography, changes in factors associated with exercise were evaluated in 14 patients with angina pectoris or old myocardial infarction. The patients were imaged in the left anterior oblique projection, and three factor images were presented on a color coded scale. Abnormal factors (AF) were observed in 6 patients before exercise, 13 during exercise, and 4 after exercise. In 7 patients, the occurrence of AF was associated with exercise. Five of them became free from AF after exercise. Three patients showing AF before exercise had aggravation of AF during exercise. Overall, the occurrence or aggravation of AF was associated with exercise in ten (71 %) of the patients. The other three patients, however, had disappearance of AF during exercise. In the last patient, none of the AF was observed throughout the study. In view of a high incidence of AF associated with exercise, the factor analysis may have the potential in evaluating cardiac reverse from the viewpoint of left ventricular wall motion abnormality. (Namekawa, K.)

  15. Correction factor for hair analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Castro Faria, L.V. de; Paschoa, A.S.

    1980-01-01

    The application of the Particle Induced X-ray Emission (PIXE) technique to analyse quantitatively the elemental composition of hair specimens brings about some difficulties in the interpretation of the data. The present paper proposes a correction factor to account for the effects of the energy loss of the incident particle with penetration depth, and X-ray self-absorption when a particular geometrical distribution of elements in hair is assumed for calculational purposes. The correction factor has been applied to the analysis of hair contents Zn, Cu and Ca as a function of the energy of the incident particle. (orig.)

  16. Boolean Factor Analysis by Attractor Neural Network

    Czech Academy of Sciences Publication Activity Database

    Frolov, A. A.; Húsek, Dušan; Muraviev, I. P.; Polyakov, P.Y.

    2007-01-01

    Roč. 18, č. 3 (2007), s. 698-707 ISSN 1045-9227 R&D Projects: GA AV ČR 1ET100300419; GA ČR GA201/05/0079 Institutional research plan: CEZ:AV0Z10300504 Keywords : recurrent neural network * Hopfield-like neural network * associative memory * unsupervised learning * neural network architecture * neural network application * statistics * Boolean factor analysis * dimensionality reduction * features clustering * concepts search * information retrieval Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 2.769, year: 2007

  17. Correction factor for hair analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Castro Faria, L.V. de; Paschoa, A.S.

    1979-06-01

    The application of the Particle Induced X-ray Emission (PIXE) technique to analyse quantitatively the elemental composition of hair specimens brings about some difficulties in the interpretation of the data. The present paper proposes a correction factor to account for the effects of energy loss of the incident particle with penetration depth, and x-ray self-absorption when a particular geometrical distribution of elements in hair is assumed for calculational purposes. The correction factor has been applied to the analysis of hair contents Zn, Cu and Ca as a function of the energy of the incident particle.(Author) [pt

  18. Single cell analysis demonstrating somatic mosaicism involving 11p in a patient with paternal isodisomy and Beckwith-Wiedemann Syndrome

    Energy Technology Data Exchange (ETDEWEB)

    Bischoff, F.Z.; McCaskill, C.; Subramanian, S. [Baylor College of Medicine, Houston, TX (United States)] [and others

    1994-09-01

    Beckwith-Wiedemann Syndrome (BWS) is characterized by numerous growth abnormalities including exomphalos, macroglossia, gigantism, and hemihypertrophy or hemihyperplasia. The {open_quotes}BWS gene{close_quotes} appears to be maternally repressed and is suspected to function as a growth factor or regulator of somatic growth, since activation of this gene through a variety of mechanisms appears to result in somatic overgrowth and tumor development. Mosaic paternal isodisomy of 11p has been observed previously by others in patients with BWS by Southern blot analysis of genomic DNA. The interpretation of these results was primarily based on the intensities of the hybridization signals for the different alleles. In our study, we demonstrate somatic mosaicism directly through PCR and single cell analysis. Peripheral blood was obtained from a patient with BWS and initial genomic DNA analysis by PCR was suggestive of somatic mosaicism for paternal isodisomy of 11p. Through micromanipulation, single cells were isolated and subjected to primer extention preamplification. Locus-specific microsatellite marker analyses by PCR were performed to determine the chromosome 11 origins in the preamplified individual cells. Two populations of cells were detected, a population of cells with normal biparental inheritance and a population of cells with paternal isodisomy of 11p and biparental disomy of 11q. Using the powerful approach of single cell analysis, the detected somatic mosaicism provides evidence for a mitotic recombinational event that has resulted in loss of the maternal 11p region and gain of a second copy of paternal 11p in some cells. The direct demonstration of mosaicism may explain the variable phenotypes and hemihypertrophy often observed in BWS.

  19. An organisational analysis of the implementation of telecare and telehealth: the whole systems demonstrator

    Science.gov (United States)

    2012-01-01

    Background To investigate organisational factors influencing the implementation challenges of redesigning services for people with long term conditions in three locations in England, using remote care (telehealth and telecare). Methods Case-studies of three sites forming the UK Department of Health’s Whole Systems Demonstrator (WSD) Programme. Qualitative research techniques were used to obtain data from various sources, including semi-structured interviews, observation of meetings over the course programme and prior to its launch, and document review. Participants were managers and practitioners involved in the implementation of remote care services. Results The implementation of remote care was nested within a large pragmatic cluster randomised controlled trial (RCT), which formed a core element of the WSD programme. To produce robust benefits evidence, many aspect of the trial design could not be easily adapted to local circumstances. While remote care was successfully rolled-out, wider implementation lessons and levels of organisational learning across the sites were hindered by the requirements of the RCT. Conclusions The implementation of a complex innovation such as remote care requires it to organically evolve, be responsive and adaptable to the local health and social care system, driven by support from front-line staff and management. This need for evolution was not always aligned with the imperative to gather robust benefits evidence. This tension needs to be resolved if government ambitions for the evidence-based scaling-up of remote care are to be realised. PMID:23153014

  20. An organisational analysis of the implementation of telecare and telehealth: the whole systems demonstrator.

    Science.gov (United States)

    Hendy, Jane; Chrysanthaki, Theopisti; Barlow, James; Knapp, Martin; Rogers, Anne; Sanders, Caroline; Bower, Peter; Bowen, Robert; Fitzpatrick, Ray; Bardsley, Martin; Newman, Stanton

    2012-11-15

    To investigate organisational factors influencing the implementation challenges of redesigning services for people with long term conditions in three locations in England, using remote care (telehealth and telecare). Case-studies of three sites forming the UK Department of Health's Whole Systems Demonstrator (WSD) Programme. Qualitative research techniques were used to obtain data from various sources, including semi-structured interviews, observation of meetings over the course programme and prior to its launch, and document review. Participants were managers and practitioners involved in the implementation of remote care services. The implementation of remote care was nested within a large pragmatic cluster randomised controlled trial (RCT), which formed a core element of the WSD programme. To produce robust benefits evidence, many aspect of the trial design could not be easily adapted to local circumstances. While remote care was successfully rolled-out, wider implementation lessons and levels of organisational learning across the sites were hindered by the requirements of the RCT. The implementation of a complex innovation such as remote care requires it to organically evolve, be responsive and adaptable to the local health and social care system, driven by support from front-line staff and management. This need for evolution was not always aligned with the imperative to gather robust benefits evidence. This tension needs to be resolved if government ambitions for the evidence-based scaling-up of remote care are to be realised.

  1. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M.A. Wasiolek

    2003-07-25

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports (BSC 2003 [DIRS 160964]; BSC 2003 [DIRS 160965]; BSC 2003 [DIRS 160976]; BSC 2003 [DIRS 161239]; BSC 2003 [DIRS 161241]) contain detailed description of the model input parameters. This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs and conversion factors for the TSPA. The BDCFs will be used in performance assessment for calculating annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle activity in groundwater and the annual dose from beta- and photon-emitting radionuclides.

  2. Comparative genomic analysis of multi-subunit tethering complexes demonstrates an ancient pan-eukaryotic complement and sculpting in Apicomplexa.

    Directory of Open Access Journals (Sweden)

    Christen M Klinger

    Full Text Available Apicomplexa are obligate intracellular parasites that cause tremendous disease burden world-wide. They utilize a set of specialized secretory organelles in their invasive process that require delivery of components for their biogenesis and function, yet the precise mechanisms underpinning such processes remain unclear. One set of potentially important components is the multi-subunit tethering complexes (MTCs, factors increasingly implicated in all aspects of vesicle-target interactions. Prompted by the results of previous studies indicating a loss of membrane trafficking factors in Apicomplexa, we undertook a bioinformatic analysis of MTC conservation. Building on knowledge of the ancient presence of most MTC proteins, we demonstrate the near complete retention of MTCs in the newly available genomes for Guillardiatheta and Bigelowiellanatans. The latter is a key taxonomic sampling point as a basal sister taxa to the group including Apicomplexa. We also demonstrate an ancient origin of the CORVET complex subunits Vps8 and Vps3, as well as the TRAPPII subunit Tca17. Having established that the lineage leading to Apicomplexa did at one point possess the complete eukaryotic complement of MTC components, we undertook a deeper taxonomic investigation in twelve apicomplexan genomes. We observed excellent conservation of the VpsC core of the HOPS and CORVET complexes, as well as the core TRAPP subunits, but sparse conservation of TRAPPII, COG, Dsl1, and HOPS/CORVET-specific subunits. However, those subunits that we did identify appear to be expressed with similar patterns to the fully conserved MTC proteins, suggesting that they may function as minimal complexes or with analogous partners. Strikingly, we failed to identify any subunits of the exocyst complex in all twelve apicomplexan genomes, as well as the dinoflagellate Perkinsus marinus. Overall, we demonstrate reduction of MTCs in Apicomplexa and their ancestors, consistent with modification during

  3. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M.A. Wasiolek

    2005-04-28

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standards. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis

  4. Nominal Performance Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M.A. Wasiolek

    2005-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standards. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1-1). The objectives of this analysis are to develop BDCFs for the

  5. A methodology to incorporate organizational factors into human reliability analysis

    International Nuclear Information System (INIS)

    Li Pengcheng; Chen Guohua; Zhang Li; Xiao Dongsheng

    2010-01-01

    A new holistic methodology for Human Reliability Analysis (HRA) is proposed to model the effects of the organizational factors on the human reliability. Firstly, a conceptual framework is built, which is used to analyze the causal relationships between the organizational factors and human reliability. Then, the inference model for Human Reliability Analysis is built by combining the conceptual framework with Bayesian networks, which is used to execute the causal inference and diagnostic inference of human reliability. Finally, a case example is presented to demonstrate the specific application of the proposed methodology. The results show that the proposed methodology of combining the conceptual model with Bayesian Networks can not only easily model the causal relationship between organizational factors and human reliability, but in a given context, people can quantitatively measure the human operational reliability, and identify the most likely root causes or the prioritization of root causes caused human error. (authors)

  6. A kernel version of spatial factor analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2009-01-01

    . Schölkopf et al. introduce kernel PCA. Shawe-Taylor and Cristianini is an excellent reference for kernel methods in general. Bishop and Press et al. describe kernel methods among many other subjects. Nielsen and Canty use kernel PCA to detect change in univariate airborne digital camera images. The kernel...... version of PCA handles nonlinearities by implicitly transforming data into high (even infinite) dimensional feature space via the kernel function and then performing a linear analysis in that space. In this paper we shall apply kernel versions of PCA, maximum autocorrelation factor (MAF) analysis...

  7. Cell lineage analysis demonstrates an endodermal origin of the distal urethra and perineum.

    Science.gov (United States)

    Seifert, Ashley W; Harfe, Brian D; Cohn, Martin J

    2008-06-01

    Congenital malformations of anorectal and genitourinary (collectively, anogenital) organs occur at a high frequency in humans, however the lineage of cells that gives rise to anogenital organs remains poorly understood. The penile urethra has been reported to develop from two cell populations, with the proximal urethra developing from endoderm and the distal urethra forming from an apical ectodermal invagination, however this has never been tested by direct analysis of cell lineage. During gut development, endodermal cells express Sonic hedgehog (Shh), which is required for normal patterning of digestive and genitourinary organs. We have taken advantage of the properties of Shh expression to genetically label and follow the fate of posterior gut endoderm during anogenital development. We report that the entire urethra, including the distal (glandar) region, is derived from endoderm. Cloacal endoderm also gives rise to the epithelial linings of the bladder, rectum and anterior region of the anus. Surprisingly, the lineage map also revealed an endodermal origin of the perineum, which is the first demonstration that endoderm differentiates into skin. In addition, we fate mapped genital tubercle ectoderm and show that it makes no detectable contribution to the urethra. In males, formation of the urethral tube involves septation of the urethral plate by continued growth of the urorectal septum. Analysis of cell lineage following disruption of androgen signaling revealed that the urethral plate of flutamide-treated males does not undergo this septation event. Instead, urethral plate cells persist to the ventral margin of the tubercle, mimicking the pattern seen in females. Based on these spatial and temporal fate maps, we present a new model for anogenital development and suggest that disruptions at specific developmental time points can account for the association between anorectal and genitourinary defects.

  8. Demonstration uncertainty/sensitivity analysis using the health and economic consequence model CRAC2

    International Nuclear Information System (INIS)

    Alpert, D.J.; Iman, R.L.; Johnson, J.D.; Helton, J.C.

    1984-12-01

    The techniques for performing uncertainty/sensitivity analyses compiled as part of the MELCOR program appear to be well suited for use with a health and economic consequence model. Two replicate samples of size 50 gave essentially identical results, indicating that for this case, a Latin hypercube sample of size 50 seems adequate to represent the distribution of results. Though the intent of this study was a demonstration of uncertainty/sensitivity analysis techniques, a number of insights relevant to health and economic consequence modeling can be gleaned: uncertainties in early deaths are significantly greater than uncertainties in latent cancer deaths; though the magnitude of the source term is the largest source of variation in estimated distributions of early deaths, a number of additional parameters are also important; even with the release fractions for a full SST1, one quarter of the CRAC2 runs gave no early deaths; and comparison of the estimates of mean early deaths for a full SST1 release in this study with those of recent point estimates for similar conditions indicates that the recent estimates may be significant overestimations of early deaths. Estimates of latent cancer deaths, however, are roughly comparable. An analysis of the type described here can provide insights in a number of areas. First, the variability in the results gives an indication of the potential uncertainty associated with the calculations. Second, the sensitivity of the results to assumptions about the input variables can be determined. Research efforts can then be concentrated on reducing the uncertainty in the variables which are the largest contributors to uncertainty in results

  9. Design and analysis of electrical energy storage demonstration projects on UK distribution networks

    International Nuclear Information System (INIS)

    Lyons, P.F.; Wade, N.S.; Jiang, T.; Taylor, P.C.; Hashiesh, F.; Michel, M.; Miller, D.

    2015-01-01

    Highlights: • Results of an EES system demonstration project carried out in the UK. • Approaches to the design of trials for EES and observation on their application. • A formalised methodology for analysis of smart grids trials. • Validated models of energy storage. • Capability of EES to connect larger quantities of heat pumps and PV is evaluated. - Abstract: The UK government’s CO 2 emissions targets will require electrification of much of the country’s infrastructure with low carbon technologies such as photovoltaic panels, electric vehicles and heat pumps. The large scale proliferation of these technologies will necessitate major changes to the planning and operation of distribution networks. Distribution network operators are trialling electrical energy storage (EES) across their networks to increase their understanding of the contribution that it can make to enable the expected paradigm shift in generation and consumption of electricity. In order to evaluate a range of applications for EES, including voltage control and power flow management, installations have taken place at various distribution network locations and voltage levels. This article reports on trial design approaches and their application to a UK trial of an EES system to ensure broad applicability of the results. Results from these trials of an EES system, low carbon technologies and trial distribution networks are used to develop validated power system models. These models are used to evaluate, using a formalised methodology, the impact that EES could have on the design and operation of future distribution networks

  10. From demonstration to deployment: An economic analysis of support policies for carbon capture and storage

    International Nuclear Information System (INIS)

    Krahé, Max; Heidug, Wolf; Ward, John; Smale, Robin

    2013-01-01

    This paper argues that an integrated policy architecture consisting of multiple policy phases and economic instruments is needed to support the development of carbon capture and storage (CCS) from its present demonstration phase to full-scale deployment. Building on an analysis of the different types of policy instruments to correct market failures specific to CCS in its various stages of development, we suggest a way to combine these into an integrated policy architecture. This policy architecture adapts to the need of a maturing technology, meets the requirement of policymakers to maintain flexibility to respond to changing circumstances while providing investors with the policy certainty that is needed to encourage private sector investment. This combination of flexibility and predictability is achieved through the use of ‘policy gateways’ which explicitly define rules and criteria for when and how policy settings will change. Our findings extend to bioenergy-based CCS applications (BECCS), which could potentially achieve negative emissions. We argue that within a framework of correcting the carbon externality, the added environmental benefits of BECCS should be reflected in an extra incentive. - Highlights: • Sensible aim of current climate policy: secure option of future CCS deployment. • But policy makers require flexibility while private investors require predictability. • Integrating CCS policy into an overall policy architecture can overcome this antinomy. • We describe the key features of a good policy architecture and give an example

  11. DISRUPTIVE EVENT BIOSPHERE DOSE CONVERSION FACTOR ANALYSIS

    International Nuclear Information System (INIS)

    M.A. Wasiolek

    2005-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The Biosphere Model Report (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1-1). The objective of this analysis was to develop the BDCFs for the volcanic

  12. Analysis of mineral phases in coal utilizing factor analysis

    International Nuclear Information System (INIS)

    Roscoe, B.A.; Hopke, P.K.

    1982-01-01

    The mineral phase inclusions of coal are discussed. The contribution of these to a coal sample are determined utilizing several techniques. Neutron activation analysis in conjunction with coal washability studies have produced some information on the general trends of elemental variation in the mineral phases. These results have been enhanced by the use of various statistical techniques. The target transformation factor analysis is specifically discussed and shown to be able to produce elemental profiles of the mineral phases in coal. A data set consisting of physically fractionated coal samples was generated. These samples were analyzed by neutron activation analysis and then their elemental concentrations examined using TTFA. Information concerning the mineral phases in coal can thus be acquired from factor analysis even with limited data. Additional data may permit the resolution of additional mineral phases as well as refinement of theose already identified

  13. A Beginner’s Guide to Factor Analysis: Focusing on Exploratory Factor Analysis

    Directory of Open Access Journals (Sweden)

    An Gie Yong

    2013-10-01

    Full Text Available The following paper discusses exploratory factor analysis and gives an overview of the statistical technique and how it is used in various research designs and applications. A basic outline of how the technique works and its criteria, including its main assumptions are discussed as well as when it should be used. Mathematical theories are explored to enlighten students on how exploratory factor analysis works, an example of how to run an exploratory factor analysis on SPSS is given, and finally a section on how to write up the results is provided. This will allow readers to develop a better understanding of when to employ factor analysis and how to interpret the tables and graphs in the output.

  14. Capital Cost Optimization for Prefabrication: A Factor Analysis Evaluation Model

    Directory of Open Access Journals (Sweden)

    Hong Xue

    2018-01-01

    Full Text Available High capital cost is a significant hindrance to the promotion of prefabrication. In order to optimize cost management and reduce capital cost, this study aims to explore the latent factors and factor analysis evaluation model. Semi-structured interviews were conducted to explore potential variables and then questionnaire survey was employed to collect professionals’ views on their effects. After data collection, exploratory factor analysis was adopted to explore the latent factors. Seven latent factors were identified, including “Management Index”, “Construction Dissipation Index”, “Productivity Index”, “Design Efficiency Index”, “Transport Dissipation Index”, “Material increment Index” and “Depreciation amortization Index”. With these latent factors, a factor analysis evaluation model (FAEM, divided into factor analysis model (FAM and comprehensive evaluation model (CEM, was established. The FAM was used to explore the effect of observed variables on the high capital cost of prefabrication, while the CEM was used to evaluate comprehensive cost management level on prefabrication projects. Case studies were conducted to verify the models. The results revealed that collaborative management had a positive effect on capital cost of prefabrication. Material increment costs and labor costs had significant impacts on production cost. This study demonstrated the potential of on-site management and standardization design to reduce capital cost. Hence, collaborative management is necessary for cost management of prefabrication. Innovation and detailed design were needed to improve cost performance. The new form of precast component factories can be explored to reduce transportation cost. Meanwhile, targeted strategies can be adopted for different prefabrication projects. The findings optimized the capital cost and improved the cost performance through providing an evaluation and optimization model, which helps managers to

  15. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs for the groundwater exposure scenario for the three climate states considered in the TSPA-LA as well as conversion factors for evaluating compliance with the groundwater protection standard. The BDCFs will be used in performance assessment for calculating all-pathway annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle

  16. Determining the Number of Factors in P-Technique Factor Analysis

    Science.gov (United States)

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  17. Disruptive Event Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. A. Wasiolek

    2003-07-21

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in

  18. Disruptive Event Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. A. Wasiolek

    2003-01-01

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in the biosphere. The biosphere process

  19. Exploratory Bi-Factor Analysis: The Oblique Case

    Science.gov (United States)

    Jennrich, Robert I.; Bentler, Peter M.

    2012-01-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford ("Psychometrika" 47:41-54, 1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler ("Psychometrika" 76:537-549, 2011) introduced an exploratory form of bi-factor…

  20. Exploratory factor analysis in Rehabilitation Psychology: a content analysis.

    Science.gov (United States)

    Roberson, Richard B; Elliott, Timothy R; Chang, Jessica E; Hill, Jessica N

    2014-11-01

    Our objective was to examine the use and quality of exploratory factor analysis (EFA) in articles published in Rehabilitation Psychology. Trained raters examined 66 separate exploratory factor analyses in 47 articles published between 1999 and April 2014. The raters recorded the aim of the EFAs, the distributional statistics, sample size, factor retention method(s), extraction and rotation method(s), and whether the pattern coefficients, structure coefficients, and the matrix of association were reported. The primary use of the EFAs was scale development, but the most widely used extraction and rotation method was principle component analysis, with varimax rotation. When determining how many factors to retain, multiple methods (e.g., scree plot, parallel analysis) were used most often. Many articles did not report enough information to allow for the duplication of their results. EFA relies on authors' choices (e.g., factor retention rules extraction, rotation methods), and few articles adhered to all of the best practices. The current findings are compared to other empirical investigations into the use of EFA in published research. Recommendations for improving EFA reporting practices in rehabilitation psychology research are provided.

  1. Performance analysis of an optical self-interference cancellation system with a directly modulated laser-based demonstration.

    Science.gov (United States)

    Yu, Yinghong; Zhang, Yunhao; Huang, Lin; Xiao, Shilin

    2018-02-20

    In this paper, two main performance indices of the optical self-interference cancellation (OSIC) system are theoretically analyzed: cancellation bandwidth and depth. Delay deviation is investigated to be the determining factor of cancellation bandwidth, based on which the bandwidth advantage of the OSIC system over electrical schemes is also proven theoretically. Cancellation depth in the narrowband is mostly influenced by attenuation and delay-adjusting deviation, while in the broadband case, the performance is mostly limited by frequency-dependent amplitude and phase mismatch. The cancellation performance analysis is suitable for most linear modulation-demodulation OSIC systems, including the directly modulated laser (DML)-based OSIC system verified experimentally in this paper. The cancellation model is well demonstrated by the agreement between experimental cancellation results and predicted performance. For over-the-air demonstration with the employment of antennas, broadband cancellation within 450 MHz bandwidth of 22 dB and 25 dB is achieved at 900 MHz and 2.4 GHz, respectively. In addition, orthogonal frequency division multiplexing signals are employed to show in-band full-duplex transmission with good performance by the DML-based OSIC system, with successful suppression of self-interference and recovery of the signal of interest.

  2. Exploratory Bi-factor Analysis: The Oblique Case

    OpenAIRE

    Jennrich, Robert L.; Bentler, Peter M.

    2011-01-01

    Bi-factor analysis is a form of confirmatory factor analysis originally introduced by Holzinger and Swineford (1937). The bi-factor model has a general factor, a number of group factors, and an explicit bi-factor structure. Jennrich and Bentler (2011) introduced an exploratory form of bi-factor analysis that does not require one to provide an explicit bi-factor structure a priori. They use exploratory factor analysis and a bi-factor rotation criterion designed to produce a rotated loading mat...

  3. Phase 1 Characterization sampling and analysis plan West Valley demonstration project.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, R. L. (Environmental Science Division)

    2011-06-30

    The Phase 1 Characterization Sampling and Analysis Plan (CSAP) provides details about environmental data collection that will be taking place to support Phase 1 decommissioning activities described in the Phase 1 Decommissioning Plan for the West Valley Demonstration Project, Revision 2 (Phase I DP; DOE 2009). The four primary purposes of CSAP data collection are: (1) pre-design data collection, (2) remedial support, (3) post-remediation status documentation, and (4) Phase 2 decision-making support. Data collection to support these four main objectives is organized into two distinct data collection efforts. The first is data collection that will take place prior to the initiation of significant Phase 1 decommissioning activities (e.g., the Waste Management Area [WMA] 1 and WMA 2 excavations). The second is data collection that will occur during and immediately after environmental remediation in support of remediation activities. Both data collection efforts have a set of well-defined objectives that encompass the data needs of the four main CSAP data collection purposes detailed in the CSAP. The main body of the CSAP describes the overall data collection strategies that will be used to satisfy data collection objectives. The details of pre-remediation data collection are organized by WMA. The CSAP contains an appendix for each WMA that describes the details of WMA-specific pre-remediation data collection activities. The CSAP is intended to expand upon the data collection requirements identified in the Phase 1 Decommissioning Plan. The CSAP is intended to tightly integrate with the Phase 1 Final Status Survey Plan (FSSP). Data collection described by the CSAP is consistent with the FSSP where appropriate and to the extent possible.

  4. Regulatory analysis of the Underground Storage Tank-Integrated Demonstration Program

    International Nuclear Information System (INIS)

    Smith, E.H.

    1992-01-01

    The Underground Storage Tank-Integrated Demonstration (UST-ID) Program has been developed to identify, demonstrate, test, and evaluate technologies that will provide alternatives to the current underground storage tank remediation program. The UST-ID Program is a national program that consists of five participating US Department of Energy (DOE) sites where technologies can be developed an ultimately demonstrated. Once these technologies are demonstrated, the UST-ID Program will transfer the developed technology system to industry (governmental or industrial) for application or back to Research and Development for further evaluation and modification, as necessary. In order to ensure that the UST-ID Program proceeds without interruption, it will be necessary to identify regulatory requirements along with associated permitting and notification requirements early in the technology development process. This document serves as a baseline for identifying certain federal and state regulatory requirements that may impact the UST-ID Program and the demonstration of any identified technologies

  5. Shielding design study of the demonstration fast breeder reactor. 2. Shielding design on the basis of the JASPER analysis

    International Nuclear Information System (INIS)

    Suzuoki, Zenro; Tabayashi, Masao; Handa, Hiroyuki; Iida, Masaaki; Takemura, Morio

    2000-01-01

    Conceptual shielding design has been performed for the Demonstration Fast Breeder Reactor (DFBR) to achieve further optimization and reduction of the plant construction cost. The design took into account its implications in overall plant configuration such as reduction of shields in the core, adoption of fission gas plenum in the lower portion of fuel assemblies, and adoption of gas expansion modules. Shielding criteria applied for the design are to secure fast neutron fluence on in-vessel structures as well as responses of the nuclear instrumentation system and to restrict secondary sodium activation. The design utilized the cross sections and the one- and two-dimensional discrete ordinates transport codes, whose verification had been performed by the JASPER experiment analysis. Correction factors yielded by the JASPER analysis were applied to the design calculations to obtain design values with improved accuracy. Design margins, which are defined by the ratios of the design criteria to the design values, were more than two for all shielding issues of interest, showing the adequacy of the shielding design of the DFBR. (author)

  6. Prevailing of ischemia cardiopathy, demonstrated by gammagraphy in less than 40 years old persons and its association with risk factors

    International Nuclear Information System (INIS)

    Cano G, M.A.; Castillo M, L.; Orea T, A.

    2005-01-01

    The Coronary Arterial Illness (EAC) is the first cause of death of those Mexicans. Among their numerous risk factors it highlights the age with more bias starting from the 45 years. The objective of this investigation was to determine the prevailing of ischemic cardiopathy (Cl) and heart attack to the myocardium (IAM) in fellows smaller than 40 years and to identify risk factors. The study of myocardial perfusion (EPM) it is a non invasive study and of great sensibility and specificity that it allows to detect obstructive coronary lesions. The used method was retrospective-traverse Study in 125 patients smaller than 40 years. Files of patients to who EPM had been practiced with Technetium 99m-SESTAMlBI, protocol of one day, were revised, where its were analyzed the short and long axis (vertical and horizontal). General data, somatometry, emotional profile analysis, lipids and glucose profiles were gathered. Results. The population conformed it 53% women and 47% men with average of 31.9 year-old age of corporal mass index (IMC) 25.1 kg/cm 2 . 46% of abnormal studies was obtained, of which 35% was compatible with ischemic cardiopathy (Cl) and 11% with heart attack to the myocardium (IAM). The characteristics of these were: age 31.6±6 Vs 32.6±5.9 years; IMC 25.4±7.0 Vs 24.4±3.34 kg/cm 2 ; stature 161.6±9.8 Vs 165.5±9.7cm; TAS 139.1±29.2 Vs. 115±13.4 mm Hg, TAD 84.5±17.4 Vs. 75±9.4 mm Hg; civil state married 65.5% (p=0.005) Vs single 57%; bigger depression 32% Vs anxiety 28%, in the group of patients with Cl and IAM, respectively. In the IAM population it was found an additional IRC 21% (p=0.030), HAS 21% (p=0.025) and drug addiction 21% (p=0.002). The rest of the results didn't show significant differences. Conclusion: Only 6.5% of the patients that went to EPM- 99m Tc-SESTAMIBl in a 6 year-old lapse, were smaller than 40 years. 71% of them was referred by prechordal pain in who almost the half it was evidenced Cl or IAM. In this investigation besides the

  7. Photonically wired spacecraft panels: an economic analysis and demonstrator for telecommunication satellites

    Science.gov (United States)

    Putzer, Philipp; Hurni, Andreas; Ziegler, Bent; Panopoulou, Aikaterini; Lemke, Norbert; Costa, Ivo; Pereira, Celeste

    2017-09-01

    In this paper we present the design of smart satellite panels with integrated optical fibers for sensing and data communication. The project starts with a detailed analysis of the system needs and ends with a demonstrator breadboard showing the full performance during and after environmental tests such as vibrations and temperature. Future science missions will need higher bandwidth in the Gbit/s range for intra-satellite communications, so the step from electrical transmission media towards fiber-optical media is the logical next step to cope with future requirements. In addition, the fibers can be used to monitor temperatures directly underneath satellite payloads which will reduce the integration effort in a later phase. For temperature monitoring so called fiber Bragg gratings (FBGs) are written in special radiation tolerant fibers, which reflection wavelength allows a direct link to temperature at the grating position. A read-out system for FBGs to use within satellite applications is currently under development at OHB. For this study, first the environmental requirements for the panels are derived and in a second stage the functional requirements are defined. To define the functional requirements a telecommunication satellite platform, in the case here the Small-GEO series from OHB, has been taken as baseline. Based on the configuration of temperature sensors, communication lines and electrical signaling a possible replacement by fiber-optical technology was defined and traded w.r.t. its economic benefit. It has been pointed out that the replacement of temperature sensors will reduce harness mass, but the great benefit is seen here in the reduction of assembly effort. Once the satellite panel is manufactured, the temperature sensors are already implemented at certain positions. Another point for mass savings which has pointed out is the replacement of the high-voltage or high- current high power commands (HPC) by fiber optics. Replacing some of the several

  8. Disruptive Event Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this

  9. Scalable group level probabilistic sparse factor analysis

    DEFF Research Database (Denmark)

    Hinrich, Jesper Løve; Nielsen, Søren Føns Vind; Riis, Nicolai Andre Brogaard

    2017-01-01

    Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation. We propose a scalable group level probabilistic sparse factor analysis (psFA) allowing spatially sparse maps, component...... pruning using automatic relevance determination (ARD) and subject specific heteroscedastic spatial noise modeling. For task-based and resting state fMRI, we show that the sparsity constraint gives rise to components similar to those obtained by group independent component analysis. The noise modeling...... shows that noise is reduced in areas typically associated with activation by the experimental design. The psFA model identifies sparse components and the probabilistic setting provides a natural way to handle parameter uncertainties. The variational Bayesian framework easily extends to more complex...

  10. Disruptive Event Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. Wasiolek

    2004-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this analysis was to develop the BDCFs for the volcanic ash

  11. Kernel parameter dependence in spatial factor analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2010-01-01

    kernel PCA. Shawe-Taylor and Cristianini [4] is an excellent reference for kernel methods in general. Bishop [5] and Press et al. [6] describe kernel methods among many other subjects. The kernel version of PCA handles nonlinearities by implicitly transforming data into high (even infinite) dimensional...... feature space via the kernel function and then performing a linear analysis in that space. In this paper we shall apply a kernel version of maximum autocorrelation factor (MAF) [7, 8] analysis to irregularly sampled stream sediment geochemistry data from South Greenland and illustrate the dependence...... of the kernel width. The 2,097 samples each covering on average 5 km2 are analyzed chemically for the content of 41 elements....

  12. Sequential-Injection Analysis: Principles, Instrument Construction, and Demonstration by a Simple Experiment

    Science.gov (United States)

    Economou, A.; Tzanavaras, P. D.; Themelis, D. G.

    2005-01-01

    The sequential-injection analysis (SIA) is an approach to sample handling that enables the automation of manual wet-chemistry procedures in a rapid, precise and efficient manner. The experiments using SIA fits well in the course of Instrumental Chemical Analysis and especially in the section of Automatic Methods of analysis provided by chemistry…

  13. A Markov Chain Monte Carlo Approach to Confirmatory Item Factor Analysis

    Science.gov (United States)

    Edwards, Michael C.

    2010-01-01

    Item factor analysis has a rich tradition in both the structural equation modeling and item response theory frameworks. The goal of this paper is to demonstrate a novel combination of various Markov chain Monte Carlo (MCMC) estimation routines to estimate parameters of a wide variety of confirmatory item factor analysis models. Further, I show…

  14. SDP_mharwit_1: Demonstration of HIFI Linear Polarization Analysis of Spectral Features

    Science.gov (United States)

    Harwit, M.

    2010-03-01

    We propose to observe the polarization of the 621 GHz water vapor maser in VY Canis Majoris to demonstrate the capability of HIFI to make polarization observations of Far-Infrared/Submillimeter spectral lines. The proposed Demonstration Phase would: - Show that HIFI is capable of interesting linear polarization measurements of spectral lines; - Test out the highest spectral resolving power to sort out closely spaced Doppler components; - Determine whether the relative intensities predicted by Neufeld and Melnick are correct; - Record the degree and direction of linear polarization for the closely-Doppler shifted peaks.

  15. Qualitative analysis of factors leading to clinical incidents.

    Science.gov (United States)

    Smith, Matthew D; Birch, Julian D; Renshaw, Mark; Ottewill, Melanie

    2013-01-01

    The purpose of this paper is to evaluate the common themes leading or contributing to clinical incidents in a UK teaching hospital. A root-cause analysis was conducted on patient safety incidents. Commonly occurring root causes and contributing factors were collected and correlated with incident timing and severity. In total, 65 root-cause analyses were reviewed, highlighting 202 factors implicated in the clinical incidents and 69 categories were identified. The 14 most commonly occurring causes (encountered in four incidents or more) were examined as a key-root or contributory cause. Incident timing was also analysed; common factors were encountered more frequently during out-hours--occurring as contributory rather than a key-root cause. In total, 14 commonly occurring factors were identified to direct interventions that could prevent many clinical incidents. From these, an "Organisational Safety Checklist" was developed to involve departmental level clinicians to monitor practice. This study demonstrates that comprehensively investigating incidents highlights common factors that can be addressed at a local level. Resilience against clinical incidents is low during out-of-hours periods, where factors such as lower staffing levels and poor service provision allows problems to escalate and become clinical incidents, which adds to the literature regarding out-of-hours care provision and should prove useful to those organising hospital services at departmental and management levels.

  16. Cost-Effectiveness Analysis of Early Reading Programs: A Demonstration with Recommendations for Future Research

    Science.gov (United States)

    Hollands, Fiona M.; Kieffer, Michael J.; Shand, Robert; Pan, Yilin; Cheng, Henan; Levin, Henry M.

    2016-01-01

    We review the value of cost-effectiveness analysis for evaluation and decision making with respect to educational programs and discuss its application to early reading interventions. We describe the conditions for a rigorous cost-effectiveness analysis and illustrate the challenges of applying the method in practice, providing examples of programs…

  17. A Cost Analysis Plan for the National Preventive Dentistry Demonstration Program.

    Science.gov (United States)

    Foch, Craig B.

    The National Preventive Dentistry Demonstration Project (NPDDP) delivers school-based preventive dental care to approximately 14,000 children in ten United States cities. The program, begun in 1976, is to be conducted over a six and one-half year period. The costing definitions and allocation rules to be used in the project are the principal…

  18. Dynamic Simulation, Sensitivity and Uncertainty Analysis of a Demonstration Scale Lignocellulosic Enzymatic Hydrolysis Process

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail; Sin, Gürkan

    2014-01-01

    This study presents the uncertainty and sensitivity analysis of a lignocellulosic enzymatic hydrolysis model considering both model and feed parameters as sources of uncertainty. The dynamic model is parametrized for accommodating various types of biomass, and different enzymatic complexes...

  19. Human Factors Assessment of the UH-60M Crew Station During the Early User Demonstration Number 2 (EUD2)

    National Research Council Canada - National Science Library

    Kennedy, Joshua S; Durbin, David B

    2005-01-01

    Pilot workload, situational awareness (SA), and the pilot-vehicle interface (PVI) characteristics associated with the UH-60M Black Hawk crew station simulator were assessed during the Early User Demonstration...

  20. An analysis of the demonstration projects for renewable energy application buildings in China

    International Nuclear Information System (INIS)

    Liu, Xingmin; Ren, Hong; Wu, Yong; Kong, Deping

    2013-01-01

    During the 2006–2008 period, there were 386 demonstration projects for renewable energy application buildings (REAB) organised by Chinese government, with a total area of approximately 40,420,000 m 2 . By the end of 2011, the vast majority of these projects had been completed and had passed the final acceptance. This paper analyses the measures taken by the Chinese government, including economic incentive mechanisms, organising agencies, application and evaluation systems, online monitoring platforms, acceptance inspections, assessment systems, standard criteria and so forth. This paper then evaluates the policy effects. The paper shows that there has been a satisfactory effect in the development of the REAB market, mobilising the enthusiasm of the government, equipment manufacturers and scientific research institutions, and promoting energy conservation. In addition, this paper analyses the suitability of different technological types in different climatic zones, which provides further guidance for the development of the REAB. Finally, based on the analyses of the problems met in the implementation of the demonstration projects, this paper proposes some policy suggestions concerning standard criteria, technological development, project management, incentive mechanisms and so on, to promote the development of the REAB more effectively in the future in China. - Highlights: • The policy measures to promote the development of renewable energy application buildings in China. • Evaluation of the demonstration policy effects in the market development and other aspects. • Analyses of the regional applicability for renewable energy application buildings in China. • Analyses of problems met in the implementation of the demonstration projects. • Put forward some policy suggestions on standard, technology, management, etc

  1. Technology demonstration: geostatistical and hydrologic analysis of salt areas. Assessment of effectiveness of geologic isolation systems

    International Nuclear Information System (INIS)

    Doctor, P.G.; Oberlander, P.L.; Rice, W.A.; Devary, J.L.; Nelson, R.W.; Tucker, P.E.

    1982-09-01

    The Office of Nuclear Waste Isolation (ONWI) requested Pacific Northwest Laboratory (PNL) to: (1) use geostatistical analyses to evaluate the adequacy of hydrologic data from three salt regions, each of which contains a potential nuclear waste repository site; and (2) demonstrate a methodology that allows quantification of the value of additional data collection. The three regions examined are the Paradox Basin in Utah, the Permian Basin in Texas, and the Mississippi Study Area. Additional and new data became available to ONWI during and following these analyses; therefore, this report must be considered a methodology demonstration here would apply as illustrated had the complete data sets been available. A combination of geostatistical and hydrologic analyses was used for this demonstration. Geostatistical analyses provided an optimal estimate of the potentiometric surface from the available data, a measure of the uncertainty of that estimate, and a means for selecting and evaluating the location of future data. The hydrologic analyses included the calculation of transmissivities, flow paths, travel times, and ground-water flow rates from hypothetical repository sites. Simulation techniques were used to evaluate the effect of optimally located future data on the potentiometric surface, flow lines, travel times, and flow rates. Data availability, quality, quantity, and conformance with model assumptions differed in each of the salt areas. Report highlights for the three locations are given

  2. Confirmatory Factor Analysis Alternative: Free, Accessible CBID Software.

    Science.gov (United States)

    Bott, Marjorie; Karanevich, Alex G; Garrard, Lili; Price, Larry R; Mudaranthakam, Dinesh Pal; Gajewski, Byron

    2018-02-01

    New software that performs Classical and Bayesian Instrument Development (CBID) is reported that seamlessly integrates expert (content validity) and participant data (construct validity) to produce entire reliability estimates with smaller sample requirements. The free CBID software can be accessed through a website and used by clinical investigators in new instrument development. Demonstrations are presented of the three approaches using the CBID software: (a) traditional confirmatory factor analysis (CFA), (b) Bayesian CFA using flat uninformative prior, and (c) Bayesian CFA using content expert data (informative prior). Outcomes of usability testing demonstrate the need to make the user-friendly, free CBID software available to interdisciplinary researchers. CBID has the potential to be a new and expeditious method for instrument development, adding to our current measurement toolbox. This allows for the development of new instruments for measuring determinants of health in smaller diverse populations or populations of rare diseases.

  3. Necessary steps in factor analysis : Enhancing validation studies of educational instruments. The PHEEM applied to clerks as an example

    NARCIS (Netherlands)

    Schonrock-Adema, Johanna; Heijne-Penninga, Marjolein; van Hell, Elisabeth A.; Cohen-Schotanus, Janke

    2009-01-01

    Background: The validation of educational instruments, in particular the employment of factor analysis, can be improved in many instances. Aims: To demonstrate the superiority of a sophisticated method of factor analysis, implying an integration of recommendations described in the factor analysis

  4. Demonstration of uncertainty quantification and sensitivity analysis for PWR fuel performance with BISON

    International Nuclear Information System (INIS)

    Zhang, Hongbin; Zhao, Haihua; Zou, Ling; Burns, Douglas; Ladd, Jacob

    2017-01-01

    BISON is an advanced fuels performance code being developed at Idaho National Laboratory and is the code of choice for fuels performance by the U.S. Department of Energy (DOE)’s Consortium for Advanced Simulation of Light Water Reactors (CASL) Program. An approach to uncertainty quantification and sensitivity analysis with BISON was developed and a new toolkit was created. A PWR fuel rod model was developed and simulated by BISON, and uncertainty quantification and sensitivity analysis were performed with eighteen uncertain input parameters. The maximum fuel temperature and gap conductance were selected as the figures of merit (FOM). Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis. (author)

  5. Demonstration of Uncertainty Quantification and Sensitivity Analysis for PWR Fuel Performance with BISON

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Hongbin; Ladd, Jacob; Zhao, Haihua; Zou, Ling; Burns, Douglas

    2015-11-01

    BISON is an advanced fuels performance code being developed at Idaho National Laboratory and is the code of choice for fuels performance by the U.S. Department of Energy (DOE)’s Consortium for Advanced Simulation of Light Water Reactors (CASL) Program. An approach to uncertainty quantification and sensitivity analysis with BISON was developed and a new toolkit was created. A PWR fuel rod model was developed and simulated by BISON, and uncertainty quantification and sensitivity analysis were performed with eighteen uncertain input parameters. The maximum fuel temperature and gap conductance were selected as the figures of merit (FOM). Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis.

  6. Analysis of toroidal vacuum vessels for use in demonstration sized tokamak reactors

    International Nuclear Information System (INIS)

    Culbert, M.E.

    1978-07-01

    The vacuum vessel component of the tokamak fusion reactor is the subject of this study. The main objective of this paper was to provide guidance for the structural design of a thin wall externally pressurized toroidal vacuum vessel. The analyses are based on the available state-of-the-art analytical methods. The shortcomings of these analytical methods necessitated approximations and assumptions to be made throughout the study. A principal result of the study has been the identification of a viable vacuum vessel design for the Demonstration Tokamak Hybrid Reactor (DTHR) and The Next Step (TNS) Reactor

  7. Phylogenomic analysis demonstrates a pattern of rare and ancient horizontal gene transfer between plants and fungi.

    Science.gov (United States)

    Richards, Thomas A; Soanes, Darren M; Foster, Peter G; Leonard, Guy; Thornton, Christopher R; Talbot, Nicholas J

    2009-07-01

    Horizontal gene transfer (HGT) describes the transmission of genetic material across species boundaries and is an important evolutionary phenomenon in the ancestry of many microbes. The role of HGT in plant evolutionary history is, however, largely unexplored. Here, we compare the genomes of six plant species with those of 159 prokaryotic and eukaryotic species and identify 1689 genes that show the highest similarity to corresponding genes from fungi. We constructed a phylogeny for all 1689 genes identified and all homolog groups available from the rice (Oryza sativa) genome (3177 gene families) and used these to define 14 candidate plant-fungi HGT events. Comprehensive phylogenetic analyses of these 14 data sets, using methods that account for site rate heterogeneity, demonstrated support for nine HGT events, demonstrating an infrequent pattern of HGT between plants and fungi. Five HGTs were fungi-to-plant transfers and four were plant-to-fungi HGTs. None of the fungal-to-plant HGTs involved angiosperm recipients. These results alter the current view of organismal barriers to HGT, suggesting that phagotrophy, the consumption of a whole cell by another, is not necessarily a prerequisite for HGT between eukaryotes. Putative functional annotation of the HGT candidate genes suggests that two fungi-to-plant transfers have added phenotypes important for life in a soil environment. Our study suggests that genetic exchange between plants and fungi is exceedingly rare, particularly among the angiosperms, but has occurred during their evolutionary history and added important metabolic traits to plant lineages.

  8. Time Series Factor Analysis with an Application to Measuring Money

    NARCIS (Netherlands)

    Gilbert, Paul D.; Meijer, Erik

    2005-01-01

    Time series factor analysis (TSFA) and its associated statistical theory is developed. Unlike dynamic factor analysis (DFA), TSFA obviates the need for explicitly modeling the process dynamics of the underlying phenomena. It also differs from standard factor analysis (FA) in important respects: the

  9. Risk factor analysis of equine strongyle resistance to anthelmintics

    Directory of Open Access Journals (Sweden)

    G. Sallé

    2017-12-01

    Full Text Available Intestinal strongyles are the most problematic endoparasites of equids as a result of their wide distribution and the spread of resistant isolates throughout the world. While abundant literature can be found on the extent of anthelmintic resistance across continents, empirical knowledge about associated risk factors is missing. This study brought together results from anthelmintic efficacy testing and risk factor analysis to provide evidence-based guidelines in the field. It involved 688 horses from 39 French horse farms and riding schools to both estimate Faecal Egg Count Reduction (FECR after anthelmintic treatment and to interview farm and riding school managers about their practices. Risk factors associated with reduced anthelmintic efficacy in equine strongyles were estimated across drugs using a marginal modelling approach. Results demonstrated ivermectin efficacy (96.3% ± 14.5% FECR, the inefficacy of fenbendazole (42.8% ± 33.4% FECR and an intermediate profile for pyrantel (90.3% ± 19.6% FECR. Risk factor analysis provided support to advocate for FEC-based treatment regimens combined with individual anthelmintic dosage and the enforcement of tighter biosecurity around horse introduction. The combination of these measures resulted in a decreased risk of drug resistance (relative risk of 0.57, p = 0.02. Premises falling under this typology also relied more on their veterinarians suggesting practitionners play an important role in the sustainability of anthelmintic usage. Similarly, drug resistance risk was halved in premises with frequent pasture rotation and with stocking rate below five horses/ha (relative risk of 0.53, p < 0.01. This is the first empirical risk factor analysis for anthelmintic resistance in equids. Our findings should guide the implementation of more sustained strongyle management in the field. Keywords: Horse, Nematode, Anthelmintic resistance, Strongyle, Cyathostomin

  10. Core physics analysis in support of the FNR HEU-LEU demonstration experiment

    International Nuclear Information System (INIS)

    Losey, David C.; Brown, Forrest B.; Martin, William R.; Lee, John C.

    1983-01-01

    A core neutronics analysis has been undertaken to assess the impact of low-enrichment fuel on the performance and utilization of the FNR As part of this analytic effort a computer code system has been assembled which will be of general use in analyzing research reactors with MTR-type fuel. The code system has been extensively tested and verified in calculations for the present high enrichment core. The analysis presented here compares the high-and-low enrichment fuels in batch and equilibrium core configurations which model the actual FNR operating conditions. The two fuels are compared for cycle length, fuel burnup, and flux and power distributions, as well as for the reactivity effects which are important in assessing the impact of LEU fuel on reactor shutdown margin. (author)

  11. Core physics analysis in support of the FNR HEU-LEU demonstration experiment

    Energy Technology Data Exchange (ETDEWEB)

    Losey, David C; Brown, Forrest B; Martin, William R; Lee, John C [Department of Nuclear Engineering, University of Michigan (United States)

    1983-08-01

    A core neutronics analysis has been undertaken to assess the impact of low-enrichment fuel on the performance and utilization of the FNR As part of this analytic effort a computer code system has been assembled which will be of general use in analyzing research reactors with MTR-type fuel. The code system has been extensively tested and verified in calculations for the present high enrichment core. The analysis presented here compares the high-and-low enrichment fuels in batch and equilibrium core configurations which model the actual FNR operating conditions. The two fuels are compared for cycle length, fuel burnup, and flux and power distributions, as well as for the reactivity effects which are important in assessing the impact of LEU fuel on reactor shutdown margin. (author)

  12. The Ford Nuclear Reactor demonstration project for the evaluation and analysis of low enrichment fuel

    International Nuclear Information System (INIS)

    Kerr, W.; King, J.S.; Lee, J.C.; Martin, W.R.; Wehe, D.K.

    1991-07-01

    The whole-core LEU fuel demonstration project at the University of Michigan was begun in 1979 as part of the Reduced Enrichment Research and Test Reactor (RERTR) Program at Argonne National Laboratory. An LEU fuel design was selected which would produce minimum perturbations in the neutronic, operations, and safety characteristics of the 2-MW Ford Nuclear Reactor (FNR). Initial criticality with a full LEU core on December 8, 1981, was followed by low- and full-power testing of the fresh LEU core, transitional operation with mixed HEU-LEU configurations, and establishment of full LEU equilibrium core operation. The transition from the HEU to the LEU configurations was achieved with negligible impact on experimental utilization and safe operation of the reactor. 78 refs., 74 figs., 84 tabs

  13. Operating experience, measurements, and analysis of the LEU whole core demonstration at the FNR

    International Nuclear Information System (INIS)

    Weha, D.K.; Drumm, C.R.; King, J.S.; Martin, W.R.; Lee, J.C.

    1984-01-01

    The 2-MW Ford Nuclear Reactor at the University of Michigan is serving as the demonstration reactor for the MTR-type low enrichment (LEU) fuel for the Reduced Enrichment for Research and Test Reactor program. Operational experience gained through six months of LEU core operation and seven months of mixed HEU-LEU core operation is presented. Subcadmium flux measurements performed with rhodium self-powered neutron detectors and iron wire activations are compared with calculations. Measured reactivity parameters are compared for HEU and LEU cores. Finally, the benchmark calculations for several HEU, LEU, and mixed HEU-LEU FNR cores and the International Atomic Energy Agency (IAEA) benchmark problem are presented. (author)

  14. Analysis of occludin trafficking, demonstrating continuous endocytosis, degradation, recycling and biosynthetic secretory trafficking.

    Directory of Open Access Journals (Sweden)

    Sarah J Fletcher

    Full Text Available Tight junctions (TJs link adjacent cells and are critical for maintenance of apical-basolateral polarity in epithelial monolayers. The TJ protein occludin functions in disparate processes, including wound healing and Hepatitis C Virus infection. Little is known about steady-state occludin trafficking into and out of the plasma membrane. Therefore, we determined the mechanisms responsible for occludin turnover in confluent Madin-Darby canine kidney (MDCK epithelial monolayers. Using various biotin-based trafficking assays we observed continuous and rapid endocytosis of plasma membrane localised occludin (the majority internalised within 30 minutes. By 120 minutes a significant reduction in internalised occludin was observed. Inhibition of lysosomal function attenuated the reduction in occludin signal post-endocytosis and promoted co-localisation with the late endocytic system. Using a similar method we demonstrated that ∼20% of internalised occludin was transported back to the cell surface. Consistent with these findings, significant co-localisation between internalised occludin and recycling endosomal compartments was observed. We then quantified the extent to which occludin synthesis and transport to the plasma membrane contributes to plasma membrane occludin homeostasis, identifying inhibition of protein synthesis led to decreased plasma membrane localised occludin. Significant co-localisation between occludin and the biosynthetic secretory pathway was demonstrated. Thus, under steady-state conditions occludin undergoes turnover via a continuous cycle of endocytosis, recycling and degradation, with degradation compensated for by biosynthetic exocytic trafficking. We developed a mathematical model to describe the endocytosis, recycling and degradation of occludin, utilising experimental data to provide quantitative estimates for the rates of these processes.

  15. Successful Completion of FY18/Q1 ASC L2 Milestone 6355: Electrical Analysis Calibration Workflow Capability Demonstration.

    Energy Technology Data Exchange (ETDEWEB)

    Copps, Kevin D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-12-01

    The Sandia Analysis Workbench (SAW) project has developed and deployed a production capability for SIERRA computational mechanics analysis workflows. However, the electrical analysis workflow capability requirements have only been demonstrated in early prototype states, with no real capability deployed for analysts’ use. This milestone aims to improve the electrical analysis workflow capability (via SAW and related tools) and deploy it for ongoing use. We propose to focus on a QASPR electrical analysis calibration workflow use case. We will include a number of new capabilities (versus today’s SAW), such as: 1) support for the XYCE code workflow component, 2) data management coupled to electrical workflow, 3) human-in-theloop workflow capability, and 4) electrical analysis workflow capability deployed on the restricted (and possibly classified) network at Sandia. While far from the complete set of capabilities required for electrical analysis workflow over the long term, this is a substantial first step toward full production support for the electrical analysts.

  16. ALOHA—Astronomical Light Optical Hybrid Analysis - From experimental demonstrations to a MIR instrument proposal

    Science.gov (United States)

    Lehmann, L.; Darré, P.; Szemendera, L.; Gomes, J. T.; Baudoin, R.; Ceus, D.; Brustlein, S.; Delage, L.; Grossard, L.; Reynaud, F.

    2018-04-01

    This paper gives an overview of the Astronomical Light Optical Hybrid Analysis (ALOHA) project dedicated to investigate a new method for high resolution imaging in mid infrared astronomy. This proposal aims to use a non-linear frequency conversion process to shift the thermal infrared radiation to a shorter wavelength domain compatible with proven technology such as guided optics and detectors. After a description of the principle, we summarise the evolution of our study from the high flux seminal experiments to the latest results in the photon counting regime.

  17. Interfacing Dielectric-Loaded Plasmonic and Silicon Photonic Waveguides: Theoretical Analysis and Experimental Demonstration

    DEFF Research Database (Denmark)

    Tsilipakos, O.; Pitilakis, A.; Yioultsis, T. V.

    2012-01-01

    A comprehensive theoretical analysis of end-fire coupling between dielectric-loaded surface plasmon polariton and rib/wire silicon-on-insulator (SOI) waveguides is presented. Simulations are based on the 3-D vector finite element method. The geometrical parameters of the interface are varied...... in order to identify the ones leading to optimum performance, i.e., maximum coupling efficiency. Fabrication tolerances about the optimum parameter values are also assessed. In addition, the effect of a longitudinal metallic stripe gap on coupling efficiency is quantified, since such gaps have been...

  18. Housing price forecastability: A factor analysis

    DEFF Research Database (Denmark)

    Bork, Lasse; Møller, Stig Vinther

    of the model stays high at longer horizons. The estimated factors are strongly statistically signi…cant according to a bootstrap resampling method which takes into account that the factors are estimated regressors. The simple three-factor model also contains substantial out-of-sample predictive power...

  19. Erikson Psychosocial Stage Inventory: A Factor Analysis

    Science.gov (United States)

    Gray, Mary McPhail; And Others

    1986-01-01

    The 72-item Erikson Psychosocial Stage Inventory (EPSI) was factor analyzed for a group of 534 university freshmen and sophomore students. Seven factors emerged, which were labeled Initiative, Industry, Identity, Friendship, Dating, Goal Clarity, and Self-Confidence. Item's representing Erikson's factors, Trust and Autonomy, were dispersed across…

  20. Analysis of operational records in the bituminization demonstration facility. Investigation of the cause of fire

    International Nuclear Information System (INIS)

    Shibata, A.; Sano, Y.; Yoneya, M.; Koyama, T.

    1997-12-01

    Operational records in the bituminization demonstration facility in 97-M46-1 campaign were analyzed in order to investigate the cause of fire. Operational records which are different from ordinary level in this campaign are drum weight, temperature at 7th zone and extruder torque. So we investigated past campaign data about these records. The results are as follows. 1) In some campaigns, drum weight was lighter, temperature at 7th zone higher, and torque higher. 2) When drum weight is lighter, temperature at 7th zone becomes relatively higher. 3) In the case that higher temperature was measured at 7th zone, drum weight was sometimes less than the past average. 4) When the extruder's torque increases, it sometimes influences drum weight and temperature at 7th zone. The maximum of salt and bitumen became hotter from 28B. As the heat source, both frictional resistance and exothermic chemical reaction can be considered. Frictional resistance came out with increase of the torque. So we checked some operational parameters to investigate what increases the torque. Feed rate of waste solution is related with the torque increase. The other parameters are not related with it. Now we can not specify any causes of the torque increase from 27B, the feed rate of waste solution is however possible to be one of the causes. (author)

  1. Bayes factor design analysis: Planning for compelling evidence.

    Science.gov (United States)

    Schönbrodt, Felix D; Wagenmakers, Eric-Jan

    2018-02-01

    A sizeable literature exists on the use of frequentist power analysis in the null-hypothesis significance testing (NHST) paradigm to facilitate the design of informative experiments. In contrast, there is almost no literature that discusses the design of experiments when Bayes factors (BFs) are used as a measure of evidence. Here we explore Bayes Factor Design Analysis (BFDA) as a useful tool to design studies for maximum efficiency and informativeness. We elaborate on three possible BF designs, (a) a fixed-n design, (b) an open-ended Sequential Bayes Factor (SBF) design, where researchers can test after each participant and can stop data collection whenever there is strong evidence for either [Formula: see text] or [Formula: see text], and (c) a modified SBF design that defines a maximal sample size where data collection is stopped regardless of the current state of evidence. We demonstrate how the properties of each design (i.e., expected strength of evidence, expected sample size, expected probability of misleading evidence, expected probability of weak evidence) can be evaluated using Monte Carlo simulations and equip researchers with the necessary information to compute their own Bayesian design analyses.

  2. Ergonomic analysis of a telemanipulation technique for a pyroprocss demonstration facility

    International Nuclear Information System (INIS)

    Yu, Seung Nam; Lee, Jong Kwang; Park, Byung Suk; Kim, Ki Ho; Cho, IL Je

    2014-01-01

    In this study, remote handling strategies for a large-scale argon cell facility were considered. The suggested strategies were evaluated by several types of field test. The teleoperation tasks were performed using a developed remote handling system, which enabled traveling over entire cell area using a bridge transport system. Each arm of the system had six DOFs (degrees of freedom), and the bridge transport system had four DOFs. However, despite the dexterous manipulators and redundant monitoring system, many operators, including professionals, experienced difficulties in operating the remote handling system. This was because of the lack of a strategy for handling the installed camera system, and the difficulty in recognizing the gripper pose, which might fall outside the FOV (field of vision) of the system during teleoperation. Hence, in this paper, several considerations for the remote handling tasks performed in the target facility were discussed, and the tasks were analyzed based on ergonomic factors such as the workload. Toward the development of a successful operation strategy, several ergonomic issues, such as active/passive view of the remote handling system, eye/hand alignment, and FOV were considered. Furthermore, using the method for classifying remote handling tasks, several unit tasks were defined and evaluated.

  3. Ergonomic analysis of a telemanipulation technique for a pyroprocss demonstration facility

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Seung Nam; Lee, Jong Kwang; Park, Byung Suk; Kim, Ki Ho; Cho, IL Je [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-08-15

    In this study, remote handling strategies for a large-scale argon cell facility were considered. The suggested strategies were evaluated by several types of field test. The teleoperation tasks were performed using a developed remote handling system, which enabled traveling over entire cell area using a bridge transport system. Each arm of the system had six DOFs (degrees of freedom), and the bridge transport system had four DOFs. However, despite the dexterous manipulators and redundant monitoring system, many operators, including professionals, experienced difficulties in operating the remote handling system. This was because of the lack of a strategy for handling the installed camera system, and the difficulty in recognizing the gripper pose, which might fall outside the FOV (field of vision) of the system during teleoperation. Hence, in this paper, several considerations for the remote handling tasks performed in the target facility were discussed, and the tasks were analyzed based on ergonomic factors such as the workload. Toward the development of a successful operation strategy, several ergonomic issues, such as active/passive view of the remote handling system, eye/hand alignment, and FOV were considered. Furthermore, using the method for classifying remote handling tasks, several unit tasks were defined and evaluated.

  4. Preliminary hazard analysis for the Brayton Isotope Ground Demonstration System (including vacuum test chamber)

    International Nuclear Information System (INIS)

    Miller, L.G.

    1975-01-01

    The Preliminary Hazard Analysis (PHA) of the BIPS-GDS is a tabular summary of hazards and undesired events which may lead to system damage or failure and/or hazard to personnel. The PHA reviews the GDS as it is envisioned to operate in the Vacuum Test Chamber (VTC) of the GDS Test Facility. The VTC and other equipment which will comprise the test facility are presently in an early stage of preliminary design and will undoubtedly undergo numerous changes before the design is frozen. The PHA and the FMECA to follow are intended to aid the design effort by identifying areas of concern which are critical to the safety and reliability of the BIPS-GDS and test facility

  5. Functional improvement after carotid endarterectomy: demonstrated by gait analysis and acetazolamide stress brain perfusion SPECT

    International Nuclear Information System (INIS)

    Kim, J. S.; Kim, G. E.; Yoo, J. Y.; Kim, D. G.; Moon, D. H.

    2005-01-01

    Scientific documentation of neurologic improvement following carotid endarterectomy (CEA) has not been established. The purpose of this prospective study is to investigate whether CEA performed for the internal carotid artery flow lesion improves gait and cerebrovascular hemodynamic status in patients with gait disturbance. We prospectively performed pre- and postCEA gait analysis and acetazolamide stress brain perfusion SPECT (Acz-SPECT) with Tc-99m ECD in 91 patients (M/F: 81/10, mean age: 64.1 y) who had gait disturbance before receiving CEA. Gait performance was assessed using a Vicon 370 motion analyzer. The gait improvement after CEA was correlated to cerebrovascular hemodynamic change as well as symptom duration. 12 hemiparetic stroke patients (M/F=9/3, mean age: 51 y) who did not receive CEA as a control underwent gait analysis twice in a week interval to evaluate whether repeat testing of gait performance shows learning effect. Of 91 patients, 73 (80%) patients showed gait improvement (change of gait speed > 10%) and 42 (46%) showed marked improvement (change of gait speed > 20%), but no improvement was observed in control group at repeat test. Post-operative cerebrovascular hemodynamic improvement was noted in 49 (54%) of 91 patients. There was marked gait improvement in patients group with cerebrovascular hemodynamic improvement compared to no change group (p<0.05). Marked gait improvement and cerebrovascular hemodynamic improvement were noted in 53% and 61% of the patient who had less than 3 month history of symptom compared to 31% and 24% of the patients who had longer than 3 months, respectively (p<0.05). Marked gait improvement was obtained in patients who had improvement of cerebrovascular hemodynamic status on Acz-SPECT after CEA. These results suggest functional improvement such as gait can result from the improved perfusion of misery perfusion area, which is viable for a longer period compared to literatures previously reported

  6. A numerical analysis and experimental demonstration of a low degradation conductive bridge resistive memory device

    KAUST Repository

    Berco, Dan

    2017-10-23

    This study investigates a low degradation metal-ion conductive bridge RAM (CBRAM) structure. The structure is based on placing a diffusion blocking layer (DBL) between the device\\'s top electrode (TE) and the resistive switching layer (RSL), unlike conventional CBRAMs, where the TE serves as a supply reservoir for metallic species diffusing into the RSL to form a conductive filament (CF) and is kept in direct contact with the RSL. The properties of a conventional CBRAM structure (Cu/HfO2/TiN), having a Cu TE, 10 nm HfO2 RSL, and a TiN bottom electrode, are compared with a 2 nm TaN DBL incorporating structure (Cu/TaN/HfO2/TiN) for 103 programming and erase simulation cycles. The low and high resistive state values for each cycle are calculated and the analysis reveals that adding the DBL yields lower degradation. In addition, the 2D distribution plots of oxygen vacancies, O ions, and Cu species within the RSL indicate that oxidation occurring in the DBL-RSL interface results in the formation of a sub-stoichiometric tantalum oxynitride with higher blocking capabilities that suppresses further Cu insertion beyond an initial CF formation phase, as well as CF lateral widening during cycling. The higher endurance of the structure with DBL may thus be attributed to the relatively low amount of Cu migrating into the RSL during the initial CF formation. Furthermore, this isomorphic CF displays similar cycling behavior to neural ionic channels. The results of numerical analysis show a good match to experimental measurements of similar device structures as well

  7. A numerical analysis and experimental demonstration of a low degradation conductive bridge resistive memory device

    Science.gov (United States)

    Berco, Dan; Chand, Umesh; Fariborzi, Hossein

    2017-10-01

    This study investigates a low degradation metal-ion conductive bridge RAM (CBRAM) structure. The structure is based on placing a diffusion blocking layer (DBL) between the device's top electrode (TE) and the resistive switching layer (RSL), unlike conventional CBRAMs, where the TE serves as a supply reservoir for metallic species diffusing into the RSL to form a conductive filament (CF) and is kept in direct contact with the RSL. The properties of a conventional CBRAM structure (Cu/HfO2/TiN), having a Cu TE, 10 nm HfO2 RSL, and a TiN bottom electrode, are compared with a 2 nm TaN DBL incorporating structure (Cu/TaN/HfO2/TiN) for 103 programming and erase simulation cycles. The low and high resistive state values for each cycle are calculated and the analysis reveals that adding the DBL yields lower degradation. In addition, the 2D distribution plots of oxygen vacancies, O ions, and Cu species within the RSL indicate that oxidation occurring in the DBL-RSL interface results in the formation of a sub-stoichiometric tantalum oxynitride with higher blocking capabilities that suppresses further Cu insertion beyond an initial CF formation phase, as well as CF lateral widening during cycling. The higher endurance of the structure with DBL may thus be attributed to the relatively low amount of Cu migrating into the RSL during the initial CF formation. Furthermore, this isomorphic CF displays similar cycling behavior to neural ionic channels. The results of numerical analysis show a good match to experimental measurements of similar device structures as well.

  8. A numerical analysis and experimental demonstration of a low degradation conductive bridge resistive memory device

    KAUST Repository

    Berco, Dan; Chand, Umesh; Fariborzi, Hossein

    2017-01-01

    This study investigates a low degradation metal-ion conductive bridge RAM (CBRAM) structure. The structure is based on placing a diffusion blocking layer (DBL) between the device's top electrode (TE) and the resistive switching layer (RSL), unlike conventional CBRAMs, where the TE serves as a supply reservoir for metallic species diffusing into the RSL to form a conductive filament (CF) and is kept in direct contact with the RSL. The properties of a conventional CBRAM structure (Cu/HfO2/TiN), having a Cu TE, 10 nm HfO2 RSL, and a TiN bottom electrode, are compared with a 2 nm TaN DBL incorporating structure (Cu/TaN/HfO2/TiN) for 103 programming and erase simulation cycles. The low and high resistive state values for each cycle are calculated and the analysis reveals that adding the DBL yields lower degradation. In addition, the 2D distribution plots of oxygen vacancies, O ions, and Cu species within the RSL indicate that oxidation occurring in the DBL-RSL interface results in the formation of a sub-stoichiometric tantalum oxynitride with higher blocking capabilities that suppresses further Cu insertion beyond an initial CF formation phase, as well as CF lateral widening during cycling. The higher endurance of the structure with DBL may thus be attributed to the relatively low amount of Cu migrating into the RSL during the initial CF formation. Furthermore, this isomorphic CF displays similar cycling behavior to neural ionic channels. The results of numerical analysis show a good match to experimental measurements of similar device structures as well

  9. Care Management to Promote Treatment Adherence in Patients with Cognitive Impairment and Vascular Risk Factors: A Demonstration Project.

    Science.gov (United States)

    Bonner, L M; Hanson, A; Robinson, G; Lowy, E; Craft, S

    2018-01-01

    Dementia prevention is highly important. Improved control of vascular risk factors has the potential to decrease dementia risk, but may be difficult. Therefore, we developed and piloted a care management protocol for Veterans at risk for dementia. We enrolled 32 Veterans with diabetes and hypertension, at least one of which was poorly controlled, and cognitive impairment. Participants were randomly assigned to a 6-month care management intervention or to usual care. At enrollment, 6-months and 12-months, we assessed cognitive performance, mood, and diabetes and hypertension control. At follow-up, diastolic blood pressure was lower in intervention participants at 6 months (p=.041) and 12 months (p=.022); hemoglobin A1c, global mental status and mood did not differ between groups. Recall of a distractor list (p=.006) and logical memory long-delay recall (p=.036) were better at 6 months in the intervention group (p=.006). Care management may contribute to improved control of dementia risk factors.

  10. The Recreational Fee Demonstration Program on the national forests: and updated analysis of public attitudes and beliefs, 1996-2001.

    Science.gov (United States)

    David N. Bengston; David P. Fan

    2002-01-01

    Analyzes trends in favorable and unfavorable attitudes toward the Recreational Fee Demonstration Program (RFDP) in the national forests, updating an earlier study using computer content analysis of the public debate. About 65 percent of the attitudes toward the RFDP were favorable, comparable to the findings of survey research.

  11. Human factors review for Severe Accident Sequence Analysis (SASA)

    International Nuclear Information System (INIS)

    Krois, P.A.; Haas, P.M.; Manning, J.J.; Bovell, C.R.

    1984-01-01

    The paper will discuss work being conducted during this human factors review including: (1) support of the Severe Accident Sequence Analysis (SASA) Program based on an assessment of operator actions, and (2) development of a descriptive model of operator severe accident management. Research by SASA analysts on the Browns Ferry Unit One (BF1) anticipated transient without scram (ATWS) was supported through a concurrent assessment of operator performance to demonstrate contributions to SASA analyses from human factors data and methods. A descriptive model was developed called the Function Oriented Accident Management (FOAM) model, which serves as a structure for bridging human factors, operations, and engineering expertise and which is useful for identifying needs/deficiencies in the area of accident management. The assessment of human factors issues related to ATWS required extensive coordination with SASA analysts. The analysis was consolidated primarily to six operator actions identified in the Emergency Procedure Guidelines (EPGs) as being the most critical to the accident sequence. These actions were assessed through simulator exercises, qualitative reviews, and quantitative human reliability analyses. The FOAM descriptive model assumes as a starting point that multiple operator/system failures exceed the scope of procedures and necessitates a knowledge-based emergency response by the operators. The FOAM model provides a functionally-oriented structure for assembling human factors, operations, and engineering data and expertise into operator guidance for unconventional emergency responses to mitigate severe accident progression and avoid/minimize core degradation. Operators must also respond to potential radiological release beyond plant protective barriers. Research needs in accident management and potential uses of the FOAM model are described. 11 references, 1 figure

  12. Body electrical loss analysis (BELA in the assessment of visceral fat: a demonstration

    Directory of Open Access Journals (Sweden)

    Blomqvist Kim H

    2011-11-01

    Full Text Available Abstract Background Body electrical loss analysis (BELA is a new non-invasive way to assess visceral fat depot size through the use of electromagnetism. BELA has worked well in phantom measurements, but the technology is not yet fully validated. Methods Ten volunteers (5 men and 5 women, age: 22-60 y, BMI: 21-30 kg/m2, waist circumference: 73-108 cm were measured with the BELA instrument and with cross-sectional magnetic resonance imaging (MRI at the navel level, navel +5 cm and navel -5 cm. The BELA signal was compared with visceral and subcutaneous fat areas calculated from the MR images. Results The BELA signal did not correlate with subcutaneous fat area at any level, but correlated significantly with visceral fat area at the navel level and navel +5 cm. The correlation was best at level of navel +5 cm (R2 = 0.74, P 2, LOOCV = 40.1 cm2, where SEE is the standard error of the estimate and LOOCV is the root mean squared error of leave-one-out style cross-validation. The average estimate of repeatability of the BELA signal observed through the study was ±9.6 %. One of the volunteers had an exceptionally large amount of visceral fat, which was underestimated by BELA. Conclusions The correlation of the BELA signal with the visceral but not with the subcutaneous fat area as measured by MRI is promising. The lack of correlation with the subcutaneous fat suggests that subcutaneous fat has a minor influence to the BELA signal. Further research will show if it is possible to develop a reliable low-cost method for the assessment of visceral fat either using BELA only or combining it, for example, with bioelectrical impedance measurement. The combination of these measurements may help assessing visceral fat in a large scale of body composition. Before large-scale clinical testing and ROC analysis, the initial BELA instrumentation requires improvements. The accuracy of the present equipment is not sufficient for such new technology.

  13. Analysis of Monolith Cores from an Engineering Scale Demonstration of a Prospective Cast Stone Process

    International Nuclear Information System (INIS)

    Crawford, C. L.; Cozzi, A. D.; Hill, K. A.

    2016-01-01

    The primary disposition path of Low Activity Waste (LAW) at the DOE Hanford Site is vitrification. A cementitious waste form is one of the alternatives being considered for the supplemental immobilization of the LAW that will not be treated by the primary vitrification facility. Washington River Protection Solutions (WRPS) has been directed to generate and collect data on cementitious or pozzolanic waste forms such as Cast Stone. This report documents the coring and leach testing of monolithic samples cored from an engineering-scale demonstration (ES Demo) with non-radioactive simulants. The ES Demo was performed at SRNL in October of 2013 using the Scaled Continuous Processing Facility (SCPF) to fill an 8.5 ft. diameter x 3.25 ft. high container with simulated Cast Stone grout. The Cast Stone formulation was chosen from the previous screening tests. Legacy salt solution from previous Hanford salt waste testing was adjusted to correspond to the average LAW composition generated from the Hanford Tank Waste Operation Simulator (HTWOS). The dry blend materials, ordinary portland cement (OPC), Class F fly ash, and ground granulated blast furnace slag (GGBFS or BFS), were obtained from Lafarge North America in Pasco, WA. In 2014 core samples originally obtained approximately six months after filling the ES Demo were tested along with bench scale molded samples that were collected during the original pour. A latter set of core samples were obtained in late March of 2015, eighteen months after completion of the original ES Demo. Core samples were obtained using a 2'' diameter x 11'' long coring bit. The ES Demo was sampled in three different regions consisting of an outer ring, a middle ring and an inner core zone. Cores from these three lateral zones were further segregated into upper, middle and lower vertical segments. Monolithic core samples were tested using the Environmental Protection Agency (EPA) Method 1315, which is designed to provide mass

  14. Analysis of Canis mitochondrial DNA demonstrates high concordance between the control region and ATPase genes

    Directory of Open Access Journals (Sweden)

    White Bradley N

    2010-07-01

    Full Text Available Abstract Background Phylogenetic studies of wild Canis species have relied heavily on the mitochondrial DNA control region (mtDNA CR to infer species relationships and evolutionary lineages. Previous analyses of the CR provided evidence for a North American evolved eastern wolf (C. lycaon, that is more closely related to red wolves (C. rufus and coyotes (C. latrans than grey wolves (C. lupus. Eastern wolf origins, however, continue to be questioned. Therefore, we analyzed mtDNA from 89 wolves and coyotes across North America and Eurasia at 347 base pairs (bp of the CR and 1067 bp that included the ATPase6 and ATPase8 genes. Phylogenies and divergence estimates were used to clarify the evolutionary history of eastern wolves, and regional comparisons of nonsynonomous to synonomous substitutions (dN/dS at the ATPase6 and ATPase8 genes were used to elucidate the potential role of selection in shaping mtDNA geographic distribution. Results We found high concordance across analyses between the mtDNA regions studied. Both had a high percentage of variable sites (CR = 14.6%; ATP = 9.7% and both phylogenies clustered eastern wolf haplotypes monophyletically within a North American evolved lineage apart from coyotes. Divergence estimates suggest the putative red wolf sequence is more closely related to coyotes (DxyCR = 0.01982 ± 0.00494 SD; DxyATP = 0.00332 ± 0.00097 SD than the eastern wolf sequences (DxyCR = 0.03047 ± 0.00664 SD; DxyATP = 0.00931 ± 0.00205 SD. Neutrality tests on both genes were indicative of the population expansion of coyotes across eastern North America, and dN/dS ratios suggest a possible role for purifying selection in the evolution of North American lineages. dN/dS ratios were higher in European evolved lineages from northern climates compared to North American evolved lineages from temperate regions, but these differences were not statistically significant. Conclusions These results demonstrate high concordance between coding

  15. Phenotypic factor analysis of psychopathology reveals a new body-related transdiagnostic factor.

    Science.gov (United States)

    Pezzoli, Patrizia; Antfolk, Jan; Santtila, Pekka

    2017-01-01

    Comorbidity challenges the notion of mental disorders as discrete categories. An increasing body of literature shows that symptoms cut across traditional diagnostic boundaries and interact in shaping the latent structure of psychopathology. Using exploratory and confirmatory factor analysis, we reveal the latent sources of covariation among nine measures of psychopathological functioning in a population-based sample of 13024 Finnish twins and their siblings. By implementing unidimensional, multidimensional, second-order, and bifactor models, we illustrate the relationships between observed variables, specific, and general latent factors. We also provide the first investigation to date of measurement invariance of the bifactor model of psychopathology across gender and age groups. Our main result is the identification of a distinct "Body" factor, alongside the previously identified Internalizing and Externalizing factors. We also report relevant cross-disorder associations, especially between body-related psychopathology and trait anger, as well as substantial sex and age differences in observed and latent means. The findings expand the meta-structure of psychopathology, with implications for empirical and clinical practice, and demonstrate shared mechanisms underlying attitudes towards nutrition, self-image, sexuality and anger, with gender- and age-specific features.

  16. Nominal Performance Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    Wasiolek, M.

    2000-01-01

    The purpose of this report was to document the process leading to development of the Biosphere Dose Conversion Factors (BDCFs) for the postclosure nominal performance of the potential repository at Yucca Mountain. BDCF calculations concerned twenty-four radionuclides. This selection included sixteen radionuclides that may be significant nominal performance dose contributors during the compliance period of up to 10,000 years, five additional radionuclides of importance for up to 1 million years postclosure, and three relatively short-lived radionuclides important for the human intrusion scenario. Consideration of radionuclide buildup in soil caused by previous irrigation with contaminated groundwater was taken into account in the BDCF development. The effect of climate evolution, from the current arid conditions to a wetter and cooler climate, on the BDCF values was evaluated. The analysis included consideration of different exposure pathway's contribution to the BDCFs. Calculations of nominal performance BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. BDCFs for the nominal performance, when combined with the concentrations of radionuclides in groundwater allow calculation of potential radiation doses to the receptor of interest. Calculated estimates of radionuclide concentration in groundwater result from the saturated zone modeling. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA) to calculate doses to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain

  17. Disruptive Event Biosphere Doser Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2000-12-28

    The purpose of this report was to document the process leading to, and the results of, development of radionuclide-, exposure scenario-, and ash thickness-specific Biosphere Dose Conversion Factors (BDCFs) for the postulated postclosure extrusive igneous event (volcanic eruption) at Yucca Mountain. BDCF calculations were done for seventeen radionuclides. The selection of radionuclides included those that may be significant dose contributors during the compliance period of up to 10,000 years, as well as radionuclides of importance for up to 1 million years postclosure. The approach documented in this report takes into account human exposure during three different phases at the time of, and after, volcanic eruption. Calculations of disruptive event BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. The pathway analysis included consideration of different exposure pathway's contribution to the BDCFs. BDCFs for volcanic eruption, when combined with the concentration of radioactivity deposited by eruption on the soil surface, allow calculation of potential radiation doses to the receptor of interest. Calculation of radioactivity deposition is outside the scope of this report and so is the transport of contaminated ash from the volcano to the location of the receptor. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA), in which doses are calculated to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  18. Scenarios for Benefits Analysis of Energy Research, Development,Demonstration and Deployment

    Energy Technology Data Exchange (ETDEWEB)

    Gumerman, Etan; Marnay, Chris

    2005-09-07

    For at least the last decade, evaluation of the benefits of research, development, demonstration, and deployment (RD3) by the U.S. Department of Energy has been conducted using deterministic forecasts that unrealistically presume we can precisely foresee our future 10, 25,or even 50 years hence. This effort tries, in a modest way, to begin a process of recognition that the reality of our energy future is rather one rife with uncertainty. The National Energy Modeling System (NEMS) is used by the Department of Energy's Office of Energy Efficiency and Renewable Energy (EE) and Fossil Energy (FE) for their RD3 benefits evaluation. In order to begin scoping out the uncertainty in these deterministic forecasts, EE and FE designed two futures that differ significantly from the basic NEMS forecast. A High Fuel Price Scenario and a Carbon Cap Scenario were envisioned to forecast alternative futures and the associated benefits. Ernest Orlando Lawrence Berkeley National Laboratory (LBNL) implemented these scenarios into its version of NEMS,NEMS-LBNL, in late 2004, and the Energy Information Agency created six scenarios for FE in early 2005. The creation and implementation of the EE-FE scenarios are explained in this report. Both a Carbon Cap Scenario and a High Fuel Price Scenarios were implemented into the NEMS-LBNL. EIA subsequently modeled similar scenarios using NEMS. While the EIA and LBNL implementations were in some ways rather different, their forecasts do not significantly diverge. Compared to the Reference Scenario, the High Fuel Price Scenario reduces energy consumption by 4 percent in 2025, while in the EIA fuel price scenario (known as Scenario 4) reduction from its corresponding reference scenario (known as Scenario 0) in 2025 is marginal. Nonetheless, the 4 percent demand reduction does not lead to other cascading effects that would significantly differentiate the two scenarios. The LBNL and EIA carbon scenarios were mostly identical. The only major

  19. 'Omics analysis of low dose acetaminophen intake demonstrates novel response pathways in humans

    Energy Technology Data Exchange (ETDEWEB)

    Jetten, Marlon J.A.; Gaj, Stan [Department of Toxicogenomics, Maastricht University, Universitiessingel 50 6229 ER Maastricht (Netherlands); Ruiz-Aracama, Ainhoa [RIKILT, Institute of Food Safety, Wageningen UR, PO Box 230, 6700 AE, Wageningen (Netherlands); Kok, Theo M. de [Department of Toxicogenomics, Maastricht University, Universitiessingel 50 6229 ER Maastricht (Netherlands); Delft, Joost H.M. van, E-mail: j.vandelft@maastrichtuniversity.nl [Department of Toxicogenomics, Maastricht University, Universitiessingel 50 6229 ER Maastricht (Netherlands); Lommen, Arjen [RIKILT, Institute of Food Safety, Wageningen UR, PO Box 230, 6700 AE, Wageningen (Netherlands); Someren, Eugene P. van [Research Group Microbiology and Systems Biology, TNO, PO Box 360 3700 AJ Zeist (Netherlands); Jennen, Danyel G.J.; Claessen, Sandra M. [Department of Toxicogenomics, Maastricht University, Universitiessingel 50 6229 ER Maastricht (Netherlands); Peijnenburg, Ad A.C.M. [RIKILT, Institute of Food Safety, Wageningen UR, PO Box 230, 6700 AE, Wageningen (Netherlands); Stierum, Rob H. [Research Group Microbiology and Systems Biology, TNO, PO Box 360 3700 AJ Zeist (Netherlands); Kleinjans, Jos C.S. [Department of Toxicogenomics, Maastricht University, Universitiessingel 50 6229 ER Maastricht (Netherlands)

    2012-03-15

    Acetaminophen is the primary cause of acute liver toxicity in Europe/USA, which led the FDA to reconsider recommendations concerning safe acetaminophen dosage/use. Unfortunately, the current tests for liver toxicity are no ideal predictive markers for liver injury, i.e. they only measure acetaminophen exposure after profound liver toxicity has already occurred. Furthermore, these tests do not provide mechanistic information. Here, 'omics techniques (global analysis of metabolomic/gene-expression responses) may provide additional insight. To better understand acetaminophen-induced responses at low doses, we evaluated the effects of (sub-)therapeutic acetaminophen doses on metabolite formation and global gene-expression changes (including, for the first time, full-genome human miRNA expression changes) in blood/urine samples from healthy human volunteers. Many known and several new acetaminophen-metabolites were detected, in particular in relation to hepatotoxicity-linked, oxidative metabolism of acetaminophen. Transcriptomic changes indicated immune-modulating effects (2 g dose) and oxidative stress responses (4 g dose). For the first time, effects of acetaminophen on full-genome human miRNA expression have been considered and confirmed the findings on mRNA level. 'Omics techniques outperformed clinical chemistry tests and revealed novel response pathways to acetaminophen in humans. Although no definitive conclusion about potential immunotoxic effects of acetaminophen can be drawn from this study, there are clear indications that the immune system is triggered even after intake of low doses of acetaminophen. Also, oxidative stress-related gene responses, similar to those seen after high dose acetaminophen exposure, suggest the occurrence of possible pre-toxic effects of therapeutic acetaminophen doses. Possibly, these effects are related to dose-dependent increases in levels of hepatotoxicity-related metabolites. -- Highlights: ► 'Omics techniques

  20. 'Omics analysis of low dose acetaminophen intake demonstrates novel response pathways in humans

    International Nuclear Information System (INIS)

    Jetten, Marlon J.A.; Gaj, Stan; Ruiz-Aracama, Ainhoa; Kok, Theo M. de; Delft, Joost H.M. van; Lommen, Arjen; Someren, Eugene P. van; Jennen, Danyel G.J.; Claessen, Sandra M.; Peijnenburg, Ad A.C.M.; Stierum, Rob H.; Kleinjans, Jos C.S.

    2012-01-01

    Acetaminophen is the primary cause of acute liver toxicity in Europe/USA, which led the FDA to reconsider recommendations concerning safe acetaminophen dosage/use. Unfortunately, the current tests for liver toxicity are no ideal predictive markers for liver injury, i.e. they only measure acetaminophen exposure after profound liver toxicity has already occurred. Furthermore, these tests do not provide mechanistic information. Here, 'omics techniques (global analysis of metabolomic/gene-expression responses) may provide additional insight. To better understand acetaminophen-induced responses at low doses, we evaluated the effects of (sub-)therapeutic acetaminophen doses on metabolite formation and global gene-expression changes (including, for the first time, full-genome human miRNA expression changes) in blood/urine samples from healthy human volunteers. Many known and several new acetaminophen-metabolites were detected, in particular in relation to hepatotoxicity-linked, oxidative metabolism of acetaminophen. Transcriptomic changes indicated immune-modulating effects (2 g dose) and oxidative stress responses (4 g dose). For the first time, effects of acetaminophen on full-genome human miRNA expression have been considered and confirmed the findings on mRNA level. 'Omics techniques outperformed clinical chemistry tests and revealed novel response pathways to acetaminophen in humans. Although no definitive conclusion about potential immunotoxic effects of acetaminophen can be drawn from this study, there are clear indications that the immune system is triggered even after intake of low doses of acetaminophen. Also, oxidative stress-related gene responses, similar to those seen after high dose acetaminophen exposure, suggest the occurrence of possible pre-toxic effects of therapeutic acetaminophen doses. Possibly, these effects are related to dose-dependent increases in levels of hepatotoxicity-related metabolites. -- Highlights: ► 'Omics techniques outperformed

  1. A Bayesian Nonparametric Approach to Factor Analysis

    DEFF Research Database (Denmark)

    Piatek, Rémi; Papaspiliopoulos, Omiros

    2018-01-01

    This paper introduces a new approach for the inference of non-Gaussian factor models based on Bayesian nonparametric methods. It relaxes the usual normality assumption on the latent factors, widely used in practice, which is too restrictive in many settings. Our approach, on the contrary, does no...

  2. Classification analysis of organization factors related to system safety

    International Nuclear Information System (INIS)

    Liu Huizhen; Zhang Li; Zhang Yuling; Guan Shihua

    2009-01-01

    This paper analyzes the different types of organization factors which influence the system safety. The organization factor can be divided into the interior organization factor and exterior organization factor. The latter includes the factors of political, economical, technical, law, social culture and geographical, and the relationships among different interest groups. The former includes organization culture, communication, decision, training, process, supervision and management and organization structure. This paper focuses on the description of the organization factors. The classification analysis of the organization factors is the early work of quantitative analysis. (authors)

  3. Multicistronic lentiviral vectors containing the FMDV 2A cleavage factor demonstrate robust expression of encoded genes at limiting MOI

    Directory of Open Access Journals (Sweden)

    Margison Geoffrey P

    2006-03-01

    Full Text Available Abstract Background A number of gene therapy applications would benefit from vectors capable of expressing multiple genes. In this study we explored the feasibility and efficiency of expressing two or three transgenes in HIV-1 based lentiviral vector. Bicistronic and tricistronic self-inactivating lentiviral vectors were constructed employing the internal ribosomal entry site (IRES sequence of encephalomyocarditis virus (EMCV and/or foot-and-mouth disease virus (FMDV cleavage factor 2A. We employed enhanced green fluorescent protein (eGFP, O6-methylguanine-DNA-methyltransferase (MGMT, and homeobox transcription factor HOXB4 as model genes and their expression was detected by appropriate methods including fluorescence microscopy, flow cytometry, immunocytochemistry, biochemical assay, and western blotting. Results All the multigene vectors produced high titer virus and were able to simultaneously express two or three transgenes in transduced cells. However, the level of expression of individual transgenes varied depending on: the transgene itself; its position within the construct; the total number of transgenes expressed; the strategy used for multigene expression and the average copy number of pro-viral insertions. Notably, at limiting MOI, the expression of eGFP in a bicistronic vector based on 2A was ~4 times greater than that of an IRES based vector. Conclusion The small and efficient 2A sequence can be used alone or in combination with an IRES for the construction of multicistronic lentiviral vectors which can express encoded transgenes at functionally relevant levels in cells containing an average of one pro-viral insert.

  4. Using BMDP and SPSS for a Q factor analysis.

    Science.gov (United States)

    Tanner, B A; Koning, S M

    1980-12-01

    While Euclidean distances and Q factor analysis may sometimes be preferred to correlation coefficients and cluster analysis for developing a typology, commercially available software does not always facilitate their use. Commands are provided for using BMDP and SPSS in a Q factor analysis with Euclidean distances.

  5. [Text mining, a method for computer-assisted analysis of scientific texts, demonstrated by an analysis of author networks].

    Science.gov (United States)

    Hahn, P; Dullweber, F; Unglaub, F; Spies, C K

    2014-06-01

    Searching for relevant publications is becoming more difficult with the increasing number of scientific articles. Text mining as a specific form of computer-based data analysis may be helpful in this context. Highlighting relations between authors and finding relevant publications concerning a specific subject using text analysis programs are illustrated graphically by 2 performed examples. © Georg Thieme Verlag KG Stuttgart · New York.

  6. Gene expression of fibroblast growth factors in human gliomas and meningiomas: Demonstration of cellular source of basic fibroblast growth factor mRNA and peptide in tumor tissues

    International Nuclear Information System (INIS)

    Takahashi, J.A.; Mori, Hirotaka; Fukumoto, Manabu; Oda, Yoshifumi; Kikuchi, Haruhiko; Hatanaka, Masakazu; Igarashi, Koichi; Jaye, M.

    1990-01-01

    The growth autonomy of human tumor cells is considered due to the endogenous production of growth factors. Transcriptional expression of candidates for autocrine stimulatory factors such as basic fibroblast growth factor (FGF), acidic FGF, and transforming growth factor type β were determined in human brain tumors. Basic FGF was expressed abundantly in 17 of 18 gliomas, 20 of 22 meningiomas, and 0 of 5 metastatic brain tumors. The level of mRNA expression of acidic FGF in gliomas was significant. In contrast, transforming growth factor type β1 was expressed in all the samples investigated. The mRNA for basic FGF and its peptide were localized in tumor cells in vivo by in situ hybridization and immunohistochemistry, showing that basic FGF is actually produced in tumor cells. The results suggest that tumor-derived basic FGF is involved in the progression of gliomas and meningiomas in vivo, whereas acidic FGF is expressed in a tumor origin-specific manner, suggesting that acidic FGF works in tandem with basic FGF in glioma tumorigenesis

  7. EXPLORATORY FACTOR ANALYSIS (EFA IN CONSUMER BEHAVIOR AND MARKETING RESEARCH

    Directory of Open Access Journals (Sweden)

    Marcos Pascual Soler

    2012-06-01

    Full Text Available Exploratory Factor Analysis (EFA is one of the most widely used statistical procedures in social research. The main objective of this work is to describe the most common practices used by researchers in the consumer behavior and marketing area. Through a literature review methodology the practices of AFE in five consumer behavior and marketing journals(2000-2010 were analyzed. Then, the choices made by the researchers concerning factor model, retention criteria, rotation, factors interpretation and other relevant issues to factor analysis were analized. The results suggest that researchers routinely conduct analyses using such questionable methods. Suggestions for improving the use of factor analysis and the reporting of results are presented and a checklist (Exploratory Factor Analysis Checklist, EFAC is provided to help editors, reviewers, and authors improve reporting exploratory factor analysis.

  8. Factor analysis of serogroups botanica and aurisina of Leptospira biflexa.

    Science.gov (United States)

    Cinco, M

    1977-11-01

    Factor analysis is performed on serovars of Botanica and Aurisina serogroup of Leptospira biflexa. The results show the arrangement of main factors serovar and serogroup specific, as well as the antigens common with serovars of heterologous serogroups.

  9. Human factors analysis of incident/accident report

    International Nuclear Information System (INIS)

    Kuroda, Isao

    1992-01-01

    Human factors analysis of accident/incident has different kinds of difficulties in not only technical, but also psychosocial background. This report introduces some experiments of 'Variation diagram method' which is able to extend to operational and managemental factors. (author)

  10. Nonparametric factor analysis of time series

    OpenAIRE

    Rodríguez-Poo, Juan M.; Linton, Oliver Bruce

    1998-01-01

    We introduce a nonparametric smoothing procedure for nonparametric factor analaysis of multivariate time series. The asymptotic properties of the proposed procedures are derived. We present an application based on the residuals from the Fair macromodel.

  11. Analysis of success factors in advertising

    OpenAIRE

    Fedorchak, Oleksiy; Kedebecz, Kristina

    2017-01-01

    The essence of factors of the success of advertising campaigns is investigated. The stages of conducting and stages of evaluation of the effectiveness of advertising campaigns are determined. Also defined goals and objectives of advertising campaigns.

  12. Holographic analysis of diffraction structure factors

    International Nuclear Information System (INIS)

    Marchesini, S.; Bucher, J.J.; Shuh, D.K.; Fabris, L.; Press, M.J.; West, M.W.; Hussain, Z.; Mannella, N.; Fadley, C.S.; Van Hove, M.A.; Stolte, W.C.

    2002-01-01

    We combine the theory of inside-source/inside-detector x-ray fluorescence holography and Kossel lines/ x ray standing waves in kinematic approximation to directly obtain the phases of the diffraction structure factors. The influence of Kossel lines and standing waves on holography is also discussed. We obtain partial phase determination from experimental data obtaining the sign of the real part of the structure factor for several reciprocal lattice vectors of a vanadium crystal

  13. Analysis of transfer reactions: determination of spectroscopic factors

    Energy Technology Data Exchange (ETDEWEB)

    Keeley, N. [CEA Saclay, Dept. d' Astrophysique, de Physique des Particules de Physique Nucleaire et de l' Instrumentation Associee (DSM/DAPNIA/SPhN), 91- Gif sur Yvette (France); The Andrzej So an Institute for Nuclear Studies, Dept. of Nuclear Reactions, Warsaw (Poland)

    2007-07-01

    An overview of the most popular models used for the analysis of direct reaction data is given, concentrating on practical aspects. The 4 following models (in order of increasing sophistication): the distorted wave born approximation (DWBA), the adiabatic model, the coupled channels born approximation, and the coupled reaction channels are briefly described. As a concrete example, the C{sup 12}(d,p)C{sup 13} reaction at an incident deuteron energy of 30 MeV is analysed with progressively more physically sophisticated models. The effect of the choice of the reaction model on the spectroscopic information extracted from the data is investigated and other sources of uncertainty in the derived spectroscopic factors are discussed. We have showed that the choice of the reaction model can significantly influence the nuclear structure information, particularly the spectroscopic factors or amplitudes but occasionally also the spin-parity, that we wish to extract from direct reaction data. We have also demonstrated that the DWBA can fail to give a satisfactory description of transfer data but when the tenets of the theory are fulfilled DWBA can work very well and will yield the same results as most sophisticated models. The use of global rather than fitted optical potentials can also lead to important differences in the extracted spectroscopic factors.

  14. Identification of noise in linear data sets by factor analysis

    International Nuclear Information System (INIS)

    Roscoe, B.A.; Hopke, Ph.K.

    1982-01-01

    A technique which has the ability to identify bad data points, after the data has been generated, is classical factor analysis. The ability of classical factor analysis to identify two different types of data errors make it ideally suited for scanning large data sets. Since the results yielded by factor analysis indicate correlations between parameters, one must know something about the nature of the data set and the analytical techniques used to obtain it to confidentially isolate errors. (author)

  15. Exploring Technostress: Results of a Large Sample Factor Analysis

    OpenAIRE

    Jonušauskas, Steponas; Raišienė, Agota Giedrė

    2016-01-01

    With reference to the results of a large sample factor analysis, the article aims to propose the frame examining technostress in a population. The survey and principal component analysis of the sample consisting of 1013 individuals who use ICT in their everyday work was implemented in the research. 13 factors combine 68 questions and explain 59.13 per cent of the answers dispersion. Based on the factor analysis, questionnaire was reframed and prepared to reasonably analyze the respondents’ an...

  16. Demonstration of a modelling-based multi-criteria decision analysis procedure for prioritisation of occupational risks from manufactured nanomaterials.

    Science.gov (United States)

    Hristozov, Danail; Zabeo, Alex; Alstrup Jensen, Keld; Gottardo, Stefania; Isigonis, Panagiotis; Maccalman, Laura; Critto, Andrea; Marcomini, Antonio

    2016-11-01

    Several tools to facilitate the risk assessment and management of manufactured nanomaterials (MN) have been developed. Most of them require input data on physicochemical properties, toxicity and scenario-specific exposure information. However, such data are yet not readily available, and tools that can handle data gaps in a structured way to ensure transparent risk analysis for industrial and regulatory decision making are needed. This paper proposes such a quantitative risk prioritisation tool, based on a multi-criteria decision analysis algorithm, which combines advanced exposure and dose-response modelling to calculate margins of exposure (MoE) for a number of MN in order to rank their occupational risks. We demonstrated the tool in a number of workplace exposure scenarios (ES) involving the production and handling of nanoscale titanium dioxide, zinc oxide (ZnO), silver and multi-walled carbon nanotubes. The results of this application demonstrated that bag/bin filling, manual un/loading and dumping of large amounts of dry powders led to high emissions, which resulted in high risk associated with these ES. The ZnO MN revealed considerable hazard potential in vivo, which significantly influenced the risk prioritisation results. In order to study how variations in the input data affect our results, we performed probabilistic Monte Carlo sensitivity/uncertainty analysis, which demonstrated that the performance of the proposed model is stable against changes in the exposure and hazard input variables.

  17. Analysis of Increased Information Technology Outsourcing Factors

    Directory of Open Access Journals (Sweden)

    Brcar Franc

    2013-01-01

    Full Text Available The study explores the field of IT outsourcing. The narrow field of research is to build a model of IT outsourcing based on influential factors. The purpose of this research is to determine the influential factors on IT outsourcing expansion. A survey was conducted with 141 large-sized Slovenian companies. Data were statistically analyzed using binary logistic regression. The final model contains five factors: (1 management’s support; (2 knowledge on IT outsourcing; (3 improvement of efficiency and effectiveness; (4 quality improvement of IT services; and (5 innovation improvement of IT. Managers immediately can use the results of this research in their decision-making. Increased performance of each individual organization is to the benefit of the entire society. The examination of IT outsourcing with the methods used is the first such research in Slovenia.

  18. Warranty claim analysis considering human factors

    International Nuclear Information System (INIS)

    Wu Shaomin

    2011-01-01

    Warranty claims are not always due to product failures. They can also be caused by two types of human factors. On the one hand, consumers might claim warranty due to misuse and/or failures caused by various human factors. Such claims might account for more than 10% of all reported claims. On the other hand, consumers might not be bothered to claim warranty for failed items that are still under warranty, or they may claim warranty after they have experienced several intermittent failures. These two types of human factors can affect warranty claim costs. However, research in this area has received rather little attention. In this paper, we propose three models to estimate the expected warranty cost when the two types of human factors are included. We consider two types of failures: intermittent and fatal failures, which might result in different claim patterns. Consumers might report claims after a fatal failure has occurred, and upon intermittent failures they might report claims after a number of failures have occurred. Numerical examples are given to validate the results derived.

  19. Chiral analysis of baryon form factors

    Energy Technology Data Exchange (ETDEWEB)

    Gail, T.A.

    2007-11-08

    This work presents an extensive theoretical investigation of the structure of the nucleon within the standard model of elementary particle physics. In particular, the long range contributions to a number of various form factors parametrizing the interactions of the nucleon with an electromagnetic probe are calculated. The theoretical framework for those calculations is chiral perturbation theory, the exact low energy limit of Quantum Chromo Dynamics, which describes such long range contributions in terms of a pion-cloud. In this theory, a nonrelativistic leading one loop order calculation of the form factors parametrizing the vector transition of a nucleon to its lowest lying resonance, the {delta}, a covariant calculation of the isovector and isoscalar vector form factors of the nucleon at next to leading one loop order and a covariant calculation of the isoscalar and isovector generalized vector form factors of the nucleon at leading one loop order are performed. In order to perform consistent loop calculations in the covariant formulation of chiral perturbation theory an appropriate renormalization scheme is defined in this work. All theoretical predictions are compared to phenomenology and results from lattice QCD simulations. These comparisons allow for a determination of the low energy constants of the theory. Furthermore, the possibility of chiral extrapolation, i.e. the extrapolation of lattice data from simulations at large pion masses down to the small physical pion mass is studied in detail. Statistical as well as systematic uncertainties are estimated for all results throughout this work. (orig.)

  20. Mammography image quality and evidence based practice: Analysis of the demonstration of the inframammary angle in the digital setting.

    Science.gov (United States)

    Spuur, Kelly; Webb, Jodi; Poulos, Ann; Nielsen, Sharon; Robinson, Wayne

    2018-03-01

    The aim of this study is to determine the clinical rates of the demonstration of the inframammary angle (IMA) on the mediolateral oblique (MLO) view of the breast on digital mammograms and to compare the outcomes with current accreditation standards for compliance. Relationships between the IMA, age, the posterior nipple line (PNL) and compressed breast thickness will be identified and the study outcomes validated using appropriate analyses of inter-reader and inter-rater reliability and variability. Differences in left versus right data were also investigated. A quantitative retrospective study of 2270 randomly selected paired digital mammograms performed by BreastScreen NSW was undertaken. Data was collected by direct measurement and visual analysis. Intra-class correlation analyses were used to evaluate inter- and intra-rater reliability. The IMA was demonstrated on 52.4% of individual and 42.6% of paired mammograms. A linear relationship was found between the posterior nipple line (PNL) and age (p-value PNL was predicted to increase by 0.48 mm for every one year increment in age. The odds of demonstrating the IMA reduced by 2% for every one year increase in age (p-value = 0.001); are 0.4% higher for every 1 mm increase in PNL (p-value = 0.001) and 1.6% lower for every 1 mm increase in compressed breast thickness, (p-valuePNL while there was 100% agreement for the demonstration of the IMA. Analysis of the demonstration of the IMA indicates clinically achievable rates (42.6%) well below that required for compliance (50%-75%) to known worldwide accreditation standards for screening mammography. These standards should be aligned to the reported evidence base. Visualisation of the IMA is impacted negatively by increasing age and compressed breast thickness but positively by breast size (PNL). Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Analysis of Biomechanical Factors in Bend Running

    OpenAIRE

    Bing Zhang; Xinping You; Feng Li

    2013-01-01

    Sprint running is the demonstration of comprehensive abilities of technology and tactics, under various conditions. However, whether it is just to allocate the tracks for short-distance athletes from different racetracks has been the hot topic. This study analyzes its forces, differences in different tracks and winding influences, in the aspects of sport biomechanics. The results indicate, many disadvantages exist in inner tracks, middle tracks are the best and outer ones are inferior to midd...

  2. Regression analysis of nuclear plant capacity factors

    International Nuclear Information System (INIS)

    Stocks, K.J.; Faulkner, J.I.

    1980-07-01

    Operating data on all commercial nuclear power plants of the PWR, HWR, BWR and GCR types in the Western World are analysed statistically to determine whether the explanatory variables size, year of operation, vintage and reactor supplier are significant in accounting for the variation in capacity factor. The results are compared with a number of previous studies which analysed only United States reactors. The possibility of specification errors affecting the results is also examined. Although, in general, the variables considered are statistically significant, they explain only a small portion of the variation in the capacity factor. The equations thus obtained should certainly not be used to predict the lifetime performance of future large reactors

  3. An Empirical Analysis of Job Satisfaction Factors.

    Science.gov (United States)

    1987-09-01

    have acknowledged the importance of factors which make the Air Force attractive to its members or conversely, make other employees consider...Maslow’s need hierarchy theory attempts to show that man has five basic categories of needs: physiological, safety, belongingness , esteem, and self...attained until lower-level basic needs are attained. This implies a sort of growth process where optional job environments for given employees are

  4. A Factor Analysis of the BSRI and the PAQ.

    Science.gov (United States)

    Edwards, Teresa A.; And Others

    Factor analysis of the Bem Sex Role Inventory (BSRI) and the Personality Attributes Questionnaire (PAQ) was undertaken to study the independence of the masculine and feminine scales within each instrument. Both instruments were administered to undergraduate education majors. Analysis of primary first and second order factors of the BSRI indicated…

  5. Analysis and optimization of the TWINKLE factoring device

    NARCIS (Netherlands)

    Lenstra, A.K.; Shamir, A.; Preneel, B.

    2000-01-01

    We describe an enhanced version of the TWINKLE factoring device and analyse to what extent it can be expected to speed up the sieving step of the Quadratic Sieve and Number Field Sieve factoring al- gorithms. The bottom line of our analysis is that the TWINKLE-assisted factorization of 768-bit

  6. Hierarchical Factoring Based On Image Analysis And Orthoblique Rotations.

    Science.gov (United States)

    Stankov, L

    1979-07-01

    The procedure for hierarchical factoring suggested by Schmid and Leiman (1957) is applied within the framework of image analysis and orthoblique rotational procedures. It is shown that this approach necessarily leads to correlated higher order factors. Also, one can obtain a smaller number of factors than produced by typical hierarchical procedures.

  7. A decision analysis framework to support long-term planning for nuclear fuel cycle technology research, development, demonstration and deployment

    International Nuclear Information System (INIS)

    Sowder, A.G.; Machiels, A.J.; Dykes, A.A.; Johnson, D.H.

    2013-01-01

    To address challenges and gaps in nuclear fuel cycle option assessment and to support research, develop and demonstration programs oriented toward commercial deployment, EPRI (Electric Power Research Institute) is seeking to develop and maintain an independent analysis and assessment capability by building a suite of assessment tools based on a platform of software, simplified relationships, and explicit decision-making and evaluation guidelines. As a demonstration of the decision-support framework, EPRI examines a relatively near-term fuel cycle option, i.e., use of reactor-grade mixed-oxide fuel (MOX) in U.S. light water reactors. The results appear as a list of significant concerns (like cooling of spent fuels, criticality risk...) that have to be taken into account for the final decision

  8. Demonstration of anticoagulation patient self-testing feasibility at an Indian Health Service facility: A case series analysis

    Directory of Open Access Journals (Sweden)

    Schupbach RR

    2013-03-01

    Full Text Available Background: Anticoagulation patient self-testing (PST represents an alternative approach to warfarin monitoring by enabling patients to use coagulometers to test their international normalized ratio (INR values. PST offers several advantages that potentially improve warfarin management. Objective: To describe implementation and associated performance of a PST demonstration program at an Indian Health Service (IHS facility. Methods: A non-consecutive case series analysis of patients from a pharmacy-managed PST demonstration program was performed at an IHS facility in Oklahoma between July 2008 and February 2009.Results: Mean time in therapeutic range (TTR for the seven patients showed a small, absolute increase during the twelve weeks of PST compared to the twelve weeks prior to PST. Four of the seven patients had an increase in TTR during the twelve week course of PST compared to their baseline TTR. Three of four patients with increased TTR in the final eight week period of PST achieved a TTR of 100%. Of the three patients who experienced a decrease in TTR after initiating self-testing, two initially presented with a TTR of 100% prior to PST and one patient had a TTR of 100% for the final eight weeks of PST. The two patients not achieving a TTR of 100% during the twelve week PST period demonstrated an increase in TTR following the first four weeks of PST. Conclusion: Although anticoagulation guidelines now emphasize patient self-management (PSM only, optimal PST remains an integral process in PSM delivery. In the patients studied, the results of this analysis suggest that PST at the IHS facility provided a convenient, alternative method for management of chronic warfarin therapy for qualified patients. More than half of the patients demonstrated improvement in TTR. Although there is a learning curve immediately following PST initiation, the mean TTR for the entire PST period increased modestly when compared to the time period prior to PST.

  9. Modification and analysis of engineering hot spot factor of HFETR

    International Nuclear Information System (INIS)

    Hu Yuechun; Deng Caiyu; Li Haitao; Xu Taozhong; Mo Zhengyu

    2014-01-01

    This paper presents the modification and analysis of engineering hot spot factors of HFETR. The new factors are applied in the fuel temperature analysis and the estimated value of the safety allowable operating power of HFETR. The result shows the maximum cladding temperature of the fuel is lower when the new factor are in utilization, and the safety allowable operating power of HFETR if higher, thus providing the economical efficiency of HFETR. (authors)

  10. A replication of a factor analysis of motivations for trapping

    Science.gov (United States)

    Schroeder, Susan; Fulton, David C.

    2015-01-01

    Using a 2013 sample of Minnesota trappers, we employed confirmatory factor analysis to replicate an exploratory factor analysis of trapping motivations conducted by Daigle, Muth, Zwick, and Glass (1998).  We employed the same 25 items used by Daigle et al. and tested the same five-factor structure using a recent sample of Minnesota trappers. We also compared motivations in our sample to those reported by Daigle et el.

  11. Factor analysis improves the selection of prescribing indicators

    DEFF Research Database (Denmark)

    Rasmussen, Hanne Marie Skyggedal; Søndergaard, Jens; Sokolowski, Ineta

    2006-01-01

    OBJECTIVE: To test a method for improving the selection of indicators of general practitioners' prescribing. METHODS: We conducted a prescription database study including all 180 general practices in the County of Funen, Denmark, approximately 472,000 inhabitants. Principal factor analysis was us...... appropriate and inappropriate prescribing, as revealed by the correlation of the indicators in the first factor. CONCLUSION: Correlation and factor analysis is a feasible method that assists the selection of indicators and gives better insight into prescribing patterns....

  12. Comprehensive Behavioral Analysis of Activating Transcription Factor 5-Deficient Mice

    Directory of Open Access Journals (Sweden)

    Mariko Umemura

    2017-07-01

    Full Text Available Activating transcription factor 5 (ATF5 is a member of the CREB/ATF family of basic leucine zipper transcription factors. We previously reported that ATF5-deficient (ATF5-/- mice demonstrated abnormal olfactory bulb development due to impaired interneuron supply. Furthermore, ATF5-/- mice were less aggressive than ATF5+/+ mice. Although ATF5 is widely expressed in the brain, and involved in the regulation of proliferation and development of neurons, the physiological role of ATF5 in the higher brain remains unknown. Our objective was to investigate the physiological role of ATF5 in the higher brain. We performed a comprehensive behavioral analysis using ATF5-/- mice and wild type littermates. ATF5-/- mice exhibited abnormal locomotor activity in the open field test. They also exhibited abnormal anxiety-like behavior in the light/dark transition test and open field test. Furthermore, ATF5-/- mice displayed reduced social interaction in the Crawley’s social interaction test and increased pain sensitivity in the hot plate test compared with wild type. Finally, behavioral flexibility was reduced in the T-maze test in ATF5-/- mice compared with wild type. In addition, we demonstrated that ATF5-/- mice display disturbances of monoamine neurotransmitter levels in several brain regions. These results indicate that ATF5 deficiency elicits abnormal behaviors and the disturbance of monoamine neurotransmitter levels in the brain. The behavioral abnormalities of ATF5-/- mice may be due to the disturbance of monoamine levels. Taken together, these findings suggest that ATF5-/- mice may be a unique animal model of some psychiatric disorders.

  13. Human factor analysis and preventive countermeasures in nuclear power plant

    International Nuclear Information System (INIS)

    Li Ye

    2010-01-01

    Based on the human error analysis theory and the characteristics of maintenance in a nuclear power plant, human factors of maintenance in NPP are divided into three different areas: human, technology, and organization. Which is defined as individual factors, including psychological factors, physiological characteristics, health status, level of knowledge and interpersonal skills; The technical factors including technology, equipment, tools, working order, etc.; The organizational factors including management, information exchange, education, working environment, team building and leadership management,etc The analysis found that organizational factors can directly or indirectly affect the behavior of staff and technical factors, is the most basic human error factor. Based on this nuclear power plant to reduce human error and measures the response. (authors)

  14. A Methodological Demonstration of Set-theoretical Approach to Social Media Maturity Models Using Necessary Condition Analysis

    DEFF Research Database (Denmark)

    Lasrado, Lester Allan; Vatrapu, Ravi; Andersen, Kim Normann

    2016-01-01

    Despite being widely accepted and applied across research domains, maturity models have been criticized for lacking academic rigor, especially methodologically rigorous and empirically grounded or tested maturity models are quite rare. Attempting to close this gap, we adopt a set-theoretic approach...... and evaluate some of arguments presented by previous conceptual focused social media maturity models....... by applying the Necessary Condition Analysis (NCA) technique to derive maturity stages and stage boundaries conditions. The ontology is to view stages (boundaries) in maturity models as a collection of necessary condition. Using social media maturity data, we demonstrate the strength of our approach...

  15. ANALYSIS OF RISK FACTORS ECTOPIC PREGNANCY

    Directory of Open Access Journals (Sweden)

    Budi Santoso

    2017-04-01

    Full Text Available Introduction: Ectopic pregnancy is a pregnancy with extrauterine implantation. This situation is gynecologic emergency that contributes to maternal mortality. Therefore, early recognition, based on identification of the causes of ectopic pregnancy risk factors, is needed. Methods: The design descriptive observational. The samples were pregnant women who had ectopic pregnancy at Maternity Room, Emergency Unit, Dr. Soetomo Hospital, Surabaya, from 1 July 2008 to 1 July 2010. Sampling technique was total sampling using medical records. Result: Patients with ectopic pregnancy were 99 individuals out of 2090 pregnant women who searched for treatment in Dr. Soetomo Hospital. However, only 29 patients were accompanied with traceable risk factors. Discussion:. Most ectopic pregnancies were in the age group of 26-30 years, comprising 32 patients (32.32%, then in age groups of 31–35 years as many as 25 patients (25.25%, 18 patients in age group 21–25 years (18.18%, 17 patients in age group 36–40 years (17.17%, 4 patients in age group 41 years and more (4.04%, and the least was in age group of 16–20 years with 3 patients (3.03%. A total of 12 patients with ectopic pregnancy (41.38% had experience of abortion and 6 patients (20.69% each in groups of patients with ectopic pregnancy who used family planning, in those who used family planning as well as ectopic pregnancy patients with history of surgery. There were 2 patients (6.90% of the group of patients ectopic pregnancy who had history of surgery and history of abortion. The incidence rate of ectopic pregnancy was 4.73%, mostly in the second gravidity (34.34%, whereas the nulliparous have the highest prevalence of 39.39%. Acquired risk factors, i.e. history of operations was 10.34%, patients with family planning 20.69%, patients with history of abortion 41.38%, patients with history of abortion and operation 6.90% patients with family and history of abortion was 20.69%.

  16. What factors do patients consider most important in making lung cancer screening decisions? Findings from a demonstration project conducted in the Veterans Health Administration.

    Science.gov (United States)

    Lillie, Sarah E; Fu, Steven S; Fabbrini, Angela E; Rice, Kathryn L; Clothier, Barbara; Nelson, David B; Doro, Elizabeth A; Moughrabieh, M Anas; Partin, Melissa R

    2017-02-01

    The National Lung Screening Trial recently reported that annual low-dose computed tomography screening is associated with decreased lung cancer mortality in high-risk smokers. This study sought to identify the factors patients consider important in making lung cancer screening (LCS) decisions, and explore variations by patient characteristics and LCS participation. This observational survey study evaluated the Minneapolis VA LCS Clinical Demonstration Project in which LCS-eligible Veterans (N=1388) were randomized to either Direct LCS Invitation (mailed with decision aid, N=926) or Usual Care (provider referral, N=462). We surveyed participants three months post-randomization (response rate 44%) and report the proportion of respondents rating eight decision-making factors (benefits, harms, and neutral factors) as important by condition, patient characteristics, and LCS completion. Overall, the most important factor was personal risk of lung cancer and the least important factor was health risks from LCS. The reported importance varied by patient characteristics, including smoking status, health status, and education level. Overall, the potential harms of LCS were reported less important than the benefits or the neutral decision-making factors. Exposure to Direct LCS Invitation (with decision aid) increased Veterans' attention to specific decision-making factors; compared to Usual Care respondents, a larger proportion of Direct LCS Invitation respondents rated the chance of false-positive results, LCS knowledge, LCS convenience, and anxiety as important. Those completing LCS considered screening harms less important, with the exception of incidental findings. Decision tools influence Veterans' perceptions about LCS decision-making factors. As the factors important to LCS decision making vary by patient characteristics, targeted materials for specific subgroups may be warranted. Attention should be paid to how LCS incidental findings are communicated. Published by

  17. Investigating product development strategy in beverage industry using factor analysis

    Directory of Open Access Journals (Sweden)

    Naser Azad

    2013-03-01

    Full Text Available Selecting a product development strategy that is associated with the company's current service or product innovation, based on customers’ needs and changing environment, plays an important role in increasing demand, increasing market share, increasing sales and profits. Therefore, it is important to extract effective variables associated with product development to improve performance measurement of firms. This paper investigates important factors influencing product development strategies using factor analysis. The proposed model of this paper investigates 36 factors and, using factor analysis, we extract six most influential factors including information sharing, intelligence information, exposure strategy, differentiation, research and development strategy and market survey. The first strategy, partnership, includes five sub-factor including product development partnership, partnership with foreign firms, customers’ perception from competitors’ products, Customer involvement in product development, inter-agency coordination, customer-oriented approach to innovation and transmission of product development change where inter-agency coordination has been considered the most important factor. Internal strengths are the most influential factors impacting the second strategy, intelligence information. The third factor, introducing strategy, introducing strategy, includes four sub criteria and consumer buying behavior is the most influencing factor. Differentiation is the next important factor with five components where knowledge and expertise in product innovation is the most important one. Research and development strategy with four sub-criteria where reducing product development cycle plays the most influential factor and finally, market survey strategy is the last important factor with three factors and finding new market plays the most important role.

  18. Housing price forecastability: A factor analysis

    DEFF Research Database (Denmark)

    Møller, Stig Vinther; Bork, Lasse

    2017-01-01

    We examine U.S. housing price forecastability using principal component analysis (PCA), partial least squares (PLS), and sparse PLS (SPLS). We incorporate information from a large panel of 128 economic time series and show that macroeconomic fundamentals have strong predictive power for future...... movements in housing prices. We find that (S)PLS models systematically dominate PCA models. (S)PLS models also generate significant out-of-sample predictive power over and above the predictive power contained by the price-rent ratio, autoregressive benchmarks, and regression models based on small datasets....

  19. Magnetic Analysis of a Single-Aperture 11T Nb3Sn Demonstrator Dipole for LHC Upgrades

    Energy Technology Data Exchange (ETDEWEB)

    Auchmann, B. [CERN; Karppinen, M. [CERN; Kashikhin, V. [Fermilab; Zlobin, A. V. [Fermilab

    2012-05-01

    The planned upgrade of the LHC collimation system foresees additional collimators to be installed in the dispersion suppressor areas around points 2, 3, and 7. The necessary longitudinal space for the collimators could be provided by replacing some 8.33-T 15-m-long NbTi LHC main dipoles with shorter 11-T Nb3Sn dipoles compatible with the LHC lattice and main systems. To demonstrate this possibility, in 2011 Fermilab and CERN started a joint R&D program with the goal of building a 5.5-m-long tw in-aperture dipole prototype suitable for installation in the LHC by 2014. The first step of this program is the development of a 2-m-long single-aperture demonstration dipole with the nominal field of 11 T at the LHC nominal current of ~11.85 kA and 60-m m bore with ~20% margin. This paper presents the results of magnetic analysis of the single-aperture Nb3Sn demonstrator dipole for the LHC collimation system upgrade.

  20. Intrinsic bacterial biodegradation of petroleum contamination demonstrated in situ using natural abundance, molecular-level 14C analysis

    International Nuclear Information System (INIS)

    Slater, G.F.; Nelson, R.K.; Kile, B.M.; Reddy, C.M.

    2006-01-01

    Natural abundance, molecular-level C 14 analysis was combined with comprehensive gas chromatography (GC x GC) to investigate, in situ, the role of intrinsic biodegradation in the loss of petroleum hydrocarbons from the rocky, inter-tidal zone impacted by the Bouchard 120 oil spill. GC x GC analysis indicated accelerated losses of n-alkane components of the residual petroleum hydrocarbons between day 40 and day 50 after the spill. 14 C analysis of bacterial phospholipid fatty acids (PLFA) from the impacted zone on day 44 showed that the polyunsaturated fatty acids attributed to the photoautotrophic component of the microbial community had the same ( 14 C as the local dissolved inorganic carbon (DIG), indicating that this DIG was their carbon source. In contrast there was significant (C depletion in the saturated and mono-unsaturated PLFA indicating incorporation of petroleum carbon. This correlation between the observed accelerated n-alkane losses and microbial incorporation of (C-depleted carbon directly demonstrated, in situ, that intrinsic biodegradation was affecting the petroleum. Since the majority of organic contaminants originate from petroleum feed-stocks, in situ molecular-level 14 C analysis of microbial PLFA can provide insights into the occurrence and pathways of biodegradation of a wide range of organic contaminants. (Author)

  1. Evaluation of the reliability concerning the identification of human factors as contributing factors by a computer supported event analysis (CEA)

    International Nuclear Information System (INIS)

    Wilpert, B.; Maimer, H.; Loroff, C.

    2000-01-01

    The project's objectives are the evaluation of the reliability concerning the identification of Human Factors as contributing factors by a computer supported event analysis (CEA). CEA is a computer version of SOL (Safety through Organizational Learning). Parts of the first step were interviews with experts from the nuclear power industry and the evaluation of existing computer supported event analysis methods. This information was combined to a requirement profile for the CEA software. The next step contained the implementation of the software in an iterative process of evaluation. The completion of this project was the testing of the CEA software. As a result the testing demonstrated that it is possible to identify contributing factors with CEA validly. In addition, CEA received a very positive feedback from the experts. (orig.) [de

  2. Factoring handedness data: I. Item analysis.

    Science.gov (United States)

    Messinger, H B; Messinger, M I

    1995-12-01

    Recently in this journal Peters and Murphy challenged the validity of factor analyses done on bimodal handedness data, suggesting instead that right- and left-handers be studied separately. But bimodality may be avoidable if attention is paid to Oldfield's questionnaire format and instructions for the subjects. Two characteristics appear crucial: a two-column LEFT-RIGHT format for the body of the instrument and what we call Oldfield's Admonition: not to indicate strong preference for handedness item, such as write, unless "... the preference is so strong that you would never try to use the other hand unless absolutely forced to...". Attaining unimodality of an item distribution would seem to overcome the objections of Peters and Murphy. In a 1984 survey in Boston we used Oldfield's ten-item questionnaire exactly as published. This produced unimodal item distributions. With reflection of the five-point item scale and a logarithmic transformation, we achieved a degree of normalization for the items. Two surveys elsewhere based on Oldfield's 20-item list but with changes in the questionnaire format and the instructions, yielded markedly different item distributions with peaks at each extreme and sometimes in the middle as well.

  3. A factor analysis of Functional Independence and Functional Assessment Measure scores among focal and diffuse brain injury patients: The importance of bi-factor models.

    Science.gov (United States)

    Gunn, Sarah; Burgess, Gerald H; Maltby, John

    2018-04-28

    To explore the factor structure of the UK Functional Independence Measure and Functional Assessment Measure (FIM+FAM) among focal and diffuse acquired brain injury patients. Criterion standard. An NHS acute acquired brain injury inpatient rehabilitation hospital. Referred sample of 447 adults (835 cases after exclusions) admitted for inpatient treatment following an acquired brain injury significant enough to justify intensive inpatient neurorehabilitation. Not applicable. Functional Independence Measure and Functional Assessment Measure. Exploratory Factor Analysis suggested a two-factor structure to FIM+FAM scores, among both focal-proximate and diffuse-proximate acquired brain injury aetiologies. Confirmatory Factor Analysis suggested a three-factor bi-factor structure presented the best fit of the FIM+FAM score data across both aetiologies. However, across both analyses, a convergence was found towards a general factor, demonstrated by high correlations between factors in the Exploratory Factor Analysis, and by a general factor explaining the majority of the variance in scores on Confirmatory Factor Analysis. Our findings suggested that although factors describing specific functional domains can be derived from FIM+FAM item scores, there is a convergence towards a single factor describing overall functioning. This single factor informs the specific group factors (e.g. motor, psychosocial and communication function) following brain injury. Further research into the comparative value of the general and group factors as evaluative/prognostic measures is indicated. Copyright © 2018. Published by Elsevier Inc.

  4. Tested Demonstrations.

    Science.gov (United States)

    Gilbert, George L.

    1983-01-01

    An apparatus is described in which effects of pressure, volume, and temperature changes on a gas can be observed simultaneously. Includes use of the apparatus in demonstrating Boyle's, Gay-Lussac's, and Charles' Laws, attractive forces, Dalton's Law of Partial pressures, and in illustrating measurable vapor pressures of liquids and some solids.…

  5. Tested Demonstrations.

    Science.gov (United States)

    Gilbert, George L., Ed.

    1987-01-01

    Describes two demonstrations to illustrate characteristics of substances. Outlines a method to detect the changes in pH levels during the electrolysis of water. Uses water pistols, one filled with methane gas and the other filled with water, to illustrate the differences in these two substances. (TW)

  6. Acidic preparations of lysed platelets upregulate proliferative pathways in osteoblast-like cells as demonstrated by genome-wide microarray analysis.

    Science.gov (United States)

    Wahlström, Ola; Linder, Cecilia Halling; Ansell, Anna; Kalén, Anders; Söderström, Mats; Magnusson, Per

    2011-01-01

    Platelets contain numerous growth factors essential for wound and fracture healing. We investigated the gene expression in human osteoblast-like cells stimulated with lysed platelets prepared in acidic, neutral, or alkaline buffers. Lysed platelets prepared in buffers at pH 5.4, 7.4, and 7.9, were added after neutralization to hFOB 1.19 cells. Genome-wide microarray analysis was performed using the Affymetrix GeneChip 7G Scanner. Biometric, cluster, and pathway analyses were performed with GeneSpring GX. Biometric analyses demonstrated that 53 genes were differentially regulated (p ≤ 0.005, ≥2-fold increase). Pathway analysis revealed 10 significant pathways of which eight are common ones regulating bone formation and cancer growth. Eleven genes were selected for quantitative real-time polymerase chain reaction (PCR) based on the microarray analysis of the lysed platelets prepared in the pH 5.4 experiments. In conclusion, acidic preparations of lysed platelet concentrates release factors essential for cell proliferation and particularly cell metabolism under hypoxic conditions. The genetic response from these factors was dominated by genes associated with the same pathways observed in bone formation and cancer growth. Activation of TGF-β in the acidic preparation could be a stimulatory key factor of cell proliferation. These results support the hypothesis that acidification of platelets modifies the stimulatory response of mesenchymal cells in vitro, which is analogous with the observed milieu of a low pH present in wound and fracture sites, as well as in growing tumors.

  7. Exploring Technostress: Results of a Large Sample Factor Analysis

    Directory of Open Access Journals (Sweden)

    Steponas Jonušauskas

    2016-06-01

    Full Text Available With reference to the results of a large sample factor analysis, the article aims to propose the frame examining technostress in a population. The survey and principal component analysis of the sample consisting of 1013 individuals who use ICT in their everyday work was implemented in the research. 13 factors combine 68 questions and explain 59.13 per cent of the answers dispersion. Based on the factor analysis, questionnaire was reframed and prepared to reasonably analyze the respondents’ answers, revealing technostress causes and consequences as well as technostress prevalence in the population in a statistically validated pattern. A key elements of technostress based on factor analysis can serve for the construction of technostress measurement scales in further research.

  8. Economic Analysis of Factors Affecting Technical Efficiency of ...

    African Journals Online (AJOL)

    Economic Analysis of Factors Affecting Technical Efficiency of Smallholders ... socio-economic characteristics which influence technical efficiency in maize production. ... Ministry of Agriculture and livestock, records, books, reports and internet.

  9. Text mining factor analysis (TFA) in green tea patent data

    Science.gov (United States)

    Rahmawati, Sela; Suprijadi, Jadi; Zulhanif

    2017-03-01

    Factor analysis has become one of the most widely used multivariate statistical procedures in applied research endeavors across a multitude of domains. There are two main types of analyses based on factor analysis: Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA). Both EFA and CFA aim to observed relationships among a group of indicators with a latent variable, but they differ fundamentally, a priori and restrictions made to the factor model. This method will be applied to patent data technology sector green tea to determine the development technology of green tea in the world. Patent analysis is useful in identifying the future technological trends in a specific field of technology. Database patent are obtained from agency European Patent Organization (EPO). In this paper, CFA model will be applied to the nominal data, which obtain from the presence absence matrix. While doing processing, analysis CFA for nominal data analysis was based on Tetrachoric matrix. Meanwhile, EFA model will be applied on a title from sector technology dominant. Title will be pre-processing first using text mining analysis.

  10. Analysis of Corrosion Residues Collected from the Aluminum Basket Rails of the High-Burnup Demonstration Cask.

    Energy Technology Data Exchange (ETDEWEB)

    Bryan, Charles R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-03-01

    On September, 2015, an inspection was performed on the TN-32B cask that will be used for the high-burnup demonstration project. During the survey, wooden cribbing that had been placed within the cask eleven years earlier to prevent shifting of the basket during transport was removed, revealing two areas of residue on the aluminum basket rails, where they had contacted the cribbing. The residue appeared to be a corrosion product, and concerns were raised that similar attack could exist at more difficult-to-inspect locations in the canister. Accordingly, when the canister was reopened, samples of the residue were collected for analysis. This report presents the results of that assessment, which determined that the corrosion was due to the presence of the cribbing. The corrosion was associated with fungal material, and fungal activity likely contributed to an aggressive chemical environment. Once the cask has been cleaned, there will be no risk of further corrosion.

  11. Sustainable Manufacturing Practices in Malaysian Automotive Industry: Confirmatory Factor Analysis

    OpenAIRE

    Habidin, Nurul Fadly; Zubir, Anis Fadzlin Mohd; Fuz, Nursyazwani Mohd; Latip, Nor Azrin Md; Azman, Mohamed Nor Azhari

    2015-01-01

    Sustainable manufacturing practices (SMPs) have received enormous attention in current years as an effective solution to support the continuous growth and expansion of the automotive manufacturing industry. This reported study was conducted to examine confirmatory factor analysis for SMP such as manufacturing process, supply chain management, social responsibility, and environmental management based on automotive manufacturing industry. The results of confirmatory factor analysis show that fo...

  12. Towards automatic analysis of dynamic radionuclide studies using principal-components factor analysis

    International Nuclear Information System (INIS)

    Nigran, K.S.; Barber, D.C.

    1985-01-01

    A method is proposed for automatic analysis of dynamic radionuclide studies using the mathematical technique of principal-components factor analysis. This method is considered as a possible alternative to the conventional manual regions-of-interest method widely used. The method emphasises the importance of introducing a priori information into the analysis about the physiology of at least one of the functional structures in a study. Information is added by using suitable mathematical models to describe the underlying physiological processes. A single physiological factor is extracted representing the particular dynamic structure of interest. Two spaces 'study space, S' and 'theory space, T' are defined in the formation of the concept of intersection of spaces. A one-dimensional intersection space is computed. An example from a dynamic 99 Tcsup(m) DTPA kidney study is used to demonstrate the principle inherent in the method proposed. The method requires no correction for the blood background activity, necessary when processing by the manual method. The careful isolation of the kidney by means of region of interest is not required. The method is therefore less prone to operator influence and can be automated. (author)

  13. An Analysis of Construction Accident Factors Based on Bayesian Network

    OpenAIRE

    Yunsheng Zhao; Jinyong Pei

    2013-01-01

    In this study, we have an analysis of construction accident factors based on bayesian network. Firstly, accidents cases are analyzed to build Fault Tree method, which is available to find all the factors causing the accidents, then qualitatively and quantitatively analyzes the factors with Bayesian network method, finally determines the safety management program to guide the safety operations. The results of this study show that bad condition of geological environment has the largest posterio...

  14. Likelihood-based Dynamic Factor Analysis for Measurement and Forecasting

    NARCIS (Netherlands)

    Jungbacker, B.M.J.P.; Koopman, S.J.

    2015-01-01

    We present new results for the likelihood-based analysis of the dynamic factor model. The latent factors are modelled by linear dynamic stochastic processes. The idiosyncratic disturbance series are specified as autoregressive processes with mutually correlated innovations. The new results lead to

  15. Factor Analysis for Finding Invariant Neural Descriptors of Human Emotions

    Directory of Open Access Journals (Sweden)

    Vitor Pereira

    2018-01-01

    Full Text Available A major challenge in decoding human emotions from electroencephalogram (EEG data is finding representations that are invariant to inter- and intrasubject differences. Most of the previous studies are focused in building an individual discrimination model for every subject (subject dependent model. Building subject-independent models is a harder problem due to the high data variability between different subjects and different experiments with the same subject. This paper explores, for the first time, the Factor Analysis as an efficient technique to extract temporal and spatial EEG features suitable to build brain-computer interface for decoding human emotions across various subjects. Our findings show that early waves (temporal window of 200–400 ms after the stimulus onset carry more information about the valence of the emotion. Also, spatial location of features, with a stronger impact on the emotional valence, occurs in the parietal and occipital regions of the brain. All discrimination models (NN, SVM, kNN, and RF demonstrate better discrimination rate of the positive valence. These results match closely experimental psychology hypothesis that, during early periods after the stimulus presentation, the brain response—to images with highly positive valence—is stronger.

  16. Left ventricular wall motion abnormalities evaluated by factor analysis as compared with Fourier analysis

    International Nuclear Information System (INIS)

    Hirota, Kazuyoshi; Ikuno, Yoshiyasu; Nishikimi, Toshio

    1986-01-01

    Factor analysis was applied to multigated cardiac pool scintigraphy to evaluate its ability to detect left ventricular wall motion abnormalities in 35 patients with old myocardial infarction (MI), and in 12 control cases with normal left ventriculography. All cases were also evaluated by conventional Fourier analysis. In most cases with normal left ventriculography, the ventricular and atrial factors were extracted by factor analysis. In cases with MI, the third factor was obtained in the left ventricle corresponding to wall motion abnormality. Each case was scored according to the coincidence of findings of ventriculography and those of factor analysis or Fourier analysis. Scores were recorded for three items; the existence, location, and degree of asynergy. In cases of MI, the detection rate of asynergy was 94 % by factor analysis, 83 % by Fourier analysis, and the agreement in respect to location was 71 % and 66 %, respectively. Factor analysis had higher scores than Fourier analysis, but this was not significant. The interobserver error of factor analysis was less than that of Fourier analysis. Factor analysis can display locations and dynamic motion curves of asynergy, and it is regarded as a useful method for detecting and evaluating left ventricular wall motion abnormalities. (author)

  17. Weaving the native web: using social network analysis to demonstrate the value of a minority career development program.

    Science.gov (United States)

    Buchwald, Dedra; Dick, Rhonda Wiegman

    2011-06-01

    American Indian and Alaska Native scientists are consistently among the most underrepresented minority groups in health research. The authors used social network analysis (SNA) to evaluate the Native Investigator Development Program (NIDP), a career development program for junior Native researchers established as a collaboration between the University of Washington and the University of Colorado Denver. The study focused on 29 trainees and mentors who participated in the NIDP. Data were collected on manuscripts and grant proposals produced by participants from 1998 to 2007. Information on authorship of manuscripts and collaborations on grant applications was used to conduct social network analyses with three measures of centrality and one measure of network reach. Both visual and quantitative analyses were performed. Participants in the NIDP collaborated on 106 manuscripts and 83 grant applications. Although three highly connected individuals, with critical and central roles in the program, accounted for much of the richness of the network, both current core faculty and "graduates" of the program were heavily involved in collaborations on manuscripts and grants. This study's innovative application of SNA demonstrates that collaborative relationships can be an important outcome of career development programs for minority investigators and that an analysis of these relationships can provide a more complete assessment of the value of such programs.

  18. Preliminary Assessment of ICRP Dose Conversion Factor Recommendations for Accident Analysis Applications

    International Nuclear Information System (INIS)

    Vincent, A.M.

    2002-01-01

    Accident analysis for U.S. Department of Energy (DOE) nuclear facilities is an integral part of the overall safety basis developed by the contractor to demonstrate facility operation can be conducted safely. An appropriate documented safety analysis for a facility discusses accident phenomenology, quantifies source terms arising from postulated process upset conditions, and applies a standardized, internationally-recognized database of dose conversion factors (DCFs) to evaluate radiological conditions to offsite receptors

  19. Phylogenetic and functional analysis of metagenome sequence from high-temperature archaeal habitats demonstrate linkages between metabolic potential and geochemistry

    Directory of Open Access Journals (Sweden)

    William P. Inskeep

    2013-05-01

    Full Text Available Geothermal habitats in Yellowstone National Park (YNP provide an unparalled opportunity to understand the environmental factors that control the distribution of archaea in thermal habitats. Here we describe, analyze and synthesize metagenomic and geochemical data collected from seven high-temperature sites that contain microbial communities dominated by archaea relative to bacteria. The specific objectives of the study were to use metagenome sequencing to determine the structure and functional capacity of thermophilic archaeal-dominated microbial communities across a pH range from 2.5 to 6.4 and to discuss specific examples where the metabolic potential correlated with measured environmental parameters and geochemical processes occurring in situ. Random shotgun metagenome sequence (~40-45 Mbase Sanger sequencing per site was obtained from environmental DNA extracted from high-temperature sediments and/or microbial mats and subjected to numerous phylogenetic and functional analyses. Analysis of individual sequences (e.g., MEGAN and G+C content and assemblies from each habitat type revealed the presence of dominant archaeal populations in all environments, 10 of whose genomes were largely reconstructed from the sequence data. Analysis of protein family occurrence, particularly of those involved in energy conservation, electron transport and autotrophic metabolism, revealed significant differences in metabolic strategies across sites consistent with differences in major geochemical attributes (e.g., sulfide, oxygen, pH. These observations provide an ecological basis for understanding the distribution of indigenous archaeal lineages across high temperature systems of YNP.

  20. Analysis of related risk factors for pancreatic fistula after pancreaticoduodenectomy

    Directory of Open Access Journals (Sweden)

    Qi-Song Yu

    2016-08-01

    Full Text Available Objective: To explore the related risk factors for pancreatic fistula after pancreaticoduodenectomy to provide a theoretical evidence for effectively preventing the occurrence of pancreatic fistula. Methods: A total of 100 patients who were admitted in our hospital from January, 2012 to January, 2015 and had performed pancreaticoduodenectomy were included in the study. The related risk factors for developing pancreatic fistula were collected for single factor and Logistic multi-factor analysis. Results: Among the included patients, 16 had pancreatic fistula, and the total occurrence rate was 16% (16/100. The single-factor analysis showed that the upper abdominal operation history, preoperative bilirubin, pancreatic texture, pancreatic duct diameter, intraoperative amount of bleeding, postoperative hemoglobin, and application of somatostatin after operation were the risk factors for developing pancreatic fistula (P<0.05. The multi-factor analysis showed that the upper abdominal operation history, the soft pancreatic texture, small pancreatic duct diameter, and low postoperative hemoglobin were the dependent risk factors for developing pancreatic fistula (OR=4.162, 6.104, 5.613, 4.034, P<0.05. Conclusions: The occurrence of pancreatic fistula after pancreaticoduodenectomy is closely associated with the upper abdominal operation history, the soft pancreatic texture, small pancreatic duct diameter, and low postoperative hemoglobin; therefore, effective measures should be taken to reduce the occurrence of pancreatic fistula according to the patients’ own conditions.

  1. Environmental Performance in Countries Worldwide: Determinant Factors and Multivariate Analysis

    Directory of Open Access Journals (Sweden)

    Isabel Gallego-Alvarez

    2014-11-01

    Full Text Available The aim of this study is to analyze the environmental performance of countries and the variables that can influence it. At the same time, we performed a multivariate analysis using the HJ-biplot, an exploratory method that looks for hidden patterns in the data, obtained from the usual singular value decomposition (SVD of the data matrix, to contextualize the countries grouped by geographical areas and the variables relating to environmental indicators included in the environmental performance index. The sample used comprises 149 countries of different geographic areas. The findings obtained from the empirical analysis emphasize that socioeconomic factors, such as economic wealth and education, as well as institutional factors represented by the style of public administration, in particular control of corruption, are determinant factors of environmental performance in the countries analyzed. In contrast, no effect on environmental performance was found for factors relating to the internal characteristics of a country or political factors.

  2. Confirmatory factor analysis applied to the Force Concept Inventory

    Science.gov (United States)

    Eaton, Philip; Willoughby, Shannon D.

    2018-06-01

    In 1995, Huffman and Heller used exploratory factor analysis to draw into question the factors of the Force Concept Inventory (FCI). Since then several papers have been published examining the factors of the FCI on larger sets of student responses and understandable factors were extracted as a result. However, none of these proposed factor models have been verified to not be unique to their original sample through the use of independent sets of data. This paper seeks to confirm the factor models proposed by Scott et al. in 2012, and Hestenes et al. in 1992, as well as another expert model proposed within this study through the use of confirmatory factor analysis (CFA) and a sample of 20 822 postinstruction student responses to the FCI. Upon application of CFA using the full sample, all three models were found to fit the data with acceptable global fit statistics. However, when CFA was performed using these models on smaller sample sizes the models proposed by Scott et al. and Eaton and Willoughby were found to be far more stable than the model proposed by Hestenes et al. The goodness of fit of these models to the data suggests that the FCI can be scored on factors that are not unique to a single class. These scores could then be used to comment on how instruction methods effect the performance of students along a single factor and more in-depth analyses of curriculum changes may be possible as a result.

  3. Using exploratory factor analysis in personality research: Best-practice recommendations

    Directory of Open Access Journals (Sweden)

    Sumaya Laher

    2010-11-01

    Research purpose: This article presents more objective methods to determine the number of factors, most notably parallel analysis and Velicer’s minimum average partial (MAP. The benefits of rotation are also discussed. The article argues for more consistent use of Procrustes rotation and congruence coefficients in factor analytic studies. Motivation for the study: Exploratory factor analysis is often criticised for not being rigorous and objective enough in terms of the methods used to determine the number of factors, the rotations to be used and ultimately the validity of the factor structure. Research design, approach and method: The article adopts a theoretical stance to discuss the best-practice recommendations for factor analytic research in the field of psychology. Following this, an example located within personality assessment and using the NEO-PI-R specifically is presented. A total of 425 students at the University of the Witwatersrand completed the NEO-PI-R. These responses were subjected to a principal components analysis using varimax rotation. The rotated solution was subjected to a Procrustes rotation with Costa and McCrae’s (1992 matrix as the target matrix. Congruence coefficients were also computed. Main findings: The example indicates the use of the methods recommended in the article and demonstrates an objective way of determining the number of factors. It also provides an example of Procrustes rotation with coefficients of agreement as an indication of how factor analytic results may be presented more rigorously in local research. Practical/managerial implications: It is hoped that the recommendations in this article will have best-practice implications for both researchers and practitioners in the field who employ factor analysis regularly. Contribution/value-add: This article will prove useful to all researchers employing factor analysis and has the potential to set the trend for better use of factor analysis in the South African context.

  4. Factor analysis of the Hamilton Depression Rating Scale in Parkinson's disease.

    Science.gov (United States)

    Broen, M P G; Moonen, A J H; Kuijf, M L; Dujardin, K; Marsh, L; Richard, I H; Starkstein, S E; Martinez-Martin, P; Leentjens, A F G

    2015-02-01

    Several studies have validated the Hamilton Depression Rating Scale (HAMD) in patients with Parkinson's disease (PD), and reported adequate reliability and construct validity. However, the factorial validity of the HAMD has not yet been investigated. The aim of our analysis was to explore the factor structure of the HAMD in a large sample of PD patients. A principal component analysis of the 17-item HAMD was performed on data of 341 PD patients, available from a previous cross sectional study on anxiety. An eigenvalue ≥1 was used to determine the number of factors. Factor loadings ≥0.4 in combination with oblique rotations were used to identify which variables made up the factors. Kaiser-Meyer-Olkin measure (KMO), Cronbach's alpha, Bartlett's test, communality, percentage of non-redundant residuals and the component correlation matrix were computed to assess factor validity. KMO verified the sample's adequacy for factor analysis and Cronbach's alpha indicated a good internal consistency of the total scale. Six factors had eigenvalues ≥1 and together explained 59.19% of the variance. The number of items per factor varied from 1 to 6. Inter-item correlations within each component were low. There was a high percentage of non-redundant residuals and low communality. This analysis demonstrates that the factorial validity of the HAMD in PD is unsatisfactory. This implies that the scale is not appropriate for studying specific symptom domains of depression based on factorial structure in a PD population. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Bayesian Sensitivity Analysis of a Nonlinear Dynamic Factor Analysis Model with Nonparametric Prior and Possible Nonignorable Missingness.

    Science.gov (United States)

    Tang, Niansheng; Chow, Sy-Miin; Ibrahim, Joseph G; Zhu, Hongtu

    2017-12-01

    Many psychological concepts are unobserved and usually represented as latent factors apprehended through multiple observed indicators. When multiple-subject multivariate time series data are available, dynamic factor analysis models with random effects offer one way of modeling patterns of within- and between-person variations by combining factor analysis and time series analysis at the factor level. Using the Dirichlet process (DP) as a nonparametric prior for individual-specific time series parameters further allows the distributional forms of these parameters to deviate from commonly imposed (e.g., normal or other symmetric) functional forms, arising as a result of these parameters' restricted ranges. Given the complexity of such models, a thorough sensitivity analysis is critical but computationally prohibitive. We propose a Bayesian local influence method that allows for simultaneous sensitivity analysis of multiple modeling components within a single fitting of the model of choice. Five illustrations and an empirical example are provided to demonstrate the utility of the proposed approach in facilitating the detection of outlying cases and common sources of misspecification in dynamic factor analysis models, as well as identification of modeling components that are sensitive to changes in the DP prior specification.

  6. Coop-Seq Analysis Demonstrates that Sox2 Evokes Latent Specificities in the DNA Recognition by Pax6.

    Science.gov (United States)

    Hu, Caizhen; Malik, Vikas; Chang, Yiming Kenny; Veerapandian, Veeramohan; Srivastava, Yogesh; Huang, Yong-Heng; Hou, Linlin; Cojocaru, Vlad; Stormo, Gary D; Jauch, Ralf

    2017-11-24

    Sox2 and Pax6 co-regulate genes in neural lineages and the lens by forming a ternary complex likely facilitated allosterically through DNA. We used the quantitative and scalable cooperativity-by-sequencing (Coop-seq) approach to interrogate Sox2/Pax6 dimerization on a DNA library where five positions of the Pax6 half-site were randomized yielding 1024 cooperativity factors. Consensus positions normally required for the high-affinity DNA binding by Pax6 need to be mutated for effective dimerization with Sox2. Out of the five randomized bases, a 5' thymidine is present in most of the top ranking elements. However, this thymidine maps to a region outside of the Pax half site and is not expected to directly interact with Pax6 in known binding modes suggesting structural reconfigurations. Re-analysis of ChIP-seq data identified several genomic regions where the cooperativity promoting sequence pattern is co-bound by Sox2 and Pax6. A highly conserved Sox2/Pax6 bound site near the Sprouty2 locus was verified to promote cooperative dimerization designating Sprouty2 as a potential target reliant on Sox2/Pax6 cooperativity in several neural cell types. Collectively, the functional interplay of Sox2 and Pax6 demands the relaxation of high-affinity binding sites and is enabled by alternative DNA sequences. We conclude that this binding mode evolved to warrant that a subset of target genes is only regulated in the presence of suitable partner factors. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Performance analysis for bounded persistent disturbances in PD/PID-controlled robotic systems with its experimental demonstrations

    Science.gov (United States)

    Kim, Jung Hoon; Hur, Sung-Moon; Oh, Yonghwan

    2018-03-01

    This paper is concerned with performance analysis of proportional-derivative/proportional-integral-derivative (PD/PID) controller for bounded persistent disturbances in a robotic manipulator. Even though the notion of input-to-state stability (ISS) has been widely used to deal with the effect of disturbances in control of a robotic manipulator, the corresponding studies cannot be directly applied to the treatment of persistent disturbances occurred in robotic manipulators. This is because the conventional studies relevant to ISS consider the H∞ performance for robotic systems, which is confined to the treatment of decaying disturbances, i.e. the disturbances those in the L2 space. To deal with the effect of persistent disturbances in robotic systems, we first provide a new treatment of ISS in the L∞ sense because bounded persistent disturbances should be intrinsically regarded as elements of the L∞ space. We next derive state-space representations of trajectory tracking control in the robotic systems which allow us to define the problem formulations more clearly. We then propose a novel control law that has a PD/PID control form, by which the trajectory tracking system satisfies the reformulated ISS. Furthermore, we can obtain a theoretical argument about the L∞ gain from the disturbance to the regulated output through the proposed control law. Finally, experimental studies for a typical 3-degrees of freedom robotic manipulator are given to demonstrate the effectiveness of the method introduced in this paper.

  8. Confirmatory factor analysis of the female sexual function index.

    Science.gov (United States)

    Opperman, Emily A; Benson, Lindsay E; Milhausen, Robin R

    2013-01-01

    The Female Sexual Functioning Index (Rosen et al., 2000 ) was designed to assess the key dimensions of female sexual functioning using six domains: desire, arousal, lubrication, orgasm, satisfaction, and pain. A full-scale score was proposed to represent women's overall sexual function. The fifth revision to the Diagnostic and Statistical Manual (DSM) is currently underway and includes a proposal to combine desire and arousal problems. The objective of this article was to evaluate and compare four models of the Female Sexual Functioning Index: (a) single-factor model, (b) six-factor model, (c) second-order factor model, and (4) five-factor model combining the desire and arousal subscales. Cross-sectional and observational data from 85 women were used to conduct a confirmatory factor analysis on the Female Sexual Functioning Index. Local and global goodness-of-fit measures, the chi-square test of differences, squared multiple correlations, and regression weights were used. The single-factor model fit was not acceptable. The original six-factor model was confirmed, and good model fit was found for the second-order and five-factor models. Delta chi-square tests of differences supported best fit for the six-factor model validating usage of the six domains. However, when revisions are made to the DSM-5, the Female Sexual Functioning Index can adapt to reflect these changes and remain a valid assessment tool for women's sexual functioning, as the five-factor structure was also supported.

  9. Clinicopathological Analysis of Factors Related to Colorectal Tumor Perforation

    OpenAIRE

    Medina-Arana, Vicente; Martínez-Riera, Antonio; Delgado-Plasencia, Luciano; Rodríguez-González, Diana; Bravo-Gutiérrez, Alberto; Álvarez-Argüelles, Hugo; Alarcó-Hernández, Antonio; Salido-Ruiz, Eduardo; Fernández-Peralta, Antonia M.; González-Aguilera, Juan J.

    2015-01-01

    Abstract Colorectal tumor perforation is a life-threatening complication of this disease. However, little is known about the anatomopathological factors or pathophysiologic mechanisms involved. Pathological and immunohistochemical analysis of factors related with tumoral neo-angiogenesis, which could influence tumor perforation are assessed in this study. A retrospective study of patients with perforated colon tumors (Group P) and T4a nonperforated (controls) was conducted between 2001 and 20...

  10. Analysis of Key Factors Driving Japan’s Military Normalization

    Science.gov (United States)

    2017-09-01

    no change to our policy of not giving in to terrorism.”40 Though the prime minister was democratically supported, Koizumi’s leadership style took...of the key driving factors of Japan’s normalization. The areas of prime ministerial leadership , regional security threats, alliance issues, and...analysis of the key driving factors of Japan’s normalization. The areas of prime ministerial leadership , regional security threats, alliance issues, and

  11. Factor analysis of the contextual fine motor questionnaire in children.

    Science.gov (United States)

    Lin, Chin-Kai; Meng, Ling-Fu; Yu, Ya-Wen; Chen, Che-Kuo; Li, Kuan-Hua

    2014-02-01

    Most studies treat fine motor as one subscale in a developmental test, hence, further factor analysis of fine motor has not been conducted. In fact, fine motor has been treated as a multi-dimensional domain from both clinical and theoretical perspectives, and therefore to know its factors would be valuable. The aim of this study is to analyze the internal consistency and factor validity of the Contextual Fine Motor Questionnaire (CFMQ). Based on the ecological observation and literature, the Contextual Fine Motor Questionnaire (CFMQ) was developed and includes 5 subscales: Pen Control, Tool Use During Handicraft Activities, the Use of Dining Utensils, Connecting and Separating during Dressing and Undressing, and Opening Containers. The main purpose of this study is to establish the factorial validity of the CFMQ through conducting this factor analysis study. Among 1208 questionnaires, 904 were successfully completed. Data from the children's CFMQ submitted by primary care providers was analyzed, including 485 females (53.6%) and 419 males (46.4%) from grades 1 to 5, ranging in age from 82 to 167 months (M=113.9, SD=16.3). Cronbach's alpha was used to measure internal consistency and explorative factor analysis was applied to test the five factor structures within the CFMQ. Results showed that Cronbach's alpha coefficient of the CFMQ for 5 subscales ranged from .77 to .92 and all item-total correlations with corresponding subscales were larger than .4 except one item. The factor loading of almost all items classified to their factor was larger than .5 except 3 items. There were five factors, explaining a total of 62.59% variance for the CFMQ. In conclusion, the remaining 24 items in the 5 subscales of the CFMQ had appropriate internal consistency, test-retest reliability and construct validity. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. A feasibility study on age-related factors of wrist pulse using principal component analysis.

    Science.gov (United States)

    Jang-Han Bae; Young Ju Jeon; Sanghun Lee; Jaeuk U Kim

    2016-08-01

    Various analysis methods for examining wrist pulse characteristics are needed for accurate pulse diagnosis. In this feasibility study, principal component analysis (PCA) was performed to observe age-related factors of wrist pulse from various analysis parameters. Forty subjects in the age group of 20s and 40s were participated, and their wrist pulse signal and respiration signal were acquired with the pulse tonometric device. After pre-processing of the signals, twenty analysis parameters which have been regarded as values reflecting pulse characteristics were calculated and PCA was performed. As a results, we could reduce complex parameters to lower dimension and age-related factors of wrist pulse were observed by combining-new analysis parameter derived from PCA. These results demonstrate that PCA can be useful tool for analyzing wrist pulse signal.

  13. Factoring local sequence composition in motif significance analysis.

    Science.gov (United States)

    Ng, Patrick; Keich, Uri

    2008-01-01

    We recently introduced a biologically realistic and reliable significance analysis of the output of a popular class of motif finders. In this paper we further improve our significance analysis by incorporating local base composition information. Relying on realistic biological data simulation, as well as on FDR analysis applied to real data, we show that our method is significantly better than the increasingly popular practice of using the normal approximation to estimate the significance of a finder's output. Finally we turn to leveraging our reliable significance analysis to improve the actual motif finding task. Specifically, endowing a variant of the Gibbs Sampler with our improved significance analysis we demonstrate that de novo finders can perform better than has been perceived. Significantly, our new variant outperforms all the finders reviewed in a recently published comprehensive analysis of the Harbison genome-wide binding location data. Interestingly, many of these finders incorporate additional information such as nucleosome positioning and the significance of binding data.

  14. Analysis of IFR driver fuel hot channel factors

    International Nuclear Information System (INIS)

    Ku, J.Y.; Chang, L.K.; Mohr, D.

    1994-01-01

    Thermal-hydraulic uncertainty factors for Integral Fast Reactor (IFR) driver fuels have been determined based primarily on the database obtained from the predecessor fuels used in the IFR prototype, Experimental Breeder Reactor II. The uncertainty factors were applied to the channel factors (HCFs) analyses to obtain separate overall HCFs for fuel and cladding for steady-state analyses. A ''semistatistical horizontal method'' was used in the HCFs analyses. The uncertainty factor of the fuel thermal conductivity dominates the effects considered in the HCFs analysis; the uncertainty in fuel thermal conductivity will be reduced as more data are obtained to expand the currently limited database for the IFR ternary metal fuel (U-20Pu-10Zr). A set of uncertainty factors to be used for transient analyses has also been derived

  15. Interactive analysis of human error factors in NPP operation events

    International Nuclear Information System (INIS)

    Zhang Li; Zou Yanhua; Huang Weigang

    2010-01-01

    Interactive of human error factors in NPP operation events were introduced, and 645 WANO operation event reports from 1999 to 2008 were analyzed, among which 432 were found relative to human errors. After classifying these errors with the Root Causes or Causal Factors, and then applying SPSS for correlation analysis,we concluded: (1) Personnel work practices are restricted by many factors. Forming a good personnel work practices is a systematic work which need supports in many aspects. (2)Verbal communications,personnel work practices, man-machine interface and written procedures and documents play great roles. They are four interaction factors which often come in bundle. If some improvements need to be made on one of them,synchronous measures are also necessary for the others.(3) Management direction and decision process, which are related to management,have a significant interaction with personnel factors. (authors)

  16. Analysis of IFR driver fuel hot channel factors

    International Nuclear Information System (INIS)

    Ku, J.Y.; Chang, L.K.; Mohr, D.

    2004-01-01

    Thermal-hydraulic uncertainty factors for Integral Fast Reactor (IFR) driver fuels have been determined based primarily on the database obtained from the predecessor fuels used in the IFR prototype. Experimental Breeder Reactor II. The uncertainty factors were applied to the hot channel factors (HCFs) analyses to obtain separate overall HCFs for fuel and cladding for steady-state analyses. A 'semistatistical horizontal method' was used in the HCFs analyses. The uncertainty factor of the fuel thermal conductivity dominates the effects considered in the HCFs analysis; the uncertainty in fuel thermal conductivity will be reduced as more data are obtained to expand the currently limited database for the IFR ternary metal fuel (U-20Pu-10Zr). A set of uncertainty factors to be used for transient analyses has also been derived. (author)

  17. Confirmatory Factor Analysis of the Procrastination Assessment Scale for Students

    Directory of Open Access Journals (Sweden)

    Ronald D. Yockey

    2015-10-01

    Full Text Available The relative fit of one- and two-factor models of the Procrastination Assessment Scale for Students (PASS was investigated using confirmatory factor analysis on an ethnically diverse sample of 345 participants. The results indicated that although the two-factor model provided better fit to the data than the one-factor model, neither model provided optimal fit. However, a two-factor model which accounted for common item theme pairs used by Solomon and Rothblum in the creation of the scale provided good fit to the data. In addition, a significant difference by ethnicity was also found on the fear of failure subscale of the PASS, with Whites having significantly lower scores than Asian Americans or Latino/as. Implications of the results are discussed and recommendations made for future work with the scale.

  18. Two Expectation-Maximization Algorithms for Boolean Factor Analysis

    Czech Academy of Sciences Publication Activity Database

    Frolov, A. A.; Húsek, Dušan; Polyakov, P.Y.

    2014-01-01

    Roč. 130, 23 April (2014), s. 83-97 ISSN 0925-2312 R&D Projects: GA ČR GAP202/10/0262 Grant - others:GA MŠk(CZ) ED1.1.00/02.0070; GA MŠk(CZ) EE.2.3.20.0073 Program:ED Institutional research plan: CEZ:AV0Z10300504 Keywords : Boolean Factor analysis * Binary Matrix factorization * Neural networks * Binary data model * Dimension reduction * Bars problem Subject RIV: IN - Informatics, Computer Science Impact factor: 2.083, year: 2014

  19. Workplace Innovation: Exploratory and Confirmatory Factor Analysis for Construct Validation

    Directory of Open Access Journals (Sweden)

    Wipulanusat Warit

    2017-06-01

    Full Text Available Workplace innovation enables the development and improvement of products, processes and services leading simultaneously to improvement in organisational performance. This study has the purpose of examining the factor structure of workplace innovation. Survey data, extracted from the 2014 APS employee census, comprising 3,125 engineering professionals in the Commonwealth of Australia’s departments were analysed using exploratory factor analysis (EFA and confirmatory factor analysis (CFA. EFA returned a two-factor structure explaining 69.1% of the variance of the construct. CFA revealed that a two-factor structure was indicated as a validated model (GFI = 0.98, AGFI = 0.95, RMSEA = 0.08, RMR = 0.02, IFI = 0.98, NFI = 0.98, CFI = 0.98, and TLI = 0.96. Both factors showed good reliability of the scale (Individual creativity: α = 0.83, CR = 0.86, and AVE = 0.62; Team Innovation: α = 0.82, CR = 0.88, and AVE = 0.61. These results confirm that the two factors extracted for characterising workplace innovation included individual creativity and team innovation.

  20. Ranking insurance firms using AHP and Factor Analysis

    Directory of Open Access Journals (Sweden)

    Mohammad Khodaei Valahzaghard

    2013-03-01

    Full Text Available Insurance industry includes a significant part of economy and it is important to learn more about the capabilities of different firms, which are active in this industry. In this paper, we present an empirical study to rank the insurance firms using analytical hierarchy process as well as factor analysis. The study considers four criteria including capital adequacy, quality of earning, quality of cash flow and quality of firms’ assets. The results of the implementation of factor analysis (FA have been verified using Kaiser-Meyer-Olkin (KMO=0.573 and Bartlett's Chi-Square (443.267 P-value=0.000 tests. According to the results FA, the first important factor, capital adequacy, represents 21.557% of total variance, the second factor, quality of income, represents 20.958% of total variance. In addition, the third factor, quality of cash flow, represents 19.417% of total variance and the last factor, quality of assets, represents 18.641% of total variance. The study has also used analytical hierarchy process (AHP to rank insurance firms. The results of our survey indicate that capital adequacy (0.559 is accounted as the most important factor followed by quality of income (0.235, quality of cash flow (0.144 and quality of assets (0.061. The results of AHP are consistent with the results of FA, which somewhat validates the overall study.

  1. Cloning and sequence analysis demonstrate the chromate reduction ability of a novel chromate reductase gene from Serratia sp.

    Science.gov (United States)

    Deng, Peng; Tan, Xiaoqing; Wu, Ying; Bai, Qunhua; Jia, Yan; Xiao, Hong

    2015-03-01

    The ChrT gene encodes a chromate reductase enzyme which catalyzes the reduction of Cr(VI). The chromate reductase is also known as flavin mononucleotide (FMN) reductase (FMN_red). The aim of the present study was to clone the full-length ChrT DNA from Serratia sp. CQMUS2 and analyze the deduced amino acid sequence and three-dimensional structure. The putative ChrT gene fragment of Serratia sp. CQMUS2 was isolated by polymerase chain reaction (PCR), according to the known FMN_red gene sequence from Serratia sp. AS13. The flanking sequences of the ChrT gene were obtained by high efficiency TAIL-PCR, while the full-length gene of ChrT was cloned in Escherichia coli for subsequent sequencing. The nucleotide sequence of ChrT was submitted onto GenBank under the accession number, KF211434. Sequence analysis of the gene and amino acids was conducted using the Basic Local Alignment Search Tool, and open reading frame (ORF) analysis was performed using ORF Finder software. The ChrT gene was found to be an ORF of 567 bp that encodes a 188-amino acid enzyme with a calculated molecular weight of 20.4 kDa. In addition, the ChrT protein was hypothesized to be an NADPH-dependent FMN_red and a member of the flavodoxin-2 superfamily. The amino acid sequence of ChrT showed high sequence similarity to the FMN reductase genes of Klebsiella pneumonia and Raoultella ornithinolytica , which belong to the flavodoxin-2 superfamily. Furthermore, ChrT was shown to have a 85.6% similarity to the three-dimensional structure of Escherichia coli ChrR, sharing four common enzyme active sites for chromate reduction. Therefore, ChrT gene cloning and protein structure determination demonstrated the ability of the gene for chromate reduction. The results of the present study provide a basis for further studies on ChrT gene expression and protein function.

  2. Cloning and sequence analysis demonstrate the chromate reduction ability of a novel chromate reductase gene from Serratia sp

    Science.gov (United States)

    DENG, PENG; TAN, XIAOQING; WU, YING; BAI, QUNHUA; JIA, YAN; XIAO, HONG

    2015-01-01

    The ChrT gene encodes a chromate reductase enzyme which catalyzes the reduction of Cr(VI). The chromate reductase is also known as flavin mononucleotide (FMN) reductase (FMN_red). The aim of the present study was to clone the full-length ChrT DNA from Serratia sp. CQMUS2 and analyze the deduced amino acid sequence and three-dimensional structure. The putative ChrT gene fragment of Serratia sp. CQMUS2 was isolated by polymerase chain reaction (PCR), according to the known FMN_red gene sequence from Serratia sp. AS13. The flanking sequences of the ChrT gene were obtained by high efficiency TAIL-PCR, while the full-length gene of ChrT was cloned in Escherichia coli for subsequent sequencing. The nucleotide sequence of ChrT was submitted onto GenBank under the accession number, KF211434. Sequence analysis of the gene and amino acids was conducted using the Basic Local Alignment Search Tool, and open reading frame (ORF) analysis was performed using ORF Finder software. The ChrT gene was found to be an ORF of 567 bp that encodes a 188-amino acid enzyme with a calculated molecular weight of 20.4 kDa. In addition, the ChrT protein was hypothesized to be an NADPH-dependent FMN_red and a member of the flavodoxin-2 superfamily. The amino acid sequence of ChrT showed high sequence similarity to the FMN reductase genes of Klebsiella pneumonia and Raoultella ornithinolytica, which belong to the flavodoxin-2 superfamily. Furthermore, ChrT was shown to have a 85.6% similarity to the three-dimensional structure of Escherichia coli ChrR, sharing four common enzyme active sites for chromate reduction. Therefore, ChrT gene cloning and protein structure determination demonstrated the ability of the gene for chromate reduction. The results of the present study provide a basis for further studies on ChrT gene expression and protein function. PMID:25667630

  3. Multiplication factor versus regression analysis in stature estimation from hand and foot dimensions.

    Science.gov (United States)

    Krishan, Kewal; Kanchan, Tanuj; Sharma, Abhilasha

    2012-05-01

    Estimation of stature is an important parameter in identification of human remains in forensic examinations. The present study is aimed to compare the reliability and accuracy of stature estimation and to demonstrate the variability in estimated stature and actual stature using multiplication factor and regression analysis methods. The study is based on a sample of 246 subjects (123 males and 123 females) from North India aged between 17 and 20 years. Four anthropometric measurements; hand length, hand breadth, foot length and foot breadth taken on the left side in each subject were included in the study. Stature was measured using standard anthropometric techniques. Multiplication factors were calculated and linear regression models were derived for estimation of stature from hand and foot dimensions. Derived multiplication factors and regression formula were applied to the hand and foot measurements in the study sample. The estimated stature from the multiplication factors and regression analysis was compared with the actual stature to find the error in estimated stature. The results indicate that the range of error in estimation of stature from regression analysis method is less than that of multiplication factor method thus, confirming that the regression analysis method is better than multiplication factor analysis in stature estimation. Copyright © 2012 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  4. A factor analysis to find critical success factors in retail brand

    Directory of Open Access Journals (Sweden)

    Naser Azad

    2013-03-01

    Full Text Available The present exploratory study aims to find critical components of retail brand among some retail stores. The study seeks to build a brand name in retail level and looks to find important factors affecting it. Customer behavior is largely influenced when the first retail customer experience is formed. These factors have direct impacts on customer experience and satisfaction in retail industry. The proposed study performs an empirical investigation on two well-known retain stores located in city of Tehran, Iran. Using a sample of 265 people from regular customers, the study uses factor analysis and extracts four main factors including related brand, product benefits, customer welfare strategy and corporate profits using the existing 31 factors in the literature.

  5. Phylogenetic analysis of Tomato spotted wilt virus (TSWV) NSs protein demonstrates the isolated emergence of resistance-breaking strains in pepper.

    Science.gov (United States)

    Almási, Asztéria; Csilléry, Gábor; Csömör, Zsófia; Nemes, Katalin; Palkovics, László; Salánki, Katalin; Tóbiás, István

    2015-02-01

    Resurgence of Tomato spotted wilt virus (TSWV) worldwide as well as in Hungary causing heavy economic losses directed the attention to the factors contributing to the outbreak of this serious epidemics. The introgression of Tsw resistance gene into various pepper cultivars seemed to solve TSWV control, but widely used resistant pepper cultivars bearing the same, unique resistance locus evoked the rapid emergence of resistance-breaking (RB) TSWV strains. In Hungary, the sporadic appearance of RB strains in pepper-producing region was first observed in 2010-2011, but in 2012 it was detected frequently. Previously, the non-structural protein (NSs) encoded by small RNA (S RNA) of TSWV was verified as the avirulence factor for Tsw resistance, therefore we analyzed the S RNA of the Hungarian RB and wild type (WT) isolates and compared to previously analyzed TSWV strains with RB properties from different geographical origins. Phylogenetic analysis demonstrated that the different RB strains had the closest relationship with the local WT isolates and there is no conserved mutation present in all the NSs genes of RB isolates from different geographical origins. According to these results, we concluded that the RB isolates evolved separately in geographic point of view, and also according to the RB mechanism.

  6. Worry About Caregiving Performance: A Confirmatory Factor Analysis

    Directory of Open Access Journals (Sweden)

    Ruijie Li

    2018-03-01

    Full Text Available Recent studies on the Zarit Burden Interview (ZBI support the existence of a unique factor, worry about caregiving performance (WaP, beyond role and personal strain. Our current study aims to confirm the existence of WaP within the multidimensionality of ZBI and to determine if predictors of WaP differ from the role and personal strain. We performed confirmatory factor analysis (CFA on 466 caregiver-patient dyads to compare between one-factor (total score, two-factor (role/personal strain, three-factor (role/personal strain and WaP, and four-factor models (role strain split into two factors. We conducted linear regression analyses to explore the relationships between different ZBI factors with socio-demographic and disease characteristics, and investigated the stage-dependent differences between WaP with role and personal strain by dyadic relationship. The four-factor structure that incorporated WaP and split role strain into two factors yielded the best fit. Linear regression analyses reveal that different variables significantly predict WaP (adult child caregiver and Neuropsychiatric Inventory Questionnaire (NPI-Q severity from role/personal strain (adult child caregiver, instrumental activities of daily living, and NPI-Q distress. Unlike other factors, WaP was significantly endorsed in early cognitive impairment. Among spouses, WaP remained low across Clinical Dementia Rating (CDR stages until a sharp rise in CDR 3; adult child and sibling caregivers experience a gradual rise throughout the stages. Our results affirm the existence of WaP as a unique factor. Future research should explore the potential of WaP as a possible intervention target to improve self-efficacy in the milder stages of burden.

  7. Architectural analysis and intraoperative measurements demonstrate the unique design of the multifidus muscle for lumbar spine stability.

    Science.gov (United States)

    Ward, Samuel R; Kim, Choll W; Eng, Carolyn M; Gottschalk, Lionel J; Tomiya, Akihito; Garfin, Steven R; Lieber, Richard L

    2009-01-01

    Muscular instability is an important risk factor for lumbar spine injury and chronic low-back pain. Although the lumbar multifidus muscle is considered an important paraspinal muscle, its design features are not completely understood. The purpose of the present study was to determine the architectural properties, in vivo sarcomere length operating range, and passive mechanical properties of the human multifidus muscle. We hypothesized that its architecture would be characterized by short fibers and a large physiological cross-sectional area and that it would operate over a relatively wide range of sarcomere lengths but would have very stiff passive material properties. The lumbar spines of eight cadaver specimens were excised en bloc from T12 to the sacrum. Multifidus muscles were isolated from each vertebral level, permitting the architectural measurements of mass, sarcomere length, normalized fiber length, physiological cross-sectional area, and fiber length-to-muscle length ratio. To determine the sarcomere length operating range of the muscle, sarcomere lengths were measured from intraoperative biopsy specimens that were obtained with the spine in the flexed and extended positions. The material properties of single muscle fibers were obtained from passive stress-strain tests of excised biopsy specimens. The average muscle mass (and standard error) was 146 +/- 8.7 g, and the average sarcomere length was 2.27 +/- 0.06 microm, yielding an average normalized fiber length of 5.66 +/- 0.65 cm, an average physiological cross-sectional area of 23.9 +/- 3.0 cm(2), and an average fiber length-to-muscle length ratio of 0.21 +/- 0.03. Intraoperative sarcomere length measurements revealed that the muscle operates from 1.98 +/- 0.15 microm in extension to 2.70 +/- 0.11 microm in flexion. Passive mechanical data suggested that the material properties of the muscle are comparable with those of muscles of the arm or leg. The architectural design (a high cross-sectional area and

  8. Alternative diagnostic strategies for coronary artery disease in women: demonstration of the usefulness and efficiency of probability analysis

    International Nuclear Information System (INIS)

    Melin, J.A.; Wijns, W.; Vanbutsele, R.J.; Robert, A.; De Coster, P.; Brasseur, L.A.; Beckers, C.; Detry, J.M.

    1985-01-01

    Alternative strategies using conditional probability analysis for the diagnosis of coronary artery disease (CAD) were examined in 93 infarct-free women presenting with chest pain. Another group of 42 consecutive female patients was prospectively analyzed. For this latter group, the physician had access to the pretest and posttest probability of CAD before coronary angiography. These 135 women all underwent stress electrocardiographic, thallium scintigraphic, and coronary angiographic examination. The pretest and posttest probabilities of coronary disease were derived from a computerized Bayesian algorithm. Probability estimates were calculated by the four following hypothetical strategies: SO, in which history, including risk factors, was considered; S1, in which history and stress electrocardiographic results were considered; S2, in which history and stress electrocardiographic and stress thallium scintigraphic results were considered; and S3, in which history and stress electrocardiographic results were used, but in which stress scintigraphic results were considered only if the poststress probability of CAD was between 10% and 90%, i.e., if a sufficient level of diagnostic certainty could not be obtained with the electrocardiographic results alone. The strategies were compared with respect to accuracy with the coronary angiogram as the standard. For both groups of women, S2 and S3 were found to be the most accurate in predicting the presence or absence of coronary disease (p less than .05). However, it was found with use of S3 that more than one-third of the thallium scintigrams could have been avoided without loss of accuracy. It was also found that diagnostic catheterization performed to exclude CAD as a diagnosis could have been avoided in half of the patients without loss of accuracy.(ABSTRACT TRUNCATED AT 250 WORDS)

  9. Cancer risk factors in Korean news media: a content analysis.

    Science.gov (United States)

    Kye, Su Yeon; Kwon, Jeong Hyun; Kim, Yong-Chan; Shim, Minsun; Kim, Jee Hyun; Cho, Hyunsoon; Jung, Kyu Won; Park, Keeho

    2015-01-01

    Little is known about the news coverage of cancer risk factors in Korea. This study aimed to examine how the news media encompasses a wide array of content regarding cancer risk factors and related cancer sites, and investigate whether news coverage of cancer risk factors is congruent with the actual prevalence of the disease. A content analysis was conducted on 1,138 news stories covered during a 5-year period between 2008 and 2012. The news stories were selected from nationally representative media in Korea. Information was collected about cancer risk factors and cancer sites. Of various cancer risk factors, occupational and environmental exposures appeared most frequently in the news. Breast cancer was mentioned the most in relation to cancer sites. Breast, cervical, prostate, and skin cancer were overrepresented in the media in comparison to incidence and mortality cases, whereas lung, thyroid, liver, and stomach cancer were underrepresented. To our knowledge, this research is the first investigation dealing with news coverage about cancer risk factors in Korea. The study findings show occupational and environmental exposures are emphasized more than personal lifestyle factors; further, more prevalent cancers in developed countries have greater media coverage, not reflecting the realities of the disease. The findings may help health journalists and other health storytellers to develop effective ways to communicate cancer risk factors.

  10. Landslides geotechnical analysis. Qualitative assessment by valuation factors

    Science.gov (United States)

    Cuanalo Oscar, Sc D.; Oliva Aldo, Sc D.; Polanco Gabriel, M. E.

    2012-04-01

    In general, a landslide can cause a disaster when it is combined a number of factors such as an extreme event related to a geological phenomenon, vulnerable elements exposed in a specific geographic area, and the probability of loss and damage evaluated in terms of lives and economic assets, in a certain period of time. This paper presents the qualitative evaluation of slope stability through of Valuation Factors, obtained from the characterization of the determinants and triggers factors that influence the instability; for the first the morphology and topography, geology, soil mechanics, hydrogeology and vegetation to the second, the rain, earthquakes, erosion and scour, human activity, and ultimately dependent factors of the stability analysis, and its influence ranges which greatly facilitate the selection of construction processes best suited to improve the behavior of a slope or hillside. The Valuation Factors are a set of parameters for assessing the influence of conditioning and triggering factors that influence the stability of slopes and hillsides. The characteristics of each factor must be properly categorized to involve its effect on behavior; a way to do this is by assigning a weighted value range indicating its effect on the stability of a slope. It is proposed to use Valuation Factors with weighted values between 0 and 1 (arbitrarily selected but common sense and logic), the first corresponds to no or minimal effect on stability (no effect or very little influence) and the second, the greatest impact on it (has a significant influence). The meddle effects are evaluated with intermediate values.

  11. Sexual Behavior, Risk Compensation, and HIV Prevention Strategies Among Participants in the San Francisco PrEP Demonstration Project: A Qualitative Analysis of Counseling Notes.

    Science.gov (United States)

    Carlo Hojilla, J; Koester, Kimberly A; Cohen, Stephanie E; Buchbinder, Susan; Ladzekpo, Deawodi; Matheson, Tim; Liu, Albert Y

    2016-07-01

    Pre-exposure prophylaxis (PrEP) is a viable HIV prevention strategy but risk compensation could undermine potential benefits. There are limited data that examine this phenomenon outside of clinical trials. We conducted a qualitative analysis of counseling notes from the San Francisco site of the US PrEP demonstration project to assess how men who have sex with men used PrEP as a prevention strategy and its impact on their sexual practices. Four major themes emerged from our analysis of 130 distinct notes associated with 26 participants. Prevention strategy decision-making was dynamic, often influenced by the context and perceived risk of a sexual encounter. Counselors noted that participants used PrEP in conjunction with other health promotion strategies like condoms, asking about HIV status of their sex partners, and seroadaptation. With few exceptions, existing risk reduction strategies were not abandoned upon initiation of PrEP. Risk-taking behavior was 'seasonal' and fluctuations were influenced by various personal, psychosocial, and health-related factors. PrEP also helped relieve anxiety regarding sex and HIV, particularly among serodiscordant partners. Understanding sexual decision-making and how PrEP is incorporated into existing prevention strategies can help inform future PrEP implementation efforts.

  12. Salivary SPECT and factor analysis in Sjoegren's syndrome

    International Nuclear Information System (INIS)

    Nakamura, T.; Oshiumi, Y.; Yonetsu, K.; Muranaka, T.; Sakai, K.; Kanda, S.; National Fukuoka Central Hospital

    1991-01-01

    Salivary SPECT and factor analysis in Sjoegren's syndrome were performed in 17 patients and 6 volunteers as controls. The ability of SPECT to detect small differences in the level of uptake can be used to separate glands from background even when uptake is reduced as in the patients with Sjoegren's syndrome. In control and probable Sjoegren's syndrome groups the uptake ratio of the submandibular gland to parotid gland on salivary SPECT (S/P ratio) was less than 1.0. However, in the definite Sjoergren's syndrome group, the ratio was more than 1.0. Moreover, the ratio in all patients with sialectasia, which is characteristic of Sjoegren's syndrome, was more than 1.0. Salivary factor analysis of normal parotid glands showed slowly increasing patterns of uptake and normal submandibular glands had rapidly increasing patterns of uptake. However, in the definite Sjoegren's syndrome group, the factor analysis patterns were altered, with slowly increasing patterns dominating both in the parotid and submandibular glands. These results suggest that the S/P ratio in salivary SPECT and salivary factor analysis provide additional radiologic criteria in diagnosing Sjoegren's syndrome. (orig.)

  13. Genomewide analysis of TCP transcription factor gene family in ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Volume 93; Issue 3. Genomewide ... Teosinte branched1/cycloidea/proliferating cell factor1 (TCP) proteins are a large family of transcriptional regulators in angiosperms. They are ... To the best of our knowledge, this is the first study of a genomewide analysis of apple TCP gene family.

  14. Liquidity indicator for the Croatian economy – Factor analysis approach

    Directory of Open Access Journals (Sweden)

    Mirjana Čižmešija

    2014-12-01

    Full Text Available Croatian business surveys (BS are conducted in the manufacturing industry, retail trade and construction sector. In all of these sectors, manager´s assessments of liquidity are measured. The aim of the paper was to form a new composite liquidity indicator by including business survey liquidity measures from all three covered economic sectors in the Croatian economy mentioned above. In calculating the leading indicator, a factor analysis approach was used. However, this kind of indicator does not exist in a Croatia or in any other European economy. Furthermore, the issue of Croatian companies´ illiquidity is highly neglected in the literature. The empirical analysis consists of two parts. In the first part the new liquidity indicator was formed using factor analysis. One factor (representing the new liquidity indicator; LI was extracted out of the three liquidity variables in three economic sectors. This factor represents the new liquidity indicator. In the second part, econometric models were applied in order to investigate the forecasting properties of the new business survey liquidity indicator, when predicting the direction of changes in Croatian industrial production. The quarterly data used in the research covered the period from January 2000 to April 2013. Based on econometric analysis, it can be concluded that the LI is a leading indicator of Croatia’s industrial production with better forecasting properties then the standard liquidity indicators (formed in a manufacturing industry.

  15. Modular Open-Source Software for Item Factor Analysis

    Science.gov (United States)

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  16. A Confirmatory Factor Analysis of Reilly's Role Overload Scale

    Science.gov (United States)

    Thiagarajan, Palaniappan; Chakrabarty, Subhra; Taylor, Ronald D.

    2006-01-01

    In 1982, Reilly developed a 13-item scale to measure role overload. This scale has been widely used, but most studies did not assess the unidimensionality of the scale. Given the significance of unidimensionality in scale development, the current study reports a confirmatory factor analysis of the 13-item scale in two samples. Based on the…

  17. 48 CFR 1615.404-70 - Profit analysis factors.

    Science.gov (United States)

    2010-10-01

    ... CONTRACTING BY NEGOTIATION Contract Pricing 1615.404-70 Profit analysis factors. (a) OPM contracting officers... managerial expertise and effort. Evidence of effective contract performance will receive a plus weight, and... indifference to cost control will generally result in a negative weight. (2) Contract cost risk. In assessing...

  18. Identifying influential factors of business process performance using dependency analysis

    Science.gov (United States)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  19. An Evaluation on Factors Influencing Decision making for Malaysia Disaster Management: The Confirmatory Factor Analysis Approach

    Science.gov (United States)

    Zubir, S. N. A.; Thiruchelvam, S.; Mustapha, K. N. M.; Che Muda, Z.; Ghazali, A.; Hakimie, H.

    2017-12-01

    For the past few years, natural disaster has been the subject of debate in disaster management especially in flood disaster. Each year, natural disaster results in significant loss of life, destruction of homes and public infrastructure, and economic hardship. Hence, an effective and efficient flood disaster management would assure non-futile efforts for life saving. The aim of this article is to examine the relationship between approach, decision maker, influence factor, result, and ethic to decision making for flood disaster management in Malaysia. The key elements of decision making in the disaster management were studied based on the literature. Questionnaire surveys were administered among lead agencies at East Coast of Malaysia in the state of Kelantan and Pahang. A total of 307 valid responses had been obtained for further analysis. Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA) were carried out to analyse the measurement model involved in the study. The CFA for second-order reflective and first-order reflective measurement model indicates that approach, decision maker, influence factor, result, and ethic have a significant and direct effect on decision making during disaster. The results from this study showed that decision- making during disaster is an important element for disaster management to necessitate a successful collaborative decision making. The measurement model is accepted to proceed with further analysis known as Structural Equation Modeling (SEM) and can be assessed for the future research.

  20. A human factor analysis of a radiotherapy accident

    International Nuclear Information System (INIS)

    Thellier, S.

    2009-01-01

    Since September 2005, I.R.S.N. studies activities of radiotherapy treatment from the angle of the human and organizational factors to improve the reliability of treatment in radiotherapy. Experienced in nuclear industry incidents analysis, I.R.S.N. analysed and diffused in March 2008, for the first time in France, the detailed study of a radiotherapy accident from the angle of the human and organizational factors. The method used for analysis is based on interviews and documents kept by the hospital. This analysis aimed at identifying the causes of the difference recorded between the dose prescribed by the radiotherapist and the dose effectively received by the patient. Neither verbal nor written communication (intra-service meetings and protocols of treatment) allowed information to be transmitted correctly in order to permit radiographers to adjust the irradiation zones correctly. This analysis highlighted the fact that during the preparation and the carrying out of the treatment, various factors led planned controls to not be performed. Finally, this analysis highlighted the fact that unsolved areas persist in the report over this accident. This is due to a lack of traceability of a certain number of key actions. The article concluded that there must be improvement in three areas: cooperation between the practitioners, control of the actions and traceability of the actions. (author)

  1. Demonstration of statistical approaches to identify component's ageing by operational data analysis-A case study for the ageing PSA network

    International Nuclear Information System (INIS)

    Rodionov, Andrei; Atwood, Corwin L.; Kirchsteiger, Christian; Patrik, Milan

    2008-01-01

    The paper presents some results of a case study on 'Demonstration of statistical approaches to identify the component's ageing by operational data analysis', which was done in the frame of the EC JRC Ageing PSA Network. Several techniques: visual evaluation, nonparametric and parametric hypothesis tests, were proposed and applied in order to demonstrate the capacity, advantages and limitations of statistical approaches to identify the component's ageing by operational data analysis. Engineering considerations are out of the scope of the present study

  2. Computed Tomography-Guided Core-Needle Biopsy Specimens Demonstrate Epidermal Growth Factor Receptor Mutations in Patients with Non-Small-Cell Lung Cancer

    International Nuclear Information System (INIS)

    Chen, C.M.; Chang, J.W.C.; Cheung, Y.C.; Lin, G.; Hsieh, J.J.; Hsu, T.; Huang, S.F.

    2008-01-01

    Background: Target therapy with a new class of epidermal growth factor receptor (EGFR) inhibitors shows improved clinical response in EGFR gene-mutated lung cancers. Purpose: To evaluate the use of computed tomography (CT)-guided core-needle biopsy specimens for the assessment of EGFR gene mutation in non-small-cell lung cancer (NSCLC). Material and Methods: Seventeen (nine males, eight females) patients with advanced NSCLC were enrolled in this study. All patients underwent CT-guided core-needle biopsy of the lung tumor prior to treatment with the EGFR inhibitor gefitinib. There were no life-threatening complications of biopsy. The specimens were sent fresh-frozen for EGFR mutation analysis and histopathological study. Results: There were 12 (70.6%) EGFR gene mutants and five (29.4%) nonmutants. The objective response rate to gefitinib therapy was 73.3% (11 of 15 patients), with 91.7% (11 of 12 mutants) for the mutant group and 0% for the nonmutant group. Conclusion: CT-guided core-needle biopsy of advanced NSCLC enables the acquisition of sufficient tissue for EGFR gene mutation analysis

  3. EMPLOYMENT LEVEL ANALYSIS FROM THE DETERMINANT FACTORS PERSPECTIVE

    Directory of Open Access Journals (Sweden)

    Elena Diana ŞERB

    2016-02-01

    Full Text Available Neglecting the human factor as part of the labor market causes losses for society as any activity that is initiated within it, has as a starting point, and also as a finishing point, the human intervention. The starting point of the article is represented by the projections made by the European    Commission in the Population Ageing Report in 2015 underlying assumptions and projections, and also by the projections of the United Nations report in 2015, and this resulted in many conclusions including the one that for the first time in Romania the average aging in 2015 exceeds the values measured by EU till present day, and this is reflected in the employment level (active aging population. The hypothesis behind the article is that the evolution of the population and migrants has repercussions on employment. Structured in three parts: knowledge status, the analysis of employment indicators and information about the intensity and direction of the link between a number of factors and employment level, this article aims to establish the determinant factors of employment through a research focused on the analysis of secondary sources, and also using the regression model. The most important lesson learned as a result of this research is that the labor market works with a variety of factors with a higher or lower influence, and in turn the labor market influences other factors.

  4. Arabidopsis transcription factors: genome-wide comparative analysis among eukaryotes.

    Science.gov (United States)

    Riechmann, J L; Heard, J; Martin, G; Reuber, L; Jiang, C; Keddie, J; Adam, L; Pineda, O; Ratcliffe, O J; Samaha, R R; Creelman, R; Pilgrim, M; Broun, P; Zhang, J Z; Ghandehari, D; Sherman, B K; Yu, G

    2000-12-15

    The completion of the Arabidopsis thaliana genome sequence allows a comparative analysis of transcriptional regulators across the three eukaryotic kingdoms. Arabidopsis dedicates over 5% of its genome to code for more than 1500 transcription factors, about 45% of which are from families specific to plants. Arabidopsis transcription factors that belong to families common to all eukaryotes do not share significant similarity with those of the other kingdoms beyond the conserved DNA binding domains, many of which have been arranged in combinations specific to each lineage. The genome-wide comparison reveals the evolutionary generation of diversity in the regulation of transcription.

  5. Numerical analysis of the in-well vapor-stripping system demonstration at Edwards Air Force Base

    International Nuclear Information System (INIS)

    White, M.D.; Gilmore, T.J.

    1996-10-01

    Numerical simulations, with the Subsurface Transport Over Multiple Phases (STOMP) simulator, were applied to the field demonstration of an in-well vapor-stripping system at Edwards Air Force Base (AFB), near Mojave, California. The demonstration field site on the Edwards AFB was previously contaminated from traversing groundwater that was contained a varied composition of volatile organic compounds (VOCs), which primarily includes trichloroethylene (TCE). Contaminant TCE originated from surface basin that had been used to collect runoff during the cleaning of experimental rocket powered planes in the 1960s and 1970s. This report documents those simulations and associated numerical analyses. A companion report documents the in- well vapor-stripping demonstration from a field perspective

  6. DOE FY 2010 Budget Request and Recovery Act Funding for Energy Research, Development, Demonstration, and Deployment: Analysis and Recommendations

    Energy Technology Data Exchange (ETDEWEB)

    Anadon, Laura Diaz; Gallagher, Kelly Sims; Bunn, Matthew

    2009-06-01

    short-term. Energy storage may play a crucial role in the future of the power and transportation systems, which together consume two thirds of primary energy in the United States. A recent National Academy of Science report recommended carrying out detailed scenario assessments of the penetration of unconventional fuels from coal and coal and biomass with CCS. And the research plan provided for nuclear fission does not justify spending as many funds as were requested. The proposed funding for FY 2010 and the resources from ARRA, however, do not guarantee that the United States will finally enjoy the predictable and consistent publicly-funded energy technology innovation effort that it needs. The Obama administration must put in place a comprehensive energy technology innovation strategy that will ensure that an expanded ERD3 effort is both sustainable and efficient. This commission would be charged with, inter alia, developing a strategy that optimizes the integration of the various stages of innovation (research, development, demonstration, early deployment), as well as integrates efforts across technology areas. The database upon which this analysis is based may be downloaded in Excel format at: http://belfercenter.ksg.harvard.edu/publication/19119/ .

  7. Factors Affecting Green Residential Building Development: Social Network Analysis

    Directory of Open Access Journals (Sweden)

    Xiaodong Yang

    2018-05-01

    Full Text Available Green residential buildings (GRBs are one of the effective practices of energy saving and emission reduction in the construction industry. However, many real estate developers in China are less willing to develop GRBs, because of the factors affecting green residential building development (GRBD. In order to promote the sustainable development of GRBs in China, this paper, based on the perspective of real estate developers, identifies the influential and critical factors affecting GRBD, using the method of social network analysis (SNA. Firstly, 14 factors affecting GRBD are determined from 64 preliminary factors of three main elements, and the framework is established. Secondly, the relationships between the 14 factors are analyzed by SNA. Finally, four critical factors for GRBD, which are on the local economy development level, development strategy and innovation orientation, developer’s acknowledgement and positioning for GRBD, and experience and ability for GRBD, are identified by the social network centrality test. The findings illustrate the key issues that affect the development of GRBs, and provide references for policy making by the government and strategy formulation by real estate developers.

  8. Profile and Risk Factor Analysis of Unintentional Injuries in Children.

    Science.gov (United States)

    Bhamkar, Rahul; Seth, Bageshree; Setia, Maninder Singh

    2016-10-01

    To study the profile and various risk factors associated with unintentional injuries in children. The study is a cross sectional analysis of data collected from 351 children presenting with unintentional injury to a tertiary care hospital in Navi Mumbai, India. Data were collected about variables based on Haddon Phase Factor Matrix - host, environment and agent factors. Proportions for categorical variables across various groups were compared using Chi square test or Fisher's exact test. Logistic regression model was used to evaluate the factors. Falls (36 %) were the most common injuries followed by bites (23 %). Majority of children were school going children (38 %) followed by preschool children (29 %). Forty-seven percent were from lower socioeconomic class. Commonest place of injury was home (48 %) and the commonest time was evening (49 %). Though there was male predominance in injuries, the difference across gender did not vary significantly (p = 0.15). Poisonings were significantly more common in infants and toddlers and in rural population (p risk of bites compared to urban (p Profile of injuries varies widely as per the variations in agent, host and environmental factors. Socio-environmental, economic conditions and infancy-toddler age groups are predisposing risk factors for bites and poisoning. Although rural areas and lower socioeconomic class population are more vulnerable to serious types of injuries, they still lack essential basic medical care.

  9. Exploratory Factor Analysis With Small Samples and Missing Data.

    Science.gov (United States)

    McNeish, Daniel

    2017-01-01

    Exploratory factor analysis (EFA) is an extremely popular method for determining the underlying factor structure for a set of variables. Due to its exploratory nature, EFA is notorious for being conducted with small sample sizes, and recent reviews of psychological research have reported that between 40% and 60% of applied studies have 200 or fewer observations. Recent methodological studies have addressed small size requirements for EFA models; however, these models have only considered complete data, which are the exception rather than the rule in psychology. Furthermore, the extant literature on missing data techniques with small samples is scant, and nearly all existing studies focus on topics that are not of primary interest to EFA models. Therefore, this article presents a simulation to assess the performance of various missing data techniques for EFA models with both small samples and missing data. Results show that deletion methods do not extract the proper number of factors and estimate the factor loadings with severe bias, even when data are missing completely at random. Predictive mean matching is the best method overall when considering extracting the correct number of factors and estimating factor loadings without bias, although 2-stage estimation was a close second.

  10. Clonal heterogeneity of small-cell anaplastic carcinoma of the lung demonstrated by flow-cytometric DNA analysis

    DEFF Research Database (Denmark)

    Vindeløv, L L; Hansen, H H; Christensen, I J

    1980-01-01

    Flow-cytometric DNA analysis yields information on ploidy and proliferative characteristics of a cell population. The analysis was implemented on small-cell anaplastic carcinoma of the lung using a rapid detergent technique for the preparation of fine-needle aspirates for DNA determination and a ...

  11. Barely Started and Already Left behind: A Descriptive Analysis of the Mathematics Ability Demonstrated by Young Deaf Children

    Science.gov (United States)

    Kritzer, Karen L.

    2009-01-01

    This study examined young deaf children's early informal/formal mathematical knowledge as measured by the Test of Early Mathematics Ability (TEMA-3). Findings from this study suggest that prior to the onset of formal schooling, young deaf children might already demonstrate evidence of academic delays. Of these 28 participants (4-6 years of age),…

  12. Demonstration and quantification of the redistribution and oxidation of carbon monoxide in the human body by tracer analysis

    Directory of Open Access Journals (Sweden)

    Makoto Sawano

    2016-01-01

    Full Text Available Numerous studies have confirmed the role of endogenous carbon monoxide (CO gas as a signal transmitter. However, CO is considered an intracellular transmitter, as no studies have demonstrated the redistribution of CO from the blood to tissue cells. Tracer analyses of 13 CO 2 production following 13 CO gas inhalation demonstrated that CO is oxidized to carbon dioxide (CO 2 in the body and that CO oxidation does not occur in the circulation. However, these results could not clearly demonstrate the redistribution of CO, because oxidation may have occurred in the airway epithelium. The objective of this study, therefore, was to definitively demonstrate and quantify the redistribution and oxidation of CO using time-course analyses of CO and 13 CO 2 production following 13 CO-hemoglobin infusion. The subject was infused with 0.45 L of 13 CO-saturated autologous blood. Exhaled gas was collected intermittently for 36 hours for measurement of minute volumes of CO/CO 2 exhalation and determination of the 13 CO 2 / 12 CO 2 ratio. 13 CO 2 production significantly increased from 3 to 28 hours, peaking at 8 hours. Of the infused CO, 81% was exhaled as CO and 2.6% as 13 CO 2 . Identical time courses of 13 CO 2 production following 13 CO-hemoglobin infusion and 13 CO inhalation refute the hypothesis that CO is oxidized in the airway epithelium and clearly demonstrate the redistribution of CO from the blood to the tissues. Quantitative analyses have revealed that 19% of CO in the circulating blood is redistributed to tissue cells, whereas 2.6% is oxidized there. Overall, these results suggest that CO functions as a systemic signal transmitter.

  13. Cross-Cultural Validation of the Modified Practice Attitudes Scale: Initial Factor Analysis and a New Factor Model.

    Science.gov (United States)

    Park, Heehoon; Ebesutani, Chad K; Chung, Kyong-Mee; Stanick, Cameo

    2018-01-01

    The objective of this study was to create the Korean version of the Modified Practice Attitudes Scale (K-MPAS) to measure clinicians' attitudes toward evidence-based treatments (EBTs) in the Korean mental health system. Using 189 U.S. therapists and 283 members from the Korean mental health system, we examined the reliability and validity of the MPAS scores. We also conducted the first exploratory and confirmatory factor analysis on the MPAS and compared EBT attitudes across U.S. and Korean therapists. Results revealed that the inclusion of both "reversed-worded" and "non-reversed-worded" items introduced significant method effects that compromised the integrity of the one-factor MPAS model. Problems with the one-factor structure were resolved by eliminating the "non-reversed-worded" items. Reliability and validity were adequate among both Korean and U.S. therapists. Korean therapists also reported significantly more negative attitudes toward EBTs on the MPAS than U.S. therapists. The K-MPAS is the first questionnaire designed to measure Korean service providers' attitudes toward EBTs to help advance the dissemination of EBTs in Korea. The current study also demonstrated the negative impacts that can be introduced by incorporating oppositely worded items into a scale, particularly with respect to factor structure and detecting significant group differences.

  14. Tensor-Dictionary Learning with Deep Kruskal-Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, Andrew J.; Pu, Yunchen; Sun, Yannan; Spell, Gregory; Carin, Lawrence

    2017-04-20

    We introduce new dictionary learning methods for tensor-variate data of any order. We represent each data item as a sum of Kruskal decomposed dictionary atoms within the framework of beta-process factor analysis (BPFA). Our model is nonparametric and can infer the tensor-rank of each dictionary atom. This Kruskal-Factor Analysis (KFA) is a natural generalization of BPFA. We also extend KFA to a deep convolutional setting and develop online learning methods. We test our approach on image processing and classification tasks achieving state of the art results for 2D & 3D inpainting and Caltech 101. The experiments also show that atom-rank impacts both overcompleteness and sparsity.

  15. Real-time dynamic MR image reconstruction using compressed sensing and principal component analysis (CS-PCA): Demonstration in lung tumor tracking.

    Science.gov (United States)

    Dietz, Bryson; Yip, Eugene; Yun, Jihyun; Fallone, B Gino; Wachowicz, Keith

    2017-08-01

    This work presents a real-time dynamic image reconstruction technique, which combines compressed sensing and principal component analysis (CS-PCA), to achieve real-time adaptive radiotherapy with the use of a linac-magnetic resonance imaging system. Six retrospective fully sampled dynamic data sets of patients diagnosed with non-small-cell lung cancer were used to investigate the CS-PCA algorithm. Using a database of fully sampled k-space, principal components (PC's) were calculated to aid in the reconstruction of undersampled images. Missing k-space data were calculated by projecting the current undersampled k-space data onto the PC's to generate the corresponding PC weights. The weighted PC's were summed together, and the missing k-space was iteratively updated. To gain insight into how the reconstruction might proceed at lower fields, 6× noise was added to the 3T data to investigate how the algorithm handles noisy data. Acceleration factors ranging from 2 to 10× were investigated using CS-PCA and Split Bregman CS for comparison. Metrics to determine the reconstruction quality included the normalized mean square error (NMSE), as well as the dice coefficients (DC) and centroid displacement of the tumor segmentations. Our results demonstrate that CS-PCA performed superior than CS alone. The CS-PCA patient averaged DC for 3T and 6× noise added data remained above 0.9 for acceleration factors up to 10×. The patient averaged NMSE gradually increased with increasing acceleration; however, it remained below 0.06 up to an acceleration factor of 10× for both 3T and 6× noise added data. The CS-PCA reconstruction speed ranged from 5 to 20 ms (Intel i7-4710HQ CPU @ 2.5 GHz), depending on the chosen parameters. A real-time reconstruction technique was developed for adaptive radiotherapy using a Linac-MRI system. Our CS-PCA algorithm can achieve tumor contours with DC greater than 0.9 and NMSE less than 0.06 at acceleration factors of up to, and including, 10×. The

  16. A new detrended semipartial cross-correlation analysis: Assessing the important meteorological factors affecting API

    Energy Technology Data Exchange (ETDEWEB)

    Shen, Chen-Hua, E-mail: shenandchen01@163.com [College of Geographical Science, Nanjing Normal University, Nanjing 210046 (China); Jiangsu Center for Collaborative Innovation in Geographical Information Resource, Nanjing 210046 (China); Key Laboratory of Virtual Geographic Environment of Ministry of Education, Nanjing 210046 (China)

    2015-12-04

    To analyze the unique contribution of meteorological factors to the air pollution index (API), a new method, the detrended semipartial cross-correlation analysis (DSPCCA), is proposed. Based on both a detrended cross-correlation analysis and a DFA-based multivariate-linear-regression (DMLR), this method is improved by including a semipartial correlation technique, which is used to indicate the unique contribution of an explanatory variable to multiple correlation coefficients. The advantages of this method in handling nonstationary time series are illustrated by numerical tests. To further demonstrate the utility of this method in environmental systems, new evidence of the primary contribution of meteorological factors to API is provided through DMLR. Results show that the most important meteorological factors affecting API are wind speed and diurnal temperature range, and the explanatory ability of meteorological factors to API gradually strengthens with increasing time scales. The results suggest that DSPCCA is a useful method for addressing environmental systems. - Highlights: • A detrended multiple linear regression is shown. • A detrended semipartial cross correlation analysis is proposed. • The important meteorological factors affecting API are assessed. • The explanatory ability of meteorological factors to API gradually strengthens with increasing time scales.

  17. A new detrended semipartial cross-correlation analysis: Assessing the important meteorological factors affecting API

    International Nuclear Information System (INIS)

    Shen, Chen-Hua

    2015-01-01

    To analyze the unique contribution of meteorological factors to the air pollution index (API), a new method, the detrended semipartial cross-correlation analysis (DSPCCA), is proposed. Based on both a detrended cross-correlation analysis and a DFA-based multivariate-linear-regression (DMLR), this method is improved by including a semipartial correlation technique, which is used to indicate the unique contribution of an explanatory variable to multiple correlation coefficients. The advantages of this method in handling nonstationary time series are illustrated by numerical tests. To further demonstrate the utility of this method in environmental systems, new evidence of the primary contribution of meteorological factors to API is provided through DMLR. Results show that the most important meteorological factors affecting API are wind speed and diurnal temperature range, and the explanatory ability of meteorological factors to API gradually strengthens with increasing time scales. The results suggest that DSPCCA is a useful method for addressing environmental systems. - Highlights: • A detrended multiple linear regression is shown. • A detrended semipartial cross correlation analysis is proposed. • The important meteorological factors affecting API are assessed. • The explanatory ability of meteorological factors to API gradually strengthens with increasing time scales.

  18. Human factors and fuzzy set theory for safety analysis

    International Nuclear Information System (INIS)

    Nishiwaki, Y.

    1987-01-01

    Human reliability and performance is affected by many factors: medical, physiological and psychological, etc. The uncertainty involved in human factors may not necessarily be probabilistic, but fuzzy. Therefore, it is important to develop a theory by which both the non-probabilistic uncertainties, or fuzziness, of human factors and the probabilistic properties of machines can be treated consistently. In reality, randomness and fuzziness are sometimes mixed. From the mathematical point of view, probabilistic measures may be considered a special case of fuzzy measures. Therefore, fuzzy set theory seems to be an effective tool for analysing man-machine systems. The concept 'failure possibility' based on fuzzy sets is suggested as an approach to safety analysis and fault diagnosis of a large complex system. Fuzzy measures and fuzzy integrals are introduced and their possible applications are also discussed. (author)

  19. Sea level rise and the geoid: factor analysis approach

    Directory of Open Access Journals (Sweden)

    Alexey Sadovski

    2013-08-01

    Full Text Available Sea levels are rising around the world, and this is a particular concern along most of the coasts of the United States. A 1989 EPA report shows that sea levels rose 5-6 inches more than the global average along the Mid-Atlantic and Gulf Coasts in the last century. The main reason for this is coastal land subsidence. This sea level rise is considered more as relative sea level rise than global sea level rise. Thus, instead of studying sea level rise globally, this paper describes a statistical approach by using factor analysis of regional sea level rates of change. Unlike physical models and semi-empirical models that attempt to approach how much and how fast sea levels are changing, this methodology allows for a discussion of the factor(s that statistically affects sea level rates of change, and seeks patterns to explain spatial correlations.

  20. Memory systems, processes, and tasks: taxonomic clarification via factor analysis.

    Science.gov (United States)

    Bruss, Peter J; Mitchell, David B

    2009-01-01

    The nature of various memory systems was examined using factor analysis. We reanalyzed data from 11 memory tasks previously reported in Mitchell and Bruss (2003). Four well-defined factors emerged, closely resembling episodic and semantic memory and conceptual and perceptual implicit memory, in line with both memory systems and transfer-appropriate processing accounts. To explore taxonomic issues, we ran separate analyses on the implicit tasks. Using a cross-format manipulation (pictures vs. words), we identified 3 prototypical tasks. Word fragment completion and picture fragment identification tasks were "factor pure," tapping perceptual processes uniquely. Category exemplar generation revealed its conceptual nature, yielding both cross-format priming and a picture superiority effect. In contrast, word stem completion and picture naming were more complex, revealing attributes of both processes.

  1. ANALYSIS OF FACTORS WHICH AFFECTING THE ECONOMIC GROWTH

    Directory of Open Access Journals (Sweden)

    Suparna Wijaya

    2017-03-01

    Full Text Available High economic growth and sustainable process are main conditions for sustainability of economic country development. They are also become measures of the success of the country's economy. Factors which tested in this study are economic and non-economic factors which impacting economic development. This study has a goal to explain the factors that influence on macroeconomic Indonesia. It used linear regression modeling approach. The analysis result showed that Tax Amnesty, Exchange Rate, Inflation, and interest rate, they jointly can bring effect which amounted to 77.6% on economic growth whereas the remaining 22.4% is the influenced by other variables which not observed in this study. Keywords: tax amnesty, exchange rates, inflation, SBI and economic growth

  2. Human Modeling for Ground Processing Human Factors Engineering Analysis

    Science.gov (United States)

    Stambolian, Damon B.; Lawrence, Brad A.; Stelges, Katrine S.; Steady, Marie-Jeanne O.; Ridgwell, Lora C.; Mills, Robert E.; Henderson, Gena; Tran, Donald; Barth, Tim

    2011-01-01

    There have been many advancements and accomplishments over the last few years using human modeling for human factors engineering analysis for design of spacecraft. The key methods used for this are motion capture and computer generated human models. The focus of this paper is to explain the human modeling currently used at Kennedy Space Center (KSC), and to explain the future plans for human modeling for future spacecraft designs

  3. Absorption correction factor in X-ray fluorescent quantitative analysis

    International Nuclear Information System (INIS)

    Pimjun, S.

    1994-01-01

    An experiment on absorption correction factor in X-ray fluorescent quantitative analysis were carried out. Standard samples were prepared from the mixture of Fe 2 O 3 and tapioca flour at various concentration of Fe 2 O 3 ranging from 5% to 25%. Unknown samples were kaolin containing 3.5% to-50% of Fe 2 O 3 Kaolin samples were diluted with tapioca flour in order to reduce the absorption of FeK α and make them easy to prepare. Pressed samples with 0.150 /cm 2 and 2.76 cm in diameter, were used in the experiment. Absorption correction factor is related to total mass absorption coefficient (χ) which varied with sample composition. In known sample, χ can be calculated by conveniently the formula. However in unknown sample, χ can be determined by Emission-Transmission method. It was found that the relationship between corrected FeK α intensity and contents of Fe 2 O 3 in these samples was linear. This result indicate that this correction factor can be used to adjust the accuracy of X-ray intensity. Therefore, this correction factor is essential in quantitative analysis of elements comprising in any sample by X-ray fluorescent technique

  4. Quantitative proteomic analysis of Streptomyces coelicolor development demonstrates that onset of secondary metabolism coincides with hyphae differentiation

    DEFF Research Database (Denmark)

    Manteca, Angel; Sanchez, Jesus; Jung, Hye Ryung

    2010-01-01

    mycelial stages: an early compartmentalized vegetative mycelium (first mycelium, MI), and a multinucleated reproductive mycelium (second mycelium, MII), arising after PCD processes. In the present study, we made a detailed proteomic analysis of the distinct developmental stages of solid confluent...... Streptomyces coelicolor cultures using iTRAQ labelling and LC-MS/MS. A new experimental approach was developed to obtain homogeneous samples at each developmental stage (temporal protein analysis) and also to obtain membrane and cytosolic protein fractions (spatial protein analysis). A total of 345 proteins...

  5. Seismic analysis response factors and design margins of piping systems

    International Nuclear Information System (INIS)

    Shieh, L.C.; Tsai, N.C.; Yang, M.S.; Wong, W.L.

    1985-01-01

    The objective of the simplified methods project of the Seismic Safety Margins Research Program is to develop a simplified seismic risk methodology for general use. The goal is to reduce seismic PRA costs to roughly 60 man-months over a 6 to 8 month period, without compromising the quality of the product. To achieve the goal, it is necessary to simplify the calculational procedure of the seismic response. The response factor approach serves this purpose. The response factor relates the median level response to the design data. Through a literature survey, we identified the various seismic analysis methods adopted in the U.S. nuclear industry for the piping system. A series of seismic response calculations was performed. The response factors and their variabilities for each method of analysis were computed. A sensitivity study of the effect of piping damping, in-structure response spectra envelop method, and analysis method was conducted. In addition, design margins, which relate the best-estimate response to the design data, are also presented

  6. Phylogenetic and Functional Analysis of Metagenome Sequence from High-Temperature Archaeal Habitats Demonstrate Linkages between Metabolic Potential and Geochemistry

    DEFF Research Database (Denmark)

    Inskeep, William P; Jay, Zackary J; Herrgard, Markus

    2013-01-01

    Geothermal habitats in Yellowstone National Park (YNP) provide an unparalleled opportunity to understand the environmental factors that control the distribution of archaea in thermal habitats. Here we describe, analyze, and synthesize metagenomic and geochemical data collected from seven high-tem...

  7. Structural analysis of closure cap barriers: A pre-test study for the Bentonite Mat Demonstration Project

    International Nuclear Information System (INIS)

    Gong, Chung; Pelfrey, J.R.

    1993-01-01

    The Bentonite Mat Demonstration Project (BMDP) is a field demonstration study to determine the construction/installation requirements, permeability, and subsidence performance characteristics of a composite barrier. The composite barrier will consist of on-site sandy-clay blanketed by a bentonite mat and a flexible High Density Polyethylene (HDPE) liner (also called flexible membrane liner). Construction of one control test pad and three bentonite test pads are planned. The control test pad will be used to establish baseline data. Underneath the composite clay cap is a four feet thick loose sand layer in which cavities will be created by evacuation of sand. The present work provides a mathematical model for the BMDP. The mathematical model will be used to simulate the mechanical and structural responses of the composite clay cap during the testing processes. Based upon engineering experience and technical references, a set of nominal soil parameters have been selected

  8. Design and Demonstration of Automated Data Analysis Algorithms for Ultrasonic Inspection of Complex Composite Panels with Bonds

    Science.gov (United States)

    2016-02-01

    all of the ADA called indications into three groups: true positives (TP), missed calls (MC) and false calls (FC). Note, an indication position error...data review burden and improve the reliability of the ultrasonic inspection of large composite structures, automated data analysis ( ADA ) algorithms...thickness and backwall C-scan images. 15. SUBJECT TERMS automated data analysis ( ADA ) algorithms; time-of-flight indications; backwall amplitude dropout

  9. Comparative analysis of species-based specificity in Sr 90 and Cs 137 accumulation demonstrated by ligneous plant forest communities

    International Nuclear Information System (INIS)

    Martinovich, B.S.; Vlasov, V.K.; Sak, M.M.; Golushko, R.M.; Afmogenov, A.M.; Kirykhin, O.V.

    2004-01-01

    The authors provided field-proven study of Sr 90 and Cs 137 absorption activity demonstrated by Pinus silvestris L.; Piceae abies (L.) Roth.; Quercus rubra L.; Acer platanoides L.; Betula pendula Roth.; Tilia cordata Mill, under identical habitat conditions. The above plants were examined after 5-year growth period on radionuclide-contaminated soil. To a great extent, such parameters as radionuclide accumulation in experimental plants and accumulation activity were determined by the plants' bio-ecological properties. (Authors)

  10. Exploratory Analysis of the Factors Affecting Consumer Choice in E-Commerce: Conjoint Analysis

    Directory of Open Access Journals (Sweden)

    Elena Mazurova

    2017-05-01

    Full Text Available According to previous studies of online consumer behaviour, three factors are the most influential on purchasing behavior - brand, colour and position of the product on the screen. However, a simultaneous influence of these three factors on the consumer decision making process has not been investigated previously. In this particular work we aim to execute a comprehensive study of the influence of these three factors. In order to answer our main research questions, we conducted an experiment with 96 different combinations of the three attributes, and using statistical analysis, such as conjoint analysis, t-test analysis and Kendall analysis we identified that the most influential factor to the online consumer decision making process is brand, the second most important attribute is the colour, which was estimated half as important as brand, and the least important attribute is the position on the screen. Additionally, we identified the main differences regarding consumers stated and revealed preferences regarding these three attributes.

  11. Microbiological analysis of common preservatives used in food items and demonstration of their in vitro anti-bacterial activity

    Directory of Open Access Journals (Sweden)

    Tohora Sultana

    2014-12-01

    Full Text Available Objective: To quantify the microorganisms contaminating the common preservatives used in food as well as to detect their in vitro anti-bacterial traits. Methods: A total of 9 preservatives were subjected to conventional cultural and biochemical methods for microbial enumeration. Anti-bacterial activities were demonstrated through the agar well diffusion method. Results: All samples were found to be contaminated with bacteria up to 105 CFU/g and with the fungal flora within a range of 1 01-1 02 CFU/g. Escherichia coli, Pseudomonas spp. and Staphylococcus spp. were demonstrated in most of the samples. Sodium sulfite and citric acid possessed the strongest anti-bacterial trait against all of the test bacteria. Acetic acid exhibited activity against 6 out of 8 test bacteria while vinegar exhibited the activity against 4 bacteria. Activity of salt was demonstrated only against Listeria spp. and Bacillus spp., while activity of sugar and honey was found only against Escherichia coli and Klebsiella spp., respectively. Conclusions: According to the current investigation, sodium sulfite and citric acid samples were found to be satisfactory preservatives both in terms of microbiological criteria and their antibacterial traits.

  12. Comparative proteome approach demonstrates that platelet-derived growth factor C and D efficiently induce proliferation while maintaining multipotency of hMSCs

    Energy Technology Data Exchange (ETDEWEB)

    Sotoca, Ana M., E-mail: a.sotoca@science.ru.nl [Department of Cell and Applied Biology, Radboud University, Heijendaalseweg 135, 6525 AJ Nijmegen (Netherlands); Roelofs-Hendriks, Jose [Department of Cell and Applied Biology, Radboud University, Heijendaalseweg 135, 6525 AJ Nijmegen (Netherlands); Boeren, Sjef [Laboratory of Biochemistry, Wageningen University, Dreijenlaan 3, 6703 HA Wageningen (Netherlands); Kraan, Peter M. van der [Department of Rheumatology Research and Advanced Therapeutics, Radboud University Nijmegen Medical Centre, Nijmegen (Netherlands); Vervoort, Jacques [Laboratory of Biochemistry, Wageningen University, Dreijenlaan 3, 6703 HA Wageningen (Netherlands); Zoelen, Everardus J.J. van; Piek, Ester [Department of Cell and Applied Biology, Radboud University, Heijendaalseweg 135, 6525 AJ Nijmegen (Netherlands)

    2013-10-15

    This is the first study that comprehensively describes the effects of the platelet-derived growth factor (PDGF) isoforms C and D during in vitro expansion of human mesenchymal stem cells (hMSCs). Our results show that PDGFs can enhance proliferation of hMSCs without affecting their multipotency. It is of great value to culture and expand hMSCs in a safe and effective manner without losing their multipotency for manipulation and further development of cell-based therapies. Moreover, differential effects of PDGF isoforms have been observed on lineage-specific differentiation induced by BMP2 and Vitamin D3. Based on label-free LC-based quantitative proteomics approach we have furthermore identified specific pathways induced by PDGFs during the proliferation process, showing the importance of bioinformatics tools to study cell function. - Highlights: • PDGFs (C and D) significantly increased the number of multipotent undifferentiated hMSCs. • Enhanced proliferation did not impair the ability to undergo lineage-specific differentiation. • Proteomic analysis confirmed the overall signatures of the ‘intact’ cells.

  13. A pragmatic approach to estimate alpha factors for common cause failure analysis

    International Nuclear Information System (INIS)

    Hassija, Varun; Senthil Kumar, C.; Velusamy, K.

    2014-01-01

    Highlights: • Estimation of coefficients in alpha factor model for common cause analysis. • A derivation of plant specific alpha factors is demonstrated. • We examine sensitivity of common cause contribution to total system failure. • We compare beta factor and alpha factor models for various redundant configurations. • The use of alpha factors is preferable, especially for large redundant systems. - Abstract: Most of the modern technological systems are deployed with high redundancy but still they fail mainly on account of common cause failures (CCF). Various models such as Beta Factor, Multiple Greek Letter, Binomial Failure Rate and Alpha Factor exists for estimation of risk from common cause failures. Amongst all, alpha factor model is considered most suitable for high redundant systems as it arrives at common cause failure probabilities from a set of ratios of failures and the total component failure probability Q T . In the present study, alpha factor model is applied for the assessment of CCF of safety systems deployed at two nuclear power plants. A method to overcome the difficulties in estimation of the coefficients viz., alpha factors in the model, importance of deriving plant specific alpha factors and sensitivity of common cause contribution to the total system failure probability with respect to hazard imposed by various CCF events is highlighted. An approach described in NUREG/CR-5500 is extended in this study to provide more explicit guidance for a statistical approach to derive plant specific coefficients for CCF analysis especially for high redundant systems. The procedure is expected to aid regulators for independent safety assessment

  14. Direct projection from the suprachiasmatic nucleus to hypophysiotrophic corticotropin-releasing factor immunoreactive cells in the paraventricular nucleus of the hypothalamus demonstrated...

    DEFF Research Database (Denmark)

    Vrang, N.; Larsen, P.J.; Mikkelsen, J.D.

    1995-01-01

    Suprachiasmatic nucleus, paraventricular nucleus, circadian rhythms, phaseolus vulgaris-leucoagglutinin, corticotropin-releasing factor, dual immunocytochemistry......Suprachiasmatic nucleus, paraventricular nucleus, circadian rhythms, phaseolus vulgaris-leucoagglutinin, corticotropin-releasing factor, dual immunocytochemistry...

  15. Dispersion-theoretical analysis of the nucleon electromagnetic form factors

    Energy Technology Data Exchange (ETDEWEB)

    Belushkin, M.

    2007-09-29

    The structure of the proton and the neutron is of fundamental importance for the study of the strong interaction dynamics over a wide range of momentum transfers. The nucleon form factors encode information on the internal structure of the nucleon as probed by the electromagnetic interaction, and, to a certain extent, reflect the charge and magnetisation distributions within the proton and the neutron. In this thesis we report on our investigation of the electromagnetic form factors of the proton and the neutron with dispersion relation techniques, including known experimental input on the {pi}{pi}, K anti K and the {rho}{pi} continua and perturbative QCD constraints. We include new experimental data on the pion form factor and the nucleon form factors in our simultaneous analysis of all four form factors in both the space- and the timelike regions for all momentum transfers, and perform Monte- Carlo sampling in order to obtain theoretical uncertainty bands. Finally, we discuss the implications of our results on the pion cloud of the nucleon, the nucleon radii and the Okubo-Zweig-Iizuka rule, and present our results of a model-independent approach to estimating two-photon effects in elastic electron-proton scattering. (orig.)

  16. Dispersion-theoretical analysis of the nucleon electromagnetic form factors

    International Nuclear Information System (INIS)

    Belushkin, M.

    2007-01-01

    The structure of the proton and the neutron is of fundamental importance for the study of the strong interaction dynamics over a wide range of momentum transfers. The nucleon form factors encode information on the internal structure of the nucleon as probed by the electromagnetic interaction, and, to a certain extent, reflect the charge and magnetisation distributions within the proton and the neutron. In this thesis we report on our investigation of the electromagnetic form factors of the proton and the neutron with dispersion relation techniques, including known experimental input on the ππ, K anti K and the ρπ continua and perturbative QCD constraints. We include new experimental data on the pion form factor and the nucleon form factors in our simultaneous analysis of all four form factors in both the space- and the timelike regions for all momentum transfers, and perform Monte- Carlo sampling in order to obtain theoretical uncertainty bands. Finally, we discuss the implications of our results on the pion cloud of the nucleon, the nucleon radii and the Okubo-Zweig-Iizuka rule, and present our results of a model-independent approach to estimating two-photon effects in elastic electron-proton scattering. (orig.)

  17. Patient Safety Culture Survey in Pediatric Complex Care Settings: A Factor Analysis.

    Science.gov (United States)

    Hessels, Amanda J; Murray, Meghan; Cohen, Bevin; Larson, Elaine L

    2017-04-19

    Children with complex medical needs are increasing in number and demanding the services of pediatric long-term care facilities (pLTC), which require a focus on patient safety culture (PSC). However, no tool to measure PSC has been tested in this unique hybrid acute care-residential setting. The objective of this study was to evaluate the psychometric properties of the Nursing Home Survey on Patient Safety Culture tool slightly modified for use in the pLTC setting. Factor analyses were performed on data collected from 239 staff at 3 pLTC in 2012. Items were screened by principal axis factoring, and the original structure was tested using confirmatory factor analysis. Exploratory factor analysis was conducted to identify the best model fit for the pLTC data, and factor reliability was assessed by Cronbach alpha. The extracted, rotated factor solution suggested items in 4 (staffing, nonpunitive response to mistakes, communication openness, and organizational learning) of the original 12 dimensions may not be a good fit for this population. Nevertheless, in the pLTC setting, both the original and the modified factor solutions demonstrated similar reliabilities to the published consistencies of the survey when tested in adult nursing homes and the items factored nearly identically as theorized. This study demonstrates that the Nursing Home Survey on Patient Safety Culture with minimal modification may be an appropriate instrument to measure PSC in pLTC settings. Additional psychometric testing is recommended to further validate the use of this instrument in this setting, including examining the relationship to safety outcomes. Increased use will yield data for benchmarking purposes across these specialized settings to inform frontline workers and organizational leaders of areas of strength and opportunity for improvement.

  18. Phasor analysis of binary diffraction gratings with different fill factors

    International Nuclear Information System (INIS)

    MartInez, Antonio; Sanchez-Lopez, Ma del Mar; Moreno, Ignacio

    2007-01-01

    In this work, we present a simple analysis of binary diffraction gratings with different slit widths relative to the grating period. The analysis is based on a simple phasor technique directly derived from the Huygens principle. By introducing a slit phasor and a grating phasor, the intensity of the diffracted orders and the grating's resolving power can be easily obtained without applying the usual Fourier transform operations required for these calculations. The proposed phasor technique is mathematically equivalent to the Fourier transform calculation of the diffraction order amplitude, and it can be useful to explain binary diffraction gratings in a simple manner in introductory physics courses. This theoretical analysis is illustrated with experimental results using a liquid crystal device to display diffraction gratings with different fill factors

  19. Phasor analysis of binary diffraction gratings with different fill factors

    Energy Technology Data Exchange (ETDEWEB)

    MartInez, Antonio [Departamento de Ciencia de Materiales, Optica y TecnologIa Electronica, Universidad Miguel Hernandez, 03202 Elche (Spain); Sanchez-Lopez, Ma del Mar [Instituto de BioingenierIa y Departamento de Fisica y Arquitectura de Computadores, Universidad Miguel Hernandez, 03202 Elche (Spain); Moreno, Ignacio [Departamento de Ciencia de Materiales, Optica y TecnologIa Electronica, Universidad Miguel Hernandez, 03202 Elche (Spain)

    2007-09-11

    In this work, we present a simple analysis of binary diffraction gratings with different slit widths relative to the grating period. The analysis is based on a simple phasor technique directly derived from the Huygens principle. By introducing a slit phasor and a grating phasor, the intensity of the diffracted orders and the grating's resolving power can be easily obtained without applying the usual Fourier transform operations required for these calculations. The proposed phasor technique is mathematically equivalent to the Fourier transform calculation of the diffraction order amplitude, and it can be useful to explain binary diffraction gratings in a simple manner in introductory physics courses. This theoretical analysis is illustrated with experimental results using a liquid crystal device to display diffraction gratings with different fill factors.

  20. Biosphere dose conversion Factor Importance and Sensitivity Analysis

    International Nuclear Information System (INIS)

    M. Wasiolek

    2004-01-01

    This report presents importance and sensitivity analysis for the environmental radiation model for Yucca Mountain, Nevada (ERMYN). ERMYN is a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis concerns the output of the model, biosphere dose conversion factors (BDCFs) for the groundwater, and the volcanic ash exposure scenarios. It identifies important processes and parameters that influence the BDCF values and distributions, enhances understanding of the relative importance of the physical and environmental processes on the outcome of the biosphere model, includes a detailed pathway analysis for key radionuclides, and evaluates the appropriateness of selected parameter values that are not site-specific or have large uncertainty

  1. Logistic regression analysis of risk factors for postoperative recurrence of spinal tumors and analysis of prognostic factors.

    Science.gov (United States)

    Zhang, Shanyong; Yang, Lili; Peng, Chuangang; Wu, Minfei

    2018-02-01

    The aim of the present study was to investigate the risk factors for postoperative recurrence of spinal tumors by logistic regression analysis and analysis of prognostic factors. In total, 77 male and 48 female patients with spinal tumor were selected in our hospital from January, 2010 to December, 2015 and divided into the benign (n=76) and malignant groups (n=49). All the patients underwent microsurgical resection of spinal tumors and were reviewed regularly 3 months after operation. The McCormick grading system was used to evaluate the postoperative spinal cord function. Data were subjected to statistical analysis. Of the 125 cases, 63 cases showed improvement after operation, 50 cases were stable, and deterioration was found in 12 cases. The improvement rate of patients with cervical spine tumor, which reached 56.3%, was the highest. Fifty-two cases of sensory disturbance, 34 cases of pain, 30 cases of inability to exercise, 26 cases of ataxia, and 12 cases of sphincter disorders were found after operation. Seventy-two cases (57.6%) underwent total resection, 18 cases (14.4%) received subtotal resection, 23 cases (18.4%) received partial resection, and 12 cases (9.6%) were only treated with biopsy/decompression. Postoperative recurrence was found in 57 cases (45.6%). The mean recurrence time of patients in the malignant group was 27.49±6.09 months, and the mean recurrence time of patients in the benign group was 40.62±4.34. The results were significantly different (Pregression analysis of total resection-related factors showed that total resection should be the preferred treatment for patients with benign tumors, thoracic and lumbosacral tumors, and lower McCormick grade, as well as patients without syringomyelia and intramedullary tumors. Logistic regression analysis of recurrence-related factors revealed that the recurrence rate was relatively higher in patients with malignant, cervical, thoracic and lumbosacral, intramedullary tumors, and higher Mc

  2. Proteome analysis demonstrates profound alterations in human dendritic cell nature by TX527, an analogue of vitamin D

    DEFF Research Database (Denmark)

    Ferreira, G. B.; van Etten, E.; Lage, K.

    2009-01-01

    Structural analogues of vitamin D have been put forward as therapeutic agents able to exploit the immunomodulatory effects of vitamin D, without its undesired calcemic side effects. We have demonstrated that TX527 affects dendritic cell (DC) maturation in vitro, resulting in the generation...... of a tolerogenic cell. In the present study, we aimed to explore the global protein changes induced by the analogue in immature DC (iDC) and mature human DC and to correlate them with alterations in DC morphology and function. Human CD14(+) monocytes were differentiated toward iDC or mature DCs, in the presence...

  3. Analysis on risk factors for post-stroke emotional incontinence

    Directory of Open Access Journals (Sweden)

    Xiao-chun ZHANG

    2018-01-01

    Full Text Available Objective To investigate the occurrence rate and related risk factors for post-stroke emotional incontinence (PSEI. Methods The clinical data [sex, age, body mass index (BMI, education, marital status, medical history (hypertension, heart disease, diabetes, hyperlipemia, smoking and drinking and family history of stroke] of 162 stroke patients were recorded. Serum homocysteine (Hcy level was examined. Head CT and/or MRI were used to indicate stroke subtype, site of lesion and number of lesion. Diagnostic and Statistical Manual of Mental Disorders Fifth Edition (DSM-Ⅴ Chinese version and Hamilton Depression Rating Scale-17 Items (HAMD-17 were used to evaluate the degree of depression. House diagnostic standard was used to diagnose PSEI. Univariate and multivariate backward Logistic regression analysis was used to screen related risk factor for PSEI. Spearman rank correlation analysis was used to discuss the correlation between PSEI and post-stroke depression (PSD. Results Among 162 stroke patients, 12 cases were diagnosed as PSEI (7.41% . The ratio of age < 60 years in PSEI group was significantly higher than non-PSEI group (P = 0.045. The ratio of smoking in PSEI group was significantly lower than non-PSEI group (P = 0.036. Univariate and multivariate backward Logistic regression analysis showed age < 60 years was independent risk factor for PSEI (OR = 4.000, 95%CI: 1.149-13.924; P = 0.029. Ten cases were combined with PSD in 12 PSEI patients, and the co-morbidity rate of PSEI and PSD was83.33%. Spearman rank correlation analysis showed PSEI was positively related to PSD (rs = 0.305, P = 0.000. Conclusions PSEI is common affective disorder in stroke patients, which easily happens in patients under 60 years of age. DOI: 10.3969/j.issn.1672-6731.2017.12.010

  4. Mediation Analysis Demonstrates That Trans-eQTLs Are Often Explained by Cis-Mediation: A Genome-Wide Analysis among 1,800 South Asians

    Science.gov (United States)

    Pierce, Brandon L.; Tong, Lin; Chen, Lin S.; Rahaman, Ronald; Argos, Maria; Jasmine, Farzana; Roy, Shantanu; Paul-Brutus, Rachelle; Westra, Harm-Jan; Franke, Lude; Esko, Tonu; Zaman, Rakibuz; Islam, Tariqul; Rahman, Mahfuzar; Baron, John A.; Kibriya, Muhammad G.; Ahsan, Habibul

    2014-01-01

    A large fraction of human genes are regulated by genetic variation near the transcribed sequence (cis-eQTL, expression quantitative trait locus), and many cis-eQTLs have implications for human disease. Less is known regarding the effects of genetic variation on expression of distant genes (trans-eQTLs) and their biological mechanisms. In this work, we use genome-wide data on SNPs and array-based expression measures from mononuclear cells obtained from a population-based cohort of 1,799 Bangladeshi individuals to characterize cis- and trans-eQTLs and determine if observed trans-eQTL associations are mediated by expression of transcripts in cis with the SNPs showing trans-association, using Sobel tests of mediation. We observed 434 independent trans-eQTL associations at a false-discovery rate of 0.05, and 189 of these trans-eQTLs were also cis-eQTLs (enrichment Pmediator based on Sobel Pmediation signals in two European cohorts, and while only 7 trans-eQTL associations were present in one or both cohorts, 6 showed evidence of cis-mediation. Analyses of simulated data show that complete mediation will be observed as partial mediation in the presence of mediator measurement error or imperfect LD between measured and causal variants. Our data demonstrates that trans-associations can become significantly stronger or switch directions after adjusting for a potential mediator. Using simulated data, we demonstrate that this phenomenon is expected in the presence of strong cis-trans confounding and when the measured cis-transcript is correlated with the true (unmeasured) mediator. In conclusion, by applying mediation analysis to eQTL data, we show that a substantial fraction of observed trans-eQTL associations can be explained by cis-mediation. Future studies should focus on understanding the mechanisms underlying widespread cis-mediation and their relevance to disease biology, as well as using mediation analysis to improve eQTL discovery. PMID:25474530

  5. Demonstrations of Agency in Contemporary International Children's Literature: An Exploratory Critical Content Analysis across Personal, Social, and Cultural Dimensions

    Science.gov (United States)

    Mathis, Janelle B.

    2015-01-01

    International children's literature has the potential to create global experiences and cultural insights for young people confronted with limited and biased images of the world offered by media. The current inquiry was designed to explore, through a critical content analysis approach, international children's literature in which characters…

  6. Improved analysis of long-term monitoring data demonstrates marked regional declines of bat populations in the eastern United States

    Science.gov (United States)

    Thomas E. Ingersoll; Brent J. Sewall; Sybill K. Amelon

    2013-01-01

    Bats are diverse and ecologically important, but are also subject to a suite of severe threats. Evidence for localized bat mortality from these threats is well-documented in some cases, but long-term changes in regional populations of bats remain poorly understood. Bat hibernation surveys provide an opportunity to improve understanding, but analysis is complicated by...

  7. MOOC Success Factors: Proposal of an Analysis Framework

    Directory of Open Access Journals (Sweden)

    Margarida M. Marques

    2017-10-01

    Full Text Available Aim/Purpose: From an idea of lifelong-learning-for-all to a phenomenon affecting higher education, Massive Open Online Courses (MOOCs can be the next step to a truly universal education. Indeed, MOOC enrolment rates can be astoundingly high; still, their completion rates are frequently disappointingly low. Nevertheless, as courses, the participants’ enrolment and learning within the MOOCs must be considered when assessing their success. In this paper, the authors’ aim is to reflect on what makes a MOOC successful to propose an analysis framework of MOOC success factors. Background: A literature review was conducted to identify reported MOOC success factors and to propose an analysis framework. Methodology: This literature-based framework was tested against data of a specific MOOC and refined, within a qualitative interpretivist methodology. The data were collected from the ‘As alterações climáticas nos média escolares - Clima@EduMedia’ course, which was developed by the project Clima@EduMedia and was submitted to content analysis. This MOOC aimed to support science and school media teachers in the use of media to teach climate change Contribution: By proposing a MOOC success factors framework the authors are attempting to contribute to fill in a literature gap regarding what concerns criteria to consider a specific MOOC successful. Findings: This work major finding is a literature-based and empirically-refined MOOC success factors analysis framework. Recommendations for Practitioners: The proposed framework is also a set of best practices relevant to MOOC developers, particularly when targeting teachers as potential participants. Recommendation for Researchers: This work’s relevance is also based on its contribution to increasing empirical research on MOOCs. Impact on Society: By providing a proposal of a framework on factors to make a MOOC successful, the authors hope to contribute to the quality of MOOCs. Future Research: Future

  8. Inference algorithms and learning theory for Bayesian sparse factor analysis

    International Nuclear Information System (INIS)

    Rattray, Magnus; Sharp, Kevin; Stegle, Oliver; Winn, John

    2009-01-01

    Bayesian sparse factor analysis has many applications; for example, it has been applied to the problem of inferring a sparse regulatory network from gene expression data. We describe a number of inference algorithms for Bayesian sparse factor analysis using a slab and spike mixture prior. These include well-established Markov chain Monte Carlo (MCMC) and variational Bayes (VB) algorithms as well as a novel hybrid of VB and Expectation Propagation (EP). For the case of a single latent factor we derive a theory for learning performance using the replica method. We compare the MCMC and VB/EP algorithm results with simulated data to the theoretical prediction. The results for MCMC agree closely with the theory as expected. Results for VB/EP are slightly sub-optimal but show that the new algorithm is effective for sparse inference. In large-scale problems MCMC is infeasible due to computational limitations and the VB/EP algorithm then provides a very useful computationally efficient alternative.

  9. Inference algorithms and learning theory for Bayesian sparse factor analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rattray, Magnus; Sharp, Kevin [School of Computer Science, University of Manchester, Manchester M13 9PL (United Kingdom); Stegle, Oliver [Max-Planck-Institute for Biological Cybernetics, Tuebingen (Germany); Winn, John, E-mail: magnus.rattray@manchester.ac.u [Microsoft Research Cambridge, Roger Needham Building, Cambridge, CB3 0FB (United Kingdom)

    2009-12-01

    Bayesian sparse factor analysis has many applications; for example, it has been applied to the problem of inferring a sparse regulatory network from gene expression data. We describe a number of inference algorithms for Bayesian sparse factor analysis using a slab and spike mixture prior. These include well-established Markov chain Monte Carlo (MCMC) and variational Bayes (VB) algorithms as well as a novel hybrid of VB and Expectation Propagation (EP). For the case of a single latent factor we derive a theory for learning performance using the replica method. We compare the MCMC and VB/EP algorithm results with simulated data to the theoretical prediction. The results for MCMC agree closely with the theory as expected. Results for VB/EP are slightly sub-optimal but show that the new algorithm is effective for sparse inference. In large-scale problems MCMC is infeasible due to computational limitations and the VB/EP algorithm then provides a very useful computationally efficient alternative.

  10. Statictical Analysis Of The Conditioning Factors Of Urban Electric Consumption

    International Nuclear Information System (INIS)

    Segura D'Rouville, Juan Joel; Suárez Carreño, Franyelit María

    2017-01-01

    This research work presents the analysis of the most important factors that condition the urban residential electricity consumption. This study shows the quantitative parameters conditioning the electricity consumption. This sector of analysis has been chosen because there is disaggregated information of which are the main social and technological factors that determine its behavior, growth, with the objective of elaborating policies in the management of the electric consumption. The electrical demand considered as the sum of the powers of all the equipment that are used in each of the instants of a full day, is related to the electrical consumption, which is not but the value of the power demanded by a determined consumer Multiplied by the time in which said demand is maintained. In this report we propose the design of a probabilistic model of prediction of electricity consumption, taking into account mainly influential social and technological factors. The statistical process of this database is done through the Stat Graphics software version 4.1, for its extensive didactic in the accomplishment of calculations and associated methods. Finally, the correlation of the variables was performed to classify the determinants in a specific way and thus to determine the consumption of the dwellings. (author)

  11. Analysis of risk factors of pulmonary embolism in diabetic patients

    International Nuclear Information System (INIS)

    Xie Changhui; Ma Zhihai; Zhu Lin; Chi Lianxiang

    2012-01-01

    Objective: To study the related risk factors in diabetic patients with pulmonary embolism (PE). Methods: 58 diabetic cases underwent lower limbs 99m Tc-MAA veins imaging (and/or ultrasonography) and pulmonary perfusion imaging. The related laboratory data [fasting blood glucose (FBG), blood cholesterol, blood long chain triglycerides (LCT)] and clinic information [age, disease courses, chest symptoms (chest pain and short of breathe), lower limbs symptoms (swelling, varicose veins and diabetic foot) and acute complication (diabetic ketoacidosis and hyperosmolar non ketotic diabetic coma)] were collected simultaneously. SPSS was used for χ 2 -test and Logistic regression analysis. Results: (1) 28 patients (48.3%) were showed to be with lower limbs deep vein thrombosis (DVT) and by 99m Tc-MAA imaging, 10 cases (17.2%) with PE. The PE ratios (32.1%) of the patients with DVT was more higher than no DVT (3.3%) (χ 2 =6.53, P 2 ≥4.23, P 2 ≤2.76, P>0.05), respectively. (3) Multiplicity analysis indicated: the related risk factors for PE included chest symptoms (Score=13.316, P=0.000) and lower limbs symptoms (Score=7.780, P=0.005). No significant difference to other factors (Score≤2.494, P>0.114), respectively. Conclusion: The serious DM with chest symptoms, lower limbs symptoms and/or DVT must be controlled as early as possible by all kinds of treatment. It will decrease the PE complication. (authors)

  12. Proteomics Analysis Reveals Previously Uncharacterized Virulence Factors in Vibrio proteolyticus

    Directory of Open Access Journals (Sweden)

    Ann Ray

    2016-07-01

    Full Text Available Members of the genus Vibrio include many pathogens of humans and marine animals that share genetic information via horizontal gene transfer. Hence, the Vibrio pan-genome carries the potential to establish new pathogenic strains by sharing virulence determinants, many of which have yet to be characterized. Here, we investigated the virulence properties of Vibrio proteolyticus, a Gram-negative marine bacterium previously identified as part of the Vibrio consortium isolated from diseased corals. We found that V. proteolyticus causes actin cytoskeleton rearrangements followed by cell lysis in HeLa cells in a contact-independent manner. In search of the responsible virulence factor involved, we determined the V. proteolyticus secretome. This proteomics approach revealed various putative virulence factors, including active type VI secretion systems and effectors with virulence toxin domains; however, these type VI secretion systems were not responsible for the observed cytotoxic effects. Further examination of the V. proteolyticus secretome led us to hypothesize and subsequently demonstrate that a secreted hemolysin, belonging to a previously uncharacterized clan of the leukocidin superfamily, was the toxin responsible for the V. proteolyticus-mediated cytotoxicity in both HeLa cells and macrophages. Clearly, there remains an armory of yet-to-be-discovered virulence factors in the Vibrio pan-genome that will undoubtedly provide a wealth of knowledge on how a pathogen can manipulate host cells.

  13. Analysis of psychological factors which interfere in soccer athletes’ behaviour

    Directory of Open Access Journals (Sweden)

    Constanza Pujals

    2008-06-01

    Full Text Available The aim of this study is to analyze the psychological factors which interfere in soccer athletes’s behaviour, juvenile and infant categories. 40 athletes from a soccer school in Maringá – PR were studied and the instruments used were: inventories, interviews, questionnaires and research diary. Data were collected individually and in group. Intervention occurred for 12 months through observation, evaluation and showed the following factors: motivation, anxiety, aggression and self confidence. Results pointed out that the positive emotions expressed by the athletes were good mood, happiness, relaxation, interest in improving and hope while negative emotions were anxiety, rage, aggressiveness, low self-confidence, lack of motivation, insecurity, feeling of failure, pessimism and group instability. Relatives and coach were also generating factors of stress and anxiety. Thus, this sporting context shows that the sports psychology seems to be highly efficient to reduce anxiety and agression indexes as well as to increase motivation and self-confidence, demonstrating the importance of psychological preparation for sporting training.

  14. Contextual risk factors for low birth weight: a multilevel analysis.

    Directory of Open Access Journals (Sweden)

    Gbenga A Kayode

    Full Text Available Low birth weight (LBW remains to be a leading cause of neonatal death and a major contributor to infant and under-five mortality. Its prevalence has not declined in the last decade in sub-Saharan Africa (SSA and Asia. Some individual level factors have been identified as risk factors for LBW but knowledge is limited on contextual risk factors for LBW especially in SSA.Contextual risk factors for LBW in Ghana were identified by performing multivariable multilevel logistic regression analysis of 6,900 mothers dwelling in 412 communities that participated in the 2003 and 2008 Demographic and Health Surveys in Ghana.Contextual-level factors were significantly associated with LBW: Being a rural dweller increased the likelihood of having a LBW infant by 43% (OR 1.43; 95% CI 1.01-2.01; P-value <0.05 while living in poverty-concentrated communities increased the risk of having a LBW infant twofold (OR 2.16; 95% CI 1.29-3.61; P-value <0.01. In neighbourhoods with a high coverage of safe water supply the odds of having a LBW infant reduced by 28% (OR 0.74; 95% CI 0.57-0.96; P-value <0.05.This study showed contextual risk factors to have independent effects on the prevalence of LBW infants. Being a rural dweller, living in a community with a high concentration of poverty and a low coverage of safe water supply were found to increase the prevalence of LBW infants. Implementing appropriate community-based intervention programmes will likely reduce the occurrence of LBW infants.

  15. Ranking factors of an investment in cogeneration: sensitivity analysis ranking the technical and economical factors

    International Nuclear Information System (INIS)

    Sundberg, Gunnel

    2001-01-01

    A deregulation of the electricity market in Europe will result in increased competition among the power-producing companies. They will therefore carefully estimate the financial risk in an investment in new power-producing capability. One part of the risk assessment is to perform a sensitivity analysis. This paper presents a sensitivity analysis using factorial design, resulting in an assessment of the most important technical and economical factors affecting an investment in gas turbine combined cycle and a steam cycle fired by wood chips. The study is performed using a simulation model that optimises the operation of existing power plants and potential new investments to fulfil the desired heat demand. The local utility system analysed is a Swedish district heating system with 655 GWh y -1 heat demand. The conclusion is that to understand which of the technical and economical factors affect the investment, it is not sufficient to investigate the parameters of the studied plant, but also the parameters related to the competing plants. Both the individual effects of the factors and the effect of their interaction should be investigated. For the energy system studied the price of natural gas, price of wood chips and investment cost have the major influence on the profitability of the investment. (Author)

  16. Technical factors that affect anastomotic integrity following esophagectomy: systematic review and meta-analysis.

    Science.gov (United States)

    Markar, Sheraz R; Arya, Shobhit; Karthikesalingam, Alan; Hanna, George B

    2013-12-01

    Due to the significant contribution of anastomotic leak, with its disastrous consequences to patient morbidity and mortality, multiple parameters have been proposed and individually meta-analyzed for the formation of the ideal esophagogastric anastomosis following cancer resection. The purpose of this pooled analysis was to examine the main technical parameters that impact on anastomotic integrity. Medline, Embase, trial registries, and conference proceedings were searched. Technical factors evaluated included hand-sewn versus stapled esophagogastric anastomosis (EGA), cervical versus thoracic EGA, minimally invasive versus open esophagectomy, anterior versus posterior route of reconstruction and ischemic conditioning of the gastric conduit. The outcome of interest was the incidence of anastomotic leak, for which pooled odds ratios were calculated for each technical factor. No significant difference in the incidence of anastomotic leak was demonstrated for the following technical factors: hand-sewn versus stapled EGA, minimally invasive versus open esophagectomy, anterior versus posterior route of reconstruction and ischemic conditioning of the gastric conduit. Four randomized, controlled trials comprising 298 patients were included that compared cervical and thoracic EGA. Anastomotic leak was seen more commonly in the cervical group (13.64 %) than in the thoracic group (2.96 %). Pooled analysis demonstrated a significantly increased incidence of anastomotic leak in the cervical group (pooled odds ratio = 4.73; 95 % CI 1.61-13.9; P = 0.005). A tailored surgical approach to the patient's physiology and esophageal cancer stage is the most important factor that influences anastomotic integrity after esophagectomy.

  17. Factors influencing societal response of nanotechnology: an expert stakeholder analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gupta, Nidhi, E-mail: nidhi.gupta@wur.nl; Fischer, Arnout R. H., E-mail: arnout.fischer@wur.nl; Lans, Ivo A. van der, E-mail: Ivo.vanderLans@wur.nl [Wageningen University, Marketing and Consumer Behaviour Group (Netherlands); Frewer, Lynn J., E-mail: lynn.frewer@newcastle.ac.uk [Newcastle University, School of Agriculture, Food and Rural Development (United Kingdom)

    2012-05-15

    Nanotechnology can be described as an emerging technology and, as has been the case with other emerging technologies such as genetic modification, different socio-psychological factors will potentially influence societal responses to its development and application. These factors will play an important role in how nanotechnology is developed and commercialised. This article aims to identify expert opinion on factors influencing societal response to applications of nanotechnology. Structured interviews with experts on nanotechnology from North West Europe were conducted using repertory grid methodology in conjunction with generalized Procrustes analysis to examine the psychological constructs underlying societal uptake of 15 key applications of nanotechnology drawn from different areas (e.g. medicine, agriculture and environment, chemical, food, military, sports, and cosmetics). Based on expert judgement, the main factors influencing societal response to different applications of nanotechnology will be the extent to which applications are perceived to be beneficial, useful, and necessary, and how 'real' and physically close to the end-user these applications are perceived to be by the public.

  18. Exploring leadership styles for innovation: an exploratory factor analysis

    Directory of Open Access Journals (Sweden)

    Wipulanusat Warit

    2017-03-01

    Full Text Available Leadership plays a vital role in building the process, structures, and climate for an organisation to become innovative and to motivate team expectations toward innovations. This study explores the leadership styles that engineers regard as significant for innovation in the public sector. Exploratory factor analysis (EFA was conducted to identify the principal leadership styles influencing innovation in the Australian Public Service (APS, using survey data extracted from the 2014 APS employee census comprising 3 125 engineering professionals in Commonwealth of Australia departments. EFA returned a two-factor structure explaining 77.6% of the variance of the leadership for innovation construct. In this study, the results from the EFA provided a clear estimation of the factor structure of the measures for leadership for innovation. From the results, the two factors extracted were transformational leadership and consideration leadership. In transformational leadership, a leader values organisational objectives, inspires subordinates to perform, and motivates followers beyond expected levels of work standards. Consideration leadership refers to the degree to which a leader shows concern and expressions of support for subordinates, takes care of their welfare, treats members as equals, and displays warmth and approachability. These findings highlight the role of leadership as the most critical predictor when considering the degree to which subordinates strive for creativity and innovation. Both transformational and consideration leadership styles are recommended to be incorporated into management training and development programs. This study also recommends that Commonwealth departments recruit supervisors who have both of these leadership styles before implementing innovative projects.

  19. Factors influencing societal response of nanotechnology: an expert stakeholder analysis

    International Nuclear Information System (INIS)

    Gupta, Nidhi; Fischer, Arnout R. H.; Lans, Ivo A. van der; Frewer, Lynn J.

    2012-01-01

    Nanotechnology can be described as an emerging technology and, as has been the case with other emerging technologies such as genetic modification, different socio-psychological factors will potentially influence societal responses to its development and application. These factors will play an important role in how nanotechnology is developed and commercialised. This article aims to identify expert opinion on factors influencing societal response to applications of nanotechnology. Structured interviews with experts on nanotechnology from North West Europe were conducted using repertory grid methodology in conjunction with generalized Procrustes analysis to examine the psychological constructs underlying societal uptake of 15 key applications of nanotechnology drawn from different areas (e.g. medicine, agriculture and environment, chemical, food, military, sports, and cosmetics). Based on expert judgement, the main factors influencing societal response to different applications of nanotechnology will be the extent to which applications are perceived to be beneficial, useful, and necessary, and how 'real' and physically close to the end-user these applications are perceived to be by the public.

  20. Factoral analysis of the cost of preparing oil

    Energy Technology Data Exchange (ETDEWEB)

    Avdeyeva, L A; Kudoyarov, G Sh; Shmatova, M F

    1979-01-01

    Mathematical statistics methods (basically correlational and regression analysis) are used to study the factors which form the level of cost of preparing oil with consideration of the mutual influence of the factors. Selected as the claims for inclusion into a mathematical model was a group of five a priori justified factors: the water level of the oil being extracted (%); the specific expenditure of deemulsifiers; the volume of oil preparation; the quality of oil preparation (the salt content) and the level of use of the installations' capacities (%). To construct an economic and mathematical model of the cost of the technical preparation (SPP) of the oil, all the unions which make up the Ministry of the Oil Industry were divided into two comparable totalities. The first group included unions in which the oil SPP was lower than the branch average and the second, unions in which the SPP was higher than the branch wide cost. Using the coefficients of regression, special elasticity coefficients and the fluctuation indicators, the basic factors were finally identified which have the greatest influence on the formation of the oil SPP level separately for the first and second groups of unions.

  1. Factors influencing societal response of nanotechnology: an expert stakeholder analysis

    Science.gov (United States)

    Gupta, Nidhi; Fischer, Arnout R. H.; van der Lans, Ivo A.; Frewer, Lynn J.

    2012-05-01

    Nanotechnology can be described as an emerging technology and, as has been the case with other emerging technologies such as genetic modification, different socio-psychological factors will potentially influence societal responses to its development and application. These factors will play an important role in how nanotechnology is developed and commercialised. This article aims to identify expert opinion on factors influencing societal response to applications of nanotechnology. Structured interviews with experts on nanotechnology from North West Europe were conducted using repertory grid methodology in conjunction with generalized Procrustes analysis to examine the psychological constructs underlying societal uptake of 15 key applications of nanotechnology drawn from different areas (e.g. medicine, agriculture and environment, chemical, food, military, sports, and cosmetics). Based on expert judgement, the main factors influencing societal response to different applications of nanotechnology will be the extent to which applications are perceived to be beneficial, useful, and necessary, and how 'real' and physically close to the end-user these applications are perceived to be by the public.

  2. A Confirmatory Factor Analysis of the Structure of Abbreviated Math Anxiety Scale

    Directory of Open Access Journals (Sweden)

    Farahman Farrokhi

    2011-06-01

    Full Text Available "nObjective: The aim of this study is to explore the confirmatory factor analysis results of the Persian adaptation of Abbreviated Math Anxiety Scale (AMAS, proposed by Hopko, Mahadevan, Bare & Hunt. "nMethod: The validity and reliability assessments of the scale were performed on 298 college students chosen randomly from Tabriz University in Iran. The confirmatory factor analysis (CFA was carried out to determine the factor structures of the Persian version of AMAS. "nResults: As expected, the two-factor solution provided a better fit to the data than a single factor. Moreover, multi-group analyses showed that this two-factor structure was invariant across sex. Hence, AMAS provides an equally valid measure for use among college students. "nConclusions:  Brief AMAS demonstrates adequate reliability and validity. The AMAS scores can be used to compare symptoms of math anxiety between male and female students. The study both expands and adds support to the existing body of math anxiety literature.

  3. Path analysis of risk factors leading to premature birth.

    Science.gov (United States)

    Fields, S J; Livshits, G; Sirotta, L; Merlob, P

    1996-01-01

    The present study tested whether various sociodemographic, anthropometric, behavioral, and medical/physiological factors act in a direct or indirect manner on the risk of prematurity using path analysis on a sample of Israeli births. The path model shows that medical complications, primarily toxemia, chorioammionitis, and a previous low birth weight delivery directly and significantly act on the risk of prematurity as do low maternal pregnancy weight gain and ethnicity. Other medical complications, including chronic hypertension, preclampsia, and placental abruption, although significantly correlated with prematurity, act indirectly on prematurity through toxemia. The model further shows that the commonly accepted sociodemographic, anthropometric, and behavioral risk factors act by modifying the development of medical complications that lead to prematurity as opposed to having a direct effect on premature delivery. © 1996 Wiley-Liss, Inc. Copyright © 1996 Wiley-Liss, Inc.

  4. Pareto analysis of critical factors affecting technical institution evaluation

    Directory of Open Access Journals (Sweden)

    Victor Gambhir

    2012-08-01

    Full Text Available With the change of education policy in 1991, more and more technical institutions are being set up in India. Some of these institutions provide quality education, but others are merely concentrating on quantity. These stakeholders are in a state of confusion about decision to select the best institute for their higher educational studies. Although various agencies including print media provide ranking of these institutions every year, but their results are controversial and biased. In this paper, the authors have made an endeavor to find the critical factors for technical institution evaluation from literature survey. A Pareto analysis has also been performed to find the intensity of these critical factors in evaluation. This will not only help the stake holders in taking right decisions but will also help the management of institutions in benchmarking for identifying the most important critical areas to improve the existing system. This will in turn help Indian economy.

  5. Multivariate factor analysis of Girgentana goat milk composition

    Directory of Open Access Journals (Sweden)

    Pietro Giaccone

    2010-01-01

    Full Text Available The interpretation of the several variables that contribute to defining milk quality is difficult due to the high degree of  correlation among them. In this case, one of the best methods of statistical processing is factor analysis, which belongs  to the multivariate groups; for our study this particular statistical approach was employed.  A total of 1485 individual goat milk samples from 117 Girgentana goats, were collected fortnightly from January to July,  and analysed for physical and chemical composition, and clotting properties. Milk pH and tritable acidity were within the  normal range for fresh goat milk. Morning milk yield resulted 704 ± 323 g with 3.93 ± 1.23% and 3.48±0.38% for fat  and protein percentages, respectively. The milk urea content was 43.70 ± 8.28 mg/dl. The clotting ability of Girgentana  milk was quite good, with a renneting time equal to 16.96 ± 3.08 minutes, a rate of curd formation of 2.01 ± 1.63 min-  utes and a curd firmness of 25.08 ± 7.67 millimetres.  Factor analysis was performed by applying axis orthogonal rotation (rotation type VARIMAX; the analysis grouped the  milk components into three latent or common factors. The first, which explained 51.2% of the total covariance, was  defined as “slow milks”, because it was linked to r and pH. The second latent factor, which explained 36.2% of the total  covariance, was defined as “milk yield”, because it is positively correlated to the morning milk yield and to the urea con-  tent, whilst negatively correlated to the fat percentage. The third latent factor, which explained 12.6% of the total covari-  ance, was defined as “curd firmness,” because it is linked to protein percentage, a30 and titatrable acidity. With the aim  of evaluating the influence of environmental effects (stage of kidding, parity and type of kidding, factor scores were anal-  ysed with the mixed linear model. Results showed significant effects of the season of

  6. Spinal appearance questionnaire: factor analysis, scoring, reliability, and validity testing.

    Science.gov (United States)

    Carreon, Leah Y; Sanders, James O; Polly, David W; Sucato, Daniel J; Parent, Stefan; Roy-Beaudry, Marjolaine; Hopkins, Jeffrey; McClung, Anna; Bratcher, Kelly R; Diamond, Beverly E

    2011-08-15

    Cross sectional. This study presents the factor analysis of the Spinal Appearance Questionnaire (SAQ) and its psychometric properties. Although the SAQ has been administered to a large sample of patients with adolescent idiopathic scoliosis (AIS) treated surgically, its psychometric properties have not been fully evaluated. This study presents the factor analysis and scoring of the SAQ and evaluates its psychometric properties. The SAQ and the Scoliosis Research Society-22 (SRS-22) were administered to AIS patients who were being observed, braced or scheduled for surgery. Standard demographic data and radiographic measures including Lenke type and curve magnitude were also collected. Of the 1802 patients, 83% were female; with a mean age of 14.8 years and mean initial Cobb angle of 55.8° (range, 0°-123°). From the 32 items of the SAQ, 15 loaded on two factors with consistent and significant correlations across all Lenke types. There is an Appearance (items 1-10) and an Expectations factor (items 12-15). Responses are summed giving a range of 5 to 50 for the Appearance domain and 5 to 20 for the Expectations domain. The Cronbach's α was 0.88 for both domains and Total score with a test-retest reliability of 0.81 for Appearance and 0.91 for Expectations. Correlations with major curve magnitude were higher for the SAQ Appearance and SAQ Total scores compared to correlations between the SRS Appearance and SRS Total scores. The SAQ and SRS-22 Scores were statistically significantly different in patients who were scheduled for surgery compared to those who were observed or braced. The SAQ is a valid measure of self-image in patients with AIS with greater correlation to curve magnitude than SRS Appearance and Total score. It also discriminates between patients who require surgery from those who do not.

  7. Physics Metacognition Inventory Part II: Confirmatory factor analysis and Rasch analysis

    Science.gov (United States)

    Taasoobshirazi, Gita; Bailey, MarLynn; Farley, John

    2015-11-01

    The Physics Metacognition Inventory was developed to measure physics students' metacognition for problem solving. In one of our earlier studies, an exploratory factor analysis provided evidence of preliminary construct validity, revealing six components of students' metacognition when solving physics problems including knowledge of cognition, planning, monitoring, evaluation, debugging, and information management. The college students' scores on the inventory were found to be reliable and related to students' physics motivation and physics grade. However, the results of the exploratory factor analysis indicated that the questionnaire could be revised to improve its construct validity. The goal of this study was to revise the questionnaire and establish its construct validity through a confirmatory factor analysis. In addition, a Rasch analysis was applied to the data to better understand the psychometric properties of the inventory and to further evaluate the construct validity. Results indicated that the final, revised inventory is a valid, reliable, and efficient tool for assessing student metacognition for physics problem solving.

  8. A comparison study on detection of key geochemical variables and factors through three different types of factor analysis

    Science.gov (United States)

    Hoseinzade, Zohre; Mokhtari, Ahmad Reza

    2017-10-01

    Large numbers of variables have been measured to explain different phenomena. Factor analysis has widely been used in order to reduce the dimension of datasets. Additionally, the technique has been employed to highlight underlying factors hidden in a complex system. As geochemical studies benefit from multivariate assays, application of this method is widespread in geochemistry. However, the conventional protocols in implementing factor analysis have some drawbacks in spite of their advantages. In the present study, a geochemical dataset including 804 soil samples collected from a mining area in central Iran in order to search for MVT type Pb-Zn deposits was considered to outline geochemical analysis through various fractal methods. Routine factor analysis, sequential factor analysis, and staged factor analysis were applied to the dataset after opening the data with (additive logratio) alr-transformation to extract mineralization factor in the dataset. A comparison between these methods indicated that sequential factor analysis has more clearly revealed MVT paragenesis elements in surface samples with nearly 50% variation in F1. In addition, staged factor analysis has given acceptable results while it is easy to practice. It could detect mineralization related elements while larger factor loadings are given to these elements resulting in better pronunciation of mineralization.

  9. Optical interconnects for in-plane high-speed signal distribution at 10 Gb/s: Analysis and demonstration

    Science.gov (United States)

    Chang, Yin-Jung

    With decreasing transistor size, increasing chip speed, and larger numbers of processors in a system, the performance of a module/system is being limited by the off-chip and off-module bandwidth-distance products. Optical links have moved from fiber-based long distance communications to the cabinet level of 1m--100m, and recently to the backplane-level (10cm--1m). Board-level inter-chip parallel optical interconnects have been demonstrated recently by researchers from Intel, IBM, Fujitsu, NTT and a few research groups in universities. However, the board-level signal/clock distribution function using optical interconnects, the lightwave circuits, the system design, a practically convenient integration scheme committed to the implementation of a system prototype have not been explored or carefully investigated. In this dissertation, the development of a board-level 1 x 4 optical-to-electrical signal distribution at 10Gb/s is presented. In contrast to other prototypes demonstrating board-level parallel optical interconnects that have been drawing much attention for the past decade, the optical link design for the high-speed signal broadcasting is even more complicated and the pitch between receivers could be varying as opposed to fixed-pitch design that has been widely-used in the parallel optical interconnects. New challenges for the board-level high-speed signal broadcasting include, but are not limited to, a new optical link design, a lightwave circuit as a distribution network, and a novel integration scheme that can be a complete radical departure from the traditional assembly method. One of the key building blocks in the lightwave circuit is the distribution network in which a 1 x 4 multimode interference (MMI) splitter is employed. MMI devices operating at high data rates are important in board-level optical interconnects and need to be characterized in the application of board-level signal broadcasting. To determine the speed limitations of MMI devices, the

  10. A Retrospective Analysis of Factors Affecting Early Stoma Complications.

    Science.gov (United States)

    Koc, Umit; Karaman, Kerem; Gomceli, Ismail; Dalgic, Tahsin; Ozer, Ilter; Ulas, Murat; Ercan, Metin; Bostanci, Erdal; Akoglu, Musa

    2017-01-01

    Despite advances in surgical techniques and products for stoma care, stoma-related complications are still common. A retrospective analysis was performed of the medical records of 462 consecutive patients (295 [63.9%] female, 167 [36.1 %] male, mean age 55.5 ± 15.1 years, mean body mass index [BMI] 25.1 ± 5.2) who had undergone stoma creation at the Gastroenterological Surgery Clinic of Turkiye Yuksek İhtisas Teaching and Research Hospital between January 2008 and December 2012 to examine the incidence of early (ie, within 30 days after surgery) stoma complications and identify potential risk factors. Variables abstracted included gender, age, and BMI; existence of malignant disease; comorbidities (diabetes mellitus, hypertension, coronary artery disease, chronic respiratory disease); use of neoadjuvant chemoradiotherapy; permanent or temporary stoma; type of stoma (loop/end stoma); stoma localization; and the use of preoperative marking of the stoma site. Data were entered and analyzed using statistical software. Descriptive statistics, chi-squared, and Mann-Whitney U tests were used to describe and analyze all variables, and logistic regression analysis was used to determine independent risk factors for stoma complications. Ostomy-related complications developed in 131 patients (28.4%) Of these, superficial mucocutaneous separation was the most frequent complication (90 patients, 19.5%), followed by stoma retraction (15 patients, 3.2%). In univariate analysis, malignant disease (P = .025), creation of a colostomy (P = .002), and left lower quadrant stoma location (P toma complication. Only stoma location was an independent risk factor for the development of a stoma complication (P = .044). The rate of stoma complications was not significantly different between patients who underwent nonemergent surgery (30% in patients preoperatively sited versus 28.4% not sited) and patients who underwent emergency surgery (27.1%). Early stoma complication rates were higher

  11. Replica Analysis for Portfolio Optimization with Single-Factor Model

    Science.gov (United States)

    Shinzato, Takashi

    2017-06-01

    In this paper, we use replica analysis to investigate the influence of correlation among the return rates of assets on the solution of the portfolio optimization problem. We consider the behavior of an optimal solution for the case where the return rate is described with a single-factor model and compare the findings obtained from our proposed methods with correlated return rates with those obtained with independent return rates. We then analytically assess the increase in the investment risk when correlation is included. Furthermore, we also compare our approach with analytical procedures for minimizing the investment risk from operations research.

  12. Meta-analysis of the predictive factors of postpartum fatigue.

    Science.gov (United States)

    Badr, Hanan A; Zauszniewski, Jaclene A

    2017-08-01

    Nearly 64% of new mothers are affected by fatigue during the postpartum period, making it the most common problem that a woman faces as she adapts to motherhood. Postpartum fatigue can lead to serious negative effects on the mother's health and the newborn's development and interfere with mother-infant interaction. The aim of this meta-analysis was to identify predictive factors of postpartum fatigue and to document the magnitude of their effects using effect sizes. We used two search engines, PubMed and Google Scholar, to identify studies that met three inclusion criteria: (a) the article was written in English, (b) the article studied the predictive factors of postpartum fatigue, and (c) the article included information about the validity and reliability of the instruments used in the research. Nine articles met these inclusion criteria. The direction and strength of correlation coefficients between predictive factors and postpartum fatigue were examined across the studies to determine their effect sizes. Measurement of predictor variables occurred from 3days to 6months postpartum. Correlations reported between predictive factors and postpartum fatigue were as follows: small effect size (r range =0.10 to 0.29) for education level, age, postpartum hemorrhage, infection, and child care difficulties; medium effect size (r range =0.30 to 0.49) for physiological illness, low ferritin level, low hemoglobin level, sleeping problems, stress and anxiety, and breastfeeding problems; and large effect size (r range =0.50+) for depression. Postpartum fatigue is a common condition that can lead to serious health problems for a new mother and her newborn. Therefore, increased knowledge concerning factors that influence the onset of postpartum fatigue is needed for early identification of new mothers who may be at risk. Appropriate treatments, interventions, information, and support can then be initiated to prevent or minimize the postpartum fatigue. Copyright © 2017 Elsevier

  13. Immunohistochemical Analysis Using Antipodocalyxin Monoclonal Antibody PcMab-47 Demonstrates Podocalyxin Expression in Oral Squamous Cell Carcinomas.

    Science.gov (United States)

    Itai, Shunsuke; Yamada, Shinji; Kaneko, Mika K; Harada, Hiroyuki; Kato, Yukinari

    2017-10-01

    Podocalyxin is a CD34-related type I transmembrane protein that is highly glycosylated with N-glycan, O-glycan, and keratan sulfate. Podocalyxin was originally found in the podocytes of rat kidney and is reportedly expressed in many types of tumors, including brain tumors, colorectal cancers, and breast cancers. Overexpression of podocalyxin is an independent predictor of progression, metastasis, and poor outcome. We recently immunized mice with recombinant human podocalyxin, which was produced using LN229 glioblastoma cells, and produced a novel antipodocalyxin monoclonal antibody (mAb), PcMab-47, which reacts with endogenous podocalyxin-expressing cancer cell lines and normal cell lines independent of glycosylation in Western blot, flow cytometry, and immunohistochemical analyses. In this study, we performed immunohistochemical analysis against oral cancers using PcMab-47. PcMab-47-stained oral squamous cell carcinoma cells in a cytoplasmic pattern and detected 26/38 (68.4%) of oral squamous cell carcinoma cells on tissue microarrays. These results indicate that PcMab-47 is useful in detecting podocalyxin of oral cancers for immunohistochemical analysis.

  14. Information technology portfolio in supply chain management using factor analysis

    Directory of Open Access Journals (Sweden)

    Ahmad Jaafarnejad

    2013-11-01

    Full Text Available The adoption of information technology (IT along with supply chain management (SCM has become increasingly a necessity among most businesses. This enhances supply chain (SC performance and helps companies achieve the organizational competitiveness. IT systems capture and analyze information and enable management to make decisions by considering a global scope across the entire SC. This paper reviews the existing literature on IT in SCM and considers pertinent criteria. Using principal component analysis (PCA of factor analysis (FA, a number of related criteria are divided into smaller groups. Finally, SC managers can develop an IT portfolio in SCM using mean values of few extracted components on the relevance –emergency matrix. A numerical example is provided to explain details of the proposed method.

  15. Analysis of Factors Affecting Inflation in Indonesia: an Islamic Perspective

    Directory of Open Access Journals (Sweden)

    Elis Ratna Wulan

    2015-04-01

    Full Text Available This study aims to determine the factors affecting inflation. The research is descriptive quantitative in nature. The data used are reported exchange rates, interest rates, money supply and inflation during 2008-2012. The research data was analyzed using multiple linear regression analysis. The results showed in the year 2008-2012 the condition of each variable are (1 the rate of inflation has a negative trend, (2 the interest rate has a negative trend, (3 the money supply has a positive trend, (4 the value of exchange rate has a positive trend. The test results by using multiple linear regression analysis result that variable interest rates, the money supply and the exchange rate of the rupiah significant effect on the rate of inflation.

  16. Clinical usefulness of physiological components obtained by factor analysis

    International Nuclear Information System (INIS)

    Ohtake, Eiji; Murata, Hajime; Matsuda, Hirofumi; Yokoyama, Masao; Toyama, Hinako; Satoh, Tomohiko.

    1989-01-01

    The clinical usefulness of physiological components obtained by factor analysis was assessed in 99m Tc-DTPA renography. Using definite physiological components, another dynamic data could be analyzed. In this paper, the dynamic renal function after ESWL (Extracorporeal Shock Wave Lithotripsy) treatment was examined using physiological components in the kidney before ESWL and/or a normal kidney. We could easily evaluate the change of renal functions by this method. The usefulness of a new analysis using physiological components was summarized as follows: 1) The change of a dynamic function could be assessed in quantity as that of the contribution ratio. 2) The change of a sick condition could be morphologically evaluated as that of the functional image. (author)

  17. Constructing the Japanese version of the Maslach Burnout Inventory-Student Survey: Confirmatory factor analysis.

    Science.gov (United States)

    Tsubakita, Takashi; Shimazaki, Kazuyo

    2016-01-01

    To examine the factorial validity of the Maslach Burnout Inventory-Student Survey, using a sample of 2061 Japanese university students majoring in the medical and natural sciences (67.9% male, 31.8% female; Mage  = 19.6 years, standard deviation = 1.5). The back-translated scale used unreversed items to assess inefficacy. The inventory's descriptive properties and Cronbach's alphas were calculated using SPSS software. The present authors compared fit indices of the null, one factor, and default three factor models via confirmatory factor analysis with maximum-likelihood estimation using AMOS software, version 21.0. Intercorrelations between exhaustion, cynicism, and inefficacy were relatively higher than in prior studies. Cronbach's alphas were 0.76, 0.85, and 0.78, respectively. Although fit indices of the hypothesized three factor model did not meet the respective criteria, the model demonstrated better fit than did the null and one factor models. The present authors added four paths between error variables within items, but the modified model did not show satisfactory fit. Subsequent analysis revealed that a bi-factor model fit the data better than did the hypothesized or modified three factor models. The Japanese version of the Maslach Burnout Inventory-Student Survey needs minor changes to improve the fit of its three factor model, but the scale as a whole can be used to adequately assess overall academic burnout in Japanese university students. Although the scale was back-translated, two items measuring exhaustion whose expressions overlapped should be modified, and all items measuring inefficacy should be reversed in order to statistically clarify the factorial difference between the scale's three factors. © 2015 The Authors. Japan Journal of Nursing Science © 2015 Japan Academy of Nursing Science.

  18. An inter-battery factor analysis of the comrey personality scales and the 16 personality factor questionnaire

    OpenAIRE

    Gideon P. de Bruin

    2000-01-01

    The scores of 700 Afrikaans-speaking university students on the Comrey Personality Scales and the 16 Personality Factor Questionnaire were subjected to an inter-battery factor analysis. This technique uses only the correlations between two sets of variables and reveals only the factors that they have in common. Three of the Big Five personality factors were revealed, namely Extroversion, Neuroticism and Conscientiousness. However, the Conscientiousness factor contained a relatively strong uns...

  19. Factor analysis of the Children's Behaviour Questionnaire in a Nigerian paediatric primary care population

    Directory of Open Access Journals (Sweden)

    O O Omigbodun

    2004-04-01

    Full Text Available Objective. This paper examines the factor structure of the Yoruba translation of the Children’s Behaviour Questionnaire for Completion by Parents (CBQ administered in a Nigerian paediatric primary care population. Design. A cross-sectional questionnaire survey. Subjects. Four hundred and seventy-eight children aged 7 - 14 years who attended a primary care clinic in Ibadan, Nigeria, over a 3-month period. Methods. Parents’ ratings of the children were obtained using the Yoruba translation of the CBQ. The factor structure of this instrument was examined using principal component analysis with varimax rotation. Only factors with eigenvalues of greater than 1 were examined further. Results. The first seven dimensions were readily conceptu- alised. These factors are conduct problem, hyperactivity, emotional problem, irritability, problems with elimination, a somatic complaint and a school problem dimension. Conclusion. These factors are similar to what has been observed in other studies involving populations of children with psychopathology, with the exception of the somatic com- plaint and school problem dimension. The emergence of these two factors, which are quite different from what has been observed in other studies, may demonstrate differences that reflect the influence of language, culture and the peculiarities of a primary care setting. On the other hand the similarity of most of the factors to those found in previous studies con- firms the broad similarities in the behaviour of children across different cultures.

  20. Latent physiological factors of complex human diseases revealed by independent component analysis of clinarrays

    Directory of Open Access Journals (Sweden)

    Chen David P

    2010-10-01

    Full Text Available Abstract Background Diagnosis and treatment of patients in the clinical setting is often driven by known symptomatic factors that distinguish one particular condition from another. Treatment based on noticeable symptoms, however, is limited to the types of clinical biomarkers collected, and is prone to overlooking dysfunctions in physiological factors not easily evident to medical practitioners. We used a vector-based representation of patient clinical biomarkers, or clinarrays, to search for latent physiological factors that underlie human diseases directly from clinical laboratory data. Knowledge of these factors could be used to improve assessment of disease severity and help to refine strategies for diagnosis and monitoring disease progression. Results Applying Independent Component Analysis on clinarrays built from patient laboratory measurements revealed both known and novel concomitant physiological factors for asthma, types 1 and 2 diabetes, cystic fibrosis, and Duchenne muscular dystrophy. Serum sodium was found to be the most significant factor for both type 1 and type 2 diabetes, and was also significant in asthma. TSH3, a measure of thyroid function, and blood urea nitrogen, indicative of kidney function, were factors unique to type 1 diabetes respective to type 2 diabetes. Platelet count was significant across all the diseases analyzed. Conclusions The results demonstrate that large-scale analyses of clinical biomarkers using unsupervised methods can offer novel insights into the pathophysiological basis of human disease, and suggest novel clinical utility of established laboratory measurements.

  1. Neutronic analysis of the European reference design of the water cooled lithium lead blanket for a DEMOnstration reactor

    International Nuclear Information System (INIS)

    Petrizzi, L.

    1994-01-01

    Water cooled lithium lead blankets, using liquid Pb-17Li eutectic both as breeder and neutron multiplier material, and martensitic steel as structural material, represent one of the four families under development in the European DEMO blanket programme. Two concepts were proposed, both reaching tritium breeding self-sufficiency: the 'box-shaped' and the 'cylindrical modules'. Also to this scope a new concept has been defined: 'the single box'. A neutronic analysis of the 'single box' is presented. A full 3-D model including the whole assembly and many of the reactor details (divertors, holes, gaps) has been defined, together with a 3-D neutron source. A tritium breeding ration (TBR) value of 1.19 confirms the tritium breeding self-sufficiency of the design. Selected power densities, calculated for the different materials and zones, are here presented. Some shielding capability considerations with respect to the toroidal field coil system are presented too. (author) 10 refs.; 3 figs.; 3 tabs

  2. Specific inhibition of p97/VCP ATPase and kinetic analysis demonstrate interaction between D1 and D2 ATPase domains.

    Science.gov (United States)

    Chou, Tsui-Fen; Bulfer, Stacie L; Weihl, Conrad C; Li, Kelin; Lis, Lev G; Walters, Michael A; Schoenen, Frank J; Lin, Henry J; Deshaies, Raymond J; Arkin, Michelle R

    2014-07-29

    The p97 AAA (ATPase associated with diverse cellular activities), also called VCP (valosin-containing protein), is an important therapeutic target for cancer and neurodegenerative diseases. p97 forms a hexamer composed of two AAA domains (D1 and D2) that form two stacked rings and an N-terminal domain that binds numerous cofactor proteins. The interplay between the three domains in p97 is complex, and a deeper biochemical understanding is needed in order to design selective p97 inhibitors as therapeutic agents. It is clear that the D2 ATPase domain hydrolyzes ATP in vitro, but whether D1 contributes to ATPase activity is controversial. Here, we use Walker A and B mutants to demonstrate that D1 is capable of hydrolyzing ATP and show for the first time that nucleotide binding in the D2 domain increases the catalytic efficiency (kcat/Km) of D1 ATP hydrolysis 280-fold, by increasing kcat 7-fold and decreasing Km about 40-fold. We further show that an ND1 construct lacking D2 but including the linker between D1 and D2 is catalytically active, resolving a conflict in the literature. Applying enzymatic observations to small-molecule inhibitors, we show that four p97 inhibitors (DBeQ, ML240, ML241, and NMS-873) have differential responses to Walker A and B mutations, to disease-causing IBMPFD mutations, and to the presence of the N domain binding cofactor protein p47. These differential effects provide the first evidence that p97 cofactors and disease mutations can alter p97 inhibitor potency and suggest the possibility of developing context-dependent inhibitors of p97. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Functional importance of conserved domains in the flowering-time gene CONSTANS demonstrated by analysis of mutant alleles and transgenic plants.

    Science.gov (United States)

    Robson, F; Costa, M M; Hepworth, S R; Vizir, I; Piñeiro, M; Reeves, P H; Putterill, J; Coupland, G

    2001-12-01

    CONSTANS promotes flowering of Arabidopsis in response to long-day conditions. We show that CONSTANS is a member of an Arabidopsis gene family that comprises 16 other members. The CO-Like proteins encoded by these genes contain two segments of homology: a zinc finger containing region near their amino terminus and a CCT (CO, CO-Like, TOC1) domain near their carboxy terminus. Analysis of seven classical co mutant alleles demonstrated that the mutations all occur within either the zinc finger region or the CCT domain, confirming that the two regions of homology are important for CO function. The zinc fingers are most similar to those of B-boxes, which act as protein-protein interaction domains in several transcription factors described in animals. Segments of CO protein containing the CCT domain localize GFP to the nucleus, but one mutation that affects the CCT domain delays flowering without affecting the nuclear localization function, suggesting that this domain has additional functions. All eight co alleles, including one recovered by pollen irradiation in which DNA encoding both B-boxes is deleted, are shown to be semidominant. This dominance appears to be largely due to a reduction in CO dosage in the heterozygous plants. However, some alleles may also actively delay flowering, because overexpression from the CaMV 35S promoter of the co-3 allele, that has a mutation in the second B-box, delayed flowering of wild-type plants. The significance of these observations for the role of CO in the control of flowering time is discussed.

  4. Parallel factor analysis PARAFAC of process affected water

    Energy Technology Data Exchange (ETDEWEB)

    Ewanchuk, A.M.; Ulrich, A.C.; Sego, D. [Alberta Univ., Edmonton, AB (Canada). Dept. of Civil and Environmental Engineering; Alostaz, M. [Thurber Engineering Ltd., Calgary, AB (Canada)

    2010-07-01

    A parallel factor analysis (PARAFAC) of oil sands process-affected water was presented. Naphthenic acids (NA) are traditionally described as monobasic carboxylic acids. Research has indicated that oil sands NA do not fit classical definitions of NA. Oil sands organic acids have toxic and corrosive properties. When analyzed by fluorescence technology, oil sands process-affected water displays a characteristic peak at 290 nm excitation and approximately 346 nm emission. In this study, a parallel factor analysis (PARAFAC) was used to decompose process-affected water multi-way data into components representing analytes, chemical compounds, and groups of compounds. Water samples from various oil sands operations were analyzed in order to obtain EEMs. The EEMs were then arranged into a large matrix in decreasing process-affected water content for PARAFAC. Data were divided into 5 components. A comparison with commercially prepared NA samples suggested that oil sands NA is fundamentally different. Further research is needed to determine what each of the 5 components represent. tabs., figs.

  5. Confirmatory Factor Analysis of the ISB - Burnout Syndrome Inventory

    Directory of Open Access Journals (Sweden)

    Ana Maria T. Benevides-Pereira

    2017-05-01

    Full Text Available AimBurnout is a dysfunctional reaction to chronic occupational stress. The present study analysis the psychometric qualities of the Burnout Syndrome Inventory (ISB through Confirmatory Factor Analysis (CFA.MethodEmpirical study in a multi-centre and multi-occupational sample (n = 701 using the ISB. The Part I assesses antecedent factors: Positive Organizational Conditions (PC and Negative Organizational Conditions (NC. The Part II assesses the syndrome: Emotional Exhaustion (EE, Dehumanization (DE, Emotional Distancing (ED and Personal Accomplishment (PA.ResultsThe highest means occurred in the positive scales CP (M = 23.29, SD = 5.89 and PA (M = 14.84, SD = 4.71. Negative conditions showed the greatest variability (SD = 6.03. Reliability indexes were reasonable, with the lowest rate at .77 for DE and the highest rate .91 for PA. The CFA revealed RMSEA = .057 and CFI = .90 with all scales regressions showing significant values (β = .73 until β = .92.ConclusionThe ISB showed a plausible instrument to evaluate burnout. The two sectors maintained the initial model and confirmed the theoretical presupposition. This instrument makes possible a more comprehensive idea of the labour context, and one or another part may be used separately according to the needs and the aims of the assessor.

  6. Confirmatory factor analysis of the Competitive State Anxiety Inventory-2.

    Science.gov (United States)

    Lane, A M; Sewell, D F; Terry, P C; Bartram, D; Nesti, M S

    1999-06-01

    The aim of this study was to evaluate the factor structure of the Competitive State Anxiety Inventory-2 (CSAI-2) using confirmatory factor analysis. Volunteer participants (n = 1213) completed the CSAI-2 approximately 1 h before competition and the data were analysed in two samples. The hypothesized model showed poor fit indices in both samples independently (Robust Comparative Fit Index: sample A = 0.82, sample B = 0.84) and simultaneously (Comparative Fit Index = 0.83), suggesting that the factor structure proposed by Martens et al. is flawed. Our findings suggest that a limitation of the Cognitive Anxiety scale derives from phrasing items around the word 'concerned' rather than 'worried'. We suggest that being concerned about an impending performance does not necessarily mean that an athlete is experiencing negative thoughts, but that the athlete is acknowledging the importance and difficulty of the challenge and is attempting to mobilize resources to cope. The present results question the use of the CSAI-2 as a valid measure of competitive state anxiety.

  7. Worldwide analysis of marine oil spill cleanup cost factors

    International Nuclear Information System (INIS)

    Etkin, D.S.

    2000-01-01

    The many factors that influence oil spill response costs were discussed with particular emphasis on how spill responses differ around the world because of differing cultural values, socio-economic factors and labor costs. This paper presented an analysis of marine oil spill cleanup costs based on the country, proximity to shoreline, spill size, oil type, degree of shoreline oiling and cleanup methodology. The objective was to determine how each factor impacts per-unit cleanup costs. Near-shore spills and in-port spills were found to be 4-5 times more expensive to clean than offshore spills. Responses to spills of heavy fuels also cost 10 times more than for lighter crudes and diesel. Spill responses for spills under 30 tonnes are 10 times more costly than on a per-unit basis, for spills of 300 tonnes. A newly developed modelling technique that can be used on different types of marine spills was described. It is based on updated cost data acquired from case studies of more than 300 spills in 40 countries. The model determines a per-unit cleanup cost estimation by taking into consideration oil type, location, spill size, cleanup methodology, and shoreline oiling. It was concluded that the actual spill costs are totally dependent on the actual circumstances of the spill. 13 refs., 10 tabs., 3 figs

  8. Container Throughput Forecasting Using Dynamic Factor Analysis and ARIMAX Model

    Directory of Open Access Journals (Sweden)

    Marko Intihar

    2017-11-01

    Full Text Available The paper examines the impact of integration of macroeconomic indicators on the accuracy of container throughput time series forecasting model. For this purpose, a Dynamic factor analysis and AutoRegressive Integrated Moving-Average model with eXogenous inputs (ARIMAX are used. Both methodologies are integrated into a novel four-stage heuristic procedure. Firstly, dynamic factors are extracted from external macroeconomic indicators influencing the observed throughput. Secondly, the family of ARIMAX models of different orders is generated based on the derived factors. In the third stage, the diagnostic and goodness-of-fit testing is applied, which includes statistical criteria such as fit performance, information criteria, and parsimony. Finally, the best model is heuristically selected and tested on the real data of the Port of Koper. The results show that by applying macroeconomic indicators into the forecasting model, more accurate future throughput forecasts can be achieved. The model is also used to produce future forecasts for the next four years indicating a more oscillatory behaviour in (2018-2020. Hence, care must be taken concerning any bigger investment decisions initiated from the management side. It is believed that the proposed model might be a useful reinforcement of the existing forecasting module in the observed port.

  9. Isotopic Analysis of Fingernails as a USGS Open House Demonstration of the Use of Stable Isotopes in Foodweb Studies

    Science.gov (United States)

    Silva, S. R.; Kendall, C.; Young, M. B.; Choy, D.

    2011-12-01

    The USGS Isotope Tracers Project uses stable isotopes and tritium to add a unique dimension of chemical information to a wide range of environmental investigations. The use and application of isotopes is usually an unfamiliar and even esoteric topic to the general public. Therefore during three USGS open house events, as a public outreach effort, we demonstrated the use of stable isotopes by analyzing nitrogen and carbon isotopes from very small fragments of fingernail from willing participants. We titled the exhibit "You Are What You Eat". The results from all participants were plotted on a graph indicating the general influence of different food groups on the composition of body tissues as represented by fingernails. All participants were assigned a number and no personal-identification information was collected. A subset of participants provided us with an estimate of the number of days a week various foods were eaten and if they were vegetarians, vegans or non-vegetarians. Volunteers from our research group were on hand to explain and discuss fundamental concepts such as how foods attain their isotopic composition, the difference between C3 and C4 plants, the effects of assimilation, trophic enrichment, and the various uses of stable isotopes in environmental studies. The results of the fingernail analyses showed the variation of the range of isotopic compositions among about 400 people at each event, the distinct influence of C4 plants (mainly corn and cane sugar) on our carbon isotopic composition, and the isotopic differences between vegetarians and non vegetarians among other details (http://wwwrcamnl.wr.usgs.gov/isoig/projects/fingernails/). A poll of visitors attending the open house event in 2006 indicated that "You Are What You Eat" was among the most popular exhibits. Following the first two open house events we were contacted by a group of researchers from Brazil who had completed a very similar study. Our collaboration resulted in a publication in

  10. Chimeric analysis of EGFP and DsRed2 transgenic mice demonstrates polyclonal maintenance of pancreatic acini.

    Science.gov (United States)

    Ryu, Je-Young; Siswanto, Antoni; Harimoto, Kenichi; Tagawa, Yoh-ichi

    2013-06-01

    The pancreatic islet is an assembly of specific endocrine cells. There are many conflicting reports regarding whether the acinus develops from single or multiple progenitor cells. This study investigated the development and maintenance clonality of the pancreatic acinus and duct using a chimeric analysis with EGFP and DsRed2 transgenic mice. Chimeric mice (G-R mice) were obtained by the aggregation method, using 8-cell stage embryos from EGFP and DsRed2 transgenic mice. The islets from the G-R mice were chimeric and mosaic, consisting of either EGFP- or DsRed2-positive populations, as in previous reports. On the other hand, most acini developed from either EGFP or DsRed2 origin, but some were chimeric. Interestingly, these chimeric acini were clearly separated into two-color regions and were not mosaic. Some large intralobular pancreatic ducts consisting of more than 10 cells were found to be chimeric, but no small ducts made up of less than 9 cells were chimeric. Our histological observations suggest that the pancreatic acinus polyclonally and directionally is maintained by multiple progenitor cells. Pancreatic large ducts also seem to develop polyclonally and might result from the assembly of small ducts that develop from a single origin. These findings provide useful information for further understanding pancreatic maintenance.

  11. In situ demonstration and characteristic analysis of the protease components from marine bacteria using substrate immersing zymography.

    Science.gov (United States)

    Liu, Dan; Yang, XingHao; Huang, JiaFeng; Wu, RiBang; Wu, CuiLing; He, HaiLun; Li, Hao

    2015-01-01

    Zymography is a widely used technique for the study of proteolytic activities on the basis of protein substrate degradation. In this study, substrate immersing zymography was used in analyzing proteolysis of extracellular proteases. Instead of being added directly into a sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE) gel, the substrates were added into the immersing solution after electrophoresis. Substrate immersing zymography could accurately determine the molecular weight of trypsin, and band intensities were linearly related to the amount of protease. The diversity of extracellular proteases produced by different marine bacteria was analyzed by substrate immersing zymography, and large variations of proteolysis were evidenced. The proteolytic activity of Pseudoalteromonas strains was more complicated than that of other strains. Five Pseudoalteromonas strains and five Vibrio strains were further analyzed by substrate immersing zymography with different substrates (casein and gelatin), and multiple caseinolytic and gelatinolytic profiles were detected. The extracellular proteolytic profiles of Pseudoalteromonas strains exhibited a large intraspecific variation. Molecular weight (Mw) of the main protease secreted by Vibrio was 35 kDa. Additionally, the time-related change trends of the activities of extracellular proteases produced by Pseudoalteromonas sp. SJN2 were analyzed by substrate immersing zymography. These results implied the potential application of substrate immersing zymography for the analysis of the diversity of bacterial extracellular proteases.

  12. Preliminary Failure Modes, Effects and Criticality Analysis (FMECA) of the Brayton Isotope Power System (BIPS) Ground Demonstration System. Report 76-311965

    International Nuclear Information System (INIS)

    Miller, L.G.

    1976-01-01

    A Failure Modes, Effects and Criticality Analysis (FMECA) has been made of the Brayton Isotope Power System Ground Demonstration System (BIPS-GDS). Details of the analysis are discussed. The BIPS Flight System was recently analyzed in an AIRPHX report. Since the results of the Flight System FMECA are directly applicable to the BIPS to be tested in the GDS mode, the contents of the earlier FMECA have not been repeated in this current analysis. The BIPS-FS FMECA has been reviewed and determined to be essentially current

  13. A Factor Analysis of The Social Interest Index--Revised.

    Science.gov (United States)

    Zarski, John J.; And Others

    1983-01-01

    Factor analyzed the Social Interest Index-Revised (SII-R), which measures levels of social interest attained in each of four life task areas. Four factors (N=308) were defined, i.e., a self-significance factor, a love factor, a friendship factor, and a work factor. Results support the empirical validity of the scale. (Author/PAS)

  14. Analysis of nasopharyngeal carcinoma risk factors with Bayesian networks.

    Science.gov (United States)

    Aussem, Alex; de Morais, Sérgio Rodrigues; Corbex, Marilys

    2012-01-01

    We propose a new graphical framework for extracting the relevant dietary, social and environmental risk factors that are associated with an increased risk of nasopharyngeal carcinoma (NPC) on a case-control epidemiologic study that consists of 1289 subjects and 150 risk factors. This framework builds on the use of Bayesian networks (BNs) for representing statistical dependencies between the random variables. We discuss a novel constraint-based procedure, called Hybrid Parents and Children (HPC), that builds recursively a local graph that includes all the relevant features statistically associated to the NPC, without having to find the whole BN first. The local graph is afterwards directed by the domain expert according to his knowledge. It provides a statistical profile of the recruited population, and meanwhile helps identify the risk factors associated to NPC. Extensive experiments on synthetic data sampled from known BNs show that the HPC outperforms state-of-the-art algorithms that appeared in the recent literature. From a biological perspective, the present study confirms that chemical products, pesticides and domestic fume intake from incomplete combustion of coal and wood are significantly associated with NPC risk. These results suggest that industrial workers are often exposed to noxious chemicals and poisonous substances that are used in the course of manufacturing. This study also supports previous findings that the consumption of a number of preserved food items, like house made proteins and sheep fat, are a major risk factor for NPC. BNs are valuable data mining tools for the analysis of epidemiologic data. They can explicitly combine both expert knowledge from the field and information inferred from the data. These techniques therefore merit consideration as valuable alternatives to traditional multivariate regression techniques in epidemiologic studies. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Making the invisible visible: bioelectrical impedance analysis demonstrates unfavourable body composition in rheumatoid arthritis patients in clinical practice.

    Science.gov (United States)

    Konijn, N P C; van Tuyl, L H D; Bultink, I E M; Lems, W F; Earthman, C P; van Bokhorst-de van der Schueren, M A E

    2014-01-01

    To examine differences between the assessment of body composition by body mass index (BMI) and bioelectrical impedance analysis (BIA) in patients with rheumatoid arthritis (RA). The body composition of RA patients was assessed during their visit to the outpatient department of a Dutch academic hospital using BMI, fat-free mass index (FFMI), and fat mass index (FMI). FFMI and FMI were determined by single-frequency BIA. Sixty-five consecutive RA patients (83% women, mean age 58 years, median disease duration 7 years) with moderately active disease [mean Disease Activity Score using 28 joint counts (DAS28) = 3.40; mean Rheumatoid Arthritis Disease Activity Index (RADAI) score = 3.49] and moderate disability [mean Health Assessment Questionnaire (HAQ) score = 0.87] were included. Based on BMI, 2% of our study population were underweight, 45% had a healthy body composition, and 54% were overweight or obese. Based on BIA, 18% of the patients showed a low FFMI and 74% had a high or very high FMI. Low FFMI was found in 44% of the women with a normal BMI, and high FMI was found in 40% of the women and 75% of the men with a normal BMI. A high frequency of unfavourable body composition, predominantly reduced FFMI and elevated FMI, was found in a cohort of RA patients with moderately active disease, turning BMI into an unreliable method for assessment of body composition in RA. BIA, however, might be the preferred method to assess FFMI and FMI in RA patients in clinical practice, as it is easy to use and relatively inexpensive.

  16. Analysis of hepatic transcriptome demonstrates altered lipid metabolism following Lactobacillus johnsonii BS15 prevention in chickens with subclinical necrotic enteritis.

    Science.gov (United States)

    Qing, Xiaodan; Zeng, Dong; Wang, Hesong; Ni, Xueqin; Lai, Jing; Liu, Lei; Khalique, Abdul; Pan, Kangcheng; Jing, Bo

    2018-04-20

    Subclinical necrotic enteritis (SNE) widely outbreaks in chickens which inflicted growth-slowing, causing enormous social and economic burdens. To better understand the molecular underpinnings of SNE on lipid metabolism and explore novel preventative strategies against SNE, we studied the regulatory mechanism of a potential probiotic, Lactobacillus johnsonii BS15 on the lipid metabolism pathways involved in chickens with SNE. One hundred eighty one-day-old chickens were randomly divided into three groups and arranged with basal diet (control and SNE group). Added with BS15 (1 × 10 6  cfu/g) or Man Rogosa Sharpe (MRS) liquid medium for 28 days. The hepatic gene expression of each group was then measured using high-throughput analysis methods (RNA-Seq). Quantitative real-time PCR (qRT-PCR) was used to detect the expression changes of the related genes. The results showed that there are eleven lipid metabolic pathways were found during the prevention of BS15 treatment in SNE chickens by RNA-Seq, including the peroxisome proliferator-activated receptor (PPAR) signaling pathway and arachidonic acid metabolism. BS15 notably facilitated the expressions of fatty acid binding protein 2 (FABP2), acyl-CoA synthetase bubblegum family member 1 (ACSBG1), perilipin 1 (PLIN1) and perilipin 2 (PLIN2), which were involved in PPAR signaling pathway of SNE chickens. Besides, suppression of phospholipase A2 group IVA (PLA2G4A) in arachidonic acid metabolism was observed in SNE chickens after BS15 prevention. The expression patterns of FABP2, ACSBG1, PLIN1, PLIN2 and PLA24G in qRT-PCR validation were consistent with RNA-Seq results. These findings indicate that SNE may affect the hepatic lipid metabolism of chickens. Meanwhile, BS15 pretreatment may provide a prospective natural prophylaxis strategy against SNE through improving the PPAR signaling pathway and arachidonic acid metabolism.

  17. A Magnetic Circuit Demonstration.

    Science.gov (United States)

    Vanderkooy, John; Lowe, June

    1995-01-01

    Presents a demonstration designed to illustrate Faraday's, Ampere's, and Lenz's laws and to reinforce the concepts through the analysis of a two-loop magnetic circuit. Can be made dramatic and challenging for sophisticated students but is suitable for an introductory course in electricity and magnetism. (JRH)

  18. Immunohistochemical analysis of mechanoreceptors in the human posterior cruciate ligament: a demonstration of its proprioceptive role and clinical relevance.

    Science.gov (United States)

    Del Valle, M E; Harwin, S F; Maestro, A; Murcia, A; Vega, J A

    1998-12-01

    Although long-term studies report successful results with total knee arthroplasty (TKA), performed with or without posterior cruciate ligament (PCL) retention, controversy exists as to which is preferable in regard to patient outcome and satisfaction. The possible proprioceptive role of the PCL may account for a more normal feeling of the arthroplasty. Although the PCL has been examined using various histological techniques, immunohistochemical techniques are the most sensitive for neural elements. Therefore an immunohistochemical study was designed to determine the patterns of innervation, the morphological types of the proprioceptors, and their immunohistochemical profile. During TKA, samples were obtained from 22 osteoarthritic PCLs and subjected to immunohistochemical analysis with mouse monoclonal antibodies against neurofilament protein (NFP), S100 protein (S100P), epithelial membrane antigen (EMA), and vimentin (all present in neuromechanoreceptors). Three normal PCLs from cadaveric specimens were also obtained and analyzed for comparison. Five types of sensory corpuscles were observed in both the normal and the arthritic PCLs: simple lamellar, Pacini-like, Ruffini, Krause-like, and morphologically unclassified. Their structure included a central axon, inner core, and capsule in lamellar and Pacini corpuscles and variable intracorpuscular axons and periaxonal cells in the Ruffini and Krause-like corpuscles. The immunohistochemical profile showed the central axon to have NFP immunoreactivity, periaxonal cells to have S100P and vimentin immunoreactivity, and the capsule to have EMA and vimentin immunoreactivity. Nerve fibers and free nerve endings displayed NFP and S100P immunoreactivity. The immunohistochemical profile of the PCL sensory corpuscles is almost identical to that of cutaneous sensory corpuscles. Some prior histological studies of the PCL reported Golgi-like mechanoreceptors, and others found encapsulated corpuscles but no Golgi-like structures

  19. The scientific use of factor analysis in behavioral and life sciences

    National Research Council Canada - National Science Library

    Cattell, Raymond Bernard

    1978-01-01

    ...; the choice of procedures in experimentation; factor interpretation; the relationship of factor analysis to broadened psychometric concepts such as scaling, validity, and reliability, and to higher- strata models...

  20. A Brief Pre-Intervention Analysis and Demonstration of the Effects of a Behavioral Safety Package on Postural Behaviors of Pharmacy Employees

    Science.gov (United States)

    Fante, Rhiannon; Gravina, Nicole; Austin, John

    2007-01-01

    This study employed a pre-intervention analysis to determine factors that contributed to safe ergonomic postures in a small pharmacy. The pharmacy was located on a university campus and employed both pharmacists and pharmacy technicians. Three of the eight pharmacy employees had experienced various repetitive motion injuries that resulted in a…

  1. Exploratory Factor Analysis of the Beck Anxiety Inventory and the Beck Depression Inventory-II in a Psychiatric Outpatient Population

    Science.gov (United States)

    2018-01-01

    Background To further understand the relationship between anxiety and depression, this study examined the factor structure of the combined items from two validated measures for anxiety and depression. Methods The participants were 406 patients with mixed psychiatric diagnoses including anxiety and depressive disorders from a psychiatric outpatient unit at a university-affiliated medical center. Responses of the Beck Anxiety Inventory (BAI), Beck Depression Inventory (BDI)-II, and Symptom Checklist-90-Revised (SCL-90-R) were analyzed. We conducted an exploratory factor analysis of 42 items from the BAI and BDI-II. Correlational analyses were performed between subscale scores of the SCL-90-R and factors derived from the factor analysis. Scores of individual items of the BAI and BDI-II were also compared between groups of anxiety disorder (n = 185) and depressive disorder (n = 123). Results Exploratory factor analysis revealed the following five factors explaining 56.2% of the total variance: somatic anxiety (factor 1), cognitive depression (factor 2), somatic depression (factor 3), subjective anxiety (factor 4), and autonomic anxiety (factor 5). The depression group had significantly higher scores for 12 items on the BDI while the anxiety group demonstrated higher scores for six items on the BAI. Conclusion Our results suggest that anxiety and depressive symptoms as measured by the BAI and BDI-II can be empirically differentiated and that particularly items of the cognitive domain in depression and those of physical domain in anxiety are noteworthy. PMID:29651821

  2. Analysis of Entrepreneurship barriers in Moravia-Silesian Region by VRIO and Factor analysis application

    OpenAIRE

    Šebestová, Jarmila

    2007-01-01

    The small and medium sized entrepreneurship is often considered to be as a phenomenon of our times. Why many authors dedicated their work on this field? The main reason is that SME make influence on society life and contribute to economic development of the region, where they establish their business. The same situation is in Moravia-Silesian region, where the fac-tor analysis being applied. VRIO and Porter's analysis were used to interpret clearly research findings.

  3. Confirmatory Factor Analysis of the Finnish Job Content Questionnaire (JCQ in 590 Professional Musicians

    Directory of Open Access Journals (Sweden)

    Heidi Vastamäki

    2017-07-01

    Full Text Available Background: Poorly functioning work environments may lead to dissatisfaction for the employees and financial loss for the employers. The Job Content Questionnaire (JCQ was designed to measure social and psychological characteristics of work environments. Objective: To investigate the factor construct of the Finnish 14-item version of JCQ when applied to professional orchestra musicians. Methods: In a cross-sectional survey, the questionnaire was sent by mail to 1550 orchestra musicians and students. 630 responses were received. Full data were available for 590 respondents (response rate 38%.The questionnaire also contained questions on demographics, job satisfaction, health status, health behaviors, and intensity of playing music. Confirmatory factor analysis of the 2-factor model of JCQ was conducted. Results: Of the 5 estimates, JCQ items in the “job demand” construct, the “conflicting demands” (question 5 explained most of the total variance in this construct (79% demonstrating almost perfect correlation of 0.63. In the construct of “job control,” “repetitive work” (question 10 demonstrated a perfect correlation index of 0.84 and the items “little decision freedom” (question 14 and “allows own decisions” (question 6 showed substantial correlations of 0.77 and 0.65. Conclusion: The 2-factor model of the Finnish 14-item version of JCQ proposed in this study fitted well into the observed data. The “conflicting demands,” “repetitive work,” “little decision freedom,” and “allows own decisions” items demonstrated the strongest correlations with latent factors suggesting that in a population similar to the studied one, especially these items should be taken into account when observed in the response of a population.

  4. Confirmatory Factor Analysis of the Finnish Job Content Questionnaire (JCQ) in 590 Professional Musicians.

    Science.gov (United States)

    Vastamäki, Heidi; Vastamäki, Martti; Laimi, Katri; Saltychev, Michail

    2017-07-01

    Poorly functioning work environments may lead to dissatisfaction for the employees and financial loss for the employers. The Job Content Questionnaire (JCQ) was designed to measure social and psychological characteristics of work environments. To investigate the factor construct of the Finnish 14-item version of JCQ when applied to professional orchestra musicians. In a cross-sectional survey, the questionnaire was sent by mail to 1550 orchestra musicians and students. 630 responses were received. Full data were available for 590 respondents (response rate 38%).The questionnaire also contained questions on demographics, job satisfaction, health status, health behaviors, and intensity of playing music. Confirmatory factor analysis of the 2-factor model of JCQ was conducted. Of the 5 estimates, JCQ items in the "job demand" construct, the "conflicting demands" (question 5) explained most of the total variance in this construct (79%) demonstrating almost perfect correlation of 0.63. In the construct of "job control," "opinions influential" (question 10) demonstrated a perfect correlation index of 0.84 and the items "little decision freedom" (question 14) and "allows own decisions" (question 6) showed substantial correlations of 0.77 and 0.65. The 2-factor model of the Finnish 14-item version of JCQ proposed in this study fitted well into the observed data. The "conflicting demands," "opinions influential," "little decision freedom," and "allows own decisions" items demonstrated the strongest correlations with latent factors suggesting that in a population similar to the studied one, especially these items should be taken into account when observed in the response of a population.

  5. Mice with diet-induced obesity demonstrate a relative prothrombotic factor profile and a thicker aorta with reduced ex-vivo function.

    Science.gov (United States)

    Uner, Aykut G; Unsal, Cengiz; Unsal, Humeyra; Erdogan, Mumin A; Koc, Ece; Ekici, Mehmet; Avci, Hamdi; Balkaya, Muharrem; Belge, Ferda; Tarin, Lokman

    2018-04-01

    : Classical risk factors such as cholesterol and lipoproteins are currently not sufficient to explain all physiopathological processes of obesity-related vascular dysfunction as well as atherosclerosis and arteriosclerosis. Therefore, the discovery of potential markers involved in vascular dysfunction in the obese state is still needed. Disturbances in hemostatic factors may be involved in the developmental processes associated with obesity-related cardiovascular disorders. We hypothesized that alterations of several hemostatic factors in the obese state could correlate with the function and morphology of the aorta and it could play an important role in the development of vascular dysfunction. To test this, we fed mice with a high-fat diet for 18 weeks and investigated the relationships between selected hemostatic factors (in either plasma or in the liver), metabolic hormones and morphology, and ex-vivo function of the aorta. Here, we show that 18-week exposure to a high-fat diet results in a higher plasma fibrinogen and prolonged prothrombin time in diet-induced obese mice compared to the controls. In addition, liver levels or activities of FII, FX, activated protein C, AT-III, and protein S are significantly different in diet-induced obese mice as compared to the controls. Curiously, FII, FVIII, FX, activated protein C, PTT, and protein S are correlated with both the aorta histology (aortic thickness and diameter) and ex-vivo aortic function. Notably, ex-vivo studies revealed that diet-induced obese mice show a marked attenuation in the functions of the aorta. Taken together, aforementioned hemostatic factors may be considered as critical markers for obesity-related vascular dysfunction and they could play important roles in diagnosing of the dysfunction.

  6. Investigation and analysis of aircrew ametropia and related factors

    Directory of Open Access Journals (Sweden)

    Li-Juan Zheng

    2014-10-01

    Full Text Available AIM: To investigate the refractive distribution and analysis risk factors for aircrew ametropia.METHODS: The number of 49 cases with ametropia from 1031 aircrew during May 2013 to May 2014 were reviewed. Various types of refraction composition, age, type, position, time of flight with the subjective assessment of aircrew were analyzed and compared. RESULTS: Of 49 cases, 43 cases(88%were myopia, 6 cases(12%were hypermetropia.,Detection rates were higher in age over 50 years aircrew and flight time more than 3000h. Detection rates were lower in self-conscious symptom heavy aircrew, fighter aircrew and good habit of using eyes. CONCLUSION: The myopia incidence in aircrew with age >50 years and long flight time is higher, than that of fighter pilots and good habit of using eyes. We should pay attention to the increasing late-onset myopia of aviators and habit of using eyes, work intensity and time of using eyes about aircrew.

  7. [Rhabdomyolysis in a Bipolar Adolescent. Analysis of Associated Factors].

    Science.gov (United States)

    Restrepo, Diana; Montoya, Pablo; Giraldo, Laura; Gaviria, Génesis; Mejía, Catalina

    2015-01-01

    To describe a case of rhabdomyolysis associated with the use of quetiapine and lamotrigine in an adolescent treated for bipolar disorder. Description of the clinical case, analysis of the associated factors and a non-systematic review of the relevant literature. An 18 year old male, with bipolar disorder and treated pharmacologically with quetiapine and lamotrigine, after two weeks of physical activity presents with rhabdomyolysis. Quetiapine and exercise have been associated with rhabdomyolysis. The mediator mechanism of this association has not been found, although it has been established that there is neuromuscular dysfunction and an increase in sarcomere permeability. This clinical case allowed the complex interaction between antipsychotic agents and increased physical activity to be observed in a psychiatric adolescent patient, as well as the appearance of a potentially lethal medical complication. Copyright © 2014 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  8. A Retrospective Analysis of Neonatal Encephalocele Predisposing Factors and Outcomes.

    Science.gov (United States)

    Yucetas, Seyho Cem; Uçler, Necati

    2017-01-01

    This study evaluates the predisposing factors and outcomes of surgical management of encephaloceles at our institution. A retrospective analysis of 32 occipital encephaloceles managed operatively at the Neurosurgery Department Clinics of the Faculty of Medicine, Adıyaman University, was performed between 2011 and 2015. Among the study population, 19 mothers had been exposed to TORCH infections (toxoplasma, rubella, cytomegalovirus, herpes simplex virus), 18 were in consanguineous marriages, and 3 had regular prenatal screening. Associated congenital anomalies were common. Eight infants required reoperation, and 9 died during follow-up. The study identified key areas for prevention. Knowledge of the intracranial and associated anomalies can guide management. © 2016 S. Karger AG, Basel.

  9. Theory of sampling: four critical success factors before analysis.

    Science.gov (United States)

    Wagner, Claas; Esbensen, Kim H

    2015-01-01

    Food and feed materials characterization, risk assessment, and safety evaluations can only be ensured if QC measures are based on valid analytical data, stemming from representative samples. The Theory of Sampling (TOS) is the only comprehensive theoretical framework that fully defines all requirements to ensure sampling correctness and representativity, and to provide the guiding principles for sampling in practice. TOS also defines the concept of material heterogeneity and its impact on the sampling process, including the effects from all potential sampling errors. TOS's primary task is to eliminate bias-generating errors and to minimize sampling variability. Quantitative measures are provided to characterize material heterogeneity, on which an optimal sampling strategy should be based. Four critical success factors preceding analysis to ensure a representative sampling process are presented here.

  10. Dispersive analysis of the scalar form factor of the nucleon

    Science.gov (United States)

    Hoferichter, M.; Ditsche, C.; Kubis, B.; Meißner, U.-G.

    2012-06-01

    Based on the recently proposed Roy-Steiner equations for pion-nucleon ( πN) scattering [1], we derive a system of coupled integral equations for the π π to overline N N and overline K K to overline N N S-waves. These equations take the form of a two-channel Muskhelishvili-Omnès problem, whose solution in the presence of a finite matching point is discussed. We use these results to update the dispersive analysis of the scalar form factor of the nucleon fully including overline K K intermediate states. In particular, we determine the correction {Δ_{σ }} = σ ( {2M_{π }^2} ) - {σ_{{π N}}} , which is needed for the extraction of the pion-nucleon σ term from πN scattering, as a function of pion-nucleon subthreshold parameters and the πN coupling constant.

  11. Bayesian analysis of factors associated with fibromyalgia syndrome subjects

    Science.gov (United States)

    Jayawardana, Veroni; Mondal, Sumona; Russek, Leslie

    2015-01-01

    Factors contributing to movement-related fear were assessed by Russek, et al. 2014 for subjects with Fibromyalgia (FM) based on the collected data by a national internet survey of community-based individuals. The study focused on the variables, Activities-Specific Balance Confidence scale (ABC), Primary Care Post-Traumatic Stress Disorder screen (PC-PTSD), Tampa Scale of Kinesiophobia (TSK), a Joint Hypermobility Syndrome screen (JHS), Vertigo Symptom Scale (VSS-SF), Obsessive-Compulsive Personality Disorder (OCPD), Pain, work status and physical activity dependent from the "Revised Fibromyalgia Impact Questionnaire" (FIQR). The study presented in this paper revisits same data with a Bayesian analysis where appropriate priors were introduced for variables selected in the Russek's paper.

  12. Analysis of factors affecting the effect of stope leaching

    International Nuclear Information System (INIS)

    Xie Wangnan; Dong Chunming

    2014-01-01

    The industrial test and industrial trial production of stope leaching were carried out at Taoshan orefield of Dabu deposit. The results of test and trial production showed obvious differences in leaching rate and leaching time. Compared with industrial trial production of stope leaching, the leaching rate of industrial test was higher, and leaching time was shorter. It was considered that the blasting method and liquid arrangement were the main factors affecting the leaching rate and leaching time according to analysis. So we put forward the following suggestions: the technique of deep hole slicing tight-face blasting was used to reduce the yield of lump ores, the effective liquid arrangement methods were adopted to make the lixiviant infiltrating throughout whole ore heap, and bacterial leaching was introduced. (authors)

  13. Measuring coalition functioning: refining constructs through factor analysis.

    Science.gov (United States)

    Brown, Louis D; Feinberg, Mark E; Greenberg, Mark T

    2012-08-01

    Internal and external coalition functioning is an important predictor of coalition success that has been linked to perceived coalition effectiveness, coalition goal achievement, coalition ability to support evidence-based programs, and coalition sustainability. Understanding which aspects of coalition functioning best predict coalition success requires the development of valid measures of empirically unique coalition functioning constructs. The goal of the present study is to examine and refine the psychometric properties of coalition functioning constructs in the following six domains: leadership, interpersonal relationships, task focus, participation benefits/costs, sustainability planning, and community support. The authors used factor analysis to identify problematic items in our original measure and then piloted new items and scales to create a more robust, psychometrically sound, multidimensional measure of coalition functioning. Scales displayed good construct validity through correlations with other measures. Discussion considers the strengths and weaknesses of the refined instrument.

  14. Postpartum Depression in Women: A Risk Factor Analysis.

    Science.gov (United States)

    Zaidi, Farheen; Nigam, Aruna; Anjum, Ruby; Agarwalla, Rashmi

    2017-08-01

    Postpartum Depression (PPD) is a known entity affecting not only the women but the whole family. It affects women more harshly and chronically due to their increased stress sensitivity, maladaptive coping strategies and multiple social roles in the community. To estimate the commonly associated risk factors of PPD among the women coming to a tertiary hospital in New Delhi, India. It was a longitudinal study conducted at the antenatal clinic for a period of one year. Total 260 women were screened at > 36 weeks of gestation, of which 149 postnatal women completed the questionnaire for PPD at six weeks of their delivery. The inform consent, demographical data and obstetrical details from each participant was taken before commencing the screening. Various risk factors and their association were determined by odds-ratio and significant association was accepted at order to identify the most important confounding variables, logistic regression analysis was used. PPD is a common mental health problem seen among the postnatal women as it was found in 12.75% (19 out of 149) of subjects at six weeks of their delivery. Moreover, it has significant association with the young maternal age (p-value=0.040), birth of the female child (p-value=0.015), previous stressful life events (p-value= 0.003), low self-esteem and feeling of loneliness (p-value=0.007). This study provides important information regarding the risk factors associated with development of PPD in this region of India. Female sex of the new born and the younger age play an important role in the development of PPD.

  15. Analysis of vector boson production within TMD factorization

    Energy Technology Data Exchange (ETDEWEB)

    Scimemi, Ignazio [Universidad Complutense de Madrid, Departamento de Fisica Teorica, Madrid (Spain); Vladimirov, Alexey [Universitaet Regensburg, Institut fuer Theoretische Physik, Regensburg (Germany)

    2018-02-15

    We present a comprehensive analysis and extraction of the unpolarized transverse momentum dependent (TMD) parton distribution functions, which are fundamental constituents of the TMD factorization theorem. We provide a general review of the theory of TMD distributions, and present a new scheme of scale fixation. This scheme, called the ζ-prescription, allows to minimize the impact of perturbative logarithms in a large range of scales and does not generate undesired power corrections. Within ζ-prescription we consistently include the perturbatively calculable parts up to next-to-next-to-leading order (NNLO), and perform the global fit of the Drell-Yan and Z-boson production, which include the data of E288, Tevatron and LHC experiments. The non-perturbative part of the TMDs are explored checking a variety of models. We support the obtained results by a study of theoretical uncertainties, perturbative convergence, and a dedicated study of the range of applicability of the TMD factorization theorem. The considered non-perturbative models present significant differences in the fitting behavior, which allow us to clearly disfavor most of them. The numerical evaluations are provided by the arTeMiDe code, which is introduced in this work and that can be used for current/future TMD phenomenology. (orig.)

  16. Analysis of vector boson production within TMD factorization

    International Nuclear Information System (INIS)

    Scimemi, Ignazio; Vladimirov, Alexey

    2018-01-01

    We present a comprehensive analysis and extraction of the unpolarized transverse momentum dependent (TMD) parton distribution functions, which are fundamental constituents of the TMD factorization theorem. We provide a general review of the theory of TMD distributions, and present a new scheme of scale fixation. This scheme, called the ζ-prescription, allows to minimize the impact of perturbative logarithms in a large range of scales and does not generate undesired power corrections. Within ζ-prescription we consistently include the perturbatively calculable parts up to next-to-next-to-leading order (NNLO), and perform the global fit of the Drell-Yan and Z-boson production, which include the data of E288, Tevatron and LHC experiments. The non-perturbative part of the TMDs are explored checking a variety of models. We support the obtained results by a study of theoretical uncertainties, perturbative convergence, and a dedicated study of the range of applicability of the TMD factorization theorem. The considered non-perturbative models present significant differences in the fitting behavior, which allow us to clearly disfavor most of them. The numerical evaluations are provided by the arTeMiDe code, which is introduced in this work and that can be used for current/future TMD phenomenology. (orig.)

  17. Structural and functional analysis of coral Hypoxia Inducible Factor.

    Science.gov (United States)

    Zoccola, Didier; Morain, Jonas; Pagès, Gilles; Caminiti-Segonds, Natacha; Giuliano, Sandy; Tambutté, Sylvie; Allemand, Denis

    2017-01-01

    Tissues of symbiotic Cnidarians are exposed to wide, rapid and daily variations of oxygen concentration. Indeed, during daytime, intracellular O2 concentration increases due to symbiont photosynthesis, while during night, respiration of both host cells and symbionts leads to intra-tissue hypoxia. The Hypoxia Inducible Factor 1 (HIF-1) is a heterodimeric transcription factor used for maintenance of oxygen homeostasis and adaptation to hypoxia. Here, we carried out a mechanistic study of the response to variations of O2 concentrations of the coral model Stylophora pistillata. In silico analysis showed that homologs of HIF-1 α (SpiHIF-1α) and HIF-1β (SpiHIF-1β) exist in coral. A specific SpiHIF-1 DNA binding on mammalian Hypoxia Response Element (HRE) sequences was shown in extracts from coral exposed to dark conditions. Then, we cloned the coral HIF-1α and β genes and determined their expression and transcriptional activity. Although HIF-1α has an incomplete Oxygen-dependent Degradation Domain (ODD) relative to its human homolog, its protein level is increased under hypoxia when tested in mammalian cells. Moreover, co-transfection of SpiHIF-1α and β in mammalian cells stimulated an artificial promoter containing HRE only in hypoxic conditions. This study shows the strong conservation of molecular mechanisms involved in adaptation to O2 concentration between Cnidarians and Mammals whose ancestors diverged about 1,200-1,500 million years ago.

  18. Structural and functional analysis of coral Hypoxia Inducible Factor.

    Directory of Open Access Journals (Sweden)

    Didier Zoccola

    Full Text Available Tissues of symbiotic Cnidarians are exposed to wide, rapid and daily variations of oxygen concentration. Indeed, during daytime, intracellular O2 concentration increases due to symbiont photosynthesis, while during night, respiration of both host cells and symbionts leads to intra-tissue hypoxia. The Hypoxia Inducible Factor 1 (HIF-1 is a heterodimeric transcription factor used for maintenance of oxygen homeostasis and adaptation to hypoxia. Here, we carried out a mechanistic study of the response to variations of O2 concentrations of the coral model Stylophora pistillata. In silico analysis showed that homologs of HIF-1 α (SpiHIF-1α and HIF-1β (SpiHIF-1β exist in coral. A specific SpiHIF-1 DNA binding on mammalian Hypoxia Response Element (HRE sequences was shown in extracts from coral exposed to dark conditions. Then, we cloned the coral HIF-1α and β genes and determined their expression and transcriptional activity. Although HIF-1α has an incomplete Oxygen-dependent Degradation Domain (ODD relative to its human homolog, its protein level is increased under hypoxia when tested in mammalian cells. Moreover, co-transfection of SpiHIF-1α and β in mammalian cells stimulated an artificial promoter containing HRE only in hypoxic conditions. This study shows the strong conservation of molecular mechanisms involved in adaptation to O2 concentration between Cnidarians and Mammals whose ancestors diverged about 1,200-1,500 million years ago.

  19. Multiple timescale analysis and factor analysis of energy ecological footprint growth in China 1953-2006

    International Nuclear Information System (INIS)

    Chen Chengzhong; Lin Zhenshan

    2008-01-01

    Scientific analysis of energy consumption and its influencing factors is of great importance for energy strategy and policy planning. The energy consumption in China 1953-2006 is estimated by applying the energy ecological footprint (EEF) method, and the fluctuation periods of annual China's per capita EEF (EEF cpc ) growth rate are analyzed with the empirical mode decomposition (EMD) method in this paper. EEF intensity is analyzed to depict energy efficiency in China. The main timescales of the 37 factors that affect the annual growth rate of EEF cpc are also discussed based on EMD and factor analysis methods. Results show three obvious undulation cycles of the annual growth rate of EEF cpc , i.e., 4.6, 14.4 and 34.2 years over the last 53 years. The analysis findings from the common synthesized factors of IMF1, IMF2 and IMF3 timescales of the 37 factors suggest that China's energy policy-makers should attach more importance to stabilizing economic growth, optimizing industrial structure, regulating domestic petroleum exploitation and improving transportation efficiency

  20. Integrative Analysis of Transcription Factor Combinatorial Interactions Using a Bayesian Tensor Factorization Approach

    Science.gov (United States)

    Ye, Yusen; Gao, Lin; Zhang, Shihua

    2017-01-01

    Transcription factors play a key role in transcriptional regulation of genes and determination of cellular identity through combinatorial interactions. However, current studies about combinatorial regulation is deficient due to lack of experimental data in the same cellular environment and extensive existence of data noise. Here, we adopt a Bayesian CANDECOMP/PARAFAC (CP) factorization approach (BCPF) to integrate multiple datasets in a network paradigm for determining precise TF interaction landscapes. In our first application, we apply BCPF to integrate three networks built based on diverse datasets of multiple cell lines from ENCODE respectively to predict a global and precise TF interaction network. This network gives 38 novel TF interactions with distinct biological functions. In our second application, we apply BCPF to seven types of cell type TF regulatory networks and predict seven cell lineage TF interaction networks, respectively. By further exploring the dynamics and modularity of them, we find cell lineage-specific hub TFs participate in cell type or lineage-specific regulation by interacting with non-specific TFs. Furthermore, we illustrate the biological function of hub TFs by taking those of cancer lineage and blood lineage as examples. Taken together, our integrative analysis can reveal more precise and extensive description about human TF combinatorial interactions. PMID:29033978

  1. Recurrent-neural-network-based Boolean factor analysis and its application to word clustering.

    Science.gov (United States)

    Frolov, Alexander A; Husek, Dusan; Polyakov, Pavel Yu

    2009-07-01

    The objective of this paper is to introduce a neural-network-based algorithm for word clustering as an extension of the neural-network-based Boolean factor analysis algorithm (Frolov , 2007). It is shown that this extended algorithm supports even the more complex model of signals that are supposed to be related to textual documents. It is hypothesized that every topic in textual data is characterized by a set of words which coherently appear in documents dedicated to a given topic. The appearance of each word in a document is coded by the activity of a particular neuron. In accordance with the Hebbian learning rule implemented in the network, sets of coherently appearing words (treated as factors) create tightly connected groups of neurons, hence, revealing them as attractors of the network dynamics. The found factors are eliminated from the network memory by the Hebbian unlearning rule facilitating the search of other factors. Topics related to the found sets of words can be identified based on the words' semantics. To make the method complete, a special technique based on a Bayesian procedure has been developed for the following purposes: first, to provide a complete description of factors in terms of component probability, and second, to enhance the accuracy of classification of signals to determine whether it contains the factor. Since it is assumed that every word may possibly contribute to several topics, the proposed method might be related to the method of fuzzy clustering. In this paper, we show that the results of Boolean factor analysis and fuzzy clustering are not contradictory, but complementary. To demonstrate the capabilities of this attempt, the method is applied to two types of textual data on neural networks in two different languages. The obtained topics and corresponding words are at a good level of agreement despite the fact that identical topics in Russian and English conferences contain different sets of keywords.

  2. A meta-analysis of peripheral blood nerve growth factor levels in patients with schizophrenia.

    Science.gov (United States)

    Qin, X-Y; Wu, H-T; Cao, C; Loh, Y P; Cheng, Y

    2017-09-01

    Neurotrophins particularly brain-derived neurotrophic factor (BDNF) and nerve growth factor (NGF) are crucial modulators in the neurodevelopment and maintenance of central and peripheral nervous systems. Neurotrophin hypothesis of schizophrenia (SCZ) postulated that the changes in the brains of SCZ patients are the result of disturbances of developing processes involving neurotrophic factors. This hypothesis was mainly supported by the abnormal regulation of BDNF in SCZ, especially the decreased peripheral blood BDNF levels in SCZ patients validated by several meta-analyses. However, the regulation of NGF in SCZ remains unclear because of the inconsistent findings from the clinical studies. Therefore, we undertook, to the best of our knowledge, the first systematic review with a meta-analysis to quantitatively summarize the peripheral blood NGF data in SCZ patients compared with healthy control (HC) subjects. A systematic search of Pubmed, PsycINFO and Web of Science identified 13 articles encompassing a sample of 1693 individuals for the meta-analysis. Random-effects meta-analysis showed that patients with SCZ had significantly decreased peripheral blood levels of NGF when compared with the HC subjects (Hedges's g=-0.633, 95% confidence interval (CI)=-0.948 to -0.318, Pmeta-regression analyses showed that age, gender and sample size had no moderating effects on the outcome of the meta-analysis, whereas disease severity might be a confounding factor for the meta-analysis. These results demonstrated that patients with SCZ are accompanied by the decreased peripheral blood NGF levels, strengthening the clinical evidence of an abnormal neurotrophin profile in the patients with SCZ.

  3. Analysis of Factors Associated With Rhytidectomy Malpractice Litigation Cases.

    Science.gov (United States)

    Kandinov, Aron; Mutchnick, Sean; Nangia, Vaibhuv; Svider, Peter F; Zuliani, Giancarlo F; Shkoukani, Mahdi A; Carron, Michael A

    2017-07-01

    This study investigates the financial burden of medical malpractice litigation associated with rhytidectomies, as well as factors that contribute to litigation and poor defendant outcomes, which can help guide physician practices. To comprehensively evaluate rhytidectomy malpractice litigation. Jury verdict and settlement reports related to rhytidectomy malpractice litigations were obtained using the Westlaw Next database. Use of medical malpractice in conjunction with several terms for rhytidectomy, to account for the various procedure names associated with the procedure, yielded 155 court cases. Duplicate and nonrelevant cases were removed, and 89 cases were included in the analysis and reviewed for outcomes, defendant specialty, payments, and other allegations raised in proceedings. Data were collected from November 21, 2015, to December 25, 2015. Data analysis took place from December 25, 2015, to January 20, 2016. A total of 89 cases met our inclusion criteria. Most plaintiffs were female (81 of 88 with known sex [92%]), and patient age ranged from 40 to 76 years (median age, 56 years). Fifty-three (60%) were resolved in the defendant's favor, while the remaining 36 cases (40%) were resolved with either a settlement or a plaintiff verdict payment. The mean payment was $1.4 million. A greater proportion of cases involving plastic surgeon defendants were resolved with payment compared with cases involving defendants with ear, nose, and throat specialty (15 [36%] vs 4 [24%]). The most common allegations raised in litigation were intraoperative negligence (61 [69%]), poor cosmesis or disfigurement (57 [64%]), inadequate informed consent (30 [34%]), additional procedures required (14 [16%]), postoperative negligence (12 [14%]), and facial nerve injury (10 [11%]). Six cases (7%) involved alleged negligence surrounding a "lifestyle-lift" procedure, which tightens or oversews the superficial muscular aponeurosis system layer. In this study, although most cases of

  4. Weightlifter Lumbar Physiology Health Influence Factor Analysis of Sports Medicine.

    Science.gov (United States)

    Zhang, Xiangyang

    2015-01-01

    Chinese women's weightlifting project has been in the advanced world level, suggests that the Chinese coaches and athletes have many successful experience in the weight lifting training. Little weight lifting belongs to high-risk sports, however, to the lumbar spine injury, some young good athletes often due to lumbar trauma had to retire, and the national investment and athletes toil is regret things. This article from the perspective of sports medicine, weightlifting athletes training situation analysis and put forward Suggestions, aimed at avoiding lumbar injury, guarantee the health of athletes. In this paper, first of all to 50 professional women's weightlifting athletes doing investigation, found that 82% of the athletes suffer from lumbar disease symptoms, the reason is mainly composed of lumbar strain, intensity is too large, motion error caused by three factors. From the Angle of sports medicine and combined with the characteristics of the structure of human body skeleton athletes lumbar structural mechanics analysis, find out the lumbar force's two biggest technical movement, study, and regulate the action standard, so as to minimize lumbar force, for athletes to contribute to the health of the lumbar spine.

  5. Efficiency limit factor analysis for the Francis-99 hydraulic turbine

    Science.gov (United States)

    Zeng, Y.; Zhang, L. X.; Guo, J. P.; Guo, Y. K.; Pan, Q. L.; Qian, J.

    2017-01-01

    The energy loss in hydraulic turbine is the most direct factor that affects the efficiency of the hydraulic turbine. Based on the analysis theory of inner energy loss of hydraulic turbine, combining the measurement data of the Francis-99, this paper calculates characteristic parameters of inner energy loss of the hydraulic turbine, and establishes the calculation model of the hydraulic turbine power. Taken the start-up test conditions given by Francis-99 as case, characteristics of the inner energy of the hydraulic turbine in transient and transformation law are researched. Further, analyzing mechanical friction in hydraulic turbine, we think that main ingredients of mechanical friction loss is the rotation friction loss between rotating runner and water body, and defined as the inner mechanical friction loss. The calculation method of the inner mechanical friction loss is given roughly. Our purpose is that explore and research the method and way increasing transformation efficiency of water flow by means of analysis energy losses in hydraulic turbine.

  6. Network-based prediction and analysis of HIV dependency factors.

    Directory of Open Access Journals (Sweden)

    T M Murali

    2011-09-01

    Full Text Available HIV Dependency Factors (HDFs are a class of human proteins that are essential for HIV replication, but are not lethal to the host cell when silenced. Three previous genome-wide RNAi experiments identified HDF sets with little overlap. We combine data from these three studies with a human protein interaction network to predict new HDFs, using an intuitive algorithm called SinkSource and four other algorithms published in the literature. Our algorithm achieves high precision and recall upon cross validation, as do the other methods. A number of HDFs that we predict are known to interact with HIV proteins. They belong to multiple protein complexes and biological processes that are known to be manipulated by HIV. We also demonstrate that many predicted HDF genes show significantly different programs of expression in early response to SIV infection in two non-human primate species that differ in AIDS progression. Our results suggest that many HDFs are yet to be discovered and that they have potential value as prognostic markers to determine pathological outcome and the likelihood of AIDS development. More generally, if multiple genome-wide gene-level studies have been performed at independent labs to study the same biological system or phenomenon, our methodology is applicable to interpret these studies simultaneously in the context of molecular interaction networks and to ask if they reinforce or contradict each other.

  7. Evidence for genetic factors explaining the association between birth weight and low-density lipoprotein cholesterol and possible intrauterine factors influencing the association between birth weight and high-density lipoprotein cholesterol: Analysis in twins

    NARCIS (Netherlands)

    IJzerman, R.G.; Stehouwer, C.D.A.; van Weissenbruch, M.M.; de Geus, E.J.C.; Boomsma, D.I.

    2001-01-01

    Recent studies have demonstrated an association between low weight at birth and an atherogenic lipid profile in later life. To examine the influences of intrauterine and genetic factors, we investigated 53 dizygotic and 61 monozygotic adolescent twin pairs. Regression analysis demonstrated that low

  8. A Rasch and factor analysis of the Functional Assessment of Cancer Therapy-General (FACT-G

    Directory of Open Access Journals (Sweden)

    Selby Peter J

    2007-04-01

    Full Text Available Abstract Background Although the Functional Assessment of Cancer Therapy – General questionnaire (FACT-G has been validated few studies have explored the factor structure of the instrument, in particular using non-sample dependent measurement techniques, such as Rasch Models. Furthermore, few studies have explored the relationship between item fit to the Rasch Model and clinical utility. The aim of this study was to investigate the dimensionality and measurement properties of the FACT-G with Rasch Models and Factor analysis. Methods A factor analysis and Rasch analysis (Partial Credit Model was carried out on the FACT-G completed by a heterogeneous sample of cancer patients (n = 465. For the Rasch analysis item fit (infit mean squares ≥ 1.30, dimensionality and item invariance were assessed. The impact of removing misfitting items on the clinical utility of the subscales and FACT-G total scale was also assessed. Results The factor analysis demonstrated a four factor structure of the FACT-G which broadly corresponded to the four subscales of the instrument. Internal consistency for these four scales was very good (Cronbach's alpha 0.72 – 0.85. The Rasch analysis demonstrated that each of the subscales and the FACT-G total scale had misfitting items (infit means square ≥ 1.30. All these scales with the exception of the Social & Family Well-being Scale (SFWB were unidimensional. When misfitting items were removed, the effect sizes and the clinical utility of the instrument were maintained for the subscales and the total FACT-G scores. Conclusion The results of the traditional factor analysis and Rasch analysis of the FACT-G broadly agreed. Caution should be exercised when utilising the Social & Family Well-being scale and further work is required to determine whether this scale is best represented by two factors. Additionally, removing misfitting items from scales should be performed alongside an assessment of the impact on clinical utility.

  9. Self-Compassion Scale: IRT Psychometric Analysis, Validation, and Factor Structure – Slovak Translation

    Directory of Open Access Journals (Sweden)

    Júlia Halamová

    2018-01-01

    Full Text Available The present study verifies the psychometric properties of the Slovak version of the Self-Compassion Scale through item response theory, factor-analysis, validity analyses and norm development. The surveyed sample consisted of 1,181 participants (34% men and 66% women with a mean age of 30.30 years (SD = 12.40. Two general factors (Self-compassionate responding and Self-uncompassionate responding were identified, whereas there was no support for a single general factor of the scale and six subscales. The results of the factor analysis were supported by an independent sample of 676 participants. Therefore, the use of total score for the whole scale would be inappropriate. In Slovak language the Self-Compassion Scale should be used in the form of two general subscales (Self-compassionate responding and Self-uncompassionate responding. In line with our theoretical assumptions, we obtained relatively high Spearman’s correlation coefficients between the Self-Compassion Scale and related external variables, demonstrating construct validity for the scale. To sum up, the Slovak translation of The Self-Compassion Scale is a reliable and valid instrument that measures Self-compassionate responding and Self-uncompassionate responding.

  10. Confirmatory Factor Analysis of a Questionnaire Measure of Managerial Stigma Towards Employee Depression.

    Science.gov (United States)

    Martin, Angela J; Giallo, Rebecca

    2016-12-01

    Managers' attitudes play a key role in how organizations respond to employees with depression. We examine the measurement properties of a questionnaire designed to assess managerial stigma towards employees with depression. Using data from a sample of 469 Australian managers representing a wide range of industries and work settings, we conducted a confirmatory factor analysis to assess three proposed subscales representing affective, cognitive and behavioural forms of stigma. Results were equivocal indicating acceptable fit for two-factor (affective and cognitive + behavioural), three-factor (affective, cognitive and behavioural) and higher order models. Failure to demonstrate the discriminant validity of the cognitive and behavioural dimensions, even though they are theoretically distinct, suggests that further work on the scale is warranted. These results provide an extension to the psychometric profile of this measure (exploratory factor analysis; Martin, ). Development of strategies to operationalize this construct will benefit occupational health research and practice, particularly in interventions that aim to reduce the stigma of mental health issues in the workplace or where managers' attitudes are a key mechanism in intervention efficacy. We encourage future research on this measure pertaining in particular to further enhancing all aspects of its construct validity. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  11. Periodic tests: a human factors analysis of documentary aspects

    International Nuclear Information System (INIS)

    Perinet, Romuald; Rousseau, Jean-Marie

    2007-01-01

    conclusions of this analysis are presented in this paper. The analysis carried out by the IRSN showed that the complexity of the design and implementation process of periodic tests is due to the diversity of.the organizations and participants, the number and the heterogeneity of the documents, and the technical and regulatory complexity of operation. In this context, defects related to the quality of the national reference document updates and to the conditions of their delivery were at the origin of difficulties in CNPEs. These difficulties address the integration of the updates by the participants, the overall vision of the rules to be respected, and the management of the workload to deal with these tasks. The analysis showed that CNPEs made efforts to produce reliable, station-specific updates, but improvements could still be made concerning the organization, the communication and the ergonomics of the operating ranges. More generally, from a human and organizational factors point of view, such an analysis surpasses the search for responsibility for the dysfunctions and allows for a more objective explanation of the encountered difficulties (inapplicable rules, delays of delivery, etc.). It also leads to a consolidated needs analysis in order to improve the global process. On the basis of a preliminary analysis, EDF has identified a plan for improvement. EDF has decided to deal with the improvement of the process within the framework of a thorough study. (authors)

  12. Pore pressure measurement plan of near field rock used on three dimensional groundwater flow analysis in demonstration test of cavern type disposal facility

    International Nuclear Information System (INIS)

    Onuma, Kazuhiro; Terada, Kenji; Matsumura, Katsuhide; Koyama, Toshihiro; Yajima, Kazuaki

    2008-01-01

    Demonstration test of underground cavern type disposal facilities is planed though carrying out construction of full scale engineering barrier system which simulated in the underground space in full scale and under actual environment. This test consists of three part, these are construction test, performance test and measurement test. Behavior of near field rock mass is measured about hydrological behavior under and after construction to evaluate effect at test facility. To make plan of pore pressure measurement, three dimensional groundwater flow analysis has been carried out. Based on comparison of analysis before and after test, detail plan has been studied. (author)

  13. Exploratory factor analysis and reliability analysis with missing data: A simple method for SPSS users

    Directory of Open Access Journals (Sweden)

    Bruce Weaver

    2014-09-01

    Full Text Available Missing data is a frequent problem for researchers conducting exploratory factor analysis (EFA or reliability analysis. The SPSS FACTOR procedure allows users to select listwise deletion, pairwise deletion or mean substitution as a method for dealing with missing data. The shortcomings of these methods are well-known. Graham (2009 argues that a much better way to deal with missing data in this context is to use a matrix of expectation maximization (EM covariances(or correlations as input for the analysis. SPSS users who have the Missing Values Analysis add-on module can obtain vectors ofEM means and standard deviations plus EM correlation and covariance matrices via the MVA procedure. But unfortunately, MVA has no /MATRIX subcommand, and therefore cannot write the EM correlations directly to a matrix dataset of the type needed as input to the FACTOR and RELIABILITY procedures. We describe two macros that (in conjunction with an intervening MVA command carry out the data management steps needed to create two matrix datasets, one containing EM correlations and the other EM covariances. Either of those matrix datasets can then be used asinput to the FACTOR procedure, and the EM correlations can also be used as input to RELIABILITY. We provide an example that illustrates the use of the two macros to generate the matrix datasets and how to use those datasets as input to the FACTOR and RELIABILITY procedures. We hope that this simple method for handling missing data will prove useful to both students andresearchers who are conducting EFA or reliability analysis.

  14. Complementing approaches to demonstrate chlorinated solvent biodegradation in a complex pollution plume: mass balance, PCR and compound-specific stable isotope analysis.

    OpenAIRE

    Courbet Christelle; Rivière Agnès; Jeannottat Simon; Rinaldi Sandro; Hunkeler Daniel; Bendjoudi Hocine; De Marsily Ghislain

    2011-01-01

    This work describes the use of different complementing methods (mass balance polymerase chain reaction assays and compound specific stable isotope analysis) to demonstrate the existence and effectiveness of biodegradation of chlorinated solvents in an alluvial aquifer. The solvent contaminated site is an old chemical factory located in an alluvial plain in France. As most of the chlorinated contaminants currently found in the groundwater at this site were produced by local industries at vario...

  15. Prevailing of ischemia cardiopathy, demonstrated by gammagraphy in less than 40 years old persons and its association with risk factors; Prevalencia de cardiopatia isquemica, demostrada por gammagrafia en menores de 40 anos y su asociacion con factores de riesgo

    Energy Technology Data Exchange (ETDEWEB)

    Cano G, M.A.; Castillo M, L.; Orea T, A. [Departamento de Medicina Nuclear, Departamento de Cardioiogia del Insitituto Nacional de Ciencias Medicas y Nutricion Salvador Zubiran. Mexico Distrito Federal (Mexico)

    2005-07-01

    The Coronary Arterial Illness (EAC) is the first cause of death of those Mexicans. Among their numerous risk factors it highlights the age with more bias starting from the 45 years. The objective of this investigation was to determine the prevailing of ischemic cardiopathy (Cl) and heart attack to the myocardium (IAM) in fellows smaller than 40 years and to identify risk factors. The study of myocardial perfusion (EPM) it is a non invasive study and of great sensibility and specificity that it allows to detect obstructive coronary lesions. The used method was retrospective-traverse Study in 125 patients smaller than 40 years. Files of patients to who EPM had been practiced with Technetium 99m-SESTAMlBI, protocol of one day, were revised, where its were analyzed the short and long axis (vertical and horizontal). General data, somatometry, emotional profile analysis, lipids and glucose profiles were gathered. Results. The population conformed it 53% women and 47% men with average of 31.9 year-old age of corporal mass index (IMC) 25.1 kg/cm{sup 2}. 46% of abnormal studies was obtained, of which 35% was compatible with ischemic cardiopathy (Cl) and 11% with heart attack to the myocardium (IAM). The characteristics of these were: age 31.6{+-}6 Vs 32.6{+-}5.9 years; IMC 25.4{+-}7.0 Vs 24.4{+-}3.34 kg/cm{sup 2}; stature 161.6{+-}9.8 Vs 165.5{+-}9.7cm; TAS 139.1{+-}29.2 Vs. 115{+-}13.4 mm Hg, TAD 84.5{+-}17.4 Vs. 75{+-}9.4 mm Hg; civil state married 65.5% (p=0.005) Vs single 57%; bigger depression 32% Vs anxiety 28%, in the group of patients with Cl and IAM, respectively. In the IAM population it was found an additional IRC 21% (p=0.030), HAS 21% (p=0.025) and drug addiction 21% (p=0.002). The rest of the results didn't show significant differences. Conclusion: Only 6.5% of the patients that went to EPM-{sup 99m}Tc-SESTAMIBl in a 6 year-old lapse, were smaller than 40 years. 71% of them was referred by prechordal pain in who almost the half it was evidenced Cl or IAM

  16. Prevailing of ischemia cardiopathy, demonstrated by gammagraphy in less than 40 years old persons and its association with risk factors; Prevalencia de cardiopatia isquemica, demostrada por gammagrafia en menores de 40 anos y su asociacion con factores de riesgo

    Energy Technology Data Exchange (ETDEWEB)

    Cano G, M A; Castillo M, L; Orea T, A [Departamento de Medicina Nuclear, Departamento de Cardioiogia del Insitituto Nacional de Ciencias Medicas y Nutricion Salvador Zubiran. Mexico Distrito Federal (Mexico)

    2005-07-01

    The Coronary Arterial Illness (EAC) is the first cause of death of those Mexicans. Among their numerous risk factors it highlights the age with more bias starting from the 45 years. The objective of this investigation was to determine the prevailing of ischemic cardiopathy (Cl) and heart attack to the myocardium (IAM) in fellows smaller than 40 years and to identify risk factors. The study of myocardial perfusion (EPM) it is a non invasive study and of great sensibility and specificity that it allows to detect obstructive coronary lesions. The used method was retrospective-traverse Study in 125 patients smaller than 40 years. Files of patients to who EPM had been practiced with Technetium 99m-SESTAMlBI, protocol of one day, were revised, where its were analyzed the short and long axis (vertical and horizontal). General data, somatometry, emotional profile analysis, lipids and glucose profiles were gathered. Results. The population conformed it 53% women and 47% men with average of 31.9 year-old age of corporal mass index (IMC) 25.1 kg/cm{sup 2}. 46% of abnormal studies was obtained, of which 35% was compatible with ischemic cardiopathy (Cl) and 11% with heart attack to the myocardium (IAM). The characteristics of these were: age 31.6{+-}6 Vs 32.6{+-}5.9 years; IMC 25.4{+-}7.0 Vs 24.4{+-}3.34 kg/cm{sup 2}; stature 161.6{+-}9.8 Vs 165.5{+-}9.7cm; TAS 139.1{+-}29.2 Vs. 115{+-}13.4 mm Hg, TAD 84.5{+-}17.4 Vs. 75{+-}9.4 mm Hg; civil state married 65.5% (p=0.005) Vs single 57%; bigger depression 32% Vs anxiety 28%, in the group of patients with Cl and IAM, respectively. In the IAM population it was found an additional IRC 21% (p=0.030), HAS 21% (p=0.025) and drug addiction 21% (p=0.002). The rest of the results didn't show significant differences. Conclusion: Only 6.5% of the patients that went to EPM-{sup 99m}Tc-SESTAMIBl in a 6 year-old lapse, were smaller than 40 years. 71% of them was referred by prechordal pain in who almost the half it was evidenced Cl or IAM. In

  17. Analysis of Performance Factors for Accounting and Finance Related Business Courses in a Distance Education Environment

    Science.gov (United States)

    Benligiray, Serdar; Onay, Ahmet

    2017-01-01

    The objective of this study is to explore business courses performance factors with a focus on accounting and finance. Course score interrelations are assumed to represent interpretable constructs of these factors. Factor analysis is proposed to identify the constructs that explain the correlations. Factor analysis results identify three…

  18. Direct demonstration of rapid insulin-like growth factor II receptor internalization and recycling in rat adipocytes. Insulin stimulates 125I-insulin-like growth factor II degradation by modulating the IGF-II receptor recycling process

    International Nuclear Information System (INIS)

    Oka, Y.; Rozek, L.M.; Czech, M.P.

    1985-01-01

    The photoactive insulin-like growth factor (IGF)-II analogue 4-azidobenzoyl- 125 I-IGF-II was synthesized and used to label specifically and covalently the Mr = 250,000 Type II IGF receptor. When rat adipocytes are irradiated after a 10-min incubation with 4-azidobenzoyl- 125 I-IGF-II at 10 degrees C and immediately homogenized, most of the labeled IGF-II receptors are associated with the plasma membrane fraction, indicating that receptors accessible to the labeling reagent at low temperature are on the cell surface. However, when the photolabeled cells are incubated at 37 degrees C for various times before homogenization, labeled IGF-II receptors are rapidly internalized with a half-time of 3.5 min as evidenced by a loss from the plasma membrane fraction and a concomitant appearance in the low density microsome fraction. The steady state level of cell surface IGF-II receptors in the presence or absence of IGF-II remains constant under these conditions, demonstrating that IGF-II receptors rapidly recycle back to the cell surface at the same rate as receptor internalization. Using the above methodology, it is shown that acute insulin action: 1) increases the steady state number of cell surface IGF-II receptors; 2) increases the number of ligand-bound IGF-II receptors that are internalized per unit of time; and 3) increases the rate of cellular 125 I-IGF-II degradation by a process that is blocked by anti-IGF-II receptor antibody

  19. Electric vehicle demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Ouellet, M. [National Centre for Advanced Transportation, Saint-Jerome, PQ (Canada)

    2010-07-01

    The desirable characteristics of Canadian projects that demonstrate vehicle use in real-world operation and the appropriate mechanism to collect and disseminate the monitoring data were discussed in this presentation. The scope of the project was on passenger cars and light duty trucks operating in plug-in electric vehicle (PHEV) or battery electric vehicle modes. The presentation also discussed the funding, stakeholders involved, Canadian travel pattern analysis, regulatory framework, current and recent electric vehicle demonstration projects, and project guidelines. It was concluded that some demonstration project activities may have been duplicated as communication between the proponents was insufficient. It was recommended that data monitoring using automatic data logging with minimum reliance on logbooks and other user entry should be emphasized. figs.

  20. Trenton ICES: demonstration of a grid-connected integrated community energy system. Phase II. Volumes 1 and 2. Preliminary design of ICES system and analysis of community ownership

    Energy Technology Data Exchange (ETDEWEB)

    1978-03-22

    Preliminary design and evaluation for the system has been carried out. The findings of this study are: (1) it is technically feasible, utilizing commercially available hardware; (2) under utility ownership and operation, it will not be economically competitive with conventional alternatives for heating and cooling buildings (analysis contained in companion report under separate cover); (3) under utility ownership and operation, no restrictions have been identified that would prevent the project from proceeding; (4) under community ownership, preliminary analysis indicates that thermal energy produced by Trenton ICES will be approximately 12 percent less expensive than thermal energy produced by oil-fired boilers; and (5) a review and update of institutional analyses performed during Phase 2 has identified no factors that would preclude community ownership and operation of the Trenton ICES. The background data produced for the analysis of the Trenton ICES based on utility ownership and operation can, in large part, be used as the bases for a detailed analysis of community ownership.

  1. The sun protection factor (SPF) inadequately defines broad spectrum photoprotection: demonstration using skin reconstructed in vitro exposed to UVA, UVBor UV-solar simulated radiation.

    Science.gov (United States)

    Bernerd, Françoise; Vioux, Corinne; Lejeune, François; Asselineau, Daniel

    2003-01-01

    Wavelength specific biological damage has been previously identified in human skin reconstructed in vitro. Sunburn cell and pyrimidine dimers were found after UVB exposure, and alterations of dermal fibroblasts after UVA exposure. These damages permitted us to discriminate UVB and UVA single absorbers. The present study shows that these biological effects can be obtained simultaneously by a combined UVB + UVA exposure using ultraviolet solar simulated light (UV-SSR), which represents a relevant UV source. In addition, the protection afforded by two broad spectrum sunscreen complex formulations was assessed after topical application. These two formulations displayed the same sun protection factor but different UVA protection factors determined by the persistent pigment darkening (PPD) method. Dose response experiments of UVA or UV-SSR showed that the preparation with the highest PF-UVA provided a better protection with regard to dermal damage compared to the other formulation. Using an original UVB source to obtain the UVB portion of SSR spectrum, the preparations provided the same protection. This study strikingly illustrates the fact that the photoprotection afforded by two sunscreen formulations having similar SPF values is not equal with regard to dermal damage related to photoaging.

  2. Factor Analysis of Drawings: Application to College Student Models of the Greenhouse Effect

    Science.gov (United States)

    Libarkin, Julie C.; Thomas, Stephen R.; Ording, Gabriel

    2015-01-01

    Exploratory factor analysis was used to identify models underlying drawings of the greenhouse effect made by over 200 entering university freshmen. Initial content analysis allowed deconstruction of drawings into salient features, with grouping of these features via factor analysis. A resulting 4-factor solution explains 62% of the data variance,…

  3. Multiple factor analysis of metachronous upper urinary tract transitional cell carcinoma after radical cystectomy

    Directory of Open Access Journals (Sweden)

    P. Wang

    2007-07-01

    Full Text Available Transitional cell carcinoma (TCC of the urothelium is often multifocal and subsequent tumors may occur anywhere in the urinary tract after the treatment of a primary carcinoma. Patients initially presenting a bladder cancer are at significant risk of developing metachronous tumors in the upper urinary tract (UUT. We evaluated the prognostic factors of primary invasive bladder cancer that may predict a metachronous UUT TCC after radical cystectomy. The records of 476 patients who underwent radical cystectomy for primary invasive bladder TCC from 1989 to 2001 were reviewed retrospectively. The prognostic factors of UUT TCC were determined by multivariate analysis using the COX proportional hazards regression model. Kaplan-Meier analysis was also used to assess the variable incidence of UUT TCC according to different risk factors. Twenty-two patients (4.6%. developed metachronous UUT TCC. Multiplicity, prostatic urethral involvement by the bladder cancer and the associated carcinoma in situ (CIS were significant and independent factors affecting the occurrence of metachronous UUT TCC (P = 0.0425, 0.0082, and 0.0006, respectively. These results were supported, to some extent, by analysis of the UUT TCC disease-free rate by the Kaplan-Meier method, whereby patients with prostatic urethral involvement or with associated CIS demonstrated a significantly lower metachronous UUT TCC disease-free rate than patients without prostatic urethral involvement or without associated CIS (log-rank test, P = 0.0116 and 0.0075, respectively. Multiple tumors, prostatic urethral involvement and associated CIS were risk factors for metachronous UUT TCC, a conclusion that may be useful for designing follow-up strategies for primary invasive bladder cancer after radical cystectomy.

  4. Factors influencing crime rates: an econometric analysis approach

    Science.gov (United States)

    Bothos, John M. A.; Thomopoulos, Stelios C. A.

    2016-05-01

    The scope of the present study is to research the dynamics that determine the commission of crimes in the US society. Our study is part of a model we are developing to understand urban crime dynamics and to enhance citizens' "perception of security" in large urban environments. The main targets of our research are to highlight dependence of crime rates on certain social and economic factors and basic elements of state anticrime policies. In conducting our research, we use as guides previous relevant studies on crime dependence, that have been performed with similar quantitative analyses in mind, regarding the dependence of crime on certain social and economic factors using statistics and econometric modelling. Our first approach consists of conceptual state space dynamic cross-sectional econometric models that incorporate a feedback loop that describes crime as a feedback process. In order to define dynamically the model variables, we use statistical analysis on crime records and on records about social and economic conditions and policing characteristics (like police force and policing results - crime arrests), to determine their influence as independent variables on crime, as the dependent variable of our model. The econometric models we apply in this first approach are an exponential log linear model and a logit model. In a second approach, we try to study the evolvement of violent crime through time in the US, independently as an autonomous social phenomenon, using autoregressive and moving average time-series econometric models. Our findings show that there are certain social and economic characteristics that affect the formation of crime rates in the US, either positively or negatively. Furthermore, the results of our time-series econometric modelling show that violent crime, viewed solely and independently as a social phenomenon, correlates with previous years crime rates and depends on the social and economic environment's conditions during previous years.

  5. Antares: preliminary demonstrator results

    International Nuclear Information System (INIS)

    Kouchner, A.

    2000-05-01

    The ANTARES collaboration is building an undersea neutrino telescope off Toulon (Mediterranean sea) with effective area ∼ 0.1 km 2 . An extensive study of the site properties has been achieved together with software analysis in order to optimize the performance of the detector. Results are summarized here. An instrumented line, linked to shore for first time via an electro-optical cable, has been immersed late 1999. The preliminary results of this demonstrator line are reported. (author)

  6. The Majorana Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Aguayo, Estanislao; Fast, James E.; Hoppe, Eric W.; Keillor, Martin E.; Kephart, Jeremy D.; Kouzes, Richard T.; LaFerriere, Brian D.; Merriman, Jason H.; Orrell, John L.; Overman, Nicole R.; Avignone, Frank T.; Back, Henning O.; Combs, Dustin C.; Leviner, L.; Young, A.; Barabash, Alexander S.; Konovalov, S.; Vanyushin, I.; Yumatov, Vladimir; Bergevin, M.; Chan, Yuen-Dat; Detwiler, Jason A.; Loach, J. C.; Martin, R. D.; Poon, Alan; Prior, Gersende; Vetter, Kai; Bertrand, F.; Cooper, R. J.; Radford, D. C.; Varner, R. L.; Yu, Chang-Hong; Boswell, M.; Elliott, S.; Gehman, Victor M.; Hime, Andrew; Kidd, M. F.; LaRoque, B. H.; Rielage, Keith; Ronquest, M. C.; Steele, David; Brudanin, V.; Egorov, Viatcheslav; Gusey, K.; Kochetov, Oleg; Shirchenko, M.; Timkin, V.; Yakushev, E.; Busch, Matthew; Esterline, James H.; Tornow, Werner; Christofferson, Cabot-Ann; Horton, Mark; Howard, S.; Sobolev, V.; Collar, J. I.; Fields, N.; Creswick, R.; Doe, Peter J.; Johnson, R. A.; Knecht, A.; Leon, Jonathan D.; Marino, Michael G.; Miller, M. L.; Robertson, R. G. H.; Schubert, Alexis G.; Wolfe, B. A.; Efremenko, Yuri; Ejiri, H.; Hazama, R.; Nomachi, Masaharu; Shima, T.; Finnerty, P.; Fraenkle, Florian; Giovanetti, G. K.; Green, M.; Henning, Reyco; Howe, M. A.; MacMullin, S.; Phillips, D.; Snavely, Kyle J.; Strain, J.; Vorren, Kris R.; Guiseppe, Vincente; Keller, C.; Mei, Dong-Ming; Perumpilly, Gopakumar; Thomas, K.; Zhang, C.; Hallin, A. L.; Keeter, K.; Mizouni, Leila; Wilkerson, J. F.

    2011-09-03

    A brief review of the history and neutrino physics of double beta decay is given. A description of the MAJORANA DEMONSTRATOR research and development program, including background reduction techniques, is presented in some detail. The application of point contact (PC) detectors to the experiment is discussed, including the effectiveness of pulse shape analysis. The predicted sensitivity of a PC detector array enriched to 86% to 76Ge is given.

  7. A Rasch and confirmatory factor analysis of the General Health Questionnaire (GHQ - 12

    Directory of Open Access Journals (Sweden)

    Velikova Galina

    2010-04-01

    Full Text Available Abstract Background The General Health Questionnaire (GHQ - 12 was designed as a short questionnaire to assess psychiatric morbidity. Despite the fact that studies have suggested a number of competing multidimensional factor structures, it continues to be largely used as a unidimensional instrument. This may have an impact on the identification of psychiatric morbidity in target populations. The aim of this study was to explore the dimensionality of the GHQ-12 and to evaluate a number of alternative models for the instrument. Methods The data were drawn from a large heterogeneous sample of cancer patients. The Partial Credit Model (Rasch was applied to the 12-item GHQ. Item misfit (infit mean square ≥ 1.3 was identified, misfitting items removed and unidimensionality and differential item functioning (age, gender, and treatment aims were assessed. The factor structures of the various alternative models proposed in the literature were explored and optimum model fit evaluated using Confirmatory Factor Analysis. Results The Rasch analysis of the 12-item GHQ identified six misfitting items. Removal of these items produced a six-item instrument which was not unidimensional. The Rasch analysis of an 8-item GHQ demonstrated two unidimensional structures corresponding to Anxiety/Depression and Social Dysfunction. No significant differential item functioning was observed by age, gender and treatment aims for the six- and eight-item GHQ. Two models competed for best fit from the confirmatory factor analysis, namely the GHQ-8 and Hankin's (2008 unidimensional model, however, the GHQ-8 produced the best overall fit statistics. Conclusions The results are consistent with the evidence that the GHQ-12 is a multi-dimensional instrument. Use of the summated scores for the GHQ-12 could potentially lead to an incorrect assessment of patients' psychiatric morbidity. Further evaluation of the GHQ-12 with different target populations is warranted.

  8. Common Factor Analysis Versus Principal Component Analysis: Choice for Symptom Cluster Research

    Directory of Open Access Journals (Sweden)

    Hee-Ju Kim, PhD, RN

    2008-03-01

    Conclusion: If the study purpose is to explain correlations among variables and to examine the structure of the data (this is usual for most cases in symptom cluster research, CFA provides a more accurate result. If the purpose of a study is to summarize data with a smaller number of variables, PCA is the choice. PCA can also be used as an initial step in CFA because it provides information regarding the maximum number and nature of factors. In using factor analysis for symptom cluster research, several issues need to be considered, including subjectivity of solution, sample size, symptom selection, and level of measure.

  9. Demonstration of immunochemical identity between the nerve growth factor-inducible large external (NILE) glycoprotein and the cell adhesion molecule L1

    DEFF Research Database (Denmark)

    Bock, E; Richter-Landsberg, C; Faissner, A

    1985-01-01

    The nerve growth factor-inducible large external (NILE) glycoprotein and the neural cell adhesion molecule L1 were shown to be immunochemically identical. Immunoprecipitation with L1 and NILE antibodies of [3H]fucose-labeled material from culture supernatants and detergent extracts of NGF......-treated rat PC12 pheochromocytoma cells yielded comigrating bands by SDS-PAGE. NILE antibodies reacted with immunopurified L1 antigen, but not with N-CAM and other L2 epitope-bearing glycoproteins from adult mouse brain. Finally, by sequential immunoprecipitation from detergent extracts of [35S......]methionine-labeled early post-natal cerebellar cell cultures or [3H]fucose-labeled NGF-treated PC12 cells, all immunoreactivity for NILE antibody could be removed by pre-clearing with L1 antibody and vice versa....

  10. Postauthorization safety surveillance of ADVATE [antihaemophilic factor (recombinant), plasma/albumin-free method] demonstrates efficacy, safety and low-risk for immunogenicity in routine clinical practice.

    Science.gov (United States)

    Oldenburg, J; Goudemand, J; Valentino, L; Richards, M; Luu, H; Kriukov, A; Gajek, H; Spotts, G; Ewenstein, B

    2010-11-01

      Postauthorization safety surveillance of factor VIII (FVIII) concentrates is essential for assessing rare adverse event incidence. We determined safety and efficacy of ADVATE [antihaemophilic factor (recombinant), plasma/albumin-free method, (rAHF-PFM)] during routine clinical practice. Subjects with differing haemophilia A severities and medical histories were monitored during 12 months of prophylactic and/or on-demand therapy. Among 408 evaluable subjects, 386 (95%) received excellent/good efficacy ratings for all on-demand assessments; the corresponding number for subjects with previous FVIII inhibitors was 36/41 (88%). Among 276 evaluable subjects receiving prophylaxis continuously in the study, 255 (92%) had excellent/good ratings for all prophylactic assessments; the corresponding number for subjects with previous FVIII inhibitors was 41/46 (89%). Efficacy of surgical prophylaxis was excellent/good in 16/16 evaluable procedures. Among previously treated patients (PTPs) with >50 exposure days (EDs) and FVIII≤2%, three (0.75%) developed low-titre inhibitors. Two of these subjects had a positive inhibitor history; thus, the incidence of de novo inhibitor formation in PTPs with FVIII≤2% and no inhibitor history was 1/348 (0.29%; 95% CI, 0.01-1.59%). A PTP with moderate haemophilia developed a low-titre inhibitor. High-titre inhibitors were reported in a PTP with mild disease (following surgery), a previously untreated patient (PUP) with moderate disease (following surgery) and a PUP with severe disease. The favourable benefit/risk profile of rAHF-PFM previously documented in prospective clinical trials has been extended to include a broader range of haemophilia patients, many of whom would have been ineligible for registration studies. © 2010 Blackwell Publishing Ltd.

  11. Industry Application ECCS / LOCA Integrated Cladding/Emergency Core Cooling System Performance: Demonstration of LOTUS-Baseline Coupled Analysis of the South Texas Plant Model

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Hongbin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Szilard, Ronaldo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Epiney, Aaron [Idaho National Lab. (INL), Idaho Falls, ID (United States); Parisi, Carlo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Vaghetto, Rodolfo [Texas A & M Univ., College Station, TX (United States); Vanni, Alessandro [Texas A & M Univ., College Station, TX (United States); Neptune, Kaleb [Texas A & M Univ., College Station, TX (United States)

    2017-06-01

    Under the auspices of the DOE LWRS Program RISMC Industry Application ECCS/LOCA, INL has engaged staff from both South Texas Project (STP) and the Texas A&M University (TAMU) to produce a generic pressurized water reactor (PWR) model including reactor core, clad/fuel design and systems thermal hydraulics based on the South Texas Project (STP) nuclear power plant, a 4-Loop Westinghouse PWR. A RISMC toolkit, named LOCA Toolkit for the U.S. (LOTUS), has been developed for use in this generic PWR plant model to assess safety margins for the proposed NRC 10 CFR 50.46c rule, Emergency Core Cooling System (ECCS) performance during LOCA. This demonstration includes coupled analysis of core design, fuel design, thermalhydraulics and systems analysis, using advanced risk analysis tools and methods to investigate a wide range of results. Within this context, a multi-physics best estimate plus uncertainty (MPBEPU) methodology framework is proposed.

  12. Factor analysis for imperfect maintenance planning at nuclear power plants by cognitive task analysis

    International Nuclear Information System (INIS)

    Takagawa, Kenichi; Iida, Hiroyasu

    2011-01-01

    Imperfect maintenance planning was frequently identified in domestic nuclear power plants. To prevent such an event, we analyzed causal factors in maintenance planning stages and showed the directionality of countermeasures in this study. There is a pragmatic limit in finding the causal factors from the items based on report descriptions. Therefore, the idea of the systemic accident model, which is used to monitor the performance variability in normal circumstances, is taken as a new concept instead of investigating negative factors. As an actual method for analyzing usual activities, cognitive task analysis (CTA) was applied. Persons who experienced various maintenance activities at one electric power company were interviewed about sources related to decision making during maintenance planning, and then usual factors affecting planning were extracted as performance variability factors. The tendency of domestic events was analyzed using the classification item of those factors, and the directionality of countermeasures was shown. The following are critical for preventing imperfect maintenance planning: the persons in charge should fully understand the situation of the equipment for which they are responsible in the work planning and maintenance evaluation stages, and they should definitely understand, for example, the maintenance bases of that equipment. (author)

  13. Scale-Free Nonparametric Factor Analysis: A User-Friendly Introduction with Concrete Heuristic Examples.

    Science.gov (United States)

    Mittag, Kathleen Cage

    Most researchers using factor analysis extract factors from a matrix of Pearson product-moment correlation coefficients. A method is presented for extracting factors in a non-parametric way, by extracting factors from a matrix of Spearman rho (rank correlation) coefficients. It is possible to factor analyze a matrix of association such that…

  14. Confirmatory Factor Analysis of the Combined Social Phobia Scale and Social Interaction Anxiety Scale: Support for a Bifactor Model

    Science.gov (United States)

    Gomez, Rapson; Watson, Shaun D.

    2017-01-01

    For the Social Phobia Scale (SPS) and the Social Interaction Anxiety Scale (SIAS) together, this study examined support for a bifactor model, and also the internal consistency reliability and external validity of the factors in this model. Participants (N = 526) were adults from the general community who completed the SPS and SIAS. Confirmatory factor analysis (CFA) of their ratings indicated good support for the bifactor model. For this model, the loadings for all but six items were higher on the general factor than the specific factors. The three positively worded items had negligible loadings on the general factor. The general factor explained most of the common variance in the SPS and SIAS, and demonstrated good model-based internal consistency reliability (omega hierarchical) and a strong association with fear of negative evaluation and extraversion. The practical implications of the findings for the utilization of the SPS and SIAS, and the theoretical and clinical implications for social anxiety are discussed. PMID:28210232

  15. Confirmatory Factor Analysis of the Combined Social Phobia Scale and Social Interaction Anxiety Scale: Support for a Bifactor Model.

    Science.gov (United States)

    Gomez, Rapson; Watson, Shaun D

    2017-01-01

    For the Social Phobia Scale (SPS) and the Social Interaction Anxiety Scale (SIAS) together, this study examined support for a bifactor model, and also the internal consistency reliability and external validity of the factors in this model. Participants ( N = 526) were adults from the general community who completed the SPS and SIAS. Confirmatory factor analysis (CFA) of their ratings indicated good support for the bifactor model. For this model, the loadings for all but six items were higher on the general factor than the specific factors. The three positively worded items had negligible loadings on the general factor. The general factor explained most of the common variance in the SPS and SIAS, and demonstrated good model-based internal consistency reliability (omega hierarchical) and a strong association with fear of negative evaluation and extraversion. The practical implications of the findings for the utilization of the SPS and SIAS, and the theoretical and clinical implications for social anxiety are discussed.

  16. Safety analysis factors for environmental restoration and decontamination and decommissioning

    International Nuclear Information System (INIS)

    Ellingson, D.R.

    1993-04-01

    Environmental restoration (ER) and facility decontamination/decommissioning (D ampersand D) operations can be grouped into two general categories. ''Nonstationary cleanup'' or simply ''cleanup'' activities are where the operation must relocate to the site of new contaminated material at the completion of each task (i.e., the operation moves to the contaminated material). ''Stationary production'' or simply ''production'' activities are where the contaminated material is moved to a centralized location (i.e., the contaminated material is moved to the operation) for analysis, sorting, treatment, storage, and disposal. This paper addresses the issue of nonstationary cleanup design. The following are the specific assigned action items: Collect and compile a list of special safety-related ER/D ampersand D design factors, especially ones that don't follow DOE Order 6430.1A requirements. Develop proposal of what makes sense to recommend to designers; especially consider recommendations for short-term projects. Present proposal at the January meeting. To achieve the action items, applicable US Department of Energy (DOE) design requirements, and cleanup operations and differences from production activities are reviewed and summarized; basic safety requirements influencing design are summarized; and finally, approaches, considerations, and methods for safe, cost-effective design of cleanup activities are discussed

  17. Regression and kriging analysis for grid power factor estimation

    Directory of Open Access Journals (Sweden)

    Rajesh Guntaka

    2014-12-01

    Full Text Available The measurement of power factor (PF in electrical utility grids is a mainstay of load balancing and is also a critical element of transmission and distribution efficiency. The measurement of PF dates back to the earliest periods of electrical power distribution to public grids. In the wide-area distribution grid, measurement of current waveforms is trivial and may be accomplished at any point in the grid using a current tap transformer. However, voltage measurement requires reference to ground and so is more problematic and measurements are normally constrained to points that have ready and easy access to a ground source. We present two mathematical analysis methods based on kriging and linear least square estimation (LLSE (regression to derive PF at nodes with unknown voltages that are within a perimeter of sample nodes with ground reference across a selected power grid. Our results indicate an error average of 1.884% that is within acceptable tolerances for PF measurements that are used in load balancing tasks.

  18. A Sensitivity Analysis Approach to Identify Key Environmental Performance Factors

    Directory of Open Access Journals (Sweden)

    Xi Yu

    2014-01-01

    Full Text Available Life cycle assessment (LCA is widely used in design phase to reduce the product’s environmental impacts through the whole product life cycle (PLC during the last two decades. The traditional LCA is restricted to assessing the environmental impacts of a product and the results cannot reflect the effects of changes within the life cycle. In order to improve the quality of ecodesign, it is a growing need to develop an approach which can reflect the changes between the design parameters and product’s environmental impacts. A sensitivity analysis approach based on LCA and ecodesign is proposed in this paper. The key environmental performance factors which have significant influence on the products’ environmental impacts can be identified by analyzing the relationship between environmental impacts and the design parameters. Users without much environmental knowledge can use this approach to determine which design parameter should be first considered when (redesigning a product. A printed circuit board (PCB case study is conducted; eight design parameters are chosen to be analyzed by our approach. The result shows that the carbon dioxide emission during the PCB manufacture is highly sensitive to the area of PCB panel.

  19. Application of factor analysis to the explosive detection

    International Nuclear Information System (INIS)

    Park, Yong Joon; Song, Byung Chul; Im, Hee Jung; Kim, Won Ho; Cho, Jung Hwan

    2005-01-01

    The detection of explosive devices hidden in airline baggage is significant problem, particularly in view of the development of modern plastic explosives which can formed into various innocent-appearing shapes and which are sufficiently powerful that small quantities can destroy an aircraft in flight. Besides, the biggest difficulty occurs from long detection time required for the explosive detection system based on thermal neutron interrogation, which involves exposing baggage to slow neutrons having energy in the order of 0.025 eV. The elemental compositions of explosives can be determined by the Neutron Induced Prompt gamma Spectroscopy (NIPS) which has been installed in Korea Atomic Energy Research Institute as a tool for the detection of explosives in passenger baggage. In this work, the factor analysis has been applied to the NIPS system to increase the signal-to-noise ratio of the prompt gamma spectrum for the detection of explosive hidden in a passenger's baggage, especially for the noisy prompt gamma spectrum obtained with short measurement time

  20. Factor Analysis of the Coopersmith Self-Esteem Inventory

    OpenAIRE

    Güloğlu, Berna; Aydın, Gül

    2001-01-01

    This study investigated the factor structure of the Turkish version of the Coopersmith Self-Esteem Inventory. The results showed that the inventory had a 21-factor highly complex factor structure. However of the empirically found 21 factors only 10 seemed theoretically meaningful. The results were discussed in comparison to the fndings obtained from the studies that were carried out with the original version of the Coopersmith Self-esteem Inventory.

  1. Sparse multivariate factor analysis regression models and its applications to integrative genomics analysis.

    Science.gov (United States)

    Zhou, Yan; Wang, Pei; Wang, Xianlong; Zhu, Ji; Song, Peter X-K

    2017-01-01

    The multivariate regression model is a useful tool to explore complex associations between two kinds of molecular markers, which enables the understanding of the biological pathways underlying disease etiology. For a set of correlated response variables, accounting for such dependency can increase statistical power. Motivated by integrative genomic data analyses, we propose a new methodology-sparse multivariate factor analysis regression model (smFARM), in which correlations of response variables are assumed to follow a factor analysis model with latent factors. This proposed method not only allows us to address the challenge that the number of association parameters is larger than the sample size, but also to adjust for unobserved genetic and/or nongenetic factors that potentially conceal the underlying response-predictor associations. The proposed smFARM is implemented by the EM algorithm and the blockwise coordinate descent algorithm. The proposed methodology is evaluated and compared to the existing methods through extensive simulation studies. Our results show that accounting for latent factors through the proposed smFARM can improve sensitivity of signal detection and accuracy of sparse association map estimation. We illustrate smFARM by two integrative genomics analysis examples, a breast cancer dataset, and an ovarian cancer dataset, to assess the relationship between DNA copy numbers and gene expression arrays to understand genetic regulatory patterns relevant to the disease. We identify two trans-hub regions: one in cytoband 17q12 whose amplification influences the RNA expression levels of important breast cancer genes, and the other in cytoband 9q21.32-33, which is associated with chemoresistance in ovarian cancer. © 2016 WILEY PERIODICALS, INC.

  2. Dynamic Factor Analysis of Nonstationary Multivariate Time Series.

    Science.gov (United States)

    Molenaar, Peter C. M.; And Others

    1992-01-01

    The dynamic factor model proposed by P. C. Molenaar (1985) is exhibited, and a dynamic nonstationary factor model (DNFM) is constructed with latent factor series that have time-varying mean functions. The use of a DNFM is illustrated using data from a television viewing habits study. (SLD)

  3. Confirmatory factor analysis of the neck disability index in a whiplash population indicates a one-factor model is viable

    OpenAIRE

    Gabel, Charles P.; Cuesta-Vargas, Antonio I.; Barr, Sebastian; Winkeljohn Black, Stephanie; Osborne, Jason W.; Melloh, Markus

    2016-01-01

    Purpose The neck disability index (NDI) as a 10-item patient reported outcome (PRO) measure is the most commonly used whiplash associated disorders (WAD) assessment tool. However, statistical rigor and factor structure are not definitive. To date, confirmatory factor analysis (CFA) has not examined whether the factor structure generalizes across different groups (e.g., WAD versus non-WAD). This study aimed to determine the psychometric properties of the NDI in these population groups.

  4. Financial consumer protection and customer satisfaction. A relationship study by using factor analysis and discriminant analysis

    Directory of Open Access Journals (Sweden)

    Marimuthu SELVAKUMAR

    2015-11-01

    Full Text Available This paper tries to make an attempt to study the relationship between the financial consumer protection and customer satisfaction by using factor analysis and discriminant analysis. The main objectives of the study are to analyze the financial consumer protection in commercial banks, to examine the customer satisfaction of commercial banks and to identify the factors of financial consumer protection lead customer satisfaction. There are many research work carried out on financial consumer protection in financial literacy, but the identification of factors which lead the financial consumer protection and the relationship between financial consumer protection and the customer satisfaction is very important, Particularly for banks to improve its quality and increase the customer satisfaction. Therefore this study is carried out with the aim of identifying the factors of financial consumer protection and its influence on customer satisfaction. This study is both descriptive and analytical in nature. It covers both primary and secondary data. The primary data has been collected from the customers of commercial banks using pre-tested interview schedule and the secondary data has been collected from standard books, journals, magazines, websites and so on.

  5. Risk analysis-based food safety policy: scientific factors versus socio-cultural factors

    NARCIS (Netherlands)

    Rosa, P.; Knapen, van F.; Brom, F.W.A.

    2008-01-01

    The purpose of this article is to illustrate the importance of socio-cultural factors in risk management and the need to incorporate these factors in a standard, internationally recognized (wto) framework. This was achieved by analysing the relevance of these factors in 3 cases
    The purpose of

  6. Real-time PCR Demonstrates Ancylostoma duodenale Is a Key Factor in the Etiology of Severe Anemia and Iron Deficiency in Malawian Pre-school Children

    Science.gov (United States)

    Jonker, Femkje A. M.; Calis, Job C. J.; Phiri, Kamija; Brienen, Eric A. T.; Khoffi, Harriet; Brabin, Bernard J.; Verweij, Jaco J.; van Hensbroek, Michael Boele; van Lieshout, Lisette

    2012-01-01

    Background Hookworm infections are an important cause of (severe) anemia and iron deficiency in children in the tropics. Type of hookworm species (Ancylostoma duodenale or Necator americanus) and infection load are considered associated with disease burden, although these parameters are rarely assessed due to limitations of currently used diagnostic methods. Using multiplex real-time PCR, we evaluated hookworm species-specific prevalence, infection load and their contribution towards severe anemia and iron deficiency in pre-school children in Malawi. Methodology and Findings A. duodenale and N. americanus DNA loads were determined in 830 fecal samples of pre-school children participating in a case control study investigating severe anemia. Using multiplex real-time PCR, hookworm infections were found in 34.1% of the severely anemic cases and in 27.0% of the non-severely anemic controls (panemia (adjusted odds ratio: 2.49 (95%CI 1.16–5.33) and 9.04 (95%CI 2.52–32.47) respectively). Iron deficiency (assessed through bone marrow examination) was positively associated with intensity of A. duodenale infection (adjusted odds ratio: 3.63 (95%CI 1.18–11.20); 16.98 (95%CI 3.88–74.35) and 44.91 (95%CI 5.23–385.77) for low, moderate and high load respectively). Conclusions/Significance This is the first report assessing the association of hookworm load and species differentiation with severe anemia and bone marrow iron deficiency. By revealing a much higher than expected prevalence of A. duodenale and its significant and load-dependent association with severe anemia and iron deficiency in pre-school children in Malawi, we demonstrated the need for quantitative and species-specific screening of hookworm infections. Multiplex real-time PCR is a powerful diagnostic tool for public health research to combat (severe) anemia and iron deficiency in children living in resource poor settings. PMID:22514750

  7. Network based transcription factor analysis of regenerating axolotl limbs

    Directory of Open Access Journals (Sweden)

    Cameron Jo Ann

    2011-03-01

    Full Text Available Abstract Background Studies on amphibian limb regeneration began in the early 1700's but we still do not completely understand the cellular and molecular events of this unique process. Understanding a complex biological process such as limb regeneration is more complicated than the knowledge of the individual genes or proteins involved. Here we followed a systems biology approach in an effort to construct the networks and pathways of protein interactions involved in formation of the accumulation blastema in regenerating axolotl limbs. Results We used the human orthologs of proteins previously identified by our research team as bait to identify the transcription factor (TF pathways and networks that regulate blastema formation in amputated axolotl limbs. The five most connected factors, c-Myc, SP1, HNF4A, ESR1 and p53 regulate ~50% of the proteins in our data. Among these, c-Myc and SP1 regulate 36.2% of the proteins. c-Myc was the most highly connected TF (71 targets. Network analysis showed that TGF-β1 and fibronectin (FN lead to the activation of these TFs. We found that other TFs known to be involved in epigenetic reprogramming, such as Klf4, Oct4, and Lin28 are also connected to c-Myc and SP1. Conclusions Our study provides a systems biology approach to how different molecular entities inter-connect with each other during the formation of an accumulation blastema in regenerating axolotl limbs. This approach provides an in silico methodology to identify proteins that are not detected by experimental methods such as proteomics but are potentially important to blastema formation. We found that the TFs, c-Myc and SP1 and their target genes could potentially play a central role in limb regeneration. Systems biology has the potential to map out numerous other pathways that are crucial to blastema formation in regeneration-competent limbs, to compare these to the pathways that characterize regeneration-deficient limbs and finally, to identify stem

  8. No study left behind: a network meta-analysis in non-small-cell lung cancer demonstrating the importance of considering all relevant data.

    Science.gov (United States)

    Hawkins, Neil; Scott, David A; Woods, Beth S; Thatcher, Nicholas

    2009-09-01

    To demonstrate the importance of considering all relevant indirect data in a network meta-analysis of treatments for non-small-cell lung cancer (NSCLC). A recent National Institute for Health and Clinical Excellence appraisal focussed on the indirect comparison of docetaxel with erlotinib in second-line treatment of NSCLC based on trials including a common comparator. We compared the results of this analysis to a network meta-analysis including other trials that formed a network of evidence. We also examined the importance of allowing for the correlations between the estimated treatment effects that can arise when analysing such networks. The analysis of the restricted network including only trials of docetaxel and erlotinib linked via the common placebo comparator produced an estimated mean hazard ratio (HR) for erlotinib compared with docetaxel of 1.55 (95% confidence interval [CI] 0.72-2.97). In contrast, the network meta-analysis produced an estimated HR for erlotinib compared with docetaxel of 0.83 (95% CI 0.65-1.06). Analyzing the wider network improved the precision of estimated treatment effects, altered their rankings and also allowed further treatments to be compared. Some of the estimated treatment effects from the wider network were highly correlated. This empirical example shows the importance of considering all potentially relevant data when comparing treatments. Care should therefore be taken to consider all relevant information, including correlations induced by the network of trial data, when comparing treatments.

  9. [An analysis of clinical characteristic and related risk factors in 208 cirrhotic patients complicated with infections].

    Science.gov (United States)

    Zhang, G H; Wang, M; Wang, L; Wang, X M; Wang, Y; Ou, X J; Jia, J D

    2018-02-01

    Objective: To analyze the clinical features and risk factors of cirrhotic patients complicated with infections. Methods: The clinical and laboratory characteristics of cirrhotic patients complicated with infections hospitalized from April 2014 to June 2017 were retrospectively analyzed. Relevant risk factors for infection and mortality were explored. Results: The overall incidence of infections was 17.6% in 1 670 hospitalized cirrhotic patients. Among the recruited 208 patients in this study, alcoholic, viral hepatitis B or C and autoimmune liver diseases accounted for 29.8% (62/208), 26.0% (54/208), and 22.1% (46/208), respectively. The most common infection site was respiratory tract (70.2%), followed by urinary tract, intestinal and intra-abdomen. Forty-six pathogens were isolated from 32 patients, including 22 (47.8%) Gram negative bacteria, 16 (34.8%) Gram positive bacteria and 2(4.3%) mycobacterium tuberculosis, 5 (10.9%) fungi and 1 (2.2%) mycoplasma. The mortality in patients with nosocomial infections (16.7%,7/42) was higher than that in patients with community-acquired infections (6.0%,10/166, P =0.025). All 17 deaths occurred in decompensated cirrhosis. Multivariate analysis demonstrated that hepatic encephalopathy and prothrombin time were independent risk factors of mortality. Conclusions: Patients with decompensated cirrhosis are more susceptible to infections. Hepatic encephalopathy and prothrombin time are independent risk factors for death.

  10. A hierarchical factor analysis of a safety culture survey.

    Science.gov (United States)

    Frazier, Christopher B; Ludwig, Timothy D; Whitaker, Brian; Roberts, D Steve

    2013-06-01

    Recent reviews of safety culture measures have revealed a host of potential factors that could make up a safety culture (Flin, Mearns, O'Connor, & Bryden, 2000; Guldenmund, 2000). However, there is still little consensus regarding what the core factors of safety culture are. The purpose of the current research was to determine the core factors, as well as the structure of those factors that make up a safety culture, and establish which factors add meaningful value by factor analyzing a widely used safety culture survey. A 92-item survey was constructed by subject matter experts and was administered to 25,574 workers across five multi-national organizations in five different industries. Exploratory and hierarchical confirmatory factor analyses were conducted revealing four second-order factors of a Safety Culture consisting of Management Concern, Personal Responsibility for Safety, Peer Support for Safety, and Safety Management Systems. Additionally, a total of 12 first-order factors were found: three on Management Concern, three on Personal Responsibility, two on Peer Support, and four on Safety Management Systems. The resulting safety culture model addresses gaps in the literature by indentifying the core constructs which make up a safety culture. This clarification of the major factors emerging in the measurement of safety cultures should impact the industry through a more accurate description, measurement, and tracking of safety cultures to reduce loss due to injury. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  11. Using sensitivity analysis to identify key factors for the propagation of a plant epidemic.

    Science.gov (United States)

    Rimbaud, Loup; Bruchou, Claude; Dallot, Sylvie; Pleydell, David R J; Jacquot, Emmanuel; Soubeyrand, Samuel; Thébaud, Gaël

    2018-01-01

    Identifying the key factors underlying the spread of a disease is an essential but challenging prerequisite to design management strategies. To tackle this issue, we propose an approach based on sensitivity analyses of a spatiotemporal stochastic model simulating the spread of a plant epidemic. This work is motivated by the spread of sharka, caused by plum pox virus , in a real landscape. We first carried out a broad-range sensitivity analysis, ignoring any prior information on six epidemiological parameters, to assess their intrinsic influence on model behaviour. A second analysis benefited from the available knowledge on sharka epidemiology and was thus restricted to more realistic values. The broad-range analysis revealed that the mean duration of the latent period is the most influential parameter of the model, whereas the sharka-specific analysis uncovered the strong impact of the connectivity of the first infected orchard. In addition to demonstrating the interest of sensitivity analyses for a stochastic model, this study highlights the impact of variation ranges of target parameters on the outcome of a sensitivity analysis. With regard to sharka management, our results suggest that sharka surveillance may benefit from paying closer attention to highly connected patches whose infection could trigger serious epidemics.

  12. Epigenetic clock analysis of diet, exercise, education, and lifestyle factors.

    Science.gov (United States)

    Quach, Austin; Levine, Morgan E; Tanaka, Toshiko; Lu, Ake T; Chen, Brian H; Ferrucci, Luigi; Ritz, Beate; Bandinelli, Stefania; Neuhouser, Marian L; Beasley, Jeannette M; Snetselaar, Linda; Wallace, Robert B; Tsao, Philip S; Absher, Devin; Assimes, Themistocles L; Stewart, James D; Li, Yun; Hou, Lifang; Baccarelli, Andrea A; Whitsel, Eric A; Horvath, Steve

    2017-02-14

    Behavioral and lifestyle factors have been shown to relate to a number of health-related outcomes, yet there is a need for studies that examine their relationship to molecular aging rates. Toward this end, we use recent epigenetic biomarkers of age that have previously been shown to predict all-cause mortality, chronic conditions, and age-related functional decline. We analyze cross-sectional data from 4,173 postmenopausal female participants from the Women's Health Initiative, as well as 402 male and female participants from the Italian cohort study, Invecchiare nel Chianti.Extrinsic epigenetic age acceleration (EEAA) exhibits significant associations with fish intake (p=0.02), moderate alcohol consumption (p=0.01), education (p=3x10 -5 ), BMI (p=0.01), and blood carotenoid levels (p=1x10 -5 )-an indicator of fruit and vegetable consumption, whereas intrinsic epigenetic age acceleration (IEAA) is associated with poultry intake (p=0.03) and BMI (p=0.05). Both EEAA and IEAA were also found to relate to indicators of metabolic syndrome, which appear to mediate their associations with BMI. Metformin-the first-line medication for the treatment of type 2 diabetes-does not delay epigenetic aging in this observational study. Finally, longitudinal data suggests that an increase in BMI is associated with increase in both EEAA and IEAA.Overall, the epigenetic age analysis of blood confirms the conventional wisdom regarding the benefits of eating a high plant diet with lean meats, moderate alcohol consumption, physical activity, and education, as well as the health risks of obesity and metabolic syndrome.

  13. Analysis Of Critical Factors Of Microfinance Institutions Of Pakistan

    Directory of Open Access Journals (Sweden)

    Ather Azim Khan

    2010-12-01

    Full Text Available This article is about the performance of Microfinance Institutions MFIs of Pakistan. In this article the types of MFIs operating is Pakistan is discussed with their details i.e. Microfinance Banks, Rural Support Programs and NGOs. Some other organizations are also involved in micro financing but their percentage is very low. It is found that Rural Support Programs RSPs are not totally involved in microfinance but have a large chunk of funds for microfinance. Micro loans are given for various purposes including starting a new business. The real theme of microcredit is to give money to a poor person to start a small or micro business and increase his family income but micro loans are often used for many other purposes such as paying another expensive loan, paying for medical expenses of bread earner of the family, marriages, construction etc. In this research work the researcher has tried to analyze the performance of MFIs of Pakistan and to find out those factors which contribute in their effectiveness. Two approaches of microfinance i.e. Institutionists Approach and Welfarists Approach are discussed. To analyze the performance of MFIs both approaches are considered i.e. Institutionists and Welfarists. Seventeen parameters are selected, many of these are financial ratios and these are divided into four groups i.e. sustainability, transparency, outreach and efficiency. Some ratios/figures of each area of these MFIs are taken in the data and analysis is performed to find out that which ones are contributing more or less. This research can be helpful for the MFIs which want to improve their performance and check their areas of significance for further improvement and development considering their approach of alleviating poverty from the society.

  14. Human Factors Considerations in New Nuclear Power Plants: Detailed Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    OHara,J.; Higgins, J.; Brown, W.; Fink, R.

    2008-02-14

    This Nuclear Regulatory Commission (NRC) sponsored study has identified human-performance issues in new and advanced nuclear power plants. To identify the issues, current industry developments and trends were evaluated in the areas of reactor technology, instrumentation and control technology, human-system integration technology, and human factors engineering (HFE) methods and tools. The issues were organized into seven high-level HFE topic areas: Role of Personnel and Automation, Staffing and Training, Normal Operations Management, Disturbance and Emergency Management, Maintenance and Change Management, Plant Design and Construction, and HFE Methods and Tools. The issues where then prioritized into four categories using a 'Phenomena Identification and Ranking Table' methodology based on evaluations provided by 14 independent subject matter experts. The subject matter experts were knowledgeable in a variety of disciplines. Vendors, utilities, research organizations and regulators all participated. Twenty issues were categorized into the top priority category. This Brookhaven National Laboratory (BNL) technical report provides the detailed methodology, issue analysis, and results. A summary of the results of this study can be found in NUREG/CR-6947. The research performed for this project has identified a large number of human-performance issues for new control stations and new nuclear power plant designs. The information gathered in this project can serve as input to the development of a long-term strategy and plan for addressing human performance in these areas through regulatory research. Addressing human-performance issues will provide the technical basis from which regulatory review guidance can be developed to meet these challenges. The availability of this review guidance will help set clear expectations for how the NRC staff will evaluate new designs, reduce regulatory uncertainty, and provide a well-defined path to new nuclear power plant

  15. Bayesian switching factor analysis for estimating time-varying functional connectivity in fMRI.

    Science.gov (United States)

    Taghia, Jalil; Ryali, Srikanth; Chen, Tianwen; Supekar, Kaustubh; Cai, Weidong; Menon, Vinod

    2017-07-15

    There is growing interest in understanding the dynamical properties of functional interactions between distributed brain regions. However, robust estimation of temporal dynamics from functional magnetic resonance imaging (fMRI) data remains challenging due to limitations in extant multivariate methods for modeling time-varying functional interactions between multiple brain areas. Here, we develop a Bayesian generative model for fMRI time-series within the framework of hidden Markov models (HMMs). The model is a dynamic variant of the static factor analysis model (Ghahramani and Beal, 2000). We refer to this model as Bayesian switching factor analysis (BSFA) as it integrates factor analysis into a generative HMM in a unified Bayesian framework. In BSFA, brain dynamic functional networks are represented by latent states which are learnt from the data. Crucially, BSFA is a generative model which estimates the temporal evolution of brain states and transition probabilities between states as a function of time. An attractive feature of BSFA is the automatic determination of the number of latent states via Bayesian model selection arising from penalization of excessively complex models. Key features of BSFA are validated using extensive simulations on carefully designed synthetic data. We further validate BSFA using fingerprint analysis of multisession resting-state fMRI data from the Human Connectome Project (HCP). Our results show that modeling temporal dependencies in the generative model of BSFA results in improved fingerprinting of individual participants. Finally, we apply BSFA to elucidate the dynamic functional organization of the salience, central-executive, and default mode networks-three core neurocognitive systems with central role in cognitive and affective information processing (Menon, 2011). Across two HCP sessions, we demonstrate a high level of dynamic interactions between these networks and determine that the salience network has the highest temporal

  16. Students' motivation to study dentistry in Malaysia: an analysis using confirmatory factor analysis.

    Science.gov (United States)

    Musa, Muhd Firdaus Che; Bernabé, Eduardo; Gallagher, Jennifer E

    2015-06-12

    Malaysia has experienced a significant expansion of dental schools over the past decade. Research into students' motivation may inform recruitment and retention of the future dental workforce. The objectives of this study were to explore students' motivation to study dentistry and whether that motivation varied by students' and school characteristics. All 530 final-year students in 11 dental schools (6 public and 5 private) in Malaysia were invited to participate at the end of 2013. The self-administered questionnaire, developed at King's College London, collected information on students' motivation to study dentistry and demographic background. Responses on students' motivation were collected using five-point ordinal scales. Confirmatory factor analysis (CFA) was used to evaluate the underlying structure of students' motivation to study dentistry. Multivariate analysis of variance (MANOVA) was used to compare factor scores for overall motivation and sub-domains by students' and school characteristics. Three hundred and fifty-six final-year students in eight schools (all public and two private) participated in the survey, representing an 83% response rate for these schools and 67% of all final-year students nationally. The majority of participants were 24 years old (47%), female (70%), Malay (56%) and from middle-income families (41%) and public schools (78%). CFA supported a model with five first-order factors (professional job, healthcare and people, academic, careers advising and family and friends) which were linked to a single second-order factor representing overall students' motivation. Academic factors and healthcare and people had the highest standardized factor loadings (0.90 and 0.71, respectively), suggesting they were the main motivation to study dentistry. MANOVA showed that students from private schools had higher scores for healthcare and people than those in public schools whereas Malay students had lower scores for family and friends than those

  17. Implementation of the k0-standardization Method for an Instrumental Neutron Activation Analysis: Use-k0-IAEA Software as a Demonstration

    International Nuclear Information System (INIS)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Kim, Hark Rho; Ho, Manh Dung

    2006-03-01

    Under the RCA post-doctoral program, from May 2005 through February 2006, it was an opportunity to review the present work being carried out in the Neutron Activation Analysis Laboratory, HANARO Center, KAERI. The scope of this research included: a calibration of the counting system, a characterization of the irradiation facility ,a validation of the established k o -NAA procedure.The k o -standardization method for an Neutron Activation Analysis(k o -NAA), which is becoming increasingly popular and widespread,is an absolute calibration technique where the nuclear data are replaced by compound nuclear constants which are experimentally determined. The k o -IAEA software distributed by the IAEA in 2005 was used as a demonstration for this work. The NAA no. 3 irradiation hole in the HANARO research reactor and the gamma-ray spectrometers No. 1 and 5 in the NAA Laboratory were used

  18. Implementation of the k{sub 0}-standardization Method for an Instrumental Neutron Activation Analysis: Use-k{sub 0}-IAEA Software as a Demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Kim, Hark Rho [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of); Ho, Manh Dung [Nuclear Research Institute, Dalat (Viet Nam)

    2006-03-15

    Under the RCA post-doctoral program, from May 2005 through February 2006, it was an opportunity to review the present work being carried out in the Neutron Activation Analysis Laboratory, HANARO Center, KAERI. The scope of this research included: a calibration of the counting system, a characterization of the irradiation facility ,a validation of the established k{sub o}-NAA procedure.The k{sub o}-standardization method for an Neutron Activation Analysis(k{sub o}-NAA), which is becoming increasingly popular and widespread,is an absolute calibration technique where the nuclear data are replaced by compound nuclear constants which are experimentally determined. The k{sub o}-IAEA software distributed by the IAEA in 2005 was used as a demonstration for this work. The NAA no. 3 irradiation hole in the HANARO research reactor and the gamma-ray spectrometers No. 1 and 5 in the NAA Laboratory were used.

  19. [Analysis of risk factors associated with professional drivers’ work].

    Science.gov (United States)

    Czerwińska, Maja; Hołowko, Joanna; Stachowska, Ewa

    Professional driver is an occupation associated with high health risk. The factors which increase the risk of developing lifestyle diseases are closely related to working conditions. The aim of this study was to analyse the risk factors which are associated with professional drivers’ lifestyle. The material consisted of 23 articles from PubMed.gov. Risk factors related to drivers’ work have a signiicant impact on their health.

  20. Factors influencing societal response of nanotechnology : an expert stakeholder analysis

    OpenAIRE

    Gupta, N.; Fischer, A.R.H.; Lans, van der, I.A.; Frewer, L.J.

    2012-01-01

    Nanotechnology can be described as an emerging technology and, as has been the case with other emerging technologies such as genetic modification, different socio-psychological factors will potentially influence societal responses to its development and application. These factors will play an important role in how nanotechnology is developed and commercialised. This article aims to identify expert opinion on factors influencing societal response to applications of nanotechnology. Structured i...

  1. Factors influencing societal response of nanotechnology: an expert stakeholder analysis

    OpenAIRE

    Gupta, Nidhi; Fischer, Arnout R. H.; van der Lans, Ivo A.; Frewer, Lynn J.

    2012-01-01

    Nanotechnology can be described as an emerging technology and, as has been the case with other emerging technologies such as genetic modification, different socio-psychological factors will potentially influence societal responses to its development and application. These factors will play an important role in how nanotechnology is developed and commercialised. This article aims to identify expert opinion on factors influencing societal response to applications of nanotechnology. Structured i...

  2. Factor Analysis of the Community Balance and Mobility Scale in Individuals with Knee Osteoarthritis.

    Science.gov (United States)

    Takacs, Judit; Krowchuk, Natasha M; Goldsmith, Charles H; Hunt, Michael A

    2017-10-01

    The clinical assessment of balance is an important first step in characterizing the risk of falls. The Community Balance and Mobility Scale (CB&M) is a test of balance and mobility that was designed to assess performance on advanced tasks necessary for independence in the community. However, other factors that can affect balancing ability may also be present during performance of the real-world tasks on the CB&M. It is important for clinicians to understand fully what other modifiable factors the CB&M may encompass. The purpose of this study was to evaluate the underlying constructs in the CB&M in individuals with knee osteoarthritis (OA). This was an observational study, with a single testing session. Participants with knee OA aged 50 years and older completed the CB&M, a clinical test of balance and mobility. Confirmatory factor analysis was then used to examine whether the tasks on the CB&M measure distinct factors. Three a priori theory-driven models with three (strength, balance, mobility), four (range of motion added) and six (pain and fear added) constructs were evaluated using multiple fit indices. A total of 131 participants (mean [SD] age 66.3 [8.5] years, BMI 27.3 [5.2] kg m -2 ) participated. A three-factor model in which all tasks loaded on these three factors explained 65% of the variance and yielded the most optimal model, as determined using scree plots, chi-squared values and explained variance. The first factor accounted for 49% of the variance and was interpreted as lower limb muscle strength. The second and third factors were interpreted as mobility and balance, respectively. The CB&M demonstrated the measurement of three distinct factors, interpreted as lower limb strength, balance and mobility, supporting the use of the CB&M with people with knee OA for evaluation of these important factors in falls risk and functional mobility. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  3. Dual-energy CT in vertebral compression fractures: performance of visual and quantitative analysis for bone marrow edema demonstration with comparison to MRI

    International Nuclear Information System (INIS)

    Bierry, Guillaume; Venkatasamy, Aina; Kremer, Stephane; Dosch, Jean-Claude; Dietemann, Jean-Louis

    2014-01-01

    To prospectively evaluate the performance of virtual non-calcium (VNC) dual-energy CT (DECT) images for the demonstration of trauma-related abnormal marrow attenuation in collapsed and non-collapsed vertebral compression fractures (VCF) with MRI as a reference standard. Twenty patients presenting with non-tumoral VCF were consecutively and prospectively included in this IRB-approved study, and underwent MRI and DECT of the spine. MR examination served as a reference standard. Two independent readers visually evaluated all vertebrae for abnormal marrow attenuation (''CT edema'') on VNC DECT images; specificity, sensitivity, predictive values, intra and inter-observer agreements were calculated. A last reader performed a quantitative evaluation of CT numbers; cut-off values were calculated using ROC analysis. In the visual analysis, VNC DECT images had an overall sensitivity of 84 %, specificity of 97 %, and accuracy of 95 %, intra- and inter-observer agreements ranged from k = 0.74 to k = 0.90. CT numbers were significantly different between vertebrae with edema on MR and those without (p < 0.0001). Cut-off values provided sensitivity of 85 % (77 %) and specificity of 82 % (74 %) for ''CT edema'' on thoracic (lumbar) vertebrae. VNC DECT images allowed an accurate demonstration of trauma-related abnormal attenuation in VCF, revealing the acute nature of the fracture, on both visual and quantitative evaluation. (orig.)

  4. Dual-energy CT in vertebral compression fractures: performance of visual and quantitative analysis for bone marrow edema demonstration with comparison to MRI

    Energy Technology Data Exchange (ETDEWEB)

    Bierry, Guillaume; Venkatasamy, Aina; Kremer, Stephane; Dosch, Jean-Claude; Dietemann, Jean-Louis [University Hospital of Strasbourg, Department of Radiology, Strasbourg (France)

    2014-04-15

    To prospectively evaluate the performance of virtual non-calcium (VNC) dual-energy CT (DECT) images for the demonstration of trauma-related abnormal marrow attenuation in collapsed and non-collapsed vertebral compression fractures (VCF) with MRI as a reference standard. Twenty patients presenting with non-tumoral VCF were consecutively and prospectively included in this IRB-approved study, and underwent MRI and DECT of the spine. MR examination served as a reference standard. Two independent readers visually evaluated all vertebrae for abnormal marrow attenuation (''CT edema'') on VNC DECT images; specificity, sensitivity, predictive values, intra and inter-observer agreements were calculated. A last reader performed a quantitative evaluation of CT numbers; cut-off values were calculated using ROC analysis. In the visual analysis, VNC DECT images had an overall sensitivity of 84 %, specificity of 97 %, and accuracy of 95 %, intra- and inter-observer agreements ranged from k = 0.74 to k = 0.90. CT numbers were significantly different between vertebrae with edema on MR and those without (p < 0.0001). Cut-off values provided sensitivity of 85 % (77 %) and specificity of 82 % (74 %) for ''CT edema'' on thoracic (lumbar) vertebrae. VNC DECT images allowed an accurate demonstration of trauma-related abnormal attenuation in VCF, revealing the acute nature of the fracture, on both visual and quantitative evaluation. (orig.)

  5. Dual-energy CT in vertebral compression fractures: performance of visual and quantitative analysis for bone marrow edema demonstration with comparison to MRI.

    Science.gov (United States)

    Bierry, Guillaume; Venkatasamy, Aïna; Kremer, Stéphane; Dosch, Jean-Claude; Dietemann, Jean-Louis

    2014-04-01

    To prospectively evaluate the performance of virtual non-calcium (VNC) dual-energy CT (DECT) images for the demonstration of trauma-related abnormal marrow attenuation in collapsed and non-collapsed vertebral compression fractures (VCF) with MRI as a reference standard. Twenty patients presenting with non-tumoral VCF were consecutively and prospectively included in this IRB-approved study, and underwent MRI and DECT of the spine. MR examination served as a reference standard. Two independent readers visually evaluated all vertebrae for abnormal marrow attenuation ("CT edema") on VNC DECT images; specificity, sensitivity, predictive values, intra and inter-observer agreements were calculated. A last reader performed a quantitative evaluation of CT numbers; cut-off values were calculated using ROC analysis. In the visual analysis, VNC DECT images had an overall sensitivity of 84%, specificity of 97%, and accuracy of 95%, intra- and inter-observer agreements ranged from k = 0.74 to k = 0.90. CT numbers were significantly different between vertebrae with edema on MR and those without (p VNC DECT images allowed an accurate demonstration of trauma-related abnormal attenuation in VCF, revealing the acute nature of the fracture, on both visual and quantitative evaluation.

  6. Pyrene conjugation and spectroscopic analysis of hydroxypropyl methylcellulose compounds successfully demonstrated a local dielectric difference associated with in vivo anti-prion activity.

    Directory of Open Access Journals (Sweden)

    Kenta Teruya

    Full Text Available Our previous study on prion-infected rodents revealed that hydroxypropyl methylcellulose compounds (HPMCs with different molecular weights but similar composition and degree of substitution have different levels of long-lasting anti-prion activity. In this study, we searched these HPMCs for a parameter specifically associated with in vivo anti-prion activity by analyzing in vitro chemical properties and in vivo tissue distributions. Infrared spectroscopic and thermal analyses revealed no differences among HPMCs, whereas pyrene conjugation and spectroscopic analysis revealed that the fluorescence intensity ratio of peak III/peak I correlated with anti-prion activity. This correlation was more clearly demonstrated in the anti-prion activity of the 1-year pre-infection treatment than that of the immediate post-infection treatment. In addition, the intensity ratio of peak III/peak I negatively correlated with the macrophage uptake level of HPMCs in our previous study. However, the in vivo distribution pattern was apparently not associated with anti-prion activity and was different in the representative tissues. These findings suggest that pyrene conjugation and spectroscopic analysis are powerful methods to successfully demonstrate local dielectric differences in HPMCs and provide a feasible parameter denoting the long-lasting anti-prion activity of HPMCs in vivo.

  7. Comorbid depression and associated factors in PNES versus epilepsy: Systematic review and meta-analysis.

    Science.gov (United States)

    Walsh, Sean; Levita, Liat; Reuber, Markus

    2018-05-24

    This systematic review aims to contrast levels, manifestations and associations of depression in patients with psychogenic non-epileptic seizures (PNES) and those with epilepsy. ScienceDirect and Web of Science were searched for primary research reports describing quantitative studies involving separate epilepsy and PNES samples (age 16+) and using a validated measure of depression. While 34 studies were identified, most were of low quality and had small sample sizes. Studies consistently found higher levels of self-reported depression in the PNES than epilepsy groups, with a meta-analysis demonstrating a significant difference between the groups. Although patients with PNES were also more likely to have a clinical diagnosis of depression than those with epilepsy, the difference between the groups was less pronounced in studies based on such diagnoses rather than self-report. Patients with PNES were more likely to report physical symptoms of depression than those with epilepsy. Interpersonal factors explained more variation in depression levels in patients with PNES than those with epilepsy, for whom illness related factors were more influential, but in both patient groups, depression had a significant impact on health related quality of life. This systematic review demonstrates a higher prevalence of depression in patients with PNES compared to patients with epilepsy and suggests differences in the expression and possible causes of depression between these groups. Copyright © 2018 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  8. Proteomic analysis of polyribosomes identifies splicing factors as potential regulators of translation during mitosis.

    Science.gov (United States)

    Aviner, Ranen; Hofmann, Sarah; Elman, Tamar; Shenoy, Anjana; Geiger, Tamar; Elkon, Ran; Ehrlich, Marcelo; Elroy-Stein, Orna

    2017-06-02

    Precise regulation of mRNA translation is critical for proper cell division, but little is known about the factors that mediate it. To identify mRNA-binding proteins that regulate translation during mitosis, we analyzed the composition of polysomes from interphase and mitotic cells using unbiased quantitative mass-spectrometry (LC-MS/MS). We found that mitotic polysomes are enriched with a subset of proteins involved in RNA processing, including alternative splicing and RNA export. To demonstrate that these may indeed be regulators of translation, we focused on heterogeneous nuclear ribonucleoprotein C (hnRNP C) as a test case and confirmed that it is recruited to elongating ribosomes during mitosis. Then, using a combination of pulsed SILAC, metabolic labeling and ribosome profiling, we showed that knockdown of hnRNP C affects both global and transcript-specific translation rates and found that hnRNP C is specifically important for translation of mRNAs that encode ribosomal proteins and translation factors. Taken together, our results demonstrate how proteomic analysis of polysomes can provide insight into translation regulation under various cellular conditions of interest and suggest that hnRNP C facilitates production of translation machinery components during mitosis to provide daughter cells with the ability to efficiently synthesize proteins as they enter G1 phase. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. A configurational analysis of success factors in crowdfunding video campaigns

    DEFF Research Database (Denmark)

    Lomberg, Carina; Li-Ying, Jason; Alkærsig, Lars

    Recent discussions on success factors on crowdfunding campaigns highlight a plentitude of diverse factors that stem from different, partly contradicting theories. We focus on campaign videos and assume more than one way of creating a successful crowdfunding video. We generate data of 1000 randomly...

  10. The Self-Report Family Inventory: An Exploratory Factor Analysis

    Science.gov (United States)

    Goodrich, Kristopher M.; Selig, James P.; Trahan, Don P., Jr.

    2012-01-01

    Researchers explored the factor structure of the Self-Report Family Inventory with a sample of heterosexual parents who have a son or daughter who self-identifies as lesbian, gay, or bisexual. Results suggest that a two-factor solution is appropriate. Research and clinical implications are offered. (Contains 1 figure and 2 tables.)

  11. Dimensions of assertiveness: factor analysis of five assertion inventories.

    Science.gov (United States)

    Henderson, M; Furnham, A

    1983-09-01

    Five self report assertiveness inventories were factor analyzed. In each case two major factors emerged, accounting for approximately one-quarter to a third of the variance. The findings emphasize the multidimensional nature of current measures of assertiveness, and suggest the construction of a more systematic and psychometrically evaluated scale that would yield subscale scores assessing the separate dimensions of assertiveness.

  12. Factors influencing societal response of nanotechnology : an expert stakeholder analysis

    NARCIS (Netherlands)

    Gupta, N.; Fischer, A.R.H.; Lans, van der I.A.; Frewer, L.J.

    2012-01-01

    Nanotechnology can be described as an emerging technology and, as has been the case with other emerging technologies such as genetic modification, different socio-psychological factors will potentially influence societal responses to its development and application. These factors will play an

  13. Confirmatory Factor Analysis of the WISC-III with Child Psychiatric Inpatients.

    Science.gov (United States)

    Tupa, David J.; Wright, Margaret O'Dougherty; Fristad, Mary A.

    1997-01-01

    Factor models of the Wechsler Intelligence Scale for Children-Third Edition (WISC-III) for one, two, three, and four factors were tested using confirmatory factor analysis with a sample of 177 child psychiatric inpatients. The four-factor model proposed in the WISC-III manual provided the best fit to the data. (SLD)

  14. Smoking among American adolescents: a risk and protective factor analysis.

    Science.gov (United States)

    Scal, Peter; Ireland, Marjorie; Borowsky, Iris Wagman

    2003-04-01

    Cigarette smoking remains a substantial threat to the current and future health of America's youth. The purpose of this study was to identify the risk and protective factors for cigarette smoking among US adolescents. Data from the National Longitudinal Study of Adolescent Health was used, comparing the responses of all non-smokers at Time 1 for their ability to predict the likelihood of smoking at Time 2, one year later. Data was stratified into four gender by grade group cohorts. Cross-cutting risk factors for smoking among all four cohorts were: using alcohol, marijuana, and other illicit drugs; violence involvement; having had sex; having friends who smoke and learning problems. Having a higher grade point average and family connectedness were protective across all cohorts. Other gender and grade group specific risk and protective factors were identified. The estimated probability of initiating smoking decreased by 19.2% to 54.1% both in situations of high and low risk as the number of protective factors present increased. Of the factors that predict or protect against smoking some are influential across all gender and grade group cohorts studied, while others are specific to gender and developmental stage. Prevention efforts that target both the reduction of risk factors and enhancement of protective factors at the individual, family, peer group and community are likely to reduce the likelihood of smoking initiation.

  15. Evaluation of Parallel Analysis Methods for Determining the Number of Factors

    Science.gov (United States)

    Crawford, Aaron V.; Green, Samuel B.; Levy, Roy; Lo, Wen-Juo; Scott, Lietta; Svetina, Dubravka; Thompson, Marilyn S.

    2010-01-01

    Population and sample simulation approaches were used to compare the performance of parallel analysis using principal component analysis (PA-PCA) and parallel analysis using principal axis factoring (PA-PAF) to identify the number of underlying factors. Additionally, the accuracies of the mean eigenvalue and the 95th percentile eigenvalue criteria…

  16. Prognostic factors in canine appendicular osteosarcoma - a meta-analysis.

    Science.gov (United States)

    Boerman, Ilse; Selvarajah, Gayathri T; Nielen, Mirjam; Kirpensteijn, Jolle

    2012-05-15

    Appendicular osteosarcoma is the most common malignant primary canine bone tumor. When treated by amputation or tumor removal alone, median survival times (MST) do not exceed 5 months, with the majority of dogs suffering from metastatic disease. This period can be extended with adequate local intervention and adjuvant chemotherapy, which has become common practice. Several prognostic factors have been reported in many different studies, e.g. age, breed, weight, sex, neuter status, location of tumor, serum alkaline phosphatase (SALP), bone alkaline phosphatase (BALP), infection, percentage of bone length affected, histological grade or histological subtype of tumor. Most of these factors are, however, only reported as confounding factors in larger studies. Insight in truly significant prognostic factors at time of diagnosis may contribute to tailoring adjuvant therapy for individual dogs suffering from osteosarcoma. The objective of this study was to systematically review the prognostic factors that are described for canine appendicular osteosarcoma and validate their scientific importance. A literature review was performed on selected studies and eligible data were extracted. Meta-analyses were done for two of the three selected possible prognostic factors (SALP and location), looking at both survival time (ST) and disease free interval (DFI). The third factor (age) was studied in a qualitative manner. Both elevated SALP level and the (proximal) humerus as location of the primary tumor are significant negative prognostic factors for both ST and DFI in dogs with appendicular osteosarcoma. Increasing age was associated with shorter ST and DFI, however, was not statistically significant because information of this factor was available in only a limited number of papers. Elevated SALP and proximal humeral location are significant negative prognosticators for canine osteosarcoma.

  17. Lunar Water Resource Demonstration

    Science.gov (United States)

    Muscatello, Anthony C.

    2008-01-01

    In cooperation with the Canadian Space Agency, the Northern Centre for Advanced Technology, Inc., the Carnegie-Mellon University, JPL, and NEPTEC, NASA has undertaken the In-Situ Resource Utilization (ISRU) project called RESOLVE. This project is a ground demonstration of a system that would be sent to explore permanently shadowed polar lunar craters, drill into the regolith, determine what volatiles are present, and quantify them in addition to recovering oxygen by hydrogen reduction. The Lunar Prospector has determined these craters contain enhanced hydrogen concentrations averaging about 0.1%. If the hydrogen is in the form of water, the water concentration would be around 1%, which would translate into billions of tons of water on the Moon, a tremendous resource. The Lunar Water Resource Demonstration (LWRD) is a part of RESOLVE designed to capture lunar water and hydrogen and quantify them as a backup to gas chromatography analysis. This presentation will briefly review the design of LWRD and some of the results of testing the subsystem. RESOLVE is to be integrated with the Scarab rover from CMIJ and the whole system demonstrated on Mauna Kea on Hawaii in November 2008. The implications of lunar water for Mars exploration are two-fold: 1) RESOLVE and LWRD could be used in a similar fashion on Mars to locate and quantify water resources, and 2) electrolysis of lunar water could provide large amounts of liquid oxygen in LEO, leading to lower costs for travel to Mars, in addition to being very useful at lunar outposts.

  18. Risk factors of chronic periodontitis on healing response: a multilevel modelling analysis.

    Science.gov (United States)

    Song, J; Zhao, H; Pan, C; Li, C; Liu, J; Pan, Y

    2017-09-15

    Chronic periodontitis is a multifactorial polygenetic disease with an increasing number of associated factors that have been identified over recent decades. Longitudinal epidemiologic studies have demonstrated that the risk factors were related to the progression of the disease. A traditional multivariate regression model was used to find risk factors associated with chronic periodontitis. However, the approach requirement of standard statistical procedures demands individual independence. Multilevel modelling (MLM) data analysis has widely been used in recent years, regarding thorough hierarchical structuring of the data, decomposing the error terms into different levels, and providing a new analytic method and framework for solving this problem. The purpose of our study is to investigate the relationship of clinical periodontal index and the risk factors in chronic periodontitis through MLM analysis and to identify high-risk individuals in the clinical setting. Fifty-four patients with moderate to severe periodontitis were included. They were treated by means of non-surgical periodontal therapy, and then made follow-up visits regularly at 3, 6, and 12 months after therapy. Each patient answered a questionnaire survey and underwent measurement of clinical periodontal parameters. Compared with baseline, probing depth (PD) and clinical attachment loss (CAL) improved significantly after non-surgical periodontal therapy with regular follow-up visits at 3, 6, and 12 months after therapy. The null model and variance component models with no independent variables included were initially obtained to investigate the variance of the PD and CAL reductions across all three levels, and they showed a statistically significant difference (P periodontal therapy with regular follow-up visits had a remarkable curative effect. All three levels had a substantial influence on the reduction of PD and CAL. Site-level had the largest effect on PD and CAL reductions.

  19. University student depression inventory (USDI): confirmatory factor analysis and review of psychometric properties.

    Science.gov (United States)

    Romaniuk, Madeline; Khawaja, Nigar G

    2013-09-25

    The 30-item USDI is a self-report measure that assesses depressive symptoms among university students. It consists of three correlated three factors: lethargy, cognitive-emotional and academic motivation. The current research used confirmatory factor analysis to asses construct validity and determine whether the original factor structure would be replicated in a different sample. Psychometric properties were also examined. Participants were 1148 students (mean age 22.84 years, SD=6.85) across all faculties from a large Australian metropolitan university. Students completed a questionnaire comprising of the USDI, the depression anxiety stress scale (DASS) and Life Satisfaction Scale (LSS). The three correlated factor model was shown to be an acceptable fit to the data, indicating sound construct validity. Internal consistency of the scale was also demonstrated to be sound, with high Cronbach alpha values. Temporal stability of the scale was also shown to be strong through test-retest analysis. Finally, concurrent and discriminant validity was examined with correlations between the USDI and DASS subscales as well as the LSS, with sound results further supporting the construct validity of the scale. Cut-off points were also developed to aid total score interpretation. Response rates are unclear. In addition, the representativeness of the sample could be improved potentially through targeted recruitment (i.e. reviewing the online sample statistics during data collection, examining the representativeness trends and addressing particular faculties within the university that were underrepresented). The USDI provides a valid and reliable method of assessing depressive symptoms found among university students. © 2013 Elsevier B.V. All rights reserved.

  20. Anatomical specificity of vascular endothelial growth factor expression in glioblastomas: a voxel-based mapping analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fan, Xing [Capital Medical University, Department of Neurosurgery, Beijing Tiantan Hospital, Beijing (China); Wang, Yinyan [Capital Medical University, Department of Neurosurgery, Beijing Tiantan Hospital, Beijing (China); Capital Medical University, Department of Neuropathology, Beijing Neurosurgical Institute, Beijing (China); Wang, Kai; Ma, Jun; Li, Shaowu [Capital Medical University, Department of Neuroradiology, Beijing Tiantan Hospital, Beijing (China); Liu, Shuai [Chinese Academy of Medical Sciences and Peking Union Medical College, Departments of Neurosurgery, Peking Union Medical College Hospital, Beijing (China); Liu, Yong [Chinese Academy of Sciences, Brainnetome Center, Institute of Automation, Beijing (China); Jiang, Tao [Capital Medical University, Department of Neurosurgery, Beijing Tiantan Hospital, Beijing (China); Beijing Academy of Critical Illness in Brain, Department of Clinical Oncology, Beijing (China)

    2016-01-15

    The expression of vascular endothelial growth factor (VEGF) is a common genetic alteration in malignant gliomas and contributes to the angiogenesis of tumors. This study aimed to investigate the anatomical specificity of VEGF expression levels in glioblastomas using voxel-based neuroimaging analysis. Clinical information, MR scans, and immunohistochemistry stains of 209 patients with glioblastomas were reviewed. All tumor lesions were segmented manually and subsequently registered to standard brain space. Voxel-based regression analysis was performed to correlate the brain regions of tumor involvement with the level of VEGF expression. Brain regions identified as significantly associated with high or low VEGF expression were preserved following permutation correction. High VEGF expression was detected in 123 (58.9 %) of the 209 patients. Voxel-based statistical analysis demonstrated that high VEGF expression was more likely in tumors located in the left frontal lobe and the right caudate and low VEGF expression was more likely in tumors that occurred in the posterior region of the right lateral ventricle. Voxel-based neuroimaging analysis revealed the anatomic specificity of VEGF expression in glioblastoma, which may further our understanding of genetic heterogeneity during tumor origination. This finding provides primary theoretical support for potential future application of customized antiangiogenic therapy. (orig.)

  1. A load factor based mean-variance analysis for fuel diversification

    Energy Technology Data Exchange (ETDEWEB)

    Gotham, Douglas; Preckel, Paul; Ruangpattana, Suriya [State Utility Forecasting Group, Purdue University, West Lafayette, IN (United States); Muthuraman, Kumar [McCombs School of Business, University of Texas, Austin, TX (United States); Rardin, Ronald [Department of Industrial Engineering, University of Arkansas, Fayetteville, AR (United States)

    2009-03-15

    Fuel diversification implies the selection of a mix of generation technologies for long-term electricity generation. The goal is to strike a good balance between reduced costs and reduced risk. The method of analysis that has been advocated and adopted for such studies is the mean-variance portfolio analysis pioneered by Markowitz (Markowitz, H., 1952. Portfolio selection. Journal of Finance 7(1) 77-91). However the standard mean-variance methodology, does not account for the ability of various fuels/technologies to adapt to varying loads. Such analysis often provides results that are easily dismissed by regulators and practitioners as unacceptable, since load cycles play critical roles in fuel selection. To account for such issues and still retain the convenience and elegance of the mean-variance approach, we propose a variant of the mean-variance analysis using the decomposition of the load into various types and utilizing the load factors of each load type. We also illustrate the approach using data for the state of Indiana and demonstrate the ability of the model in providing useful insights. (author)

  2. Anatomical specificity of vascular endothelial growth factor expression in glioblastomas: a voxel-based mapping analysis

    International Nuclear Information System (INIS)

    Fan, Xing; Wang, Yinyan; Wang, Kai; Ma, Jun; Li, Shaowu; Liu, Shuai; Liu, Yong; Jiang, Tao

    2016-01-01

    The expression of vascular endothelial growth factor (VEGF) is a common genetic alteration in malignant gliomas and contributes to the angiogenesis of tumors. This study aimed to investigate the anatomical specificity of VEGF expression levels in glioblastomas using voxel-based neuroimaging analysis. Clinical information, MR scans, and immunohistochemistry stains of 209 patients with glioblastomas were reviewed. All tumor lesions were segmented manually and subsequently registered to standard brain space. Voxel-based regression analysis was performed to correlate the brain regions of tumor involvement with the level of VEGF expression. Brain regions identified as significantly associated with high or low VEGF expression were preserved following permutation correction. High VEGF expression was detected in 123 (58.9 %) of the 209 patients. Voxel-based statistical analysis demonstrated that high VEGF expression was more likely in tumors located in the left frontal lobe and the right caudate and low VEGF expression was more likely in tumors that occurred in the posterior region of the right lateral ventricle. Voxel-based neuroimaging analysis revealed the anatomic specificity of VEGF expression in glioblastoma, which may further our understanding of genetic heterogeneity during tumor origination. This finding provides primary theoretical support for potential future application of customized antiangiogenic therapy. (orig.)

  3. Garlic powder intake and cardiovascular risk factors: a meta-analysis of randomized controlled clinical trials.

    Science.gov (United States)

    Kwak, Jin Sook; Kim, Ji Yeon; Paek, Ju Eun; Lee, You Jin; Kim, Haeng-Ran; Park, Dong-Sik; Kwon, Oran

    2014-12-01

    Although preclinical studies suggest that garlic has potential preventive effects on cardiovascular disease (CVD) risk factors, clinical trials and reports from systematic reviews or meta-analyses present inconsistent results. The contradiction might be attributed to variations in the manufacturing process that can markedly influence the composition of garlic products. To investigate this issue further, we performed a meta-analysis of the effects of garlic powder on CVD risk factors. We searched PubMed, Cochrane, Science Direct and EMBASE through May 2014. A random-effects meta-analysis was performed on 22 trials reporting total cholesterol (TC), 17 trials reporting LDL cholesterol (LDL-C), 18 trials reporting HDL cholesterol (HDL-C), 4 trials reporting fasting blood glucose (FBG), 9 trials reporting systolic blood pressure (SBP) and 10 trials reporting diastolic blood pressure (DBP). The overall garlic powder intake significantly reduced blood TC and LDL-C by -0.41 mmol/L (95% confidence interval [CI], -0.69, -0.12) (-15.83 mg/dL [95% CI, -26.64, -4.63]) and -0.21 mmol/L (95% CI, -0.40, -0.03) (-8.11 mg/dL [95% CI, -15.44, -1.16]), respectively. The mean difference in the reduction of FBG levels was -0.96 mmol/L (95% CI, -1.91, -0.01) (-17.30 mg/dL [95% CI, -34.41, -0.18]). Evidence for SBP and DBP reduction in the garlic supplementation group was also demonstrated by decreases of -4.34 mmHg (95% CI, -8.38, -0.29) and -2.36 mmHg (95% CI, -4.56, -0.15), respectively. This meta-analysis provides consistent evidence that garlic powder intake reduces the CVD risk factors of TC, LDL-C, FBG and BP.

  4. Weighing up the weighted case mix tool (WCMT): a psychometric investigation using confirmatory factor analysis.

    Science.gov (United States)

    Duane, B G; Humphris, G; Richards, D; Okeefe, E J; Gordon, K; Freeman, R

    2014-12-01

    To assess the use of the WCMT in two Scottish health boards and to consider the impact of simplifying the tool to improve efficient use. A retrospective analysis of routine WCMT data (47,276 cases). Public Dental Service (PDS) within NHS Lothian and Highland. The WCMT consists of six criteria. Each criterion is measured independently on a four-point scale to assess patient complexity and the dental care for the disabled/impaired patient. Psychometric analyses on the data-set were conducted. Conventional internal consistency coefficients were calculated. Latent variable modelling was performed to assess the 'fit' of the raw data to a pre-specified measurement model. A Confirmatory Factor Analysis (CFA) was used to test three potential changes to the existing WCMT that included, the removal of the oral risk factor question, the removal of original weightings for scoring the Tool, and collapsing the 4-point rating scale to three categories. The removal of the oral risk factor question had little impact on the reliability of the proposed simplified CMT to discriminate between levels of patient complexity. The removal of weighting and collapsing each item's rating scale to three categories had limited impact on reliability of the revised tool. The CFA analysis provided strong evidence that a new, proposed simplified Case Mix Tool (sCMT) would operate closely to the pre-specified measurement model (the WMCT). A modified sCMT can demonstrate, without reducing reliability, a useful measure of the complexity of patient care. The proposed sCMT may be implemented within primary care dentistry to record patient complexity as part of an oral health assessment.

  5. Gastric lymphomas in Turkey. Analysis of prognostic factors with special emphasis on flow cytometric DNA content.

    Science.gov (United States)

    Aydin, Z D; Barista, I; Canpinar, H; Sungur, A; Tekuzman, G

    2000-07-01

    In contrast to DNA ploidy, to the authors' knowledge the prognostic significance of S-phase fraction (SPF) in gastric lymphomas has not been determined. In the current study, the prognostic significance of various parameters including SPF and DNA aneuploidy were analyzed and some distinct epidemiologic and biologic features of gastric lymphomas in Turkey were found. A series of 78 gastric lymphoma patients followed at Hacettepe University is reported. DNA flow cytometry was performed for 34 patients. The influence of various parameters on survival was investigated with the log rank test. The Cox proportional hazards model was fitted to identify independent prognostic factors. The median age of the patients was 50 years. There was no correlation between patient age and tumor grade. DNA content analysis revealed 4 of the 34 cases to be aneuploid with DNA index values < 1.0. The mean SPF was 33.5%. In the univariate analysis, surgical resection of the tumor, modified Ann Arbor stage, performance status, response to first-line chemotherapy, lactate dehydrogenase (LDH) level, and SPF were important prognostic factors for disease free survival (DFS). The same parameters, excluding LDH level, were important for determining overall survival (OS). In the multivariate analysis, surgical resection of the tumor, disease stage, performance status, and age were found to be important prognostic factors for OS. To the authors' knowledge the current study is the first to demonstrate the prognostic significance of SPF in gastric lymphomas. The distinguishing features of Turkish gastric lymphoma patients are 1) DNA indices of aneuploid cases that all are < 1.0, which is a unique feature; 2) a lower percentage of aneuploid cases; 3) a higher SPF; 4) a younger age distribution; and 5) lack of an age-grade correlation. The authors conclude that gastric lymphomas in Turkey have distinct biologic and epidemiologic characteristics. Copyright 2000 American Cancer Society.

  6. Analysis of The Factors Influencing the Private Cost of Teacher ...

    African Journals Online (AJOL)

    info

    programme of study, level of study, place of students' residence and ownership status ... factor in the process of economic growth and development of nations. ... project/assignment, teaching practice, study tow/excursion, textbook, stationary,.

  7. Risk Factor Analysis for Oral Precancer among Slum Dwellers in ...

    African Journals Online (AJOL)

    risk factors for oral precancer, i.e., smoking/smokeless tobacco, chewing ... procedure was performed on a group of 10 subjects, which were ... clinical description of observed oral mucosal lesions was made ..... use and effects of cessation.

  8. Analysis of Factors Affecting Decisions to Participate and Levels of ...

    African Journals Online (AJOL)

    ... among Heads of Households in Minituber Yam Marketing in Abia State, Nigeria. ... in negative effects of socio economic factors on market participation as well as ... These results called for public policy for increased gender access to good ...

  9. Dispersive analysis of the pion transition form factor

    Science.gov (United States)

    Hoferichter, M.; Kubis, B.; Leupold, S.; Niecknig, F.; Schneider, S. P.

    2014-11-01

    We analyze the pion transition form factor using dispersion theory. We calculate the singly-virtual form factor in the time-like region based on data for the cross section, generalizing previous studies on decays and scattering, and verify our result by comparing to data. We perform the analytic continuation to the space-like region, predicting the poorly-constrained space-like transition form factor below , and extract the slope of the form factor at vanishing momentum transfer . We derive the dispersive formalism necessary for the extension of these results to the doubly-virtual case, as required for the pion-pole contribution to hadronic light-by-light scattering in the anomalous magnetic moment of the muon.

  10. Analysis of corrosive environmental factors of seabed sediment

    Indian Academy of Sciences (India)

    Unknown

    Seabed sediment; corrosion; environmental factors. 1. Introduction. The corrosion ... plays an important role in the corrosion behaviour of steel in sediment. Figure 2b shows the change in oxidation-reduction po- tential, Eh with distance from ...

  11. Low-energy analysis of the nucleon electromagnetic form factors

    International Nuclear Information System (INIS)

    Kubis, Bastian.; Meissner, Ulf-G.

    2001-01-01

    We analyze the electromagnetic form factors of the nucleon to fourth order in relativistic baryon chiral perturbation theory. We employ the recently proposed infrared regularization scheme and show that the convergence of the chiral expansion is improved as compared to the heavy-fermion approach. We also discuss the inclusion of vector mesons and obtain an accurate description of all four-nucleon form factors for momentum transfer squared up to Q 2 ≅0.4 GeV 2

  12. A comparative analysis of foreign direct investment factors

    OpenAIRE

    Miškinis, Algirdas; Juozėnaitė, Ilma

    2015-01-01

    The paper identifies factors affecting the foreign direct investment (FDI) inflow. It analyzes the determinants of FDI in recent empirical evidence as well as determines differences among FDI factors in Greece, Ireland, and the Netherlands. The determinants being examined are the gross domestic product (GDP) per capita, exchange rate, unit labor costs, trade openness as well as inflation. The analyzed period is 1974–2012. Data were collected from the World Bank and the Organization for Econom...

  13. An Analysis of the Factors Impacting Employee's Specific Investment

    Institute of Scientific and Technical Information of China (English)

    WU Ai-hua; GE Wen-lei

    2008-01-01

    The amount of specific investment from employees is limited, and the reasons of the under-investment from employees are analyzed in this paper. Based on the relationship of the specific investment and the employee demission, an empirical study has been conducted focusing on the factors influencing the employee turnover and the specific investment. A theoretical model of the factors influencing employee's specific investment is given.

  14. Quantitative risk analysis offshore-Human and organizational factors

    International Nuclear Information System (INIS)

    Espen Skogdalen, Jon; Vinnem, Jan Erik

    2011-01-01

    Quantitative Risk Analyses (QRAs) are one of the main tools for risk management within the Norwegian and UK oil and gas industry. Much criticism has been given to the limitations related to the QRA-models and that the QRAs do not include human and organizational factors (HOF-factors). Norway and UK offshore legislation and guidelines require that the HOF-factors are included in the QRAs. A study of 15 QRAs shows that the factors are to some extent included, and there are large differences between the QRAs. The QRAs are categorized into four levels according to the findings. Level 1 QRAs do not describe or comment on the HOF-factors at all. Relevant research projects have been conducted to fulfill the requirements of Level 3 analyses. At this level, there is a systematic collection of data related to HOF. The methods are systematic and documented, and the QRAs are adjusted. None of the QRAs fulfill the Level 4 requirements. Level 4 QRAs include the model and describe the HOF-factors as well as explain how the results should be followed up in the overall risk management. Safety audits by regulatory authorities are probably necessary to point out the direction for QRA and speed up the development.

  15. [Analysis of citations and national and international impact factor of Farmacia Hospitalaria (2001-2005)].

    Science.gov (United States)

    Aleixandre-Benavent, R; González Alcaide, G; Miguel-Dasit, A; González de Dios, J; de Granda Orive, J I; Valderrama Zurián, J C

    2007-01-01

    The objective of this study is to analyse the citation patterns and impact and immediacy indicators of the Farmacia Hospitalaria journal during the period 2001-2005. An analysis of citations chosen from 101 Spanish health science journals was carried out in order to determine the citing and cited journals and the national and international impact and immediacy indicators. A similar methodology used by Thomson ISI in Science Citation Index (SCI) and Journal Citation Reports (JRC) was applied. Farmacia Hospitalaria made 1,370 citations to 316 different journals. The percentage of self-citations was 9%. The national impact factor increased from 0.178 points in 2001 to 0.663 points in 2005 while the international impact factor increased from 0.178 to 0.806 for the same period. The analysis of citation patterns demonstrates the multidisciplinary nature of Farmacia Hospitalaria and a significant growth in the impact indicators over recent years. These indicators are higher than those of some other pharmacy journals included in Journal Citation Reports. Self-citation was not excessive and was similar to that of other journals.

  16. Analysis of Performance Factors for Accounting and Finance Related Business Courses in A Distance Education Environment

    OpenAIRE

    BENLIGIRAY, Serdar; ONAY, Ahmet

    2017-01-01

    The objective of this study is to explore business courses performance factors with a focus on accounting and finance. Course score interrelations are assumed to represent interpretable constructs of these factors. Factor analysis is proposed to identify the constructs that explain the correlations. Factor analysis results identify three sub-groups of business core courses. The first group is labeled as management-oriented courses. Accounting, finance and economics courses are separated in tw...

  17. Dissecting high-dimensional phenotypes with bayesian sparse factor analysis of genetic covariance matrices.

    Science.gov (United States)

    Runcie, Daniel E; Mukherjee, Sayan

    2013-07-01

    Quantitative genetic studies that model complex, multivariate phenotypes are important for both evolutionary prediction and artificial selection. For example, changes in gene expression can provide insight into developmental and physiological mechanisms that link genotype and phenotype. However, classical analytical techniques are poorly suited to quantitative genetic studies of gene expression where the number of traits assayed per individual can reach many thousand. Here, we derive a Bayesian genetic sparse factor model for estimating the genetic covariance matrix (G-matrix) of high-dimensional traits, such as gene expression, in a mixed-effects model. The key idea of our model is that we need consider only G-matrices that are biologically plausible. An organism's entire phenotype is the result of processes that are modular and have limited complexity. This implies that the G-matrix will be highly structured. In particular, we assume that a limited number of intermediate traits (or factors, e.g., variations in development or physiology) control the variation in the high-dimensional phenotype, and that each of these intermediate traits is sparse - affecting only a few observed traits. The advantages of this approach are twofold. First, sparse factors are interpretable and provide biological insight into mechanisms underlying the genetic architecture. Second, enforcing sparsity helps prevent sampling errors from swamping out the true signal in high-dimensional data. We demonstrate the advantages of our model on simulated data and in an analysis of a published Drosophila melanogaster gene expression data set.

  18. A factor analysis to find critical success factors in retail brand

    OpenAIRE

    Naser Azad; Seyed Foad Zarifi; Somayeh Hozouri

    2013-01-01

    The present exploratory study aims to find critical components of retail brand among some retail stores. The study seeks to build a brand name in retail level and looks to find important factors affecting it. Customer behavior is largely influenced when the first retail customer experience is formed. These factors have direct impacts on customer experience and satisfaction in retail industry. The proposed study performs an empirical investigation on two well-known retain stores located in cit...

  19. Analysis of human factors in incidents reported by Swiss nuclear power plants to the inspectorate

    International Nuclear Information System (INIS)

    Alder, H.P.; Hausmann, W.

    1997-01-01

    197 reported incidents in Swiss Nuclear Power Plants were analyzed by a team of the Swiss Federal Nuclear Safety Inspectorate (HSK) using the OECD/NEA Incident Reporting System. The following conclusions could be drawn from this exercise. While the observed cause reported by the plant was ''technical failure'' in about 90% of the incidents, the HSK-Team identified for more than 60% of the incidents ''human factors'' as the root cause. When analyzing this root cause further it was shown that only a smaller contribution came from the side of the operators and the more important shares were caused by plant maintenance, vendors/constructors and plant management with procedural and organizational deficiencies. These findings demonstrate that root cause analysis of incidents by the IRS-Code is a most useful tool to analyze incidents and to find weak points in plant performance. (author). 5 tabs

  20. Analysis of the effect of meteorological factors on dewfall

    International Nuclear Information System (INIS)

    Xiao, Huijie; Meissner, Ralph; Seeger, Juliane; Rupp, Holger; Borg, Heinz; Zhang, Yuqing

    2013-01-01

    To get an insight into when dewfall will occur and how much to expect we carried out extensive calculations with the energy balance equation for a crop surface to 1) identify the meteorological factors which determine dewfall, 2) establish the relationship between dewfall and each of them, and 3) analyse how these relationships are influenced by changes in these factors. The meteorological factors which determine dewfall were found to be air temperature (T a ), cloud cover (N), wind speed (u), soil heat flux (G), and relative humidity (h r ). Net radiation is also a relevant factor. We did not consider it explicitly, but indirectly through the effect of temperature on the night-time radiation balance. The temperature of the surface (T s ) where dew forms on is also important. However, it is not a meteorological factor, but determined by the aforementioned parameters. All other conditions being equal our study revealed that dewfall increases linearly with decreasing N or G, and with increasing h r . The effect of T a and u on dewfall is non-linear: dewfall initially increases with increasing T a or u, and then decreases. All five meteorological factors can lead to variations in dewfall between 0 and 25 W m −2 over the range of their values we studied. The magnitude of the variation due to one factor depends on the value of the others. Dewfall is highest at N = 0, G = 0, and h r = 1. T a at which dewfall is highest depends on u and vice versa. The change in dewfall for a unit change in N, G or h r is not affected by the value of N, G or h r , but increases as T a or u increase. The change in dewfall for a unit change in T a or u depends on the value of the other four meteorological factors. - Highlights: • Process of dewfall is examined for a wide range of meteorological conditions. • Effect of meteorological factors on dewfall is individually elucidated. • Interaction between factors and their combined effect on dewfall is assessed. • Extensive