WorldWideScience

Sample records for regression results demonstrate

  1. Demonstration of a Fiber Optic Regression Probe

    Science.gov (United States)

    Korman, Valentin; Polzin, Kurt A.

    2010-01-01

    empirically anchoring any analysis geared towards lifetime qualification. Erosion rate data over an operating envelope could also be useful in the modeling detailed physical processes. The sensor has been embedded in many regressing media for the purposes of proof-of-concept testing. A gross demonstration of its capabilities was performed using a sanding wheel to remove layers of metal. A longer-term demonstration measurement involved the placement of the sensor in a brake pad, monitoring the removal of pad material associated with the normal wear-and-tear of driving. It was used to measure the regression rates of the combustable media in small model rocket motors and road flares. Finally, a test was performed using a sand blaster to remove small amounts of material at a time. This test was aimed at demonstrating the unit's present resolution, and is compared with laser profilometry data obtained simultaneously. At the lowest resolution levels, this unit should be useful in locally quantifying the erosion rates of the channel walls in plasma thrusters. .

  2. Antares: preliminary demonstrator results

    International Nuclear Information System (INIS)

    Kouchner, A.

    2000-05-01

    The ANTARES collaboration is building an undersea neutrino telescope off Toulon (Mediterranean sea) with effective area ∼ 0.1 km 2 . An extensive study of the site properties has been achieved together with software analysis in order to optimize the performance of the detector. Results are summarized here. An instrumented line, linked to shore for first time via an electro-optical cable, has been immersed late 1999. The preliminary results of this demonstrator line are reported. (author)

  3. A Demonstration of Regression False Positive Selection in Data Mining

    Science.gov (United States)

    Pinder, Jonathan P.

    2014-01-01

    Business analytics courses, such as marketing research, data mining, forecasting, and advanced financial modeling, have substantial predictive modeling components. The predictive modeling in these courses requires students to estimate and test many linear regressions. As a result, false positive variable selection ("type I errors") is…

  4. Salt decontamination demonstration test results

    International Nuclear Information System (INIS)

    Snell, E.B.; Heng, C.J.

    1983-06-01

    The Salt Decontamination Demonstration confirmed that the precipitation process could be used for large-scale decontamination of radioactive waste sale solution. Although a number of refinements are necessary to safely process the long-term requirement of 5 million gallons of waste salt solution per year, there were no observations to suggest that any fundamentals of the process require re-evaluation. Major accomplishments were: (1) 518,000 gallons of decontaminated filtrate were produced from 427,000 gallons of waste salt solution from tank 24H. The demonstration goal was to produce a minimum of 200,000 gallons of decontaminated salt solution; (2) cesium activity in the filtrate was reduced by a factor of 43,000 below the cesium activity in the tank 24 solution. This decontamination factor (DF) exceeded the demonstration goal of a DF greater than 10,000; (3) average strontium-90 activity in the filtrate was reduced by a factor of 26 to less than 10 3 d/m/ml versus a goal of less than 10 4 d/m/ml; and (4) the concentrated precipitate was washed to a final sodium ion concentration of 0.15 M, well below the 0.225 M upper limit for DWPF feed. These accomplishments were achieved on schedule and without incident. Total radiation exposure to personnel was less than 350 mrem and resulted primarily from sampling precipitate slurry inside tank 48. 3 references, 6 figures, 2 tables

  5. Repeated Results Analysis for Middleware Regression Benchmarking

    Czech Academy of Sciences Publication Activity Database

    Bulej, Lubomír; Kalibera, T.; Tůma, P.

    2005-01-01

    Roč. 60, - (2005), s. 345-358 ISSN 0166-5316 R&D Projects: GA ČR GA102/03/0672 Institutional research plan: CEZ:AV0Z10300504 Keywords : middleware benchmarking * regression benchmarking * regression testing Subject RIV: JD - Computer Applications, Robotics Impact factor: 0.756, year: 2005

  6. Augmenting Data with Published Results in Bayesian Linear Regression

    Science.gov (United States)

    de Leeuw, Christiaan; Klugkist, Irene

    2012-01-01

    In most research, linear regression analyses are performed without taking into account published results (i.e., reported summary statistics) of similar previous studies. Although the prior density in Bayesian linear regression could accommodate such prior knowledge, formal models for doing so are absent from the literature. The goal of this…

  7. The Majorana Demonstrator Status and Preliminary Results

    Science.gov (United States)

    Yu, C.-H.; Alvis, S. I.; Arnquist, I. J.; Avignone, F. T.; Barabash, A. S.; Barton, C. J.; Bertrand, F. E.; Bode, T.; Brudanin, V.; Busch, M.; Buuck, M.; Caldwell, T. S.; Chan, Y.-D.; Christofferson, C. D.; Chu, P.-H.; Cuesta, C.; Detwiler, J. A.; Dunagan, C.; Efremenko, Yu; Ejiri, H.; Elliott, S. R.; Gilliss, T.; Giovanetti, G. K.; Green, M.; Gruszko, J.; Guinn, I. S.; Guiseppe, V. E.; Haufe, C. R.; Hehn, L.; Henning, R.; Hoppe, E. W.; Howe, M. A.; Keeter, K. J.; Kidd, M. F.; Konovalov, S. I.; Kouzes, R. T.; Lopez, A. M.; Martin, R. D.; Massarczyk, R.; Meijer, S. J.; Mertens, S.; Myslik, J.; Othman, G.; Pettus, W.; Poon, A. W. P.; Radford, D. C.; Rager, J.; Reine, A. L.; Rielage, K.; Ruof, N. W.; Shanks, B.; Shirchenko, M.; Suriano, A. M.; Tedeschi, D.; Varner, R. L.; Vasilyev, S.; Vetter, K.; Vorren, K.; White, B. R.; Wilkerson, J. F.; Wiseman, C.; Xu, W.; Yakushev, E.; Yumatov, V.; Zhitnikov, I.; Zhu, B. Z.

    2018-05-01

    The MAJORANA Collaboration is using an array of high-purity Ge detectors to search for neutrinoless double-beta decay in 76Ge. Searches for neutrinoless double-beta decay are understood to be the only viable experimental method for testing the Majorana nature of the neutrino. Observation of this decay would imply violation of lepton number, that neutrinos are Majorana in nature, and provide information on the neutrino mass. The MAJORANA DEMONSTRATOR comprises 44.1 kg of p-type point-contact Ge detectors (29.7 kg enriched in 76Ge) surrounded by a low-background shield system. The experiment achieved a high efficiency of converting raw Ge material to detectors and an unprecedented detector energy resolution of 2.5 keV FWHM at Qββ. The MAJORANA collaboration began taking physics data in 2016. This paper summarizes key construction aspects of the Demonstrator and shows preliminary results from initial data.

  8. Arc melter demonstration baseline test results

    International Nuclear Information System (INIS)

    Soelberg, N.R.; Chambers, A.G.; Anderson, G.L.; Oden, L.L.; O'Connor, W.K.; Turner, P.C.

    1994-07-01

    This report describes the test results and evaluation for the Phase 1 (baseline) arc melter vitrification test series conducted for the Buried Waste Integrated Demonstration program (BWID). Phase 1 tests were conducted on surrogate mixtures of as-incinerated wastes and soil. Some buried wastes, soils, and stored wastes at the INEL and other DOE sites, are contaminated with transuranic (TRU) radionuclides and hazardous organics and metals. The high temperature environment in an electric arc furnace may be used to process these wastes to produce materials suitable for final disposal. An electric arc furnace system can treat heterogeneous wastes and contaminated soils by (a) dissolving and retaining TRU elements and selected toxic metals as oxides in the slag phase, (b) destroying organic materials by dissociation, pyrolyzation, and combustion, and (c) capturing separated volatilized metals in the offgas system for further treatment. Structural metals in the waste may be melted and tapped separately for recycle or disposal, or these metals may be oxidized and dissolved into the slag. The molten slag, after cooling, will provide a glass/ceramic final waste form that is homogeneous, highly nonleachable, and extremely durable. These features make this waste form suitable for immobilization of TRU radionuclides and toxic metals for geologic timeframes. Further, the volume of contaminated wastes and soils will be substantially reduced in the process

  9. MODIL cryocooler producibility demonstration project results

    International Nuclear Information System (INIS)

    Cruz, G.E.; Franks, R.M.

    1993-01-01

    The production of large quantities of spacecraft needed by SDIO will require a cultural change in design and production practices. Low rates production and the need for exceedingly high reliability has driven the industry to custom designed, hand crafted, and exhaustively tested satellites. These factors have mitigated against employing design and manufacturing cost reduction methods commonly used in tactical missile production. Additional challenges to achieving production efficiencies are presented by the SDI spacecraft mission requirement. IR sensor systems, for example, are comprised of subassemblies and components that require the design, manufacture, and maintenance of ultra precision tolerances over challenging operational lifetimes. These IR sensors demand the use of reliable, closed loop, cryogenic refrigerators or active cryocoolers to meet stringent system acquisition and pointing requirements. The authors summarize some spacecraft cryocooler requirements and discuss observations regarding Industry's current production capabilities of cryocoolers. The results of the Lawrence Livermore National Laboratory (LLNL) Spacecraft Fabrication and Test (SF and T) MODIL's Phase I producibility demonstration project is presented

  10. Demonstration of a Fiber Optic Regression Probe in a High-Temperature Flow

    Science.gov (United States)

    Korman, Valentin; Polzin, Kurt

    2011-01-01

    empirically anchoring any analysis geared towards lifetime qualification. Erosion rate data over an operating envelope could also be useful in the modeling detailed physical processes. The sensor has been embedded in many regressing media to demonstrate the capabilities in a number of regressing environments. In the present work, sensors were installed in the eroding/regressing throat region of a converging-diverging flow, with the working gas heated to high temperatures by means of a high-pressure arc discharge at steady-state discharge power levels up to 500 kW. The amount of regression observed in each material sample was quantified using a later profilometer, which was compared to the in-situ erosion measurements to demonstrate the efficacy of the measurement technique in very harsh, high-temperature environments.

  11. The Development and Demonstration of Multiple Regression Models for Operant Conditioning Questions.

    Science.gov (United States)

    Fanning, Fred; Newman, Isadore

    Based on the assumption that inferential statistics can make the operant conditioner more sensitive to possible significant relationships, regressions models were developed to test the statistical significance between slopes and Y intercepts of the experimental and control group subjects. These results were then compared to the traditional operant…

  12. Mapping the results of local statistics: Using geographically weighted regression

    Directory of Open Access Journals (Sweden)

    Stephen A. Matthews

    2012-03-01

    Full Text Available BACKGROUND The application of geographically weighted regression (GWR - a local spatial statistical technique used to test for spatial nonstationarity - has grown rapidly in the social, health, and demographic sciences. GWR is a useful exploratory analytical tool that generates a set of location-specific parameter estimates which can be mapped and analysed to provide information on spatial nonstationarity in the relationships between predictors and the outcome variable. OBJECTIVE A major challenge to users of GWR methods is how best to present and synthesize the large number of mappable results, specifically the local parameter parameter estimates and local t-values, generated from local GWR models. We offer an elegant solution. METHODS This paper introduces a mapping technique to simultaneously display local parameter estimates and local t-values on one map based on the use of data selection and transparency techniques. We integrate GWR software and GIS software package (ArcGIS and adapt earlier work in cartography on bivariate mapping. We compare traditional mapping strategies (i.e., side-by-side comparison and isoline overlay maps with our method using an illustration focusing on US county infant mortality data. CONCLUSIONS The resultant map design is more elegant than methods used to date. This type of map presentation can facilitate the exploration and interpretation of nonstationarity, focusing map reader attention on the areas of primary interest.

  13. Two SPSS programs for interpreting multiple regression results.

    Science.gov (United States)

    Lorenzo-Seva, Urbano; Ferrando, Pere J; Chico, Eliseo

    2010-02-01

    When multiple regression is used in explanation-oriented designs, it is very important to determine both the usefulness of the predictor variables and their relative importance. Standardized regression coefficients are routinely provided by commercial programs. However, they generally function rather poorly as indicators of relative importance, especially in the presence of substantially correlated predictors. We provide two user-friendly SPSS programs that implement currently recommended techniques and recent developments for assessing the relevance of the predictors. The programs also allow the user to take into account the effects of measurement error. The first program, MIMR-Corr.sps, uses a correlation matrix as input, whereas the second program, MIMR-Raw.sps, uses the raw data and computes bootstrap confidence intervals of different statistics. The SPSS syntax, a short manual, and data files related to this article are available as supplemental materials from http://brm.psychonomic-journals.org/content/supplemental.

  14. Linear regression metamodeling as a tool to summarize and present simulation model results.

    Science.gov (United States)

    Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M

    2013-10-01

    Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.

  15. Integrating scientific results for a post-closure safety demonstration

    International Nuclear Information System (INIS)

    Taylor, E.C.; Ramspott, L.D.; Sinnock, S.; Sprecher, W.M.

    1994-01-01

    The U.S. Department of Energy (DOE) is developing a nuclear waste management system that will accept high-level radioactive waste, transport it, store it, and ultimately emplace it in a deep geologic repository. The key activity now is determining whether Yucca Mountain, Nevada is suitable as a site for the repository. If so, the crucial technological advance will be the demonstration that disposal of nuclear waste will be safe for thousands of years after closure. Recent regulatory, legal, and scientific developments imply that the safety demonstration must be simple. The scientific developments taken together support a simple set of hypotheses that constitute a post-closure safety argument for a repository at Yucca Mountain. If the understanding of Yucca Mountain hydrology presented in the Site Characterization Plan proves correct, then these hypotheses might be confirmed by combining results of Surface-Based Testing with early testing results in the Exploratory Studies Facility

  16. Graphite electrode arc melter demonstration Phase 2 test results

    International Nuclear Information System (INIS)

    Soelberg, N.R.; Chambers, A.G.; Anderson, G.L.; O'Connor, W.K.; Oden, L.L.; Turner, P.C.

    1996-06-01

    Several U.S. Department of Energy organizations and the U.S. Bureau of Mines have been collaboratively conducting mixed waste treatment process demonstration testing on the near full-scale graphite electrode submerged arc melter system at the Bureau's Albany (Oregon) Research Center. An initial test series successfully demonstrated arc melter capability for treating surrogate incinerator ash of buried mixed wastes with soil. The conceptual treatment process for that test series assumed that buried waste would be retrieved and incinerated, and that the incinerator ash would be vitrified in an arc melter. This report presents results from a recently completed second series of tests, undertaken to determine the ability of the arc melter system to stably process a wide range of open-quotes as-receivedclose quotes heterogeneous solid mixed wastes containing high levels of organics, representative of the wastes buried and stored at the Idaho National Engineering Laboratory (INEL). The Phase 2 demonstration test results indicate that an arc melter system is capable of directly processing these wastes and could enable elimination of an up-front incineration step in the conceptual treatment process

  17. Graphite electrode arc melter demonstration Phase 2 test results

    Energy Technology Data Exchange (ETDEWEB)

    Soelberg, N.R.; Chambers, A.G.; Anderson, G.L.; O`Connor, W.K.; Oden, L.L.; Turner, P.C.

    1996-06-01

    Several U.S. Department of Energy organizations and the U.S. Bureau of Mines have been collaboratively conducting mixed waste treatment process demonstration testing on the near full-scale graphite electrode submerged arc melter system at the Bureau`s Albany (Oregon) Research Center. An initial test series successfully demonstrated arc melter capability for treating surrogate incinerator ash of buried mixed wastes with soil. The conceptual treatment process for that test series assumed that buried waste would be retrieved and incinerated, and that the incinerator ash would be vitrified in an arc melter. This report presents results from a recently completed second series of tests, undertaken to determine the ability of the arc melter system to stably process a wide range of {open_quotes}as-received{close_quotes} heterogeneous solid mixed wastes containing high levels of organics, representative of the wastes buried and stored at the Idaho National Engineering Laboratory (INEL). The Phase 2 demonstration test results indicate that an arc melter system is capable of directly processing these wastes and could enable elimination of an up-front incineration step in the conceptual treatment process.

  18. Attitude Operation Results of Solar Sail Demonstrator IKAROS

    Science.gov (United States)

    Saiki, Takanao; Tsuda, Yuichi; Funase, Ryu; Mimasu, Yuya; Shirasawa, Yoji; Ikaros Demonstration Team,

    This paper shows the attitude operation results of Japanese interplanetary solar sail demonstration spacecraft IKAROS. IKAROS was launched on 21 May 2010(JST) aboard an H-IIA rocket, together with the AKATSUKI Venus climate orbiter. As IKAROS is the secondary payload, the development cost and period were restricted and the onboard attitude system is very simple. This paper introduces the attitude determination and control system. And as IKAROS is spin type spacecraft and it has the large membrane, the attitude control is not easy and it is very important to determine the long-term attitude plan in advance. This paper also shows the outline of the IKAROS attitude operation plan and its operation results.

  19. Foothill Transit Battery Electric Bus Demonstration Results: Second Report

    Energy Technology Data Exchange (ETDEWEB)

    Eudy, Leslie [National Renewable Energy Lab. (NREL), Golden, CO (United States); Jeffers, Matthew [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-06-28

    This report summarizes results of a battery electric bus (BEB) evaluation at Foothill Transit, located in the San Gabriel and Pomona Valley region of Los Angeles County, California. Foothill Transit is collaborating with the California Air Resources Board and the U.S. Department of Energy's (DOE's) National Renewable Energy Laboratory to evaluate its fleet of Proterra BEBs in revenue service. The focus of this evaluation is to compare performance of the BEBs to that of conventional technology and to track progress over time toward meeting performance targets. This project has also provided an opportunity for DOE to conduct a detailed evaluation of the BEBs and charging infrastructure. This is the second report summarizing the results of the BEB demonstration at Foothill Transit and it provides data on the buses from August 2015 through December 2016. Data are provided on a selection of compressed natural gas buses as a baseline comparison.

  20. King County Metro Battery Electric Bus Demonstration: Preliminary Project Results

    Energy Technology Data Exchange (ETDEWEB)

    2017-05-22

    The U.S. Federal Transit Administration (FTA) funds a variety of research projects that support the commercialization of zero-emission bus technology. To evaluate projects funded through these programs, FTA has enlisted the help of the National Renewable Energy Laboratory (NREL) to conduct third-party evaluations of the technologies deployed under the FTA programs. NREL works with the selected agencies to evaluate the performance of the zero-emission buses compared to baseline conventional buses in similar service. The evaluation effort will advance the knowledge base of zero-emission technologies in transit bus applications and provide 'lessons learned' to aid other fleets in incrementally introducing next generation zero-emission buses into their operations. This report provides preliminary performance evaluation results from a demonstration of three zero-emission battery electric buses at King County Metro in King County, Washington. NREL developed this preliminary results report to quickly disseminate evaluation results to stakeholders. Detailed evaluation results will be published in future reports.

  1. The CMS event builder demonstrator and results with Myrinet

    CERN Document Server

    Antchev, G; Cittolin, Sergio; Erhan, S; Faure, B; Gigi, D; Gutleber, J; Jacobs, C; Meijers, F; Meschi, E; Ninane, A; Orsini, L; Pollet, Lucien; Rácz, A; Samyn, D; Schleifer, W; Sinanis, N; Sphicas, Paris

    2001-01-01

    The data acquisition system for the CMS experiment at the Large Hadron Collider (LHC) will require a large and high performance event building network. Several switch technologies are currently being evaluated in order to compare different architectures for the event builder. One candidate is Myrinet. This paper describes the demonstrator which has been setup to study a small-scale (16*16) event builder based on PCs running Linux connected to Myrinet and Ethernet switches. A detailed study of the Myrinet switch performance has been performed for various traffic conditions, including the behaviour of composite switches. Results from event building studies are presented, including measurements on throughput, overhead and scaling. Traffic shaping techniques have been implemented and the effect on the event building performance has been investigated. The paper reports on performances and maximum event rate obtainable using custom software, not described, for the Myrinet control program and the low-level communica...

  2. Fission Surface Power Technology Demonstration Unit Test Results

    Science.gov (United States)

    Briggs, Maxwell H.; Gibson, Marc A.; Geng, Steven M.; Sanzi, James L.

    2016-01-01

    The Fission Surface Power (FSP) Technology Demonstration Unit (TDU) is a system-level demonstration of fission power technology intended for use on manned missions to Mars. The Baseline FSP systems consists of a 190 kWt UO2 fast-spectrum reactor cooled by a primary pumped liquid metal loop. This liquid metal loop transfers heat to two intermediate liquid metal loops designed to isolate fission products in the primary loop from the balance of plant. The intermediate liquid metal loops transfer heat to four Stirling Power Conversion Units (PCU), each of which produce 12 kWe (48 kW total) and reject waste heat to two pumped water loops, which transfer the waste heat to titanium-water heat pipe radiators. The FSP TDU simulates a single leg of the baseline FSP system using an electrically heater core simulator, a single liquid metal loop, a single PCU, and a pumped water loop which rejects the waste heat to a Facility Cooling System (FCS). When operated at the nominal operating conditions (modified for low liquid metal flow) during TDU testing the PCU produced 8.9 kW of power at an efficiency of 21.7 percent resulting in a net system power of 8.1 kW and a system level efficiency of 17.2 percent. The reduction in PCU power from levels seen during electrically heated testing is the result of insufficient heat transfer from the NaK heater head to the Stirling acceptor, which could not be tested at Sunpower prior to delivery to the NASA Glenn Research Center (GRC). The maximum PCU power of 10.4 kW was achieved at the maximum liquid metal temperature of 875 K, minimum water temperature of 350 K, 1.1 kg/s liquid metal flow, 0.39 kg/s water flow, and 15.0 mm amplitude at an efficiency of 23.3 percent. This resulted in a system net power of 9.7 kW and a system efficiency of 18.7 percent.

  3. CUSUM-Logistic Regression analysis for the rapid detection of errors in clinical laboratory test results.

    Science.gov (United States)

    Sampson, Maureen L; Gounden, Verena; van Deventer, Hendrik E; Remaley, Alan T

    2016-02-01

    The main drawback of the periodic analysis of quality control (QC) material is that test performance is not monitored in time periods between QC analyses, potentially leading to the reporting of faulty test results. The objective of this study was to develop a patient based QC procedure for the more timely detection of test errors. Results from a Chem-14 panel measured on the Beckman LX20 analyzer were used to develop the model. Each test result was predicted from the other 13 members of the panel by multiple regression, which resulted in correlation coefficients between the predicted and measured result of >0.7 for 8 of the 14 tests. A logistic regression model, which utilized the measured test result, the predicted test result, the day of the week and time of day, was then developed for predicting test errors. The output of the logistic regression was tallied by a daily CUSUM approach and used to predict test errors, with a fixed specificity of 90%. The mean average run length (ARL) before error detection by CUSUM-Logistic Regression (CSLR) was 20 with a mean sensitivity of 97%, which was considerably shorter than the mean ARL of 53 (sensitivity 87.5%) for a simple prediction model that only used the measured result for error detection. A CUSUM-Logistic Regression analysis of patient laboratory data can be an effective approach for the rapid and sensitive detection of clinical laboratory errors. Published by Elsevier Inc.

  4. The Pixel-TPC. Demonstration of the concept and results

    Energy Technology Data Exchange (ETDEWEB)

    Lupberger, Michael [Universitaet Bonn (Germany); Collaboration: LCTPC-Deutschland-Collaboration

    2016-07-01

    A Time Projection Chamber (TPC) is foreseen as tracker for the ILD, one of the two detector concepts at the planned International Linear Collider (ILC). At the TPC endplates, Micromegas or GEMs will be used as gas amplification structure. Besides segmented anodes, also an active endplate with pixel chips, in our experiments the Timepix ASIC, is considered as a readout option. In a photolithographic process a grid has been produced on top of the chip to form a so called InGrid, which is a Micromegas-like gas amplification structure. Several thousand InGrids are necessary to equip a complete TPC endplate. For demonstration of the concept, three endplate modules have been built with a total of 160 InGrids covering an active area of about 300 cm{sup 2}. To read out the 10.5 million channels, the Timepix ASIC was implemented in a general readout system. A dedicated powering scheme, DAQ and online event display were developed by our group. The feasibility of the Pixel-TPC could be proven in a test beam campaign at DESY early 2015. The data has partly been analysed and shows the potential of this new type of detector. An overview of the developments necessary to build the detector is presented followed by impressions from the test beam and some of the results from the data analysis.

  5. Michigan Oncology Medical Home Demonstration Project: First-Year Results.

    Science.gov (United States)

    Kuntz, Gordon; Tozer, Jane; Snegosky, Jeff; Fox, John; Neumann, Kurt

    2014-03-01

    The Michigan Oncology Medical Home Demonstration Project (MOMHDP) is an innovative multipractice oncology medical home model, supported by payment reform. Sponsored by Priority Health, Physician Resource Management, and ION Solutions, MOMHDP includes four oncology practices and 29 physicians. Oncology practices used existing technologies, with MOMHDP providing evidence-based treatment guideline selection and compliance tracking, automated physician order entry, a patient portal, symptom management/standardized nurse triage, and advance care planning. To support changes in care and administrative models and to focus on quality, MOMHDP modifies provider payments. The program replaces the average sales price payment methodology with a drug acquisition reimbursement plus a care management fee, calculated to increase total drug reimbursement. Additionally, it reimburses for chemotherapy and treatment planning and advance care planning consultation. There is also a shared savings opportunity. MOMHDP will be enhanced in its second year to include a survivorship program, patient distress screening, imaging guidelines, and standardized patient satisfaction surveys. Priority Health patients receiving chemotherapy for a cancer diagnosis were recruited to the program. Results for this group were compared with a control group of patients from a prior period. In addition to the financial results, the project also accomplished the following: (1) adherence to practice-selected guidelines, (2) institution of advance care planning, (3) effective and standardized symptom management; and (4) payment reform. We have identified a number of critical success factors: strong payer/provider collaboration built on trust through transparent use and cost data; timing of clinical standardization must come from the practices, so they can effectively absorb new approaches; having comprehensive, written program documentation and consistently applied training facilitate practice understanding

  6. Bias in logistic regression due to imperfect diagnostic test results and practical correction approaches.

    Science.gov (United States)

    Valle, Denis; Lima, Joanna M Tucker; Millar, Justin; Amratia, Punam; Haque, Ubydul

    2015-11-04

    Logistic regression is a statistical model widely used in cross-sectional and cohort studies to identify and quantify the effects of potential disease risk factors. However, the impact of imperfect tests on adjusted odds ratios (and thus on the identification of risk factors) is under-appreciated. The purpose of this article is to draw attention to the problem associated with modelling imperfect diagnostic tests, and propose simple Bayesian models to adequately address this issue. A systematic literature review was conducted to determine the proportion of malaria studies that appropriately accounted for false-negatives/false-positives in a logistic regression setting. Inference from the standard logistic regression was also compared with that from three proposed Bayesian models using simulations and malaria data from the western Brazilian Amazon. A systematic literature review suggests that malaria epidemiologists are largely unaware of the problem of using logistic regression to model imperfect diagnostic test results. Simulation results reveal that statistical inference can be substantially improved when using the proposed Bayesian models versus the standard logistic regression. Finally, analysis of original malaria data with one of the proposed Bayesian models reveals that microscopy sensitivity is strongly influenced by how long people have lived in the study region, and an important risk factor (i.e., participation in forest extractivism) is identified that would have been missed by standard logistic regression. Given the numerous diagnostic methods employed by malaria researchers and the ubiquitous use of logistic regression to model the results of these diagnostic tests, this paper provides critical guidelines to improve data analysis practice in the presence of misclassification error. Easy-to-use code that can be readily adapted to WinBUGS is provided, enabling straightforward implementation of the proposed Bayesian models.

  7. Results from an in-plant demonstration of intelligent control

    International Nuclear Information System (INIS)

    Edwards, R.M.; Garcia, H.E.; Messick, N.

    1993-01-01

    A learning systems-based reconfigurable controller was demonstrated on the deaerating feedwater heater at the Experimental Breeder Reactor II (EBR-II) on April 1, 1993. Failures of the normal pressure regulating process were introduced by reducing the steam flow to the heater by as much as 10%. The controller maintained pressure in the heater at acceptable levels for several minutes, whereas operator intervention would have otherwise been required within a few seconds. This experiment demonstrates the potential of advanced control techniques for improving safety, reliability, and performance of power plant operations as well as the utility of EBR-II as an experimental power plant controls facility

  8. FINAL SIMULATION RESULTS FOR DEMONSTRATION CASE 1 AND 2

    Energy Technology Data Exchange (ETDEWEB)

    David Sloan; Woodrow Fiveland

    2003-10-15

    The goal of this DOE Vision-21 project work scope was to develop an integrated suite of software tools that could be used to simulate and visualize advanced plant concepts. Existing process simulation software did not meet the DOE's objective of ''virtual simulation'' which was needed to evaluate complex cycles. The overall intent of the DOE was to improve predictive tools for cycle analysis, and to improve the component models that are used in turn to simulate equipment in the cycle. Advanced component models are available; however, a generic coupling capability that would link the advanced component models to the cycle simulation software remained to be developed. In the current project, the coupling of the cycle analysis and cycle component simulation software was based on an existing suite of programs. The challenge was to develop a general-purpose software and communications link between the cycle analysis software Aspen Plus{reg_sign} (marketed by Aspen Technology, Inc.), and specialized component modeling packages, as exemplified by industrial proprietary codes (utilized by ALSTOM Power Inc.) and the FLUENT{reg_sign} computational fluid dynamics (CFD) code (provided by Fluent Inc). A software interface and controller, based on an open CAPE-OPEN standard, has been developed and extensively tested. Various test runs and demonstration cases have been utilized to confirm the viability and reliability of the software. ALSTOM Power was tasked with the responsibility to select and run two demonstration cases to test the software--(1) a conventional steam cycle (designated as Demonstration Case 1), and (2) a combined cycle test case (designated as Demonstration Case 2). Demonstration Case 1 is a 30 MWe coal-fired power plant for municipal electricity generation, while Demonstration Case 2 is a 270 MWe, natural gas-fired, combined cycle power plant. Sufficient data was available from the operation of both power plants to complete the cycle

  9. Last results of MADRAS, a space active optics demonstrator

    Science.gov (United States)

    Laslandes, Marie; Hourtoule, Claire; Hugot, Emmanuel; Ferrari, Marc; Devilliers, Christophe; Liotard, Arnaud; Lopez, Céline; Chazallet, Frédéric

    2017-11-01

    The goal of the MADRAS project (Mirror Active, Deformable and Regulated for Applications in Space) is to highlight the interest of Active Optics for the next generation of space telescope and instrumentation. Wave-front errors in future space telescopes will mainly come from thermal dilatation and zero gravity, inducing large lightweight primary mirrors deformation. To compensate for these effects, a 24 actuators, 100 mm diameter deformable mirror has been designed to be inserted in a pupil relay. Within the project, such a system has been optimized, integrated and experimentally characterized. The system is designed considering wave-front errors expected in 3m-class primary mirrors, and taking into account space constraints such as compactness, low weight, low power consumption and mechanical strength. Finite Element Analysis allowed an optimization of the system in order to reach a precision of correction better than 10 nm rms. A dedicated test-bed has been designed to fully characterize the integrated mirror performance in representative conditions. The test set up is made of three main parts: a telescope aberrations generator, a correction loop with the MADRAS mirror and a Shack-Hartman wave-front sensor, and PSF imaging. In addition, Fizeau interferometry monitors the optical surface shape. We have developed and characterized an active optics system with a limited number of actuators and a design fitting space requirements. All the conducted tests tend to demonstrate the efficiency of such a system for a real-time, in situ wave-front. It would allow a significant improvement for future space telescopes optical performance while relaxing the specifications on the others components.

  10. Recent results of the National Ignition Facility Beamlet demonstration project

    International Nuclear Information System (INIS)

    Van Wonterghem, B.M.; Caird, J.A.; Barker, C.E.; Campbell, J.H.; Murray, J.R.; Speck, D.R.

    1995-01-01

    The activation of a full scale single beam prototype of amultipass amplifier cavity based fusion class laser has been completed. A 35 x 35 cm 2 beam is amplified during four passes through an 11 slab long amplifier in cavity, and is switched out using a full aperture Pockels cell and polarizer. Further amplification is achieved in a five slab long booster amplifier, before being frequency tripled by a Type I/Type II frequency converter. We present initial performance results of this laser system, called Beamlet. At 1 ω, energies up to 17.3 kJ have been generated in a 10 ns pulse, and frequency tripled beams up to 8.3 kJ in a 3 ns pulse

  11. Ground Operations Demonstration Unit for Liquid Hydrogen Initial Test Results

    Science.gov (United States)

    Notardonato, W. U.; Johnson, W. L.; Swanger, A. M.; Tomsik, T.

    2015-01-01

    NASA operations for handling cryogens in ground support equipment have not changed substantially in 50 years, despite major technology advances in the field of cryogenics. NASA loses approximately 50% of the hydrogen purchased because of a continuous heat leak into ground and flight vessels, transient chill down of warm cryogenic equipment, liquid bleeds, and vent losses. NASA Kennedy Space Center (KSC) needs to develop energy-efficient cryogenic ground systems to minimize propellant losses, simplify operations, and reduce cost associated with hydrogen usage. The GODU LH2 project has designed, assembled, and started testing of a prototype storage and distribution system for liquid hydrogen that represents an advanced end-to-end cryogenic propellant system for a ground launch complex. The project has multiple objectives including zero loss storage and transfer, liquefaction of gaseous hydrogen, and densification of liquid hydrogen. The system is unique because it uses an integrated refrigeration and storage system (IRAS) to control the state of the fluid. This paper will present and discuss the results of the initial phase of testing of the GODU LH2 system.

  12. Variances in the projections, resulting from CLIMEX, Boosted Regression Trees and Random Forests techniques

    Science.gov (United States)

    Shabani, Farzin; Kumar, Lalit; Solhjouy-fard, Samaneh

    2017-08-01

    The aim of this study was to have a comparative investigation and evaluation of the capabilities of correlative and mechanistic modeling processes, applied to the projection of future distributions of date palm in novel environments and to establish a method of minimizing uncertainty in the projections of differing techniques. The location of this study on a global scale is in Middle Eastern Countries. We compared the mechanistic model CLIMEX (CL) with the correlative models MaxEnt (MX), Boosted Regression Trees (BRT), and Random Forests (RF) to project current and future distributions of date palm ( Phoenix dactylifera L.). The Global Climate Model (GCM), the CSIRO-Mk3.0 (CS) using the A2 emissions scenario, was selected for making projections. Both indigenous and alien distribution data of the species were utilized in the modeling process. The common areas predicted by MX, BRT, RF, and CL from the CS GCM were extracted and compared to ascertain projection uncertainty levels of each individual technique. The common areas identified by all four modeling techniques were used to produce a map indicating suitable and unsuitable areas for date palm cultivation for Middle Eastern countries, for the present and the year 2100. The four different modeling approaches predict fairly different distributions. Projections from CL were more conservative than from MX. The BRT and RF were the most conservative methods in terms of projections for the current time. The combination of the final CL and MX projections for the present and 2100 provide higher certainty concerning those areas that will become highly suitable for future date palm cultivation. According to the four models, cold, hot, and wet stress, with differences on a regional basis, appears to be the major restrictions on future date palm distribution. The results demonstrate variances in the projections, resulting from different techniques. The assessment and interpretation of model projections requires reservations

  13. Celiac Disease Associated with a Benign Granulomatous Mass Demonstrating Self-Regression after Initiation of a Gluten-Free Diet.

    Science.gov (United States)

    Tiwari, Abhinav; Sharma, Himani; Qamar, Khola; Khan, Zubair; Darr, Umar; Renno, Anas; Nawras, Ali

    2017-01-01

    Celiac disease is a chronic immune-mediated enteropathy in which dietary gluten induces an inflammatory reaction predominantly in the duodenum. Celiac disease is known to be associated with benign small bowel thickening and reactive lymphadenopathy that often regresses after the institution of a gluten-free diet. A 66-year-old male patient with celiac disease presented with abdominal pain and diarrheal illness. Computerized tomography of the abdomen revealed a duodenal mass. Endoscopic ultrasound-guided fine needle aspiration of the mass revealed bizarre stromal cells which represent a nonspecific tissue reaction to inflammation. This inflammatory mass regressed after the institution of a gluten-free diet. This case report describes a unique presentation of celiac disease in the form of a granulomatous self-regressing mass. Also, this is the first reported case of bizarre stromal cells found in association with celiac disease. In addition to lymphoma and small bowel adenocarcinoma, celiac disease can present with a benign inflammatory mass, which should be serially monitored for resolution with a gluten-free diet.

  14. Celiac Disease Associated with a Benign Granulomatous Mass Demonstrating Self-Regression after Initiation of a Gluten-Free Diet

    Directory of Open Access Journals (Sweden)

    Abhinav Tiwari

    2017-08-01

    Full Text Available Celiac disease is a chronic immune-mediated enteropathy in which dietary gluten induces an inflammatory reaction predominantly in the duodenum. Celiac disease is known to be associated with benign small bowel thickening and reactive lymphadenopathy that often regresses after the institution of a gluten-free diet. A 66-year-old male patient with celiac disease presented with abdominal pain and diarrheal illness. Computerized tomography of the abdomen revealed a duodenal mass. Endoscopic ultrasound-guided fine needle aspiration of the mass revealed bizarre stromal cells which represent a nonspecific tissue reaction to inflammation. This inflammatory mass regressed after the institution of a gluten-free diet. This case report describes a unique presentation of celiac disease in the form of a granulomatous self-regressing mass. Also, this is the first reported case of bizarre stromal cells found in association with celiac disease. In addition to lymphoma and small bowel adenocarcinoma, celiac disease can present with a benign inflammatory mass, which should be serially monitored for resolution with a gluten-free diet.

  15. Posterior consistency for Bayesian inverse problems through stability and regression results

    International Nuclear Information System (INIS)

    Vollmer, Sebastian J

    2013-01-01

    In the Bayesian approach, the a priori knowledge about the input of a mathematical model is described via a probability measure. The joint distribution of the unknown input and the data is then conditioned, using Bayes’ formula, giving rise to the posterior distribution on the unknown input. In this setting we prove posterior consistency for nonlinear inverse problems: a sequence of data is considered, with diminishing fluctuations around a single truth and it is then of interest to show that the resulting sequence of posterior measures arising from this sequence of data concentrates around the truth used to generate the data. Posterior consistency justifies the use of the Bayesian approach very much in the same way as error bounds and convergence results for regularization techniques do. As a guiding example, we consider the inverse problem of reconstructing the diffusion coefficient from noisy observations of the solution to an elliptic PDE in divergence form. This problem is approached by splitting the forward operator into the underlying continuum model and a simpler observation operator based on the output of the model. In general, these splittings allow us to conclude posterior consistency provided a deterministic stability result for the underlying inverse problem and a posterior consistency result for the Bayesian regression problem with the push-forward prior. Moreover, we prove posterior consistency for the Bayesian regression problem based on the regularity, the tail behaviour and the small ball probabilities of the prior. (paper)

  16. Financial analysis and forecasting of the results of small businesses performance based on regression model

    Directory of Open Access Journals (Sweden)

    Svetlana O. Musienko

    2017-03-01

    Full Text Available Objective to develop the economicmathematical model of the dependence of revenue on other balance sheet items taking into account the sectoral affiliation of the companies. Methods using comparative analysis the article studies the existing approaches to the construction of the company management models. Applying the regression analysis and the least squares method which is widely used for financial management of enterprises in Russia and abroad the author builds a model of the dependence of revenue on other balance sheet items taking into account the sectoral affiliation of the companies which can be used in the financial analysis and prediction of small enterprisesrsquo performance. Results the article states the need to identify factors affecting the financial management efficiency. The author analyzed scientific research and revealed the lack of comprehensive studies on the methodology for assessing the small enterprisesrsquo management while the methods used for large companies are not always suitable for the task. The systematized approaches of various authors to the formation of regression models describe the influence of certain factors on the company activity. It is revealed that the resulting indicators in the studies were revenue profit or the company relative profitability. The main drawback of most models is the mathematical not economic approach to the definition of the dependent and independent variables. Basing on the analysis it was determined that the most correct is the model of dependence between revenues and total assets of the company using the decimal logarithm. The model was built using data on the activities of the 507 small businesses operating in three spheres of economic activity. Using the presented model it was proved that there is direct dependence between the sales proceeds and the main items of the asset balance as well as differences in the degree of this effect depending on the economic activity of small

  17. Zero Emission Bay Area (ZEBA) Fuel Cell Bus Demonstration Results: Sixth Report

    Science.gov (United States)

    2017-09-01

    This report presents results of a demonstration of fuel cell electric buses (FCEBs) operating in Oakland, California. Alameda-Contra Costa Transit District (AC Transit) leads the Zero Emission Bay Area (ZEBA) demonstration that includes 13 advanced-d...

  18. An illustration of harmonic regression based on the results of the fast Fourier transformation

    Directory of Open Access Journals (Sweden)

    Bertfai Imre

    2002-01-01

    Full Text Available The well-known methodology of the Fourier analysis is put against the background in the 2nd half of the century parallel to the development of the time-domain approach in the analysis of mainly economical time series. However, from the author's point of view, the former possesses some hidden analytical advantages which deserve to be re-introduced to the toolbox of analysts. This paper, through several case studies, reports research results for computer algorithm providing a harmonic model for time series. The starting point of the particular method is a harmonic analysis (Fourier-analysis or Lomb-periodogram. The results are optimized in a multifold manner resulting in a model which is easy to handle and able to forecast the underlying data. The results provided are particularly free from limitations characteristic for that methods. Furthermore, the calculated results are easy to interpret and use for further decisions. Nevertheless, the author intends to enhance the procedure in several ways. The method shown seems to be very effective and useful in modeling time series consisting of periodic terms. An additional advantage is the easy interpretation of the obtained parameters.

  19. Land surface temperature downscaling using random forest regression: primary result and sensitivity analysis

    Science.gov (United States)

    Pan, Xin; Cao, Chen; Yang, Yingbao; Li, Xiaolong; Shan, Liangliang; Zhu, Xi

    2018-04-01

    The land surface temperature (LST) derived from thermal infrared satellite images is a meaningful variable in many remote sensing applications. However, at present, the spatial resolution of the satellite thermal infrared remote sensing sensor is coarser, which cannot meet the needs. In this study, LST image was downscaled by a random forest model between LST and multiple predictors in an arid region with an oasis-desert ecotone. The proposed downscaling approach was evaluated using LST derived from the MODIS LST product of Zhangye City in Heihe Basin. The primary result of LST downscaling has been shown that the distribution of downscaled LST matched with that of the ecosystem of oasis and desert. By the way of sensitivity analysis, the most sensitive factors to LST downscaling were modified normalized difference water index (MNDWI)/normalized multi-band drought index (NMDI), soil adjusted vegetation index (SAVI)/ shortwave infrared reflectance (SWIR)/normalized difference vegetation index (NDVI), normalized difference building index (NDBI)/SAVI and SWIR/NDBI/MNDWI/NDWI for the region of water, vegetation, building and desert, with LST variation (at most) of 0.20/-0.22 K, 0.92/0.62/0.46 K, 0.28/-0.29 K and 3.87/-1.53/-0.64/-0.25 K in the situation of +/-0.02 predictor perturbances, respectively.

  20. Demonstrations in Solute Transport Using Dyes: Part I. Procedures and Results.

    Science.gov (United States)

    Butters, Greg; Bandaranayake, Wije

    1993-01-01

    Presents the general theory to explain chemical movement in soil. Describes classroom demonstrations with visually stimulating results that show the effects of soil structure, soil texture, soil pH, and soluble organic matter on that movement. (MDH)

  1. Natural Attenuation of Chlorinated Solvents Performance and Cost Results from Multiple Air Force Demonstration Sites, Technology Demonstration Slide Presentation

    National Research Council Canada - National Science Library

    Wiedemeier, Todd

    1999-01-01

    This slide presentation summarizes the results of natural attenuation treatability studies at 14 Air Force sites contaminated with chlorinated solvents and their associated biodegradation daughter products...

  2. Non-Flow-Through Fuel Cell System Test Results and Demonstration on the SCARAB Rover

    Science.gov (United States)

    Scheidegger, Brianne, T.; Burke, Kenneth A.; Jakupca, Ian J.

    2012-01-01

    This paper describes the results of the demonstration of a non-flow-through PEM fuel cell as part of a power system on the SCARAB rover. A 16-cell non-flow-through fuel cell stack from Infinity Fuel Cell and Hydrogen, Inc. was incorporated into a power system designed to act as a range extender by providing power to the rover s hotel loads. This work represents the first attempt at a ground demonstration of this new technology aboard a mobile test platform. Development and demonstration were supported by the Office of the Chief Technologist s Space Power Systems Project and the Advanced Exploration System Modular Power Systems Project.

  3. Zero Emission Bay Area (ZEBA) Fuel Cell Bus Demonstration: Second Results Report

    Energy Technology Data Exchange (ETDEWEB)

    Eudy, L.; Chandler, K.

    2012-07-01

    This report presents results of a demonstration of 12 new fuel cell electric buses (FCEB) operating in Oakland, California. The 12 FCEBs operate as a part of the Zero Emission Bay Area (ZEBA) Demonstration, which also includes two new hydrogen fueling stations. This effort is the largest FCEB demonstration in the United States and involves five participating transit agencies. The ZEBA partners are collaborating with the U.S. Department of Energy (DOE) and DOE's National Renewable Energy Laboratory (NREL) to evaluate the buses in revenue service. The first results report was published in August 2011, describing operation of these new FCEBs from September 2010 through May 2011. New results in this report provide an update through April 2012.

  4. Zero Emission Bay Area (ZEBA) Fuel Cell Bus Demonstration Results: Fifth Report

    Energy Technology Data Exchange (ETDEWEB)

    Eudy, Leslie [National Renewable Energy Lab. (NREL), Golden, CO (United States); Post, Matthew [National Renewable Energy Lab. (NREL), Golden, CO (United States); Jeffers, Matthew [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-06-01

    This report presents results of a demonstration of fuel cell electric buses (FCEB) operating in Oakland, California. Alameda-Contra Costa Transit District (AC Transit) leads the Zero Emission Bay Area (ZEBA) demonstration, which includes 13 advanced-design fuel cell buses and two hydrogen fueling stations. The ZEBA partners are collaborating with the U.S. Department of Energy (DOE) and DOE's National Renewable Energy Laboratory (NREL) to evaluate the buses in revenue service. NREL has published four previous reports describing operation of these buses. This report presents new and updated results covering data from January 2015 through December 2015.

  5. Zero Emission Bay Area (ZEBA) Fuel Cell Bus Demonstration Results: Sixth Report

    Energy Technology Data Exchange (ETDEWEB)

    Eudy, Leslie [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Post, Matthew B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Jeffers, Matthew A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-11

    This report presents results of a demonstration of fuel cell electric buses (FCEB) operating in Oakland, California. Alameda-Contra Costa Transit District (AC Transit) leads the Zero Emission Bay Area (ZEBA) demonstration, which includes 13 advanced-design fuel cell buses and two hydrogen fueling stations. The ZEBA partners are collaborating with the U.S. Department of Energy (DOE) and DOE's National Renewable Energy Laboratory (NREL) to evaluate the buses in revenue service. NREL has published five previous reports describing operation of these buses. This report presents new and updated results covering data from January 2016 through December 2016.

  6. Zero Emission Bay Area (ZEBA) Fuel Cell Bus Demonstration Results: Third Report

    Energy Technology Data Exchange (ETDEWEB)

    Eudy, L.; Post, M.

    2014-05-01

    This report presents results of a demonstration of 12 fuel cell electric buses (FCEB) operating in Oakland, California. The 12 FCEBs operate as a part of the Zero Emission Bay Area (ZEBA) Demonstration, which also includes two new hydrogen fueling stations. This effort is the largest FCEB demonstration in the United States and involves five participating transit agencies. The ZEBA partners are collaborating with the U.S. Department of Energy (DOE) and DOE's National Renewable Energy Laboratory (NREL) to evaluate the buses in revenue service. NREL has published two previous reports, in August 2011 and July 2012, describing operation of these buses. New results in this report provide an update covering eight months through October 2013.

  7. Road surface erosion on the Jackson Demonstration State Forest: results of a pilot study

    Science.gov (United States)

    Brian Barrett; Rosemary Kosaka; David. Tomberlin

    2012-01-01

    This paper presents results of a 3 year pilot study of surface erosion on forest roads in the Jackson Demonstration State Forest in California’s coastal redwood region. Ten road segments representing a range of surface, grade, and ditch conditions were selected for the study. At each segment, settling basins with tipping buckets were installed to measure...

  8. Development of large aperture telescope technology (LATT): test results on a demonstrator bread-board

    Science.gov (United States)

    Briguglio, R.; Xompero, M.; Riccardi, A.; Lisi, F.; Duò, F.; Vettore, C.; Gallieni, D.; Tintori, M.; Lazzarini, P.; Patauner, C.; Biasi, R.; D'Amato, F.; Pucci, M.; Pereira do Carmo, João.

    2017-11-01

    The concept of a low areal density primary mirror, actively controlled by actuators, has been investigated through a demonstration prototype. A spherical mirror (400 mm diameter, 2.7 Kg mass) has been manufactured and tested in laboratory and on the optical bench, to verify performance, controllability and optical quality. In the present paper we will describe the prototype and the test results.

  9. Zero Emission Bay Area (ZEBA) Fuel Cell Bus Demonstration Results. Fourth Report

    Energy Technology Data Exchange (ETDEWEB)

    Eudy, Leslie [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Post, Matthew [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2015-07-02

    This report presents results of a demonstration of fuel cell electric buses (FCEB) operating in Oakland, California. Alameda-Contra Costa Transit District (AC Transit) leads the Zero Emission Bay Area (ZEBA) demonstration, which includes 12 advanced-design fuel cell buses and two hydrogen fueling stations. The FCEBs in service at AC Transit are 40-foot, low-floor buses built by Van Hool with a hybrid electric propulsion system that includes a US Hybrid fuel cell power system and EnerDel lithium-based energy storage system. The buses began revenue service in May 2010.

  10. Does the high–tech industry consistently reduce CO{sub 2} emissions? Results from nonparametric additive regression model

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Bin [School of Statistics, Jiangxi University of Finance and Economics, Nanchang, Jiangxi 330013 (China); Research Center of Applied Statistics, Jiangxi University of Finance and Economics, Nanchang, Jiangxi 330013 (China); Lin, Boqiang, E-mail: bqlin@xmu.edu.cn [Collaborative Innovation Center for Energy Economics and Energy Policy, China Institute for Studies in Energy Policy, Xiamen University, Xiamen, Fujian 361005 (China)

    2017-03-15

    China is currently the world's largest carbon dioxide (CO{sub 2}) emitter. Moreover, total energy consumption and CO{sub 2} emissions in China will continue to increase due to the rapid growth of industrialization and urbanization. Therefore, vigorously developing the high–tech industry becomes an inevitable choice to reduce CO{sub 2} emissions at the moment or in the future. However, ignoring the existing nonlinear links between economic variables, most scholars use traditional linear models to explore the impact of the high–tech industry on CO{sub 2} emissions from an aggregate perspective. Few studies have focused on nonlinear relationships and regional differences in China. Based on panel data of 1998–2014, this study uses the nonparametric additive regression model to explore the nonlinear effect of the high–tech industry from a regional perspective. The estimated results show that the residual sum of squares (SSR) of the nonparametric additive regression model in the eastern, central and western regions are 0.693, 0.054 and 0.085 respectively, which are much less those that of the traditional linear regression model (3.158, 4.227 and 7.196). This verifies that the nonparametric additive regression model has a better fitting effect. Specifically, the high–tech industry produces an inverted “U–shaped” nonlinear impact on CO{sub 2} emissions in the eastern region, but a positive “U–shaped” nonlinear effect in the central and western regions. Therefore, the nonlinear impact of the high–tech industry on CO{sub 2} emissions in the three regions should be given adequate attention in developing effective abatement policies. - Highlights: • The nonlinear effect of the high–tech industry on CO{sub 2} emissions was investigated. • The high–tech industry yields an inverted “U–shaped” effect in the eastern region. • The high–tech industry has a positive “U–shaped” nonlinear effect in other regions. • The linear impact

  11. Hyperprolactinaemia as a result of immaturity or regression: the concept of maternal subroutine. A new model of psychoendocrine interactions.

    Science.gov (United States)

    Sobrinho, L G; Almeida-Costa, J M

    1992-01-01

    Pathological hyperprolactinaemia (PH) is significantly associated with: (1) paternal deprivation during childhood, (2) depression, (3) non-specific symptoms including obesity and weight gain. The clinical onset of the symptoms often follows pregnancy or a loss. Prolactin is an insulin antagonist which does not promote weight gain. Hyperprolactinaemia and increased metabolic efficiency are parts of a system of interdependent behavioural and metabolic mechanisms necessary for the care of the young. We call this system, which is available as a whole package, maternal subroutine (MS). An important number of cases of PH are due to activation of the MS that is not induced by pregnancy. The same occurs in surrogate maternity and in some animal models. Most women with PH developed a malignant symbiotic relationship with their mothers in the setting of absence, alcoholism or devaluation of the father. These women may regress to early developmental stages to the point that they identify themselves both with their lactating mother and with the nursing infant as has been found in psychoanalysed patients and in the paradigmatic condition of pseudopregnancy. Such regression can be associated with activation of the MS. Prolactinomas represent the extreme of the spectrum of PH and may result from somatic mutations occurring in hyperstimulated lactotrophs.

  12. Results and implications of the EBR-II inherent safety demonstration tests

    International Nuclear Information System (INIS)

    Planchon, H.P.; Golden, G.H.; Sackett, J.I.; Mohr, D.; Chang, L.K.; Feldman, E.E.; Betten, P.R.

    1987-01-01

    On April 3, 1986 two milestone tests were conducted in Experimental Breeder Reactor-2 (EBR-II). The first test was a loss of flow without scram and the second was a loss of heat sink without scram. Both tests were initiated from 100% power and in both tests the reactor was shut down by natural processes, principally thermal expansion, without automatic scram, operator intervention or the help of special in-core devices. The temperature transients during the tests were mild, as predicted, and there was no damage to the core or reactor plant structures. In a general sense, therefore, the tests plus supporting analysis demonstrated the feasibility of inherent passive shutdown for undercooling accidents in metal-fueled LMRs. The results provide a technical basis for future experiments in EBR-II to demonstrate inherent safety for overpower accidents and provide data for validation of computer codes used for design and safety analysis of inherently safe reactor plants

  13. Final test results for the ground operations demonstration unit for liquid hydrogen

    Science.gov (United States)

    Notardonato, W. U.; Swanger, A. M.; Fesmire, J. E.; Jumper, K. M.; Johnson, W. L.; Tomsik, T. M.

    2017-12-01

    Described herein is a comprehensive project-a large-scale test of an integrated refrigeration and storage system called the Ground Operations and Demonstration Unit for Liquid Hydrogen (GODU LH2), sponsored by the Advanced Exploration Systems Program and constructed at Kennedy Space Center. A commercial cryogenic refrigerator interfaced with a 125,000 l liquid hydrogen tank and auxiliary systems in a manner that enabled control of the propellant state by extracting heat via a closed loop Brayton cycle refrigerator coupled to a novel internal heat exchanger. Three primary objectives were demonstrating zero-loss storage and transfer, gaseous liquefaction, and propellant densification. Testing was performed at three different liquid hydrogen fill-levels. Data were collected on tank pressure, internal tank temperature profiles, mass flow in and out of the system, and refrigeration system performance. All test objectives were successfully achieved during approximately two years of testing. A summary of the final results is presented in this paper.

  14. Automatically explaining machine learning prediction results: a demonstration on type 2 diabetes risk prediction.

    Science.gov (United States)

    Luo, Gang

    2016-01-01

    Predictive modeling is a key component of solutions to many healthcare problems. Among all predictive modeling approaches, machine learning methods often achieve the highest prediction accuracy, but suffer from a long-standing open problem precluding their widespread use in healthcare. Most machine learning models give no explanation for their prediction results, whereas interpretability is essential for a predictive model to be adopted in typical healthcare settings. This paper presents the first complete method for automatically explaining results for any machine learning predictive model without degrading accuracy. We did a computer coding implementation of the method. Using the electronic medical record data set from the Practice Fusion diabetes classification competition containing patient records from all 50 states in the United States, we demonstrated the method on predicting type 2 diabetes diagnosis within the next year. For the champion machine learning model of the competition, our method explained prediction results for 87.4 % of patients who were correctly predicted by the model to have type 2 diabetes diagnosis within the next year. Our demonstration showed the feasibility of automatically explaining results for any machine learning predictive model without degrading accuracy.

  15. Measuring the statistical validity of summary meta‐analysis and meta‐regression results for use in clinical practice

    Science.gov (United States)

    Riley, Richard D.

    2017-01-01

    An important question for clinicians appraising a meta‐analysis is: are the findings likely to be valid in their own practice—does the reported effect accurately represent the effect that would occur in their own clinical population? To this end we advance the concept of statistical validity—where the parameter being estimated equals the corresponding parameter for a new independent study. Using a simple (‘leave‐one‐out’) cross‐validation technique, we demonstrate how we may test meta‐analysis estimates for statistical validity using a new validation statistic, Vn, and derive its distribution. We compare this with the usual approach of investigating heterogeneity in meta‐analyses and demonstrate the link between statistical validity and homogeneity. Using a simulation study, the properties of Vn and the Q statistic are compared for univariate random effects meta‐analysis and a tailored meta‐regression model, where information from the setting (included as model covariates) is used to calibrate the summary estimate to the setting of application. Their properties are found to be similar when there are 50 studies or more, but for fewer studies Vn has greater power but a higher type 1 error rate than Q. The power and type 1 error rate of Vn are also shown to depend on the within‐study variance, between‐study variance, study sample size, and the number of studies in the meta‐analysis. Finally, we apply Vn to two published meta‐analyses and conclude that it usefully augments standard methods when deciding upon the likely validity of summary meta‐analysis estimates in clinical practice. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:28620945

  16. Experimental-demonstrative system for energy conversion using hydrogen fuel cell - preliminary results

    International Nuclear Information System (INIS)

    Stoenescu, D.; Stefanescu, I.; Patularu, I.; Culcer, M.; Lazar, R.E.; Carcadea, E.; Mirica, D. . E-mail address of corresponding author: daniela@icsi.ro; Stoenescu, D.)

    2005-01-01

    It is well known that hydrogen is the most promising solution of future energy, both for long and medium term strategies. Hydrogen can be produced using many primary sources (natural gas, methane, biomass, etc.), it can be burned or chemically react having a high yield of energy conversion, being a non-polluted fuel. This paper presents the preliminary results obtained by ICSI Rm. Valcea in an experimental-demonstrative conversion energy system made by a sequence of hydrogen purification units and a CO removing reactors until a CO level lower than 10ppm, that finally feeds a hydrogen fuel stack. (author)

  17. Performance and cost results from a DOE Micro-CHP demonstration facility at Mississippi State University

    International Nuclear Information System (INIS)

    Giffin, Paxton K.

    2013-01-01

    Highlights: ► We examine the cost and performance results of a Micro-CHP demonstration facility. ► Evaluation includes both summer and winter performance. ► Evaluation in comparison to a conventional HVAC system using grid power. ► Influence of improperly sized equipment. ► Influence of natural gas prices on the viability of CHP projects using that fuel. - Abstract: Cooling, Heating, and Power (CHP) systems have been around for decades, but systems that utilize 20 kW or less, designated as Micro-CHP, are relatively new. A demonstration site has been constructed at Mississippi State University (MSU) to show the advantages of these micro scale systems. This study is designed to evaluate the performance of a Micro-CHP system as opposed to a conventional high-efficiency Heating, Ventilation, and Air Conditioning (HVAC) system that utilizes electrical power from the existing power grid. Raw data was collected for 7 months to present the following results. The combined cycle efficiency from the demonstration site was averaged at 29%. The average combined boiler and engine cost was $1.8 h −1 of operation for heating season and $3.9 h −1 of operation for cooling season. The cooling technology used, an absorption chiller exhibited an average Coefficient of Performance (COP) of 0.27. The conventional high-efficiency system, during cooling season, had a COP of 4.7 with a combined cooling and building cost of $0.2 h −1 of operation. During heating mode, the conventional system had an efficiency of 47% with a fuel and building electrical cost of $0.28 h −1 of operation.

  18. Coupling solar photo-Fenton and biotreatment at industrial scale: Main results of a demonstration plant

    International Nuclear Information System (INIS)

    Malato, Sixto; Blanco, Julian; Maldonado, Manuel I.; Oller, Isabel; Gernjak, Wolfgang; Perez-Estrada, Leonidas

    2007-01-01

    This paper reports on the combined solar photo-Fenton/biological treatment of an industrial effluent (initial total organic carbon, TOC, around 500 mg L -1 ) containing a non-biodegradable organic substance (α-methylphenylglycine at 500 mg L -1 ), focusing on pilot plant tests performed for design of an industrial plant, the design itself and the plant layout. Pilot plant tests have demonstrated that biodegradability enhancement is closely related to disappearance of the parent compound, for which a certain illumination time and hydrogen peroxide consumption are required, working at pH 2.8 and adding Fe 2+ = 20 mg L -1 . Based on pilot plant results, an industrial plant with 100 m 2 of CPC collectors for a 250 L/h treatment capacity has been designed. The solar system discharges the wastewater (WW) pre-treated by photo-Fenton into a biotreatment based on an immobilized biomass reactor. First, results of the industrial plant are also presented, demonstrating that it is able to treat up to 500 L h -1 at an average solar ultraviolet radiation of 22.9 W m -2 , under the same conditions (pH, hydrogen peroxide consumption) tested in the pilot plant

  19. SEXTANT X-Ray Pulsar Navigation Demonstration: Flight System and Test Results

    Science.gov (United States)

    Winternitz, Luke; Mitchell, Jason W.; Hassouneh, Munther A.; Valdez, Jennifer E.; Price, Samuel R.; Semper, Sean R.; Yu, Wayne H.; Ray, Paul S.; Wood, Kent S.; Arzoumanian, Zaven; hide

    2016-01-01

    The Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) is a technology demonstration enhancement to the Neutron-star Interior Composition Explorer (NICER) mission. NICER is a NASA Explorer Mission of Opportunity that will be hosted on the International Space Station (ISS). SEXTANT will, for the first time, demonstrate real-time, on-board X-ray Pulsar Navigation (XNAV), a significant milestone in the quest to establish a GPS-like navigation capability available throughout our Solar System and beyond. This paper gives an overview of the SEXTANT system architecture and describes progress prior to environmental testing of the NICER flight instrument. It provides descriptions and development status of the SEXTANT flight software and ground system, as well as detailed description and results from the flight software functional and performance testing within the high-fidelity Goddard Space Flight Center (GSFC) X-ray Navigation Laboratory Testbed (GXLT) software and hardware simulation environment. Hardware-in-the-loop simulation results are presented, using the engineering model of the NICER timing electronics and the GXLT pulsar simulator-the GXLT precisely controls NASA GSFC's unique Modulated X-ray Source to produce X-rays that make the NICER detector electronics appear as if they were aboard the ISS viewing a sequence of millisecond pulsars

  20. Dual Regression

    OpenAIRE

    Spady, Richard; Stouli, Sami

    2012-01-01

    We propose dual regression as an alternative to the quantile regression process for the global estimation of conditional distribution functions under minimal assumptions. Dual regression provides all the interpretational power of the quantile regression process while avoiding the need for repairing the intersecting conditional quantile surfaces that quantile regression often produces in practice. Our approach introduces a mathematical programming characterization of conditional distribution f...

  1. The progress and results of a demonstration test of a cavern-type disposal facility

    International Nuclear Information System (INIS)

    Akiyama, Yoshihiro; Terada, Kenji; Oda, Nobuaki; Yada, Tsutomu; Nakajima, Takahiro

    2011-01-01

    The cavern-type disposal facilities for low-level waste (LLW) with relatively high radioactivity levels mainly generated from power reactor decommissioning and for part of transuranic (TRU) waste mainly from spent fuel reprocessing are designed to be constructed in a cavern 50 to 100 meters below ground, and to employ an engineered barrier system (EBS) of a combination of bentonite and cement materials in Japan. In order to advance the feasibility study for these disposal, a government-commissioned research project named Demonstration Test of Cavern-Type Disposal Facility started in fiscal 2005, and since fiscal 2007 a full-scale mock-up test facility has been constructed under actual subsurface environment. The main objective of the test is to establish construction methodology and procedures which ensure the required quality of the EBS on-site. By fiscal 2009 some parts of the facility have been constructed, and the test has demonstrated both practicability of the construction and achievement of the quality. They are respectively taken as low-permeability of less than 5x10 13 m/s and low-diffusivity of less than 1x10 -12 m 2 /s at the time of completion of construction. This paper covers the project outline and the test results obtained by the construction of some parts of a bentonite and cement materials. (author)

  2. Status of disposal techniques for spent fuel in Germany: Results of demonstration tests for direct disposal

    International Nuclear Information System (INIS)

    Engelmann, H.J.; Filbert, W.

    1993-01-01

    According to the Atomic Energy Act (1985) the Federal Government is responsible for establishing facilities to indemnify and dispose radioactive waste. According to Art. 9b of the Atomic Energy Act (1986) the construction and operation of such a repository requires approval of a plan. According to safety criteria applicable for disposing radioactive waste in mines, construction and operation of repository mines require application of acknowledged rules of technology, laws, ordinances and other regulations to protect operating staff and population from radiation damages. Shaft hoisting equipment for the transportation of radioactive waste in a repository mine must satisfy normal operational tasks and meet special safety-requirements. Its failure may result in danger for persons, release of radioactive substances into the plant and environment. That means, shaft hoisting equipment must be designed to satisfy the necessary safety requirements and be state of the art of science and technology. The aim of these demonstration tests is verification of technical feasibility of a shaft hoisting equipment with a payload of 85 t, underground for drift disposal of POLLUX-casks, and essential machine and mine-technical systems and components. The demonstration also includes safe radiation protection during transport and disposal operations. Investigations assume that radioactive waste is transported in containers that satisfy transport requirements for dangerous goods and have a type-B-certificate

  3. Demonstration Results on the Effects of Mercury Speciation on the Stabilization of Wastes

    International Nuclear Information System (INIS)

    Conley, T.B.; Hulet, G.A.; Morris, M.I.; Osborne-Lee, I.W.

    1999-01-01

    Mercury-contaminated wastes are currently being stored at approximately 19 Department of Energy sites, the volume of which is estimated to be about 16m(sup)3. These wastes exist in various forms including soil, sludges, and debris, which present a particular challenge regarding possible mercury stabilization methods. This reports provides the test results of three vendors, Allied Technology Group, IT Corporation, and Nuclear Fuel Services, Inc., that demonstrate the effects of mercury speciation on the stabilization of the mercury wastes. Mercury present in concentrations that exceed 260 parts per million must be removed by extraction methods and requires stabilization to ensure that the final wasteforms leach less than 0.2mg/L of mercury by the Toxicity Characteristic Leaching Procedure or 0.025 mg/L using the Universal Treatment Standard

  4. The public visits a nuclear waste site: Survey results from the West Valley Demonstration Project

    International Nuclear Information System (INIS)

    Hoffman, W.D.

    1987-01-01

    This paper discusses the results of the 1986 survey taken at the West Valley Demonstration Project Open House where a major nuclear waste cleanup is in progress. Over 1400 people were polled on what they think is most effective in educating the public on nuclear waste. A demographic analysis describes the population attending the event and their major interests in the project. Responses to attitudinal questions are examined to evaluate the importance of radioactive waste cleanup as an environmental issue and a fiscal responsibility. Additionally, nuclear power is evaluated on its public perception as an energy resource. The purpose of the study is to find out who visits a nuclear waste site and why, and to measure their attitudes on nuclear issues

  5. RAVAN CubeSat Results: Technologies and Science Demonstrated On Orbit

    Science.gov (United States)

    Swartz, W. H.; Lorentz, S. R.; Huang, P. M.; Smith, A. W.; Yu, Y.; Briscoe, J. S.; Reilly, N.; Reilly, S.; Reynolds, E.; Carvo, J.; Wu, D.

    2017-12-01

    Elucidating Earth's energy budget is vital to understanding and predicting climate, particularly the small imbalance between the incident solar irradiance and Earth-leaving fluxes of total and solar-reflected energy. Accurately quantifying the spatial and temporal variation of Earth's outgoing energy from space is a challenge—one potentially rendered more tractable with the advent of multipoint measurements from small satellite or hosted payload constellations. The Radiometer Assessment using Vertically Aligned Nanotubes (RAVAN) 3U CubeSat, launched November 11, 2016, is a pathfinder for a constellation to measure the Earth's energy imbalance. The objective of RAVAN is to establish that compact, broadband radiometers absolutely calibrated to high accuracy can be built and operated in space for low cost. RAVAN demonstrates two key technologies: (1) vertically aligned carbon nanotubes as spectrally flat radiometer absorbers and (2) gallium phase-change cells for on-board calibration and degradation monitoring of RAVAN's radiometer sensors. We show on-orbit results, including calibrated irradiance measurements at both shortwave, solar-reflected wavelengths and in the thermal infrared. These results are compared with both modeled upwelling fluxes and those measured by independent Earth energy instruments in low-Earth orbit. Further, we show the performance of two gallium phase-change cells that are used to monitor the degradation of RAVAN's radiometer sensors. In addition to Earth energy budget technology and science, RAVAN also demonstrates partnering with a commercial vendor for the CubeSat bus, payload integration and test, and mission operations. We conclude with a discussion of how a RAVAN-type constellation could enable a breakthrough in the measurement of Earth's energy budget and lead to superior predictions of future climate.

  6. Results From The Salt Disposition Project Next Generation Solvent Demonstration Plan

    Energy Technology Data Exchange (ETDEWEB)

    Peters, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Fondeur, F. F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Taylor-Pashow, K. M.L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2014-04-02

    Strip Effluent Hold Tank (SEHT), Decontaminated Salt Solution Hold Tank (DSSHT), Caustic Wash Tank (CWT) and Solvent Hold Tank (SHT) samples were taken throughout the Next Generation Solvent (NGS) Demonstration Plan. These samples were analyzed and the results are reported. SHT: The solvent behaved as expected, with no bulk changes in the composition over time, with the exception of the TOA and TiDG. The TiDG depletion is higher than expected, and consideration must be taken on the required rate of replenishment. Monthly sampling of the SHT is warranted. If possible, additional SHT samples for TiDG analysis (only) would help SRNL refine the TiDG degradation model. CWT: The CWT samples show the expected behavior in terms of bulk chemistry. The 137Cs deposited into the CWT varies somewhat, but generally appears to be lower than during operations with the BOBCalix solvent. While a few minor organic components were noted to be present in the Preliminary sample, at this time these are thought to be artifacts of the sample preparation or may be due to the preceding solvent superwash. DSSHT: The DSSHT samples show the predicted bulk chemistry, although they point towards significant dilution at the front end of the Demonstration. The 137Cs levels in the DSSHT are much lower than during the BOBCalix operations, which is the expected observation. SEHT: The SEHT samples represent the most different output of all four of the outputs from MCU. While the bulk chemistry is as expected, something is causing the pH of the SEHT to be higher than what would be predicted from a pure stream of 0.01 M boric acid. There are several possible different reasons for this, and SRNL is in the process of investigating. Other than the pH issue, the SEHT is as predicted. In summary, the NGS Demonstration Plan samples indicate that the MCU system, with the Blend Solvent, is operating as expected. The only issue of concern regards the pH of the SEHT, and SRNL is in the process of investigating

  7. Demonstrating the European capability for airborne gamma spectrometry: results from the eccomags exercise

    International Nuclear Information System (INIS)

    Sanderson, D.C.W.

    2003-01-01

    Full text: Airborne gamma spectrometry has emerged over recent years as an important means for mapping deposited activity and dose-rates in the environment. Its importance to emergency response has been increasingly recognised in defining contaminated areas at rates of measurement and spatial density which simply cannot be matched using ground based methods. In Europe the practical capability for AGS has developed markedly over the period since the Chernobyl accident, and significant progress towards cooperation between AGS teams, and convergence of methodology has been made, largely as the result of work conducted with support from the Commission under Frameworks IV and V. As an important part of the ECCOMAGS project an international comparison was undertaken in SW Scotland in 2002 to evaluate the performance of AGS teams in comparison with established groundbased methods. The work included a composite mapping task whereby different AGS teams recorded data over adjacent areas, to demonstrate the potential for collective actions in emergency response. More than 70,000 observations were made from a 90x40 km area during a 3 day period by several European teams, and compiled within days to produce regional scale maps for 137-Cs contamination and environmental gamma dose-rates. In three disjoint common areas AGS and ground based observations were taken. The ground based work was split for practical reasons into two phases; a pre-characterization stage to define calibration sites, and a second stage where sampling of additional sites was undertake synchronously with the exercise to provide the basis for blind intercomparison with AGS. More than 800 laboratory gamma spectrometry measurements were taken in support of this activity, which in combination with in-situ gamma spectrometry and instrumental dose-rate measurements were used to define the ground-based estimates of 137-Cs activity per unit area and dose-rate. AGS for the common areas, included points for ground

  8. Laser Spectroscopy Multi-Gas Monitor: Results of Technology Demonstration on ISS

    Science.gov (United States)

    Mudgett, Paul D.; Pilgrim, Jeffrey S.

    2015-01-01

    Tunable diode laser spectroscopy (TDLS) is an up and coming trace and major gas monitoring technology with unmatched selectivity, range and stability. The technology demonstration of the 4 gas Multi-Gas Monitor (MGM), reported at the 2014 ICES conference, operated continuously on the International Space Station (ISS) for nearly a year. The MGM is designed to measure oxygen, carbon dioxide, ammonia and water vapor in ambient cabin air in a low power, relatively compact device. While on board, the MGM experienced a number of challenges, unplanned and planned, including a test of the ammonia channel using a commercial medical ammonia inhalant. Data from the unit was downlinked once per week and compared with other analytical resources on board, notably the Major Constituent Analyzer (MCA), a magnetic sector mass spectrometer. MGM spent the majority of the time installed in the Nanoracks Frame 2 payload facility in front breathing mode (sampling the ambient environment of the Japanese Experiment Module), but was also used to analyze recirculated rack air. The capability of the MGM to be operated in portable mode (via internal rechargeable lithium ion polymer batteries or by plugging into any Express Rack 28VDC connector) was a part of the usability demonstration. Results to date show unprecedented stability and accuracy of the MGM vs. the MCA for oxygen and carbon dioxide. The ammonia challenge (approx. 75 ppm) was successful as well, showing very rapid response time in both directions. Work on an expansion of capability in a next generation MGM has just begun. Combustion products and hydrazine are being added to the measurable target analytes. An 8 to 10 gas monitor (aka Gas Tricorder 1.0) is envisioned for use on ISS, Orion and Exploration missions.

  9. Results from the Big Spring basin water quality monitoring and demonstration projects, Iowa, USA

    Science.gov (United States)

    Rowden, R.D.; Liu, H.; Libra, R.D.

    2001-01-01

    Agricultural practices, hydrology, and water quality of the 267-km2 Big Spring groundwater drainage basin in Clayton County, Iowa, have been monitored since 1981. Land use is agricultural; nitrate-nitrogen (-N) and herbicides are the resulting contaminants in groundwater and surface water. Ordovician Galena Group carbonate rocks comprise the main aquifer in the basin. Recharge to this karstic aquifer is by infiltration, augmented by sinkhole-captured runoff. Groundwater is discharged at Big Spring, where quantity and quality of the discharge are monitored. Monitoring has shown a threefold increase in groundwater nitrate-N concentrations from the 1960s to the early 1980s. The nitrate-N discharged from the basin typically is equivalent to over one-third of the nitrogen fertilizer applied, with larger losses during wetter years. Atrazine is present in groundwater all year; however, contaminant concentrations in the groundwater respond directly to recharge events, and unique chemical signatures of infiltration versus runoff recharge are detectable in the discharge from Big Spring. Education and demonstration efforts have reduced nitrogen fertilizer application rates by one-third since 1981. Relating declines in nitrate and pesticide concentrations to inputs of nitrogen fertilizer and pesticides at Big Spring is problematic. Annual recharge has varied five-fold during monitoring, overshadowing any water-quality improvements resulting from incrementally decreased inputs. ?? Springer-Verlag 2001.

  10. Correlation of results obtained by in-vivo optical spectroscopy with measured blood oxygen saturation using a positive linear regression fit

    Science.gov (United States)

    McCormick, Patrick W.; Lewis, Gary D.; Dujovny, Manuel; Ausman, James I.; Stewart, Mick; Widman, Ronald A.

    1992-05-01

    Near infrared light generated by specialized instrumentation was passed through artificially oxygenated human blood during simultaneous sampling by a co-oximeter. Characteristic absorption spectra were analyzed to calculate the ratio of oxygenated to reduced hemoglobin. A positive linear regression fit between diffuse transmission oximetry and measured blood oxygenation over the range 23% to 99% (r2 equals .98, p signal was observed in the patient over time. The procedure was able to be performed clinically without difficulty; rSO2 values recorded continuously demonstrate the usefulness of the technique. Using the same instrumentation, arterial input and cerebral response functions, generated by IV tracer bolus, were deconvoluted to measure mean cerebral transit time. Date collected over time provided a sensitive index of changes in cerebral blood flow as a result of therapeutic maneuvers.

  11. Demonstration project Klaipeda, Lithuania. Final evaluation of 1. year results after rehabilitation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-07-01

    In order to investigate various options for reduction of the district heating return temperature and for introduction of energy saving measures in the buildings, a demonstration area comprising eight buildings was selected. The eight buildings selected were typical 5-floor buildings built in 1982-89. This enabled the experience to be transferred to a maximum number of buildings in Klaipeda and in Lithuania, in general. All buildings had a one-string radiator system installed. Since 1991, the supply temperature in Klaipeda district heating system has been reduced. This has led to a lower indoor temperature, which has decreased from the design value 18 deg. C to 15 dec. C due to shortage of fuel. With such a low room temperature, the improved sytems may result in better comfort rather than in energy savings. The project has been implemented during a two-year period from heating season 97/98 to 98/99. The first year before the heating season 97/98, energy and flow meters were installed for measuring the existing heating and hot water consumption. For the next heating season 1998/1999, various energy saving measures were installed, and measurements were caried out again. After that, the energy consumption and temperature levels were analysed and compared for the two seasons. (au)

  12. Results of a Long-Term Demonstration of an Optical Multi-Gas Monitor on ISS

    Science.gov (United States)

    Mudgett, Paul; Pilgrim, Jeffrey S.

    2015-01-01

    Previously at SAMAP we reported on the development of tunable diode laser spectroscopy (TDLS) based instruments for measuring small gas molecules in real time. TDLS technology has matured rapidly over the last 5 years as a result of advances in low power diode lasers as well as better detection schemes. In collaboration with two small businesses Vista Photonics, Inc. and Nanoracks LLC, NASA developed a 4 gas TDLS based monitor for an experimental demonstration of the technology on the International Space Station (ISS). Vista invented and constructed the core TDLS sensor. Nanoracks designed and built the enclosure, and certified the integrated monitor as a payload. The device, which measures oxygen, carbon dioxide, ammonia and water vapor, is called the Multi-Gas Monitor (MGM). MGM measures the 4 gases every few seconds and records a 30 second moving average of the concentrations. The relatively small unit draws only 2.5W. MGM was calibrated at NASA-Johnson Space Center in July 2013 and launched to ISS on a Soyuz vehicle in November 2013. Installation and activation of MGM occurred in February 2014, and the unit has been operating nearly continuously ever since in the Japanese Experiment Module. Data is downlinked from ISS about once per week. Oxygen and carbon dioxide data is compared with that from the central Major Constituents Analyzer. Water vapor data is compared with dew point measurements made by sensors in the Columbus module. The ammonia channel was tested by the crew using a commercial ammonia inhalant. MGM is remarkably stable to date. Results of 18 months of operation are presented and future applications including combustion product monitoring are discussed.

  13. Percentile-Based ETCCDI Temperature Extremes Indices for CMIP5 Model Output: New Results through Semiparametric Quantile Regression Approach

    Science.gov (United States)

    Li, L.; Yang, C.

    2017-12-01

    Climate extremes often manifest as rare events in terms of surface air temperature and precipitation with an annual reoccurrence period. In order to represent the manifold characteristics of climate extremes for monitoring and analysis, the Expert Team on Climate Change Detection and Indices (ETCCDI) had worked out a set of 27 core indices based on daily temperature and precipitation data, describing extreme weather and climate events on an annual basis. The CLIMDEX project (http://www.climdex.org) had produced public domain datasets of such indices for data from a variety of sources, including output from global climate models (GCM) participating in the Coupled Model Intercomparison Project Phase 5 (CMIP5). Among the 27 ETCCDI indices, there are six percentile-based temperature extremes indices that may fall into two groups: exceedance rates (ER) (TN10p, TN90p, TX10p and TX90p) and durations (CSDI and WSDI). Percentiles must be estimated prior to the calculation of the indices, and could more or less be biased by the adopted algorithm. Such biases will in turn be propagated to the final results of indices. The CLIMDEX used an empirical quantile estimator combined with a bootstrap resampling procedure to reduce the inhomogeneity in the annual series of the ER indices. However, there are still some problems remained in the CLIMDEX datasets, namely the overestimated climate variability due to unaccounted autocorrelation in the daily temperature data, seasonally varying biases and inconsistency between algorithms applied to the ER indices and to the duration indices. We now present new results of the six indices through a semiparametric quantile regression approach for the CMIP5 model output. By using the base-period data as a whole and taking seasonality and autocorrelation into account, this approach successfully addressed the aforementioned issues and came out with consistent results. The new datasets cover the historical and three projected (RCP2.6, RCP4.5 and RCP

  14. Large-scale dynamic compaction demonstration using WIPP salt: Fielding and preliminary results

    International Nuclear Information System (INIS)

    Ahrens, E.H.; Hansen, F.D.

    1995-10-01

    Reconsolidation of crushed rock salt is a phenomenon of great interest to programs studying isolation of hazardous materials in natural salt geologic settings. Of particular interest is the potential for disaggregated salt to be restored to nearly an impermeable state. For example, reconsolidated crushed salt is proposed as a major shaft seal component for the Waste Isolation Pilot Plant (WIPP) Project. The concept for a permanent shaft seal component of the WIPP repository is to densely compact crushed salt in the four shafts; an effective seal will then be developed as the surrounding salt creeps into the shafts, further consolidating the crushed salt. Fundamental information on placement density and permeability is required to ensure attainment of the design function. The work reported here is the first large-scale compaction demonstration to provide information on initial salt properties applicable to design, construction, and performance expectations. The shaft seals must function for 10,000 years. Over this period a crushed salt mass will become less permeable as it is compressed by creep closure of salt surrounding the shaft. These facts preclude the possibility of conducting a full-scale, real-time field test. Because permanent seals taking advantage of salt reconsolidation have never been constructed, performance measurements have not been made on an appropriately large scale. An understanding of potential construction methods, achievable initial density and permeability, and performance of reconsolidated salt over time is required for seal design and performance assessment. This report discusses fielding and operations of a nearly full-scale dynamic compaction of mine-run WIPP salt, and presents preliminary density and in situ (in place) gas permeability results

  15. Diet influenced tooth erosion prevalence in children and adolescents: Results of a meta-analysis and meta-regression

    NARCIS (Netherlands)

    Salas, M.M.; Nascimento, G.G.; Vargas-Ferreira, F.; Tarquinio, S.B.; Huysmans, M.C.D.N.J.M.; Demarco, F.F.

    2015-01-01

    OBJECTIVE: The aim of the present study was to assess the influence of diet in tooth erosion presence in children and adolescents by meta-analysis and meta-regression. DATA: Two reviewers independently performed the selection process and the quality of studies was assessed. SOURCES: Studies

  16. Cost results from the 1994 Fernald characterization field demonstration for uranium-contaminated soils

    International Nuclear Information System (INIS)

    Douthat, D.M.; Stewart, R.N.; Armstrong, A.Q.

    1995-04-01

    One of the principal objectives of the US Department of Energy (DOE) Office of Technology Development is to develop an optimum integrated system of technologies for removing uranium substances from soil. This system of technologies, through demonstration, must be proven in terms of cost reduction, waste minimization, risk reduction, and user applicability. To evaluate the effectiveness of these technologies, a field demonstration was conducted at the Fernald site in the summer of 1994. Fernald was selected as the host site for the demonstration based on environmental problems stemming from past production of uranium metal for defense-related applications. The following six alternative technologies were developed and/or demonstrated by the principal investigators in the Characterization Task Group at the field demonstration: (1) beta scintillation detector by Pacific Northwest Laboratory (PNL), (2) in situ gamma detector by PNL, (3) mobile laser ablation-inductively coupled plasma/atomic emission spectrometry (LA-ICP/AES) laboratory by Ames Laboratory, (4) long-range alpha detector (LRAD) by Los Alamos National Laboratory (LANL), (5) passive radon monitoring by ORNL, and (6) electret ion chamber by ORNL

  17. Zero Emission Bay Area (ZEBA) Fuel Cell Bus Demonstration: First Results Report

    Energy Technology Data Exchange (ETDEWEB)

    Chandler, K.; Eudy, L.

    2011-08-01

    This report documents the early implementation experience for the Zero Emission Bay Area (ZEBA) Demonstration, the largest fleet of fuel cell buses in the United States. The ZEBA Demonstration group includes five participating transit agencies: AC Transit (lead transit agency), Santa Clara Valley Transportation Authority (VTA), Golden Gate Transit (GGT), San Mateo County Transit District (SamTrans), and San Francisco Municipal Railway (Muni). The ZEBA partners are collaborating with the U.S. Department of Energy (DOE) and DOE's National Renewable Energy Laboratory (NREL) to evaluate the buses in revenue service.

  18. Differentiating regressed melanoma from regressed lichenoid keratosis.

    Science.gov (United States)

    Chan, Aegean H; Shulman, Kenneth J; Lee, Bonnie A

    2017-04-01

    Distinguishing regressed lichen planus-like keratosis (LPLK) from regressed melanoma can be difficult on histopathologic examination, potentially resulting in mismanagement of patients. We aimed to identify histopathologic features by which regressed melanoma can be differentiated from regressed LPLK. Twenty actively inflamed LPLK, 12 LPLK with regression and 15 melanomas with regression were compared and evaluated by hematoxylin and eosin staining as well as Melan-A, microphthalmia transcription factor (MiTF) and cytokeratin (AE1/AE3) immunostaining. (1) A total of 40% of regressed melanomas showed complete or near complete loss of melanocytes within the epidermis with Melan-A and MiTF immunostaining, while 8% of regressed LPLK exhibited this finding. (2) Necrotic keratinocytes were seen in the epidermis in 33% regressed melanomas as opposed to all of the regressed LPLK. (3) A dense infiltrate of melanophages in the papillary dermis was seen in 40% of regressed melanomas, a feature not seen in regressed LPLK. In summary, our findings suggest that a complete or near complete loss of melanocytes within the epidermis strongly favors a regressed melanoma over a regressed LPLK. In addition, necrotic epidermal keratinocytes and the presence of a dense band-like distribution of dermal melanophages can be helpful in differentiating these lesions. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. Selection of Children for the KEEP Demonstration School: Criteria, Procedures, and Results. Technical Report #13.

    Science.gov (United States)

    Mays, Violet; And Others

    This brief report describes the selection of the pupil population of the Kamehameha Early Education Program (KEEP) Demonstration School. The pupil population must be representative of the Kalihi community (an urban area of Honolulu) from which it is drawn. An attempt was made to include 75% Hawaiian and 25 % Non-Hawaiian children, to select equal…

  20. Monitoring results of two PBS demonstration vehicles in the forestry industry in South Africa

    CSIR Research Space (South Africa)

    Nordengen, Paul A

    2010-03-01

    Full Text Available and manufactured to comply with the Level 2 safety standards of the Australian PBS system. This paper presents a summary of the monitoring data compiled during the first nine months of operation of the two PBS demonstration vehicles, which were commissioned...

  1. POSTERS FOR WORKSHOP ON EPA’S ARSENIC REMOVAL DEMONSTRATION PROGRAM: RESULTS AND LESSONS LEARNED.

    Science.gov (United States)

    The Workshop included posters on 21 different arsenic demonstration projects. Each poster included information on raw water quality, cost of the system, a schematic of the layout of the system and several graphs and tables on the performance of the system for the removal of arsen...

  2. Teacher and Principal Survey Results in the National Preventive Dentistry Demonstration Program.

    Science.gov (United States)

    Klein, Stephen P.; And Others

    The National Preventive Dentistry Demonstration Program was conducted to assess the costs and benefits of combinations of school-based preventive dental care procedures. The program involved almost 30,000 elementary school children from 10 sites across the country. Classroom procedures, such as weekly fluoride mouthrinse, were administered or…

  3. Retinal microaneurysm count predicts progression and regression of diabetic retinopathy. Post-hoc results from the DIRECT Programme.

    Science.gov (United States)

    Sjølie, A K; Klein, R; Porta, M; Orchard, T; Fuller, J; Parving, H H; Bilous, R; Aldington, S; Chaturvedi, N

    2011-03-01

    To study the association between baseline retinal microaneurysm score and progression and regression of diabetic retinopathy, and response to treatment with candesartan in people with diabetes. This was a multicenter randomized clinical trial. The progression analysis included 893 patients with Type 1 diabetes and 526 patients with Type 2 diabetes with retinal microaneurysms only at baseline. For regression, 438 with Type 1 and 216 with Type 2 diabetes qualified. Microaneurysms were scored from yearly retinal photographs according to the Early Treatment Diabetic Retinopathy Study (ETDRS) protocol. Retinopathy progression and regression was defined as two or more step change on the ETDRS scale from baseline. Patients were normoalbuminuric, and normotensive with Type 1 and Type 2 diabetes or treated hypertensive with Type 2 diabetes. They were randomized to treatment with candesartan 32 mg daily or placebo and followed for 4.6 years. A higher microaneurysm score at baseline predicted an increased risk of retinopathy progression (HR per microaneurysm score 1.08, P diabetes; HR 1.07, P = 0.0174 in Type 2 diabetes) and reduced the likelihood of regression (HR 0.79, P diabetes; HR 0.85, P = 0.0009 in Type 2 diabetes), all adjusted for baseline variables and treatment. Candesartan reduced the risk of microaneurysm score progression. Microaneurysm counts are important prognostic indicators for worsening of retinopathy, thus microaneurysms are not benign. Treatment with renin-angiotensin system inhibitors is effective in the early stages and may improve mild diabetic retinopathy. Microaneurysm scores may be useful surrogate endpoints in clinical trials. © 2011 The Authors. Diabetic Medicine © 2011 Diabetes UK.

  4. ADVANCED SIMULATION CAPABILITY FOR ENVIRONMENTAL MANAGEMENT- CURRENT STATUS AND PHASE II DEMONSTRATION RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    Seitz, R.

    2013-02-26

    The U.S. Department of Energy (USDOE) Office of Environmental Management (EM), Office of Soil and Groundwater, is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The modular and open source high-performance computing tool facilitates integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. The ASCEM project continues to make significant progress in development of computer software capabilities with an emphasis on integration of capabilities in FY12. Capability development is occurring for both the Platform and Integrated Toolsets and High-Performance Computing (HPC) Multiprocess Simulator. The Platform capabilities provide the user interface and tools for end-to-end model development, starting with definition of the conceptual model, management of data for model input, model calibration and uncertainty analysis, and processing of model output, including visualization. The HPC capabilities target increased functionality of process model representations, toolsets for interaction with Platform, and verification and model confidence testing. The Platform and HPC capabilities are being tested and evaluated for EM applications in a set of demonstrations as part of Site Applications Thrust Area activities. The Phase I demonstration focusing on individual capabilities of the initial toolsets was completed in 2010. The Phase II demonstration completed in 2012 focused on showcasing integrated ASCEM capabilities. For Phase II, the Hanford Site deep vadose zone (BC Cribs) served as an application site for an end-to-end demonstration of capabilities, with emphasis on integration and linkages between the Platform and HPC components. Other demonstrations

  5. Supplement analysis 2 of environmental impacts resulting from modifications in the West Valley Demonstration Project

    International Nuclear Information System (INIS)

    1998-01-01

    The West Valley Demonstration Project, located in western New York, has approximately 600,000 gallons of liquid high-level radioactive waste (HLW) in storage in underground tanks. While corrosion analysis has revealed that only limited tank degradation has taken place, the failure of these tanks could release HLW to the environment. Congress requires DOE to demonstrate the technology for removal and solidification of HLW. DOE issued the Final Environmental Impact Statement (FEIS) in 1982. The purpose of this second supplement analysis is to re-assess the 1982 Final Environmental Impact Statement's continued adequacy. This report provides the necessary and appropriate data for DOE to determine whether the environmental impacts presented by the ongoing refinements in the design, process, and operations of the Project are considered sufficiently bounded within the envelope of impacts presented in the FEIS and supporting documentation

  6. Haida Gwaii / Queen Charlotte Islands demonstration tidal power plant feasibility study : summary results

    Energy Technology Data Exchange (ETDEWEB)

    Tu, A. [BC Hydro, Burnaby, BC (Canada)

    2008-07-01

    Remote communities may benefit from using tidal energy in terms of reduced diesel fuel consumption and the associated greenhouse gas emissions. A study was conducted to assess the feasibility for a tidal demonstration project on the Haida Gwaii, Queen Charlotte Islands. Candidate communities were scanned for resource potential, load profile, infrastructure distribution and community interest. This presentation focused on choosing an appropriate site for a given tidal power technology. Three hotspots in Masset Sound were identified as well as one hotspot at Juskatla Narrows. Technology providers were solicited for information on unit performance, cost, and trials to date. The presentation noted that demonstration or future commercial deployment is limited by resource and by the ability of the grid to accommodate tidal power. The presentation concluded with next steps which include publishing the study. tabs., figs.

  7. Advanced Simulation Capability for Environmental Management - Current Status and Phase II Demonstration Results - 13161

    Energy Technology Data Exchange (ETDEWEB)

    Seitz, Roger R.; Flach, Greg [Savannah River National Laboratory, Savannah River Site, Bldg 773-43A, Aiken, SC 29808 (United States); Freshley, Mark D.; Freedman, Vicky; Gorton, Ian [Pacific Northwest National Laboratory, MSIN K9-33, P.O. Box 999, Richland, WA 99352 (United States); Dixon, Paul; Moulton, J. David [Los Alamos National Laboratory, MS B284, P.O. Box 1663, Los Alamos, NM 87544 (United States); Hubbard, Susan S.; Faybishenko, Boris; Steefel, Carl I.; Finsterle, Stefan [Lawrence Berkeley National Laboratory, 1 Cyclotron Road, MS 50B-4230, Berkeley, CA 94720 (United States); Marble, Justin [Department of Energy, 19901 Germantown Road, Germantown, MD 20874-1290 (United States)

    2013-07-01

    The U.S. Department of Energy (US DOE) Office of Environmental Management (EM), Office of Soil and Groundwater, is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The modular and open source high-performance computing tool facilitates integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. The ASCEM project continues to make significant progress in development of computer software capabilities with an emphasis on integration of capabilities in FY12. Capability development is occurring for both the Platform and Integrated Tool-sets and High-Performance Computing (HPC) Multi-process Simulator. The Platform capabilities provide the user interface and tools for end-to-end model development, starting with definition of the conceptual model, management of data for model input, model calibration and uncertainty analysis, and processing of model output, including visualization. The HPC capabilities target increased functionality of process model representations, tool-sets for interaction with Platform, and verification and model confidence testing. The Platform and HPC capabilities are being tested and evaluated for EM applications in a set of demonstrations as part of Site Applications Thrust Area activities. The Phase I demonstration focusing on individual capabilities of the initial tool-sets was completed in 2010. The Phase II demonstration completed in 2012 focused on showcasing integrated ASCEM capabilities. For Phase II, the Hanford Site deep vadose zone (BC Cribs) served as an application site for an end-to-end demonstration of capabilities, with emphasis on integration and linkages between the Platform and HPC components. Other demonstrations

  8. Advanced Simulation Capability for Environmental Management - Current Status and Phase II Demonstration Results - 13161

    International Nuclear Information System (INIS)

    Seitz, Roger R.; Flach, Greg; Freshley, Mark D.; Freedman, Vicky; Gorton, Ian; Dixon, Paul; Moulton, J. David; Hubbard, Susan S.; Faybishenko, Boris; Steefel, Carl I.; Finsterle, Stefan; Marble, Justin

    2013-01-01

    The U.S. Department of Energy (US DOE) Office of Environmental Management (EM), Office of Soil and Groundwater, is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. The modular and open source high-performance computing tool facilitates integrated approaches to modeling and site characterization that enable robust and standardized assessments of performance and risk for EM cleanup and closure activities. The ASCEM project continues to make significant progress in development of computer software capabilities with an emphasis on integration of capabilities in FY12. Capability development is occurring for both the Platform and Integrated Tool-sets and High-Performance Computing (HPC) Multi-process Simulator. The Platform capabilities provide the user interface and tools for end-to-end model development, starting with definition of the conceptual model, management of data for model input, model calibration and uncertainty analysis, and processing of model output, including visualization. The HPC capabilities target increased functionality of process model representations, tool-sets for interaction with Platform, and verification and model confidence testing. The Platform and HPC capabilities are being tested and evaluated for EM applications in a set of demonstrations as part of Site Applications Thrust Area activities. The Phase I demonstration focusing on individual capabilities of the initial tool-sets was completed in 2010. The Phase II demonstration completed in 2012 focused on showcasing integrated ASCEM capabilities. For Phase II, the Hanford Site deep vadose zone (BC Cribs) served as an application site for an end-to-end demonstration of capabilities, with emphasis on integration and linkages between the Platform and HPC components. Other demonstrations

  9. Elder mediation in theory and practice: study results from a national caregiver mediation demonstration project.

    Science.gov (United States)

    Crampton, Alexandra

    2013-01-01

    Mediation is a process through which a third party facilitates discussion among disputing parties to help them identify interests and ideally reach an amicable solution. Elder mediation is a growing subspecialty to address conflicts involving older adults, primarily involving caregiving or finances. Mediation is theorized to empower participants but critics argue that it can exacerbate power imbalances among parties and coerce consensus. These contested claims are examined through study of a national caregiver mediation demonstration project. Study implications underscore the importance of gerontological social work expertise to ensure the empowerment of vulnerable older adults in mediation sessions.

  10. Regression to Causality : Regression-style presentation influences causal attribution

    DEFF Research Database (Denmark)

    Bordacconi, Mats Joe; Larsen, Martin Vinæs

    2014-01-01

    of equivalent results presented as either regression models or as a test of two sample means. Our experiment shows that the subjects who were presented with results as estimates from a regression model were more inclined to interpret these results causally. Our experiment implies that scholars using regression...... models – one of the primary vehicles for analyzing statistical results in political science – encourage causal interpretation. Specifically, we demonstrate that presenting observational results in a regression model, rather than as a simple comparison of means, makes causal interpretation of the results...... more likely. Our experiment drew on a sample of 235 university students from three different social science degree programs (political science, sociology and economics), all of whom had received substantial training in statistics. The subjects were asked to compare and evaluate the validity...

  11. In situ permeable flow sensors at the Savannah River Integrated Demonstration: Phase 2 results

    International Nuclear Information System (INIS)

    Ballard, S.

    1994-08-01

    A suite of In Situ Permeable Flow Sensors was deployed at the site of the Savannah River Integrated Demonstration to monitor the interaction between the groundwater flow regime and air injected into the saturated subsurface through a horizontal well. One of the goals of the experiment was to determine if a groundwater circulation system was induced by the air injection process. The data suggest that no such circulation system was established, perhaps due to the heterogeneous nature of the sediments through which the injected gas has to travel. The steady state and transient groundwater flow patterns observed suggest that the injected air followed high permeability pathways from the injection well to the water table. The preferential pathways through the essentially horizontal impermeable layers appear to have been created by drilling activities at the site

  12. EPA program to demonstrate mitigation measures for indoor radon: initial results

    International Nuclear Information System (INIS)

    Henschel, D.B.; Scott, A.G.

    1986-01-01

    EPA has installed radon mitigation techniques in 18 concrete block basement homes in the Reading Prong region of eastern Pennsylvania. Three alternative active soil ventilation approaches were tested: suction on the void network within the concrete block basement walls; suction on the footing drain tile system; and suction on the aggregate underneath the concrete slab. The initial 18 mitigation installations were designed to demonstrate techniques which would have low to moderate installation and operating costs. Where effective closure of major openings in the block walls is possible, suction on the wall voids has proved to be extremely effective, able to reduce homes having very high radon Working Levels (up to 7 WL) to 0.02 WL and less. However, where inaccessible major openings are concealed within the wall, it is more difficult and/or more expensive to develop adequate suction on the void network, and performance is reduced. Testing is continuing to demonstrate the steps required to achieve high performance with wall suction in homes with such difficult-to close walls. Drain tile suction can be very effective where the drain tiles completely surround the home; drain tile suction is the least expensive and most aesthetic of the active soil ventilation approaches, but appears susceptible to spikes in radon levels when the basement is depressurized. Sub-slab suction as tested in this study - with one or two individual suction points in the slab - does not appear adequate to ensure sustained high levels of reduction on block wall basement homes; it appears to effectively treat slab-related soil gas entry routes so long as a uniform layer of aggregate is present, but it does not appear to effectively treat the wall-related entry routes. Closure of major openings might have improved sub-slab suction performance. 5 figures, 3 tables

  13. Regression Phalanxes

    OpenAIRE

    Zhang, Hongyang; Welch, William J.; Zamar, Ruben H.

    2017-01-01

    Tomal et al. (2015) introduced the notion of "phalanxes" in the context of rare-class detection in two-class classification problems. A phalanx is a subset of features that work well for classification tasks. In this paper, we propose a different class of phalanxes for application in regression settings. We define a "Regression Phalanx" - a subset of features that work well together for prediction. We propose a novel algorithm which automatically chooses Regression Phalanxes from high-dimensi...

  14. Research and demonstration results for a new "Double-Solution" technology for municipal solid waste treatment.

    Science.gov (United States)

    Erping, Li; Haoyun, Chen; Yanyang, Shang; Jun, Pan; Qing, Hu

    2017-11-01

    In this paper, the pyrolysis characteristics of six typical components in municipal solid waste (MSW) were investigated through a TG-FTIR combined technique and it was concluded that the main pyrolysis process of the biomass components (including food residues, sawdust and paper) occurred at 150-600°C. The main volatiles were multi-component gas including H 2 O, CO 2 , and CO. The main pyrolysis temperatures of three artificial products (PP, PVC and leather) was ranged from 200to 500°C. The wavelength of small molecule gases (CH 4 , CO 2 and CO) and the the chemical bonds (CO and CC) were observed in the infrared spectrum Based on the pyrolysis temperature interval and volatile constituent, a new "double-solution" process of pyrolysis and oxygen-enrichment decomposition MSW was designed. To achieve this process, a double-solution project was built for the direct treatment of MSW (10t/d). The complete setup of equipment and analysis of the byproducts has been reported in this paper to indicate the performance of this process. Energy balance and economic benefits were analysed for the process supporting. It was successfully demonstrated that the double-solution process was the environmentally friendly alternative method for MSW treatment in Chinese rural areas. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Results of a search for neutrinoless double-beta decay using the COBRA demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Quante, Thomas; Goessling, Claus; Kroeninger, Kevin [TU Dortmund, Exp. Physik IV, Dortmund (Germany)

    2016-07-01

    COBRA is an experiment aiming to search for neutrinoless double-beta-decay (0νββ-decay) using CdZnTe semiconductor detectors. The main focus is on {sup 116}Cd, with a Q-value of 2813.5 keV well above the highest dominant naturally occurring gamma lines. By measuring the half-life of the 0νββ-decay, it is possible to clarify the nature of the neutrino as either Dirac or Majorana particle and furthermore to determine its effective Majorana mass. The COBRA collaboration operates a demonstrator to search for these decays at the Laboratori Nazionali del Gran Sasso in Italy. The exposure of 234.7 kg d considered in this analysis was collected between September 2011 and February 2015. The analysis focuses on the decay of the nuclides {sup 114}Cd, {sup 128}Te, {sup 70}Zn, {sup 130}Te and {sup 116}Cd. A Bayesian analysis is performed to estimate the signal strength of 0νββ-decay.

  16. Characterization and demonstration results of a SQUID magnetometer system developed for geomagnetic field measurements

    Science.gov (United States)

    Kawai, J.; Miyamoto, M.; Kawabata, M.; Nosé, M.; Haruta, Y.; Uehara, G.

    2017-08-01

    We characterized a low temperature superconducting quantum interference device (SQUID) magnetometer system developed for high-sensitivity geomagnetic field measurement, and demonstrated the detection of weak geomagnetic signals. The SQUID magnetometer system is comprised of three-axis SQUID magnetometers housed in a glass fiber reinforced plastic cryostat, readout electronics with flux locked loop (FLL), a 24-bit data logger with a global positioning system and batteries. The system noise was approximately 0.2 pT √Hz- 1/2 in the 1-50 Hz frequency range. This performance was determined by including the thermal noise and the shielding effect of the copper shield, which covered the SQUID magnetometers to eliminate high-frequency interference. The temperature drift of the system was ˜0.8 pT °C- 1 in an FLL operation. The system operated for a month using 33 l liquid helium. Using this system, we performed the measurements of geomagnetic field in the open-air, far away from the city. The system could detect weak geomagnetic signals such as the Schumann resonance with sixth harmonics, and the ionospheric Alfvén resonance appearing at night, for the north-south and east-west components of the geomagnetic field. We confirm that the system was capable of high-sensitivity measurement of the weak geomagnetic activities.

  17. Financial performance among adult day centers: results of a national demonstration program.

    Science.gov (United States)

    Reifler, B V; Henry, R S; Rushing, J; Yates, M K; Cox, N J; Bradham, D D; McFarlane, M

    1997-02-01

    This paper describes the financial performance (defined as percent of total expenses covered by net operating revenue) of 16 adult day centers participating in a national demonstration program on day services for people with dementia, including examination of possible predictors of financial performance. Participating sites submitted quarterly financial and utilization reports to the National Program Office. Descriptive statistics summarize the factors believed to influence financial performance. Sites averaged meeting 35% of expenses from self-pay and 29% from government (mainly Medicaid) revenue, totaling 64% of all (cash plus in-kind) expenses met by operating revenue. Examination of center characteristics suggests that factors related to meeting consumer needs, such as being open a full day (i.e., 7:30 am to 6:00 pm) rather than shorter hours, and providing transportation, may be related to improved utilization and, thus, improved financial performance. Higher fees were not related to lower enrollment, census, or revenue. Adult day centers are able to achieve financial viability through a combination of operating (i.e., fee-for-service) and non-operating revenue. Operating revenue is enhanced by placing emphasis on consumer responsiveness, such as being open a full day. Because higher fees were not related to lower utilization, centers should set fees to reflect actual costs. The figure of 64% of expenses met by operating revenue is conservative inasmuch as sites included in-kind revenue as expenses in their budgeting calculations, and percent of cash expenses met by operating revenue would be higher (approximately 75% for this group of centers).

  18. Structural Area Inspection Frequency Evaluation (SAIFE). Volume 5. Results of Model Demonstration

    Science.gov (United States)

    1978-04-01

    serlV ice 11 i Stot’)’ .i ’ base~d Otil narrow -bodly at ir-c t’a ftA wh ich have fewer 0 1 euient~s than thle hypothet i cal widve - A body aircaf...kP1 c u TABLE 14. DL40MRATION RESULTS .M WING - SPAR, FMD Defects Per Million Flight Hours Crack Detected Preflight 0.00 0104 Service 0.00 0.49 Phase

  19. A multicenter study demonstrating discordant results from electronic prostate-specific antigen biochemical failure calculation systems

    International Nuclear Information System (INIS)

    Williams, Scott G.; Pickles, Tom; Kestin, Larry; Potters, Louis; Fearn, Paul; Smith, Ryan; Pratt, Gary

    2006-01-01

    Purpose: To evaluate the interobserver variation of four electronic biochemical failure (bF) calculators using three bF definitions. Methods and Materials: The data of 1200 men were analyzed using the electronic bF calculators of four institutions. Three bF definitions were examined for their concordance of bF identification across the centers: the American Society for Therapeutic Radiology and Oncology consensus definition (ACD), the lowest prostate-specific antigen (PSA) level to date plus 2 ng/mL (L2), and a threshold of 3 ng/mL (T3). Results: Unanimous agreement regarding bF status using the ACD, L2, and T3 definitions occurred in 87.3%, 96.4%, and 92.7% of cases, respectively. Using the ACD, 63% of the variation was from one institution, which allowed the bF status to be reversed if a PSA decline was seen after bF (PSA 'bounce'). A total of 270 men had an ACD bF time variation of >2 months across the calculators, and the 5-year freedom from bF rate was 49.8-60.9%. The L2 definition had a 20.5% rate of calculated bF times; which varied by >2 months (median, 6.4; range, 2.1-75.6) and a corresponding 5-year freedom from bF rate of 55.9-61.0%. The T3 definition had a 2.0% range in the 5-year freedom from bF. Fifteen definition interpretation variations were identified. Conclusion: Reported bF results vary not only because of bF definition differences, but because of variations in how those definitions are written into computer-based calculators, with multiple interpretations most prevalent for the ACD. An algorithm to avoid misinterpretations is proposed for the L2 definition. A verification system to guarantee consistent electronic bF results requires development

  20. Some results from the demonstration of indoor radon reduction measures in block basement houses

    International Nuclear Information System (INIS)

    Henschel, D.B.; Scott, A.G.

    1989-01-01

    Active soil ventilation techniques have been tested in 26 block-wall basement houses in eastern Pennsylvania with significantly elevated indoor radon concentrations, generally above 740 Bq/m 3 , and the results indicate that radon levels can be reduced substantially often below the U.S. Environmental Protection Agency (EPA) guideline of 148 Bq/m 3 , if effective suction can be drawn on the soil underneath the concrete slabs of these houses. Such effective suction appears achievable when either: (1) the house has a complete loop of drain tile around its footings for water drainage purposes, and suction is drawn on that loop; or (2) a sufficient number of suction pipes can be inserted at the proper locations into the crushed rock or the soil underneath the slab

  1. Strategic alliance for environmental restoration - results of the Chicago Pile 5 large scale demonstration project

    International Nuclear Information System (INIS)

    Aker, R.E.; Bradley, T.L.; Bhattacharyya, S.

    1998-01-01

    The world's largest environmental cleanup effort is focused upon the DOE weapons complex. These cleanup efforts parallel those which will be required as the commercial nuclear industry reaches the end of licensed life. The strategic Alliance for Environmental Restoration (Strategic Alliance), reflects the cooperative interest of industry, commercial nuclear utilities, university and national laboratory team members to bring a collaborative best-in-class approach to finding, and providing effective delivery of innovative environmental remediation technologies to the DOE Complex and subsequently to industry. The Strategic Alliance is comprised of team members from ComEd, Duke Engineering and Services, 3M, ICF Kaiser, Florida International University, and Argonne National Laboratory in concert with DOE. This team tested and evaluated over twenty innovative technologies in an effort to help provide cost effective technology solutions to DOE/Industry needs for decontamination and decommissioning. This paper summarizes the approach used by the Strategic Alliance and describes the results of this DOE funded project

  2. Predictors of adherence with self-care guidelines among persons with type 2 diabetes: results from a logistic regression tree analysis.

    Science.gov (United States)

    Yamashita, Takashi; Kart, Cary S; Noe, Douglas A

    2012-12-01

    Type 2 diabetes is known to contribute to health disparities in the U.S. and failure to adhere to recommended self-care behaviors is a contributing factor. Intervention programs face difficulties as a result of patient diversity and limited resources. With data from the 2005 Behavioral Risk Factor Surveillance System, this study employs a logistic regression tree algorithm to identify characteristics of sub-populations with type 2 diabetes according to their reported frequency of adherence to four recommended diabetes self-care behaviors including blood glucose monitoring, foot examination, eye examination and HbA1c testing. Using Andersen's health behavior model, need factors appear to dominate the definition of which sub-groups were at greatest risk for low as well as high adherence. Findings demonstrate the utility of easily interpreted tree diagrams to design specific culturally appropriate intervention programs targeting sub-populations of diabetes patients who need to improve their self-care behaviors. Limitations and contributions of the study are discussed.

  3. Latest results of SEE measurements obtained by the STRURED demonstrator ASIC

    Energy Technology Data Exchange (ETDEWEB)

    Candelori, A. [INFN, Section of Padova, Via Marzolo 8, c.a.p. 35131, Padova (Italy); De Robertis, G. [INFN Section of Bari, Via Orabona 4, c.a.p. 70126, Bari (Italy); Gabrielli, A. [Physics Department, University of Bologna, Viale Berti Pichat 6/2, c.a.p. 40127, Bologna (Italy); Mattiazzo, S.; Pantano, D. [INFN, Section of Padova, Via Marzolo 8, c.a.p. 35131, Padova (Italy); Ranieri, A., E-mail: antonio.ranieri@ba.infn.i [INFN Section of Bari, Via Orabona 4, c.a.p. 70126, Bari (Italy); Tessaro, M. [INFN, Section of Padova, Via Marzolo 8, c.a.p. 35131, Padova (Italy)

    2011-01-21

    With the perspective to develop a radiation-tolerant circuit for High Energy Physics (HEP) applications, a test digital ASIC VLSI chip, called STRURED, has been designed and fabricated using a standard-cell library of commercial 130 nm CMOS technology by implementing three different radiation-tolerant architectures (Hamming, Triple Modular Redundancy and Triple Time Redundancy) in order to correct circuit malfunctions induced by the occurrence of Soft Errors (SEs). SEs are one of the main reasons of failures affecting electronic digital circuits operating in harsh radiation environments, such as in experiments performed at HEP colliders or in apparatus to be operated in space. In this paper we present and discuss the latest results of SE cross-section measurements performed using the STRURED digital device, exposed to high energy heavy ions at the SIRAD irradiation facility of the INFN National Laboratories of Legnaro (Padova, Italy). In particular the different behaviors of the input part and the core of the three radiation-tolerant architectures are analyzed in detail.

  4. Regression analysis by example

    CERN Document Server

    Chatterjee, Samprit

    2012-01-01

    Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded

  5. Autistic Regression

    Science.gov (United States)

    Matson, Johnny L.; Kozlowski, Alison M.

    2010-01-01

    Autistic regression is one of the many mysteries in the developmental course of autism and pervasive developmental disorders not otherwise specified (PDD-NOS). Various definitions of this phenomenon have been used, further clouding the study of the topic. Despite this problem, some efforts at establishing prevalence have been made. The purpose of…

  6. "Logits and Tigers and Bears, Oh My! A Brief Look at the Simple Math of Logistic Regression and How It Can Improve Dissemination of Results"

    Directory of Open Access Journals (Sweden)

    Jason W. Osborne

    2012-06-01

    Full Text Available Logistic regression is slowly gaining acceptance in the social sciences, and fills an important niche in the researcher's toolkit: being able to predict important outcomes that are not continuous in nature. While OLS regression is a valuable tool, it cannot routinely be used to predict outcomes that are binary or categorical in nature. These outcomes represent important social science lines of research: retention in, or dropout from school, using illicit drugs, underage alcohol consumption, antisocial behavior, purchasing decisions, voting patterns, risky behavior, and so on. The goal of this paper is to briefly lead the reader through the surprisingly simple mathematics that underpins logistic regression: probabilities, odds, odds ratios, and logits. Anyone with spreadsheet software or a scientific calculator can follow along, and in turn, this knowledge can be used to make much more interesting, clear, and accurate presentations of results (especially to non-technical audiences. In particular, I will share an example of an interaction in logistic regression, how it was originally graphed, and how the graph was made substantially more user-friendly by converting the original metric (logits to a more readily interpretable metric (probability through three simple steps.

  7. Linear regression

    CERN Document Server

    Olive, David J

    2017-01-01

    This text covers both multiple linear regression and some experimental design models. The text uses the response plot to visualize the model and to detect outliers, does not assume that the error distribution has a known parametric distribution, develops prediction intervals that work when the error distribution is unknown, suggests bootstrap hypothesis tests that may be useful for inference after variable selection, and develops prediction regions and large sample theory for the multivariate linear regression model that has m response variables. A relationship between multivariate prediction regions and confidence regions provides a simple way to bootstrap confidence regions. These confidence regions often provide a practical method for testing hypotheses. There is also a chapter on generalized linear models and generalized additive models. There are many R functions to produce response and residual plots, to simulate prediction intervals and hypothesis tests, to detect outliers, and to choose response trans...

  8. CT demonstration of chicken trachea resulting from complete cartilaginous rings of the trachea in ring-sling complex

    International Nuclear Information System (INIS)

    Calcagni, Giulio; Bonnet, Damien; Sidi, Daniel; Brunelle, Francis; Vouhe, Pascal; Ou, Phalla

    2008-01-01

    We report a 10-month-old infant who presented with tetralogy of Fallot and respiratory disease in whom the suspicion of a ring-sling complex was confirmed by high-resolution CT. CT demonstrated the typical association of left pulmonary artery sling and the ''chicken trachea'' resulting from complete cartilaginous rings of the trachea. (orig.)

  9. CT demonstration of chicken trachea resulting from complete cartilaginous rings of the trachea in ring-sling complex

    Energy Technology Data Exchange (ETDEWEB)

    Calcagni, Giulio; Bonnet, Damien; Sidi, Daniel [University Paris Descartes, Department of Paediatric Cardiology, Hopital Necker-Enfants Malades, AP-HP, Paris (France); Brunelle, Francis [University Paris Descartes, Department of Paediatric Radiology, Hopital Necker-Enfants Malades, AP-HP, Paris Cedex 15 (France); Vouhe, Pascal [University Paris Descartes, Department of Paediatric Cardiovascular Surgery, Hopital Necker-Enfants Malades, AP-HP, Paris (France); Ou, Phalla [University Paris Descartes, Department of Paediatric Cardiology, Hopital Necker-Enfants Malades, AP-HP, Paris (France); University Paris Descartes, Department of Paediatric Radiology, Hopital Necker-Enfants Malades, AP-HP, Paris Cedex 15 (France)

    2008-07-15

    We report a 10-month-old infant who presented with tetralogy of Fallot and respiratory disease in whom the suspicion of a ring-sling complex was confirmed by high-resolution CT. CT demonstrated the typical association of left pulmonary artery sling and the ''chicken trachea'' resulting from complete cartilaginous rings of the trachea. (orig.)

  10. Development and demonstration of a stabilization system for buried mixed waste tanks: Initital results of the tank V-9 hot demonstration

    International Nuclear Information System (INIS)

    Matthern, G.E.; Kuhns, D.J.; Meservey, R.H.; Farnsworth, R.K.

    1996-01-01

    This paper describes a systematic approach for the stabilization of buried mixed waste tanks and presents the status of an application of this approach to a specific hot waste tank demonstration to be performed in FY-96. The approach uses the cradle-to-grave concept and includes technical, health and safety, and regulatory considerations and requirements. It starts with the identification of the tank and continues to the final disposition and monitoring of the tank

  11. Suppression Situations in Multiple Linear Regression

    Science.gov (United States)

    Shieh, Gwowen

    2006-01-01

    This article proposes alternative expressions for the two most prevailing definitions of suppression without resorting to the standardized regression modeling. The formulation provides a simple basis for the examination of their relationship. For the two-predictor regression, the author demonstrates that the previous results in the literature are…

  12. Preliminary test results from a free-piston Stirling engine technology demonstration program to support advanced radioisotope space power applications

    International Nuclear Information System (INIS)

    White, Maurice A.; Qiu Songgang; Augenblick, Jack E.

    2000-01-01

    Free-piston Stirling engines offer a relatively mature, proven, long-life technology that is well-suited for advanced, high-efficiency radioisotope space power systems. Contracts from DOE and NASA are being conducted by Stirling Technology Company (STC) for the purpose of demonstrating the Stirling technology in a configuration and power level that is representative of an eventual space power system. The long-term objective is to develop a power system with an efficiency exceeding 20% that can function with a high degree of reliability for up to 15 years on deep space missions. The current technology demonstration convertors (TDC's) are completing shakedown testing and have recently demonstrated performance levels that are virtually identical to projections made during the preliminary design phase. This paper describes preliminary test results for power output, efficiency, and vibration levels. These early results demonstrate the ability of the free-piston Stirling technology to exceed objectives by approximately quadrupling the efficiency of conventional radioisotope thermoelectric generators (RTG's)

  13. Preliminary test results from a free-piston Stirling engine technology demonstration program to support advanced radioisotope space power applications

    Science.gov (United States)

    White, Maurice A.; Qiu, Songgang; Augenblick, Jack E.

    2000-01-01

    Free-piston Stirling engines offer a relatively mature, proven, long-life technology that is well-suited for advanced, high-efficiency radioisotope space power systems. Contracts from DOE and NASA are being conducted by Stirling Technology Company (STC) for the purpose of demonstrating the Stirling technology in a configuration and power level that is representative of an eventual space power system. The long-term objective is to develop a power system with an efficiency exceeding 20% that can function with a high degree of reliability for up to 15 years on deep space missions. The current technology demonstration convertors (TDC's) are completing shakedown testing and have recently demonstrated performance levels that are virtually identical to projections made during the preliminary design phase. This paper describes preliminary test results for power output, efficiency, and vibration levels. These early results demonstrate the ability of the free-piston Stirling technology to exceed objectives by approximately quadrupling the efficiency of conventional radioisotope thermoelectric generators (RTG's). .

  14. In situ vitrification demonstration at Pit 1, Oak Ridge National Laboratory. Volume 1: Results of treatability study

    International Nuclear Information System (INIS)

    Spalding, B.P.; Naney, M.T.; Cline, S.R.; Bogle, M.A.

    1997-12-01

    A treatability study was initiated in October 1993 to apply in situ vitrification (ISV) to at least two segments of Oak Ridge National Laboratory (ORNL) seepage Pit 1 by the end of fiscal year (FY) 1995. This treatability study was later extended to include all of Pit 1 and was performed to support a possible Interim Record of Decision or removal action for closure of one or more of the seepage pits and trenches beginning as early as FY 1997. This treatability study was carried out to establish the field-scale technical performance of ISV for (1) attaining the required depth, nominally 15 ft, to incorporate source contamination within and beneath the pits; (2) demonstrating field capability for the overlap of melt settings which will be necessary to achieve fused, melted segments of the source contamination; (3) demonstrating off-gas handling technology for accommodating and minimizing the volatilization of 137 Cs; (4) demonstrating adequate site characterization techniques to predict ISV melting kinetics, processing temperatures, and product durability; and (5) promoting public acceptance of ISV technology by demonstrating its safety, implementability, site impacts, and air emissions and by coordinating the treatability study within the regulatory closure process. In April 1996 an expulsion of an estimated 10% of the 196 Mg (216 tons) melt body occurred resulting in significant damage to ISV equipment and, ultimately, led to an indefinite suspension of further ISV operations at Pit 1. This report summarizes the technical accomplishments and status of the project in fulfilling these objectives through September 1997

  15. In situ vitrification demonstration at Pit 1, Oak Ridge National Laboratory. Volume 1: Results of treatability study

    Energy Technology Data Exchange (ETDEWEB)

    Spalding, B.P.; Naney, M.T.; Cline, S.R.; Bogle, M.A. [Oak Ridge National Lab., TN (United States). Environmental Sciences Div.; Tixier, J.S. [Pacific Northwest National Lab., Richland, WA (United States)

    1997-12-01

    A treatability study was initiated in October 1993 to apply in situ vitrification (ISV) to at least two segments of Oak Ridge National Laboratory (ORNL) seepage Pit 1 by the end of fiscal year (FY) 1995. This treatability study was later extended to include all of Pit 1 and was performed to support a possible Interim Record of Decision or removal action for closure of one or more of the seepage pits and trenches beginning as early as FY 1997. This treatability study was carried out to establish the field-scale technical performance of ISV for (1) attaining the required depth, nominally 15 ft, to incorporate source contamination within and beneath the pits; (2) demonstrating field capability for the overlap of melt settings which will be necessary to achieve fused, melted segments of the source contamination; (3) demonstrating off-gas handling technology for accommodating and minimizing the volatilization of {sup 137}Cs; (4) demonstrating adequate site characterization techniques to predict ISV melting kinetics, processing temperatures, and product durability; and (5) promoting public acceptance of ISV technology by demonstrating its safety, implementability, site impacts, and air emissions and by coordinating the treatability study within the regulatory closure process. In April 1996 an expulsion of an estimated 10% of the 196 Mg (216 tons) melt body occurred resulting in significant damage to ISV equipment and, ultimately, led to an indefinite suspension of further ISV operations at Pit 1. This report summarizes the technical accomplishments and status of the project in fulfilling these objectives through September 1997.

  16. Understanding logistic regression analysis

    OpenAIRE

    Sperandei, Sandro

    2014-01-01

    Logistic regression is used to obtain odds ratio in the presence of more than one explanatory variable. The procedure is quite similar to multiple linear regression, with the exception that the response variable is binomial. The result is the impact of each variable on the odds ratio of the observed event of interest. The main advantage is to avoid confounding effects by analyzing the association of all variables together. In this article, we explain the logistic regression procedure using ex...

  17. Applied linear regression

    CERN Document Server

    Weisberg, Sanford

    2013-01-01

    Praise for the Third Edition ""...this is an excellent book which could easily be used as a course text...""-International Statistical Institute The Fourth Edition of Applied Linear Regression provides a thorough update of the basic theory and methodology of linear regression modeling. Demonstrating the practical applications of linear regression analysis techniques, the Fourth Edition uses interesting, real-world exercises and examples. Stressing central concepts such as model building, understanding parameters, assessing fit and reliability, and drawing conclusions, the new edition illus

  18. Summary Report on Phase I Results from the 3D Printing in Zero G Technology Demonstration Mission, Volume I

    Science.gov (United States)

    Prater, T. J.; Bean, Q. A.; Beshears, R. D.; Rolin, T. D.; Werkheiser, N. J.; Ordonez, E. A.; Ryan, R. M.; Ledbetter, F. E., III

    2016-01-01

    Human space exploration to date has been confined to low-Earth orbit and the Moon. The International Space Station (ISS) provides a unique opportunity for researchers to prove out the technologies that will enable humans to safely live and work in space for longer periods of time and venture beyond the Earth/Moon system. The ability to manufacture parts in-space rather than launch them from Earth represents a fundamental shift in the current risk and logistics paradigm for human spaceflight. In September 2014, NASA, in partnership with Made In Space, Inc., launched the 3D Printing in Zero-G technology demonstration mission to explore the potential of additive manufacturing for in-space applications and demonstrate the capability to manufacture parts and tools on orbit using fused deposition modeling. This Technical Publication summarizes the results of testing to date of the ground control and flight prints from the first phase of this ISS payload.

  19. FY16 Status of Immersion Phased Array Ultrasonic Probe Development and Performance Demonstration Results for Under Sodium Viewing

    International Nuclear Information System (INIS)

    Diaz, Aaron A.; Chamberlin, Clyde E.; Edwards, Matthew K.; Hagge, Tobias J.; Hughes, Michael S.; Larche, Michael R.; Mathews, Royce A.; Neill, Kevin J.; Prowant, Matthew S.

    2016-01-01

    This section of the Joint summary technical letter report (TLR) describes work conducted at the Pacific Northwest National Laboratory (PNNL) during FY 2016 (FY16) on the under-sodium viewing (USV) PNNL project 58745, work package AT-16PN230102. This section of the TLR satisfies PNNL's M3AT-16PN2301025 milestone and is focused on summarizing the design, development, and evaluation of two different phased-array ultrasonic testing (PA-UT) probe designs - a two-dimensional (2D) matrix phased-array probe, and two one-dimensional (1D) linear array probes, referred to as serial number 4 (SN4) engineering test units (ETUs). The 2D probe is a pulse-echo (PE), 32x2, 64-element matrix phased-array ETU. The 1D probes are 32x1 element linear array ETUs. This TLR also provides the results from a performance demonstration (PD) of in-sodium target detection trials at 260°C using both probe designs. This effort continues the iterative evolution supporting the longer term goal of producing and demonstrating a pre-manufacturing prototype ultrasonic probe that possesses the fundamental performance characteristics necessary to enable the development of a high-temperature sodium-cooled fast reactor (SFR) inspection system for in-sodium detection and imaging.

  20. FY16 Status of Immersion Phased Array Ultrasonic Probe Development and Performance Demonstration Results for Under Sodium Viewing

    Energy Technology Data Exchange (ETDEWEB)

    Diaz, Aaron A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chamberlin, Clyde E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Edwards, Matthew K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hagge, Tobias J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hughes, Michael S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Larche, Michael R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mathews, Royce A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Neill, Kevin J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Prowant, Matthew S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-08-31

    This section of the Joint summary technical letter report (TLR) describes work conducted at the Pacific Northwest National Laboratory (PNNL) during FY 2016 (FY16) on the under-sodium viewing (USV) PNNL project 58745, work package AT-16PN230102. This section of the TLR satisfies PNNL’s M3AT-16PN2301025 milestone and is focused on summarizing the design, development, and evaluation of two different phased-array ultrasonic testing (PA-UT) probe designs—a two-dimensional (2D) matrix phased-array probe, and two one-dimensional (1D) linear array probes, referred to as serial number 4 (SN4) engineering test units (ETUs). The 2D probe is a pulse-echo (PE), 32×2, 64-element matrix phased-array ETU. The 1D probes are 32×1 element linear array ETUs. This TLR also provides the results from a performance demonstration (PD) of in-sodium target detection trials at 260°C using both probe designs. This effort continues the iterative evolution supporting the longer term goal of producing and demonstrating a pre-manufacturing prototype ultrasonic probe that possesses the fundamental performance characteristics necessary to enable the development of a high-temperature sodium-cooled fast reactor (SFR) inspection system for in-sodium detection and imaging.

  1. FY15 Status of Immersion Phased Array Ultrasonic Probe Development and Performance Demonstration Results for Under Sodium Viewing

    Energy Technology Data Exchange (ETDEWEB)

    Diaz, Aaron A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Larche, Michael R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mathews, Royce [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Neill, Kevin J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Baldwin, David L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Prowant, Matthew S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Edwards, Matthew K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Chamberlin, Clyde E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-09-01

    This Technical Letter Report (TLR) describes work conducted at the Pacific Northwest National Laboratory (PNNL) during FY 2015 on the under-sodium viewing (USV) PNNL project 58745, Work Package AT-15PN230102. This TLR satisfies PNNL’s M3AT-15PN2301027 milestone, and is focused on summarizing the design, development, and evaluation of a two-dimensional matrix phased-array probe referred to as serial number 3 (SN3). In addition, this TLR also provides the results from a performance demonstration of in-sodium target detection trials at 260°C using a one-dimensional 22-element linear array developed in FY14 and referred to as serial number 2 (SN2).

  2. Understanding logistic regression analysis.

    Science.gov (United States)

    Sperandei, Sandro

    2014-01-01

    Logistic regression is used to obtain odds ratio in the presence of more than one explanatory variable. The procedure is quite similar to multiple linear regression, with the exception that the response variable is binomial. The result is the impact of each variable on the odds ratio of the observed event of interest. The main advantage is to avoid confounding effects by analyzing the association of all variables together. In this article, we explain the logistic regression procedure using examples to make it as simple as possible. After definition of the technique, the basic interpretation of the results is highlighted and then some special issues are discussed.

  3. Logits and Tigers and Bears, Oh My! A Brief Look at the Simple Math of Logistic Regression and How It Can Improve Dissemination of Results

    Science.gov (United States)

    Osborne, Jason W.

    2012-01-01

    Logistic regression is slowly gaining acceptance in the social sciences, and fills an important niche in the researcher's toolkit: being able to predict important outcomes that are not continuous in nature. While OLS regression is a valuable tool, it cannot routinely be used to predict outcomes that are binary or categorical in nature. These…

  4. The Mammographic Head Demonstrator Developed in the Framework of the “IMI” Project:. First Imaging Tests Results

    Science.gov (United States)

    Bisogni, Maria Giuseppina

    2006-04-01

    In this paper we report on the performances and the first imaging test results of a digital mammographic demonstrator based on GaAs pixel detectors. The heart of this prototype is the X-ray detection unit, which is a GaAs pixel sensor read-out by the PCC/MEDIPIXI circuit. Since the active area of the sensor is 1 cm2, 18 detectors have been organized in two staggered rows of nine chips each. To cover the typical mammographic format (18 × 24 cm2) a linear scanning is performed by means of a stepper motor. The system is integrated in mammographic equipment comprehending the X-ray tube, the bias and data acquisition systems and the PC-based control system. The prototype has been developed in the framework of the integrated Mammographic Imaging (IMI) project, an industrial research activity aiming to develop innovative instrumentation for morphologic and functional imaging. The project has been supported by the Italian Ministry of Education, University and Research (MIUR) and by five Italian High Tech companies in collaboration with the universities of Ferrara, Roma “La Sapienza”, Pisa and the INFN.

  5. Improving quantitative gas chromatography-electron ionization mass spectrometry results using a modified ion source: demonstration for a pharmaceutical application.

    Science.gov (United States)

    D'Autry, Ward; Wolfs, Kris; Hoogmartens, Jos; Adams, Erwin; Van Schepdael, Ann

    2011-07-01

    Gas chromatography-mass spectrometry is a well established analytical technique. However, mass spectrometers with electron ionization sources may suffer from signal drifts, hereby negatively influencing quantitative performance. To demonstrate this phenomenon for a real application, a static headspace-gas chromatography method in combination with electron ionization-quadrupole mass spectrometry was optimized for the determination of residual dichloromethane in coronary stent coatings. Validating the method, the quantitative performance of an original stainless steel ion source was compared to that of a modified ion source. Ion source modification included the application of a gold coating on the repeller and exit plate. Several validation aspects such as limit of detection, limit of quantification, linearity and precision were evaluated using both ion sources. It was found that, as expected, the stainless steel ion source suffered from signal drift. As a consequence, non-linearity and high RSD values for repeated analyses were obtained. An additional experiment was performed to check whether an internal standard compound would lead to better results. It was found that the signal drift patterns of the analyte and internal standard were different, consequently leading to high RSD values for the response factor. With the modified ion source however, a more stable signal was observed resulting in acceptable linearity and precision. Moreover, it was also found that sensitivity improved compared to the stainless steel ion source. Finally, the optimized method with the modified ion source was applied to determine residual dichloromethane in the coating of coronary stents. The solvent was detected but found to be below the limit of quantification. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Radar Cross Section (RCS) Certification for Static and Dynamic RCS Measurement Facilities. Volume 2: DOD RCS Demonstration Program Results

    National Research Council Canada - National Science Library

    2001-01-01

    ...) 46 Test Group, in cooperation with the RCC/SMSG Radar Committee, the demonstration program described herein was entirely successful and should lay the groundwork for similar technical or laboratory...

  7. Polynomial regression analysis and significance test of the regression function

    International Nuclear Information System (INIS)

    Gao Zhengming; Zhao Juan; He Shengping

    2012-01-01

    In order to analyze the decay heating power of a certain radioactive isotope per kilogram with polynomial regression method, the paper firstly demonstrated the broad usage of polynomial function and deduced its parameters with ordinary least squares estimate. Then significance test method of polynomial regression function is derived considering the similarity between the polynomial regression model and the multivariable linear regression model. Finally, polynomial regression analysis and significance test of the polynomial function are done to the decay heating power of the iso tope per kilogram in accord with the authors' real work. (authors)

  8. Vector regression introduced

    Directory of Open Access Journals (Sweden)

    Mok Tik

    2014-06-01

    Full Text Available This study formulates regression of vector data that will enable statistical analysis of various geodetic phenomena such as, polar motion, ocean currents, typhoon/hurricane tracking, crustal deformations, and precursory earthquake signals. The observed vector variable of an event (dependent vector variable is expressed as a function of a number of hypothesized phenomena realized also as vector variables (independent vector variables and/or scalar variables that are likely to impact the dependent vector variable. The proposed representation has the unique property of solving the coefficients of independent vector variables (explanatory variables also as vectors, hence it supersedes multivariate multiple regression models, in which the unknown coefficients are scalar quantities. For the solution, complex numbers are used to rep- resent vector information, and the method of least squares is deployed to estimate the vector model parameters after transforming the complex vector regression model into a real vector regression model through isomorphism. Various operational statistics for testing the predictive significance of the estimated vector parameter coefficients are also derived. A simple numerical example demonstrates the use of the proposed vector regression analysis in modeling typhoon paths.

  9. Modified Regression Correlation Coefficient for Poisson Regression Model

    Science.gov (United States)

    Kaengthong, Nattacha; Domthong, Uthumporn

    2017-09-01

    This study gives attention to indicators in predictive power of the Generalized Linear Model (GLM) which are widely used; however, often having some restrictions. We are interested in regression correlation coefficient for a Poisson regression model. This is a measure of predictive power, and defined by the relationship between the dependent variable (Y) and the expected value of the dependent variable given the independent variables [E(Y|X)] for the Poisson regression model. The dependent variable is distributed as Poisson. The purpose of this research was modifying regression correlation coefficient for Poisson regression model. We also compare the proposed modified regression correlation coefficient with the traditional regression correlation coefficient in the case of two or more independent variables, and having multicollinearity in independent variables. The result shows that the proposed regression correlation coefficient is better than the traditional regression correlation coefficient based on Bias and the Root Mean Square Error (RMSE).

  10. Results from a multi aperture Fizeau interferometer ground testbed: demonstrator for a future space-based interferometer

    Science.gov (United States)

    Baccichet, Nicola; Caillat, Amandine; Rakotonimbahy, Eddy; Dohlen, Kjetil; Savini, Giorgio; Marcos, Michel

    2016-08-01

    In the framework of the European FP7-FISICA (Far Infrared Space Interferometer Critical Assessment) program, we developed a miniaturized version of the hyper-telescope to demonstrate multi-aperture interferometry on ground. This setup would be ultimately integrated into a CubeSat platform, therefore providing the first real demonstrator of a multi aperture Fizeau interferometer in space. In this paper, we describe the optical design of the ground testbed and the data processing pipeline implemented to reconstruct the object image from interferometric data. As a scientific application, we measured the Sun diameter by fitting a limb-darkening model to our data. Finally, we present the design of a CubeSat platform carrying this miniature Fizeau interferometer, which could be used to monitor the Sun diameter over a long in-orbit period.

  11. Results of a demonstration experiment: Hydrogenation of pyrolysis oils from biomass; Ergebnisse eines Demonstrationsversuchs zur Hydrierung von Pyrolyseoelen aus Biomassen

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, M [DMT-Gesellschaft fuer Forschung und Pruefung mbH, Essen (Germany)

    1998-09-01

    Sump phase hydrogenation is a technique specially developed for coal liquefaction; it provides a possibility of processing the liquid products of biomass pyrolyis into high-grade carburettor fuels. A demonstration experiment was carried out at the hydrogenation plant of DMT. The plant has a capacity of 10 kg/h. The technical feasibility of hydrogenation of biomass oils was demonstrated in a continuous experiment. The contribution describes the experimental conditions, yields, and product qualities. (orig.) [Deutsch] Die fuer die Kohleverfluessigung entwickelte Sumpfphasenhydrierung bietet die Moeglichkeit, die Fluessigprodukte der Pyrolyse von Biomassen zu hochwertigen Vergaserkraftstoffen zu veredeln. Im Hydriertechnikum der DMT wurde hierzu ein Demonstrationsversuch durchgefuehrt. Die Anlage ist fuer einen Kohledurchsatz von 10 kg/h ausgelegt. In einem kontinuierlichen Versuchslauf wurde mit dieser Anlage die technische Machbarkeit der Hydrierung von Bio-Oelen demonstriert. In dem vorliegenden Beitrag werden die Versuchsbedingungen, Ausbeuten und Produktqualitaeten vorgestellt. (orig.)

  12. Demonstration test results of organic materials' volumetric reduction using bio-ethanol, thermal decomposition and burning

    International Nuclear Information System (INIS)

    Tagawa, Akihiro; Watanabe, Masahisa

    2013-01-01

    To discover technologies that can be utilized for decontamination work and verify their effects, economic feasibility, safety, and other factors, the Ministry of the Environment launched the 'FY2011 Decontamination Technology Demonstrations Project' to publicly solicit decontamination technologies that would be verified in demonstration tests and adopted 22 candidates. JAEA was commissioned by the Ministry of the Environment to provide technical assistance related to these demonstrations. This paper describes the volume reduction due to bio-ethanol, thermal decomposition and burning of organic materials in this report. The purpose of this study is that to evaluate a technique that can be used as biomass energy source, while performing volume reduction of contamination organic matter generated by decontamination. An important point of volume reduction technology of contaminated organic matter, is to evaluate the mass balance in the system. Then, confirming the mass balance of radioactive material and where to stay is important. The things that are common to all technologies, are ensuring that the radioactive cesium is not released as exhaust gas, etc.. In addition, it evaluates the cost balance and energy balance in order to understand the applicability to the decontamination of volume reduction technology. The radioactive cesium remains in the carbides when organic materials are carbonized, and radioactive cesium does not transfer to bio-ethanol when organic materials are processed for bio-ethanol production. While plant operating costs are greater if radioactive materials need to be treated, if income is expected by business such as power generation, depreciation may be calculated over approximately 15 years. (authors)

  13. Surgery for the correction of hallux valgus: minimum five-year results with a validated patient-reported outcome tool and regression analysis.

    Science.gov (United States)

    Chong, A; Nazarian, N; Chandrananth, J; Tacey, M; Shepherd, D; Tran, P

    2015-02-01

    This study sought to determine the medium-term patient-reported and radiographic outcomes in patients undergoing surgery for hallux valgus. A total of 118 patients (162 feet) underwent surgery for hallux valgus between January 2008 and June 2009. The Manchester-Oxford Foot Questionnaire (MOXFQ), a validated tool for the assessment of outcome after surgery for hallux valgus, was used and patient satisfaction was sought. The medical records and radiographs were reviewed retrospectively. At a mean of 5.2 years (4.7 to 6.0) post-operatively, the median combined MOXFQ score was 7.8 (IQR:0 to 32.8). The median domain scores for pain, walking/standing, and social interaction were 10 (IQR: 0 to 45), 0 (IQR: 0 to 32.1) and 6.3 (IQR: 0 to 25) respectively. A total of 119 procedures (73.9%, in 90 patients) were reported as satisfactory but only 53 feet (32.7%, in 43 patients) were completely asymptomatic. The mean (SD) correction of hallux valgus, intermetatarsal, and distal metatarsal articular angles was 18.5° (8.8°), 5.7° (3.3°), and 16.6° (8.8°), respectively. Multivariable regression analysis identified that an American Association of Anesthesiologists grade of >1 (Incident Rate Ratio (IRR) = 1.67, p-value = 0.011) and recurrent deformity (IRR = 1.77, p-value = 0.003) were associated with significantly worse MOXFQ scores. No correlation was found between the severity of deformity, the type, or degree of surgical correction and the outcome. When using a validated outcome score for the assessment of outcome after surgery for hallux valgus, the long-term results are worse than expected when compared with the short- and mid-term outcomes, with 25.9% of patients dissatisfied at a mean follow-up of 5.2 years. ©2015 The British Editorial Society of Bone & Joint Surgery.

  14. KEA-144: Final Results of the Ground Operations Demonstration Unit for Liquid Hydrogen (GODU-LH2) Project

    Science.gov (United States)

    Notardonato, William; Fesmire, James; Swanger, Adam; Jumper, Kevin; Johnson, Wesley; Tomsik, Thomas

    2017-01-01

    GODU-LH2 system has successfully met all test objectives at the 33%, 67%, and 100% tank fill level. Complete control over the state of the fluid has been demonstrated using Integrated Refrigeration and Storage (IRAS). Almost any desired point along the H2saturation curve can essentially be "dialed in" and maintained indefinitely. System can also be used to produce densified hydrogen in large quantities to the triple point. Exploring multiple technology infusion paths. Studying implementation of IRAS technology into new LH2sphere for EM-2 at LC39B. Technical interchange also occurring with STMD, LSP, ULA, DoE, KIST, Kawasaki, Shell Oil, SpaceX, US Coast Guard, and Virgin Galactic.

  15. and Multinomial Logistic Regression

    African Journals Online (AJOL)

    This work presented the results of an experimental comparison of two models: Multinomial Logistic Regression (MLR) and Artificial Neural Network (ANN) for classifying students based on their academic performance. The predictive accuracy for each model was measured by their average Classification Correct Rate (CCR).

  16. Regression: A Bibliography.

    Science.gov (United States)

    Pedrini, D. T.; Pedrini, Bonnie C.

    Regression, another mechanism studied by Sigmund Freud, has had much research, e.g., hypnotic regression, frustration regression, schizophrenic regression, and infra-human-animal regression (often directly related to fixation). Many investigators worked with hypnotic age regression, which has a long history, going back to Russian reflexologists.…

  17. The Catalonia World Health Organization demonstration project for palliative care implementation: quantitative and qualitative results at 20 years.

    Science.gov (United States)

    Gómez-Batiste, Xavier; Caja, Carmen; Espinosa, Jose; Bullich, Ingrid; Martínez-Muñoz, Marisa; Porta-Sales, Josep; Trelis, Jordi; Esperalba, Joaquim; Stjernsward, Jan

    2012-04-01

    Catalonia (Spain) has a total population of 7.3 million citizens for whom the National Health Service (NHS) provides health care that is free at the point of access. The prevalence of terminally ill patients is between 30,100 and 39,600. Twenty years ago, the World Health Organization (WHO), in collaboration with the Catalan Department of Health and the Catalan Institute of Oncology, began a demonstration project (WHO Demonstration Project) in palliative care (PC) with the aim of implementing specialist PC services, generating experience in this field, identifying areas for improvement, and introducing educative procedures (clinical and nonclinical). Over the past 20 years, 237 PC clinical services (72 home care support teams, 49 hospital support teams, 60 units with 742 dedicated beds, 50 outpatient clinics, and six psychosocial support teams) have been implemented. In the five years since the previous evaluation, 57 new clinical services (15 new hospital support teams, 36 outpatient clinics, and six psychosocial support teams among others) and four nonclinical services (education, research, WHO Collaborating Center, and planning) have been implemented. During the year 2010, a total of 46,200 processes were undertaken for the care of 23,100 patients, of whom 12,100 (52%) had cancer and 11,000 (48%) had other chronic advanced diseases. The overall yearly costs are around €52,568,000, with an overall savings of €69,300,000 (€2275 per patient, net savings to the NHS of €16,732,000). In the last five years, three qualitative evaluations and a benchmarking process have been performed to identify weak points and inequities in care provision among districts. Systematic assessments indicate high cost-effectiveness of care as well as high levels of satisfaction by patients and their relatives, thus reinforcing the principle that access to PC under the auspices of the NHS at the end of life is a basic human right. Copyright © 2012 U.S. Cancer Pain Relief Committee

  18. Results of a Demonstration Assessment of Passive System Reliability Utilizing the Reliability Method for Passive Systems (RMPS)

    Energy Technology Data Exchange (ETDEWEB)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia; Grelle, Austin

    2015-04-26

    Advanced small modular reactor designs include many advantageous design features such as passively driven safety systems that are arguably more reliable and cost effective relative to conventional active systems. Despite their attractiveness, a reliability assessment of passive systems can be difficult using conventional reliability methods due to the nature of passive systems. Simple deviations in boundary conditions can induce functional failures in a passive system, and intermediate or unexpected operating modes can also occur. As part of an ongoing project, Argonne National Laboratory is investigating various methodologies to address passive system reliability. The Reliability Method for Passive Systems (RMPS), a systematic approach for examining reliability, is one technique chosen for this analysis. This methodology is combined with the Risk-Informed Safety Margin Characterization (RISMC) approach to assess the reliability of a passive system and the impact of its associated uncertainties. For this demonstration problem, an integrated plant model of an advanced small modular pool-type sodium fast reactor with a passive reactor cavity cooling system is subjected to a station blackout using RELAP5-3D. This paper discusses important aspects of the reliability assessment, including deployment of the methodology, the uncertainty identification and quantification process, and identification of key risk metrics.

  19. Results of Tank-Leak Detection Demonstration Using Geophysical Techniques at the Hanford Mock Tank Site-Fiscal Year 2001

    International Nuclear Information System (INIS)

    Barnett, D BRENT.; Gee, Glendon W.; Sweeney, Mark D.

    2002-01-01

    During July and August of 2001, Pacific Northwest National Laboratory (PNNL), hosted researchers from Lawrence Livermore and Lawrence Berkeley National laboratories, and a private contractor, HydroGEOPHYSICS, Inc., for deployment of the following five geophysical leak-detection technologies at the Hanford Site Mock Tank in a Tank Leak Detection Demonstration (TLDD): Electrical Resistivity Tomography (ERT); Cross-Borehole Electromagnetic Induction (CEMI) ; High-Resolution Resistivity (HRR); Cross-Borehole Radar (XBR); Cross-Borehole Seismic Tomography (XBS). Under a ''Tri-party Agreement'' with Federal and state regulators, the U.S. Department of Energy will remove wastes from single-shell tanks (SSTs) and other miscellaneous underground tanks for storage in the double-shell tank system. Waste retrieval methods are being considered that use very little, if any, liquid to dislodge, mobilize, and remove the wastes. As additional assurance of protection of the vadose zone beneath the SSTs, tank wastes and tank conditions may be aggressively monitored during retrieval operations by methods that are deployed outside the SSTs in the vadose zone

  20. Results of Tank-Leak Detection Demonstration Using Geophysical Techniques at the Hanford Mock Tank Site-Fiscal Year 2001

    Energy Technology Data Exchange (ETDEWEB)

    Barnett, D BRENT.; Gee, Glendon W.; Sweeney, Mark D.

    2002-03-01

    During July and August of 2001, Pacific Northwest National Laboratory (PNNL), hosted researchers from Lawrence Livermore and Lawrence Berkeley National laboratories, and a private contractor, HydroGEOPHYSICS, Inc., for deployment of the following five geophysical leak-detection technologies at the Hanford Site Mock Tank in a Tank Leak Detection Demonstration (TLDD): (1) Electrical Resistivity Tomography (ERT); (2) Cross-Borehole Electromagnetic Induction (CEMI); (3) High-Resolution Resistivity (HRR); (4) Cross-Borehole Radar (XBR); and (5) Cross-Borehole Seismic Tomography (XBS). Under a ''Tri-party Agreement'' with Federal and state regulators, the U.S. Department of Energy will remove wastes from single-shell tanks (SSTs) and other miscellaneous underground tanks for storage in the double-shell tank system. Waste retrieval methods are being considered that use very little, if any, liquid to dislodge, mobilize, and remove the wastes. As additional assurance of protection of the vadose zone beneath the SSTs, tank wastes and tank conditions may be aggressively monitored during retrieval operations by methods that are deployed outside the SSTs in the vadose zone.

  1. Estimating Engineering and Manufacturing Development Cost Risk Using Logistic and Multiple Regression

    National Research Council Canada - National Science Library

    Bielecki, John

    2003-01-01

    .... Previous research has demonstrated the use of a two-step logistic and multiple regression methodology to predicting cost growth produces desirable results versus traditional single-step regression...

  2. A school-based human papillomavirus vaccination program in barretos, Brazil: final results of a demonstrative study.

    Directory of Open Access Journals (Sweden)

    José Humberto Tavares Guerreiro Fregnani

    Full Text Available The implementation of a public HPV vaccination program in several developing countries, especially in Latin America, is a great challenge for health care specialists.To evaluate the uptake and the three-dose completion rates of a school-based HPV vaccination program in Barretos (Brazil.THE STUDY INCLUDED GIRLS WHO WERE ENROLLED IN PUBLIC AND PRIVATE SCHOOLS AND WHO REGULARLY ATTENDED THE SIXTH AND SEVENTH GRADES OF ELEMENTARY SCHOOL (MEAN AGE: 11.9 years. A meeting with the parents or guardians occurred approximately one week before the vaccination in order to explain the project and clarify the doubts. The quadrivalent vaccine was administered using the same schedule as in the product package (0-2-6 months. The school visits for regular vaccination occurred on previously scheduled dates. The vaccine was also made available at Barretos Cancer Hospital for the girls who could not be vaccinated on the day when the team visited the school.Among the potential candidates for vaccination (n = 1,574, the parents or guardians of 1,513 girls (96.1% responded to the invitation to participate in the study. A total of 1,389 parents or guardians agreed to participate in the program (acceptance rate = 91.8%. The main reason for refusing to participate in the vaccination program was fear of adverse events. The vaccine uptake rates for the first, second, and third doses were 87.5%, 86.3% and 85.0%, respectively. The three-dose completion rate was 97.2%.This demonstrative study achieved high rates of vaccination uptake and completion of three vaccine doses in children 10-16 years old from Brazil. The feasibility and success of an HPV vaccination program for adolescents in a developing country may depend on the integration between the public health and schooling systems.

  3. GNSS Wave Glider: First results from Loch Ness and demonstration of its suitability for determining the marine geoid

    Science.gov (United States)

    Penna, N. T.; Morales Maqueda, M.; Williams, S. D.; Foden, P.; Martin, I.; Pugh, J.

    2013-12-01

    We report on a first deployment of a GNSS Wave Glider designed for precise, unmanned, autonomous, mobile self-propelled sea level and sea state measurement in the open ocean. The Wave Glider, equipped with a dual frequency GPS+GLONASS receiver, was deployed in Loch Ness, Scotland, autonomously travelling 32 km in a north-easterly direction along the length of the loch in 26 hours, propelled by energy generated from waves of typical amplitude only 100-150 mm and frequency on the order 0.5-1 Hz. The Wave Glider GNSS data were analysed using a post-processed kinematic GPS+GLONASS precise point positioning (PPP) approach, which were quality controlled using double difference GPS kinematic processing with respect to onshore reference stations at either end of the loch. The PPP heights of the loch's surface revealed a clear geoid gradient of about 30 mm/km (i.e. just under 1 m over the whole length of the loch), very similar to both the EGM2008 and OSGM02 geoid models, demonstrating the potential use of a GNSS Wave Glider for marine geoid determination. After applying a low pass filter, the GNSS heights showed local deviations from both EGM2008 and OSGM02, potentially caused by omission errors or a lack of gravity data over Loch Ness. In addition to dual frequency GNSS data, the Wave Glider also recorded inclinometer data, bathymetry, and surface currents, which, in combination with tide gauge and wind data, were used to further control and interpret the GNSS time series.

  4. Agronomic and economic potential of sweet sorghum and Kenaf: Preliminary results of the California Industrial Crops Demonstration Program

    International Nuclear Information System (INIS)

    Shaffer, S.D.; Jenkins, B.M.; Brink, D.L.; Merriman, M.M.; Mouser, B.; Campbell, M.L.; Frate, C.; Schmierer, J.

    1992-01-01

    Sweet sorghum is proving to have excellent potential as a biomass energy crop for the production of fuel alcohol and/or electricity. Its advantages include high biomass and fermentables production per unit area of land, relatively low input requirements, and good suitability to a variety of California growing conditions. Average biomass yield for twelve projects involving nine growers, and eight cultivars was 7.6 bone dry tons per acre (bdt/ac) (17 t/ha) at an average cost of production of $58/bdt ($64/t), ready for harvest. With an ethanol yield of 89 gal/bdt (371 L/t), feed stock costs would be about $0.65/gal ($0.17/L). Improved crop yields at reduced costs can be expected in the future. Kenaf is a potential paper pulp and fiber feed stock which produces a long bast fiber and a short- fiber core material. About 30% of the stem material is long fiber, and the remaining 70% is short fiber. The current cost of production, given demonstration project yields of 4 bdt/ac (9t/ha) is about $222/bdt ($245/t), and available higher-value uses command prices of $300/bdt ($330/t) for long fiber for cordage and $160/bdt ($175/t) for core material as poultry litter, precluding its use directly as an energy feed stock. However, reusing the poultry litter core material for energy production may be economically feasible. This material may be obtained for about $15/bdt ($17/t), and with an ethanol yield of 34 ga/bdt (142 L/t), feed stock cost may be about $0.44/gal ($0.12/L)

  5. Some results from the demonstration of indoor radon reduction measures in block basement houses. Report for June 1985-February 1987

    International Nuclear Information System (INIS)

    Henschel, D.B.; Scott, A.G.

    1987-03-01

    The paper gives results of tests of active soil-ventilation techniques in 24 block-wall basement houses in eastern Pennsylvania having significantly elevated indoor radon concentrations, generally above 740 Bq/cu m. The results indicate that radon levels can be reduced substantially (often below the U.S. EPA guideline of 148 Bq/cu m) if effective suction can be drawn on the soil underneath the concrete slabs of these houses. Such effective suction appears achievable when either: (a) the house has a complete loop of drain tile around its footings for water drainage purposes, and suction is drawn on that loop; or (b) a sufficient number of suction pipes can be inserted at the proper locations into the crushed rock or soil underneath the slab

  6. Demonstration Results for the Phytoextraction of Lead-Contaminated Soil at the Twin Cities Army Ammunition Plant, Arden Hills, Minnesota

    Science.gov (United States)

    2000-07-01

    resulted in corresponding blood lead levels, USEPA has not developed a cancer slope factor and has focused on the non-carcinogenic effects. Lead...threshold value. USEPA’s alternative approach to the use of cancer slope factors and RfDs to evaluate lead exposure is to consider the effect of...the sites, the grass in the 90- x 90-foot farm plots was eradicated with an application of RoundupTM ( glyphosate ) [Figure 4-3]. These activities were

  7. SEPARATION PHENOMENA LOGISTIC REGRESSION

    Directory of Open Access Journals (Sweden)

    Ikaro Daniel de Carvalho Barreto

    2014-03-01

    Full Text Available This paper proposes an application of concepts about the maximum likelihood estimation of the binomial logistic regression model to the separation phenomena. It generates bias in the estimation and provides different interpretations of the estimates on the different statistical tests (Wald, Likelihood Ratio and Score and provides different estimates on the different iterative methods (Newton-Raphson and Fisher Score. It also presents an example that demonstrates the direct implications for the validation of the model and validation of variables, the implications for estimates of odds ratios and confidence intervals, generated from the Wald statistics. Furthermore, we present, briefly, the Firth correction to circumvent the phenomena of separation.

  8. Results of the German alternative fuel cycle evaluation and further efforts geared toward demonstration of direct disposal

    International Nuclear Information System (INIS)

    Papp, R.; Closs, K.D.

    1986-01-01

    In a comparative study initiated by the German Federal Ministry for Research and Technology which was carried out by Karlsruhe Nuclear Research Center in the period from 1981 to 1985, direct disposal of spent fuel was contrasted to the traditional fuel cycle with reprocessing and recycle. The results of the study did not exhibit decisive advantages of direct disposal over fuel reprocessing. Due to this face and legal requirements of the German Atomic Energy Act, the cabinet concluded to continue to adhere to fuel reprocessing as the preferred version of ''Entsorgung''. But the door was left ajar for the direct disposal alternative that, under present atomic law, is permissible for fuel for which reprocessing is neither technically feasible nor economically justified. An ambitious program has been launched in the Federal Republic of Germany (FRG), geared to bring direct disposal to a point of technical maturity

  9. Impact of performance grading on annual numbers of acute myocardial infarction-associated emergency department visits in Taiwan: Results of segmented regression analysis.

    Science.gov (United States)

    Tzeng, I-Shiang; Liu, Su-Hsun; Chen, Kuan-Fu; Wu, Chin-Chieh; Chen, Jih-Chang

    2016-10-01

    To reduce patient boarding time at the emergency department (ED) and to improve the overall quality of the emergent care system in Taiwan, the Minister of Health and Welfare of Taiwan (MOHW) piloted the Grading Responsible Hospitals for Acute Care (GRHAC) audit program in 2007-2009.The aim of the study was to evaluate the impact of the GRHAC audit program on the identification and management of acute myocardial infarction (AMI)-associated ED visits by describing and comparing the incidence of AMI-associated ED visits before (2003-2007), during (2007-2009), and after (2009-2012) the initial audit program implementation.Using aggregated data from the MOHW of Taiwan, we estimated the annual incidence of AMI-associated ED visits by Poisson regression models. We used segmented regression techniques to evaluate differences in the annual rates and in the year-to-year changes in AMI-associated ED visits between 2003 and 2012. Medical comorbidities such as diabetes mellitus, hyperlipidemia, and hypertensive disease were considered as potential confounders.Overall, the number of AMI-associated patient visits increased from 8130 visits in 2003 to 12,695 visits in 2012 (P-value for trend capacity for timely and correctly diagnosing and managing patients presenting with AMI-associated symptoms or signs at the ED.

  10. Porous reactive wall for prevention of acid mine drainage: Results of a full-scale field demonstration

    International Nuclear Information System (INIS)

    Benner, S.G.; Blowes, D.W.; Ptacek, C.J.

    1997-01-01

    A porous reactive wall was installed in August, 1995, to treat mine drainage flowing within an aquifer at the Nickel Rim mine site, near Sudbury, Ontario. The reactive mixture was designed to maximize removal of metals and acid generating capacity from the groundwater by enhancing sulfate reduction and metal sulfide precipitation. The installed structure, composed of a mixed organic substrate, is 15 meters long, 3.6 meters deep and the flow path length (wall width) is 4 meters. Results of sampling nine months after installation, indicate that sulfate reduction and metal sulfide precipitation is occurring. Comparing the chemistry of water entering the wall to treated water exiting the wall nine months after installation: SO 4 concentrations decrease by >50% (from 2400-4800 mg/L to 60-3600 mg/L), Fe concentrations decrease by >95% (from 260-1300 mg/L to 1.0-40 mg/L), pH increased from 5.8 to 7.0 and alkalinity increased from 0-60 mg/L to 700-3200 mg/L as CaCO 3 . After passing through the reactive wall, the net acid generating potential of the aquifer water was converted from acid producing to acid consuming

  11. Bayesian ARTMAP for regression.

    Science.gov (United States)

    Sasu, L M; Andonie, R

    2013-10-01

    Bayesian ARTMAP (BA) is a recently introduced neural architecture which uses a combination of Fuzzy ARTMAP competitive learning and Bayesian learning. Training is generally performed online, in a single-epoch. During training, BA creates input data clusters as Gaussian categories, and also infers the conditional probabilities between input patterns and categories, and between categories and classes. During prediction, BA uses Bayesian posterior probability estimation. So far, BA was used only for classification. The goal of this paper is to analyze the efficiency of BA for regression problems. Our contributions are: (i) we generalize the BA algorithm using the clustering functionality of both ART modules, and name it BA for Regression (BAR); (ii) we prove that BAR is a universal approximator with the best approximation property. In other words, BAR approximates arbitrarily well any continuous function (universal approximation) and, for every given continuous function, there is one in the set of BAR approximators situated at minimum distance (best approximation); (iii) we experimentally compare the online trained BAR with several neural models, on the following standard regression benchmarks: CPU Computer Hardware, Boston Housing, Wisconsin Breast Cancer, and Communities and Crime. Our results show that BAR is an appropriate tool for regression tasks, both for theoretical and practical reasons. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Results of a demonstration project

    International Nuclear Information System (INIS)

    Poli, F. De

    1992-01-01

    A consortial (cooperative) plant for the treatment of pig wastes was built in The Marsciano area (Perugia, Italy). The system was designed to recover all the available resources, both in terms of energy and in terms of materials. It is composed of an anaerobic digestion system (CSTR reactor, 15,000 m 3 working volume), producing over 20,000 m 3 of biogas/day, and a number of biogas operating devices (generators of electricity and heat, hay and crop dryer, tobacco dryer, greenhouses), that make all the biogas to be utilized with profit. The solid fraction of digested effluent is sold as fertilizer, the liquid is stored in lagoons for 9 months, and utilized during summer for fertilizing irrigation. The environmental problems of the area, both water pollution and odour emission, were completely solved with a low cost for the farmers, and the Municipality is likely to allow the realization of new pig farms. (au)

  13. Results of the 1998 Field Demonstration and Preliminary Implementation Guidance for Phytoremediation of Lead-Contaminated Soil at the Twin Cities Army Ammunition Plant, Arden Hills, Minnesota

    National Research Council Canada - National Science Library

    Behel, A

    1999-01-01

    This report describes the first-year (1998) results of a two-year field demonstration conducted to determine if phytoextraction is a viable and feasible technology for remediation of metals (specifically lead) in soil...

  14. Fungible weights in logistic regression.

    Science.gov (United States)

    Jones, Jeff A; Waller, Niels G

    2016-06-01

    In this article we develop methods for assessing parameter sensitivity in logistic regression models. To set the stage for this work, we first review Waller's (2008) equations for computing fungible weights in linear regression. Next, we describe 2 methods for computing fungible weights in logistic regression. To demonstrate the utility of these methods, we compute fungible logistic regression weights using data from the Centers for Disease Control and Prevention's (2010) Youth Risk Behavior Surveillance Survey, and we illustrate how these alternate weights can be used to evaluate parameter sensitivity. To make our work accessible to the research community, we provide R code (R Core Team, 2015) that will generate both kinds of fungible logistic regression weights. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  15. Better Autologistic Regression

    Directory of Open Access Journals (Sweden)

    Mark A. Wolters

    2017-11-01

    Full Text Available Autologistic regression is an important probability model for dichotomous random variables observed along with covariate information. It has been used in various fields for analyzing binary data possessing spatial or network structure. The model can be viewed as an extension of the autologistic model (also known as the Ising model, quadratic exponential binary distribution, or Boltzmann machine to include covariates. It can also be viewed as an extension of logistic regression to handle responses that are not independent. Not all authors use exactly the same form of the autologistic regression model. Variations of the model differ in two respects. First, the variable coding—the two numbers used to represent the two possible states of the variables—might differ. Common coding choices are (zero, one and (minus one, plus one. Second, the model might appear in either of two algebraic forms: a standard form, or a recently proposed centered form. Little attention has been paid to the effect of these differences, and the literature shows ambiguity about their importance. It is shown here that changes to either coding or centering in fact produce distinct, non-nested probability models. Theoretical results, numerical studies, and analysis of an ecological data set all show that the differences among the models can be large and practically significant. Understanding the nature of the differences and making appropriate modeling choices can lead to significantly improved autologistic regression analyses. The results strongly suggest that the standard model with plus/minus coding, which we call the symmetric autologistic model, is the most natural choice among the autologistic variants.

  16. Reduced Rank Regression

    DEFF Research Database (Denmark)

    Johansen, Søren

    2008-01-01

    The reduced rank regression model is a multivariate regression model with a coefficient matrix with reduced rank. The reduced rank regression algorithm is an estimation procedure, which estimates the reduced rank regression model. It is related to canonical correlations and involves calculating...

  17. Used Nuclear Fuel Loading and Structural Performance Under Normal Conditions of Transport- Demonstration of Approach and Results on Used Fuel Performance Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Adkins, Harold [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Geelhood, Ken [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Koeppel, Brian [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Coleman, Justin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bignell, John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Flores, Gregg [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wang, Jy-An [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sanborn, Scott [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Spears, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States); Klymyshyn, Nick [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-09-30

    This document addresses Oak Ridge National Laboratory milestone M2FT-13OR0822015 Demonstration of Approach and Results on Used Nuclear Fuel Performance Characterization. This report provides results of the initial demonstration of the modeling capability developed to perform preliminary deterministic evaluations of moderate-to-high burnup used nuclear fuel (UNF) mechanical performance under normal conditions of storage (NCS) and normal conditions of transport (NCT) conditions. This report also provides results from the sensitivity studies that have been performed. Finally, discussion on the long-term goals and objectives of this initiative are provided.

  18. Advantages and Limitations of Anticipating Laboratory Test Results from Regression- and Tree-Based Rules Derived from Electronic Health-Record Data

    OpenAIRE

    Mohammad, Fahim; Theisen-Toupal, Jesse C.; Arnaout, Ramy

    2014-01-01

    Laboratory testing is the single highest-volume medical activity, making it useful to ask how well one can anticipate whether a given test result will be high, low, or within the reference interval ("normal"). We analyzed 10 years of electronic health records--a total of 69.4 million blood tests--to see how well standard rule-mining techniques can anticipate test results based on patient age and gender, recent diagnoses, and recent laboratory test results. We evaluated rules according to thei...

  19. Discriminative Elastic-Net Regularized Linear Regression.

    Science.gov (United States)

    Zhang, Zheng; Lai, Zhihui; Xu, Yong; Shao, Ling; Wu, Jian; Xie, Guo-Sen

    2017-03-01

    In this paper, we aim at learning compact and discriminative linear regression models. Linear regression has been widely used in different problems. However, most of the existing linear regression methods exploit the conventional zero-one matrix as the regression targets, which greatly narrows the flexibility of the regression model. Another major limitation of these methods is that the learned projection matrix fails to precisely project the image features to the target space due to their weak discriminative capability. To this end, we present an elastic-net regularized linear regression (ENLR) framework, and develop two robust linear regression models which possess the following special characteristics. First, our methods exploit two particular strategies to enlarge the margins of different classes by relaxing the strict binary targets into a more feasible variable matrix. Second, a robust elastic-net regularization of singular values is introduced to enhance the compactness and effectiveness of the learned projection matrix. Third, the resulting optimization problem of ENLR has a closed-form solution in each iteration, which can be solved efficiently. Finally, rather than directly exploiting the projection matrix for recognition, our methods employ the transformed features as the new discriminate representations to make final image classification. Compared with the traditional linear regression model and some of its variants, our method is much more accurate in image classification. Extensive experiments conducted on publicly available data sets well demonstrate that the proposed framework can outperform the state-of-the-art methods. The MATLAB codes of our methods can be available at http://www.yongxu.org/lunwen.html.

  20. Advantages and limitations of anticipating laboratory test results from regression- and tree-based rules derived from electronic health-record data.

    Directory of Open Access Journals (Sweden)

    Fahim Mohammad

    Full Text Available Laboratory testing is the single highest-volume medical activity, making it useful to ask how well one can anticipate whether a given test result will be high, low, or within the reference interval ("normal". We analyzed 10 years of electronic health records--a total of 69.4 million blood tests--to see how well standard rule-mining techniques can anticipate test results based on patient age and gender, recent diagnoses, and recent laboratory test results. We evaluated rules according to their positive and negative predictive value (PPV and NPV and area under the receiver-operator characteristic curve (ROC AUCs. Using a stringent cutoff of PPV and/or NPV≥0.95, standard techniques yield few rules for sendout tests but several for in-house tests, mostly for repeat laboratory tests that are part of the complete blood count and basic metabolic panel. Most rules were clinically and pathophysiologically plausible, and several seemed clinically useful for informing pre-test probability of a given result. But overall, rules were unlikely to be able to function as a general substitute for actually ordering a test. Improving laboratory utilization will likely require different input data and/or alternative methods.

  1. Advantages and limitations of anticipating laboratory test results from regression- and tree-based rules derived from electronic health-record data.

    Science.gov (United States)

    Mohammad, Fahim; Theisen-Toupal, Jesse C; Arnaout, Ramy

    2014-01-01

    Laboratory testing is the single highest-volume medical activity, making it useful to ask how well one can anticipate whether a given test result will be high, low, or within the reference interval ("normal"). We analyzed 10 years of electronic health records--a total of 69.4 million blood tests--to see how well standard rule-mining techniques can anticipate test results based on patient age and gender, recent diagnoses, and recent laboratory test results. We evaluated rules according to their positive and negative predictive value (PPV and NPV) and area under the receiver-operator characteristic curve (ROC AUCs). Using a stringent cutoff of PPV and/or NPV≥0.95, standard techniques yield few rules for sendout tests but several for in-house tests, mostly for repeat laboratory tests that are part of the complete blood count and basic metabolic panel. Most rules were clinically and pathophysiologically plausible, and several seemed clinically useful for informing pre-test probability of a given result. But overall, rules were unlikely to be able to function as a general substitute for actually ordering a test. Improving laboratory utilization will likely require different input data and/or alternative methods.

  2. Predicting Engineering Student Attrition Risk Using a Probabilistic Neural Network and Comparing Results with a Backpropagation Neural Network and Logistic Regression

    Science.gov (United States)

    Mason, Cindi; Twomey, Janet; Wright, David; Whitman, Lawrence

    2018-01-01

    As the need for engineers continues to increase, a growing focus has been placed on recruiting students into the field of engineering and retaining the students who select engineering as their field of study. As a result of this concentration on student retention, numerous studies have been conducted to identify, understand, and confirm…

  3. Retro-regression--another important multivariate regression improvement.

    Science.gov (United States)

    Randić, M

    2001-01-01

    We review the serious problem associated with instabilities of the coefficients of regression equations, referred to as the MRA (multivariate regression analysis) "nightmare of the first kind". This is manifested when in a stepwise regression a descriptor is included or excluded from a regression. The consequence is an unpredictable change of the coefficients of the descriptors that remain in the regression equation. We follow with consideration of an even more serious problem, referred to as the MRA "nightmare of the second kind", arising when optimal descriptors are selected from a large pool of descriptors. This process typically causes at different steps of the stepwise regression a replacement of several previously used descriptors by new ones. We describe a procedure that resolves these difficulties. The approach is illustrated on boiling points of nonanes which are considered (1) by using an ordered connectivity basis; (2) by using an ordering resulting from application of greedy algorithm; and (3) by using an ordering derived from an exhaustive search for optimal descriptors. A novel variant of multiple regression analysis, called retro-regression (RR), is outlined showing how it resolves the ambiguities associated with both "nightmares" of the first and the second kind of MRA.

  4. Benefits from flywheel energy storage for area regulation in California - demonstration results : a study for the DOE Energy Storage Systems program.

    Energy Technology Data Exchange (ETDEWEB)

    Eyer, James M. (Distributed Utility Associates, Livermore, CA)

    2009-10-01

    This report documents a high-level analysis of the benefit and cost for flywheel energy storage used to provide area regulation for the electricity supply and transmission system in California. Area regulation is an 'ancillary service' needed for a reliable and stable regional electricity grid. The analysis was based on results from a demonstration, in California, of flywheel energy storage developed by Beacon Power Corporation (the system's manufacturer). Demonstrated was flywheel storage systems ability to provide 'rapid-response' regulation. Flywheel storage output can be varied much more rapidly than the output from conventional regulation sources, making flywheels more attractive than conventional regulation resources. The performance of the flywheel storage system demonstrated was generally consistent with requirements for a possible new class of regulation resources - 'rapid-response' energy-storage-based regulation - in California. In short, it was demonstrated that Beacon Power Corporation's flywheel system follows a rapidly changing control signal (the ACE, which changes every four seconds). Based on the results and on expected plant cost and performance, the Beacon Power flywheel storage system has a good chance of being a financially viable regulation resource. Results indicate a benefit/cost ratio of 1.5 to 1.8 using what may be somewhat conservative assumptions. A benefit/cost ratio of one indicates that, based on the financial assumptions used, the investment's financial returns just meet the investors target.

  5. Quantile Regression Methods

    DEFF Research Database (Denmark)

    Fitzenberger, Bernd; Wilke, Ralf Andreas

    2015-01-01

    if the mean regression model does not. We provide a short informal introduction into the principle of quantile regression which includes an illustrative application from empirical labor market research. This is followed by briefly sketching the underlying statistical model for linear quantile regression based......Quantile regression is emerging as a popular statistical approach, which complements the estimation of conditional mean models. While the latter only focuses on one aspect of the conditional distribution of the dependent variable, the mean, quantile regression provides more detailed insights...... by modeling conditional quantiles. Quantile regression can therefore detect whether the partial effect of a regressor on the conditional quantiles is the same for all quantiles or differs across quantiles. Quantile regression can provide evidence for a statistical relationship between two variables even...

  6. Aid and growth regressions

    DEFF Research Database (Denmark)

    Hansen, Henrik; Tarp, Finn

    2001-01-01

    This paper examines the relationship between foreign aid and growth in real GDP per capita as it emerges from simple augmentations of popular cross country growth specifications. It is shown that aid in all likelihood increases the growth rate, and this result is not conditional on ‘good’ policy....... investment. We conclude by stressing the need for more theoretical work before this kind of cross-country regressions are used for policy purposes.......This paper examines the relationship between foreign aid and growth in real GDP per capita as it emerges from simple augmentations of popular cross country growth specifications. It is shown that aid in all likelihood increases the growth rate, and this result is not conditional on ‘good’ policy...

  7. Spontaneous regression of a congenital melanocytic nevus

    Directory of Open Access Journals (Sweden)

    Amiya Kumar Nath

    2011-01-01

    Full Text Available Congenital melanocytic nevus (CMN may rarely regress which may also be associated with a halo or vitiligo. We describe a 10-year-old girl who presented with CMN on the left leg since birth, which recently started to regress spontaneously with associated depigmentation in the lesion and at a distant site. Dermoscopy performed at different sites of the regressing lesion demonstrated loss of epidermal pigments first followed by loss of dermal pigments. Histopathology and Masson-Fontana stain demonstrated lymphocytic infiltration and loss of pigment production in the regressing area. Immunohistochemistry staining (S100 and HMB-45, however, showed that nevus cells were present in the regressing areas.

  8. Canonical variate regression.

    Science.gov (United States)

    Luo, Chongliang; Liu, Jin; Dey, Dipak K; Chen, Kun

    2016-07-01

    In many fields, multi-view datasets, measuring multiple distinct but interrelated sets of characteristics on the same set of subjects, together with data on certain outcomes or phenotypes, are routinely collected. The objective in such a problem is often two-fold: both to explore the association structures of multiple sets of measurements and to develop a parsimonious model for predicting the future outcomes. We study a unified canonical variate regression framework to tackle the two problems simultaneously. The proposed criterion integrates multiple canonical correlation analysis with predictive modeling, balancing between the association strength of the canonical variates and their joint predictive power on the outcomes. Moreover, the proposed criterion seeks multiple sets of canonical variates simultaneously to enable the examination of their joint effects on the outcomes, and is able to handle multivariate and non-Gaussian outcomes. An efficient algorithm based on variable splitting and Lagrangian multipliers is proposed. Simulation studies show the superior performance of the proposed approach. We demonstrate the effectiveness of the proposed approach in an [Formula: see text] intercross mice study and an alcohol dependence study. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. Tumor regression patterns in retinoblastoma

    International Nuclear Information System (INIS)

    Zafar, S.N.; Siddique, S.N.; Zaheer, N.

    2016-01-01

    To observe the types of tumor regression after treatment, and identify the common pattern of regression in our patients. Study Design: Descriptive study. Place and Duration of Study: Department of Pediatric Ophthalmology and Strabismus, Al-Shifa Trust Eye Hospital, Rawalpindi, Pakistan, from October 2011 to October 2014. Methodology: Children with unilateral and bilateral retinoblastoma were included in the study. Patients were referred to Pakistan Institute of Medical Sciences, Islamabad, for chemotherapy. After every cycle of chemotherapy, dilated funds examination under anesthesia was performed to record response of the treatment. Regression patterns were recorded on RetCam II. Results: Seventy-four tumors were included in the study. Out of 74 tumors, 3 were ICRB group A tumors, 43 were ICRB group B tumors, 14 tumors belonged to ICRB group C, and remaining 14 were ICRB group D tumors. Type IV regression was seen in 39.1% (n=29) tumors, type II in 29.7% (n=22), type III in 25.6% (n=19), and type I in 5.4% (n=4). All group A tumors (100%) showed type IV regression. Seventeen (39.5%) group B tumors showed type IV regression. In group C, 5 tumors (35.7%) showed type II regression and 5 tumors (35.7%) showed type IV regression. In group D, 6 tumors (42.9%) regressed to type II non-calcified remnants. Conclusion: The response and success of the focal and systemic treatment, as judged by the appearance of different patterns of tumor regression, varies with the ICRB grouping of the tumor. (author)

  10. Short-term load forecasting with increment regression tree

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Jingfei; Stenzel, Juergen [Darmstadt University of Techonology, Darmstadt 64283 (Germany)

    2006-06-15

    This paper presents a new regression tree method for short-term load forecasting. Both increment and non-increment tree are built according to the historical data to provide the data space partition and input variable selection. Support vector machine is employed to the samples of regression tree nodes for further fine regression. Results of different tree nodes are integrated through weighted average method to obtain the comprehensive forecasting result. The effectiveness of the proposed method is demonstrated through its application to an actual system. (author)

  11. Introduction to regression graphics

    CERN Document Server

    Cook, R Dennis

    2009-01-01

    Covers the use of dynamic and interactive computer graphics in linear regression analysis, focusing on analytical graphics. Features new techniques like plot rotation. The authors have composed their own regression code, using Xlisp-Stat language called R-code, which is a nearly complete system for linear regression analysis and can be utilized as the main computer program in a linear regression course. The accompanying disks, for both Macintosh and Windows computers, contain the R-code and Xlisp-Stat. An Instructor's Manual presenting detailed solutions to all the problems in the book is ava

  12. Alternative Methods of Regression

    CERN Document Server

    Birkes, David

    2011-01-01

    Of related interest. Nonlinear Regression Analysis and its Applications Douglas M. Bates and Donald G. Watts ".an extraordinary presentation of concepts and methods concerning the use and analysis of nonlinear regression models.highly recommend[ed].for anyone needing to use and/or understand issues concerning the analysis of nonlinear regression models." --Technometrics This book provides a balance between theory and practice supported by extensive displays of instructive geometrical constructs. Numerous in-depth case studies illustrate the use of nonlinear regression analysis--with all data s

  13. Correlation and simple linear regression.

    Science.gov (United States)

    Zou, Kelly H; Tuncali, Kemal; Silverman, Stuart G

    2003-06-01

    In this tutorial article, the concepts of correlation and regression are reviewed and demonstrated. The authors review and compare two correlation coefficients, the Pearson correlation coefficient and the Spearman rho, for measuring linear and nonlinear relationships between two continuous variables. In the case of measuring the linear relationship between a predictor and an outcome variable, simple linear regression analysis is conducted. These statistical concepts are illustrated by using a data set from published literature to assess a computed tomography-guided interventional technique. These statistical methods are important for exploring the relationships between variables and can be applied to many radiologic studies.

  14. IMPROVING LEARNING PROCESS AND STUDENT RESULTS LEARNING TO TUNE-UPMOTORCYCLE USING DEMONSTRATION METHODOF CLASS XI SMA N 1 PLAYEN YEAR STUDY2013/2014

    Directory of Open Access Journals (Sweden)

    Haryono Haryono

    2014-12-01

    Full Text Available This research is to improve the learning process and results in learning a tune-up motorcycle  using the demonstrationmethod of class XI SMA 1 Playen.              This research is a classroom action research (PTK, using the demonstration method.Subyek this study were students of class XI SMA Negeri 1 Playen.Theimplementationofthisstudyusing3cycles,there is a (planning, implementation (actuating, observation (observing, and reflection (reflecting. Collecting data in this study are observations of student learning process and student learning outcomes test data pre-test, postesI, II, III and documentation as a support to the two data. Further observation data based on the observation of student learning just learning the positive process of learning student and test data were analyzed for comparison. Indicators of success in this classroom action research that student learning increases towards positive along with the use of methods of demonstration, is to see an increase from the pre-cycle to end the first cycle, the first cycle to the second cycle and the secondcyclebycycle III.             From the results of this study concluded that the method could improve the demonstration of positive student learning, from the first cycle of 30%, 50% second cycle and third cycle of 80%. Learning is also more effective with students indicated more quickly adapt as a positive activity, especially in terms of increased student asked, noting the test and work on the problems. Demonstration method can improve the learning outcomes  students of class XI SMA 1 Playen as evidenced by an increase in the average yield final test first cycle of 64.09; second cycle of 77.82 and 78.86 for the third cycle. So it proved with the increasing positive student learning canalso improve student learning outcomes.

  15. Abstract Expression Grammar Symbolic Regression

    Science.gov (United States)

    Korns, Michael F.

    This chapter examines the use of Abstract Expression Grammars to perform the entire Symbolic Regression process without the use of Genetic Programming per se. The techniques explored produce a symbolic regression engine which has absolutely no bloat, which allows total user control of the search space and output formulas, which is faster, and more accurate than the engines produced in our previous papers using Genetic Programming. The genome is an all vector structure with four chromosomes plus additional epigenetic and constraint vectors, allowing total user control of the search space and the final output formulas. A combination of specialized compiler techniques, genetic algorithms, particle swarm, aged layered populations, plus discrete and continuous differential evolution are used to produce an improved symbolic regression sytem. Nine base test cases, from the literature, are used to test the improvement in speed and accuracy. The improved results indicate that these techniques move us a big step closer toward future industrial strength symbolic regression systems.

  16. Regression analysis with categorized regression calibrated exposure: some interesting findings

    Directory of Open Access Journals (Sweden)

    Hjartåker Anette

    2006-07-01

    Full Text Available Abstract Background Regression calibration as a method for handling measurement error is becoming increasingly well-known and used in epidemiologic research. However, the standard version of the method is not appropriate for exposure analyzed on a categorical (e.g. quintile scale, an approach commonly used in epidemiologic studies. A tempting solution could then be to use the predicted continuous exposure obtained through the regression calibration method and treat it as an approximation to the true exposure, that is, include the categorized calibrated exposure in the main regression analysis. Methods We use semi-analytical calculations and simulations to evaluate the performance of the proposed approach compared to the naive approach of not correcting for measurement error, in situations where analyses are performed on quintile scale and when incorporating the original scale into the categorical variables, respectively. We also present analyses of real data, containing measures of folate intake and depression, from the Norwegian Women and Cancer study (NOWAC. Results In cases where extra information is available through replicated measurements and not validation data, regression calibration does not maintain important qualities of the true exposure distribution, thus estimates of variance and percentiles can be severely biased. We show that the outlined approach maintains much, in some cases all, of the misclassification found in the observed exposure. For that reason, regression analysis with the corrected variable included on a categorical scale is still biased. In some cases the corrected estimates are analytically equal to those obtained by the naive approach. Regression calibration is however vastly superior to the naive method when applying the medians of each category in the analysis. Conclusion Regression calibration in its most well-known form is not appropriate for measurement error correction when the exposure is analyzed on a

  17. Boosted beta regression.

    Directory of Open Access Journals (Sweden)

    Matthias Schmid

    Full Text Available Regression analysis with a bounded outcome is a common problem in applied statistics. Typical examples include regression models for percentage outcomes and the analysis of ratings that are measured on a bounded scale. In this paper, we consider beta regression, which is a generalization of logit models to situations where the response is continuous on the interval (0,1. Consequently, beta regression is a convenient tool for analyzing percentage responses. The classical approach to fit a beta regression model is to use maximum likelihood estimation with subsequent AIC-based variable selection. As an alternative to this established - yet unstable - approach, we propose a new estimation technique called boosted beta regression. With boosted beta regression estimation and variable selection can be carried out simultaneously in a highly efficient way. Additionally, both the mean and the variance of a percentage response can be modeled using flexible nonlinear covariate effects. As a consequence, the new method accounts for common problems such as overdispersion and non-binomial variance structures.

  18. Applied logistic regression

    CERN Document Server

    Hosmer, David W; Sturdivant, Rodney X

    2013-01-01

     A new edition of the definitive guide to logistic regression modeling for health science and other applications This thoroughly expanded Third Edition provides an easily accessible introduction to the logistic regression (LR) model and highlights the power of this model by examining the relationship between a dichotomous outcome and a set of covariables. Applied Logistic Regression, Third Edition emphasizes applications in the health sciences and handpicks topics that best suit the use of modern statistical software. The book provides readers with state-of-

  19. Understanding poisson regression.

    Science.gov (United States)

    Hayat, Matthew J; Higgins, Melinda

    2014-04-01

    Nurse investigators often collect study data in the form of counts. Traditional methods of data analysis have historically approached analysis of count data either as if the count data were continuous and normally distributed or with dichotomization of the counts into the categories of occurred or did not occur. These outdated methods for analyzing count data have been replaced with more appropriate statistical methods that make use of the Poisson probability distribution, which is useful for analyzing count data. The purpose of this article is to provide an overview of the Poisson distribution and its use in Poisson regression. Assumption violations for the standard Poisson regression model are addressed with alternative approaches, including addition of an overdispersion parameter or negative binomial regression. An illustrative example is presented with an application from the ENSPIRE study, and regression modeling of comorbidity data is included for illustrative purposes. Copyright 2014, SLACK Incorporated.

  20. Significance testing in ridge regression for genetic data

    Directory of Open Access Journals (Sweden)

    De Iorio Maria

    2011-09-01

    Full Text Available Abstract Background Technological developments have increased the feasibility of large scale genetic association studies. Densely typed genetic markers are obtained using SNP arrays, next-generation sequencing technologies and imputation. However, SNPs typed using these methods can be highly correlated due to linkage disequilibrium among them, and standard multiple regression techniques fail with these data sets due to their high dimensionality and correlation structure. There has been increasing interest in using penalised regression in the analysis of high dimensional data. Ridge regression is one such penalised regression technique which does not perform variable selection, instead estimating a regression coefficient for each predictor variable. It is therefore desirable to obtain an estimate of the significance of each ridge regression coefficient. Results We develop and evaluate a test of significance for ridge regression coefficients. Using simulation studies, we demonstrate that the performance of the test is comparable to that of a permutation test, with the advantage of a much-reduced computational cost. We introduce the p-value trace, a plot of the negative logarithm of the p-values of ridge regression coefficients with increasing shrinkage parameter, which enables the visualisation of the change in p-value of the regression coefficients with increasing penalisation. We apply the proposed method to a lung cancer case-control data set from EPIC, the European Prospective Investigation into Cancer and Nutrition. Conclusions The proposed test is a useful alternative to a permutation test for the estimation of the significance of ridge regression coefficients, at a much-reduced computational cost. The p-value trace is an informative graphical tool for evaluating the results of a test of significance of ridge regression coefficients as the shrinkage parameter increases, and the proposed test makes its production computationally feasible.

  1. Tested Demonstrations.

    Science.gov (United States)

    Gilbert, George L.

    1983-01-01

    An apparatus is described in which effects of pressure, volume, and temperature changes on a gas can be observed simultaneously. Includes use of the apparatus in demonstrating Boyle's, Gay-Lussac's, and Charles' Laws, attractive forces, Dalton's Law of Partial pressures, and in illustrating measurable vapor pressures of liquids and some solids.…

  2. Tested Demonstrations.

    Science.gov (United States)

    Gilbert, George L., Ed.

    1987-01-01

    Describes two demonstrations to illustrate characteristics of substances. Outlines a method to detect the changes in pH levels during the electrolysis of water. Uses water pistols, one filled with methane gas and the other filled with water, to illustrate the differences in these two substances. (TW)

  3. Multicollinearity and Regression Analysis

    Science.gov (United States)

    Daoud, Jamal I.

    2017-12-01

    In regression analysis it is obvious to have a correlation between the response and predictor(s), but having correlation among predictors is something undesired. The number of predictors included in the regression model depends on many factors among which, historical data, experience, etc. At the end selection of most important predictors is something objective due to the researcher. Multicollinearity is a phenomena when two or more predictors are correlated, if this happens, the standard error of the coefficients will increase [8]. Increased standard errors means that the coefficients for some or all independent variables may be found to be significantly different from In other words, by overinflating the standard errors, multicollinearity makes some variables statistically insignificant when they should be significant. In this paper we focus on the multicollinearity, reasons and consequences on the reliability of the regression model.

  4. Hermod: optical payload technology demonstrator flying on PROBA-V: overview of the payload development, testing and results after 1 year in orbit exploitation

    Science.gov (United States)

    Hernandez, S.; Blasco, J.; Henriksen, V.; Samuelsson, H.; Navasquillo, O.; Grimsgaard, M.; Mellab, K.

    2017-11-01

    and environmental tests before the integration in a very limited time. The telemetry data is currently sent to ground on daily basis. All the channels have survived the launch and no BER has been measured with the exception of channel 2, currently recording a BER of 3.06*10-16, that exhibits from time to time a burst of errors due to synchronizing issues of the initial data frame. It is expected to observe during the operating life of the payload the first errors within the channel 4 which was designed on purpose with reduced power margin. This paper will present the full overview of the HERMOD technology demonstrator including the development, testing, validation activity, integration, commissioning and 1 year in-orbit exploitation results.

  5. Wireless Transmission of Monitoring Data out of an Underground Repository: Results of Field Demonstrations Performed at the HADES Underground Laboratory - 13589

    International Nuclear Information System (INIS)

    Schroeder, T.J.; Rosca-Bocancea, E.; Hart, J.

    2013-01-01

    As part of the European 7. framework project MoDeRn, Nuclear Research and Consultancy Group (NRG) performed experiments in order to demonstrate the feasibility of wireless data transmission through the subsurface over large distances by low frequency magnetic fields in the framework of the geological disposal of radioactive waste. The main objective of NRG's contribution is to characterize and optimize the energy use of this technique within the specific context of post-closure monitoring of a repository. For that, measurements have been performed in the HADES Underground Research Laboratory (URL) located at Mol, Belgium, at 225 m depth. The experimental set-up utilizes a loop antenna for the transmitter that has been matched to the existing infrastructure of the HADES. Between 2010 and 2012 NRG carried out several experiments at the HADES URL in order to test the technical set-up and to characterize the propagation behavior of the geological medium and the local background noise pattern. Transmission channels have been identified and data transmission has been demonstrated at several frequencies, with data rates up to 10 bit/s and bit error rates <1%. A mathematical model description that includes the most relevant characteristics of the transmitter, transmission path, and receiver has been developed and applied to analyze possible options to optimize the set-up. With respect to the energy-efficiency, results so far have shown that data transmission over larger distances through the subsurface is a feasible option. To support the conclusions on the energy need per bit of transmitted data, additional experiments are foreseen. (authors)

  6. Minimax Regression Quantiles

    DEFF Research Database (Denmark)

    Bache, Stefan Holst

    A new and alternative quantile regression estimator is developed and it is shown that the estimator is root n-consistent and asymptotically normal. The estimator is based on a minimax ‘deviance function’ and has asymptotically equivalent properties to the usual quantile regression estimator. It is......, however, a different and therefore new estimator. It allows for both linear- and nonlinear model specifications. A simple algorithm for computing the estimates is proposed. It seems to work quite well in practice but whether it has theoretical justification is still an open question....

  7. riskRegression

    DEFF Research Database (Denmark)

    Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas

    2017-01-01

    In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface...... for predicting the covariate specific absolute risks, their confidence intervals, and their confidence bands based on right censored time to event data. We provide explicit formulas for our implementation of the estimator of the (stratified) baseline hazard function in the presence of tied event times. As a by...... functionals. The software presented here is implemented in the riskRegression package....

  8. Regression Analysis by Example. 5th Edition

    Science.gov (United States)

    Chatterjee, Samprit; Hadi, Ali S.

    2012-01-01

    Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. "Regression Analysis by Example, Fifth Edition" has been expanded and thoroughly…

  9. Greenhouse gas profiling by infrared-laser and microwave occultation: retrieval algorithm and demonstration results from end-to-end simulations

    Directory of Open Access Journals (Sweden)

    V. Proschek

    2011-10-01

    Full Text Available Measuring greenhouse gas (GHG profiles with global coverage and high accuracy and vertical resolution in the upper troposphere and lower stratosphere (UTLS is key for improved monitoring of GHG concentrations in the free atmosphere. In this respect a new satellite mission concept adding an infrared-laser part to the already well studied microwave occultation technique exploits the joint propagation of infrared-laser and microwave signals between Low Earth Orbit (LEO satellites. This synergetic combination, referred to as LEO-LEO microwave and infrared-laser occultation (LMIO method, enables to retrieve thermodynamic profiles (pressure, temperature, humidity and accurate altitude levels from the microwave signals and GHG profiles from the simultaneously measured infrared-laser signals. However, due to the novelty of the LMIO method, a retrieval algorithm for GHG profiling is not yet available. Here we introduce such an algorithm for retrieving GHGs from LEO-LEO infrared-laser occultation (LIO data, applied as a second step after retrieving thermodynamic profiles from LEO-LEO microwave occultation (LMO data. We thoroughly describe the LIO retrieval algorithm and unveil the synergy with the LMO-retrieved pressure, temperature, and altitude information. We furthermore demonstrate the effective independence of the GHG retrieval results from background (a priori information in discussing demonstration results from LMIO end-to-end simulations for a representative set of GHG profiles, including carbon dioxide (CO2, water vapor (H2O, methane (CH4, and ozone (O3. The GHGs except for ozone are well retrieved throughout the UTLS, while ozone is well retrieved from about 10 km to 15 km upwards, since the ozone layer resides in the lower stratosphere. The GHG retrieval errors are generally smaller than 1% to 3% r.m.s., at a vertical resolution of about 1 km. The retrieved profiles also appear unbiased, which points

  10. Logistic regression for dichotomized counts.

    Science.gov (United States)

    Preisser, John S; Das, Kalyan; Benecha, Habtamu; Stamm, John W

    2016-12-01

    Sometimes there is interest in a dichotomized outcome indicating whether a count variable is positive or zero. Under this scenario, the application of ordinary logistic regression may result in efficiency loss, which is quantifiable under an assumed model for the counts. In such situations, a shared-parameter hurdle model is investigated for more efficient estimation of regression parameters relating to overall effects of covariates on the dichotomous outcome, while handling count data with many zeroes. One model part provides a logistic regression containing marginal log odds ratio effects of primary interest, while an ancillary model part describes the mean count of a Poisson or negative binomial process in terms of nuisance regression parameters. Asymptotic efficiency of the logistic model parameter estimators of the two-part models is evaluated with respect to ordinary logistic regression. Simulations are used to assess the properties of the models with respect to power and Type I error, the latter investigated under both misspecified and correctly specified models. The methods are applied to data from a randomized clinical trial of three toothpaste formulations to prevent incident dental caries in a large population of Scottish schoolchildren. © The Author(s) 2014.

  11. Multiple linear regression analysis

    Science.gov (United States)

    Edwards, T. R.

    1980-01-01

    Program rapidly selects best-suited set of coefficients. User supplies only vectors of independent and dependent data and specifies confidence level required. Program uses stepwise statistical procedure for relating minimal set of variables to set of observations; final regression contains only most statistically significant coefficients. Program is written in FORTRAN IV for batch execution and has been implemented on NOVA 1200.

  12. Bayesian logistic regression analysis

    NARCIS (Netherlands)

    Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.

    2012-01-01

    In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an

  13. Linear Regression Analysis

    CERN Document Server

    Seber, George A F

    2012-01-01

    Concise, mathematically clear, and comprehensive treatment of the subject.* Expanded coverage of diagnostics and methods of model fitting.* Requires no specialized knowledge beyond a good grasp of matrix algebra and some acquaintance with straight-line regression and simple analysis of variance models.* More than 200 problems throughout the book plus outline solutions for the exercises.* This revision has been extensively class-tested.

  14. Nonlinear Regression with R

    CERN Document Server

    Ritz, Christian; Parmigiani, Giovanni

    2009-01-01

    R is a rapidly evolving lingua franca of graphical display and statistical analysis of experiments from the applied sciences. This book provides a coherent treatment of nonlinear regression with R by means of examples from a diversity of applied sciences such as biology, chemistry, engineering, medicine and toxicology.

  15. Bounded Gaussian process regression

    DEFF Research Database (Denmark)

    Jensen, Bjørn Sand; Nielsen, Jens Brehm; Larsen, Jan

    2013-01-01

    We extend the Gaussian process (GP) framework for bounded regression by introducing two bounded likelihood functions that model the noise on the dependent variable explicitly. This is fundamentally different from the implicit noise assumption in the previously suggested warped GP framework. We...... with the proposed explicit noise-model extension....

  16. Mechanisms of neuroblastoma regression

    Science.gov (United States)

    Brodeur, Garrett M.; Bagatell, Rochelle

    2014-01-01

    Recent genomic and biological studies of neuroblastoma have shed light on the dramatic heterogeneity in the clinical behaviour of this disease, which spans from spontaneous regression or differentiation in some patients, to relentless disease progression in others, despite intensive multimodality therapy. This evidence also suggests several possible mechanisms to explain the phenomena of spontaneous regression in neuroblastomas, including neurotrophin deprivation, humoral or cellular immunity, loss of telomerase activity and alterations in epigenetic regulation. A better understanding of the mechanisms of spontaneous regression might help to identify optimal therapeutic approaches for patients with these tumours. Currently, the most druggable mechanism is the delayed activation of developmentally programmed cell death regulated by the tropomyosin receptor kinase A pathway. Indeed, targeted therapy aimed at inhibiting neurotrophin receptors might be used in lieu of conventional chemotherapy or radiation in infants with biologically favourable tumours that require treatment. Alternative approaches consist of breaking immune tolerance to tumour antigens or activating neurotrophin receptor pathways to induce neuronal differentiation. These approaches are likely to be most effective against biologically favourable tumours, but they might also provide insights into treatment of biologically unfavourable tumours. We describe the different mechanisms of spontaneous neuroblastoma regression and the consequent therapeutic approaches. PMID:25331179

  17. Thousands and thousands of kilowatt-hours saved: Results from The Energy Efficiency McDonalds (TEEM) demonstration project in Bay Point, California

    Energy Technology Data Exchange (ETDEWEB)

    Allen, T.; Young, R.; Spata, T.; Smith, V.

    1998-07-01

    Food service operations use more energy per square foot than any other commercial buildings and yet, the opportunity to build energy efficient restaurants is often overlooked due to a lack of information and education within the industry. To meet this challenge and stimulate energy-efficient restaurant design, McDonald's Corporation, the nation's largest restaurant chain, and Pacific Gas and Electric (PG and E), one of the largest combined fuel utilities, are working together in a program called The Energy Efficient McDonald's, or TEEM. TEEM will identify, demonstrate and evaluate energy-saving technologies with the goal of integrating cost-effective energy-efficient technologies into McDonalds universal building specification and giving existing store operators the opportunity to improve their operations. Technologies installed at the TEEM store in Bay Point include: direct evaporative cooler, evaporative precooler, high-efficiency air conditioners, high-efficiency and two-speed exhaust fans, advanced glazing systems, tubular skylights, low-cost dimming controller and electronic ballasts, T-8 fluorescent fixtures. low-temperature occupance sensors for walking cooler/freezer, and an energy management system. An extensive data collection system has been collecting data since the store opened in June 1996. This paper will present the performance results of the energy efficient measures installed using measured data analysis techniques.

  18. Ridge Regression Signal Processing

    Science.gov (United States)

    Kuhl, Mark R.

    1990-01-01

    The introduction of the Global Positioning System (GPS) into the National Airspace System (NAS) necessitates the development of Receiver Autonomous Integrity Monitoring (RAIM) techniques. In order to guarantee a certain level of integrity, a thorough understanding of modern estimation techniques applied to navigational problems is required. The extended Kalman filter (EKF) is derived and analyzed under poor geometry conditions. It was found that the performance of the EKF is difficult to predict, since the EKF is designed for a Gaussian environment. A novel approach is implemented which incorporates ridge regression to explain the behavior of an EKF in the presence of dynamics under poor geometry conditions. The basic principles of ridge regression theory are presented, followed by the derivation of a linearized recursive ridge estimator. Computer simulations are performed to confirm the underlying theory and to provide a comparative analysis of the EKF and the recursive ridge estimator.

  19. Subset selection in regression

    CERN Document Server

    Miller, Alan

    2002-01-01

    Originally published in 1990, the first edition of Subset Selection in Regression filled a significant gap in the literature, and its critical and popular success has continued for more than a decade. Thoroughly revised to reflect progress in theory, methods, and computing power, the second edition promises to continue that tradition. The author has thoroughly updated each chapter, incorporated new material on recent developments, and included more examples and references. New in the Second Edition:A separate chapter on Bayesian methodsComplete revision of the chapter on estimationA major example from the field of near infrared spectroscopyMore emphasis on cross-validationGreater focus on bootstrappingStochastic algorithms for finding good subsets from large numbers of predictors when an exhaustive search is not feasible Software available on the Internet for implementing many of the algorithms presentedMore examplesSubset Selection in Regression, Second Edition remains dedicated to the techniques for fitting...

  20. Regression in organizational leadership.

    Science.gov (United States)

    Kernberg, O F

    1979-02-01

    The choice of good leaders is a major task for all organizations. Inforamtion regarding the prospective administrator's personality should complement questions regarding his previous experience, his general conceptual skills, his technical knowledge, and the specific skills in the area for which he is being selected. The growing psychoanalytic knowledge about the crucial importance of internal, in contrast to external, object relations, and about the mutual relationships of regression in individuals and in groups, constitutes an important practical tool for the selection of leaders.

  1. Classification and regression trees

    CERN Document Server

    Breiman, Leo; Olshen, Richard A; Stone, Charles J

    1984-01-01

    The methodology used to construct tree structured rules is the focus of this monograph. Unlike many other statistical procedures, which moved from pencil and paper to calculators, this text's use of trees was unthinkable before computers. Both the practical and theoretical sides have been developed in the authors' study of tree methods. Classification and Regression Trees reflects these two sides, covering the use of trees as a data analysis method, and in a more mathematical framework, proving some of their fundamental properties.

  2. General regression and representation model for classification.

    Directory of Open Access Journals (Sweden)

    Jianjun Qian

    Full Text Available Recently, the regularized coding-based classification methods (e.g. SRC and CRC show a great potential for pattern classification. However, most existing coding methods assume that the representation residuals are uncorrelated. In real-world applications, this assumption does not hold. In this paper, we take account of the correlations of the representation residuals and develop a general regression and representation model (GRR for classification. GRR not only has advantages of CRC, but also takes full use of the prior information (e.g. the correlations between representation residuals and representation coefficients and the specific information (weight matrix of image pixels to enhance the classification performance. GRR uses the generalized Tikhonov regularization and K Nearest Neighbors to learn the prior information from the training data. Meanwhile, the specific information is obtained by using an iterative algorithm to update the feature (or image pixel weights of the test sample. With the proposed model as a platform, we design two classifiers: basic general regression and representation classifier (B-GRR and robust general regression and representation classifier (R-GRR. The experimental results demonstrate the performance advantages of proposed methods over state-of-the-art algorithms.

  3. Application of a Weighted Regression Model for Reporting Nutrient and Sediment Concentrations, Fluxes, and Trends in Concentration and Flux for the Chesapeake Bay Nontidal Water-Quality Monitoring Network, Results Through Water Year 2012

    Science.gov (United States)

    Chanat, Jeffrey G.; Moyer, Douglas L.; Blomquist, Joel D.; Hyer, Kenneth E.; Langland, Michael J.

    2016-01-13

    In the Chesapeake Bay watershed, estimated fluxes of nutrients and sediment from the bay’s nontidal tributaries into the estuary are the foundation of decision making to meet reductions prescribed by the Chesapeake Bay Total Maximum Daily Load (TMDL) and are often the basis for refining scientific understanding of the watershed-scale processes that influence the delivery of these constituents to the bay. Two regression-based flux and trend estimation models, ESTIMATOR and Weighted Regressions on Time, Discharge, and Season (WRTDS), were compared using data from 80 watersheds in the Chesapeake Bay Nontidal Water-Quality Monitoring Network (CBNTN). The watersheds range in size from 62 to 70,189 square kilometers and record lengths range from 6 to 28 years. ESTIMATOR is a constant-parameter model that estimates trends only in concentration; WRTDS uses variable parameters estimated with weighted regression, and estimates trends in both concentration and flux. WRTDS had greater explanatory power than ESTIMATOR, with the greatest degree of improvement evident for records longer than 25 years (30 stations; improvement in median model R2= 0.06 for total nitrogen, 0.08 for total phosphorus, and 0.05 for sediment) and the least degree of improvement for records of less than 10 years, for which the two models performed nearly equally. Flux bias statistics were comparable or lower (more favorable) for WRTDS for any record length; for 30 stations with records longer than 25 years, the greatest degree of improvement was evident for sediment (decrease of 0.17 in median statistic) and total phosphorus (decrease of 0.05). The overall between-station pattern in concentration trend direction and magnitude for all constituents was roughly similar for both models. A detailed case study revealed that trends in concentration estimated by WRTDS can operationally be viewed as a less-constrained equivalent to trends in concentration estimated by ESTIMATOR. Estimates of annual mean flow

  4. Logistic regression models

    CERN Document Server

    Hilbe, Joseph M

    2009-01-01

    This book really does cover everything you ever wanted to know about logistic regression … with updates available on the author's website. Hilbe, a former national athletics champion, philosopher, and expert in astronomy, is a master at explaining statistical concepts and methods. Readers familiar with his other expository work will know what to expect-great clarity.The book provides considerable detail about all facets of logistic regression. No step of an argument is omitted so that the book will meet the needs of the reader who likes to see everything spelt out, while a person familiar with some of the topics has the option to skip "obvious" sections. The material has been thoroughly road-tested through classroom and web-based teaching. … The focus is on helping the reader to learn and understand logistic regression. The audience is not just students meeting the topic for the first time, but also experienced users. I believe the book really does meet the author's goal … .-Annette J. Dobson, Biometric...

  5. Long-term response of total ozone content at different latitudes of the Northern and Southern Hemispheres caused by solar activity during 1958-2006 (results of regression analysis)

    Science.gov (United States)

    Krivolutsky, Alexei A.; Nazarova, Margarita; Knyazeva, Galina

    Solar activity influences on atmospheric photochemical system via its changebale electromag-netic flux with eleven-year period and also by energetic particles during solar proton event (SPE). Energetic particles penetrate mostly into polar regions and induce additional produc-tion of NOx and HOx chemical compounds, which can destroy ozone in photochemical catalytic cycles. Solar irradiance variations cause in-phase variability of ozone in accordance with photo-chemical theory. However, real ozone response caused by these two factors, which has different physical nature, is not so clear on long-term time scale. In order to understand the situation multiply linear regression statistical method was used. Three data series, which covered the period 1958-2006, have been used to realize such analysis: yearly averaged total ozone at dif-ferent latitudes (World Ozone Data Centre, Canada, WMO); yearly averaged proton fluxes with E¿ 10 MeV ( IMP, GOES, METEOR satellites); yearly averaged numbers of solar spots (Solar Data). Then, before the analysis, the data sets of ozone deviations from the mean values for whole period (1958-2006) at each latitudinal belt were prepared. The results of multiply regression analysis (two factors) revealed rather complicated time-dependent behavior of ozone response with clear negative peaks for the years of strong SPEs. The magnitudes of such peaks on annual mean basis are not greater than 10 DU. The unusual effect -positive response of ozone to solar proton activity near both poles-was discovered by statistical analysis. The pos-sible photochemical nature of found effect is discussed. This work was supported by Russian Science Foundation for Basic Research (grant 09-05-009949) and by the contract 1-6-08 under Russian Sub-Program "Research and Investigation of Antarctica".

  6. HIV pre-exposure prophylaxis and early antiretroviral treatment among female sex workers in South Africa: Results from a prospective observational demonstration project.

    Directory of Open Access Journals (Sweden)

    Robyn Eakle

    2017-11-01

    . There were no seroconversions on PrEP and 7 virological failures on early ART among women remaining in the study. Reported adherence to PrEP varied over time between 70% and 85%, whereas over 90% of participants reported taking pills daily while on early ART. Data on provider-side costs were also collected and analysed. The total cost of service delivery was approximately US$126 for PrEP and US$406 for early ART per person-year. The main limitations of this study include the lack of a control group, which was not included due to ethical considerations; clinical study requirements imposed when PrEP was not approved through the regulatory system, which could have affected uptake; and the timing of the implementation of a national sex worker HIV programme, which could have also affected uptake and retention.PrEP and early ART services can be implemented within FSW routine services in high prevalence, urban settings. We observed good uptake for both PrEP and early ART; however, retention rates for PrEP were low. Retention rates for early ART were similar to retention rates for the current standard of care. While the cost of the interventions was higher than previously published, there is potential for cost reduction at scale. The TAPS Demonstration Project results provided the basis for the first government PrEP and early ART guidelines and the rollout of the national sex worker HIV programme in South Africa.

  7. Use of probabilistic weights to enhance linear regression myoelectric control.

    Science.gov (United States)

    Smith, Lauren H; Kuiken, Todd A; Hargrove, Levi J

    2015-12-01

    Clinically available prostheses for transradial amputees do not allow simultaneous myoelectric control of degrees of freedom (DOFs). Linear regression methods can provide simultaneous myoelectric control, but frequently also result in difficulty with isolating individual DOFs when desired. This study evaluated the potential of using probabilistic estimates of categories of gross prosthesis movement, which are commonly used in classification-based myoelectric control, to enhance linear regression myoelectric control. Gaussian models were fit to electromyogram (EMG) feature distributions for three movement classes at each DOF (no movement, or movement in either direction) and used to weight the output of linear regression models by the probability that the user intended the movement. Eight able-bodied and two transradial amputee subjects worked in a virtual Fitts' law task to evaluate differences in controllability between linear regression and probability-weighted regression for an intramuscular EMG-based three-DOF wrist and hand system. Real-time and offline analyses in able-bodied subjects demonstrated that probability weighting improved performance during single-DOF tasks (p linear regression control. Use of probability weights can improve the ability to isolate individual during linear regression myoelectric control, while maintaining the ability to simultaneously control multiple DOFs.

  8. On Weighted Support Vector Regression

    DEFF Research Database (Denmark)

    Han, Xixuan; Clemmensen, Line Katrine Harder

    2014-01-01

    We propose a new type of weighted support vector regression (SVR), motivated by modeling local dependencies in time and space in prediction of house prices. The classic weights of the weighted SVR are added to the slack variables in the objective function (OF‐weights). This procedure directly...... shrinks the coefficient of each observation in the estimated functions; thus, it is widely used for minimizing influence of outliers. We propose to additionally add weights to the slack variables in the constraints (CF‐weights) and call the combination of weights the doubly weighted SVR. We illustrate...... the differences and similarities of the two types of weights by demonstrating the connection between the Least Absolute Shrinkage and Selection Operator (LASSO) and the SVR. We show that an SVR problem can be transformed to a LASSO problem plus a linear constraint and a box constraint. We demonstrate...

  9. AN APPLICATION OF FUNCTIONAL MULTIVARIATE REGRESSION MODEL TO MULTICLASS CLASSIFICATION

    OpenAIRE

    Krzyśko, Mirosław; Smaga, Łukasz

    2017-01-01

    In this paper, the scale response functional multivariate regression model is considered. By using the basis functions representation of functional predictors and regression coefficients, this model is rewritten as a multivariate regression model. This representation of the functional multivariate regression model is used for multiclass classification for multivariate functional data. Computational experiments performed on real labelled data sets demonstrate the effectiveness of the proposed ...

  10. MR 201104: Evaluation of Discrimination Technologies and Classification Results and MR 201157: Demonstration of MetalMapper Static Data Acquisition and Data Analysis

    Science.gov (United States)

    2016-09-23

    favorable at this site and included a single TOI (4.2-inch mortar) and benign topography and geology. All of the demonstrated classification approaches...Stokes mortars (PMTMA) • 3.5-inch rockets (Fort Sill, MMR) • Antitank land mines (Fort Sill, WMA) • Hand grenades (Fort Sill, WMA) 4.4 SITE

  11. Test results of full-scale high temperature superconductors cable models destined for a 36 kV, 2 kA(rms) utility demonstration

    DEFF Research Database (Denmark)

    Daumling, M.; Rasmussen, C.N.; Hansen, F.

    2001-01-01

    Power cable systems using high temperature superconductors (HTS) are nearing technical feasibility. This presentation summarises the advancements and status of a project aimed at demonstrating a 36 kV, 2 kA(rms) AC cable system by installing a 30 m long full-scale functional model in a power...

  12. Steganalysis using logistic regression

    Science.gov (United States)

    Lubenko, Ivans; Ker, Andrew D.

    2011-02-01

    We advocate Logistic Regression (LR) as an alternative to the Support Vector Machine (SVM) classifiers commonly used in steganalysis. LR offers more information than traditional SVM methods - it estimates class probabilities as well as providing a simple classification - and can be adapted more easily and efficiently for multiclass problems. Like SVM, LR can be kernelised for nonlinear classification, and it shows comparable classification accuracy to SVM methods. This work is a case study, comparing accuracy and speed of SVM and LR classifiers in detection of LSB Matching and other related spatial-domain image steganography, through the state-of-art 686-dimensional SPAM feature set, in three image sets.

  13. riskRegression

    DEFF Research Database (Denmark)

    Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas

    2017-01-01

    In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface......-product we obtain fast access to the baseline hazards (compared to survival::basehaz()) and predictions of survival probabilities, their confidence intervals and confidence bands. Confidence intervals and confidence bands are based on point-wise asymptotic expansions of the corresponding statistical...

  14. Adaptive metric kernel regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    2000-01-01

    Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...

  15. Adaptive Metric Kernel Regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

  16. Summary Report on Phase I and Phase II Results From the 3D Printing in Zero-G Technology Demonstration Mission. Volume II

    Science.gov (United States)

    Prater, T. J.; Werkheiser, N. J.; Ledbetter, F. E., III

    2018-01-01

    In-space manufacturing seeks to develop the processes, skill sets, and certification architecture needed to provide a rapid response manufacturing capability on long-duration exploration missions. The first 3D printer on the Space Station was developed by Made in Space, Inc. and completed two rounds of operation on orbit as part of the 3D Printing in Zero-G Technology Demonstration Mission. This Technical Publication provides a comprehensive overview of the technical objections of the mission, the two phases of hardware operation conducted on orbit, and the subsequent detailed analysis of specimens produced. No engineering significant evidence of microgravity effects on material outcomes was noted. This technology demonstration mission represents the first step in developing a suite of manufacturing capabilities to meet future mission needs.

  17. Robust analysis of trends in noisy tokamak confinement data using geodesic least squares regression

    Energy Technology Data Exchange (ETDEWEB)

    Verdoolaege, G., E-mail: geert.verdoolaege@ugent.be [Department of Applied Physics, Ghent University, B-9000 Ghent (Belgium); Laboratory for Plasma Physics, Royal Military Academy, B-1000 Brussels (Belgium); Shabbir, A. [Department of Applied Physics, Ghent University, B-9000 Ghent (Belgium); Max Planck Institute for Plasma Physics, Boltzmannstr. 2, 85748 Garching (Germany); Hornung, G. [Department of Applied Physics, Ghent University, B-9000 Ghent (Belgium)

    2016-11-15

    Regression analysis is a very common activity in fusion science for unveiling trends and parametric dependencies, but it can be a difficult matter. We have recently developed the method of geodesic least squares (GLS) regression that is able to handle errors in all variables, is robust against data outliers and uncertainty in the regression model, and can be used with arbitrary distribution models and regression functions. We here report on first results of application of GLS to estimation of the multi-machine scaling law for the energy confinement time in tokamaks, demonstrating improved consistency of the GLS results compared to standard least squares.

  18. New Technology Demonstration Program - Results of an Attempted Field Test of Multi-Layer Light Polarizing Panels in an Office Space

    Energy Technology Data Exchange (ETDEWEB)

    Richman, Eric E.

    2001-06-14

    An assessment of the potential energy savings associated with the use of multi-layer light polarizing panels in an office space was initiated as part of the Department of Energy's (DOE) Federal Energy Management Program (FEMP) New Technology Demonstration Program (NTDP) in 1997. This project was intended to provide information on the effectiveness and application of this technology that could help federal energy managers and other interested individuals determine whether this technology had benefits for their occupied spaces. The use of an actual working office area provided the capability of evaluating the technology's effectiveness in the real world.

  19. results

    Directory of Open Access Journals (Sweden)

    Salabura Piotr

    2017-01-01

    Full Text Available HADES experiment at GSI is the only high precision experiment probing nuclear matter in the beam energy range of a few AGeV. Pion, proton and ion beams are used to study rare dielectron and strangeness probes to diagnose properties of strongly interacting matter in this energy regime. Selected results from p + A and A + A collisions are presented and discussed.

  20. Demonstrating managed aquifer recharge as a solution for climate change adaptation: results from Gabardine project and asemwaterNet coordination action in the Algarve region (Portugal

    Directory of Open Access Journals (Sweden)

    João Paulo Lobo Ferreira

    2014-09-01

    Full Text Available In the Algarve southern Portugal region, Managed Aquifer Recharge (MAR research activities have been developed to provide not only water surplus storage in aquifers during wet years, focusing in the Querença-Silves aquifer (FP6 ASEMWATERNet Coordination Action, but also groundwater quality rehabilitation in the Campina de Faro aquifer (FP6 Gabardine Project. Following MAR research potentialities in southern Portugal, this paper describes the objectives, conceptual demonstration, background and capabilities of one of the selected Circum-Mediterranean pilot sites (in Portugal that will be researched in the new FP7-ENV-2013-WATER-INNO-DEMO MARSOL project, which started Dec. 1st, 2013. In the Algarve pilot site, several case-study areas will be located in the Querença-Silves aquifer and in the Campina de Faro aquifer.

  1. Spontaneous regression of metastatic Merkel cell carcinoma.

    LENUS (Irish Health Repository)

    Hassan, S J

    2010-01-01

    Merkel cell carcinoma is a rare aggressive neuroendocrine carcinoma of the skin predominantly affecting elderly Caucasians. It has a high rate of local recurrence and regional lymph node metastases. It is associated with a poor prognosis. Complete spontaneous regression of Merkel cell carcinoma has been reported but is a poorly understood phenomenon. Here we present a case of complete spontaneous regression of metastatic Merkel cell carcinoma demonstrating a markedly different pattern of events from those previously published.

  2. Multicollinearity in Regression Analyses Conducted in Epidemiologic Studies.

    Science.gov (United States)

    Vatcheva, Kristina P; Lee, MinJae; McCormick, Joseph B; Rahbar, Mohammad H

    2016-04-01

    The adverse impact of ignoring multicollinearity on findings and data interpretation in regression analysis is very well documented in the statistical literature. The failure to identify and report multicollinearity could result in misleading interpretations of the results. A review of epidemiological literature in PubMed from January 2004 to December 2013, illustrated the need for a greater attention to identifying and minimizing the effect of multicollinearity in analysis of data from epidemiologic studies. We used simulated datasets and real life data from the Cameron County Hispanic Cohort to demonstrate the adverse effects of multicollinearity in the regression analysis and encourage researchers to consider the diagnostic for multicollinearity as one of the steps in regression analysis.

  3. Polylinear regression analysis in radiochemistry

    International Nuclear Information System (INIS)

    Kopyrin, A.A.; Terent'eva, T.N.; Khramov, N.N.

    1995-01-01

    A number of radiochemical problems have been formulated in the framework of polylinear regression analysis, which permits the use of conventional mathematical methods for their solution. The authors have considered features of the use of polylinear regression analysis for estimating the contributions of various sources to the atmospheric pollution, for studying irradiated nuclear fuel, for estimating concentrations from spectral data, for measuring neutron fields of a nuclear reactor, for estimating crystal lattice parameters from X-ray diffraction patterns, for interpreting data of X-ray fluorescence analysis, for estimating complex formation constants, and for analyzing results of radiometric measurements. The problem of estimating the target parameters can be incorrect at certain properties of the system under study. The authors showed the possibility of regularization by adding a fictitious set of data open-quotes obtainedclose quotes from the orthogonal design. To estimate only a part of the parameters under consideration, the authors used incomplete rank models. In this case, it is necessary to take into account the possibility of confounding estimates. An algorithm for evaluating the degree of confounding is presented which is realized using standard software or regression analysis

  4. Case management to reduce cardiovascular disease risk in American Indians and Alaska Natives with diabetes: results from the Special Diabetes Program for Indians Healthy Heart Demonstration Project.

    Science.gov (United States)

    Moore, Kelly; Jiang, Luohua; Manson, Spero M; Beals, Janette; Henderson, William; Pratte, Katherine; Acton, Kelly J; Roubideaux, Yvette

    2014-11-01

    We evaluated cardiovascular disease (CVD) risk factors in American Indians/Alaska Natives (AI/ANs) with diabetes in the Special Diabetes Program for Indians Healthy Heart (SDPI-HH) Demonstration Project. Multidisciplinary teams implemented an intensive case management intervention among 30 health care programs serving 138 tribes. The project recruited 3373 participants, with and without current CVD, between 2006 and 2009. We examined data collected at baseline and 1 year later to determine whether improvements occurred in CVD risk factors and in Framingham coronary heart disease (CHD) risk scores, aspirin use, and smoking status. A1c levels decreased an average of 0.2% (P risk scores also decreased significantly. Aspirin therapy increased significantly, and smoking decreased. Participants with more case management visits had significantly greater reductions in LDL cholesterol and A1c values. SDPI-HH successfully translated an intensive case management intervention. Creative retention strategies and an improved understanding of organizational challenges are needed for future Indian health translational efforts.

  5. Women with polycystic ovary syndrome demonstrate worsening markers of cardiovascular risk over the short-term despite declining hyperandrogenaemia: Results of a longitudinal study with community controls.

    Science.gov (United States)

    Huddleston, Heather G; Quinn, Molly M; Kao, Chia-Ning; Lenhart, Nikolaus; Rosen, Mitchell P; Cedars, Marcelle I

    2017-12-01

    To compare age-associated changes in cardiovascular risk markers in lean and obese reproductive-aged women with polycystic ovary syndrome (PCOS) with community controls. Longitudinal study at an academic medical centre PATIENTS: Patients diagnosed with PCOS by 2004 Rotterdam criteria in a multidisciplinary clinic were systematically enrolled from 2006-2014 in a PCOS cohort study and subsequently agreed to participate in a longitudinal study. The comparison controls were from the prospective, longitudinal Ovarian Aging (OVA) study, which consists of healthy women with regular menstrual cycles recruited from 2006 to 2011. Cardiovascular risk markers and hormone parameters at baseline and follow-up. Obese and lean PCOS (n = 38) and control women (n = 296) completed two study visits. The follow-up time (3.5 ± 1.5 vs 4.0 ± 0.8 years, P = .06) and magnitude of BMI gain (+0.1 kg/m 2 /y [-0.11, 0.36] vs +0.26 [-0.18, 0.87] P = .19) did not differ between obese and lean PCOS and controls. In PCOS subjects, total testosterone decreased in both obese and lean, but the decrease was greater in obese subjects (-0.09 nmol/L per year; 95% CI: -0.16, -0.02 vs -0.04 nmol/L per year; 95%CI: -0.11, 0.03). Compared to their respective controls, obese and lean PCOS saw worsening triglyceride (TG) levels (P women with PCOS demonstrated declines in biochemical hyperandrogenaemia over time. Despite this, PCOS subjects experienced steeper increases in cardiovascular risk factors associated with insulin resistance, including triglycerides and HOMA-IR. © 2017 John Wiley & Sons Ltd.

  6. A Simulation Investigation of Principal Component Regression.

    Science.gov (United States)

    Allen, David E.

    Regression analysis is one of the more common analytic tools used by researchers. However, multicollinearity between the predictor variables can cause problems in using the results of regression analyses. Problems associated with multicollinearity include entanglement of relative influences of variables due to reduced precision of estimation,…

  7. Use of probabilistic weights to enhance linear regression myoelectric control

    Science.gov (United States)

    Smith, Lauren H.; Kuiken, Todd A.; Hargrove, Levi J.

    2015-12-01

    Objective. Clinically available prostheses for transradial amputees do not allow simultaneous myoelectric control of degrees of freedom (DOFs). Linear regression methods can provide simultaneous myoelectric control, but frequently also result in difficulty with isolating individual DOFs when desired. This study evaluated the potential of using probabilistic estimates of categories of gross prosthesis movement, which are commonly used in classification-based myoelectric control, to enhance linear regression myoelectric control. Approach. Gaussian models were fit to electromyogram (EMG) feature distributions for three movement classes at each DOF (no movement, or movement in either direction) and used to weight the output of linear regression models by the probability that the user intended the movement. Eight able-bodied and two transradial amputee subjects worked in a virtual Fitts’ law task to evaluate differences in controllability between linear regression and probability-weighted regression for an intramuscular EMG-based three-DOF wrist and hand system. Main results. Real-time and offline analyses in able-bodied subjects demonstrated that probability weighting improved performance during single-DOF tasks (p < 0.05) by preventing extraneous movement at additional DOFs. Similar results were seen in experiments with two transradial amputees. Though goodness-of-fit evaluations suggested that the EMG feature distributions showed some deviations from the Gaussian, equal-covariance assumptions used in this experiment, the assumptions were sufficiently met to provide improved performance compared to linear regression control. Significance. Use of probability weights can improve the ability to isolate individual during linear regression myoelectric control, while maintaining the ability to simultaneously control multiple DOFs.

  8. Development and results of a test program to demonstrate compliance with IEEE STD 384 and R.G. 1.75 electrical separation requirements

    International Nuclear Information System (INIS)

    Eckert, G.P.; Heneberry, E.F.; Walker, F.P.; Konkus, J.F.

    1987-01-01

    The IEEE Std 384-1974, entitled ''Criteria for Separation of Class 1E Equipment and Circuits,'' contains criteria to ensure the independence of redundant Class 1E equipment when designing electrical systems in nuclear plants. The NRC, in R.G. 1.75 Rev. 2, 1978, endorses, with comments, IEEE-384, as the means of achieving independence. One method given in IEEE-384, is that of maintaining a specified separation between components; another method utilizes a combination of separation and barriers. The standard also allows alternative methods to be used when justified by test-based analyses. This paper is a report of a test program undertaken to provide a basis for analysis in the development of alternative methods of achieving separation. The test parameters developed and used, and the results obtained, should prove useful in determining alternative methods of complying with R.G. 1.75 requirements

  9. Recursive Algorithm For Linear Regression

    Science.gov (United States)

    Varanasi, S. V.

    1988-01-01

    Order of model determined easily. Linear-regression algorithhm includes recursive equations for coefficients of model of increased order. Algorithm eliminates duplicative calculations, facilitates search for minimum order of linear-regression model fitting set of data satisfactory.

  10. Demonstration of safety in Alzheimer's patients for intervention with an anti-hypertensive drug Nilvadipine: results from a 6-week open label study.

    LENUS (Irish Health Repository)

    Kennelly, S P

    2012-02-01

    BACKGROUND: Nilvadipine may lower rates of conversion from mild-cognitive impairment to Alzheimer\\'s disease (AD), in hypertensive patients. However, it remains to be determined whether treatment with nilvadipine is safe in AD patients, given the higher incidence of orthostatic hypotension (OH) in this population, who may be more likely to suffer from symptoms associated with the further exaggeration of a drop in BP. OBJECTIVE: The aim of this study was to investigate the safety and tolerability of nilvadipine in AD patients. METHODS: AD patients in the intervention group (n = 56) received nilvadipine 8 mg daily over 6-weeks, compared to the control group (n = 30) who received no intervention. Differences in systolic (SBP) and diastolic (DBP) blood pressure, before and after intervention, was assessed using automated sphygmomanometer readings and ambulatory BP monitors (ABP), and change in OH using a finometer. Reporting of adverse events was monitored throughout the study. RESULTS: There was a significant reduction in the SBP of treated patients compared to non-treated patients but no significant change in DBP. Individuals with higher initial blood pressure (BP) had greater reduction in BP but individuals with normal BP did not experience much change in their BP. While OH was present in 84% of the patients, there was no further drop in BP recorded on active stand studies. There were no significant differences in adverse event reporting between groups. CONCLUSION: Nilvadipine was well tolerated by patients with AD. This study supports further investigation of its efficacy as a potential treatment for AD.

  11. Results from a 'Proof-of-Concept' Demonstration of RF-Based Tracking of UF6 Cylinders during a Processing Operation at a Uranium Enrichment Plant

    International Nuclear Information System (INIS)

    Pickett, Chris A; Kovacic, Donald N; Whitaker, J Michael; Younkin, James R; Hines, Jairus B; Laughter, Mark D; Morgan, Jim; Carrick, Bernie; Boyer, Brian; Whittle, K.

    2008-01-01

    Approved industry-standard cylinders are used globally for processing, storing, and transporting uranium hexafluoride (UF 6 ) at uranium enrichment plants. To ensure that cylinder movements at enrichment facilities occur as declared, the International Atomic Energy Agency (IAEA) must conduct time-consuming periodic physical inspections to validate facility records, cylinder identity, and containment. By using a robust system design that includes the capability for real-time unattended monitoring (of cylinder movements), site-specific rules-based event detection algorithms, and the capability to integrate with other types of monitoring technologies, one can build a system that will improve overall inspector effectiveness. This type of monitoring system can provide timely detection of safeguard events that could be used to ensure more timely and appropriate responses by the IAEA. It also could reduce reliance on facility records and have the additional benefit of enhancing domestic safeguards at the installed facilities. This paper will discuss the installation and evaluation of a radio-frequency- (RF-) based cylinder tracking system that was installed at a United States Enrichment Corporation Centrifuge Facility. This system was installed primarily to evaluate the feasibility of using RF technology at a site and the operational durability of the components under harsh processing conditions. The installation included a basic system that is designed to support layering with other safeguard system technologies and that applies fundamental rules-based event processing methodologies. This paper will discuss the fundamental elements of the system design, the results from this site installation, and future efforts needed to make this technology ready for IAEA consideration

  12. Combining Alphas via Bounded Regression

    Directory of Open Access Journals (Sweden)

    Zura Kakushadze

    2015-11-01

    Full Text Available We give an explicit algorithm and source code for combining alpha streams via bounded regression. In practical applications, typically, there is insufficient history to compute a sample covariance matrix (SCM for a large number of alphas. To compute alpha allocation weights, one then resorts to (weighted regression over SCM principal components. Regression often produces alpha weights with insufficient diversification and/or skewed distribution against, e.g., turnover. This can be rectified by imposing bounds on alpha weights within the regression procedure. Bounded regression can also be applied to stock and other asset portfolio construction. We discuss illustrative examples.

  13. Regression in autistic spectrum disorders.

    Science.gov (United States)

    Stefanatos, Gerry A

    2008-12-01

    A significant proportion of children diagnosed with Autistic Spectrum Disorder experience a developmental regression characterized by a loss of previously-acquired skills. This may involve a loss of speech or social responsitivity, but often entails both. This paper critically reviews the phenomena of regression in autistic spectrum disorders, highlighting the characteristics of regression, age of onset, temporal course, and long-term outcome. Important considerations for diagnosis are discussed and multiple etiological factors currently hypothesized to underlie the phenomenon are reviewed. It is argued that regressive autistic spectrum disorders can be conceptualized on a spectrum with other regressive disorders that may share common pathophysiological features. The implications of this viewpoint are discussed.

  14. Linear regression in astronomy. I

    Science.gov (United States)

    Isobe, Takashi; Feigelson, Eric D.; Akritas, Michael G.; Babu, Gutti Jogesh

    1990-01-01

    Five methods for obtaining linear regression fits to bivariate data with unknown or insignificant measurement errors are discussed: ordinary least-squares (OLS) regression of Y on X, OLS regression of X on Y, the bisector of the two OLS lines, orthogonal regression, and 'reduced major-axis' regression. These methods have been used by various researchers in observational astronomy, most importantly in cosmic distance scale applications. Formulas for calculating the slope and intercept coefficients and their uncertainties are given for all the methods, including a new general form of the OLS variance estimates. The accuracy of the formulas was confirmed using numerical simulations. The applicability of the procedures is discussed with respect to their mathematical properties, the nature of the astronomical data under consideration, and the scientific purpose of the regression. It is found that, for problems needing symmetrical treatment of the variables, the OLS bisector performs significantly better than orthogonal or reduced major-axis regression.

  15. Foothill Transit Battery Electric Bus Demonstration Results

    Energy Technology Data Exchange (ETDEWEB)

    Eudy, Leslie [National Renewable Energy Lab. (NREL), Golden, CO (United States); Prohaska, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States); Kelly, Kenneth [National Renewable Energy Lab. (NREL), Golden, CO (United States); Post, Matthew [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-01-27

    Foothill Transit is collaborating with the California Air Resources Board and the U.S. Department of Energy's (DOE's) National Renewable Energy Laboratory (NREL) to evaluate its fleet of Proterra battery electric buses (BEBs) in revenue service. The focus of this evaluation is to compare performance of the BEBs to that of conventional technology and to track progress over time toward meeting performance targets. This project has also provided an opportunity for DOE to conduct a detailed evaluation of the BEBs and charging infrastructure. This report provides data on the buses from April 2014 through July 2015. Data are provided on a selection of compressed natural gas buses as a baseline comparison.

  16. Finite Algorithms for Robust Linear Regression

    DEFF Research Database (Denmark)

    Madsen, Kaj; Nielsen, Hans Bruun

    1990-01-01

    The Huber M-estimator for robust linear regression is analyzed. Newton type methods for solution of the problem are defined and analyzed, and finite convergence is proved. Numerical experiments with a large number of test problems demonstrate efficiency and indicate that this kind of approach may...

  17. Regression of environmental noise in LIGO data

    International Nuclear Information System (INIS)

    Tiwari, V; Klimenko, S; Mitselmakher, G; Necula, V; Drago, M; Prodi, G; Frolov, V; Yakushin, I; Re, V; Salemi, F; Vedovato, G

    2015-01-01

    We address the problem of noise regression in the output of gravitational-wave (GW) interferometers, using data from the physical environmental monitors (PEM). The objective of the regression analysis is to predict environmental noise in the GW channel from the PEM measurements. One of the most promising regression methods is based on the construction of Wiener–Kolmogorov (WK) filters. Using this method, the seismic noise cancellation from the LIGO GW channel has already been performed. In the presented approach the WK method has been extended, incorporating banks of Wiener filters in the time–frequency domain, multi-channel analysis and regulation schemes, which greatly enhance the versatility of the regression analysis. Also we present the first results on regression of the bi-coherent noise in the LIGO data. (paper)

  18. Advanced statistics: linear regression, part I: simple linear regression.

    Science.gov (United States)

    Marill, Keith A

    2004-01-01

    Simple linear regression is a mathematical technique used to model the relationship between a single independent predictor variable and a single dependent outcome variable. In this, the first of a two-part series exploring concepts in linear regression analysis, the four fundamental assumptions and the mechanics of simple linear regression are reviewed. The most common technique used to derive the regression line, the method of least squares, is described. The reader will be acquainted with other important concepts in simple linear regression, including: variable transformations, dummy variables, relationship to inference testing, and leverage. Simplified clinical examples with small datasets and graphic models are used to illustrate the points. This will provide a foundation for the second article in this series: a discussion of multiple linear regression, in which there are multiple predictor variables.

  19. T-Cell Therapy Using Interleukin-21-Primed Cytotoxic T-Cell Lymphocytes Combined With Cytotoxic T-Cell Lymphocyte Antigen-4 Blockade Results in Long-Term Cell Persistence and Durable Tumor Regression.

    Science.gov (United States)

    Chapuis, Aude G; Roberts, Ilana M; Thompson, John A; Margolin, Kim A; Bhatia, Shailender; Lee, Sylvia M; Sloan, Heather L; Lai, Ivy P; Farrar, Erik A; Wagener, Felecia; Shibuya, Kendall C; Cao, Jianhong; Wolchok, Jedd D; Greenberg, Philip D; Yee, Cassian

    2016-11-01

    Purpose Peripheral blood-derived antigen-specific cytotoxic T cells (CTLs) provide a readily available source of effector cells that can be administered with minimal toxicity in an outpatient setting. In metastatic melanoma, this approach results in measurable albeit modest clinical responses in patients resistant to conventional therapy. We reasoned that concurrent cytotoxic T-cell lymphocyte antigen-4 (CTLA-4) checkpoint blockade might enhance the antitumor activity of adoptively transferred CTLs. Patients and Methods Autologous MART1-specific CTLs were generated by priming with peptide-pulsed dendritic cells in the presence of interleukin-21 and enriched by peptide-major histocompatibility complex multimer-guided cell sorting. This expeditiously yielded polyclonal CTL lines uniformly expressing markers associated with an enhanced survival potential. In this first-in-human strategy, 10 patients with stage IV melanoma received the MART1-specific CTLs followed by a standard course of anti-CTLA-4 (ipilimumab). Results The toxicity profile of the combined treatment was comparable to that of ipilimumab monotherapy. Evaluation of best responses at 12 weeks yielded two continuous complete remissions, one partial response (PR) using RECIST criteria (two PRs using immune-related response criteria), and three instances of stable disease. Infused CTLs persisted with frequencies up to 2.9% of CD8 + T cells for as long as the patients were monitored (up to 40 weeks). In patients who experienced complete remissions, PRs, or stable disease, the persisting CTLs acquired phenotypic and functional characteristics of long-lived memory cells. Moreover, these patients also developed responses to nontargeted tumor antigens (epitope spreading). Conclusion We demonstrate that combining antigen-specific CTLs with CTLA-4 blockade is safe and produces durable clinical responses, likely reflecting both enhanced activity of transferred cells and improved recruitment of new responses

  20. T-Cell Therapy Using Interleukin-21–Primed Cytotoxic T-Cell Lymphocytes Combined With Cytotoxic T-Cell Lymphocyte Antigen-4 Blockade Results in Long-Term Cell Persistence and Durable Tumor Regression

    Science.gov (United States)

    Chapuis, Aude G.; Roberts, Ilana M.; Thompson, John A.; Margolin, Kim A.; Bhatia, Shailender; Lee, Sylvia M.; Sloan, Heather L.; Lai, Ivy P.; Farrar, Erik A.; Wagener, Felecia; Shibuya, Kendall C.; Cao, Jianhong; Wolchok, Jedd D.; Greenberg, Philip D.

    2016-01-01

    Purpose Peripheral blood–derived antigen-specific cytotoxic T cells (CTLs) provide a readily available source of effector cells that can be administered with minimal toxicity in an outpatient setting. In metastatic melanoma, this approach results in measurable albeit modest clinical responses in patients resistant to conventional therapy. We reasoned that concurrent cytotoxic T-cell lymphocyte antigen-4 (CTLA-4) checkpoint blockade might enhance the antitumor activity of adoptively transferred CTLs. Patients and Methods Autologous MART1-specific CTLs were generated by priming with peptide-pulsed dendritic cells in the presence of interleukin-21 and enriched by peptide-major histocompatibility complex multimer-guided cell sorting. This expeditiously yielded polyclonal CTL lines uniformly expressing markers associated with an enhanced survival potential. In this first-in-human strategy, 10 patients with stage IV melanoma received the MART1-specific CTLs followed by a standard course of anti–CTLA-4 (ipilimumab). Results The toxicity profile of the combined treatment was comparable to that of ipilimumab monotherapy. Evaluation of best responses at 12 weeks yielded two continuous complete remissions, one partial response (PR) using RECIST criteria (two PRs using immune-related response criteria), and three instances of stable disease. Infused CTLs persisted with frequencies up to 2.9% of CD8+ T cells for as long as the patients were monitored (up to 40 weeks). In patients who experienced complete remissions, PRs, or stable disease, the persisting CTLs acquired phenotypic and functional characteristics of long-lived memory cells. Moreover, these patients also developed responses to nontargeted tumor antigens (epitope spreading). Conclusion We demonstrate that combining antigen-specific CTLs with CTLA-4 blockade is safe and produces durable clinical responses, likely reflecting both enhanced activity of transferred cells and improved recruitment of new responses

  1. Quality of life in breast cancer patients--a quantile regression analysis.

    Science.gov (United States)

    Pourhoseingholi, Mohamad Amin; Safaee, Azadeh; Moghimi-Dehkordi, Bijan; Zeighami, Bahram; Faghihzadeh, Soghrat; Tabatabaee, Hamid Reza; Pourhoseingholi, Asma

    2008-01-01

    Quality of life study has an important role in health care especially in chronic diseases, in clinical judgment and in medical resources supplying. Statistical tools like linear regression are widely used to assess the predictors of quality of life. But when the response is not normal the results are misleading. The aim of this study is to determine the predictors of quality of life in breast cancer patients, using quantile regression model and compare to linear regression. A cross-sectional study conducted on 119 breast cancer patients that admitted and treated in chemotherapy ward of Namazi hospital in Shiraz. We used QLQ-C30 questionnaire to assessment quality of life in these patients. A quantile regression was employed to assess the assocciated factors and the results were compared to linear regression. All analysis carried out using SAS. The mean score for the global health status for breast cancer patients was 64.92+/-11.42. Linear regression showed that only grade of tumor, occupational status, menopausal status, financial difficulties and dyspnea were statistically significant. In spite of linear regression, financial difficulties were not significant in quantile regression analysis and dyspnea was only significant for first quartile. Also emotion functioning and duration of disease statistically predicted the QOL score in the third quartile. The results have demonstrated that using quantile regression leads to better interpretation and richer inference about predictors of the breast cancer patient quality of life.

  2. Linear regression in astronomy. II

    Science.gov (United States)

    Feigelson, Eric D.; Babu, Gutti J.

    1992-01-01

    A wide variety of least-squares linear regression procedures used in observational astronomy, particularly investigations of the cosmic distance scale, are presented and discussed. The classes of linear models considered are (1) unweighted regression lines, with bootstrap and jackknife resampling; (2) regression solutions when measurement error, in one or both variables, dominates the scatter; (3) methods to apply a calibration line to new data; (4) truncated regression models, which apply to flux-limited data sets; and (5) censored regression models, which apply when nondetections are present. For the calibration problem we develop two new procedures: a formula for the intercept offset between two parallel data sets, which propagates slope errors from one regression to the other; and a generalization of the Working-Hotelling confidence bands to nonstandard least-squares lines. They can provide improved error analysis for Faber-Jackson, Tully-Fisher, and similar cosmic distance scale relations.

  3. Time-adaptive quantile regression

    DEFF Research Database (Denmark)

    Møller, Jan Kloppenborg; Nielsen, Henrik Aalborg; Madsen, Henrik

    2008-01-01

    and an updating procedure are combined into a new algorithm for time-adaptive quantile regression, which generates new solutions on the basis of the old solution, leading to savings in computation time. The suggested algorithm is tested against a static quantile regression model on a data set with wind power......An algorithm for time-adaptive quantile regression is presented. The algorithm is based on the simplex algorithm, and the linear optimization formulation of the quantile regression problem is given. The observations have been split to allow a direct use of the simplex algorithm. The simplex method...... production, where the models combine splines and quantile regression. The comparison indicates superior performance for the time-adaptive quantile regression in all the performance parameters considered....

  4. Quantile regression theory and applications

    CERN Document Server

    Davino, Cristina; Vistocco, Domenico

    2013-01-01

    A guide to the implementation and interpretation of Quantile Regression models This book explores the theory and numerous applications of quantile regression, offering empirical data analysis as well as the software tools to implement the methods. The main focus of this book is to provide the reader with a comprehensivedescription of the main issues concerning quantile regression; these include basic modeling, geometrical interpretation, estimation and inference for quantile regression, as well as issues on validity of the model, diagnostic tools. Each methodological aspect is explored and

  5. Variable and subset selection in PLS regression

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar

    2001-01-01

    The purpose of this paper is to present some useful methods for introductory analysis of variables and subsets in relation to PLS regression. We present here methods that are efficient in finding the appropriate variables or subset to use in the PLS regression. The general conclusion...... is that variable selection is important for successful analysis of chemometric data. An important aspect of the results presented is that lack of variable selection can spoil the PLS regression, and that cross-validation measures using a test set can show larger variation, when we use different subsets of X, than...

  6. Removing Malmquist bias from linear regressions

    Science.gov (United States)

    Verter, Frances

    1993-01-01

    Malmquist bias is present in all astronomical surveys where sources are observed above an apparent brightness threshold. Those sources which can be detected at progressively larger distances are progressively more limited to the intrinsically luminous portion of the true distribution. This bias does not distort any of the measurements, but distorts the sample composition. We have developed the first treatment to correct for Malmquist bias in linear regressions of astronomical data. A demonstration of the corrected linear regression that is computed in four steps is presented.

  7. Two Paradoxes in Linear Regression Analysis

    Science.gov (United States)

    FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong

    2016-01-01

    Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214

  8. Caudal regression syndrome : a case report

    International Nuclear Information System (INIS)

    Lee, Eun Joo; Kim, Hi Hye; Kim, Hyung Sik; Park, So Young; Han, Hye Young; Lee, Kwang Hun

    1998-01-01

    Caudal regression syndrome is a rare congenital anomaly, which results from a developmental failure of the caudal mesoderm during the fetal period. We present a case of caudal regression syndrome composed of a spectrum of anomalies including sirenomelia, dysplasia of the lower lumbar vertebrae, sacrum, coccyx and pelvic bones,genitourinary and anorectal anomalies, and dysplasia of the lung, as seen during infantography and MR imaging

  9. Caudal regression syndrome : a case report

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Eun Joo; Kim, Hi Hye; Kim, Hyung Sik; Park, So Young; Han, Hye Young; Lee, Kwang Hun [Chungang Gil Hospital, Incheon (Korea, Republic of)

    1998-07-01

    Caudal regression syndrome is a rare congenital anomaly, which results from a developmental failure of the caudal mesoderm during the fetal period. We present a case of caudal regression syndrome composed of a spectrum of anomalies including sirenomelia, dysplasia of the lower lumbar vertebrae, sacrum, coccyx and pelvic bones,genitourinary and anorectal anomalies, and dysplasia of the lung, as seen during infantography and MR imaging.

  10. Panel Smooth Transition Regression Models

    DEFF Research Database (Denmark)

    González, Andrés; Terasvirta, Timo; Dijk, Dick van

    We introduce the panel smooth transition regression model. This new model is intended for characterizing heterogeneous panels, allowing the regression coefficients to vary both across individuals and over time. Specifically, heterogeneity is allowed for by assuming that these coefficients are bou...

  11. Testing discontinuities in nonparametric regression

    KAUST Repository

    Dai, Wenlin

    2017-01-19

    In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100

  12. Testing discontinuities in nonparametric regression

    KAUST Repository

    Dai, Wenlin; Zhou, Yuejin; Tong, Tiejun

    2017-01-01

    In nonparametric regression, it is often needed to detect whether there are jump discontinuities in the mean function. In this paper, we revisit the difference-based method in [13 H.-G. Müller and U. Stadtmüller, Discontinuous versus smooth regression, Ann. Stat. 27 (1999), pp. 299–337. doi: 10.1214/aos/1018031100

  13. Logistic Regression: Concept and Application

    Science.gov (United States)

    Cokluk, Omay

    2010-01-01

    The main focus of logistic regression analysis is classification of individuals in different groups. The aim of the present study is to explain basic concepts and processes of binary logistic regression analysis intended to determine the combination of independent variables which best explain the membership in certain groups called dichotomous…

  14. Ordinary least square regression, orthogonal regression, geometric mean regression and their applications in aerosol science

    International Nuclear Information System (INIS)

    Leng Ling; Zhang Tianyi; Kleinman, Lawrence; Zhu Wei

    2007-01-01

    Regression analysis, especially the ordinary least squares method which assumes that errors are confined to the dependent variable, has seen a fair share of its applications in aerosol science. The ordinary least squares approach, however, could be problematic due to the fact that atmospheric data often does not lend itself to calling one variable independent and the other dependent. Errors often exist for both measurements. In this work, we examine two regression approaches available to accommodate this situation. They are orthogonal regression and geometric mean regression. Comparisons are made theoretically as well as numerically through an aerosol study examining whether the ratio of organic aerosol to CO would change with age

  15. Fiscal 1999 research result report on energy and environment technology demonstration research support project (International joint demonstration research project). Japan- Russia joint demonstration research on large-capacity long- distant DC power transmission technology; 1999 nendo daiyoryo denryoku no chokyori chokuryu soden gijutsu ni kansuru Russia kenkyu kikan tono kyodo jissho kenkyu seika hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    Under the assumption of a large-capacity DC power transmission model project in the Far East and Siberia districts, technical study was made on the basic design of the project, considering selection of concrete power generation and consumption sites and power transmission routes, power transmission/transformation equipment, cables, and environmental impact. Study was also made on its applicability to similar projects in Japan. The model project aims at integration of Eastern Integrated Power System in the Far East and Russian Unified Power System, and development of abundant undeveloped hydraulic and tidal power generation in the Far East. The study result showed that (1) construction of the high-voltage DC power transmission (HVDC) system of model project class voltage and capacity in Eastern Siberia is possible technically enough, (2) the total construction cost of the model project scheduled to be put into operation in 2025 amounts to nearly $4.7 billion, and (3) the model project is environment-friendly without any CO{sub 2} gas emission because of hydraulic and tidal power generation. (NEDO)

  16. Sparse Regression by Projection and Sparse Discriminant Analysis

    KAUST Repository

    Qi, Xin

    2015-04-03

    © 2015, © American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America. Recent years have seen active developments of various penalized regression methods, such as LASSO and elastic net, to analyze high-dimensional data. In these approaches, the direction and length of the regression coefficients are determined simultaneously. Due to the introduction of penalties, the length of the estimates can be far from being optimal for accurate predictions. We introduce a new framework, regression by projection, and its sparse version to analyze high-dimensional data. The unique nature of this framework is that the directions of the regression coefficients are inferred first, and the lengths and the tuning parameters are determined by a cross-validation procedure to achieve the largest prediction accuracy. We provide a theoretical result for simultaneous model selection consistency and parameter estimation consistency of our method in high dimension. This new framework is then generalized such that it can be applied to principal components analysis, partial least squares, and canonical correlation analysis. We also adapt this framework for discriminant analysis. Compared with the existing methods, where there is relatively little control of the dependency among the sparse components, our method can control the relationships among the components. We present efficient algorithms and related theory for solving the sparse regression by projection problem. Based on extensive simulations and real data analysis, we demonstrate that our method achieves good predictive performance and variable selection in the regression setting, and the ability to control relationships between the sparse components leads to more accurate classification. In supplementary materials available online, the details of the algorithms and theoretical proofs, and R codes for all simulation studies are provided.

  17. Advanced statistics: linear regression, part II: multiple linear regression.

    Science.gov (United States)

    Marill, Keith A

    2004-01-01

    The applications of simple linear regression in medical research are limited, because in most situations, there are multiple relevant predictor variables. Univariate statistical techniques such as simple linear regression use a single predictor variable, and they often may be mathematically correct but clinically misleading. Multiple linear regression is a mathematical technique used to model the relationship between multiple independent predictor variables and a single dependent outcome variable. It is used in medical research to model observational data, as well as in diagnostic and therapeutic studies in which the outcome is dependent on more than one factor. Although the technique generally is limited to data that can be expressed with a linear function, it benefits from a well-developed mathematical framework that yields unique solutions and exact confidence intervals for regression coefficients. Building on Part I of this series, this article acquaints the reader with some of the important concepts in multiple regression analysis. These include multicollinearity, interaction effects, and an expansion of the discussion of inference testing, leverage, and variable transformations to multivariate models. Examples from the first article in this series are expanded on using a primarily graphic, rather than mathematical, approach. The importance of the relationships among the predictor variables and the dependence of the multivariate model coefficients on the choice of these variables are stressed. Finally, concepts in regression model building are discussed.

  18. Logic regression and its extensions.

    Science.gov (United States)

    Schwender, Holger; Ruczinski, Ingo

    2010-01-01

    Logic regression is an adaptive classification and regression procedure, initially developed to reveal interacting single nucleotide polymorphisms (SNPs) in genetic association studies. In general, this approach can be used in any setting with binary predictors, when the interaction of these covariates is of primary interest. Logic regression searches for Boolean (logic) combinations of binary variables that best explain the variability in the outcome variable, and thus, reveals variables and interactions that are associated with the response and/or have predictive capabilities. The logic expressions are embedded in a generalized linear regression framework, and thus, logic regression can handle a variety of outcome types, such as binary responses in case-control studies, numeric responses, and time-to-event data. In this chapter, we provide an introduction to the logic regression methodology, list some applications in public health and medicine, and summarize some of the direct extensions and modifications of logic regression that have been proposed in the literature. Copyright © 2010 Elsevier Inc. All rights reserved.

  19. A Solution to Separation and Multicollinearity in Multiple Logistic Regression.

    Science.gov (United States)

    Shen, Jianzhao; Gao, Sujuan

    2008-10-01

    In dementia screening tests, item selection for shortening an existing screening test can be achieved using multiple logistic regression. However, maximum likelihood estimates for such logistic regression models often experience serious bias or even non-existence because of separation and multicollinearity problems resulting from a large number of highly correlated items. Firth (1993, Biometrika, 80(1), 27-38) proposed a penalized likelihood estimator for generalized linear models and it was shown to reduce bias and the non-existence problems. The ridge regression has been used in logistic regression to stabilize the estimates in cases of multicollinearity. However, neither solves the problems for each other. In this paper, we propose a double penalized maximum likelihood estimator combining Firth's penalized likelihood equation with a ridge parameter. We present a simulation study evaluating the empirical performance of the double penalized likelihood estimator in small to moderate sample sizes. We demonstrate the proposed approach using a current screening data from a community-based dementia study.

  20. Geographically Weighted Logistic Regression Applied to Credit Scoring Models

    Directory of Open Access Journals (Sweden)

    Pedro Henrique Melo Albuquerque

    Full Text Available Abstract This study used real data from a Brazilian financial institution on transactions involving Consumer Direct Credit (CDC, granted to clients residing in the Distrito Federal (DF, to construct credit scoring models via Logistic Regression and Geographically Weighted Logistic Regression (GWLR techniques. The aims were: to verify whether the factors that influence credit risk differ according to the borrower’s geographic location; to compare the set of models estimated via GWLR with the global model estimated via Logistic Regression, in terms of predictive power and financial losses for the institution; and to verify the viability of using the GWLR technique to develop credit scoring models. The metrics used to compare the models developed via the two techniques were the AICc informational criterion, the accuracy of the models, the percentage of false positives, the sum of the value of false positive debt, and the expected monetary value of portfolio default compared with the monetary value of defaults observed. The models estimated for each region in the DF were distinct in their variables and coefficients (parameters, with it being concluded that credit risk was influenced differently in each region in the study. The Logistic Regression and GWLR methodologies presented very close results, in terms of predictive power and financial losses for the institution, and the study demonstrated viability in using the GWLR technique to develop credit scoring models for the target population in the study.

  1. Quantile Regression With Measurement Error

    KAUST Repository

    Wei, Ying; Carroll, Raymond J.

    2009-01-01

    . The finite sample performance of the proposed method is investigated in a simulation study, and compared to the standard regression calibration approach. Finally, we apply our methodology to part of the National Collaborative Perinatal Project growth data, a

  2. From Rasch scores to regression

    DEFF Research Database (Denmark)

    Christensen, Karl Bang

    2006-01-01

    Rasch models provide a framework for measurement and modelling latent variables. Having measured a latent variable in a population a comparison of groups will often be of interest. For this purpose the use of observed raw scores will often be inadequate because these lack interval scale propertie....... This paper compares two approaches to group comparison: linear regression models using estimated person locations as outcome variables and latent regression models based on the distribution of the score....

  3. Testing Heteroscedasticity in Robust Regression

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2011-01-01

    Roč. 1, č. 4 (2011), s. 25-28 ISSN 2045-3345 Grant - others:GA ČR(CZ) GA402/09/0557 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust regression * heteroscedasticity * regression quantiles * diagnostics Subject RIV: BB - Applied Statistics , Operational Research http://www.researchjournals.co.uk/documents/Vol4/06%20Kalina.pdf

  4. Regression methods for medical research

    CERN Document Server

    Tai, Bee Choo

    2013-01-01

    Regression Methods for Medical Research provides medical researchers with the skills they need to critically read and interpret research using more advanced statistical methods. The statistical requirements of interpreting and publishing in medical journals, together with rapid changes in science and technology, increasingly demands an understanding of more complex and sophisticated analytic procedures.The text explains the application of statistical models to a wide variety of practical medical investigative studies and clinical trials. Regression methods are used to appropriately answer the

  5. Forecasting with Dynamic Regression Models

    CERN Document Server

    Pankratz, Alan

    2012-01-01

    One of the most widely used tools in statistical forecasting, single equation regression models is examined here. A companion to the author's earlier work, Forecasting with Univariate Box-Jenkins Models: Concepts and Cases, the present text pulls together recent time series ideas and gives special attention to possible intertemporal patterns, distributed lag responses of output to input series and the auto correlation patterns of regression disturbance. It also includes six case studies.

  6. Five-year results from a prospective multicentre study of percutaneous pulmonary valve implantation demonstrate sustained removal of significant pulmonary regurgitation, improved right ventricular outflow tract obstruction and improved quality of life

    DEFF Research Database (Denmark)

    Hager, Alfred; Schubert, Stephan; Ewert, Peter

    2017-01-01

    . The EQ-5D quality of life utility index and visual analogue scale scores were both significantly improved six months post PPVI and remained so at five years. CONCLUSIONS: Five-year results following PPVI demonstrate resolved moderate or severe pulmonary regurgitation, improved right ventricular outflow...

  7. Distributed picture compilation demonstration

    Science.gov (United States)

    Alexander, Richard; Anderson, John; Leal, Jeff; Mullin, David; Nicholson, David; Watson, Graham

    2004-08-01

    A physical demonstration of distributed surveillance and tracking is described. The demonstration environment is an outdoor car park overlooked by a system of four rooftop cameras. The cameras extract moving objects from the scene, and these objects are tracked in a decentralized way, over a real communication network, using the information form of the standard Kalman filter. Each node therefore has timely access to the complete global picture and because there is no single point of failure in the system, it is robust. The demonstration system and its main components are described here, with an emphasis on some of the lessons we have learned as a result of applying a corpus of distributed data fusion theory and algorithms in practice. Initial results are presented and future plans to scale up the network are also outlined.

  8. Improving asthma-related health outcomes among low-income, multiethnic, school-aged children: results of a demonstration project that combined continuous quality improvement and community health worker strategies.

    Science.gov (United States)

    Fox, Patrick; Porter, Patricia G; Lob, Sibylle H; Boer, Jennifer Holloman; Rocha, David A; Adelson, Joel W

    2007-10-01

    The purpose of this work was to improve asthma-related health outcomes in an ethnically and geographically disparate population of economically disadvantaged school-aged children by using a team-based approach using continuous quality improvement and community health workers. A demonstration project was conducted with 7 community clinics treating approximately 3000 children with asthma 5 to 18 years of age. The overall clinic population with asthma was assessed for care-process changes through random cross-sectional chart reviews at baseline and 24 months (N = 560). A subset of patients with either moderate or severe persistent asthma or poorly controlled asthma (N = 405) was followed longitudinally for specific asthma-related clinical outcomes, satisfaction with care, and confidence managing asthma by family interview at baseline and at 12 or 24 months. Patient-centered and care-process outcomes included patient/parent assessment of quality of care and confidence in self-management, asthma action plan review, and documentation of guideline-based indicators of quality of care. Direct clinical outcomes included daytime and nighttime symptoms, use of rescue medications, acute care and emergency department visits, hospitalizations, and missed school days. Each clinic site's degree of adherence to the intervention model was evaluated and ranked to examine the correlation between model adherence and outcomes. Cross-sectional data showed clinic-wide improvements in the documentation of asthma severity, review of action plans, health services use, and asthma symptoms. At follow-up in the longitudinal sample, fewer patients reported acute visits, emergency department visits, hospitalizations, frequent daytime and nighttime symptoms, and missed school days compared with baseline. More patients reported excellent or very good quality of care and confidence in asthma self-management. Linear regression analysis of the clinical sites' model adherence ranks against site

  9. Inseparable Phone Books Demonstration

    Science.gov (United States)

    Balta, Nuri; Çetin, Ali

    2017-01-01

    This study is aimed at first introducing a well-known discrepant event; inseparable phone books and second, turning it into an experiment for high school or middle school students. This discrepant event could be used especially to indicate how friction force can be effective in producing an unexpected result. Demonstration, discussion, explanation…

  10. Comparison of multinomial logistic regression and logistic regression: which is more efficient in allocating land use?

    Science.gov (United States)

    Lin, Yingzhi; Deng, Xiangzheng; Li, Xing; Ma, Enjun

    2014-12-01

    Spatially explicit simulation of land use change is the basis for estimating the effects of land use and cover change on energy fluxes, ecology and the environment. At the pixel level, logistic regression is one of the most common approaches used in spatially explicit land use allocation models to determine the relationship between land use and its causal factors in driving land use change, and thereby to evaluate land use suitability. However, these models have a drawback in that they do not determine/allocate land use based on the direct relationship between land use change and its driving factors. Consequently, a multinomial logistic regression method was introduced to address this flaw, and thereby, judge the suitability of a type of land use in any given pixel in a case study area of the Jiangxi Province, China. A comparison of the two regression methods indicated that the proportion of correctly allocated pixels using multinomial logistic regression was 92.98%, which was 8.47% higher than that obtained using logistic regression. Paired t-test results also showed that pixels were more clearly distinguished by multinomial logistic regression than by logistic regression. In conclusion, multinomial logistic regression is a more efficient and accurate method for the spatial allocation of land use changes. The application of this method in future land use change studies may improve the accuracy of predicting the effects of land use and cover change on energy fluxes, ecology, and environment.

  11. Optimized support vector regression for drilling rate of penetration estimation

    Science.gov (United States)

    Bodaghi, Asadollah; Ansari, Hamid Reza; Gholami, Mahsa

    2015-12-01

    In the petroleum industry, drilling optimization involves the selection of operating conditions for achieving the desired depth with the minimum expenditure while requirements of personal safety, environment protection, adequate information of penetrated formations and productivity are fulfilled. Since drilling optimization is highly dependent on the rate of penetration (ROP), estimation of this parameter is of great importance during well planning. In this research, a novel approach called `optimized support vector regression' is employed for making a formulation between input variables and ROP. Algorithms used for optimizing the support vector regression are the genetic algorithm (GA) and the cuckoo search algorithm (CS). Optimization implementation improved the support vector regression performance by virtue of selecting proper values for its parameters. In order to evaluate the ability of optimization algorithms in enhancing SVR performance, their results were compared to the hybrid of pattern search and grid search (HPG) which is conventionally employed for optimizing SVR. The results demonstrated that the CS algorithm achieved further improvement on prediction accuracy of SVR compared to the GA and HPG as well. Moreover, the predictive model derived from back propagation neural network (BPNN), which is the traditional approach for estimating ROP, is selected for comparisons with CSSVR. The comparative results revealed the superiority of CSSVR. This study inferred that CSSVR is a viable option for precise estimation of ROP.

  12. A simple approach to power and sample size calculations in logistic regression and Cox regression models.

    Science.gov (United States)

    Vaeth, Michael; Skovlund, Eva

    2004-06-15

    For a given regression problem it is possible to identify a suitably defined equivalent two-sample problem such that the power or sample size obtained for the two-sample problem also applies to the regression problem. For a standard linear regression model the equivalent two-sample problem is easily identified, but for generalized linear models and for Cox regression models the situation is more complicated. An approximately equivalent two-sample problem may, however, also be identified here. In particular, we show that for logistic regression and Cox regression models the equivalent two-sample problem is obtained by selecting two equally sized samples for which the parameters differ by a value equal to the slope times twice the standard deviation of the independent variable and further requiring that the overall expected number of events is unchanged. In a simulation study we examine the validity of this approach to power calculations in logistic regression and Cox regression models. Several different covariate distributions are considered for selected values of the overall response probability and a range of alternatives. For the Cox regression model we consider both constant and non-constant hazard rates. The results show that in general the approach is remarkably accurate even in relatively small samples. Some discrepancies are, however, found in small samples with few events and a highly skewed covariate distribution. Comparison with results based on alternative methods for logistic regression models with a single continuous covariate indicates that the proposed method is at least as good as its competitors. The method is easy to implement and therefore provides a simple way to extend the range of problems that can be covered by the usual formulas for power and sample size determination. Copyright 2004 John Wiley & Sons, Ltd.

  13. Functional data analysis of generalized regression quantiles

    KAUST Repository

    Guo, Mengmeng

    2013-11-05

    Generalized regression quantiles, including the conditional quantiles and expectiles as special cases, are useful alternatives to the conditional means for characterizing a conditional distribution, especially when the interest lies in the tails. We develop a functional data analysis approach to jointly estimate a family of generalized regression quantiles. Our approach assumes that the generalized regression quantiles share some common features that can be summarized by a small number of principal component functions. The principal component functions are modeled as splines and are estimated by minimizing a penalized asymmetric loss measure. An iterative least asymmetrically weighted squares algorithm is developed for computation. While separate estimation of individual generalized regression quantiles usually suffers from large variability due to lack of sufficient data, by borrowing strength across data sets, our joint estimation approach significantly improves the estimation efficiency, which is demonstrated in a simulation study. The proposed method is applied to data from 159 weather stations in China to obtain the generalized quantile curves of the volatility of the temperature at these stations. © 2013 Springer Science+Business Media New York.

  14. Functional data analysis of generalized regression quantiles

    KAUST Repository

    Guo, Mengmeng; Zhou, Lan; Huang, Jianhua Z.; Hä rdle, Wolfgang Karl

    2013-01-01

    Generalized regression quantiles, including the conditional quantiles and expectiles as special cases, are useful alternatives to the conditional means for characterizing a conditional distribution, especially when the interest lies in the tails. We develop a functional data analysis approach to jointly estimate a family of generalized regression quantiles. Our approach assumes that the generalized regression quantiles share some common features that can be summarized by a small number of principal component functions. The principal component functions are modeled as splines and are estimated by minimizing a penalized asymmetric loss measure. An iterative least asymmetrically weighted squares algorithm is developed for computation. While separate estimation of individual generalized regression quantiles usually suffers from large variability due to lack of sufficient data, by borrowing strength across data sets, our joint estimation approach significantly improves the estimation efficiency, which is demonstrated in a simulation study. The proposed method is applied to data from 159 weather stations in China to obtain the generalized quantile curves of the volatility of the temperature at these stations. © 2013 Springer Science+Business Media New York.

  15. Solving Dynamic Traveling Salesman Problem Using Dynamic Gaussian Process Regression

    Directory of Open Access Journals (Sweden)

    Stephen M. Akandwanaho

    2014-01-01

    Full Text Available This paper solves the dynamic traveling salesman problem (DTSP using dynamic Gaussian Process Regression (DGPR method. The problem of varying correlation tour is alleviated by the nonstationary covariance function interleaved with DGPR to generate a predictive distribution for DTSP tour. This approach is conjoined with Nearest Neighbor (NN method and the iterated local search to track dynamic optima. Experimental results were obtained on DTSP instances. The comparisons were performed with Genetic Algorithm and Simulated Annealing. The proposed approach demonstrates superiority in finding good traveling salesman problem (TSP tour and less computational time in nonstationary conditions.

  16. Detecting nonsense for Chinese comments based on logistic regression

    Science.gov (United States)

    Zhuolin, Ren; Guang, Chen; Shu, Chen

    2016-07-01

    To understand cyber citizens' opinion accurately from Chinese news comments, the clear definition on nonsense is present, and a detection model based on logistic regression (LR) is proposed. The detection of nonsense can be treated as a binary-classification problem. Besides of traditional lexical features, we propose three kinds of features in terms of emotion, structure and relevance. By these features, we train an LR model and demonstrate its effect in understanding Chinese news comments. We find that each of proposed features can significantly promote the result. In our experiments, we achieve a prediction accuracy of 84.3% which improves the baseline 77.3% by 7%.

  17. Gigashot Optical Laser Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Deri, R. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-10-13

    The Gigashot Optical Laser Demonstrator (GOLD) project has demonstrated a novel optical amplifier for high energy pulsed lasers operating at high repetition rates. The amplifier stores enough pump energy to support >10 J of laser output, and employs conduction cooling for thermal management to avoid the need for expensive and bulky high-pressure helium subsystems. A prototype amplifier was fabricated, pumped with diode light at 885 nm, and characterized. Experimental results show that the amplifier provides sufficient small-signal gain and sufficiently low wavefront and birefringence impairments to prove useful in laser systems, at repetition rates up to 60 Hz.

  18. Producing The New Regressive Left

    DEFF Research Database (Denmark)

    Crone, Christine

    members, this thesis investigates a growing political trend and ideological discourse in the Arab world that I have called The New Regressive Left. On the premise that a media outlet can function as a forum for ideology production, the thesis argues that an analysis of this material can help to trace...... the contexture of The New Regressive Left. If the first part of the thesis lays out the theoretical approach and draws the contextual framework, through an exploration of the surrounding Arab media-and ideoscapes, the second part is an analytical investigation of the discourse that permeates the programmes aired...... becomes clear from the analytical chapters is the emergence of the new cross-ideological alliance of The New Regressive Left. This emerging coalition between Shia Muslims, religious minorities, parts of the Arab Left, secular cultural producers, and the remnants of the political,strategic resistance...

  19. A Matlab program for stepwise regression

    Directory of Open Access Journals (Sweden)

    Yanhong Qi

    2016-03-01

    Full Text Available The stepwise linear regression is a multi-variable regression for identifying statistically significant variables in the linear regression equation. In present study, we presented the Matlab program of stepwise regression.

  20. Regression filter for signal resolution

    International Nuclear Information System (INIS)

    Matthes, W.

    1975-01-01

    The problem considered is that of resolving a measured pulse height spectrum of a material mixture, e.g. gamma ray spectrum, Raman spectrum, into a weighed sum of the spectra of the individual constituents. The model on which the analytical formulation is based is described. The problem reduces to that of a multiple linear regression. A stepwise linear regression procedure was constructed. The efficiency of this method was then tested by transforming the procedure in a computer programme which was used to unfold test spectra obtained by mixing some spectra, from a library of arbitrary chosen spectra, and adding a noise component. (U.K.)

  1. Nonparametric Mixture of Regression Models.

    Science.gov (United States)

    Huang, Mian; Li, Runze; Wang, Shaoli

    2013-07-01

    Motivated by an analysis of US house price index data, we propose nonparametric finite mixture of regression models. We study the identifiability issue of the proposed models, and develop an estimation procedure by employing kernel regression. We further systematically study the sampling properties of the proposed estimators, and establish their asymptotic normality. A modified EM algorithm is proposed to carry out the estimation procedure. We show that our algorithm preserves the ascent property of the EM algorithm in an asymptotic sense. Monte Carlo simulations are conducted to examine the finite sample performance of the proposed estimation procedure. An empirical analysis of the US house price index data is illustrated for the proposed methodology.

  2. Biodenitrification demonstration test report

    International Nuclear Information System (INIS)

    Benear, A.K.; Murray, S.J.; Lahoda, E.J.; Leslie, J.W.; Patton, J.B.; Menako, C.R.

    1987-08-01

    A two-column biodenitrification (BDN) facility was constructed at the Feed Materials Production Center (FMPC) in 1985 and 1986 to test the feasibility of biological treatment for industrial nitrate-bearing waste water generated at FMPC. This demonstration facility comprises one-half of the proposed four-column production facility. A demonstration test was conducted over a four month period in 1987. The results indicate the proposed BDN production facility can process FMPC industrial wastewater in a continuous manner while maintaining an effluent that will consistently meet the proposed NPDES limits for combined nitrate nitrogen (NO 3 -N) and nitrite nitrogen (NO 2 -N). The proposed NPDES limits are 62 kg/day average and 124 kg/day maximum. These limits were proportioned to determine that the two-column demonstration facility should meet the limits of 31 kg/day average and 62 kg/day maximum

  3. Regression Benchmarking: An Approach to Quality Assurance in Performance

    OpenAIRE

    Bulej, Lubomír

    2005-01-01

    The paper presents a short summary of our work in the area of regression benchmarking and its application to software development. Specially, we explain the concept of regression benchmarking, the requirements for employing regression testing in a software project, and methods used for analyzing the vast amounts of data resulting from repeated benchmarking. We present the application of regression benchmarking on a real software project and conclude with a glimpse at the challenges for the fu...

  4. Classifying machinery condition using oil samples and binary logistic regression

    Science.gov (United States)

    Phillips, J.; Cripps, E.; Lau, John W.; Hodkiewicz, M. R.

    2015-08-01

    The era of big data has resulted in an explosion of condition monitoring information. The result is an increasing motivation to automate the costly and time consuming human elements involved in the classification of machine health. When working with industry it is important to build an understanding and hence some trust in the classification scheme for those who use the analysis to initiate maintenance tasks. Typically "black box" approaches such as artificial neural networks (ANN) and support vector machines (SVM) can be difficult to provide ease of interpretability. In contrast, this paper argues that logistic regression offers easy interpretability to industry experts, providing insight to the drivers of the human classification process and to the ramifications of potential misclassification. Of course, accuracy is of foremost importance in any automated classification scheme, so we also provide a comparative study based on predictive performance of logistic regression, ANN and SVM. A real world oil analysis data set from engines on mining trucks is presented and using cross-validation we demonstrate that logistic regression out-performs the ANN and SVM approaches in terms of prediction for healthy/not healthy engines.

  5. Panel data specifications in nonparametric kernel regression

    DEFF Research Database (Denmark)

    Czekaj, Tomasz Gerard; Henningsen, Arne

    parametric panel data estimators to analyse the production technology of Polish crop farms. The results of our nonparametric kernel regressions generally differ from the estimates of the parametric models but they only slightly depend on the choice of the kernel functions. Based on economic reasoning, we...

  6. Influence diagnostics in meta-regression model.

    Science.gov (United States)

    Shi, Lei; Zuo, ShanShan; Yu, Dalei; Zhou, Xiaohua

    2017-09-01

    This paper studies the influence diagnostics in meta-regression model including case deletion diagnostic and local influence analysis. We derive the subset deletion formulae for the estimation of regression coefficient and heterogeneity variance and obtain the corresponding influence measures. The DerSimonian and Laird estimation and maximum likelihood estimation methods in meta-regression are considered, respectively, to derive the results. Internal and external residual and leverage measure are defined. The local influence analysis based on case-weights perturbation scheme, responses perturbation scheme, covariate perturbation scheme, and within-variance perturbation scheme are explored. We introduce a method by simultaneous perturbing responses, covariate, and within-variance to obtain the local influence measure, which has an advantage of capable to compare the influence magnitude of influential studies from different perturbations. An example is used to illustrate the proposed methodology. Copyright © 2017 John Wiley & Sons, Ltd.

  7. Principal component regression for crop yield estimation

    CERN Document Server

    Suryanarayana, T M V

    2016-01-01

    This book highlights the estimation of crop yield in Central Gujarat, especially with regard to the development of Multiple Regression Models and Principal Component Regression (PCR) models using climatological parameters as independent variables and crop yield as a dependent variable. It subsequently compares the multiple linear regression (MLR) and PCR results, and discusses the significance of PCR for crop yield estimation. In this context, the book also covers Principal Component Analysis (PCA), a statistical procedure used to reduce a number of correlated variables into a smaller number of uncorrelated variables called principal components (PC). This book will be helpful to the students and researchers, starting their works on climate and agriculture, mainly focussing on estimation models. The flow of chapters takes the readers in a smooth path, in understanding climate and weather and impact of climate change, and gradually proceeds towards downscaling techniques and then finally towards development of ...

  8. Cactus: An Introduction to Regression

    Science.gov (United States)

    Hyde, Hartley

    2008-01-01

    When the author first used "VisiCalc," the author thought it a very useful tool when he had the formulas. But how could he design a spreadsheet if there was no known formula for the quantities he was trying to predict? A few months later, the author relates he learned to use multiple linear regression software and suddenly it all clicked into…

  9. Regression Models for Repairable Systems

    Czech Academy of Sciences Publication Activity Database

    Novák, Petr

    2015-01-01

    Roč. 17, č. 4 (2015), s. 963-972 ISSN 1387-5841 Institutional support: RVO:67985556 Keywords : Reliability analysis * Repair models * Regression Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.782, year: 2015 http://library.utia.cas.cz/separaty/2015/SI/novak-0450902.pdf

  10. Survival analysis II: Cox regression

    NARCIS (Netherlands)

    Stel, Vianda S.; Dekker, Friedo W.; Tripepi, Giovanni; Zoccali, Carmine; Jager, Kitty J.

    2011-01-01

    In contrast to the Kaplan-Meier method, Cox proportional hazards regression can provide an effect estimate by quantifying the difference in survival between patient groups and can adjust for confounding effects of other variables. The purpose of this article is to explain the basic concepts of the

  11. Kernel regression with functional response

    OpenAIRE

    Ferraty, Frédéric; Laksaci, Ali; Tadj, Amel; Vieu, Philippe

    2011-01-01

    We consider kernel regression estimate when both the response variable and the explanatory one are functional. The rates of uniform almost complete convergence are stated as function of the small ball probability of the predictor and as function of the entropy of the set on which uniformity is obtained.

  12. AVNG system demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Thron, Jonathan Louis [Los Alamos National Laboratory; Mac Arthur, Duncan W [Los Alamos National Laboratory; Kondratov, Sergey [VNIIEF; Livke, Alexander [VNIIEF; Razinkov, Sergey [VNIIEF

    2010-01-01

    An attribute measurement system (AMS) measures a number of unclassified attributes of potentially classified material. By only displaying these unclassified results as red or green lights, the AMS protects potentially classified information while still generating confidence in the measurement result. The AVNG implementation that we describe is an AMS built by RFNC - VNIIEF in Sarov, Russia. To provide additional confidence, the AVNG was designed with two modes of operation. In the secure mode, potentially classified measurements can be made with only the simple red light/green light display. In the open mode, known unclassified material can be measured with complete display of the information collected from the radiation detectors. The AVNG demonstration, which occurred in Sarov, Russia in June 2009 for a joint US/Russian audience, included exercising both modes of AVNG operation using a number of multi-kg plutonium sources. In addition to describing the demonstration, we will show photographs and/or video taken of AVNG operation.

  13. On a Robust MaxEnt Process Regression Model with Sample-Selection

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2018-04-01

    Full Text Available In a regression analysis, a sample-selection bias arises when a dependent variable is partially observed as a result of the sample selection. This study introduces a Maximum Entropy (MaxEnt process regression model that assumes a MaxEnt prior distribution for its nonparametric regression function and finds that the MaxEnt process regression model includes the well-known Gaussian process regression (GPR model as a special case. Then, this special MaxEnt process regression model, i.e., the GPR model, is generalized to obtain a robust sample-selection Gaussian process regression (RSGPR model that deals with non-normal data in the sample selection. Various properties of the RSGPR model are established, including the stochastic representation, distributional hierarchy, and magnitude of the sample-selection bias. These properties are used in the paper to develop a hierarchical Bayesian methodology to estimate the model. This involves a simple and computationally feasible Markov chain Monte Carlo algorithm that avoids analytical or numerical derivatives of the log-likelihood function of the model. The performance of the RSGPR model in terms of the sample-selection bias correction, robustness to non-normality, and prediction, is demonstrated through results in simulations that attest to its good finite-sample performance.

  14. Poisson Mixture Regression Models for Heart Disease Prediction.

    Science.gov (United States)

    Mufudza, Chipo; Erol, Hamza

    2016-01-01

    Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model.

  15. Regression and regression analysis time series prediction modeling on climate data of quetta, pakistan

    International Nuclear Information System (INIS)

    Jafri, Y.Z.; Kamal, L.

    2007-01-01

    Various statistical techniques was used on five-year data from 1998-2002 of average humidity, rainfall, maximum and minimum temperatures, respectively. The relationships to regression analysis time series (RATS) were developed for determining the overall trend of these climate parameters on the basis of which forecast models can be corrected and modified. We computed the coefficient of determination as a measure of goodness of fit, to our polynomial regression analysis time series (PRATS). The correlation to multiple linear regression (MLR) and multiple linear regression analysis time series (MLRATS) were also developed for deciphering the interdependence of weather parameters. Spearman's rand correlation and Goldfeld-Quandt test were used to check the uniformity or non-uniformity of variances in our fit to polynomial regression (PR). The Breusch-Pagan test was applied to MLR and MLRATS, respectively which yielded homoscedasticity. We also employed Bartlett's test for homogeneity of variances on a five-year data of rainfall and humidity, respectively which showed that the variances in rainfall data were not homogenous while in case of humidity, were homogenous. Our results on regression and regression analysis time series show the best fit to prediction modeling on climatic data of Quetta, Pakistan. (author)

  16. Logistic regression applied to natural hazards: rare event logistic regression with replications

    OpenAIRE

    Guns, M.; Vanacker, Veerle

    2012-01-01

    Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logisti...

  17. Quantile Regression With Measurement Error

    KAUST Repository

    Wei, Ying

    2009-08-27

    Regression quantiles can be substantially biased when the covariates are measured with error. In this paper we propose a new method that produces consistent linear quantile estimation in the presence of covariate measurement error. The method corrects the measurement error induced bias by constructing joint estimating equations that simultaneously hold for all the quantile levels. An iterative EM-type estimation algorithm to obtain the solutions to such joint estimation equations is provided. The finite sample performance of the proposed method is investigated in a simulation study, and compared to the standard regression calibration approach. Finally, we apply our methodology to part of the National Collaborative Perinatal Project growth data, a longitudinal study with an unusual measurement error structure. © 2009 American Statistical Association.

  18. Multivariate and semiparametric kernel regression

    OpenAIRE

    Härdle, Wolfgang; Müller, Marlene

    1997-01-01

    The paper gives an introduction to theory and application of multivariate and semiparametric kernel smoothing. Multivariate nonparametric density estimation is an often used pilot tool for examining the structure of data. Regression smoothing helps in investigating the association between covariates and responses. We concentrate on kernel smoothing using local polynomial fitting which includes the Nadaraya-Watson estimator. Some theory on the asymptotic behavior and bandwidth selection is pro...

  19. Regression algorithm for emotion detection

    OpenAIRE

    Berthelon , Franck; Sander , Peter

    2013-01-01

    International audience; We present here two components of a computational system for emotion detection. PEMs (Personalized Emotion Maps) store links between bodily expressions and emotion values, and are individually calibrated to capture each person's emotion profile. They are an implementation based on aspects of Scherer's theoretical complex system model of emotion~\\cite{scherer00, scherer09}. We also present a regression algorithm that determines a person's emotional feeling from sensor m...

  20. Directional quantile regression in R

    Czech Academy of Sciences Publication Activity Database

    Boček, Pavel; Šiman, Miroslav

    2017-01-01

    Roč. 53, č. 3 (2017), s. 480-492 ISSN 0023-5954 R&D Projects: GA ČR GA14-07234S Institutional support: RVO:67985556 Keywords : multivariate quantile * regression quantile * halfspace depth * depth contour Subject RIV: BD - Theory of Information OBOR OECD: Applied mathematics Impact factor: 0.379, year: 2016 http://library.utia.cas.cz/separaty/2017/SI/bocek-0476587.pdf

  1. Multiple regression analysis of anthropometric measurements influencing the cephalic index of male Japanese university students.

    Science.gov (United States)

    Hossain, Md Golam; Saw, Aik; Alam, Rashidul; Ohtsuki, Fumio; Kamarul, Tunku

    2013-09-01

    Cephalic index (CI), the ratio of head breadth to head length, is widely used to categorise human populations. The aim of this study was to access the impact of anthropometric measurements on the CI of male Japanese university students. This study included 1,215 male university students from Tokyo and Kyoto, selected using convenient sampling. Multiple regression analysis was used to determine the effect of anthropometric measurements on CI. The variance inflation factor (VIF) showed no evidence of a multicollinearity problem among independent variables. The coefficients of the regression line demonstrated a significant positive relationship between CI and minimum frontal breadth (p regression analysis showed a greater likelihood for minimum frontal breadth (p regression analysis revealed bizygomatic breadth, head circumference, minimum frontal breadth, head height and morphological facial height to be the best predictor craniofacial measurements with respect to CI. The results suggest that most of the variables considered in this study appear to influence the CI of adult male Japanese students.

  2. Multiple regression approach to predict turbine-generator output for Chinshan nuclear power plant

    International Nuclear Information System (INIS)

    Chan, Yea-Kuang; Tsai, Yu-Ching

    2017-01-01

    The objective of this study is to develop a turbine cycle model using the multiple regression approach to estimate the turbine-generator output for the Chinshan Nuclear Power Plant (NPP). The plant operating data was verified using a linear regression model with a corresponding 95% confidence interval for the operating data. In this study, the key parameters were selected as inputs for the multiple regression based turbine cycle model. The proposed model was used to estimate the turbine-generator output. The effectiveness of the proposed turbine cycle model was demonstrated by using plant operating data obtained from the Chinshan NPP Unit 2. The results show that this multiple regression based turbine cycle model can be used to accurately estimate the turbine-generator output. In addition, this study also provides an alternative approach with simple and easy features to evaluate the thermal performance for nuclear power plants.

  3. Multiple regression approach to predict turbine-generator output for Chinshan nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Chan, Yea-Kuang; Tsai, Yu-Ching [Institute of Nuclear Energy Research, Taoyuan City, Taiwan (China). Nuclear Engineering Division

    2017-03-15

    The objective of this study is to develop a turbine cycle model using the multiple regression approach to estimate the turbine-generator output for the Chinshan Nuclear Power Plant (NPP). The plant operating data was verified using a linear regression model with a corresponding 95% confidence interval for the operating data. In this study, the key parameters were selected as inputs for the multiple regression based turbine cycle model. The proposed model was used to estimate the turbine-generator output. The effectiveness of the proposed turbine cycle model was demonstrated by using plant operating data obtained from the Chinshan NPP Unit 2. The results show that this multiple regression based turbine cycle model can be used to accurately estimate the turbine-generator output. In addition, this study also provides an alternative approach with simple and easy features to evaluate the thermal performance for nuclear power plants.

  4. The EU CONCERTO project Class 1 - Demonstrating cost-effective low-energy buildings - Recent results with special focus on comparison of calculated and measured energy performance of Danish buildings

    DEFF Research Database (Denmark)

    Mørck, Ove; Thomsen, K.E.; Rose, J.

    2012-01-01

    -chip heating plant has been added. The project demonstrates the benefits of ultra-low-energy buildings integrated with biomass- and solar heating energy supply. The CLASS1 project involves 4 other countries: Estonia, France, Italy and Romania. These countries develop training activities based on the results......In 2007 the Class1 project commenced. Originally, 442 dwellings were to be designed and constructed as "low-energy class 1" houses according to requirements set by the Municipality of Egedal/Denmark. This means that the energy consumption is 50% below the existing energy regulations. 65 dwellings...... and experiences gained from the Danish housing projects. This paper describes the comparisons between measured and calculated energy consumption in a social housing settlement and in a detached single-family house. Results show relatively large discrepancies between measured and calculated results...

  5. A study of toxic emissions from a coal-fired power plant utilizing the SNOX innovative clean coal technology demonstration. Volume 1, Sampling/results/special topics: Final report

    Energy Technology Data Exchange (ETDEWEB)

    1994-07-01

    This study was one of a group of assessments of toxic emissions from coal-fired power plants, conducted for DOE during 1993. The motivation for those assessments was the mandate in the 1990 Clean Air Act Amendments that a study be made of emissions of hazardous air pollutants (HAPs) from electric utilities. The report is organized in two volumes. Volume 1: Sampling describes the sampling effort conducted as the basis for this study; Results presents the concentration data on HAPs in the several power plant streams, and reports the results of evaluations and calculations conducted with those data; and Special Topics report on issues such as comparison of sampling methods and vapor/solid distributions of HAPs. Volume 2: Appendices include quality assurance/quality control results, uncertainty analysis for emission factors, and data sheets. This study involved measurements of a variety of substances in solid, liquid, and gaseous samples from input, output, and process streams at the Innovative Clean Coal Technology Demonstration (ICCT) of the Wet Sulfuric Acid-Selective Catalytic Reduction (SNOX) process. The SNOX demonstration is being conducted at Ohio Edison`s Niles Boiler No. 2 which uses cyclone burners to burn bituminous coal. A 35 megawatt slipstream of flue gas from the boiler is used to demonstrate SNOX. The substances measured at the SNOX process were the following: 1. Five major and 16 trace elements, including mercury, chromium, cadmium, lead, selenium, arsenic, beryllium, and nickel; 2. Acids and corresponding anions (HCl, HF, chloride, fluoride, phosphate, sulfate); 3. Ammonia and cyanide; 4. Elemental carbon; 5. Radionuclides; 6. Volatile organic compounds (VOC); 7. Semi-volatile compounds (SVOC) including polynuclear aromatic hydrocarbons (PAH); and 8. Aldehydes.

  6. Gaussian Process Regression Model in Spatial Logistic Regression

    Science.gov (United States)

    Sofro, A.; Oktaviarina, A.

    2018-01-01

    Spatial analysis has developed very quickly in the last decade. One of the favorite approaches is based on the neighbourhood of the region. Unfortunately, there are some limitations such as difficulty in prediction. Therefore, we offer Gaussian process regression (GPR) to accommodate the issue. In this paper, we will focus on spatial modeling with GPR for binomial data with logit link function. The performance of the model will be investigated. We will discuss the inference of how to estimate the parameters and hyper-parameters and to predict as well. Furthermore, simulation studies will be explained in the last section.

  7. Photovoltaic demonstration projects

    Energy Technology Data Exchange (ETDEWEB)

    Kaut, W [Commission of the European Communities, Brussels (Belgium); Gillett, W B; Hacker, R J [Halcrow Gilbert Associates Ltd., Swindon (GB)

    1992-12-31

    This publication, comprising the proceedings of the fifth contractor`s meeting organized by the Commission of the European Communities, Directorate-General for Energy, provides an overview of the photovoltaic demonstration projects which have been supported in the framework of the energy demonstration programme since 1983. It includes reports by each of the contractors who submitted proposals in 1987 and 1988, describing progress within their projects. Projects accepted from earlier calls for proposals and not yet completed were reviewed by a rapporteur and are discussed in the summary section. The results of the performance monitoring of all projects and the lessons drawn from the practical experience of the projects are also presented in the summaries and conclusions. Contractors whose projects were submitted in 1989 were also present at the meeting and contributed to the reported discussions. This proceeding is divided into four sessions (General, Housing, technical presentations, other applications) and 24 papers are offered.

  8. Multitask Quantile Regression under the Transnormal Model.

    Science.gov (United States)

    Fan, Jianqing; Xue, Lingzhou; Zou, Hui

    2016-01-01

    We consider estimating multi-task quantile regression under the transnormal model, with focus on high-dimensional setting. We derive a surprisingly simple closed-form solution through rank-based covariance regularization. In particular, we propose the rank-based ℓ 1 penalization with positive definite constraints for estimating sparse covariance matrices, and the rank-based banded Cholesky decomposition regularization for estimating banded precision matrices. By taking advantage of alternating direction method of multipliers, nearest correlation matrix projection is introduced that inherits sampling properties of the unprojected one. Our work combines strengths of quantile regression and rank-based covariance regularization to simultaneously deal with nonlinearity and nonnormality for high-dimensional regression. Furthermore, the proposed method strikes a good balance between robustness and efficiency, achieves the "oracle"-like convergence rate, and provides the provable prediction interval under the high-dimensional setting. The finite-sample performance of the proposed method is also examined. The performance of our proposed rank-based method is demonstrated in a real application to analyze the protein mass spectroscopy data.

  9. Complex regression Doppler optical coherence tomography

    Science.gov (United States)

    Elahi, Sahar; Gu, Shi; Thrane, Lars; Rollins, Andrew M.; Jenkins, Michael W.

    2018-04-01

    We introduce a new method to measure Doppler shifts more accurately and extend the dynamic range of Doppler optical coherence tomography (OCT). The two-point estimate of the conventional Doppler method is replaced with a regression that is applied to high-density B-scans in polar coordinates. We built a high-speed OCT system using a 1.68-MHz Fourier domain mode locked laser to acquire high-density B-scans (16,000 A-lines) at high enough frame rates (˜100 fps) to accurately capture the dynamics of the beating embryonic heart. Flow phantom experiments confirm that the complex regression lowers the minimum detectable velocity from 12.25 mm / s to 374 μm / s, whereas the maximum velocity of 400 mm / s is measured without phase wrapping. Complex regression Doppler OCT also demonstrates higher accuracy and precision compared with the conventional method, particularly when signal-to-noise ratio is low. The extended dynamic range allows monitoring of blood flow over several stages of development in embryos without adjusting the imaging parameters. In addition, applying complex averaging recovers hidden features in structural images.

  10. Satellite rainfall retrieval by logistic regression

    Science.gov (United States)

    Chiu, Long S.

    1986-01-01

    The potential use of logistic regression in rainfall estimation from satellite measurements is investigated. Satellite measurements provide covariate information in terms of radiances from different remote sensors.The logistic regression technique can effectively accommodate many covariates and test their significance in the estimation. The outcome from the logistical model is the probability that the rainrate of a satellite pixel is above a certain threshold. By varying the thresholds, a rainrate histogram can be obtained, from which the mean and the variant can be estimated. A logistical model is developed and applied to rainfall data collected during GATE, using as covariates the fractional rain area and a radiance measurement which is deduced from a microwave temperature-rainrate relation. It is demonstrated that the fractional rain area is an important covariate in the model, consistent with the use of the so-called Area Time Integral in estimating total rain volume in other studies. To calibrate the logistical model, simulated rain fields generated by rainfield models with prescribed parameters are needed. A stringent test of the logistical model is its ability to recover the prescribed parameters of simulated rain fields. A rain field simulation model which preserves the fractional rain area and lognormality of rainrates as found in GATE is developed. A stochastic regression model of branching and immigration whose solutions are lognormally distributed in some asymptotic limits has also been developed.

  11. Few crystal balls are crystal clear : eyeballing regression

    International Nuclear Information System (INIS)

    Wittebrood, R.T.

    1998-01-01

    The theory of regression and statistical analysis as it applies to reservoir analysis was discussed. It was argued that regression lines are not always the final truth. It was suggested that regression lines and eyeballed lines are often equally accurate. The many conditions that must be fulfilled to calculate a proper regression were discussed. Mentioned among these conditions were the distribution of the data, hidden variables, knowledge of how the data was obtained, the need for causal correlation of the variables, and knowledge of the manner in which the regression results are going to be used. 1 tab., 13 figs

  12. Short-term electricity prices forecasting based on support vector regression and Auto-regressive integrated moving average modeling

    International Nuclear Information System (INIS)

    Che Jinxing; Wang Jianzhou

    2010-01-01

    In this paper, we present the use of different mathematical models to forecast electricity price under deregulated power. A successful prediction tool of electricity price can help both power producers and consumers plan their bidding strategies. Inspired by that the support vector regression (SVR) model, with the ε-insensitive loss function, admits of the residual within the boundary values of ε-tube, we propose a hybrid model that combines both SVR and Auto-regressive integrated moving average (ARIMA) models to take advantage of the unique strength of SVR and ARIMA models in nonlinear and linear modeling, which is called SVRARIMA. A nonlinear analysis of the time-series indicates the convenience of nonlinear modeling, the SVR is applied to capture the nonlinear patterns. ARIMA models have been successfully applied in solving the residuals regression estimation problems. The experimental results demonstrate that the model proposed outperforms the existing neural-network approaches, the traditional ARIMA models and other hybrid models based on the root mean square error and mean absolute percentage error.

  13. Piecewise linear regression splines with hyperbolic covariates

    International Nuclear Information System (INIS)

    Cologne, John B.; Sposto, Richard

    1992-09-01

    Consider the problem of fitting a curve to data that exhibit a multiphase linear response with smooth transitions between phases. We propose substituting hyperbolas as covariates in piecewise linear regression splines to obtain curves that are smoothly joined. The method provides an intuitive and easy way to extend the two-phase linear hyperbolic response model of Griffiths and Miller and Watts and Bacon to accommodate more than two linear segments. The resulting regression spline with hyperbolic covariates may be fit by nonlinear regression methods to estimate the degree of curvature between adjoining linear segments. The added complexity of fitting nonlinear, as opposed to linear, regression models is not great. The extra effort is particularly worthwhile when investigators are unwilling to assume that the slope of the response changes abruptly at the join points. We can also estimate the join points (the values of the abscissas where the linear segments would intersect if extrapolated) if their number and approximate locations may be presumed known. An example using data on changing age at menarche in a cohort of Japanese women illustrates the use of the method for exploratory data analysis. (author)

  14. Targeting: Logistic Regression, Special Cases and Extensions

    Directory of Open Access Journals (Sweden)

    Helmut Schaeben

    2014-12-01

    Full Text Available Logistic regression is a classical linear model for logit-transformed conditional probabilities of a binary target variable. It recovers the true conditional probabilities if the joint distribution of predictors and the target is of log-linear form. Weights-of-evidence is an ordinary logistic regression with parameters equal to the differences of the weights of evidence if all predictor variables are discrete and conditionally independent given the target variable. The hypothesis of conditional independence can be tested in terms of log-linear models. If the assumption of conditional independence is violated, the application of weights-of-evidence does not only corrupt the predicted conditional probabilities, but also their rank transform. Logistic regression models, including the interaction terms, can account for the lack of conditional independence, appropriate interaction terms compensate exactly for violations of conditional independence. Multilayer artificial neural nets may be seen as nested regression-like models, with some sigmoidal activation function. Most often, the logistic function is used as the activation function. If the net topology, i.e., its control, is sufficiently versatile to mimic interaction terms, artificial neural nets are able to account for violations of conditional independence and yield very similar results. Weights-of-evidence cannot reasonably include interaction terms; subsequent modifications of the weights, as often suggested, cannot emulate the effect of interaction terms.

  15. Lunar Water Resource Demonstration

    Science.gov (United States)

    Muscatello, Anthony C.

    2008-01-01

    In cooperation with the Canadian Space Agency, the Northern Centre for Advanced Technology, Inc., the Carnegie-Mellon University, JPL, and NEPTEC, NASA has undertaken the In-Situ Resource Utilization (ISRU) project called RESOLVE. This project is a ground demonstration of a system that would be sent to explore permanently shadowed polar lunar craters, drill into the regolith, determine what volatiles are present, and quantify them in addition to recovering oxygen by hydrogen reduction. The Lunar Prospector has determined these craters contain enhanced hydrogen concentrations averaging about 0.1%. If the hydrogen is in the form of water, the water concentration would be around 1%, which would translate into billions of tons of water on the Moon, a tremendous resource. The Lunar Water Resource Demonstration (LWRD) is a part of RESOLVE designed to capture lunar water and hydrogen and quantify them as a backup to gas chromatography analysis. This presentation will briefly review the design of LWRD and some of the results of testing the subsystem. RESOLVE is to be integrated with the Scarab rover from CMIJ and the whole system demonstrated on Mauna Kea on Hawaii in November 2008. The implications of lunar water for Mars exploration are two-fold: 1) RESOLVE and LWRD could be used in a similar fashion on Mars to locate and quantify water resources, and 2) electrolysis of lunar water could provide large amounts of liquid oxygen in LEO, leading to lower costs for travel to Mars, in addition to being very useful at lunar outposts.

  16. ROBOT LEARNING OF OBJECT MANIPULATION TASK ACTIONS FROM HUMAN DEMONSTRATIONS

    Directory of Open Access Journals (Sweden)

    Maria Kyrarini

    2017-08-01

    Full Text Available Robot learning from demonstration is a method which enables robots to learn in a similar way as humans. In this paper, a framework that enables robots to learn from multiple human demonstrations via kinesthetic teaching is presented. The subject of learning is a high-level sequence of actions, as well as the low-level trajectories necessary to be followed by the robot to perform the object manipulation task. The multiple human demonstrations are recorded and only the most similar demonstrations are selected for robot learning. The high-level learning module identifies the sequence of actions of the demonstrated task. Using Dynamic Time Warping (DTW and Gaussian Mixture Model (GMM, the model of demonstrated trajectories is learned. The learned trajectory is generated by Gaussian mixture regression (GMR from the learned Gaussian mixture model.  In online working phase, the sequence of actions is identified and experimental results show that the robot performs the learned task successfully.

  17. Commercial incineration demonstration

    International Nuclear Information System (INIS)

    Vavruska, J.S.; Borduin, L.C.

    1982-01-01

    Low-level radioactive wastes (LLW) generated by nuclear utilities presently are shipped to commercial burial grounds for disposal. Increasing transportation and disposal costs have caused industry to consider incineration as a cost-effective means of volume reduction of combustible LLW. Repeated inquiries from the nuclear industry regarding the applicability of the Los Alamos controlled air incineration (CAI) design led the DOE to initiate a commercial demonstration program in FY-1980. Development studies and results in support of this program involving ion exchange resin incineration and fission/activation product distributions within the Los Alamos CAI are described

  18. Prediction of unwanted pregnancies using logistic regression, probit regression and discriminant analysis.

    Science.gov (United States)

    Ebrahimzadeh, Farzad; Hajizadeh, Ebrahim; Vahabi, Nasim; Almasian, Mohammad; Bakhteyar, Katayoon

    2015-01-01

    Unwanted pregnancy not intended by at least one of the parents has undesirable consequences for the family and the society. In the present study, three classification models were used and compared to predict unwanted pregnancies in an urban population. In this cross-sectional study, 887 pregnant mothers referring to health centers in Khorramabad, Iran, in 2012 were selected by the stratified and cluster sampling; relevant variables were measured and for prediction of unwanted pregnancy, logistic regression, discriminant analysis, and probit regression models and SPSS software version 21 were used. To compare these models, indicators such as sensitivity, specificity, the area under the ROC curve, and the percentage of correct predictions were used. The prevalence of unwanted pregnancies was 25.3%. The logistic and probit regression models indicated that parity and pregnancy spacing, contraceptive methods, household income and number of living male children were related to unwanted pregnancy. The performance of the models based on the area under the ROC curve was 0.735, 0.733, and 0.680 for logistic regression, probit regression, and linear discriminant analysis, respectively. Given the relatively high prevalence of unwanted pregnancies in Khorramabad, it seems necessary to revise family planning programs. Despite the similar accuracy of the models, if the researcher is interested in the interpretability of the results, the use of the logistic regression model is recommended.

  19. Fiscal 1999 report on results of joint demonstrative project for environmentally benign coal utilization system. Demonstrative project concerning coal preparation technology (China); 1999 nendo kankyo chowagata sekitan riyo system kyodo jissho jigyo seika hokokusho. Sentan gijutsu ni kakawaru jissho jigyo (Chugoku)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    This paper describes the demonstrative project for coal preparation technology, as a part of the measures against environmental pollution due to the structuring of demonstration and dissemination basis for clean coal technologies in China, The results for fiscal 1999 is reported. In the utilization of coal in China, a problem of urgency is the highly efficient selection and removal of sulfur contents in raw coal. Coal production in Chongquing City is yearly 30 million tons, of which 90% contains sulfur contents of 3% or higher. At Jinjia Colliery of Panjiang Coal and Electric Co. Ltd., Guizhou Province, a site for the present project, a number of coal seams are unsuitable for single utilization because of high sulfur contents. The coal preparation technologies to be introduced are expected to improve coal preparation efficiency and desulfurization ratio in terms of both the washability of raw coal and the accuracy of the coal washer. This is the third year of the project, with the following activities performed, namely, research/design, manufacturing/procurement of equipment, design for construction work, training of operators or the like, and documentation. The manufacturing and procurement are for such equipment as vacuum disk filter with accessories, waste water thickener, pressure filter for tailings with accessories, flocculant pump/piping, slurry tank/pump, high-shear mixer with accessories, and electric instrumentation. All the equipment arrived at the site in January, 2001. (NEDO)

  20. Regression: The Apple Does Not Fall Far From the Tree.

    Science.gov (United States)

    Vetter, Thomas R; Schober, Patrick

    2018-05-15

    Researchers and clinicians are frequently interested in either: (1) assessing whether there is a relationship or association between 2 or more variables and quantifying this association; or (2) determining whether 1 or more variables can predict another variable. The strength of such an association is mainly described by the correlation. However, regression analysis and regression models can be used not only to identify whether there is a significant relationship or association between variables but also to generate estimations of such a predictive relationship between variables. This basic statistical tutorial discusses the fundamental concepts and techniques related to the most common types of regression analysis and modeling, including simple linear regression, multiple regression, logistic regression, ordinal regression, and Poisson regression, as well as the common yet often underrecognized phenomenon of regression toward the mean. The various types of regression analysis are powerful statistical techniques, which when appropriately applied, can allow for the valid interpretation of complex, multifactorial data. Regression analysis and models can assess whether there is a relationship or association between 2 or more observed variables and estimate the strength of this association, as well as determine whether 1 or more variables can predict another variable. Regression is thus being applied more commonly in anesthesia, perioperative, critical care, and pain research. However, it is crucial to note that regression can identify plausible risk factors; it does not prove causation (a definitive cause and effect relationship). The results of a regression analysis instead identify independent (predictor) variable(s) associated with the dependent (outcome) variable. As with other statistical methods, applying regression requires that certain assumptions be met, which can be tested with specific diagnostics.

  1. Spontaneous regression of pulmonary bullae

    International Nuclear Information System (INIS)

    Satoh, H.; Ishikawa, H.; Ohtsuka, M.; Sekizawa, K.

    2002-01-01

    The natural history of pulmonary bullae is often characterized by gradual, progressive enlargement. Spontaneous regression of bullae is, however, very rare. We report a case in which complete resolution of pulmonary bullae in the left upper lung occurred spontaneously. The management of pulmonary bullae is occasionally made difficult because of gradual progressive enlargement associated with abnormal pulmonary function. Some patients have multiple bulla in both lungs and/or have a history of pulmonary emphysema. Others have a giant bulla without emphysematous change in the lungs. Our present case had treated lung cancer with no evidence of local recurrence. He had no emphysematous change in lung function test and had no complaints, although the high resolution CT scan shows evidence of underlying minimal changes of emphysema. Ortin and Gurney presented three cases of spontaneous reduction in size of bulla. Interestingly, one of them had a marked decrease in the size of a bulla in association with thickening of the wall of the bulla, which was observed in our patient. This case we describe is of interest, not only because of the rarity with which regression of pulmonary bulla has been reported in the literature, but also because of the spontaneous improvements in the radiological picture in the absence of overt infection or tumor. Copyright (2002) Blackwell Science Pty Ltd

  2. Quantum algorithm for linear regression

    Science.gov (United States)

    Wang, Guoming

    2017-07-01

    We present a quantum algorithm for fitting a linear regression model to a given data set using the least-squares approach. Differently from previous algorithms which yield a quantum state encoding the optimal parameters, our algorithm outputs these numbers in the classical form. So by running it once, one completely determines the fitted model and then can use it to make predictions on new data at little cost. Moreover, our algorithm works in the standard oracle model, and can handle data sets with nonsparse design matrices. It runs in time poly( log2(N ) ,d ,κ ,1 /ɛ ) , where N is the size of the data set, d is the number of adjustable parameters, κ is the condition number of the design matrix, and ɛ is the desired precision in the output. We also show that the polynomial dependence on d and κ is necessary. Thus, our algorithm cannot be significantly improved. Furthermore, we also give a quantum algorithm that estimates the quality of the least-squares fit (without computing its parameters explicitly). This algorithm runs faster than the one for finding this fit, and can be used to check whether the given data set qualifies for linear regression in the first place.

  3. Interpretation of commonly used statistical regression models.

    Science.gov (United States)

    Kasza, Jessica; Wolfe, Rory

    2014-01-01

    A review of some regression models commonly used in respiratory health applications is provided in this article. Simple linear regression, multiple linear regression, logistic regression and ordinal logistic regression are considered. The focus of this article is on the interpretation of the regression coefficients of each model, which are illustrated through the application of these models to a respiratory health research study. © 2013 The Authors. Respirology © 2013 Asian Pacific Society of Respirology.

  4. Fiscal 2000 report on result of R and D of nonmetallic material recycling promotion technology (demonstration test and research, total system technology); 2000 nendo hitetsu kinzokukei sozai recycle sokushin gijutsu kenkyu kaihatsu seika hokokusho. Jissho shiken kenkyu, total system gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    R and D was conducted on advanced recycling technology for aluminum and base metal/rare metal based materials, with fiscal 2000 results compiled. In the research of aluminum recycling technology, on a continuous fractional crystallization process and a purification by zinc removal process, the existing facilities for each demonstrated that they could simulate an aluminum scrap melting process capacity of 1,000 t/month, with a series of initial conditions determined. In the research of total system technology, combined test facilities were completed in which a purification process and a melt cleaning process were integrated. In the research of the recycling technology for base metal/rare metal based materials, a test was carried out by demonstrative facilities, with the aim of establishing copper regeneration technology in which high grade copper is produced using metal/resin based scraps such as shredder dust of automobiles as the materials. In structuring the total system technology, a preliminary survey and environmental load measures were carried out toward the practicability of a comprehensive copper metal collection recycling system. (NEDO)

  5. Linear regression and the normality assumption.

    Science.gov (United States)

    Schmidt, Amand F; Finan, Chris

    2017-12-16

    Researchers often perform arbitrary outcome transformations to fulfill the normality assumption of a linear regression model. This commentary explains and illustrates that in large data settings, such transformations are often unnecessary, and worse may bias model estimates. Linear regression assumptions are illustrated using simulated data and an empirical example on the relation between time since type 2 diabetes diagnosis and glycated hemoglobin levels. Simulation results were evaluated on coverage; i.e., the number of times the 95% confidence interval included the true slope coefficient. Although outcome transformations bias point estimates, violations of the normality assumption in linear regression analyses do not. The normality assumption is necessary to unbiasedly estimate standard errors, and hence confidence intervals and P-values. However, in large sample sizes (e.g., where the number of observations per variable is >10) violations of this normality assumption often do not noticeably impact results. Contrary to this, assumptions on, the parametric model, absence of extreme observations, homoscedasticity, and independency of the errors, remain influential even in large sample size settings. Given that modern healthcare research typically includes thousands of subjects focusing on the normality assumption is often unnecessary, does not guarantee valid results, and worse may bias estimates due to the practice of outcome transformations. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Ridge regression estimator: combining unbiased and ordinary ridge regression methods of estimation

    Directory of Open Access Journals (Sweden)

    Sharad Damodar Gore

    2009-10-01

    Full Text Available Statistical literature has several methods for coping with multicollinearity. This paper introduces a new shrinkage estimator, called modified unbiased ridge (MUR. This estimator is obtained from unbiased ridge regression (URR in the same way that ordinary ridge regression (ORR is obtained from ordinary least squares (OLS. Properties of MUR are derived. Results on its matrix mean squared error (MMSE are obtained. MUR is compared with ORR and URR in terms of MMSE. These results are illustrated with an example based on data generated by Hoerl and Kennard (1975.

  7. Prediction, Regression and Critical Realism

    DEFF Research Database (Denmark)

    Næss, Petter

    2004-01-01

    This paper considers the possibility of prediction in land use planning, and the use of statistical research methods in analyses of relationships between urban form and travel behaviour. Influential writers within the tradition of critical realism reject the possibility of predicting social...... phenomena. This position is fundamentally problematic to public planning. Without at least some ability to predict the likely consequences of different proposals, the justification for public sector intervention into market mechanisms will be frail. Statistical methods like regression analyses are commonly...... seen as necessary in order to identify aggregate level effects of policy measures, but are questioned by many advocates of critical realist ontology. Using research into the relationship between urban structure and travel as an example, the paper discusses relevant research methods and the kinds...

  8. Two biased estimation techniques in linear regression: Application to aircraft

    Science.gov (United States)

    Klein, Vladislav

    1988-01-01

    Several ways for detection and assessment of collinearity in measured data are discussed. Because data collinearity usually results in poor least squares estimates, two estimation techniques which can limit a damaging effect of collinearity are presented. These two techniques, the principal components regression and mixed estimation, belong to a class of biased estimation techniques. Detection and assessment of data collinearity and the two biased estimation techniques are demonstrated in two examples using flight test data from longitudinal maneuvers of an experimental aircraft. The eigensystem analysis and parameter variance decomposition appeared to be a promising tool for collinearity evaluation. The biased estimators had far better accuracy than the results from the ordinary least squares technique.

  9. Modeling oil production based on symbolic regression

    International Nuclear Information System (INIS)

    Yang, Guangfei; Li, Xianneng; Wang, Jianliang; Lian, Lian; Ma, Tieju

    2015-01-01

    Numerous models have been proposed to forecast the future trends of oil production and almost all of them are based on some predefined assumptions with various uncertainties. In this study, we propose a novel data-driven approach that uses symbolic regression to model oil production. We validate our approach on both synthetic and real data, and the results prove that symbolic regression could effectively identify the true models beneath the oil production data and also make reliable predictions. Symbolic regression indicates that world oil production will peak in 2021, which broadly agrees with other techniques used by researchers. Our results also show that the rate of decline after the peak is almost half the rate of increase before the peak, and it takes nearly 12 years to drop 4% from the peak. These predictions are more optimistic than those in several other reports, and the smoother decline will provide the world, especially the developing countries, with more time to orchestrate mitigation plans. -- Highlights: •A data-driven approach has been shown to be effective at modeling the oil production. •The Hubbert model could be discovered automatically from data. •The peak of world oil production is predicted to appear in 2021. •The decline rate after peak is half of the increase rate before peak. •Oil production projected to decline 4% post-peak

  10. Image superresolution using support vector regression.

    Science.gov (United States)

    Ni, Karl S; Nguyen, Truong Q

    2007-06-01

    A thorough investigation of the application of support vector regression (SVR) to the superresolution problem is conducted through various frameworks. Prior to the study, the SVR problem is enhanced by finding the optimal kernel. This is done by formulating the kernel learning problem in SVR form as a convex optimization problem, specifically a semi-definite programming (SDP) problem. An additional constraint is added to reduce the SDP to a quadratically constrained quadratic programming (QCQP) problem. After this optimization, investigation of the relevancy of SVR to superresolution proceeds with the possibility of using a single and general support vector regression for all image content, and the results are impressive for small training sets. This idea is improved upon by observing structural properties in the discrete cosine transform (DCT) domain to aid in learning the regression. Further improvement involves a combination of classification and SVR-based techniques, extending works in resolution synthesis. This method, termed kernel resolution synthesis, uses specific regressors for isolated image content to describe the domain through a partitioned look of the vector space, thereby yielding good results.

  11. Tutorial on Using Regression Models with Count Outcomes Using R

    Directory of Open Access Journals (Sweden)

    A. Alexander Beaujean

    2016-02-01

    Full Text Available Education researchers often study count variables, such as times a student reached a goal, discipline referrals, and absences. Most researchers that study these variables use typical regression methods (i.e., ordinary least-squares either with or without transforming the count variables. In either case, using typical regression for count data can produce parameter estimates that are biased, thus diminishing any inferences made from such data. As count-variable regression models are seldom taught in training programs, we present a tutorial to help educational researchers use such methods in their own research. We demonstrate analyzing and interpreting count data using Poisson, negative binomial, zero-inflated Poisson, and zero-inflated negative binomial regression models. The count regression methods are introduced through an example using the number of times students skipped class. The data for this example are freely available and the R syntax used run the example analyses are included in the Appendix.

  12. Studies to demonstrate the adequacy of testing results of the qualification tests for the actuator of main steam safety relive valves (MSSRV) in an advanced boiling water reactor (ABWR)

    International Nuclear Information System (INIS)

    Gou, P.F.; Patel, R.; Curran, G.; Henrie, D.; Solorzano, E.

    2005-01-01

    This paper presents several studies performed to demonstrate that the testing results from the qualification tests for the actuator of the Main Steam Safety Relief Valves (MSSRV; also called SRV in this paper) in GE's Advanced Boiling Water Reactor (ABWR) are in compliance with the qualification guidelines stipulated in the applicable IEEE standards. The safety-related function of the MSSRV is to relieve pressure in order to protect the reactor pressure vessel from over-pressurization condition during normal operation and design basis events. In order to perform this function, the SRV must actuate at a given set pressure while maintaining the pressure and structural integrity of the SRV. The valves are provided with an electro-pneumatic actuator assembly that opens the valve upon receipt of an automatic or manually initiated electric signal to allow depressurization of the reactor pressure vessel (RPV). To assure the SRV can perform its intended safety related functions properly, qualification tests are needed in addition to analysis, to demonstrate that the SRV can withstand the specified environmental, dynamic and seismic design basis conditions without impairing its safety related function throughout their installed life under the design conditions including postulated design basis events such as OBE loads and Faulted (SSE) events. The guidelines used for the test methods, procedures and acceptance criteria for the qualification tests are established in IEEE std 344-1987 and IEEE std 382-1985. In the qualification tests, the specimen consists of the actuator, control valve assembly, limit switches, and limit switch support structure. During the functional, dynamic and seismic tests, the test specimen was mounted on a SRV. Qualification of safety related equipment to meet the guidelines of the IEEE standards is typically a two-step process: 1) environmental aging and 2) design basis events qualification. The purpose of the first step is to put the equipment in an

  13. Biodiesel Mass Transit Demonstration

    Science.gov (United States)

    2010-04-01

    The Biodiesel Mass Transit Demonstration report is intended for mass transit decision makers and fleet managers considering biodiesel use. This is the final report for the demonstration project implemented by the National Biodiesel Board under a gran...

  14. Authoring Effective Demonstrations

    National Research Council Canada - National Science Library

    Fu, Dan; Jensen, Randy; Salas, Eduardo; Rosen, Michael A; Ramachandran, Sowmya; Upshaw, Christin L; Hinkelman, Elizabeth; Lampton, Don

    2007-01-01

    ... or human role-players for each training event. We report our ongoing efforts to (1) research the nature and purpose of demonstration, articulating guidelines for effective demonstration within a training context, and (2...

  15. Comparing Demonstratives in Kwa

    African Journals Online (AJOL)

    This paper is a comparative study of demonstrative forms in three K wa languages, ... relative distance from the deictic centre, such as English this and that, here and there. ... Mostly, the referents of demonstratives are 'activated' or at least.

  16. Polarized Light Corridor Demonstrations.

    Science.gov (United States)

    Davies, G. R.

    1990-01-01

    Eleven demonstrations of light polarization are presented. Each includes a brief description of the apparatus and the effect demonstrated. Illustrated are strain patterns, reflection, scattering, the Faraday Effect, interference, double refraction, the polarizing microscope, and optical activity. (CW)

  17. Robust Regression and its Application in Financial Data Analysis

    OpenAIRE

    Mansoor Momeni; Mahmoud Dehghan Nayeri; Ali Faal Ghayoumi; Hoda Ghorbani

    2010-01-01

    This research is aimed to describe the application of robust regression and its advantages over the least square regression method in analyzing financial data. To do this, relationship between earning per share, book value of equity per share and share price as price model and earning per share, annual change of earning per share and return of stock as return model is discussed using both robust and least square regressions, and finally the outcomes are compared. Comparing the results from th...

  18. Phase Space Prediction of Chaotic Time Series with Nu-Support Vector Machine Regression

    International Nuclear Information System (INIS)

    Ye Meiying; Wang Xiaodong

    2005-01-01

    A new class of support vector machine, nu-support vector machine, is discussed which can handle both classification and regression. We focus on nu-support vector machine regression and use it for phase space prediction of chaotic time series. The effectiveness of the method is demonstrated by applying it to the Henon map. This study also compares nu-support vector machine with back propagation (BP) networks in order to better evaluate the performance of the proposed methods. The experimental results show that the nu-support vector machine regression obtains lower root mean squared error than the BP networks and provides an accurate chaotic time series prediction. These results can be attributable to the fact that nu-support vector machine implements the structural risk minimization principle and this leads to better generalization than the BP networks.

  19. No rationale for 1 variable per 10 events criterion for binary logistic regression analysis

    Directory of Open Access Journals (Sweden)

    Maarten van Smeden

    2016-11-01

    Full Text Available Abstract Background Ten events per variable (EPV is a widely advocated minimal criterion for sample size considerations in logistic regression analysis. Of three previous simulation studies that examined this minimal EPV criterion only one supports the use of a minimum of 10 EPV. In this paper, we examine the reasons for substantial differences between these extensive simulation studies. Methods The current study uses Monte Carlo simulations to evaluate small sample bias, coverage of confidence intervals and mean square error of logit coefficients. Logistic regression models fitted by maximum likelihood and a modified estimation procedure, known as Firth’s correction, are compared. Results The results show that besides EPV, the problems associated with low EPV depend on other factors such as the total sample size. It is also demonstrated that simulation results can be dominated by even a few simulated data sets for which the prediction of the outcome by the covariates is perfect (‘separation’. We reveal that different approaches for identifying and handling separation leads to substantially different simulation results. We further show that Firth’s correction can be used to improve the accuracy of regression coefficients and alleviate the problems associated with separation. Conclusions The current evidence supporting EPV rules for binary logistic regression is weak. Given our findings, there is an urgent need for new research to provide guidance for supporting sample size considerations for binary logistic regression analysis.

  20. Support Vector Regression Model Based on Empirical Mode Decomposition and Auto Regression for Electric Load Forecasting

    Directory of Open Access Journals (Sweden)

    Hong-Juan Li

    2013-04-01

    Full Text Available Electric load forecasting is an important issue for a power utility, associated with the management of daily operations such as energy transfer scheduling, unit commitment, and load dispatch. Inspired by strong non-linear learning capability of support vector regression (SVR, this paper presents a SVR model hybridized with the empirical mode decomposition (EMD method and auto regression (AR for electric load forecasting. The electric load data of the New South Wales (Australia market are employed for comparing the forecasting performances of different forecasting models. The results confirm the validity of the idea that the proposed model can simultaneously provide forecasting with good accuracy and interpretability.

  1. Using the Ridge Regression Procedures to Estimate the Multiple Linear Regression Coefficients

    Science.gov (United States)

    Gorgees, HazimMansoor; Mahdi, FatimahAssim

    2018-05-01

    This article concerns with comparing the performance of different types of ordinary ridge regression estimators that have been already proposed to estimate the regression parameters when the near exact linear relationships among the explanatory variables is presented. For this situations we employ the data obtained from tagi gas filling company during the period (2008-2010). The main result we reached is that the method based on the condition number performs better than other methods since it has smaller mean square error (MSE) than the other stated methods.

  2. Bayesian Inference of a Multivariate Regression Model

    Directory of Open Access Journals (Sweden)

    Marick S. Sinay

    2014-01-01

    Full Text Available We explore Bayesian inference of a multivariate linear regression model with use of a flexible prior for the covariance structure. The commonly adopted Bayesian setup involves the conjugate prior, multivariate normal distribution for the regression coefficients and inverse Wishart specification for the covariance matrix. Here we depart from this approach and propose a novel Bayesian estimator for the covariance. A multivariate normal prior for the unique elements of the matrix logarithm of the covariance matrix is considered. Such structure allows for a richer class of prior distributions for the covariance, with respect to strength of beliefs in prior location hyperparameters, as well as the added ability, to model potential correlation amongst the covariance structure. The posterior moments of all relevant parameters of interest are calculated based upon numerical results via a Markov chain Monte Carlo procedure. The Metropolis-Hastings-within-Gibbs algorithm is invoked to account for the construction of a proposal density that closely matches the shape of the target posterior distribution. As an application of the proposed technique, we investigate a multiple regression based upon the 1980 High School and Beyond Survey.

  3. Face Alignment via Regressing Local Binary Features.

    Science.gov (United States)

    Ren, Shaoqing; Cao, Xudong; Wei, Yichen; Sun, Jian

    2016-03-01

    This paper presents a highly efficient and accurate regression approach for face alignment. Our approach has two novel components: 1) a set of local binary features and 2) a locality principle for learning those features. The locality principle guides us to learn a set of highly discriminative local binary features for each facial landmark independently. The obtained local binary features are used to jointly learn a linear regression for the final output. This approach achieves the state-of-the-art results when tested on the most challenging benchmarks to date. Furthermore, because extracting and regressing local binary features are computationally very cheap, our system is much faster than previous methods. It achieves over 3000 frames per second (FPS) on a desktop or 300 FPS on a mobile phone for locating a few dozens of landmarks. We also study a key issue that is important but has received little attention in the previous research, which is the face detector used to initialize alignment. We investigate several face detectors and perform quantitative evaluation on how they affect alignment accuracy. We find that an alignment friendly detector can further greatly boost the accuracy of our alignment method, reducing the error up to 16% relatively. To facilitate practical usage of face detection/alignment methods, we also propose a convenient metric to measure how good a detector is for alignment initialization.

  4. Geographically weighted regression model on poverty indicator

    Science.gov (United States)

    Slamet, I.; Nugroho, N. F. T. A.; Muslich

    2017-12-01

    In this research, we applied geographically weighted regression (GWR) for analyzing the poverty in Central Java. We consider Gaussian Kernel as weighted function. The GWR uses the diagonal matrix resulted from calculating kernel Gaussian function as a weighted function in the regression model. The kernel weights is used to handle spatial effects on the data so that a model can be obtained for each location. The purpose of this paper is to model of poverty percentage data in Central Java province using GWR with Gaussian kernel weighted function and to determine the influencing factors in each regency/city in Central Java province. Based on the research, we obtained geographically weighted regression model with Gaussian kernel weighted function on poverty percentage data in Central Java province. We found that percentage of population working as farmers, population growth rate, percentage of households with regular sanitation, and BPJS beneficiaries are the variables that affect the percentage of poverty in Central Java province. In this research, we found the determination coefficient R2 are 68.64%. There are two categories of district which are influenced by different of significance factors.

  5. Spontaneous regression of intracranial malignant lymphoma

    International Nuclear Information System (INIS)

    Kojo, Nobuto; Tokutomi, Takashi; Eguchi, Gihachirou; Takagi, Shigeyuki; Matsumoto, Tomie; Sasaguri, Yasuyuki; Shigemori, Minoru.

    1988-01-01

    In a 46-year-old female with a 1-month history of gait and speech disturbances, computed tomography (CT) demonstrated mass lesions of slightly high density in the left basal ganglia and left frontal lobe. The lesions were markedly enhanced by contrast medium. The patient received no specific treatment, but her clinical manifestations gradually abated and the lesions decreased in size. Five months after her initial examination, the lesions were absent on CT scans; only a small area of low density remained. Residual clinical symptoms included mild right hemiparesis and aphasia. After 14 months the patient again deteriorated, and a CT scan revealed mass lesions in the right frontal lobe and the pons. However, no enhancement was observed in the previously affected regions. A biopsy revealed malignant lymphoma. Despite treatment with steroids and radiation, the patient's clinical status progressively worsened and she died 27 months after initial presentation. Seven other cases of spontaneous regression of primary malignant lymphoma have been reported. In this case, the mechanism of the spontaneous regression was not clear, but changes in immunologic status may have been involved. (author)

  6. Supporting Regularized Logistic Regression Privately and Efficiently

    Science.gov (United States)

    Li, Wenfa; Liu, Hongzhe; Yang, Peng; Xie, Wei

    2016-01-01

    As one of the most popular statistical and machine learning models, logistic regression with regularization has found wide adoption in biomedicine, social sciences, information technology, and so on. These domains often involve data of human subjects that are contingent upon strict privacy regulations. Concerns over data privacy make it increasingly difficult to coordinate and conduct large-scale collaborative studies, which typically rely on cross-institution data sharing and joint analysis. Our work here focuses on safeguarding regularized logistic regression, a widely-used statistical model while at the same time has not been investigated from a data security and privacy perspective. We consider a common use scenario of multi-institution collaborative studies, such as in the form of research consortia or networks as widely seen in genetics, epidemiology, social sciences, etc. To make our privacy-enhancing solution practical, we demonstrate a non-conventional and computationally efficient method leveraging distributing computing and strong cryptography to provide comprehensive protection over individual-level and summary data. Extensive empirical evaluations on several studies validate the privacy guarantee, efficiency and scalability of our proposal. We also discuss the practical implications of our solution for large-scale studies and applications from various disciplines, including genetic and biomedical studies, smart grid, network analysis, etc. PMID:27271738

  7. Supporting Regularized Logistic Regression Privately and Efficiently.

    Science.gov (United States)

    Li, Wenfa; Liu, Hongzhe; Yang, Peng; Xie, Wei

    2016-01-01

    As one of the most popular statistical and machine learning models, logistic regression with regularization has found wide adoption in biomedicine, social sciences, information technology, and so on. These domains often involve data of human subjects that are contingent upon strict privacy regulations. Concerns over data privacy make it increasingly difficult to coordinate and conduct large-scale collaborative studies, which typically rely on cross-institution data sharing and joint analysis. Our work here focuses on safeguarding regularized logistic regression, a widely-used statistical model while at the same time has not been investigated from a data security and privacy perspective. We consider a common use scenario of multi-institution collaborative studies, such as in the form of research consortia or networks as widely seen in genetics, epidemiology, social sciences, etc. To make our privacy-enhancing solution practical, we demonstrate a non-conventional and computationally efficient method leveraging distributing computing and strong cryptography to provide comprehensive protection over individual-level and summary data. Extensive empirical evaluations on several studies validate the privacy guarantee, efficiency and scalability of our proposal. We also discuss the practical implications of our solution for large-scale studies and applications from various disciplines, including genetic and biomedical studies, smart grid, network analysis, etc.

  8. Hyperspectral Unmixing with Robust Collaborative Sparse Regression

    Directory of Open Access Journals (Sweden)

    Chang Li

    2016-07-01

    Full Text Available Recently, sparse unmixing (SU of hyperspectral data has received particular attention for analyzing remote sensing images. However, most SU methods are based on the commonly admitted linear mixing model (LMM, which ignores the possible nonlinear effects (i.e., nonlinearity. In this paper, we propose a new method named robust collaborative sparse regression (RCSR based on the robust LMM (rLMM for hyperspectral unmixing. The rLMM takes the nonlinearity into consideration, and the nonlinearity is merely treated as outlier, which has the underlying sparse property. The RCSR simultaneously takes the collaborative sparse property of the abundance and sparsely distributed additive property of the outlier into consideration, which can be formed as a robust joint sparse regression problem. The inexact augmented Lagrangian method (IALM is used to optimize the proposed RCSR. The qualitative and quantitative experiments on synthetic datasets and real hyperspectral images demonstrate that the proposed RCSR is efficient for solving the hyperspectral SU problem compared with the other four state-of-the-art algorithms.

  9. Supporting Regularized Logistic Regression Privately and Efficiently.

    Directory of Open Access Journals (Sweden)

    Wenfa Li

    Full Text Available As one of the most popular statistical and machine learning models, logistic regression with regularization has found wide adoption in biomedicine, social sciences, information technology, and so on. These domains often involve data of human subjects that are contingent upon strict privacy regulations. Concerns over data privacy make it increasingly difficult to coordinate and conduct large-scale collaborative studies, which typically rely on cross-institution data sharing and joint analysis. Our work here focuses on safeguarding regularized logistic regression, a widely-used statistical model while at the same time has not been investigated from a data security and privacy perspective. We consider a common use scenario of multi-institution collaborative studies, such as in the form of research consortia or networks as widely seen in genetics, epidemiology, social sciences, etc. To make our privacy-enhancing solution practical, we demonstrate a non-conventional and computationally efficient method leveraging distributing computing and strong cryptography to provide comprehensive protection over individual-level and summary data. Extensive empirical evaluations on several studies validate the privacy guarantee, efficiency and scalability of our proposal. We also discuss the practical implications of our solution for large-scale studies and applications from various disciplines, including genetic and biomedical studies, smart grid, network analysis, etc.

  10. Credit Scoring Problem Based on Regression Analysis

    OpenAIRE

    Khassawneh, Bashar Suhil Jad Allah

    2014-01-01

    ABSTRACT: This thesis provides an explanatory introduction to the regression models of data mining and contains basic definitions of key terms in the linear, multiple and logistic regression models. Meanwhile, the aim of this study is to illustrate fitting models for the credit scoring problem using simple linear, multiple linear and logistic regression models and also to analyze the found model functions by statistical tools. Keywords: Data mining, linear regression, logistic regression....

  11. Logistic regression applied to natural hazards: rare event logistic regression with replications

    Directory of Open Access Journals (Sweden)

    M. Guns

    2012-06-01

    Full Text Available Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.

  12. Logistic regression applied to natural hazards: rare event logistic regression with replications

    Science.gov (United States)

    Guns, M.; Vanacker, V.

    2012-06-01

    Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.

  13. Strategy Guideline: Demonstration Home

    Energy Technology Data Exchange (ETDEWEB)

    Savage, C.; Hunt, A.

    2012-12-01

    This guideline will provide a general overview of the different kinds of demonstration home projects, a basic understanding of the different roles and responsibilities involved in the successful completion of a demonstration home, and an introduction into some of the lessons learned from actual demonstration home projects. Also, this guideline will specifically look at the communication methods employed during demonstration home projects. And lastly, we will focus on how to best create a communication plan for including an energy efficient message in a demonstration home project and carry that message to successful completion.

  14. Strategy Guideline. Demonstration Home

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, A.; Savage, C.

    2012-12-01

    This guideline will provide a general overview of the different kinds of demonstration home projects, a basic understanding of the different roles and responsibilities involved in the successful completion of a demonstration home, and an introduction into some of the lessons learned from actual demonstration home projects. Also, this guideline will specifically look at the communication methods employed during demonstration home projects. And lastly, we will focus on how to best create a communication plan for including an energy efficient message in a demonstration home project and carry that message to successful completion.

  15. Multiplication factor versus regression analysis in stature estimation from hand and foot dimensions.

    Science.gov (United States)

    Krishan, Kewal; Kanchan, Tanuj; Sharma, Abhilasha

    2012-05-01

    Estimation of stature is an important parameter in identification of human remains in forensic examinations. The present study is aimed to compare the reliability and accuracy of stature estimation and to demonstrate the variability in estimated stature and actual stature using multiplication factor and regression analysis methods. The study is based on a sample of 246 subjects (123 males and 123 females) from North India aged between 17 and 20 years. Four anthropometric measurements; hand length, hand breadth, foot length and foot breadth taken on the left side in each subject were included in the study. Stature was measured using standard anthropometric techniques. Multiplication factors were calculated and linear regression models were derived for estimation of stature from hand and foot dimensions. Derived multiplication factors and regression formula were applied to the hand and foot measurements in the study sample. The estimated stature from the multiplication factors and regression analysis was compared with the actual stature to find the error in estimated stature. The results indicate that the range of error in estimation of stature from regression analysis method is less than that of multiplication factor method thus, confirming that the regression analysis method is better than multiplication factor analysis in stature estimation. Copyright © 2012 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  16. Regularized Label Relaxation Linear Regression.

    Science.gov (United States)

    Fang, Xiaozhao; Xu, Yong; Li, Xuelong; Lai, Zhihui; Wong, Wai Keung; Fang, Bingwu

    2018-04-01

    Linear regression (LR) and some of its variants have been widely used for classification problems. Most of these methods assume that during the learning phase, the training samples can be exactly transformed into a strict binary label matrix, which has too little freedom to fit the labels adequately. To address this problem, in this paper, we propose a novel regularized label relaxation LR method, which has the following notable characteristics. First, the proposed method relaxes the strict binary label matrix into a slack variable matrix by introducing a nonnegative label relaxation matrix into LR, which provides more freedom to fit the labels and simultaneously enlarges the margins between different classes as much as possible. Second, the proposed method constructs the class compactness graph based on manifold learning and uses it as the regularization item to avoid the problem of overfitting. The class compactness graph is used to ensure that the samples sharing the same labels can be kept close after they are transformed. Two different algorithms, which are, respectively, based on -norm and -norm loss functions are devised. These two algorithms have compact closed-form solutions in each iteration so that they are easily implemented. Extensive experiments show that these two algorithms outperform the state-of-the-art algorithms in terms of the classification accuracy and running time.

  17. Weighted SGD for ℓp Regression with Randomized Preconditioning*

    Science.gov (United States)

    Yang, Jiyan; Chow, Yin-Lam; Ré, Christopher; Mahoney, Michael W.

    2018-01-01

    prediction norm in 𝒪(log n·nnz(A)+poly(d) log(1/ε)/ε) time. We show that for unconstrained ℓ2 regression, this complexity is comparable to that of RLA and is asymptotically better over several state-of-the-art solvers in the regime where the desired accuracy ε, high dimension n and low dimension d satisfy d ≥ 1/ε and n ≥ d2/ε. We also provide lower bounds on the coreset complexity for more general regression problems, indicating that still new ideas will be needed to extend similar RLA preconditioning ideas to weighted SGD algorithms for more general regression problems. Finally, the effectiveness of such algorithms is illustrated numerically on both synthetic and real datasets, and the results are consistent with our theoretical findings and demonstrate that pwSGD converges to a medium-precision solution, e.g., ε = 10−3, more quickly. PMID:29782626

  18. A comparison of random forest regression and multiple linear regression for prediction in neuroscience.

    Science.gov (United States)

    Smith, Paul F; Ganesh, Siva; Liu, Ping

    2013-10-30

    Regression is a common statistical tool for prediction in neuroscience. However, linear regression is by far the most common form of regression used, with regression trees receiving comparatively little attention. In this study, the results of conventional multiple linear regression (MLR) were compared with those of random forest regression (RFR), in the prediction of the concentrations of 9 neurochemicals in the vestibular nucleus complex and cerebellum that are part of the l-arginine biochemical pathway (agmatine, putrescine, spermidine, spermine, l-arginine, l-ornithine, l-citrulline, glutamate and γ-aminobutyric acid (GABA)). The R(2) values for the MLRs were higher than the proportion of variance explained values for the RFRs: 6/9 of them were ≥ 0.70 compared to 4/9 for RFRs. Even the variables that had the lowest R(2) values for the MLRs, e.g. ornithine (0.50) and glutamate (0.61), had much lower proportion of variance explained values for the RFRs (0.27 and 0.49, respectively). The RSE values for the MLRs were lower than those for the RFRs in all but two cases. In general, MLRs seemed to be superior to the RFRs in terms of predictive value and error. In the case of this data set, MLR appeared to be superior to RFR in terms of its explanatory value and error. This result suggests that MLR may have advantages over RFR for prediction in neuroscience with this kind of data set, but that RFR can still have good predictive value in some cases. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Robust Mediation Analysis Based on Median Regression

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2014-01-01

    Mediation analysis has many applications in psychology and the social sciences. The most prevalent methods typically assume that the error distribution is normal and homoscedastic. However, this assumption may rarely be met in practice, which can affect the validity of the mediation analysis. To address this problem, we propose robust mediation analysis based on median regression. Our approach is robust to various departures from the assumption of homoscedasticity and normality, including heavy-tailed, skewed, contaminated, and heteroscedastic distributions. Simulation studies show that under these circumstances, the proposed method is more efficient and powerful than standard mediation analysis. We further extend the proposed robust method to multilevel mediation analysis, and demonstrate through simulation studies that the new approach outperforms the standard multilevel mediation analysis. We illustrate the proposed method using data from a program designed to increase reemployment and enhance mental health of job seekers. PMID:24079925

  20. Least square regularized regression in sum space.

    Science.gov (United States)

    Xu, Yong-Li; Chen, Di-Rong; Li, Han-Xiong; Liu, Lu

    2013-04-01

    This paper proposes a least square regularized regression algorithm in sum space of reproducing kernel Hilbert spaces (RKHSs) for nonflat function approximation, and obtains the solution of the algorithm by solving a system of linear equations. This algorithm can approximate the low- and high-frequency component of the target function with large and small scale kernels, respectively. The convergence and learning rate are analyzed. We measure the complexity of the sum space by its covering number and demonstrate that the covering number can be bounded by the product of the covering numbers of basic RKHSs. For sum space of RKHSs with Gaussian kernels, by choosing appropriate parameters, we tradeoff the sample error and regularization error, and obtain a polynomial learning rate, which is better than that in any single RKHS. The utility of this method is illustrated with two simulated data sets and five real-life databases.

  1. Electrodynamic Dust Shield Demonstrator

    Science.gov (United States)

    Stankie, Charles G.

    2013-01-01

    The objective of the project was to design and manufacture a device to demonstrate a new technology developed by NASA's Electrostatics and Surface Physics Laboratory. The technology itself is a system which uses magnetic principles to remove regolith dust from its surface. This project was to create an enclosure that will be used to demonstrate the effectiveness of the invention to The Office of the Chief Technologist. ONE of the most important challenges of space exploration is actually caused by something very small and seemingly insignificant. Dust in space, most notably on the moon and Mars, has caused many unforeseen issues. Dirt and dust on Earth, while a nuisance, can be easily cleaned and kept at bay. However, there is considerably less weathering and erosion in space. As a result, the microscopic particles are extremely rough and abrasive. They are also electrostatically charged, so they cling to everything they make contact with. This was first noted to be a major problem during the Apollo missions. Dust would stick to the spacesuits, and could not be wiped off as predicted. Dust was brought back into the spacecraft, and was even inhaled by astronauts. This is a major health hazard. Atmospheric storms and other events can also cause dust to coat surfaces of spacecraft. This can cause abrasive damage to the craft. The coating can also reduce the effectiveness of thermal insulation and solar panels.' A group of engineers at Kennedy Space Center's Electrostatics and Surface Physics Laboratory have developed a new technology, called the Electrodynamic Dust Shield, to help alleviate these problems. It is based off of the electric curtain concept developed at NASA in 1967. "The EDS is an active dust mitigation technology that uses traveling electric fields to transport electrostatically charged dust particles along surfaces. To generate the traveling electric fields, the EDS consists of a multilayer dielectric coating with an embedded thin electrode grid

  2. A Monte Carlo simulation study comparing linear regression, beta regression, variable-dispersion beta regression and fractional logit regression at recovering average difference measures in a two sample design.

    Science.gov (United States)

    Meaney, Christopher; Moineddin, Rahim

    2014-01-24

    In biomedical research, response variables are often encountered which have bounded support on the open unit interval--(0,1). Traditionally, researchers have attempted to estimate covariate effects on these types of response data using linear regression. Alternative modelling strategies may include: beta regression, variable-dispersion beta regression, and fractional logit regression models. This study employs a Monte Carlo simulation design to compare the statistical properties of the linear regression model to that of the more novel beta regression, variable-dispersion beta regression, and fractional logit regression models. In the Monte Carlo experiment we assume a simple two sample design. We assume observations are realizations of independent draws from their respective probability models. The randomly simulated draws from the various probability models are chosen to emulate average proportion/percentage/rate differences of pre-specified magnitudes. Following simulation of the experimental data we estimate average proportion/percentage/rate differences. We compare the estimators in terms of bias, variance, type-1 error and power. Estimates of Monte Carlo error associated with these quantities are provided. If response data are beta distributed with constant dispersion parameters across the two samples, then all models are unbiased and have reasonable type-1 error rates and power profiles. If the response data in the two samples have different dispersion parameters, then the simple beta regression model is biased. When the sample size is small (N0 = N1 = 25) linear regression has superior type-1 error rates compared to the other models. Small sample type-1 error rates can be improved in beta regression models using bias correction/reduction methods. In the power experiments, variable-dispersion beta regression and fractional logit regression models have slightly elevated power compared to linear regression models. Similar results were observed if the

  3. Fiscal 1999 international energy conservation model project. Report on result of demonstrative research concerning cement clinker cooling system; 1999 nendo kokusai energy shohi koritsuka nado model jigyo seika hokokusho. Cement clinker reikyaku sochi ni kakawaru jissho kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    For the purpose of reducing energy consumption and CO2 discharge in a cement plant in Indonesia, R and D was conducted on new clinker cooling system, high performance kiln combustion system, and technology for steady kiln operation and control, with the fiscal 1999 results reported. In the research on the optimum clinker cooling system, a new type clinker cooling system (CCS) was developed in which air beams are applied only to stationary grate rows, in an air beam type clinker cooling system where cooling air is fed to each block, with grate plates used as the air duct. This year, in an actual machine testing equipment (capacity 2,500 t/d), the whole heat recuperation area was modified for the CCS, with the operation started since February, 1999, aiming at the optimal clinker cooling effect and high heat recovery efficiency. The heat quantity for the entire system showed a decrease of 60 kcal/kg in the heat consumption rate through CCS modification, kiln burner adjustment, etc. So long as the demonstration plant is concerned, design of a new type burner and study/design for the kiln stabilization were nearly completed. (NEDO)

  4. FY 1994 Report on the feasibility study results of the geothermal exploitation technologies for the international joint demonstration research; 1994 nendo chinetsu tansa gijutsu no kaigai kyodo jissho kenkyu kanosei chosa hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    Described herein are the FY 1994 results of the feasibility study of the geothermal exploitation technologies for the international joint demonstration research with Indonesia. The survey methods are considered for the areas difficult to access by land transportation means (e.g., tropical rain forests) to promote development of geothermal resources in remote areas (small- to medium-scale geothermal power generation plans). The satellite and air remote sensing are used for the wide-area survey. The data obtained by the satellite are analyzed using the JERS-1 data, and then surveyed in detail by the air remote sensing for the selected areas to find, e.g., abnormal ground temperature regions, faults, volcanoes, geothermally altered regions and landslide regions. They are surveyed in more detail by the air electromagnetic and magnetic exploitation methods. Although they have high resolution, their application tends to be hindered by hot and humid climates in the prospective exploitation areas. The GEMS-aided resources analysis is used to establish the geothermal models, to help extract the promising areas. These techniques are basically common, but it is necessary to take into consideration, e.g., the environments and regional characteristics of these areas when they are actually used. Diversification of fossil fuel supply sources is advantageous for Japan, and her energy security will be improved by supporting geothermal resources development promotion in the supply sources. (NEDO)

  5. Innovative Clean Coal Technology (ICCT): 180 MW demonstration of advanced tangentially-fired combustion techniques for the reduction of nitrogen oxide (NO{sub x}) emissions from coal-fired boilers. Topical report, LNCFS Levels 1 and 3 test results

    Energy Technology Data Exchange (ETDEWEB)

    1993-08-17

    This report presents results from the third phase of an Innovative Clean Coal Technology (ICC-1) project demonstrating advanced tangentially-fired combustion techniques for the reduction of nitrogen oxide (NO{sub x}) emissions from a coal-fired boiler. The purpose of this project was to study the NO{sub x} emissions characteristics of ABB Combustion Engineering`s (ABB CE) Low NO{sub x} Concentric Firing System (LNCFS) Levels I, II, and III. These technologies were installed and tested in a stepwise fashion at Gulf Power Company`s Plant Lansing Smith Unit 2. The objective of this report is to provide the results from Phase III. During that phase, Levels I and III of the ABB C-E Services Low NO{sub x} Concentric Firing System were tested. The LNCFS Level III technology includes separated overfire air, close coupled overfire air, clustered coal nozzles, flame attachment coal nozzle tips, and concentric firing. The LNCFS Level I was simulated by closing the separated overfire air nozzles of the LNCFS Level III system. Based upon long-term data, LNCFS Level HI reduced NO{sub x} emissions by 45 percent at full load. LOI levels with LNCFS Level III increased slightly, however, tests showed that LOI levels with LNCFS Level III were highly dependent upon coal fineness. After correcting for leakage air through the separated overfire air system, the simulated LNCFS Level I reduced NO{sub x} emissions by 37 percent. There was no increase in LOI with LNCFS Level I.

  6. Intelligent Quality Prediction Using Weighted Least Square Support Vector Regression

    Science.gov (United States)

    Yu, Yaojun

    A novel quality prediction method with mobile time window is proposed for small-batch producing process based on weighted least squares support vector regression (LS-SVR). The design steps and learning algorithm are also addressed. In the method, weighted LS-SVR is taken as the intelligent kernel, with which the small-batch learning is solved well and the nearer sample is set a larger weight, while the farther is set the smaller weight in the history data. A typical machining process of cutting bearing outer race is carried out and the real measured data are used to contrast experiment. The experimental results demonstrate that the prediction accuracy of the weighted LS-SVR based model is only 20%-30% that of the standard LS-SVR based one in the same condition. It provides a better candidate for quality prediction of small-batch producing process.

  7. No rationale for 1 variable per 10 events criterion for binary logistic regression analysis.

    Science.gov (United States)

    van Smeden, Maarten; de Groot, Joris A H; Moons, Karel G M; Collins, Gary S; Altman, Douglas G; Eijkemans, Marinus J C; Reitsma, Johannes B

    2016-11-24

    Ten events per variable (EPV) is a widely advocated minimal criterion for sample size considerations in logistic regression analysis. Of three previous simulation studies that examined this minimal EPV criterion only one supports the use of a minimum of 10 EPV. In this paper, we examine the reasons for substantial differences between these extensive simulation studies. The current study uses Monte Carlo simulations to evaluate small sample bias, coverage of confidence intervals and mean square error of logit coefficients. Logistic regression models fitted by maximum likelihood and a modified estimation procedure, known as Firth's correction, are compared. The results show that besides EPV, the problems associated with low EPV depend on other factors such as the total sample size. It is also demonstrated that simulation results can be dominated by even a few simulated data sets for which the prediction of the outcome by the covariates is perfect ('separation'). We reveal that different approaches for identifying and handling separation leads to substantially different simulation results. We further show that Firth's correction can be used to improve the accuracy of regression coefficients and alleviate the problems associated with separation. The current evidence supporting EPV rules for binary logistic regression is weak. Given our findings, there is an urgent need for new research to provide guidance for supporting sample size considerations for binary logistic regression analysis.

  8. Manufacturing Demonstration Facility (MDF)

    Data.gov (United States)

    Federal Laboratory Consortium — The U.S. Department of Energy Manufacturing Demonstration Facility (MDF) at Oak Ridge National Laboratory (ORNL) provides a collaborative, shared infrastructure to...

  9. Testing the equality of nonparametric regression curves based on ...

    African Journals Online (AJOL)

    Abstract. In this work we propose a new methodology for the comparison of two regression functions f1 and f2 in the case of homoscedastic error structure and a fixed design. Our approach is based on the empirical Fourier coefficients of the regression functions f1 and f2 respectively. As our main results we obtain the ...

  10. A Methodology for Generating Placement Rules that Utilizes Logistic Regression

    Science.gov (United States)

    Wurtz, Keith

    2008-01-01

    The purpose of this article is to provide the necessary tools for institutional researchers to conduct a logistic regression analysis and interpret the results. Aspects of the logistic regression procedure that are necessary to evaluate models are presented and discussed with an emphasis on cutoff values and choosing the appropriate number of…

  11. Bayesian regression of piecewise homogeneous Poisson processes

    Directory of Open Access Journals (Sweden)

    Diego Sevilla

    2015-12-01

    Full Text Available In this paper, a Bayesian method for piecewise regression is adapted to handle counting processes data distributed as Poisson. A numerical code in Mathematica is developed and tested analyzing simulated data. The resulting method is valuable for detecting breaking points in the count rate of time series for Poisson processes. Received: 2 November 2015, Accepted: 27 November 2015; Edited by: R. Dickman; Reviewed by: M. Hutter, Australian National University, Canberra, Australia.; DOI: http://dx.doi.org/10.4279/PIP.070018 Cite as: D J R Sevilla, Papers in Physics 7, 070018 (2015

  12. SPE dose prediction using locally weighted regression

    International Nuclear Information System (INIS)

    Hines, J. W.; Townsend, L. W.; Nichols, T. F.

    2005-01-01

    When astronauts are outside earth's protective magnetosphere, they are subject to large radiation doses resulting from solar particle events (SPEs). The total dose received from a major SPE in deep space could cause severe radiation poisoning. The dose is usually received over a 20-40 h time interval but the event's effects may be mitigated with an early warning system. This paper presents a method to predict the total dose early in the event. It uses a locally weighted regression model, which is easier to train and provides predictions as accurate as neural network models previously used. (authors)

  13. SPE dose prediction using locally weighted regression

    International Nuclear Information System (INIS)

    Hines, J. W.; Townsend, L. W.; Nichols, T. F.

    2005-01-01

    When astronauts are outside Earth's protective magnetosphere, they are subject to large radiation doses resulting from solar particle events. The total dose received from a major solar particle event in deep space could cause severe radiation poisoning. The dose is usually received over a 20-40 h time interval but the event's effects may be reduced with an early warning system. This paper presents a method to predict the total dose early in the event. It uses a locally weighted regression model, which is easier to train, and provides predictions as accurate as the neural network models that were used previously. (authors)

  14. Principal component regression analysis with SPSS.

    Science.gov (United States)

    Liu, R X; Kuang, J; Gong, Q; Hou, X L

    2003-06-01

    The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.

  15. Gradient descent for robust kernel-based regression

    Science.gov (United States)

    Guo, Zheng-Chu; Hu, Ting; Shi, Lei

    2018-06-01

    In this paper, we study the gradient descent algorithm generated by a robust loss function over a reproducing kernel Hilbert space (RKHS). The loss function is defined by a windowing function G and a scale parameter σ, which can include a wide range of commonly used robust losses for regression. There is still a gap between theoretical analysis and optimization process of empirical risk minimization based on loss: the estimator needs to be global optimal in the theoretical analysis while the optimization method can not ensure the global optimality of its solutions. In this paper, we aim to fill this gap by developing a novel theoretical analysis on the performance of estimators generated by the gradient descent algorithm. We demonstrate that with an appropriately chosen scale parameter σ, the gradient update with early stopping rules can approximate the regression function. Our elegant error analysis can lead to convergence in the standard L 2 norm and the strong RKHS norm, both of which are optimal in the mini-max sense. We show that the scale parameter σ plays an important role in providing robustness as well as fast convergence. The numerical experiments implemented on synthetic examples and real data set also support our theoretical results.

  16. Evaluation of Linear Regression Simultaneous Myoelectric Control Using Intramuscular EMG.

    Science.gov (United States)

    Smith, Lauren H; Kuiken, Todd A; Hargrove, Levi J

    2016-04-01

    The objective of this study was to evaluate the ability of linear regression models to decode patterns of muscle coactivation from intramuscular electromyogram (EMG) and provide simultaneous myoelectric control of a virtual 3-DOF wrist/hand system. Performance was compared to the simultaneous control of conventional myoelectric prosthesis methods using intramuscular EMG (parallel dual-site control)-an approach that requires users to independently modulate individual muscles in the residual limb, which can be challenging for amputees. Linear regression control was evaluated in eight able-bodied subjects during a virtual Fitts' law task and was compared to performance of eight subjects using parallel dual-site control. An offline analysis also evaluated how different types of training data affected prediction accuracy of linear regression control. The two control systems demonstrated similar overall performance; however, the linear regression method demonstrated improved performance for targets requiring use of all three DOFs, whereas parallel dual-site control demonstrated improved performance for targets that required use of only one DOF. Subjects using linear regression control could more easily activate multiple DOFs simultaneously, but often experienced unintended movements when trying to isolate individual DOFs. Offline analyses also suggested that the method used to train linear regression systems may influence controllability. Linear regression myoelectric control using intramuscular EMG provided an alternative to parallel dual-site control for 3-DOF simultaneous control at the wrist and hand. The two methods demonstrated different strengths in controllability, highlighting the tradeoff between providing simultaneous control and the ability to isolate individual DOFs when desired.

  17. Demonstration exercise 'Cavtat 09'

    International Nuclear Information System (INIS)

    Trut, D.

    2009-01-01

    The demonstration exercise is to show a terrorist attack in urban area resulting in a certain number of injured people. On 7th April 2009 a terrorist group HAL 9000 is in Cavtat and set up an explosive devices with chemical reagents in several spots with intention to activate them and cause great number of victims. On the same day, in area of the Cavtat Croatia Hotel, which is hosting the world CBMTS Congress, Cavtat Police Station notice several masked persons, in escapement. Hotel personnel alerted the County 112 Center about noticed devices placed by chlorine dioxide tanks, for water conditioning. Intervention police came to block entrance to this area and evacuate hotel's guests and congress members. An explosion and fire occurs from where the position of water-conditioning plant and chlorine dioxide tank. The 112 Center alarms fire-fighters for fight fire and decontamination action and HAZMAT Civil Support Team from Georgia (participated the congress). In the meantime, guests have been instructed not to leave their rooms and to hermetically close doors and windows with available material to keep away potential toxic fume. Decision makers form the County Protection and Rescue Headquarters monitors the situation till the end of alert for the population in the area of Cavtat.(author)

  18. Detection of epistatic effects with logic regression and a classical linear regression model.

    Science.gov (United States)

    Malina, Magdalena; Ickstadt, Katja; Schwender, Holger; Posch, Martin; Bogdan, Małgorzata

    2014-02-01

    To locate multiple interacting quantitative trait loci (QTL) influencing a trait of interest within experimental populations, usually methods as the Cockerham's model are applied. Within this framework, interactions are understood as the part of the joined effect of several genes which cannot be explained as the sum of their additive effects. However, if a change in the phenotype (as disease) is caused by Boolean combinations of genotypes of several QTLs, this Cockerham's approach is often not capable to identify them properly. To detect such interactions more efficiently, we propose a logic regression framework. Even though with the logic regression approach a larger number of models has to be considered (requiring more stringent multiple testing correction) the efficient representation of higher order logic interactions in logic regression models leads to a significant increase of power to detect such interactions as compared to a Cockerham's approach. The increase in power is demonstrated analytically for a simple two-way interaction model and illustrated in more complex settings with simulation study and real data analysis.

  19. A flexible fuzzy regression algorithm for forecasting oil consumption estimation

    International Nuclear Information System (INIS)

    Azadeh, A.; Khakestani, M.; Saberi, M.

    2009-01-01

    Oil consumption plays a vital role in socio-economic development of most countries. This study presents a flexible fuzzy regression algorithm for forecasting oil consumption based on standard economic indicators. The standard indicators are annual population, cost of crude oil import, gross domestic production (GDP) and annual oil production in the last period. The proposed algorithm uses analysis of variance (ANOVA) to select either fuzzy regression or conventional regression for future demand estimation. The significance of the proposed algorithm is three fold. First, it is flexible and identifies the best model based on the results of ANOVA and minimum absolute percentage error (MAPE), whereas previous studies consider the best fitted fuzzy regression model based on MAPE or other relative error results. Second, the proposed model may identify conventional regression as the best model for future oil consumption forecasting because of its dynamic structure, whereas previous studies assume that fuzzy regression always provide the best solutions and estimation. Third, it utilizes the most standard independent variables for the regression models. To show the applicability and superiority of the proposed flexible fuzzy regression algorithm the data for oil consumption in Canada, United States, Japan and Australia from 1990 to 2005 are used. The results show that the flexible algorithm provides accurate solution for oil consumption estimation problem. The algorithm may be used by policy makers to accurately foresee the behavior of oil consumption in various regions.

  20. Regression away from the mean: Theory and examples.

    Science.gov (United States)

    Schwarz, Wolf; Reike, Dennis

    2018-02-01

    Using a standard repeated measures model with arbitrary true score distribution and normal error variables, we present some fundamental closed-form results which explicitly indicate the conditions under which regression effects towards (RTM) and away from the mean are expected. Specifically, we show that for skewed and bimodal distributions many or even most cases will show a regression effect that is in expectation away from the mean, or that is not just towards but actually beyond the mean. We illustrate our results in quantitative detail with typical examples from experimental and biometric applications, which exhibit a clear regression away from the mean ('egression from the mean') signature. We aim not to repeal cautionary advice against potential RTM effects, but to present a balanced view of regression effects, based on a clear identification of the conditions governing the form that regression effects take in repeated measures designs. © 2017 The British Psychological Society.

  1. The prediction of intelligence in preschool children using alternative models to regression.

    Science.gov (United States)

    Finch, W Holmes; Chang, Mei; Davis, Andrew S; Holden, Jocelyn E; Rothlisberg, Barbara A; McIntosh, David E

    2011-12-01

    Statistical prediction of an outcome variable using multiple independent variables is a common practice in the social and behavioral sciences. For example, neuropsychologists are sometimes called upon to provide predictions of preinjury cognitive functioning for individuals who have suffered a traumatic brain injury. Typically, these predictions are made using standard multiple linear regression models with several demographic variables (e.g., gender, ethnicity, education level) as predictors. Prior research has shown conflicting evidence regarding the ability of such models to provide accurate predictions of outcome variables such as full-scale intelligence (FSIQ) test scores. The present study had two goals: (1) to demonstrate the utility of a set of alternative prediction methods that have been applied extensively in the natural sciences and business but have not been frequently explored in the social sciences and (2) to develop models that can be used to predict premorbid cognitive functioning in preschool children. Predictions of Stanford-Binet 5 FSIQ scores for preschool-aged children is used to compare the performance of a multiple regression model with several of these alternative methods. Results demonstrate that classification and regression trees provided more accurate predictions of FSIQ scores than does the more traditional regression approach. Implications of these results are discussed.

  2. Unbalanced Regressions and the Predictive Equation

    DEFF Research Database (Denmark)

    Osterrieder, Daniela; Ventosa-Santaulària, Daniel; Vera-Valdés, J. Eduardo

    Predictive return regressions with persistent regressors are typically plagued by (asymptotically) biased/inconsistent estimates of the slope, non-standard or potentially even spurious statistical inference, and regression unbalancedness. We alleviate the problem of unbalancedness in the theoreti......Predictive return regressions with persistent regressors are typically plagued by (asymptotically) biased/inconsistent estimates of the slope, non-standard or potentially even spurious statistical inference, and regression unbalancedness. We alleviate the problem of unbalancedness...

  3. Semiparametric regression during 2003–2007

    KAUST Repository

    Ruppert, David; Wand, M.P.; Carroll, Raymond J.

    2009-01-01

    Semiparametric regression is a fusion between parametric regression and nonparametric regression that integrates low-rank penalized splines, mixed model and hierarchical Bayesian methodology – thus allowing more streamlined handling of longitudinal and spatial correlation. We review progress in the field over the five-year period between 2003 and 2007. We find semiparametric regression to be a vibrant field with substantial involvement and activity, continual enhancement and widespread application.

  4. Gaussian process regression analysis for functional data

    CERN Document Server

    Shi, Jian Qing

    2011-01-01

    Gaussian Process Regression Analysis for Functional Data presents nonparametric statistical methods for functional regression analysis, specifically the methods based on a Gaussian process prior in a functional space. The authors focus on problems involving functional response variables and mixed covariates of functional and scalar variables.Covering the basics of Gaussian process regression, the first several chapters discuss functional data analysis, theoretical aspects based on the asymptotic properties of Gaussian process regression models, and new methodological developments for high dime

  5. Toward Customer-Centric Organizational Science: A Common Language Effect Size Indicator for Multiple Linear Regressions and Regressions With Higher-Order Terms.

    Science.gov (United States)

    Krasikova, Dina V; Le, Huy; Bachura, Eric

    2018-01-22

    To address a long-standing concern regarding a gap between organizational science and practice, scholars called for more intuitive and meaningful ways of communicating research results to users of academic research. In this article, we develop a common language effect size index (CLβ) that can help translate research results to practice. We demonstrate how CLβ can be computed and used to interpret the effects of continuous and categorical predictors in multiple linear regression models. We also elaborate on how the proposed CLβ index is computed and used to interpret interactions and nonlinear effects in regression models. In addition, we test the robustness of the proposed index to violations of normality and provide means for computing standard errors and constructing confidence intervals around its estimates. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  6. Standards for Standardized Logistic Regression Coefficients

    Science.gov (United States)

    Menard, Scott

    2011-01-01

    Standardized coefficients in logistic regression analysis have the same utility as standardized coefficients in linear regression analysis. Although there has been no consensus on the best way to construct standardized logistic regression coefficients, there is now sufficient evidence to suggest a single best approach to the construction of a…

  7. A Seemingly Unrelated Poisson Regression Model

    OpenAIRE

    King, Gary

    1989-01-01

    This article introduces a new estimator for the analysis of two contemporaneously correlated endogenous event count variables. This seemingly unrelated Poisson regression model (SUPREME) estimator combines the efficiencies created by single equation Poisson regression model estimators and insights from "seemingly unrelated" linear regression models.

  8. Estimating Loess Plateau Average Annual Precipitation with Multiple Linear Regression Kriging and Geographically Weighted Regression Kriging

    Directory of Open Access Journals (Sweden)

    Qiutong Jin

    2016-06-01

    Full Text Available Estimating the spatial distribution of precipitation is an important and challenging task in hydrology, climatology, ecology, and environmental science. In order to generate a highly accurate distribution map of average annual precipitation for the Loess Plateau in China, multiple linear regression Kriging (MLRK and geographically weighted regression Kriging (GWRK methods were employed using precipitation data from the period 1980–2010 from 435 meteorological stations. The predictors in regression Kriging were selected by stepwise regression analysis from many auxiliary environmental factors, such as elevation (DEM, normalized difference vegetation index (NDVI, solar radiation, slope, and aspect. All predictor distribution maps had a 500 m spatial resolution. Validation precipitation data from 130 hydrometeorological stations were used to assess the prediction accuracies of the MLRK and GWRK approaches. Results showed that both prediction maps with a 500 m spatial resolution interpolated by MLRK and GWRK had a high accuracy and captured detailed spatial distribution data; however, MLRK produced a lower prediction error and a higher variance explanation than GWRK, although the differences were small, in contrast to conclusions from similar studies.

  9. Innovative technology demonstration

    International Nuclear Information System (INIS)

    Anderson, D.B.; Luttrell, S.P.; Hartley, J.N.; Hinchee, R.

    1992-04-01

    The Innovative Technology Demonstration (ITD) program at Tinker Air Force Base (TAFB), Oklahoma City, Oklahoma, will demonstrate the overall utility and effectiveness of innovative technologies for site characterization, monitoring, and remediation of selected contaminated test sites. The current demonstration test sites include a CERCLA site on the NPL list, located under a building (Building 3001) that houses a large active industrial complex used for rebuilding military aircraft, and a site beneath and surrounding an abandoned underground tank vault used for storage of jet fuels and solvents. The site under Building 3001 (the NW Test Site) is contaminated with TCE and Cr +6 ; the site with the fuel storage vault (the SW Tanks Site) is contaminated with fuels, BTEX and TCE. These sites and others have been identified for cleanup under the Air Force's Installation Restoration Program (IRP). This document describes the demonstrations that have been conducted or are planned for the TAFB

  10. Laser Communications Relay Demonstration

    Data.gov (United States)

    National Aeronautics and Space Administration — LCRD is a minimum two year flight demonstration in geosynchronous Earth orbit to advance optical communications technology toward infusion into Deep Space and Near...

  11. Spacecraft Fire Safety Demonstration

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the Spacecraft Fire Safety Demonstration project is to develop and conduct large-scale fire safety experiments on an International Space Station...

  12. [Application of negative binomial regression and modified Poisson regression in the research of risk factors for injury frequency].

    Science.gov (United States)

    Cao, Qingqing; Wu, Zhenqiang; Sun, Ying; Wang, Tiezhu; Han, Tengwei; Gu, Chaomei; Sun, Yehuan

    2011-11-01

    To Eexplore the application of negative binomial regression and modified Poisson regression analysis in analyzing the influential factors for injury frequency and the risk factors leading to the increase of injury frequency. 2917 primary and secondary school students were selected from Hefei by cluster random sampling method and surveyed by questionnaire. The data on the count event-based injuries used to fitted modified Poisson regression and negative binomial regression model. The risk factors incurring the increase of unintentional injury frequency for juvenile students was explored, so as to probe the efficiency of these two models in studying the influential factors for injury frequency. The Poisson model existed over-dispersion (P Poisson regression and negative binomial regression model, was fitted better. respectively. Both showed that male gender, younger age, father working outside of the hometown, the level of the guardian being above junior high school and smoking might be the results of higher injury frequencies. On a tendency of clustered frequency data on injury event, both the modified Poisson regression analysis and negative binomial regression analysis can be used. However, based on our data, the modified Poisson regression fitted better and this model could give a more accurate interpretation of relevant factors affecting the frequency of injury.

  13. Education Payload Operation - Demonstrations

    Science.gov (United States)

    Keil, Matthew

    2009-01-01

    Education Payload Operation - Demonstrations (EPO-Demos) are recorded video education demonstrations performed on the International Space Station (ISS) by crewmembers using hardware already onboard the ISS. EPO-Demos are videotaped, edited, and used to enhance existing NASA education resources and programs for educators and students in grades K-12. EPO-Demos are designed to support the NASA mission to inspire the next generation of explorers.

  14. Moderation analysis using a two-level regression model.

    Science.gov (United States)

    Yuan, Ke-Hai; Cheng, Ying; Maxwell, Scott

    2014-10-01

    Moderation analysis is widely used in social and behavioral research. The most commonly used model for moderation analysis is moderated multiple regression (MMR) in which the explanatory variables of the regression model include product terms, and the model is typically estimated by least squares (LS). This paper argues for a two-level regression model in which the regression coefficients of a criterion variable on predictors are further regressed on moderator variables. An algorithm for estimating the parameters of the two-level model by normal-distribution-based maximum likelihood (NML) is developed. Formulas for the standard errors (SEs) of the parameter estimates are provided and studied. Results indicate that, when heteroscedasticity exists, NML with the two-level model gives more efficient and more accurate parameter estimates than the LS analysis of the MMR model. When error variances are homoscedastic, NML with the two-level model leads to essentially the same results as LS with the MMR model. Most importantly, the two-level regression model permits estimating the percentage of variance of each regression coefficient that is due to moderator variables. When applied to data from General Social Surveys 1991, NML with the two-level model identified a significant moderation effect of race on the regression of job prestige on years of education while LS with the MMR model did not. An R package is also developed and documented to facilitate the application of the two-level model.

  15. Buried Waste Integrated Demonstration

    International Nuclear Information System (INIS)

    1994-03-01

    The Buried Waste Integrated Demonstration (BWID) supports the applied research, development, demonstration, and evaluation of a suite of advanced technologies that offer promising solutions to the problems associated with the remediation of buried waste. BWID addresses the difficult remediation problems associated with DOE complex-wide buried waste, particularly transuranic (TRU) contaminated buried waste. BWID has implemented a systems approach to the development and demonstration of technologies that will characterize, retrieve, treat, and dispose of DOE buried wastes. This approach encompasses the entire remediation process from characterization to post-monitoring. The development and demonstration of the technology is predicated on how a technology fits into the total remediation process. To address all of these technological issues, BWID has enlisted scientific expertise of individuals and groups from within the DOE Complex, as well as experts from universities and private industry. The BWID mission is to support development and demonstration of a suite of technologies that, when integrated with commercially-available technologies, forms a comprehensive, remediation system for the effective and efficient remediation of buried waste throughout the DOE Complex. BWID will evaluate and validate demonstrated technologies and transfer this information and equipment to private industry to support the Office of Environmental Restoration (ER), Office of Waste Management (WM), and Office of Facility Transition (FT) remediation planning and implementation activities

  16. Regression with Sparse Approximations of Data

    DEFF Research Database (Denmark)

    Noorzad, Pardis; Sturm, Bob L.

    2012-01-01

    We propose sparse approximation weighted regression (SPARROW), a method for local estimation of the regression function that uses sparse approximation with a dictionary of measurements. SPARROW estimates the regression function at a point with a linear combination of a few regressands selected...... by a sparse approximation of the point in terms of the regressors. We show SPARROW can be considered a variant of \\(k\\)-nearest neighbors regression (\\(k\\)-NNR), and more generally, local polynomial kernel regression. Unlike \\(k\\)-NNR, however, SPARROW can adapt the number of regressors to use based...

  17. Nonlinear regression analysis for evaluating tracer binding parameters using the programmable K1003 desk computer

    International Nuclear Information System (INIS)

    Sarrach, D.; Strohner, P.

    1986-01-01

    The Gauss-Newton algorithm has been used to evaluate tracer binding parameters of RIA by nonlinear regression analysis. The calculations were carried out on the K1003 desk computer. Equations for simple binding models and its derivatives are presented. The advantages of nonlinear regression analysis over linear regression are demonstrated

  18. Mapping geogenic radon potential by regression kriging

    Energy Technology Data Exchange (ETDEWEB)

    Pásztor, László [Institute for Soil Sciences and Agricultural Chemistry, Centre for Agricultural Research, Hungarian Academy of Sciences, Department of Environmental Informatics, Herman Ottó út 15, 1022 Budapest (Hungary); Szabó, Katalin Zsuzsanna, E-mail: sz_k_zs@yahoo.de [Department of Chemistry, Institute of Environmental Science, Szent István University, Páter Károly u. 1, Gödöllő 2100 (Hungary); Szatmári, Gábor; Laborczi, Annamária [Institute for Soil Sciences and Agricultural Chemistry, Centre for Agricultural Research, Hungarian Academy of Sciences, Department of Environmental Informatics, Herman Ottó út 15, 1022 Budapest (Hungary); Horváth, Ákos [Department of Atomic Physics, Eötvös University, Pázmány Péter sétány 1/A, 1117 Budapest (Hungary)

    2016-02-15

    Radon ({sup 222}Rn) gas is produced in the radioactive decay chain of uranium ({sup 238}U) which is an element that is naturally present in soils. Radon is transported mainly by diffusion and convection mechanisms through the soil depending mainly on the physical and meteorological parameters of the soil and can enter and accumulate in buildings. Health risks originating from indoor radon concentration can be attributed to natural factors and is characterized by geogenic radon potential (GRP). Identification of areas with high health risks require spatial modeling, that is, mapping of radon risk. In addition to geology and meteorology, physical soil properties play a significant role in the determination of GRP. In order to compile a reliable GRP map for a model area in Central-Hungary, spatial auxiliary information representing GRP forming environmental factors were taken into account to support the spatial inference of the locally measured GRP values. Since the number of measured sites was limited, efficient spatial prediction methodologies were searched for to construct a reliable map for a larger area. Regression kriging (RK) was applied for the interpolation using spatially exhaustive auxiliary data on soil, geology, topography, land use and climate. RK divides the spatial inference into two parts. Firstly, the deterministic component of the target variable is determined by a regression model. The residuals of the multiple linear regression analysis represent the spatially varying but dependent stochastic component, which are interpolated by kriging. The final map is the sum of the two component predictions. Overall accuracy of the map was tested by Leave-One-Out Cross-Validation. Furthermore the spatial reliability of the resultant map is also estimated by the calculation of the 90% prediction interval of the local prediction values. The applicability of the applied method as well as that of the map is discussed briefly. - Highlights: • A new method

  19. Mapping geogenic radon potential by regression kriging

    International Nuclear Information System (INIS)

    Pásztor, László; Szabó, Katalin Zsuzsanna; Szatmári, Gábor; Laborczi, Annamária; Horváth, Ákos

    2016-01-01

    Radon ( 222 Rn) gas is produced in the radioactive decay chain of uranium ( 238 U) which is an element that is naturally present in soils. Radon is transported mainly by diffusion and convection mechanisms through the soil depending mainly on the physical and meteorological parameters of the soil and can enter and accumulate in buildings. Health risks originating from indoor radon concentration can be attributed to natural factors and is characterized by geogenic radon potential (GRP). Identification of areas with high health risks require spatial modeling, that is, mapping of radon risk. In addition to geology and meteorology, physical soil properties play a significant role in the determination of GRP. In order to compile a reliable GRP map for a model area in Central-Hungary, spatial auxiliary information representing GRP forming environmental factors were taken into account to support the spatial inference of the locally measured GRP values. Since the number of measured sites was limited, efficient spatial prediction methodologies were searched for to construct a reliable map for a larger area. Regression kriging (RK) was applied for the interpolation using spatially exhaustive auxiliary data on soil, geology, topography, land use and climate. RK divides the spatial inference into two parts. Firstly, the deterministic component of the target variable is determined by a regression model. The residuals of the multiple linear regression analysis represent the spatially varying but dependent stochastic component, which are interpolated by kriging. The final map is the sum of the two component predictions. Overall accuracy of the map was tested by Leave-One-Out Cross-Validation. Furthermore the spatial reliability of the resultant map is also estimated by the calculation of the 90% prediction interval of the local prediction values. The applicability of the applied method as well as that of the map is discussed briefly. - Highlights: • A new method, regression

  20. Regression calibration with more surrogates than mismeasured variables

    KAUST Repository

    Kipnis, Victor

    2012-06-29

    In a recent paper (Weller EA, Milton DK, Eisen EA, Spiegelman D. Regression calibration for logistic regression with multiple surrogates for one exposure. Journal of Statistical Planning and Inference 2007; 137: 449-461), the authors discussed fitting logistic regression models when a scalar main explanatory variable is measured with error by several surrogates, that is, a situation with more surrogates than variables measured with error. They compared two methods of adjusting for measurement error using a regression calibration approximate model as if it were exact. One is the standard regression calibration approach consisting of substituting an estimated conditional expectation of the true covariate given observed data in the logistic regression. The other is a novel two-stage approach when the logistic regression is fitted to multiple surrogates, and then a linear combination of estimated slopes is formed as the estimate of interest. Applying estimated asymptotic variances for both methods in a single data set with some sensitivity analysis, the authors asserted superiority of their two-stage approach. We investigate this claim in some detail. A troubling aspect of the proposed two-stage method is that, unlike standard regression calibration and a natural form of maximum likelihood, the resulting estimates are not invariant to reparameterization of nuisance parameters in the model. We show, however, that, under the regression calibration approximation, the two-stage method is asymptotically equivalent to a maximum likelihood formulation, and is therefore in theory superior to standard regression calibration. However, our extensive finite-sample simulations in the practically important parameter space where the regression calibration model provides a good approximation failed to uncover such superiority of the two-stage method. We also discuss extensions to different data structures.

  1. Regression calibration with more surrogates than mismeasured variables

    KAUST Repository

    Kipnis, Victor; Midthune, Douglas; Freedman, Laurence S.; Carroll, Raymond J.

    2012-01-01

    In a recent paper (Weller EA, Milton DK, Eisen EA, Spiegelman D. Regression calibration for logistic regression with multiple surrogates for one exposure. Journal of Statistical Planning and Inference 2007; 137: 449-461), the authors discussed fitting logistic regression models when a scalar main explanatory variable is measured with error by several surrogates, that is, a situation with more surrogates than variables measured with error. They compared two methods of adjusting for measurement error using a regression calibration approximate model as if it were exact. One is the standard regression calibration approach consisting of substituting an estimated conditional expectation of the true covariate given observed data in the logistic regression. The other is a novel two-stage approach when the logistic regression is fitted to multiple surrogates, and then a linear combination of estimated slopes is formed as the estimate of interest. Applying estimated asymptotic variances for both methods in a single data set with some sensitivity analysis, the authors asserted superiority of their two-stage approach. We investigate this claim in some detail. A troubling aspect of the proposed two-stage method is that, unlike standard regression calibration and a natural form of maximum likelihood, the resulting estimates are not invariant to reparameterization of nuisance parameters in the model. We show, however, that, under the regression calibration approximation, the two-stage method is asymptotically equivalent to a maximum likelihood formulation, and is therefore in theory superior to standard regression calibration. However, our extensive finite-sample simulations in the practically important parameter space where the regression calibration model provides a good approximation failed to uncover such superiority of the two-stage method. We also discuss extensions to different data structures.

  2. BANK FAILURE PREDICTION WITH LOGISTIC REGRESSION

    Directory of Open Access Journals (Sweden)

    Taha Zaghdoudi

    2013-04-01

    Full Text Available In recent years the economic and financial world is shaken by a wave of financial crisis and resulted in violent bank fairly huge losses. Several authors have focused on the study of the crises in order to develop an early warning model. It is in the same path that our work takes its inspiration. Indeed, we have tried to develop a predictive model of Tunisian bank failures with the contribution of the binary logistic regression method. The specificity of our prediction model is that it takes into account microeconomic indicators of bank failures. The results obtained using our provisional model show that a bank's ability to repay its debt, the coefficient of banking operations, bank profitability per employee and leverage financial ratio has a negative impact on the probability of failure.

  3. Caudal regression with sirenomelia and dysplasia renofacialis (Potter's syndrome)

    International Nuclear Information System (INIS)

    Noeldge, G.; Billmann, P.; Boehm, N.; Freiburg Univ.

    1982-01-01

    A case of caudal regression in combination with sirenomelia and dysplasia renofacialis (Potter's syndrome) is reported. The formal pathogenesis of these malformations and clinical facts are shown and discussed. Findings of plain films, postmortal angiography and pathologic-anatomical changes are demonstrated. (orig.) [de

  4. On the Occurrence of Standardized Regression Coefficients Greater than One.

    Science.gov (United States)

    Deegan, John, Jr.

    1978-01-01

    It is demonstrated here that standardized regression coefficients greater than one can legitimately occur. Furthermore, the relationship between the occurrence of such coefficients and the extent of multicollinearity present among the set of predictor variables in an equation is examined. Comments on the interpretation of these coefficients are…

  5. Regression testing in the TOTEM DCS

    International Nuclear Information System (INIS)

    Rodríguez, F Lucas; Atanassov, I; Burkimsher, P; Frost, O; Taskinen, J; Tulimaki, V

    2012-01-01

    The Detector Control System of the TOTEM experiment at the LHC is built with the industrial product WinCC OA (PVSS). The TOTEM system is generated automatically through scripts using as input the detector Product Breakdown Structure (PBS) structure and its pinout connectivity, archiving and alarm metainformation, and some other heuristics based on the naming conventions. When those initial parameters and automation code are modified to include new features, the resulting PVSS system can also introduce side-effects. On a daily basis, a custom developed regression testing tool takes the most recent code from a Subversion (SVN) repository and builds a new control system from scratch. This system is exported in plain text format using the PVSS export tool, and compared with a system previously validated by a human. A report is sent to the developers with any differences highlighted, in readiness for validation and acceptance as a new stable version. This regression approach is not dependent on any development framework or methodology. This process has been satisfactory during several months, proving to be a very valuable tool before deploying new versions in the production systems.

  6. Parameters Estimation of Geographically Weighted Ordinal Logistic Regression (GWOLR) Model

    Science.gov (United States)

    Zuhdi, Shaifudin; Retno Sari Saputro, Dewi; Widyaningsih, Purnami

    2017-06-01

    A regression model is the representation of relationship between independent variable and dependent variable. The dependent variable has categories used in the logistic regression model to calculate odds on. The logistic regression model for dependent variable has levels in the logistics regression model is ordinal. GWOLR model is an ordinal logistic regression model influenced the geographical location of the observation site. Parameters estimation in the model needed to determine the value of a population based on sample. The purpose of this research is to parameters estimation of GWOLR model using R software. Parameter estimation uses the data amount of dengue fever patients in Semarang City. Observation units used are 144 villages in Semarang City. The results of research get GWOLR model locally for each village and to know probability of number dengue fever patient categories.

  7. Zero-Shot Learning via Attribute Regression and Class Prototype Rectification.

    Science.gov (United States)

    Luo, Changzhi; Li, Zhetao; Huang, Kaizhu; Feng, Jiashi; Wang, Meng

    2018-02-01

    Zero-shot learning (ZSL) aims at classifying examples for unseen classes (with no training examples) given some other seen classes (with training examples). Most existing approaches exploit intermedia-level information (e.g., attributes) to transfer knowledge from seen classes to unseen classes. A common practice is to first learn projections from samples to attributes on seen classes via a regression method, and then apply such projections to unseen classes directly. However, it turns out that such a manner of learning strategy easily causes projection domain shift problem and hubness problem, which hinder the performance of ZSL task. In this paper, we also formulate ZSL as an attribute regression problem. However, different from general regression-based solutions, the proposed approach is novel in three aspects. First, a class prototype rectification method is proposed to connect the unseen classes to the seen classes. Here, a class prototype refers to a vector representation of a class, and it is also known as a class center, class signature, or class exemplar. Second, an alternating learning scheme is proposed for jointly performing attribute regression and rectifying the class prototypes. Finally, a new objective function which takes into consideration both the attribute regression accuracy and the class prototype discrimination is proposed. By introducing such a solution, domain shift problem and hubness problem can be mitigated. Experimental results on three public datasets (i.e., CUB200-2011, SUN Attribute, and aPaY) well demonstrate the effectiveness of our approach.

  8. Performance and strategy comparisons of human listeners and logistic regression in discriminating underwater targets.

    Science.gov (United States)

    Yang, Lixue; Chen, Kean

    2015-11-01

    To improve the design of underwater target recognition systems based on auditory perception, this study compared human listeners with automatic classifiers. Performances measures and strategies in three discrimination experiments, including discriminations between man-made and natural targets, between ships and submarines, and among three types of ships, were used. In the experiments, the subjects were asked to assign a score to each sound based on how confident they were about the category to which it belonged, and logistic regression, which represents linear discriminative models, also completed three similar tasks by utilizing many auditory features. The results indicated that the performances of logistic regression improved as the ratio between inter- and intra-class differences became larger, whereas the performances of the human subjects were limited by their unfamiliarity with the targets. Logistic regression performed better than the human subjects in all tasks but the discrimination between man-made and natural targets, and the strategies employed by excellent human subjects were similar to that of logistic regression. Logistic regression and several human subjects demonstrated similar performances when discriminating man-made and natural targets, but in this case, their strategies were not similar. An appropriate fusion of their strategies led to further improvement in recognition accuracy.

  9. Learning From Demonstration?

    DEFF Research Database (Denmark)

    Koch, Christian; Bertelsen, Niels Haldor

    2014-01-01

    Demonstration projects are often used in the building sector to provide a basis for using new processes and/or products. The climate change agenda implies that construction is not only required to deliver value for the customer, cost reductions and efficiency but also sustainable buildings....... This paper reports on an early demonstration project, the Building of a passive house dormitory in the Central Region of Denmark in 2006-2009. The project was supposed to deliver value, lean design, prefabrication, quality in sustainability, certification according to German standards for passive houses......, and micro combined heat and power using hydrogen. Using sociological and business economic theories of innovation, the paper discusses how early movers of innovation tend to obtain only partial success when demonstrating their products and often feel obstructed by minor details. The empirical work...

  10. Solar renovation demonstration projects

    Energy Technology Data Exchange (ETDEWEB)

    Bruun Joergensen, O [ed.

    1998-10-01

    In the framework of the IEA SHC Programme, a Task on building renovation was initiated, `Task 20, Solar Energy in Building Renovation`. In a part of the task, Subtask C `Design of Solar Renovation Projects`, different solar renovation demonstration projects were developed. The objective of Subtask C was to demonstrate the application of advanced solar renovation concepts on real buildings. This report documents 16 different solar renovation demonstration projects including the design processes of the projects. The projects include the renovation of houses, schools, laboratories, and factories. Several solar techniques were used: building integrated solar collectors, glazed balconies, ventilated solar walls, transparent insulation, second skin facades, daylight elements and photovoltaic systems. These techniques are used in several simple as well as more complex system designs. (au)

  11. Photovoltaic demonstration projects

    Energy Technology Data Exchange (ETDEWEB)

    Gillett, W B; Hacker, R J; Kaut, W [eds.

    1991-01-01

    This book, the proceedings of the fourth PV-Contractors' Meeting organized by the Commission of the European Communities, Directorate-General for Energy, held at Brussels on 21 and 22 November 1989, provides an overview of the photovoltaic demonstration projects which have been supported in the framework of the Energy Demonstration Program since 1983. It includes reports by each of the contractors who submitted proposals in 1983, 1984, 1985 and 1986, describing progress with their projects. Summaries of the discussions held at the meeting, which included contractors whose projects were submitted in 1987, are also presented. The different technologies which are being demonstrated concern the modules, the cabling of the array, structure design, storage strategy and power conditioning. The various applications include desalination, communications, dairy farms, water pumping, and warning systems. Papers have been processed separately for inclusion on the data base.

  12. Electric vehicle demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Ouellet, M. [National Centre for Advanced Transportation, Saint-Jerome, PQ (Canada)

    2010-07-01

    The desirable characteristics of Canadian projects that demonstrate vehicle use in real-world operation and the appropriate mechanism to collect and disseminate the monitoring data were discussed in this presentation. The scope of the project was on passenger cars and light duty trucks operating in plug-in electric vehicle (PHEV) or battery electric vehicle modes. The presentation also discussed the funding, stakeholders involved, Canadian travel pattern analysis, regulatory framework, current and recent electric vehicle demonstration projects, and project guidelines. It was concluded that some demonstration project activities may have been duplicated as communication between the proponents was insufficient. It was recommended that data monitoring using automatic data logging with minimum reliance on logbooks and other user entry should be emphasized. figs.

  13. Innovative technology demonstrations

    International Nuclear Information System (INIS)

    Anderson, D.B.; Luttrell, S.P.; Hartley, J.N.

    1992-08-01

    Environmental Management Operations (EMO) is conducting an Innovative Technology Demonstration Program for Tinker Air Force Base (TAFB). Several innovative technologies are being demonstrated to address specific problems associated with remediating two contaminated test sites at the base. Cone penetrometer testing (CPT) is a form of testing that can rapidly characterize a site. This technology was selected to evaluate its applicability in the tight clay soils and consolidated sandstone sediments found at TAFB. Directionally drilled horizontal wells was selected as a method that may be effective in accessing contamination beneath Building 3001 without disrupting the mission of the building, and in enhancing the extraction of contamination both in ground water and in soil. A soil gas extraction (SGE) demonstration, also known as soil vapor extraction, will evaluate the effectiveness of SGE in remediating fuels and TCE contamination contained in the tight clay soil formations surrounding the abandoned underground fuel storage vault located at the SW Tanks Site. In situ sensors have recently received much acclaim as a technology that can be effective in remediating hazardous waste sites. Sensors can be useful for determining real-time, in situ contaminant concentrations during the remediation process for performance monitoring and in providing feedback for controlling the remediation process. Following the SGE demonstration, the SGE system and SW Tanks test site will be modified to demonstrate bioremediation as an effective means of degrading the remaining contaminants in situ. The bioremediation demonstration will evaluate a bioventing process in which the naturally occurring consortium of soil bacteria will be stimulated to aerobically degrade soil contaminants, including fuel and TCE, in situ

  14. Innovative technology demonstrations

    International Nuclear Information System (INIS)

    Anderson, D.B.; Hartley, J.N.; Luttrell, S.P.

    1992-04-01

    Currently, several innovative technologies are being demonstrated at Tinker Air Force Base (TAFB) to address specific problems associated with remediating two contaminated test sites at the base. Cone penetrometer testing (CPT) is a form of testing that can rapidly characterize a site. This technology was selected to evaluate its applicability in the tight clay soils and consolidated sandstone sediments found at TAFB. Directionally drilled horizontal wells have been successfully installed at the US Department of Energy's (DOE) Savannah River Site to test new methods of in situ remediation of soils and ground water. This emerging technology was selected as a method that may be effective in accessing contamination beneath Building 3001 without disrupting the mission of the building, and in enhancing the extraction of contamination both in ground water and in soil. A soil gas extraction (SGE) demonstration, also known as soil vapor extraction, will evaluate the effectiveness of SGE in remediating fuels and TCE contamination contained in the tight clay soil formations surrounding the abandoned underground fuel storage vault located at the SW Tanks Site. In situ sensors have recently received much acclaim as a technology that can be effective in remediating hazardous waste sites. Sensors can be useful for determining real-time, in situ contaminant concentrations during the remediation process for performance monitoring and in providing feedback for controlling the remediation process. A demonstration of two in situ sensor systems capable of providing real-time data on contamination levels will be conducted and evaluated concurrently with the SGE demonstration activities. Following the SGE demonstration, the SGE system and SW Tanks test site will be modified to demonstrate bioremediation as an effective means of degrading the remaining contaminants in situ

  15. Poisson Mixture Regression Models for Heart Disease Prediction

    Science.gov (United States)

    Erol, Hamza

    2016-01-01

    Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model. PMID:27999611

  16. Photovoltaic demonstration projects 2

    Energy Technology Data Exchange (ETDEWEB)

    Gillett, W B; Hacker, R J [Halcrow (William) and Partners, Swindon (UK); Kaut, W [eds.

    1989-01-01

    This book, the proceedings of the third Photovoltaic Contractors' Meeting organised by the Commission of the European Communities, Directorate-General for Energy provides an overview of the photovoltaic demonstration projects which have been supported by the Energy Directorate of the Commission of the European Communities since 1983. It includes reports by each of the contractors who submitted proposals in 1983, 1984 and 1985, describing progress with their projects. The different technologies which are being demonstrated concern the modules, the cabling of the array, structure design, storage strategy and power conditioning. The various applications include powering of houses, villages, recreation centres, water desalination, communications, dairy farms, water pumping and warning systems. (author).

  17. Face Hallucination with Linear Regression Model in Semi-Orthogonal Multilinear PCA Method

    Science.gov (United States)

    Asavaskulkiet, Krissada

    2018-04-01

    In this paper, we propose a new face hallucination technique, face images reconstruction in HSV color space with a semi-orthogonal multilinear principal component analysis method. This novel hallucination technique can perform directly from tensors via tensor-to-vector projection by imposing the orthogonality constraint in only one mode. In our experiments, we use facial images from FERET database to test our hallucination approach which is demonstrated by extensive experiments with high-quality hallucinated color faces. The experimental results assure clearly demonstrated that we can generate photorealistic color face images by using the SO-MPCA subspace with a linear regression model.

  18. Binary logistic regression-Instrument for assessing museum indoor air impact on exhibits.

    Science.gov (United States)

    Bucur, Elena; Danet, Andrei Florin; Lehr, Carol Blaziu; Lehr, Elena; Nita-Lazar, Mihai

    2017-04-01

    This paper presents a new way to assess the environmental impact on historical artifacts using binary logistic regression. The prediction of the impact on the exhibits during certain pollution scenarios (environmental impact) was calculated by a mathematical model based on the binary logistic regression; it allows the identification of those environmental parameters from a multitude of possible parameters with a significant impact on exhibitions and ranks them according to their severity effect. Air quality (NO 2 , SO 2 , O 3 and PM 2.5 ) and microclimate parameters (temperature, humidity) monitoring data from a case study conducted within exhibition and storage spaces of the Romanian National Aviation Museum Bucharest have been used for developing and validating the binary logistic regression method and the mathematical model. The logistic regression analysis was used on 794 data combinations (715 to develop of the model and 79 to validate it) by a Statistical Package for Social Sciences (SPSS 20.0). The results from the binary logistic regression analysis demonstrated that from six parameters taken into consideration, four of them present a significant effect upon exhibits in the following order: O 3 >PM 2.5 >NO 2 >humidity followed at a significant distance by the effects of SO 2 and temperature. The mathematical model, developed in this study, correctly predicted 95.1 % of the cumulated effect of the environmental parameters upon the exhibits. Moreover, this model could also be used in the decisional process regarding the preventive preservation measures that should be implemented within the exhibition space. The paper presents a new way to assess the environmental impact on historical artifacts using binary logistic regression. The mathematical model developed on the environmental parameters analyzed by the binary logistic regression method could be useful in a decision-making process establishing the best measures for pollution reduction and preventive

  19. Comparing lagged linear correlation, lagged regression, Granger causality, and vector autoregression for uncovering associations in EHR data.

    Science.gov (United States)

    Levine, Matthew E; Albers, David J; Hripcsak, George

    2016-01-01

    Time series analysis methods have been shown to reveal clinical and biological associations in data collected in the electronic health record. We wish to develop reliable high-throughput methods for identifying adverse drug effects that are easy to implement and produce readily interpretable results. To move toward this goal, we used univariate and multivariate lagged regression models to investigate associations between twenty pairs of drug orders and laboratory measurements. Multivariate lagged regression models exhibited higher sensitivity and specificity than univariate lagged regression in the 20 examples, and incorporating autoregressive terms for labs and drugs produced more robust signals in cases of known associations among the 20 example pairings. Moreover, including inpatient admission terms in the model attenuated the signals for some cases of unlikely associations, demonstrating how multivariate lagged regression models' explicit handling of context-based variables can provide a simple way to probe for health-care processes that confound analyses of EHR data.

  20. PHARUS ASAR demonstrator

    NARCIS (Netherlands)

    Smith, A.J.E.; Bree, R.J.P. van; Calkoen, C.J.; Dekker, R.J.; Otten, M.P.G.; Rossum, W.L. van

    2001-01-01

    PHARUS is a polarimetric phased array C-band Synthetic Aperture Radar (SAR), designed and built for airborne use. Advanced SAR (ASAR) data in image and alternating polarization mode have been simulated with PHARUS to demonstrate the use of Envisat for a number of typical SAR applications that are

  1. Demonstrating the Gas Laws.

    Science.gov (United States)

    Holko, David A.

    1982-01-01

    Presents a complete computer program demonstrating the relationship between volume/pressure for Boyle's Law, volume/temperature for Charles' Law, and volume/moles of gas for Avagadro's Law. The programing reinforces students' application of gas laws and equates a simulated moving piston to theoretical values derived using the ideal gas law.…

  2. Astronomy LITE Demonstrations

    Science.gov (United States)

    Brecher, Kenneth

    2006-12-01

    Project LITE (Light Inquiry Through Experiments) is a materials, software, and curriculum development project. It focuses on light, optics, color and visual perception. According to two recent surveys of college astronomy faculty members, these are among the topics most often included in the large introductory astronomy courses. The project has aimed largely at the design and implementation of hands-on experiences for students. However, it has also included the development of lecture demonstrations that employ novel light sources and materials. In this presentation, we will show some of our new lecture demonstrations concerning geometrical and physical optics, fluorescence, phosphorescence and polarization. We have developed over 200 Flash and Java applets that can be used either by teachers in lecture settings or by students at home. They are all posted on the web at http://lite.bu.edu. For either purpose they can be downloaded directly to the user's computer or run off line. In lecture demonstrations, some of these applets can be used to control the light emitted by video projectors to produce physical effects in materials (e.g. fluorescence). Other applets can be used, for example, to demonstrate that the human percept of color does not have a simple relationship with the physical frequency of the stimulating source of light. Project LITE is supported by Grant #DUE-0125992 from the NSF Division of Undergraduate Education.

  3. A Magnetic Circuit Demonstration.

    Science.gov (United States)

    Vanderkooy, John; Lowe, June

    1995-01-01

    Presents a demonstration designed to illustrate Faraday's, Ampere's, and Lenz's laws and to reinforce the concepts through the analysis of a two-loop magnetic circuit. Can be made dramatic and challenging for sophisticated students but is suitable for an introductory course in electricity and magnetism. (JRH)

  4. Robust mislabel logistic regression without modeling mislabel probabilities.

    Science.gov (United States)

    Hung, Hung; Jou, Zhi-Yu; Huang, Su-Yun

    2018-03-01

    Logistic regression is among the most widely used statistical methods for linear discriminant analysis. In many applications, we only observe possibly mislabeled responses. Fitting a conventional logistic regression can then lead to biased estimation. One common resolution is to fit a mislabel logistic regression model, which takes into consideration of mislabeled responses. Another common method is to adopt a robust M-estimation by down-weighting suspected instances. In this work, we propose a new robust mislabel logistic regression based on γ-divergence. Our proposal possesses two advantageous features: (1) It does not need to model the mislabel probabilities. (2) The minimum γ-divergence estimation leads to a weighted estimating equation without the need to include any bias correction term, that is, it is automatically bias-corrected. These features make the proposed γ-logistic regression more robust in model fitting and more intuitive for model interpretation through a simple weighting scheme. Our method is also easy to implement, and two types of algorithms are included. Simulation studies and the Pima data application are presented to demonstrate the performance of γ-logistic regression. © 2017, The International Biometric Society.

  5. Logistic regression for risk factor modelling in stuttering research.

    Science.gov (United States)

    Reed, Phil; Wu, Yaqionq

    2013-06-01

    To outline the uses of logistic regression and other statistical methods for risk factor analysis in the context of research on stuttering. The principles underlying the application of a logistic regression are illustrated, and the types of questions to which such a technique has been applied in the stuttering field are outlined. The assumptions and limitations of the technique are discussed with respect to existing stuttering research, and with respect to formulating appropriate research strategies to accommodate these considerations. Finally, some alternatives to the approach are briefly discussed. The way the statistical procedures are employed are demonstrated with some hypothetical data. Research into several practical issues concerning stuttering could benefit if risk factor modelling were used. Important examples are early diagnosis, prognosis (whether a child will recover or persist) and assessment of treatment outcome. After reading this article you will: (a) Summarize the situations in which logistic regression can be applied to a range of issues about stuttering; (b) Follow the steps in performing a logistic regression analysis; (c) Describe the assumptions of the logistic regression technique and the precautions that need to be checked when it is employed; (d) Be able to summarize its advantages over other techniques like estimation of group differences and simple regression. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. Conjoined legs: Sirenomelia or caudal regression syndrome?

    Directory of Open Access Journals (Sweden)

    Sakti Prasad Das

    2013-01-01

    Full Text Available Presence of single umbilical persistent vitelline artery distinguishes sirenomelia from caudal regression syndrome. We report a case of a12-year-old boy who had bilateral umbilical arteries presented with fusion of both legs in the lower one third of leg. Both feet were rudimentary. The right foot had a valgus rocker-bottom deformity. All toes were present but rudimentary. The left foot showed absence of all toes. Physical examination showed left tibia vara. The chest evaluation in sitting revealed pigeon chest and elevated right shoulder. Posterior examination of the trunk showed thoracic scoliosis with convexity to right. The patient was operated and at 1 year followup the boy had two separate legs with a good aesthetic and functional results.

  7. Conjoined legs: Sirenomelia or caudal regression syndrome?

    Science.gov (United States)

    Das, Sakti Prasad; Ojha, Niranjan; Ganesh, G Shankar; Mohanty, Ram Narayan

    2013-07-01

    Presence of single umbilical persistent vitelline artery distinguishes sirenomelia from caudal regression syndrome. We report a case of a12-year-old boy who had bilateral umbilical arteries presented with fusion of both legs in the lower one third of leg. Both feet were rudimentary. The right foot had a valgus rocker-bottom deformity. All toes were present but rudimentary. The left foot showed absence of all toes. Physical examination showed left tibia vara. The chest evaluation in sitting revealed pigeon chest and elevated right shoulder. Posterior examination of the trunk showed thoracic scoliosis with convexity to right. The patient was operated and at 1 year followup the boy had two separate legs with a good aesthetic and functional results.

  8. Logistic regression against a divergent Bayesian network

    Directory of Open Access Journals (Sweden)

    Noel Antonio Sánchez Trujillo

    2015-01-01

    Full Text Available This article is a discussion about two statistical tools used for prediction and causality assessment: logistic regression and Bayesian networks. Using data of a simulated example from a study assessing factors that might predict pulmonary emphysema (where fingertip pigmentation and smoking are considered; we posed the following questions. Is pigmentation a confounding, causal or predictive factor? Is there perhaps another factor, like smoking, that confounds? Is there a synergy between pigmentation and smoking? The results, in terms of prediction, are similar with the two techniques; regarding causation, differences arise. We conclude that, in decision-making, the sum of both: a statistical tool, used with common sense, and previous evidence, taking years or even centuries to develop; is better than the automatic and exclusive use of statistical resources.

  9. Entrepreneurial intention modeling using hierarchical multiple regression

    Directory of Open Access Journals (Sweden)

    Marina Jeger

    2014-12-01

    Full Text Available The goal of this study is to identify the contribution of effectuation dimensions to the predictive power of the entrepreneurial intention model over and above that which can be accounted for by other predictors selected and confirmed in previous studies. As is often the case in social and behavioral studies, some variables are likely to be highly correlated with each other. Therefore, the relative amount of variance in the criterion variable explained by each of the predictors depends on several factors such as the order of variable entry and sample specifics. The results show the modest predictive power of two dimensions of effectuation prior to the introduction of the theory of planned behavior elements. The article highlights the main advantages of applying hierarchical regression in social sciences as well as in the specific context of entrepreneurial intention formation, and addresses some of the potential pitfalls that this type of analysis entails.

  10. Gaussian process regression for geometry optimization

    Science.gov (United States)

    Denzel, Alexander; Kästner, Johannes

    2018-03-01

    We implemented a geometry optimizer based on Gaussian process regression (GPR) to find minimum structures on potential energy surfaces. We tested both a two times differentiable form of the Matérn kernel and the squared exponential kernel. The Matérn kernel performs much better. We give a detailed description of the optimization procedures. These include overshooting the step resulting from GPR in order to obtain a higher degree of interpolation vs. extrapolation. In a benchmark against the Limited-memory Broyden-Fletcher-Goldfarb-Shanno optimizer of the DL-FIND library on 26 test systems, we found the new optimizer to generally reduce the number of required optimization steps.

  11. Statistical learning from a regression perspective

    CERN Document Server

    Berk, Richard A

    2016-01-01

    This textbook considers statistical learning applications when interest centers on the conditional distribution of the response variable, given a set of predictors, and when it is important to characterize how the predictors are related to the response. As a first approximation, this can be seen as an extension of nonparametric regression. This fully revised new edition includes important developments over the past 8 years. Consistent with modern data analytics, it emphasizes that a proper statistical learning data analysis derives from sound data collection, intelligent data management, appropriate statistical procedures, and an accessible interpretation of results. A continued emphasis on the implications for practice runs through the text. Among the statistical learning procedures examined are bagging, random forests, boosting, support vector machines and neural networks. Response variables may be quantitative or categorical. As in the first edition, a unifying theme is supervised learning that can be trea...

  12. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  13. Remote monitoring demonstration

    International Nuclear Information System (INIS)

    Caskey, Susan; Olsen, John

    2006-01-01

    The recently upgraded remote monitoring system at the Joyo Experimental Reactor uses a DCM-14 camera module and GEMINI software. The final data is compatible both with the IAEA-approved GARS review software and the ALIS software that was used for this demonstration. Features of the remote monitoring upgrade emphasized compatibility with IAEA practice. This presentation gives particular attention to the selection process for meeting network security considerations at the O'arai site. The Joyo system is different from the NNCA's ACPF system, in that it emphasizes use of IAEA standard camera technology and data acquisition and transmission software. In the demonstration itself, a temporary virtual private network (VPN) between the meeting room and the server at Sandia in Albuquerque allowed attendees to observe data stored from routine transmissions from the Joyo Fresh Fuel Storage to Sandia. Image files from a fuel movement earlier in the month showed Joyo workers and IAEA inspectors carrying out a transfer. (author)

  14. Commercial incineration demonstration

    International Nuclear Information System (INIS)

    Borduin, L.C.; Neuls, A.S.

    1981-01-01

    Low-level radioactive wastes (LLW) generated by nuclear utilities presently are shipped to commercial burial grounds for disposal. Substantially increasing shipping and disposal charges have sparked renewed industry interest in incineration and other advanced volume reduction techniques as potential cost-saving measures. Repeated inquiries from industry sources regarding LLW applicability of the Los Alamos controlled-air incineration (CAI) design led DOE to initiate this commercial demonstration program in FY-1980. The selected program approach to achieving CAI demonstration at a utility site is a DOE sponsored joint effort involving Los Alamos, a nuclear utility, and a liaison subcontractor. Required development tasks and responsibilities of the particpants are described. Target date for project completion is the end of FY-1985

  15. Regression models of reactor diagnostic signals

    International Nuclear Information System (INIS)

    Vavrin, J.

    1989-01-01

    The application is described of an autoregression model as the simplest regression model of diagnostic signals in experimental analysis of diagnostic systems, in in-service monitoring of normal and anomalous conditions and their diagnostics. The method of diagnostics is described using a regression type diagnostic data base and regression spectral diagnostics. The diagnostics is described of neutron noise signals from anomalous modes in the experimental fuel assembly of a reactor. (author)

  16. The Majorana Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Aguayo, Estanislao; Fast, James E.; Hoppe, Eric W.; Keillor, Martin E.; Kephart, Jeremy D.; Kouzes, Richard T.; LaFerriere, Brian D.; Merriman, Jason H.; Orrell, John L.; Overman, Nicole R.; Avignone, Frank T.; Back, Henning O.; Combs, Dustin C.; Leviner, L.; Young, A.; Barabash, Alexander S.; Konovalov, S.; Vanyushin, I.; Yumatov, Vladimir; Bergevin, M.; Chan, Yuen-Dat; Detwiler, Jason A.; Loach, J. C.; Martin, R. D.; Poon, Alan; Prior, Gersende; Vetter, Kai; Bertrand, F.; Cooper, R. J.; Radford, D. C.; Varner, R. L.; Yu, Chang-Hong; Boswell, M.; Elliott, S.; Gehman, Victor M.; Hime, Andrew; Kidd, M. F.; LaRoque, B. H.; Rielage, Keith; Ronquest, M. C.; Steele, David; Brudanin, V.; Egorov, Viatcheslav; Gusey, K.; Kochetov, Oleg; Shirchenko, M.; Timkin, V.; Yakushev, E.; Busch, Matthew; Esterline, James H.; Tornow, Werner; Christofferson, Cabot-Ann; Horton, Mark; Howard, S.; Sobolev, V.; Collar, J. I.; Fields, N.; Creswick, R.; Doe, Peter J.; Johnson, R. A.; Knecht, A.; Leon, Jonathan D.; Marino, Michael G.; Miller, M. L.; Robertson, R. G. H.; Schubert, Alexis G.; Wolfe, B. A.; Efremenko, Yuri; Ejiri, H.; Hazama, R.; Nomachi, Masaharu; Shima, T.; Finnerty, P.; Fraenkle, Florian; Giovanetti, G. K.; Green, M.; Henning, Reyco; Howe, M. A.; MacMullin, S.; Phillips, D.; Snavely, Kyle J.; Strain, J.; Vorren, Kris R.; Guiseppe, Vincente; Keller, C.; Mei, Dong-Ming; Perumpilly, Gopakumar; Thomas, K.; Zhang, C.; Hallin, A. L.; Keeter, K.; Mizouni, Leila; Wilkerson, J. F.

    2011-09-03

    A brief review of the history and neutrino physics of double beta decay is given. A description of the MAJORANA DEMONSTRATOR research and development program, including background reduction techniques, is presented in some detail. The application of point contact (PC) detectors to the experiment is discussed, including the effectiveness of pulse shape analysis. The predicted sensitivity of a PC detector array enriched to 86% to 76Ge is given.

  17. IGCC technology and demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Palonen, J [A. Ahlstrom Corporation, Karhula (Finland). Hans Ahlstrom Lab.; Lundqvist, R G [A. Ahlstrom Corporation, Helsinki (Finland); Staahl, K [Sydkraft AB, Malmoe (Sweden)

    1997-12-31

    Future energy production will be performed by advanced technologies that are more efficient, more environmentally friendly and less expensive than current technologies. Integrated gasification combined cycle (IGCC) power plants have been proposed as one of these systems. Utilising biofuels in future energy production will also be emphasised since this lowers substantially carbon dioxide emissions into the atmosphere due to the fact that biomass is a renewable form of energy. Combining advanced technology and biomass utilisation is for this reason something that should and will be encouraged. A. Ahlstrom Corporation of Finland and Sydkraft AB of Sweden have as one part of company strategies adopted this approach for the future. The companies have joined their resources in developing a biomass-based IGCC system with the gasification part based on pressurised circulating fluidized-bed technology. With this kind of technology electrical efficiency can be substantially increased compared to conventional power plants. As a first concrete step, a decision has been made to build a demonstration plant. This plant, located in Vaernamo, Sweden, has already been built and is now in commissioning and demonstration stage. The system comprises a fuel drying plant, a pressurised CFB gasifier with gas cooling and cleaning, a gas turbine, a waste heat recovery unit and a steam turbine. The plant is the first in the world where the integration of a pressurised gasifier with a gas turbine will be realised utilising a low calorific gas produced from biomass. The capacity of the Vaernamo plant is 6 MW of electricity and 9 MW of district heating. Technology development is in progress for design of plants of sizes from 20 to 120 MWe. The paper describes the Bioflow IGCC system, the Vaernamo demonstration plant and experiences from the commissioning and demonstration stages. (orig.)

  18. IGCC technology and demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Palonen, J. [A. Ahlstrom Corporation, Karhula (Finland). Hans Ahlstrom Lab.; Lundqvist, R.G. [A. Ahlstrom Corporation, Helsinki (Finland); Staahl, K. [Sydkraft AB, Malmoe (Sweden)

    1996-12-31

    Future energy production will be performed by advanced technologies that are more efficient, more environmentally friendly and less expensive than current technologies. Integrated gasification combined cycle (IGCC) power plants have been proposed as one of these systems. Utilising biofuels in future energy production will also be emphasised since this lowers substantially carbon dioxide emissions into the atmosphere due to the fact that biomass is a renewable form of energy. Combining advanced technology and biomass utilisation is for this reason something that should and will be encouraged. A. Ahlstrom Corporation of Finland and Sydkraft AB of Sweden have as one part of company strategies adopted this approach for the future. The companies have joined their resources in developing a biomass-based IGCC system with the gasification part based on pressurised circulating fluidized-bed technology. With this kind of technology electrical efficiency can be substantially increased compared to conventional power plants. As a first concrete step, a decision has been made to build a demonstration plant. This plant, located in Vaernamo, Sweden, has already been built and is now in commissioning and demonstration stage. The system comprises a fuel drying plant, a pressurised CFB gasifier with gas cooling and cleaning, a gas turbine, a waste heat recovery unit and a steam turbine. The plant is the first in the world where the integration of a pressurised gasifier with a gas turbine will be realised utilising a low calorific gas produced from biomass. The capacity of the Vaernamo plant is 6 MW of electricity and 9 MW of district heating. Technology development is in progress for design of plants of sizes from 20 to 120 MWe. The paper describes the Bioflow IGCC system, the Vaernamo demonstration plant and experiences from the commissioning and demonstration stages. (orig.)

  19. Waste and Disposal: Demonstration

    International Nuclear Information System (INIS)

    Neerdael, B.; Buyens, M.; De Bruyn, D.; Volckaert, G.

    2002-01-01

    Within the Belgian R and D programme on geological disposal, demonstration experiments have become increasingly important. In this contribution to the scientific report 2001, an overview is given of SCK-CEN's activities and achievements in the field of large-scale demonstration experiments. In 2001, main emphasis was on the PRACLAY project, which is a large-scale experiment to demonstrate the construction and the operation of a gallery for the disposal of HLW in a clay formation. The PRACLAY experiment will contribute to enhance understanding of water flow and mass transport in dense clay-based materials as well as to improve the design of the reference disposal concept. In the context of PRACLAY, a surface experiment (OPHELIE) has been developed to prepare and to complement PRACLAY-related experimental work in the HADES Underground Research Laboratory. In 2001, efforts were focussed on the operation of the OPHELIE mock-up. SCK-CEN also contributed to the SELFRAC roject which studies the self-healing of fractures in a clay formation

  20. Normalization Ridge Regression in Practice I: Comparisons Between Ordinary Least Squares, Ridge Regression and Normalization Ridge Regression.

    Science.gov (United States)

    Bulcock, J. W.

    The problem of model estimation when the data are collinear was examined. Though the ridge regression (RR) outperforms ordinary least squares (OLS) regression in the presence of acute multicollinearity, it is not a problem free technique for reducing the variance of the estimates. It is a stochastic procedure when it should be nonstochastic and it…

  1. Multivariate Regression Analysis and Slaughter Livestock,

    Science.gov (United States)

    AGRICULTURE, *ECONOMICS), (*MEAT, PRODUCTION), MULTIVARIATE ANALYSIS, REGRESSION ANALYSIS , ANIMALS, WEIGHT, COSTS, PREDICTIONS, STABILITY, MATHEMATICAL MODELS, STORAGE, BEEF, PORK, FOOD, STATISTICAL DATA, ACCURACY

  2. [From clinical judgment to linear regression model.

    Science.gov (United States)

    Palacios-Cruz, Lino; Pérez, Marcela; Rivas-Ruiz, Rodolfo; Talavera, Juan O

    2013-01-01

    When we think about mathematical models, such as linear regression model, we think that these terms are only used by those engaged in research, a notion that is far from the truth. Legendre described the first mathematical model in 1805, and Galton introduced the formal term in 1886. Linear regression is one of the most commonly used regression models in clinical practice. It is useful to predict or show the relationship between two or more variables as long as the dependent variable is quantitative and has normal distribution. Stated in another way, the regression is used to predict a measure based on the knowledge of at least one other variable. Linear regression has as it's first objective to determine the slope or inclination of the regression line: Y = a + bx, where "a" is the intercept or regression constant and it is equivalent to "Y" value when "X" equals 0 and "b" (also called slope) indicates the increase or decrease that occurs when the variable "x" increases or decreases in one unit. In the regression line, "b" is called regression coefficient. The coefficient of determination (R 2 ) indicates the importance of independent variables in the outcome.

  3. Geographically weighted regression and multicollinearity: dispelling the myth

    Science.gov (United States)

    Fotheringham, A. Stewart; Oshan, Taylor M.

    2016-10-01

    Geographically weighted regression (GWR) extends the familiar regression framework by estimating a set of parameters for any number of locations within a study area, rather than producing a single parameter estimate for each relationship specified in the model. Recent literature has suggested that GWR is highly susceptible to the effects of multicollinearity between explanatory variables and has proposed a series of local measures of multicollinearity as an indicator of potential problems. In this paper, we employ a controlled simulation to demonstrate that GWR is in fact very robust to the effects of multicollinearity. Consequently, the contention that GWR is highly susceptible to multicollinearity issues needs rethinking.

  4. Forecasting daily meteorological time series using ARIMA and regression models

    Science.gov (United States)

    Murat, Małgorzata; Malinowska, Iwona; Gos, Magdalena; Krzyszczak, Jaromir

    2018-04-01

    The daily air temperature and precipitation time series recorded between January 1, 1980 and December 31, 2010 in four European sites (Jokioinen, Dikopshof, Lleida and Lublin) from different climatic zones were modeled and forecasted. In our forecasting we used the methods of the Box-Jenkins and Holt- Winters seasonal auto regressive integrated moving-average, the autoregressive integrated moving-average with external regressors in the form of Fourier terms and the time series regression, including trend and seasonality components methodology with R software. It was demonstrated that obtained models are able to capture the dynamics of the time series data and to produce sensible forecasts.

  5. Regression modeling methods, theory, and computation with SAS

    CERN Document Server

    Panik, Michael

    2009-01-01

    Regression Modeling: Methods, Theory, and Computation with SAS provides an introduction to a diverse assortment of regression techniques using SAS to solve a wide variety of regression problems. The author fully documents the SAS programs and thoroughly explains the output produced by the programs.The text presents the popular ordinary least squares (OLS) approach before introducing many alternative regression methods. It covers nonparametric regression, logistic regression (including Poisson regression), Bayesian regression, robust regression, fuzzy regression, random coefficients regression,

  6. Examination of influential observations in penalized spline regression

    Science.gov (United States)

    Türkan, Semra

    2013-10-01

    In parametric or nonparametric regression models, the results of regression analysis are affected by some anomalous observations in the data set. Thus, detection of these observations is one of the major steps in regression analysis. These observations are precisely detected by well-known influence measures. Pena's statistic is one of them. In this study, Pena's approach is formulated for penalized spline regression in terms of ordinary residuals and leverages. The real data and artificial data are used to see illustrate the effectiveness of Pena's statistic as to Cook's distance on detecting influential observations. The results of the study clearly reveal that the proposed measure is superior to Cook's Distance to detect these observations in large data set.

  7. Demonstration of HITEX

    International Nuclear Information System (INIS)

    Morrison, H.D.; Woodall, K.B.

    1993-01-01

    A model reactor for HITEX successfully demonstrated the concept of high-temperature isotopic exchange in a closed loop simulating the conditions for fusion fuel cleanup. The catalyst of platinum on alumina pellets provided a surface area large enough to operate the reactor at 400 degrees celsius with flow rates up to 2 L/min. A 15-L tank containing a mixture of 4% CD 4 in H 2 was depleted in deuterium within 75 minutes down to 100 ppm HD above the natural concentration of HD in the make-up hydrogen stream. The application to tritium removal from tritiated impurities in a hydrogen stream will work as well or better

  8. Visual Electricity Demonstrator

    Science.gov (United States)

    Lincoln, James

    2017-09-01

    The Visual Electricity Demonstrator (VED) is a linear diode array that serves as a dynamic alternative to an ammeter. A string of 48 red light-emitting diodes (LEDs) blink one after another to create the illusion of a moving current. Having the current represented visually builds an intuitive and qualitative understanding about what is happening in a circuit. In this article, I describe several activities for this device and explain how using this technology in the classroom can enhance the understanding and appreciation of physics.

  9. Exploration Medical System Demonstration

    Science.gov (United States)

    Rubin, D. A.; Watkins, S. D.

    2014-01-01

    BACKGROUND: Exploration class missions will present significant new challenges and hazards to the health of the astronauts. Regardless of the intended destination, beyond low Earth orbit a greater degree of crew autonomy will be required to diagnose medical conditions, develop treatment plans, and implement procedures due to limited communications with ground-based personnel. SCOPE: The Exploration Medical System Demonstration (EMSD) project will act as a test bed on the International Space Station (ISS) to demonstrate to crew and ground personnel that an end-to-end medical system can assist clinician and non-clinician crew members in optimizing medical care delivery and data management during an exploration mission. Challenges facing exploration mission medical care include limited resources, inability to evacuate to Earth during many mission phases, and potential rendering of medical care by non-clinicians. This system demonstrates the integration of medical devices and informatics tools for managing evidence and decision making and can be designed to assist crewmembers in nominal, non-emergent situations and in emergent situations when they may be suffering from performance decrements due to environmental, physiological or other factors. PROJECT OBJECTIVES: The objectives of the EMSD project are to: a. Reduce or eliminate the time required of an on-orbit crew and ground personnel to access, transfer, and manipulate medical data. b. Demonstrate that the on-orbit crew has the ability to access medical data/information via an intuitive and crew-friendly solution to aid in the treatment of a medical condition. c. Develop a common data management framework that can be ubiquitously used to automate repetitive data collection, management, and communications tasks for all activities pertaining to crew health and life sciences. d. Ensure crew access to medical data during periods of restricted ground communication. e. Develop a common data management framework that

  10. Demonstration tokamak power plant

    International Nuclear Information System (INIS)

    Abdou, M.; Baker, C.; Brooks, J.; Ehst, D.; Mattas, R.; Smith, D.L.; DeFreece, D.; Morgan, G.D.; Trachsel, C.

    1983-01-01

    A conceptual design for a tokamak demonstration power plant (DEMO) was developed. A large part of the study focused on examining the key issues and identifying the R and D needs for: (1) current drive for steady-state operation, (2) impurity control and exhaust, (3) tritium breeding blanket, and (4) reactor configuration and maintenance. Impurity control and exhaust will not be covered in this paper but is discussed in another paper in these proceedings, entitled Key Issues of FED/INTOR Impurity Control System

  11. Collaborative regression-based anatomical landmark detection

    International Nuclear Information System (INIS)

    Gao, Yaozong; Shen, Dinggang

    2015-01-01

    Anatomical landmark detection plays an important role in medical image analysis, e.g. for registration, segmentation and quantitative analysis. Among the various existing methods for landmark detection, regression-based methods have recently attracted much attention due to their robustness and efficiency. In these methods, landmarks are localised through voting from all image voxels, which is completely different from the classification-based methods that use voxel-wise classification to detect landmarks. Despite their robustness, the accuracy of regression-based landmark detection methods is often limited due to (1) the inclusion of uninformative image voxels in the voting procedure, and (2) the lack of effective ways to incorporate inter-landmark spatial dependency into the detection step. In this paper, we propose a collaborative landmark detection framework to address these limitations. The concept of collaboration is reflected in two aspects. (1) Multi-resolution collaboration. A multi-resolution strategy is proposed to hierarchically localise landmarks by gradually excluding uninformative votes from faraway voxels. Moreover, for informative voxels near the landmark, a spherical sampling strategy is also designed at the training stage to improve their prediction accuracy. (2) Inter-landmark collaboration. A confidence-based landmark detection strategy is proposed to improve the detection accuracy of ‘difficult-to-detect’ landmarks by using spatial guidance from ‘easy-to-detect’ landmarks. To evaluate our method, we conducted experiments extensively on three datasets for detecting prostate landmarks and head and neck landmarks in computed tomography images, and also dental landmarks in cone beam computed tomography images. The results show the effectiveness of our collaborative landmark detection framework in improving landmark detection accuracy, compared to other state-of-the-art methods. (paper)

  12. Impact of multicollinearity on small sample hydrologic regression models

    Science.gov (United States)

    Kroll, Charles N.; Song, Peter

    2013-06-01

    Often hydrologic regression models are developed with ordinary least squares (OLS) procedures. The use of OLS with highly correlated explanatory variables produces multicollinearity, which creates highly sensitive parameter estimators with inflated variances and improper model selection. It is not clear how to best address multicollinearity in hydrologic regression models. Here a Monte Carlo simulation is developed to compare four techniques to address multicollinearity: OLS, OLS with variance inflation factor screening (VIF), principal component regression (PCR), and partial least squares regression (PLS). The performance of these four techniques was observed for varying sample sizes, correlation coefficients between the explanatory variables, and model error variances consistent with hydrologic regional regression models. The negative effects of multicollinearity are magnified at smaller sample sizes, higher correlations between the variables, and larger model error variances (smaller R2). The Monte Carlo simulation indicates that if the true model is known, multicollinearity is present, and the estimation and statistical testing of regression parameters are of interest, then PCR or PLS should be employed. If the model is unknown, or if the interest is solely on model predictions, is it recommended that OLS be employed since using more complicated techniques did not produce any improvement in model performance. A leave-one-out cross-validation case study was also performed using low-streamflow data sets from the eastern United States. Results indicate that OLS with stepwise selection generally produces models across study regions with varying levels of multicollinearity that are as good as biased regression techniques such as PCR and PLS.

  13. Simulation Experiments in Practice: Statistical Design and Regression Analysis

    OpenAIRE

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic DOE and regression analysis assume a single simulation response that is normally and independen...

  14. Inferring gene expression dynamics via functional regression analysis

    Directory of Open Access Journals (Sweden)

    Leng Xiaoyan

    2008-01-01

    Full Text Available Abstract Background Temporal gene expression profiles characterize the time-dynamics of expression of specific genes and are increasingly collected in current gene expression experiments. In the analysis of experiments where gene expression is obtained over the life cycle, it is of interest to relate temporal patterns of gene expression associated with different developmental stages to each other to study patterns of long-term developmental gene regulation. We use tools from functional data analysis to study dynamic changes by relating temporal gene expression profiles of different developmental stages to each other. Results We demonstrate that functional regression methodology can pinpoint relationships that exist between temporary gene expression profiles for different life cycle phases and incorporates dimension reduction as needed for these high-dimensional data. By applying these tools, gene expression profiles for pupa and adult phases are found to be strongly related to the profiles of the same genes obtained during the embryo phase. Moreover, one can distinguish between gene groups that exhibit relationships with positive and others with negative associations between later life and embryonal expression profiles. Specifically, we find a positive relationship in expression for muscle development related genes, and a negative relationship for strictly maternal genes for Drosophila, using temporal gene expression profiles. Conclusion Our findings point to specific reactivation patterns of gene expression during the Drosophila life cycle which differ in characteristic ways between various gene groups. Functional regression emerges as a useful tool for relating gene expression patterns from different developmental stages, and avoids the problems with large numbers of parameters and multiple testing that affect alternative approaches.

  15. Regression and Sparse Regression Methods for Viscosity Estimation of Acid Milk From it’s Sls Features

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara; Skytte, Jacob Lercke; Nielsen, Otto Højager Attermann

    2012-01-01

    Statistical solutions find wide spread use in food and medicine quality control. We investigate the effect of different regression and sparse regression methods for a viscosity estimation problem using the spectro-temporal features from new Sub-Surface Laser Scattering (SLS) vision system. From...... with sparse LAR, lasso and Elastic Net (EN) sparse regression methods. Due to the inconsistent measurement condition, Locally Weighted Scatter plot Smoothing (Loess) has been employed to alleviate the undesired variation in the estimated viscosity. The experimental results of applying different methods show...

  16. Smart Grid Demonstration Project

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Craig [National Rural Electric Cooperative Association, Arlington, VA (United States); Carroll, Paul [National Rural Electric Cooperative Association, Arlington, VA (United States); Bell, Abigail [National Rural Electric Cooperative Association, Arlington, VA (United States)

    2015-03-11

    The National Rural Electric Cooperative Association (NRECA) organized the NRECA-U.S. Department of Energy (DOE) Smart Grid Demonstration Project (DE-OE0000222) to install and study a broad range of advanced smart grid technologies in a demonstration that spanned 23 electric cooperatives in 12 states. More than 205,444 pieces of electronic equipment and more than 100,000 minor items (bracket, labels, mounting hardware, fiber optic cable, etc.) were installed to upgrade and enhance the efficiency, reliability, and resiliency of the power networks at the participating co-ops. The objective of this project was to build a path for other electric utilities, and particularly electrical cooperatives, to adopt emerging smart grid technology when it can improve utility operations, thus advancing the co-ops’ familiarity and comfort with such technology. Specifically, the project executed multiple subprojects employing a range of emerging smart grid technologies to test their cost-effectiveness and, where the technology demonstrated value, provided case studies that will enable other electric utilities—particularly electric cooperatives— to use these technologies. NRECA structured the project according to the following three areas: Demonstration of smart grid technology; Advancement of standards to enable the interoperability of components; and Improvement of grid cyber security. We termed these three areas Technology Deployment Study, Interoperability, and Cyber Security. Although the deployment of technology and studying the demonstration projects at coops accounted for the largest portion of the project budget by far, we see our accomplishments in each of the areas as critical to advancing the smart grid. All project deliverables have been published. Technology Deployment Study: The deliverable was a set of 11 single-topic technical reports in areas related to the listed technologies. Each of these reports has already been submitted to DOE, distributed to co-ops, and

  17. Impact of the Processes of Total Testicular Regression and Recrudescence on the Epididymal Physiology of the Bat Myotis nigricans (Chiroptera: Vespertilionidae.

    Directory of Open Access Journals (Sweden)

    Mateus R Beguelini

    Full Text Available Myotis nigricans is a species of vespertilionid bat, whose males show two periods of total testicular regression within the same annual reproductive cycle in the northwest São Paulo State, Brazil. Studies have demonstrated that its epididymis has an elongation of the caudal portion, which stores spermatozoa during the period of testicular regression in July, but that they had no sperm during the regression in November. Thus, the aim of this study was to analyze the impact of the total testicular regression in the epididymal morphophysiology and patterns of its hormonal regulation. The results demonstrate a continuous activity of the epididymis from the Active to the Regressing periods; a morphofunctional regression of the epididymis in the Regressed period; and a slow recrudescence process. Thus, we concluded that the processes of total testicular regression and posterior recrudescence suffered by M. nigricans also impact the physiology of the epididymis, but with a delay in epididymal response. Epididymal physiology is regulated by testosterone and estrogen, through the production and secretion of testosterone by the testes, its conduction to the epididymis (mainly through luminal fluid, conversion of testosterone to dihydrotestosterone by the 5α-reductase enzyme (mainly in epithelial cells and to estrogen by aromatase; and through the activation/deactivation of the androgen receptor and estrogen receptor α in epithelial cells, which regulate the epithelial cell morphophysiology, prevents cell death and regulates their protein expression and secretion, which ensures the maturation and storage of the spermatozoa.

  18. IMPROVING CORRELATION FUNCTION FITTING WITH RIDGE REGRESSION: APPLICATION TO CROSS-CORRELATION RECONSTRUCTION

    International Nuclear Information System (INIS)

    Matthews, Daniel J.; Newman, Jeffrey A.

    2012-01-01

    Cross-correlation techniques provide a promising avenue for calibrating photometric redshifts and determining redshift distributions using spectroscopy which is systematically incomplete (e.g., current deep spectroscopic surveys fail to obtain secure redshifts for 30%-50% or more of the galaxies targeted). In this paper, we improve on the redshift distribution reconstruction methods from our previous work by incorporating full covariance information into our correlation function fits. Correlation function measurements are strongly covariant between angular or spatial bins, and accounting for this in fitting can yield substantial reduction in errors. However, frequently the covariance matrices used in these calculations are determined from a relatively small set (dozens rather than hundreds) of subsamples or mock catalogs, resulting in noisy covariance matrices whose inversion is ill-conditioned and numerically unstable. We present here a method of conditioning the covariance matrix known as ridge regression which results in a more well behaved inversion than other techniques common in large-scale structure studies. We demonstrate that ridge regression significantly improves the determination of correlation function parameters. We then apply these improved techniques to the problem of reconstructing redshift distributions. By incorporating full covariance information, applying ridge regression, and changing the weighting of fields in obtaining average correlation functions, we obtain reductions in the mean redshift distribution reconstruction error of as much as ∼40% compared to previous methods. We provide a description of POWERFIT, an IDL code for performing power-law fits to correlation functions with ridge regression conditioning that we are making publicly available.

  19. RAWS II: A MULTIPLE REGRESSION ANALYSIS PROGRAM,

    Science.gov (United States)

    This memorandum gives instructions for the use and operation of a revised version of RAWS, a multiple regression analysis program. The program...of preprocessed data, the directed retention of variable, listing of the matrix of the normal equations and its inverse, and the bypassing of the regression analysis to provide the input variable statistics only. (Author)

  20. Hierarchical regression analysis in structural Equation Modeling

    NARCIS (Netherlands)

    de Jong, P.F.

    1999-01-01

    In a hierarchical or fixed-order regression analysis, the independent variables are entered into the regression equation in a prespecified order. Such an analysis is often performed when the extra amount of variance accounted for in a dependent variable by a specific independent variable is the main

  1. Categorical regression dose-response modeling

    Science.gov (United States)

    The goal of this training is to provide participants with training on the use of the U.S. EPA’s Categorical Regression soft¬ware (CatReg) and its application to risk assessment. Categorical regression fits mathematical models to toxicity data that have been assigned ord...

  2. Variable importance in latent variable regression models

    NARCIS (Netherlands)

    Kvalheim, O.M.; Arneberg, R.; Bleie, O.; Rajalahti, T.; Smilde, A.K.; Westerhuis, J.A.

    2014-01-01

    The quality and practical usefulness of a regression model are a function of both interpretability and prediction performance. This work presents some new graphical tools for improved interpretation of latent variable regression models that can also assist in improved algorithms for variable

  3. Stepwise versus Hierarchical Regression: Pros and Cons

    Science.gov (United States)

    Lewis, Mitzi

    2007-01-01

    Multiple regression is commonly used in social and behavioral data analysis. In multiple regression contexts, researchers are very often interested in determining the "best" predictors in the analysis. This focus may stem from a need to identify those predictors that are supportive of theory. Alternatively, the researcher may simply be interested…

  4. Gibrat’s law and quantile regressions

    DEFF Research Database (Denmark)

    Distante, Roberta; Petrella, Ivan; Santoro, Emiliano

    2017-01-01

    The nexus between firm growth, size and age in U.S. manufacturing is examined through the lens of quantile regression models. This methodology allows us to overcome serious shortcomings entailed by linear regression models employed by much of the existing literature, unveiling a number of important...

  5. Regression Analysis and the Sociological Imagination

    Science.gov (United States)

    De Maio, Fernando

    2014-01-01

    Regression analysis is an important aspect of most introductory statistics courses in sociology but is often presented in contexts divorced from the central concerns that bring students into the discipline. Consequently, we present five lesson ideas that emerge from a regression analysis of income inequality and mortality in the USA and Canada.

  6. Principles of Quantile Regression and an Application

    Science.gov (United States)

    Chen, Fang; Chalhoub-Deville, Micheline

    2014-01-01

    Newer statistical procedures are typically introduced to help address the limitations of those already in practice or to deal with emerging research needs. Quantile regression (QR) is introduced in this paper as a relatively new methodology, which is intended to overcome some of the limitations of least squares mean regression (LMR). QR is more…

  7. ON REGRESSION REPRESENTATIONS OF STOCHASTIC-PROCESSES

    NARCIS (Netherlands)

    RUSCHENDORF, L; DEVALK, [No Value

    We construct a.s. nonlinear regression representations of general stochastic processes (X(n))n is-an-element-of N. As a consequence we obtain in particular special regression representations of Markov chains and of certain m-dependent sequences. For m-dependent sequences we obtain a constructive

  8. Bayesian Regression of Thermodynamic Models of Redox Active Materials

    Energy Technology Data Exchange (ETDEWEB)

    Johnston, Katherine [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    Finding a suitable functional redox material is a critical challenge to achieving scalable, economically viable technologies for storing concentrated solar energy in the form of a defected oxide. Demonstrating e ectiveness for thermal storage or solar fuel is largely accomplished by using a thermodynamic model derived from experimental data. The purpose of this project is to test the accuracy of our regression model on representative data sets. Determining the accuracy of the model includes parameter tting the model to the data, comparing the model using di erent numbers of param- eters, and analyzing the entropy and enthalpy calculated from the model. Three data sets were considered in this project: two demonstrating materials for solar fuels by wa- ter splitting and the other of a material for thermal storage. Using Bayesian Inference and Markov Chain Monte Carlo (MCMC), parameter estimation was preformed on the three data sets. Good results were achieved, except some there was some deviations on the edges of the data input ranges. The evidence values were then calculated in a variety of ways and used to compare models with di erent number of parameters. It was believed that at least one of the parameters was unnecessary and comparing evidence values demonstrated that the parameter was need on one data set and not signi cantly helpful on another. The entropy was calculated by taking the derivative in one variable and integrating over another. and its uncertainty was also calculated by evaluating the entropy over multiple MCMC samples. Afterwards, all the parts were written up as a tutorial for the Uncertainty Quanti cation Toolkit (UQTk).

  9. Pathological assessment of liver fibrosis regression

    Directory of Open Access Journals (Sweden)

    WANG Bingqiong

    2017-03-01

    Full Text Available Hepatic fibrosis is the common pathological outcome of chronic hepatic diseases. An accurate assessment of fibrosis degree provides an important reference for a definite diagnosis of diseases, treatment decision-making, treatment outcome monitoring, and prognostic evaluation. At present, many clinical studies have proven that regression of hepatic fibrosis and early-stage liver cirrhosis can be achieved by effective treatment, and a correct evaluation of fibrosis regression has become a hot topic in clinical research. Liver biopsy has long been regarded as the gold standard for the assessment of hepatic fibrosis, and thus it plays an important role in the evaluation of fibrosis regression. This article reviews the clinical application of current pathological staging systems in the evaluation of fibrosis regression from the perspectives of semi-quantitative scoring system, quantitative approach, and qualitative approach, in order to propose a better pathological evaluation system for the assessment of fibrosis regression.

  10. Should metacognition be measured by logistic regression?

    Science.gov (United States)

    Rausch, Manuel; Zehetleitner, Michael

    2017-03-01

    Are logistic regression slopes suitable to quantify metacognitive sensitivity, i.e. the efficiency with which subjective reports differentiate between correct and incorrect task responses? We analytically show that logistic regression slopes are independent from rating criteria in one specific model of metacognition, which assumes (i) that rating decisions are based on sensory evidence generated independently of the sensory evidence used for primary task responses and (ii) that the distributions of evidence are logistic. Given a hierarchical model of metacognition, logistic regression slopes depend on rating criteria. According to all considered models, regression slopes depend on the primary task criterion. A reanalysis of previous data revealed that massive numbers of trials are required to distinguish between hierarchical and independent models with tolerable accuracy. It is argued that researchers who wish to use logistic regression as measure of metacognitive sensitivity need to control the primary task criterion and rating criteria. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Fuel Cell Demonstration Program

    Energy Technology Data Exchange (ETDEWEB)

    Gerald Brun

    2006-09-15

    In an effort to promote clean energy projects and aid in the commercialization of new fuel cell technologies the Long Island Power Authority (LIPA) initiated a Fuel Cell Demonstration Program in 1999 with six month deployments of Proton Exchange Membrane (PEM) non-commercial Beta model systems at partnering sites throughout Long Island. These projects facilitated significant developments in the technology, providing operating experience that allowed the manufacturer to produce fuel cells that were half the size of the Beta units and suitable for outdoor installations. In 2001, LIPA embarked on a large-scale effort to identify and develop measures that could improve the reliability and performance of future fuel cell technologies for electric utility applications and the concept to establish a fuel cell farm (Farm) of 75 units was developed. By the end of October of 2001, 75 Lorax 2.0 fuel cells had been installed at the West Babylon substation on Long Island, making it the first fuel cell demonstration of its kind and size anywhere in the world at the time. Designed to help LIPA study the feasibility of using fuel cells to operate in parallel with LIPA's electric grid system, the Farm operated 120 fuel cells over its lifetime of over 3 years including 3 generations of Plug Power fuel cells (Lorax 2.0, Lorax 3.0, Lorax 4.5). Of these 120 fuel cells, 20 Lorax 3.0 units operated under this Award from June 2002 to September 2004. In parallel with the operation of the Farm, LIPA recruited government and commercial/industrial customers to demonstrate fuel cells as on-site distributed generation. From December 2002 to February 2005, 17 fuel cells were tested and monitored at various customer sites throughout Long Island. The 37 fuel cells operated under this Award produced a total of 712,635 kWh. As fuel cell technology became more mature, performance improvements included a 1% increase in system efficiency. Including equipment, design, fuel, maintenance

  12. An application of robust ridge regression model in the presence of outliers to real data problem

    Science.gov (United States)

    Shariff, N. S. Md.; Ferdaos, N. A.

    2017-09-01

    Multicollinearity and outliers are often leads to inconsistent and unreliable parameter estimates in regression analysis. The well-known procedure that is robust to multicollinearity problem is the ridge regression method. This method however is believed are affected by the presence of outlier. The combination of GM-estimation and ridge parameter that is robust towards both problems is on interest in this study. As such, both techniques are employed to investigate the relationship between stock market price and macroeconomic variables in Malaysia due to curiosity of involving the multicollinearity and outlier problem in the data set. There are four macroeconomic factors selected for this study which are Consumer Price Index (CPI), Gross Domestic Product (GDP), Base Lending Rate (BLR) and Money Supply (M1). The results demonstrate that the proposed procedure is able to produce reliable results towards the presence of multicollinearity and outliers in the real data.

  13. Fusion-power demonstration

    International Nuclear Information System (INIS)

    Henning, C.D.; Logan, B.G.; Carlson, G.A.; Neef, W.S.; Moir, R.W.; Campbell, R.B.; Botwin, R.; Clarkson, I.R.; Carpenter, T.J.

    1983-01-01

    As a satellite to the MARS (Mirror Advanced Reactor Study) a smaller, near-term device has been scoped, called the FPD (Fusion Power Demonstration). Envisioned as the next logical step toward a power reactor, it would advance the mirror fusion program beyond MFTF-B and provide an intermediate step toward commercial fusion power. Breakeven net electric power capability would be the goal such that no net utility power would be required to sustain the operation. A phased implementation is envisioned, with a deuterium checkout first to verify the plasma systems before significant neutron activation has occurred. Major tritium-related facilities would be installed with the second phase to produce sufficient fusion power to supply the recirculating power to maintain the neutral beams, ECRH, magnets and other auxiliary equipment

  14. Spent fuel pyroprocessing demonstration

    International Nuclear Information System (INIS)

    McFarlane, L.F.; Lineberry, M.J.

    1995-01-01

    A major element of the shutdown of the US liquid metal reactor development program is managing the sodium-bonded spent metallic fuel from the Experimental Breeder Reactor-II to meet US environmental laws. Argonne National Laboratory has refurbished and equipped an existing hot cell facility for treating the spent fuel by a high-temperature electrochemical process commonly called pyroprocessing. Four products will be produced for storage and disposal. Two high-level waste forms will be produced and qualified for disposal of the fission and activation products. Uranium and transuranium alloys will be produced for storage pending a decision by the US Department of Energy on the fate of its plutonium and enriched uranium. Together these activities will demonstrate a unique electrochemical treatment technology for spent nuclear fuel. This technology potentially has significant economic and technical advantages over either conventional reprocessing or direct disposal as a high-level waste option

  15. Industrial demonstration trials

    International Nuclear Information System (INIS)

    Gelee, M.; Fabre, C.; Villepoix, R. de; Fra, J.; Le Foulgoc, L.; Morel, Y.; Querite, P.; Roques, R.

    1975-01-01

    Prototypes of the plant components, meeting the specifications set by the process and built by industrial firms in collaboration with the supervisor and the C.E.A., are subjected to trial runs on the UF 6 test bench of the Pierrelatte testing zone. These items of equipment (diffuser, compressor, exchanger) are placed in an industrial operation context very similar to that of an enrichment plant. Their performance is measured within a broad region around the working point and their reliability observed over periods up to several tens of thousands of hours. Between 1969 and 1973 six industrial demonstration test benches have been built, marking the stages in the technical preparation of the 1973 file on the basis of which the decision of building was taken by Eurodif [fr

  16. Fusion Power Demonstration III

    International Nuclear Information System (INIS)

    Lee, J.D.

    1985-07-01

    This is the third in the series of reports covering the Fusion Power Demonstration (FPD) design study. This volume considers the FPD-III configuration that incorporates an octopole end plug. As compared with the quadrupole end-plugged designs of FPD-I and FPD-II, this octopole configuration reduces the number of end cell magnets and shortens the minimum ignition length of the central cell. The end-cell plasma length is also reduced, which in turn reduces the size and cost of the end cell magnets and shielding. As a contiuation in the series of documents covering the FPD, this report does not stand alone as a design description of FPD-III. Design details of FPD-III subsystems that do not differ significantly from those of the FPD-II configuration are not duplicated in this report

  17. TPA device for demonstration

    International Nuclear Information System (INIS)

    1980-02-01

    The TPA (torus plasma for amature) is a small race-trac type device made by the technical service division to demonstrate basic properties of plasma such as electron temperature, conductivity, effect of helical field for toroidal drift, and shape of plasma in mirror and cusp magnetic field in linear section. The plasmas are produced by RF discharge (-500W) and/or DC discharge (-30 mA) within glass discharge tube. Where major radius is 50 cm, length of linear section is 50 cm, toroidal magnetic field is 200 gauss. The device has been designed to be compact with only 100 V power source (-3.2 KW for the case without helical field) and to be full automatic sequence of operation. (author)

  18. Fusion power demonstration

    International Nuclear Information System (INIS)

    Henning, C.D.; Logan, B.G.

    1983-01-01

    As a satellite to the MARS (Mirror Advanced Reactor Study) a smaller, near-term device has been scoped, called the FPD (Fusion Power Demonstration). Envisioned as the next logical step toward a power reactor, it would advance the mirror fusion program beyond MFTF-B and provide an intermediate step toward commercial fusion power. Breakeven net electric power capability would be the goal such that no net utility power would be required to sustain the operation. A phased implementation is envisioned, with a deuterium checkout first to verify the plasma systems before significant neutron activation has occurred. Major tritium-related facilities would be installed with the second phase to produce sufficient fusion power to supply the recirculating power to maintain the neutral beams, ECRH, magnets and other auxiliary equipment

  19. Dynamic wall demonstration project

    Energy Technology Data Exchange (ETDEWEB)

    Nakatsui, L.; Mayhew, W.

    1990-12-01

    The dynamic wall concept is a ventilation strategy that can be applied to a single family dwelling. With suitable construction, outside air can be admitted through the exterior walls of the house to the interior space to function as ventilation air. The construction and performance monitoring of a demonstration house built to test the dynamic wall concept in Sherwood Park, Alberta, is described. The project had the objectives of demonstrating and assessing the construction methods; determining the cost-effectiveness of the concept in Alberta; analyzing the operation of the dynamic wall system; and determining how other components and systems in the house interact with the dynamic wall. The exterior wall construction consisted of vinyl siding, spun-bonded polyolefin-backed (SBPO) rigid fiberglass sheathing, 38 mm by 89 mm framing, fiberglass batt insulation and 12.7 mm drywall. The mechanical system was designed to operate in the dynamic (negative pressure) mode, however flexibility was provided to allow operation in the static (balanced pressure) mode to permit monitoring of the walls as if they were in a conventional house. The house was monitored by an extensive computerized monitoring system. Dynamic wall operation was dependent on pressure and temperature differentials between indoor and outdoor as well as wind speed and direction. The degree of heat gain was found to be ca 74% of the indoor-outdoor temperature differential. Temperature of incoming dynamic air was significantly affected by solar radiation and measurement of indoor air pollutants found no significant levels. 4 refs., 34 figs., 11 tabs.

  20. Application of principal component regression and partial least squares regression in ultraviolet spectrum water quality detection

    Science.gov (United States)

    Li, Jiangtong; Luo, Yongdao; Dai, Honglin

    2018-01-01

    Water is the source of life and the essential foundation of all life. With the development of industrialization, the phenomenon of water pollution is becoming more and more frequent, which directly affects the survival and development of human. Water quality detection is one of the necessary measures to protect water resources. Ultraviolet (UV) spectral analysis is an important research method in the field of water quality detection, which partial least squares regression (PLSR) analysis method is becoming predominant technology, however, in some special cases, PLSR's analysis produce considerable errors. In order to solve this problem, the traditional principal component regression (PCR) analysis method was improved by using the principle of PLSR in this paper. The experimental results show that for some special experimental data set, improved PCR analysis method performance is better than PLSR. The PCR and PLSR is the focus of this paper. Firstly, the principal component analysis (PCA) is performed by MATLAB to reduce the dimensionality of the spectral data; on the basis of a large number of experiments, the optimized principal component is extracted by using the principle of PLSR, which carries most of the original data information. Secondly, the linear regression analysis of the principal component is carried out with statistic package for social science (SPSS), which the coefficients and relations of principal components can be obtained. Finally, calculating a same water spectral data set by PLSR and improved PCR, analyzing and comparing two results, improved PCR and PLSR is similar for most data, but improved PCR is better than PLSR for data near the detection limit. Both PLSR and improved PCR can be used in Ultraviolet spectral analysis of water, but for data near the detection limit, improved PCR's result better than PLSR.

  1. Characteristics and Properties of a Simple Linear Regression Model

    Directory of Open Access Journals (Sweden)

    Kowal Robert

    2016-12-01

    Full Text Available A simple linear regression model is one of the pillars of classic econometrics. Despite the passage of time, it continues to raise interest both from the theoretical side as well as from the application side. One of the many fundamental questions in the model concerns determining derivative characteristics and studying the properties existing in their scope, referring to the first of these aspects. The literature of the subject provides several classic solutions in that regard. In the paper, a completely new design is proposed, based on the direct application of variance and its properties, resulting from the non-correlation of certain estimators with the mean, within the scope of which some fundamental dependencies of the model characteristics are obtained in a much more compact manner. The apparatus allows for a simple and uniform demonstration of multiple dependencies and fundamental properties in the model, and it does it in an intuitive manner. The results were obtained in a classic, traditional area, where everything, as it might seem, has already been thoroughly studied and discovered.

  2. Research and analyze of physical health using multiple regression analysis

    Directory of Open Access Journals (Sweden)

    T. S. Kyi

    2014-01-01

    Full Text Available This paper represents the research which is trying to create a mathematical model of the "healthy people" using the method of regression analysis. The factors are the physical parameters of the person (such as heart rate, lung capacity, blood pressure, breath holding, weight height coefficient, flexibility of the spine, muscles of the shoulder belt, abdominal muscles, squatting, etc.., and the response variable is an indicator of physical working capacity. After performing multiple regression analysis, obtained useful multiple regression models that can predict the physical performance of boys the aged of fourteen to seventeen years. This paper represents the development of regression model for the sixteen year old boys and analyzed results.

  3. Radiation regression patterns after cobalt plaque insertion for retinoblastoma

    International Nuclear Information System (INIS)

    Buys, R.J.; Abramson, D.H.; Ellsworth, R.M.; Haik, B.

    1983-01-01

    An analysis of 31 eyes of 30 patients who had been treated with cobalt plaques for retinoblastoma disclosed that a type I radiation regression pattern developed in 15 patients; type II, in one patient, and type III, in five patients. Nine patients had a regression pattern characterized by complete destruction of the tumor, the surrounding choroid, and all of the vessels in the area into which the plaque was inserted. This resulting white scar, corresponding to the sclerae only, was classified as a type IV radiation regression pattern. There was no evidence of tumor recurrence in patients with type IV regression patterns, with an average follow-up of 6.5 years, after receiving cobalt plaque therapy. Twenty-nine of these 30 patients had been unsuccessfully treated with at least one other modality (ie, light coagulation, cryotherapy, external beam radiation, or chemotherapy)

  4. Radiation regression patterns after cobalt plaque insertion for retinoblastoma

    Energy Technology Data Exchange (ETDEWEB)

    Buys, R.J.; Abramson, D.H.; Ellsworth, R.M.; Haik, B.

    1983-08-01

    An analysis of 31 eyes of 30 patients who had been treated with cobalt plaques for retinoblastoma disclosed that a type I radiation regression pattern developed in 15 patients; type II, in one patient, and type III, in five patients. Nine patients had a regression pattern characterized by complete destruction of the tumor, the surrounding choroid, and all of the vessels in the area into which the plaque was inserted. This resulting white scar, corresponding to the sclerae only, was classified as a type IV radiation regression pattern. There was no evidence of tumor recurrence in patients with type IV regression patterns, with an average follow-up of 6.5 years, after receiving cobalt plaque therapy. Twenty-nine of these 30 patients had been unsuccessfully treated with at least one other modality (ie, light coagulation, cryotherapy, external beam radiation, or chemotherapy).

  5. FY 1992 report on the results of the demonstration test on the methanol conversion at oil-fired power plant. Demonstration test on a methanol reformation type power generation total system; 1992 nendo sekiyu karyoku hatsudensho metanoru tenkan tou jissho shiken. Metanoru kaishitsu gata hatsuden total system jissho shiken

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-03-01

    For the promotion of introduction of methanol to oil-fired power plant, based on the results of the element study, operational study was conducted of a 1,000kW class total system plant for which each of the elements was combined, and the FY 1992 results were summarized. In the operational study, data on various kinds of operational study were sampled of each of the simple cycle/regeneration cycle of liquid methanol and simple cycle/regeneration cycle of gas methanol. As to the reformed gas/water injection/regeneration cycle, all functions as a total system plant worked normally, and it was confirmed that the reformed gas/water injection/regeneration cycle operation could be made possible. Besides, the following were conducted: confirmation test on the performance of the developmental catalyst used in the operational study by bench-scale test device, trial operation for adjustment of gas turbine and combustion study such as the performance test in each cycle, manufacture/study of catalyst for the total system, study for longevity of catalyst for the total system, etc. (NEDO)

  6. Fiscal 1999 research result report on energy and environment technology demonstration research support project (International joint demonstration research project). Improvement of long-distant power transmission efficiency and reliability, and its environmental impact assessment; 1999 nendo chokyori soden hoshiki ni kansuru soden koritsu to soden shinraido no kojo oyobi kankyo eno eikyo hyoka seika hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    Japan-Russia international joint research was made on overhead ultrahigh-voltage DC power transmission lines for transmission loss reduction and reliability improvement, and an optimum international power supply cable system, considering energy saving and environment conservation. Using the European-Russian DC power transmission line of {+-}500kV and 4GW as a model, comparison was made between a model using Russian round strands and glass insulators and a model using Japanese low-loss wires and insulators. As for improvement of the reliability in cold districts, Russian design techniques for tower structure and ice loading were reasonable to counteract galloping oscillation and ice load. In evaluation and selection of optimum underground and marine cables, study was made on cable specifications using the Turkey-Russia power transmission route as a model. The environmental assessment result of these cables showed that XLPE cable under development is optimum. (NEDO)

  7. Regression modeling of ground-water flow

    Science.gov (United States)

    Cooley, R.L.; Naff, R.L.

    1985-01-01

    Nonlinear multiple regression methods are developed to model and analyze groundwater flow systems. Complete descriptions of regression methodology as applied to groundwater flow models allow scientists and engineers engaged in flow modeling to apply the methods to a wide range of problems. Organization of the text proceeds from an introduction that discusses the general topic of groundwater flow modeling, to a review of basic statistics necessary to properly apply regression techniques, and then to the main topic: exposition and use of linear and nonlinear regression to model groundwater flow. Statistical procedures are given to analyze and use the regression models. A number of exercises and answers are included to exercise the student on nearly all the methods that are presented for modeling and statistical analysis. Three computer programs implement the more complex methods. These three are a general two-dimensional, steady-state regression model for flow in an anisotropic, heterogeneous porous medium, a program to calculate a measure of model nonlinearity with respect to the regression parameters, and a program to analyze model errors in computed dependent variables such as hydraulic head. (USGS)

  8. Development of a computer program to support an efficient non-regression test of a thermal-hydraulic system code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jun Yeob; Jeong, Jae Jun [School of Mechanical Engineering, Pusan National University, Busan (Korea, Republic of); Suh, Jae Seung [System Engineering and Technology Co., Daejeon (Korea, Republic of); Kim, Kyung Doo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    During the development process of a thermal-hydraulic system code, a non-regression test (NRT) must be performed repeatedly in order to prevent software regression. The NRT process, however, is time-consuming and labor-intensive. Thus, automation of this process is an ideal solution. In this study, we have developed a program to support an efficient NRT for the SPACE code and demonstrated its usability. This results in a high degree of efficiency for code development. The program was developed using the Visual Basic for Applications and designed so that it can be easily customized for the NRT of other computer codes.

  9. A Demonstration of Lusail

    KAUST Repository

    Mansour, Essam; Abdelaziz, Ibrahim; Ouzzani, Mourad; Aboulnaga, Ashraf; Kalnis, Panos

    2017-01-01

    There has been a proliferation of datasets available as interlinked RDF data accessible through SPARQL endpoints. This has led to the emergence of various applications in life science, distributed social networks, and Internet of Things that need to integrate data from multiple endpoints. We will demonstrate Lusail; a system that supports the need of emerging applications to access tens to hundreds of geo-distributed datasets. Lusail is a geo-distributed graph engine for querying linked RDF data. Lusail delivers outstanding performance using (i) a novel locality-aware query decomposition technique that minimizes the intermediate data to be accessed by the subqueries, and (ii) selectivityawareness and parallel query execution to reduce network latency and to increase parallelism. During the demo, the audience will be able to query actually deployed RDF endpoints as well as large synthetic and real benchmarks that we have deployed in the public cloud. The demo will also show that Lusail outperforms state-of-the-art systems by orders of magnitude in terms of scalability and response time.

  10. A Demonstration of Lusail

    KAUST Repository

    Mansour, Essam

    2017-05-10

    There has been a proliferation of datasets available as interlinked RDF data accessible through SPARQL endpoints. This has led to the emergence of various applications in life science, distributed social networks, and Internet of Things that need to integrate data from multiple endpoints. We will demonstrate Lusail; a system that supports the need of emerging applications to access tens to hundreds of geo-distributed datasets. Lusail is a geo-distributed graph engine for querying linked RDF data. Lusail delivers outstanding performance using (i) a novel locality-aware query decomposition technique that minimizes the intermediate data to be accessed by the subqueries, and (ii) selectivityawareness and parallel query execution to reduce network latency and to increase parallelism. During the demo, the audience will be able to query actually deployed RDF endpoints as well as large synthetic and real benchmarks that we have deployed in the public cloud. The demo will also show that Lusail outperforms state-of-the-art systems by orders of magnitude in terms of scalability and response time.

  11. Tidd PFBC demonstration project

    Energy Technology Data Exchange (ETDEWEB)

    Marrocco, M. [American Electric Power, Columbus, OH (United States)

    1997-12-31

    The Tidd project was one of the first joint government-industry ventures to be approved by the US Department of Energy (DOE) in its Clean Coal Technology Program. In March 1987, DOE signed an agreement with the Ohio Power Company, a subsidiary of American Electric Power, to refurbish the then-idle Tidd plant on the banks of the Ohio River with advanced pressurized fluidized bed technology. Testing ended after 49 months of operation, 100 individual tests, and the generation of more than 500,000 megawatt-hours of electricity. The demonstration plant has met its objectives. The project showed that more than 95 percent of sulfur dioxide pollutants could be removed inside the advanced boiler using the advanced combustion technology, giving future power plants an attractive alternative to expensive, add-on scrubber technology. In addition to its sulfur removal effectiveness, the plant`s sustained periods of steady-state operation boosted its availability significantly above design projections, heightening confidence that pressurized fluidized bed technology will be a reliable, baseload technology for future power plants. The technology also controlled the release of nitrogen oxides to levels well below the allowable limits set by federal air quality standards. It also produced a dry waste product that is much easier to handle than wastes from conventional power plants and will likely have commercial value when produced by future power plants.

  12. Demonstration of the Safety and Feasibility of Robotically Assisted Percutaneous Coronary Intervention in Complex Coronary Lesions: Results of the CORA-PCI Study (Complex Robotically Assisted Percutaneous Coronary Intervention).

    Science.gov (United States)

    Mahmud, Ehtisham; Naghi, Jesse; Ang, Lawrence; Harrison, Jonathan; Behnamfar, Omid; Pourdjabbar, Ali; Reeves, Ryan; Patel, Mitul

    2017-07-10

    in the robotic group (42:59 ± 26:14 min:s with R-PCI vs. 34:01 ± 17:14 min:s with M-PCI; p = 0.007), although clinical success remained similar (98.8% with R-PCI vs. 100% with M-PCI; p = 1.00). This study demonstrates the feasibility, safety, and high technical success of R-PCI for the treatment of complex coronary disease. Furthermore, comparable clinical outcomes, without an adverse effect on stent use or fluoroscopy time, were observed with R-PCI and M-PCI. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  13. Applied Regression Modeling A Business Approach

    CERN Document Server

    Pardoe, Iain

    2012-01-01

    An applied and concise treatment of statistical regression techniques for business students and professionals who have little or no background in calculusRegression analysis is an invaluable statistical methodology in business settings and is vital to model the relationship between a response variable and one or more predictor variables, as well as the prediction of a response value given values of the predictors. In view of the inherent uncertainty of business processes, such as the volatility of consumer spending and the presence of market uncertainty, business professionals use regression a

  14. Fiscal 2000 report on result of international joint demonstrative development of photovoltaic power generation system. Demonstrative research on photovoltaic power generation system interconnection system (Myanmar); 2000 nendo taiyoko hatsuden system kokusai kyodo jissho kaihatsu seika hokokusho. Taiyoko hatsuden keito renkei system jissho kenkyu (Myanmar)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    Joint research was conducted with Myanmar on a photovoltaic power generation system interconnection system, for which fiscal 2000 results were described in this paper. Power generating facilities were set up consisting of 80kW photovoltaic, 40kW wind and 60kW diesel systems. With the photovoltaic and wind power generation connected to a small-scale power system as a ballast load, the system interconnection is formed through load adjusting equipment such as storage batteries. The hybrid system feeding is from 6 o'clock early in the morning until 23 late at night. The diesel power generation is free from restrictions. The operating method was set on system control (demand side management) by adjustment from the load side, with ballast load control employed that adjustably operates an ice machine load. The basic design was drafted in terms of a storage battery capacity of 1,000Ah and an ice machine load of 32kW. The daytime load was assumed to be 25% of the night load. On the equipment specifications set in this basic design, arrangement design was conducted for the equipment in the premises, making a land development plan and a basic construction plan including a temporary work site, construction steps, transportation and delivery of the equipment. Quantity of solar radiation and wind data were continuously observed. (NEDO)

  15. A SAS-macro for estimation of the cumulative incidence using Poisson regression

    DEFF Research Database (Denmark)

    Waltoft, Berit Lindum

    2009-01-01

    the hazard rates, and the hazard rates are often estimated by the Cox regression. This procedure may not be suitable for large studies due to limited computer resources. Instead one uses Poisson regression, which approximates the Cox regression. Rosthøj et al. presented a SAS-macro for the estimation...... of the cumulative incidences based on the Cox regression. I present the functional form of the probabilities and variances when using piecewise constant hazard rates and a SAS-macro for the estimation using Poisson regression. The use of the macro is demonstrated through examples and compared to the macro presented...

  16. Characterization of AVHRR global cloud detection sensitivity based on CALIPSO-CALIOP cloud optical thickness information: demonstration of results based on the CM SAF CLARA-A2 climate data record

    Science.gov (United States)

    Karlsson, Karl-Göran; Håkansson, Nina

    2018-02-01

    The sensitivity in detecting thin clouds of the cloud screening method being used in the CM SAF cloud, albedo and surface radiation data set from AVHRR data (CLARA-A2) cloud climate data record (CDR) has been evaluated using cloud information from the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) onboard the CALIPSO satellite. The sensitivity, including its global variation, has been studied based on collocations of Advanced Very High Resolution Radiometer (AVHRR) and CALIOP measurements over a 10-year period (2006-2015). The cloud detection sensitivity has been defined as the minimum cloud optical thickness for which 50 % of clouds could be detected, with the global average sensitivity estimated to be 0.225. After using this value to reduce the CALIOP cloud mask (i.e. clouds with optical thickness below this threshold were interpreted as cloud-free cases), cloudiness results were found to be basically unbiased over most of the globe except over the polar regions where a considerable underestimation of cloudiness could be seen during the polar winter. The overall probability of detecting clouds in the polar winter could be as low as 50 % over the highest and coldest parts of Greenland and Antarctica, showing that a large fraction of optically thick clouds also remains undetected here. The study included an in-depth analysis of the probability of detecting a cloud as a function of the vertically integrated cloud optical thickness as well as of the cloud's geographical position. Best results were achieved over oceanic surfaces at mid- to high latitudes where at least 50 % of all clouds with an optical thickness down to a value of 0.075 were detected. Corresponding cloud detection sensitivities over land surfaces outside of the polar regions were generally larger than 0.2 with maximum values of approximately 0.5 over the Sahara and the Arabian Peninsula. For polar land surfaces the values were close to 1 or higher with maximum values of 4.5 for the parts

  17. Kinesthetic Transverse Wave Demonstration

    Science.gov (United States)

    Pantidos, Panagiotis; Patapis, Stamatis

    2005-09-01

    This is a variation on the String and Sticky Tape demonstration "The Wave Game," suggested by Ron Edge. A group of students stand side by side, each one holding a card chest high with both hands. The teacher cues the first student to begin raising and lowering his card. When he starts lowering his card, the next student begins to raise his. As succeeding students move their cards up and down, a wave such as that shown in the figure is produced. To facilitate the process, students' motions were synchronized with the ticks of a metronome (without such synchronization it was nearly impossible to generate a satisfactory wave). Our waves typically had a frequency of about 1 Hz and a wavelength of around 3 m. We videotaped the activity so that the students could analyze the motions. The (17-year-old) students had not received any prior instruction regarding wave motion and did not know beforehand the nature of the exercise they were about to carry out. During the activity they were asked what a transverse wave is. Most of them quickly realized, without teacher input, that while the wave propagated horizontally, the only motion of the transmitting medium (them) was vertical. They located the equilibrium points of the oscillations, the crests and troughs of the waves, and identified the wavelength. The teacher defined for them the period of the oscillations of the motion of a card to be the total time for one cycle. The students measured this time and then several asserted that it was the same as the wave period. Knowing the length of the waves and the number of waves per second, the next step can easily be to find the wave speed.

  18. Multilayer perceptron for robust nonlinear interval regression analysis using genetic algorithms.

    Science.gov (United States)

    Hu, Yi-Chung

    2014-01-01

    On the basis of fuzzy regression, computational models in intelligence such as neural networks have the capability to be applied to nonlinear interval regression analysis for dealing with uncertain and imprecise data. When training data are not contaminated by outliers, computational models perform well by including almost all given training data in the data interval. Nevertheless, since training data are often corrupted by outliers, robust learning algorithms employed to resist outliers for interval regression analysis have been an interesting area of research. Several approaches involving computational intelligence are effective for resisting outliers, but the required parameters for these approaches are related to whether the collected data contain outliers or not. Since it seems difficult to prespecify the degree of contamination beforehand, this paper uses multilayer perceptron to construct the robust nonlinear interval regression model using the genetic algorithm. Outliers beyond or beneath the data interval will impose slight effect on the determination of data interval. Simulation results demonstrate that the proposed method performs well for contaminated datasets.

  19. Composite marginal quantile regression analysis for longitudinal adolescent body mass index data.

    Science.gov (United States)

    Yang, Chi-Chuan; Chen, Yi-Hau; Chang, Hsing-Yi

    2017-09-20

    Childhood and adolescenthood overweight or obesity, which may be quantified through the body mass index (BMI), is strongly associated with adult obesity and other health problems. Motivated by the child and adolescent behaviors in long-term evolution (CABLE) study, we are interested in individual, family, and school factors associated with marginal quantiles of longitudinal adolescent BMI values. We propose a new method for composite marginal quantile regression analysis for longitudinal outcome data, which performs marginal quantile regressions at multiple quantile levels simultaneously. The proposed method extends the quantile regression coefficient modeling method introduced by Frumento and Bottai (Biometrics 2016; 72:74-84) to longitudinal data accounting suitably for the correlation structure in longitudinal observations. A goodness-of-fit test for the proposed modeling is also developed. Simulation results show that the proposed method can be much more efficient than the analysis without taking correlation into account and the analysis performing separate quantile regressions at different quantile levels. The application to the longitudinal adolescent BMI data from the CABLE study demonstrates the practical utility of our proposal. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  20. Simultaneous Force Regression and Movement Classification of Fingers via Surface EMG within a Unified Bayesian Framework.

    Science.gov (United States)

    Baldacchino, Tara; Jacobs, William R; Anderson, Sean R; Worden, Keith; Rowson, Jennifer

    2018-01-01

    This contribution presents a novel methodology for myolectric-based control using surface electromyographic (sEMG) signals recorded during finger movements. A multivariate Bayesian mixture of experts (MoE) model is introduced which provides a powerful method for modeling force regression at the fingertips, while also performing finger movement classification as a by-product of the modeling algorithm. Bayesian inference of the model allows uncertainties to be naturally incorporated into the model structure. This method is tested using data from the publicly released NinaPro database which consists of sEMG recordings for 6 degree-of-freedom force activations for 40 intact subjects. The results demonstrate that the MoE model achieves similar performance compared to the benchmark set by the authors of NinaPro for finger force regression. Additionally, inherent to the Bayesian framework is the inclusion of uncertainty in the model parameters, naturally providing confidence bounds on the force regression predictions. Furthermore, the integrated clustering step allows a detailed investigation into classification of the finger movements, without incurring any extra computational effort. Subsequently, a systematic approach to assessing the importance of the number of electrodes needed for accurate control is performed via sensitivity analysis techniques. A slight degradation in regression performance is observed for a reduced number of electrodes, while classification performance is unaffected.