WorldWideScience

Sample records for grounded empirically tested

  1. Empirical Test Case Specification

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Heiselberg, Per

    This document includes the empirical specification on the IEA task of evaluation building energy simulation computer programs for the Double Skin Facades (DSF) constructions. There are two approaches involved into this procedure, one is the comparative approach and another is the empirical one. I....... In the comparative approach the outcomes of different software tools are compared, while in the empirical approach the modelling results are compared with the results of experimental test cases....

  2. Hot Ground Vibration Tests

    Data.gov (United States)

    National Aeronautics and Space Administration — Ground vibration tests or modal surveys are routinely conducted to support flutter analysis for subsonic and supersonic vehicles. However, vibration testing...

  3. Empirically Testing Thematic Analysis (ETTA)

    DEFF Research Database (Denmark)

    Gildberg, Frederik Alkier; Bradley, Stephen K.; Tingleff, Elllen B.

    2015-01-01

    Text analysis is not a question of a right or wrong way to go about it, but a question of different traditions. These tend to not only give answers to how to conduct an analysis, but also to provide the answer as to why it is conducted in the way that it is. The problem however may be that the li...... for themselves. The advantage of utilizing the presented analytic approach is argued to be the integral empirical testing, which should assure systematic development, interpretation and analysis of the source textual material....... between tradition and tool is unclear. The main objective of this article is therefore to present Empirical Testing Thematic Analysis, a step by step approach to thematic text analysis; discussing strengths and weaknesses, so that others might assess its potential as an approach that they might utilize/develop...

  4. Final Empirical Test Case Specification

    DEFF Research Database (Denmark)

    Kalyanova, Olena; Heiselberg, Per

    This document includes the empirical specification on the IEA task of evaluation building energy simulation computer programs for the Double Skin Facades (DSF) constructions. There are two approaches involved into this procedure, one is the comparative approach and another is the empirical one....

  5. 2012 Ground Testing Highlights

    Science.gov (United States)

    Buchholz, Steven J.

    2012-01-01

    As part of the Fundamental Aeronautics Program and a collaborative effort with Boeing, and Lockheed Martin this past year a series of sonic boom test were completed in the NASA Ames Unitary Plan Wind Tunnel (UPWT). One of the goals was to develop new test techniques and hardware for measuring sonic boom signatures in the transonic and supersonic regimes. Data for various model designs and configurations were collected and will be used to validate CFD predictions of sonic boom signatures. Reactivation of the NASA Ames Mitsubishi compressor system was completed this past year. The compressor is intended to replace and augment the existing UPWT Clark Compressor as the primary Make Up Air (MUA) source. The MUA system provides air and vacuum pumping capability to the Ames UPWT. It will improve productivity and reliability of the UPWT as a vital testing and research facility for the U.S. aerospace industry and NASA. Funding for this task was provided from the American Recovery Investment Act (ARRA). Installation and validation of a Noncontact Stress Monitoring System (NSMS) for the 3-stage compressor was completed at the 11-foot Transonic Wind Tunnel. The system, originally developed at AEDC, consists of 36 pairs of LED light sources with optic beam send and receive probes along a 1-per rev signal. The new system allows for continuous monitoring and recording of compressor blade bending and torsion stress during normal test operations. A very unusual test was completed in the 11 FT TWT to acquire aerodynamic and flow field data for the Crew Exploration Vehicle (CEV) Parachute Assembly System (CPAS) to validate CFD methods and tools. Surface pressure distribution measurements and velocity measurements in the wake of the command module back to the drogues parachute location were acquired. Testing methods included Particle Image Velocimetry (PIV), Pressure Sensitive Paint (PSP), Schlieren Infrared Imaging (IR) and boundary layer survey and skin friction.

  6. Empirical recurrence rates for ground motion signals on planetary surfaces

    Science.gov (United States)

    Lorenz, Ralph D.; Panning, Mark

    2018-03-01

    We determine the recurrence rates of ground motion events as a function of sensed velocity amplitude at several terrestrial locations, and make a first interplanetary comparison with measurements on the Moon, Mars, Venus and Titan. This empirical approach gives an intuitive order-of-magnitude guide to the observed ground motion (including both tectonic and ocean- and atmosphere-forced signals) of these locations as a guide to instrument expectations on future missions, without invoking interior models and specific sources: for example a Venera-14 observation of possible ground motion indicates a microseismic environment mid-way between noisy and quiet terrestrial locations. Quiet terrestrial regions see a peak velocity amplitude in mm/s roughly equal to 0.3*N(-0.7), where N is the number of "events" (half-hour intervals in which a given peak ground motion is exceeded) observed per year. The Apollo data show endogenous seismic signals for a given recurrence rate that are typically about 10,000 times smaller in amplitude than a quiet site on Earth, although local thermally-induced moonquakes are much more common. Viking data masked for low-wind periods appear comparable with a quiet terrestrial site, whereas a Venera observation of microseisms suggests ground motion more similar to a more active terrestrial location. Recurrence rate plots from in-situ measurements provide a context for seismic instrumentation on future planetary missions, e.g. to guide formulation of data compression schemes. While even small geophones can discriminate terrestrial activity rates, observations with guidance accelerometers are typically too insensitive to provide meaningful constraints (i.e. a non-zero number of "events") on actual ground motion observations unless operated for very long periods.

  7. 2011 Ground Testing Highlights Article

    Science.gov (United States)

    Ross, James C.; Buchholz, Steven J.

    2011-01-01

    Two tests supporting development of the launch abort system for the Orion MultiPurpose Crew Vehicle were run in the NASA Ames Unitary Plan wind tunnel last year. The first test used a fully metric model to examine the stability and controllability of the Launch Abort Vehicle during potential abort scenarios for Mach numbers ranging from 0.3 to 2.5. The aerodynamic effects of the Abort Motor and Attitude Control Motor plumes were simulated using high-pressure air flowing through independent paths. The aerodynamic effects of the proximity to the launch vehicle during the early moments of an abort were simulated with a remotely actuated Service Module that allowed the position relative to the Crew Module to be varied appropriately. The second test simulated the acoustic environment around the Launch Abort Vehicle caused by the plumes from the 400,000-pound thrust, solid-fueled Abort Motor. To obtain the proper acoustic characteristics of the hot rocket plumes for the flight vehicle, heated Helium was used. A custom Helium supply system was developed for the test consisting of 2 jumbo high-pressure Helium trailers, a twelve-tube accumulator, and a 13MW gas-fired heater borrowed from the Propulsion Simulation Laboratory at NASA Glenn Research Center. The test provided fluctuating surface pressure measurements at over 200 points on the vehicle surface that have now been used to define the ground-testing requirements for the Orion Launch Abort Vehicle.

  8. Psychological Models of Art Reception must be Empirically Grounded

    DEFF Research Database (Denmark)

    Nadal, Marcos; Vartanian, Oshin; Skov, Martin

    2017-01-01

    We commend Menninghaus et al. for tackling the role of negative emotions in art reception. However, their model suffers from shortcomings that reduce its applicability to empirical studies of the arts: poor use of evidence, lack of integration with other models, and limited derivation of testable...... hypotheses. We argue that theories about art experiences should be based on empirical evidence....

  9. An Empirical Study of Atmospheric Correction Procedures for Regional Infrasound Amplitudes with Ground Truth.

    Science.gov (United States)

    Howard, J. E.

    2014-12-01

    This study focusses on improving methods of accounting for atmospheric effects on infrasound amplitudes observed on arrays at regional distances in the southwestern United States. Recordings at ranges of 150 to nearly 300 km from a repeating ground truth source of small HE explosions are used. The explosions range in actual weight from approximately 2000-4000 lbs. and are detonated year-round which provides signals for a wide range of atmospheric conditions. Three methods of correcting the observed amplitudes for atmospheric effects are investigated with the data set. The first corrects amplitudes for upper stratospheric wind as developed by Mutschlecner and Whitaker (1999) and uses the average wind speed between 45-55 km altitudes in the direction of propagation to derive an empirical correction formula. This approach was developed using large chemical and nuclear explosions and is tested with the smaller explosions for which shorter wavelengths cause the energy to be scattered by the smaller scale structure of the atmosphere. The second approach isa semi-empirical method using ray tracing to determine wind speed at ray turning heights where the wind estimates replace the wind values in the existing formula. Finally, parabolic equation (PE) modeling is used to predict the amplitudes at the arrays at 1 Hz. The PE amplitudes are compared to the observed amplitudes with a narrow band filter centered at 1 Hz. An analysis is performed of the conditions under which the empirical and semi-empirical methods fail and full wave methods must be used.

  10. Empirical and pragmatic adequacy of grounded theory: Advancing nurse empowerment theory for nurses' practice.

    Science.gov (United States)

    Udod, Sonia A; Racine, Louise

    2017-12-01

    To draw on the findings of a grounded theory study aimed at exploring how power is exercised in nurse-manager relationships in the hospital setting, this paper examines the empirical and pragmatic adequacy of grounded theory as a methodology to advance the concept of empowerment in the area of nursing leadership and management. The evidence on staff nurse empowerment has highlighted the magnitude of individual and organisational outcomes, but has not fully explicated the micro-level processes underlying how power is exercised, shared or created within the nurse-manager relationship. Although grounded theory is a widely adopted nursing research methodology, it remains less used in nursing leadership because of the dominance of quantitative approaches to research. Grounded theory methodology provides the empirical and pragmatic relevance to inform nursing practice and policy. Grounded theory is a relevant qualitative approach to use in leadership research as it provides a fine and detailed analysis of the process underlying complexity and bureaucracy. Discursive paper. A critical examination of the empirical and pragmatic relevance of grounded theory by (Corbin & Strauss, , ) as a method for analysing and solving problems in nurses' practice is provided. This paper provides evidence to support the empirical and pragmatic adequacy of grounded theory methodology. Although the application of the ontological, epistemological and methodological assumptions of grounded theory is challenging, this methodology is useful to address real-life problems in nursing practice by developing theoretical explanations of nurse empowerment, or lack thereof, in the workplace. Grounded theory represents a relevant methodology to inform nursing leadership research. Grounded theory is anchored in the reality of practice. The strength of grounded theory is to provide results that can be readily applied to clinical practice and policy as they arise from problems that affect practice and that

  11. Advanced Testing Method for Ground Thermal Conductivity

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xiaobing [ORNL; Clemenzi, Rick [Geothermal Design Center Inc.; Liu, Su [University of Tennessee (UT)

    2017-04-01

    A new method is developed that can quickly and more accurately determine the effective ground thermal conductivity (GTC) based on thermal response test (TRT) results. Ground thermal conductivity is an important parameter for sizing ground heat exchangers (GHEXs) used by geothermal heat pump systems. The conventional GTC test method usually requires a TRT for 48 hours with a very stable electric power supply throughout the entire test. In contrast, the new method reduces the required test time by 40%–60% or more, and it can determine GTC even with an unstable or intermittent power supply. Consequently, it can significantly reduce the cost of GTC testing and increase its use, which will enable optimal design of geothermal heat pump systems. Further, this new method provides more information about the thermal properties of the GHEX and the ground than previous techniques. It can verify the installation quality of GHEXs and has the potential, if developed, to characterize the heterogeneous thermal properties of the ground formation surrounding the GHEXs.

  12. Stewardship and risk: An empirically grounded theory of organic fish farming in Scotland

    NARCIS (Netherlands)

    Georgakopoulos, G.; Ciancanelli, P.; Coulson, A.; Kaldis, P.E.

    2008-01-01

    It has long been assumed ownership gives farmers incentives to act as stewards of the land. On this basis, quasi-property rights are granted to fish farmers to encourage them to manage risks to the aquatic environment. This paper offers an empirically grounded theorisation of fish farmers’

  13. Coping with the Stigma of Mental Illness: Empirically-Grounded Hypotheses from Computer Simulations

    Science.gov (United States)

    Kroska, Amy; Har, Sarah K.

    2011-01-01

    This research demonstrates how affect control theory and its computer program, "Interact", can be used to develop empirically-grounded hypotheses regarding the connection between cultural labels and behaviors. Our demonstration focuses on propositions in the modified labeling theory of mental illness. According to the MLT, negative societal…

  14. Single-event effect ground test issues

    International Nuclear Information System (INIS)

    Koga, R.

    1996-01-01

    Ground-based single event effect (SEE) testing of microcircuits permits characterization of device susceptibility to various radiation induced disturbances, including: (1) single event upset (SEU) and single event latchup (SEL) in digital microcircuits; (2) single event gate rupture (SEGR), and single event burnout (SEB) in power transistors; and (3) bit errors in photonic devices. These characterizations can then be used to generate predictions of device performance in the space radiation environment. This paper provides a general overview of ground-based SEE testing and examines in critical depth several underlying conceptual constructs relevant to the conduct of such tests and to the proper interpretation of results. These more traditional issues are contrasted with emerging concerns related to the testing of modern, advanced microcircuits

  15. Where Are the Grounds for Grounded Theory? A Troubled Empirical Methodology Meets Wittgenstein

    Science.gov (United States)

    James, Fiona

    2018-01-01

    This article provides a critical exposition of the epistemological underpinnings of a recent redevelopment of Grounded Theory (GT) methodology, "Constructivist" GT. Although proffered as freed from the "objectivist" tenets of the original version, critical examination exposes the essentialism threaded through its integral…

  16. Empirical Evaluation of Directional-Dependence Tests

    Science.gov (United States)

    Thoemmes, Felix

    2015-01-01

    Testing of directional dependence is a method to infer causal direction that recently has attracted some attention. Previous examples by e.g. von Eye and DeShon (2012a) and extensive simulation studies by Pornprasertmanit and Little (2012) have demonstrated that under specific assumptions, directional-dependence tests can recover the true causal…

  17. Ground test for vibration control demonstrator

    Science.gov (United States)

    Meyer, C.; Prodigue, J.; Broux, G.; Cantinaud, O.; Poussot-Vassal, C.

    2016-09-01

    In the objective of maximizing comfort in Falcon jets, Dassault Aviation is developing an innovative vibration control technology. Vibrations of the structure are measured at several locations and sent to a dedicated high performance vibration control computer. Control laws are implemented in this computer to analyse the vibrations in real time, and then elaborate orders sent to the existing control surfaces to counteract vibrations. After detailing the technology principles, this paper focuses on the vibration control ground demonstration that was performed by Dassault Aviation in May 2015 on Falcon 7X business jet. The goal of this test was to attenuate vibrations resulting from fixed forced excitation delivered by shakers. The ground test demonstrated the capability to implement an efficient closed-loop vibration control with a significant vibration level reduction and validated the vibration control law design methodology. This successful ground test was a prerequisite before the flight test demonstration that is now being prepared. This study has been partly supported by the JTI CleanSky SFWA-ITD.

  18. The discovered preference hypothesis - an empirical test

    DEFF Research Database (Denmark)

    Lundhede, Thomas; Ladenburg, Jacob; Olsen, Søren Bøye

    Using stated preference methods for valuation of non-market goods is known to be vulnerable to a range of biases. Some authors claim that these so-called anomalies in effect render the methods useless for the purpose. However, the Discovered Preference Hypothesis, as put forth by Plott [31], offers...... an nterpretation and explanation of biases which entails that the stated preference methods need not to be completely written off. In this paper we conduct a test for the validity and relevance of the DPH interpretation of biases. In a choice experiment concerning preferences for protection of Danish nature areas...... as respondents evaluate more and more choice sets. This finding supports the Discovered Preference Hypothesis interpretation and explanation of starting point bias....

  19. Empirical Ground Motion Characterization of Induced Seismicity in Alberta and Oklahoma

    Science.gov (United States)

    Novakovic, M.; Atkinson, G. M.; Assatourians, K.

    2017-12-01

    We develop empirical ground-motion prediction equations (GMPEs) for ground motions from induced earthquakes in Alberta and Oklahoma following the stochastic-model-based method of Atkinson et al. (2015 BSSA). The Oklahoma ground-motion database is compiled from over 13,000 small to moderate seismic events (M 1 to 5.8) recorded at 1600 seismic stations, at distances from 1 to 750 km. The Alberta database is compiled from over 200 small to moderate seismic events (M 1 to 4.2) recorded at 50 regional stations, at distances from 30 to 500 km. A generalized inversion is used to solve for regional source, attenuation and site parameters. The obtained parameters describe the regional attenuation, stress parameter and site amplification. Resolving these parameters allows for the derivation of regionally-calibrated GMPEs that can be used to compare ground motion observations between waste water injection (Oklahoma) and hydraulic fracture induced events (Alberta), and further compare induced observations with ground motions resulting from natural sources (California, NGAWest2). The derived GMPEs have applications for the evaluation of hazards from induced seismicity and can be used to track amplitudes across the regions in real time, which is useful for ground-motion-based alerting systems and traffic light protocols.

  20. Large Payload Ground Transportation and Test Considerations

    Science.gov (United States)

    Rucker, Michelle A.

    2016-01-01

    Many spacecraft concepts under consideration by the National Aeronautics and Space Administration’s (NASA’s) Evolvable Mars Campaign take advantage of a Space Launch System payload shroud that may be 8 to 10 meters in diameter. Large payloads can theoretically save cost by reducing the number of launches needed--but only if it is possible to build, test, and transport a large payload to the launch site in the first place. Analysis performed previously for the Altair project identified several transportation and test issues with an 8.973 meters diameter payload. Although the entire Constellation Program—including Altair—has since been canceled, these issues serve as important lessons learned for spacecraft designers and program managers considering large payloads for future programs. A transportation feasibility study found that, even broken up into an Ascent and Descent Module, the Altair spacecraft would not fit inside available aircraft. Ground transportation of such large payloads over extended distances is not generally permitted, so overland transportation alone would not be an option. Limited ground transportation to the nearest waterway may be possible, but water transportation could take as long as 67 days per production unit, depending on point of origin and acceptance test facility; transportation from the western United States would require transit through the Panama Canal to access the Kennedy Space Center launch site. Large payloads also pose acceptance test and ground processing challenges. Although propulsion, mechanical vibration, and reverberant acoustic test facilities at NASA’s Plum Brook Station have been designed to accommodate large spacecraft, special handling and test work-arounds may be necessary, which could increase cost, schedule, and technical risk. Once at the launch site, there are no facilities currently capable of accommodating the combination of large payload size and hazardous processing such as hypergolic fuels

  1. Ground test facility for nuclear testing of space reactor subsystems

    International Nuclear Information System (INIS)

    Quapp, W.J.; Watts, K.D.

    1985-01-01

    Two major reactor facilities at the INEL have been identified as easily adaptable for supporting the nuclear testing of the SP-100 reactor subsystem. They are the Engineering Test Reactor (ETR) and the Loss of Fluid Test Reactor (LOFT). In addition, there are machine shops, analytical laboratories, hot cells, and the supporting services (fire protection, safety, security, medical, waste management, etc.) necessary to conducting a nuclear test program. This paper presents the conceptual approach for modifying these reactor facilities for the ground engineering test facility for the SP-100 nuclear subsystem. 4 figs

  2. Empirical testing of forecast update procedure forseasonal products

    DEFF Research Database (Denmark)

    Wong, Chee Yew; Johansen, John

    2008-01-01

    Updating of forecasts is essential for successful collaborative forecasting, especially for seasonal products. This paper discusses the results of a theoretical simulation and an empirical test of a proposed time-series forecast updating procedure. It involves a two-stage longitudinal case study...... of a toy supply chain. The theoretical simulation involves historical weekly consumer demand data for 122 toy products. The empirical test is then carried out in real-time with 291 toy products. The results show that the proposed forecast updating procedure: 1) reduced forecast errors of the annual...... provided less forecast accuracy improvement and it needed a longer time to achieve relatively acceptable forecast uncertainty....

  3. "Emergence" vs. "Forcing" of Empirical Data? A Crucial Problem of "Grounded Theory" Reconsidered

    Directory of Open Access Journals (Sweden)

    Udo Kelle

    2005-05-01

    Full Text Available Since the late 1960s Barney GLASER and Anselm STRAUSS, developers of the methodology of "Grounded Theory" have made several attempts to explicate, clarify and reconceptualise some of the basic tenets of their methodological approach. Diverging concepts and understandings of Grounded Theory have arisen from these attempts which have led to a split between its founders. Much of the explication and reworking of Grounded Theory surrounds the relation between data and theory and the role of previous theoretical assumptions. The book which initially established the popularity of GLASER's and STRAUSS' methodological ideas, "The Discovery of Grounded Theory", contains two conflicting understandings of the relation between data and theory—the concept of "emergence" on the one hand and the concept of "theoretical sensitivity" on the other hand. Much of the later developments of Grounded Theory can be seen as attempts to reconcile these prima facie diverging concepts. Thereby GLASER recommends to draw on a variety of "coding families" while STRAUSS proposes the use of a general theory of action to build an axis for an emerging theory. This paper first summarises the most important developments within "Grounded Theory" concerning the understanding of the relation between empirical data and theoretical statements. Thereby special emphasis will be laid on differences between GLASER's and STRAUSS' concepts and on GLASER's current critique that the concepts of "coding paradigm" and "axial coding" described by STRAUSS and Juliet CORBIN lead to the "forcing" of data. It will be argued that GLASER's critique points out some existing weaknesses of STRAUSS' concepts but vastly exaggerates the risks of the STRAUSSian approach. A main argument of this paper is that basic problems of empirically grounded theory construction can be treated much more effectively if one draws on certain results of contemporary philosophical and epistemological discussions and on widely

  4. Development of Response Spectral Ground Motion Prediction Equations from Empirical Models for Fourier Spectra and Duration of Ground Motion

    Science.gov (United States)

    Bora, S. S.; Scherbaum, F.; Kuehn, N. M.; Stafford, P.; Edwards, B.

    2014-12-01

    In a probabilistic seismic hazard assessment (PSHA) framework, it still remains a challenge to adjust ground motion prediction equations (GMPEs) for application in different seismological environments. In this context, this study presents a complete framework for the development of a response spectral GMPE easily adjustable to different seismological conditions; and which does not suffer from the technical problems associated with the adjustment in response spectral domain. Essentially, the approach consists of an empirical FAS (Fourier Amplitude Spectrum) model and a duration model for ground motion which are combined within the random vibration theory (RVT) framework to obtain the full response spectral ordinates. Additionally, FAS corresponding to individual acceleration records are extrapolated beyond the frequency range defined by the data using the stochastic FAS model, obtained by inversion as described in Edwards & Faeh, (2013). To that end, an empirical model for a duration, which is tuned to optimize the fit between RVT based and observed response spectral ordinate, at each oscillator frequency is derived. Although, the main motive of the presented approach was to address the adjustability issues of response spectral GMPEs; comparison, of median predicted response spectra with the other regional models indicate that presented approach can also be used as a stand-alone model. Besides that, a significantly lower aleatory variability (σbrands it to a potentially viable alternative to the classical regression (on response spectral ordinates) based GMPEs for seismic hazard studies in the near future. The dataset used for the presented analysis is a subset of the recently compiled database RESORCE-2012 across Europe, Middle East and the Mediterranean region.

  5. GES [Ground Engineering System] test site preparation

    International Nuclear Information System (INIS)

    Cox, C.M.; Mahaffey, M.K.; Miller, W.C.; Schade, A.R.; Toyoda, K.G.

    1987-10-01

    Activities are under way at Hanford to convert the 309 containment building and its associated service wing to a nuclear test facility for the Ground Engineering System (GES) test. Conceptual design is about 80% complete, encompassing facility modifications, a secondary heat transport system, a large vacuum system, a test article cell and handing system, control and data handling systems, and safety andl auxiliary systems. The design makes extensive use of existing equipment to minimize technical risk and cost. Refurbishment of this equipment is 25% complete. Cleanout of some 1000 m 3 of equipment from the earlier reactor test in the facility is 85% complete. An Environmental Assessment was prepared and revised to incorporate Department of Energy (DOE) comments. It is now in the DOE approval chain, where a Finding of No Significant Impact is expected. During the next year, definite design will be well advanced, long-lead procurements will be initiated, construction planning will be completed, an operator training plan will be prepared, and the site (preliminary) safety analysis report will be drafted

  6. Commissioning of the Ground Test Accelerator RFQ

    International Nuclear Information System (INIS)

    Johnson, K.F.; Sander, O.R.; Atkins, W.H.; Bolme, G.O.; Brown, S.; Connolly, R.; Garnett, R.; Gilpatrick, J.D.; Guy, F.W.; Ingalls, W.B.; Little, C.; Lohson, R.A.; Lloyd, S.; Neuschaefer, G.; Power, J.; Saadatmand, K.; Sandoval, D.P.; Stevens, R.R.; Vaughn, G.; Wadlinger, E.A.; Weiss, R.; Yuan, V.

    1992-01-01

    The Ground Test Accelerator (GTA) has the objective of verifying much of the technology (physics and engineering) required for producing high-brightness, high-current H - beams. GTA commissioning is staged to verify the beam dynamics design of each major accelerator component as it is brought on-line. The commissioning stages are the 35 key H - injector, the 2.5 MeV Radio Frequency Quadrupole (RFQ), the Intertank Matching Section (IMS), the 3.2 MeV first 2βγ Drift Tube Linac (DTL-1) module, the 8.7 MeV 2βγ DTL (modules 1--5), and the 24 MeV GTA; all 10 DTL modules. Commissioning results from the RFQ beam experiments will be presented along with comparisons to simulations

  7. Commissioning of the ground test accelerator RFQ

    International Nuclear Information System (INIS)

    Johnson, K.F.; Sander, O.R.; Atkins, W.H.; Bolme, G.O.; Brown, S.; Garnott, R.; Gilpatrick, J.D.; Guy, F.W.; Ingalls, W.B.; Little, C.; Lohsen, R.A.; Lloyd, S.; Neuschaefer, G.; Power, J.; Sandoval, D.P.; Saadatmand, K.; Stevens, R.R.Jr.; Vaughn, G.; Wadlinger, E.A.; Yuan, V.; Connolly, R.; Weiss, R.

    1992-01-01

    The Ground Test Accelerator (GTA) has the objective of verifying much of the technology (physics and engineering) required for producing high-brightness, high-current H - beams. GTA commissioning is staged to verify the beam dynamics design of each major accelerator component as it is brought on line. The commissioning stages are the 35-keV H - injector, the 2.5-MeV radio-frequency quadrupole (RFQ), the intertank matching section (IMS), the 3.2-MeV first 2-βλ drift tube linac (DTL-1) module, the 8.7-MeV 2-βλDTL (modules 1-5), and the 24-MeV GTA (all 10 DTL modules). Commissioning results from the RFQ beam experiments are presented along with comparisons with simulations. (Author) 8 refs., 9 figs

  8. Context, Experience, Expectation, and Action—Towards an Empirically Grounded, General Model for Analyzing Biographical Uncertainty

    Directory of Open Access Journals (Sweden)

    Herwig Reiter

    2010-01-01

    Full Text Available The article proposes a general, empirically grounded model for analyzing biographical uncertainty. The model is based on findings from a qualitative-explorative study of transforming meanings of unemployment among young people in post-Soviet Lithuania. In a first step, the particular features of the uncertainty puzzle in post-communist youth transitions are briefly discussed. A historical event like the collapse of state socialism in Europe, similar to the recent financial and economic crisis, is a generator of uncertainty par excellence: it undermines the foundations of societies and the taken-for-grantedness of related expectations. Against this background, the case of a young woman and how she responds to the novel threat of unemployment in the transition to the world of work is introduced. Her uncertainty management in the specific time perspective of certainty production is then conceptually rephrased by distinguishing three types or levels of biographical uncertainty: knowledge, outcome, and recognition uncertainty. Biographical uncertainty, it is argued, is empirically observable through the analysis of acting and projecting at the biographical level. The final part synthesizes the empirical findings and the conceptual discussion into a stratification model of biographical uncertainty as a general tool for the biographical analysis of uncertainty phenomena. URN: urn:nbn:de:0114-fqs100120

  9. The efficient market hypothesis: problems with interpretations of empirical tests

    Directory of Open Access Journals (Sweden)

    Denis Alajbeg

    2012-03-01

    Full Text Available Despite many “refutations” in empirical tests, the efficient market hypothesis (EMH remains the central concept of financial economics. The EMH’s resistance to the results of empirical testing emerges from the fact that the EMH is not a falsifiable theory. Its axiomatic definition shows how asset prices would behave under assumed conditions. Testing for this price behavior does not make much sense as the conditions in the financial markets are much more complex than the simplified conditions of perfect competition, zero transaction costs and free information used in the formulation of the EMH. Some recent developments within the tradition of the adaptive market hypothesis are promising regarding development of a falsifiable theory of price formation in financial markets, but are far from giving assurance that we are approaching a new formulation. The most that can be done in the meantime is to be very cautious while interpreting the empirical evidence that is presented as “testing” the EMH.

  10. X-43A Vehicle During Ground Testing

    Science.gov (United States)

    1999-01-01

    The X-43A Hypersonic Experimental Vehicle, or 'Hyper-X' is seen here undergoing ground testing at NASA's Dryden Flight Research Center, Edwards, California in December 1999. The X-43A was developed to research a dual-mode ramjet/scramjet propulsion system at speeds from Mach 7 up to Mach 10 (7 to 10 times the speed of sound, which varies with temperature and altitude). Hyper-X, the flight vehicle for which is designated as X-43A, is an experimental flight-research program seeking to demonstrate airframe-integrated, 'air-breathing' engine technologies that promise to increase payload capacity for future vehicles, including hypersonic aircraft (faster than Mach 5) and reusable space launchers. This multiyear program is currently underway at NASA Dryden Flight Research Center, Edwards, California. The Hyper-X schedule calls for its first flight later this year (2000). Hyper-X is a joint program, with Dryden sharing responsibility with NASA's Langley Research Center, Hampton, Virginia. Dryden's primary role is to fly three unpiloted X-43A research vehicles to validate engine technologies and hypersonic design tools as well as the hypersonic test facility at Langley. Langley manages the program and leads the technology development effort. The Hyper-X Program seeks to significantly expand the speed boundaries of air-breathing propulsion by being the first aircraft to demonstrate an airframe-integrated, scramjet-powered free flight. Scramjets (supersonic-combustion ramjets) are ramjet engines in which the airflow through the whole engine remains supersonic. Scramjet technology is challenging because only limited testing can be performed in ground facilities. Long duration, full-scale testing requires flight research. Scramjet engines are air-breathing, capturing their oxygen from the atmosphere. Current spacecraft, such as the Space Shuttle, are rocket powered, so they must carry both fuel and oxygen for propulsion. Scramjet technology-based vehicles need to carry only

  11. Empirical Comparison of Publication Bias Tests in Meta-Analysis.

    Science.gov (United States)

    Lin, Lifeng; Chu, Haitao; Murad, Mohammad Hassan; Hong, Chuan; Qu, Zhiyong; Cole, Stephen R; Chen, Yong

    2018-04-16

    Decision makers rely on meta-analytic estimates to trade off benefits and harms. Publication bias impairs the validity and generalizability of such estimates. The performance of various statistical tests for publication bias has been largely compared using simulation studies and has not been systematically evaluated in empirical data. This study compares seven commonly used publication bias tests (i.e., Begg's rank test, trim-and-fill, Egger's, Tang's, Macaskill's, Deeks', and Peters' regression tests) based on 28,655 meta-analyses available in the Cochrane Library. Egger's regression test detected publication bias more frequently than other tests (15.7% in meta-analyses of binary outcomes and 13.5% in meta-analyses of non-binary outcomes). The proportion of statistically significant publication bias tests was greater for larger meta-analyses, especially for Begg's rank test and the trim-and-fill method. The agreement among Tang's, Macaskill's, Deeks', and Peters' regression tests for binary outcomes was moderately strong (most κ's were around 0.6). Tang's and Deeks' tests had fairly similar performance (κ > 0.9). The agreement among Begg's rank test, the trim-and-fill method, and Egger's regression test was weak or moderate (κ < 0.5). Given the relatively low agreement between many publication bias tests, meta-analysts should not rely on a single test and may apply multiple tests with various assumptions. Non-statistical approaches to evaluating publication bias (e.g., searching clinical trials registries, records of drug approving agencies, and scientific conference proceedings) remain essential.

  12. Hypersonic Inflatable Aerodynamic Decelerator Ground Test Development

    Science.gov (United States)

    Del Corso, Jospeh A.; Hughes, Stephen; Cheatwood, Neil; Johnson, Keith; Calomino, Anthony

    2015-01-01

    Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology readiness levels have been incrementally matured by NASA over the last thirteen years, with most recent support from NASA's Space Technology Mission Directorate (STMD) Game Changing Development Program (GCDP). Recently STMD GCDP has authorized funding and support through fiscal year 2015 (FY15) for continued HIAD ground developments which support a Mars Entry, Descent, and Landing (EDL) study. The Mars study will assess the viability of various EDL architectures to enable a Mars human architecture pathfinder mission planned for mid-2020. At its conclusion in November 2014, NASA's first HIAD ground development effort had demonstrated success with fabricating a 50 W/cm2 modular thermal protection system, a 400 C capable inflatable structure, a 10-meter scale aeroshell manufacturing capability, together with calibrated thermal and structural models. Despite the unquestionable success of the first HIAD ground development effort, it was recognized that additional investment was needed in order to realize the full potential of the HIAD technology capability to enable future flight opportunities. The second HIAD ground development effort will focus on extending performance capability in key technology areas that include thermal protection system, lifting-body structures, inflation systems, flight control, stage transitions, and 15-meter aeroshell scalability. This paper presents an overview of the accomplishments under the baseline HIAD development effort and current plans for a follow-on development effort focused on extending those critical technologies needed to enable a Mars Pathfinder mission.

  13. Ground/Flight Test Techniques and Correlation.

    Science.gov (United States)

    1983-02-01

    dihedral. The photogrammetric analysis system developed at AEDC 6 uses 70-mm Hasselblad cameras and a Keffel & Esser DSC-3/80® analytical stereocompiler...model transmits data to a ground receiver by telemetry and is tracked by accurate radar scanners and/or kinetheodolite cameras as required. The required...Materials Panel Meeting, Ottawa/Canada Sept. 25-27, 1967; also Jahrbuch 1967 der Wissenschaftlichen Gesell - schaft fur Luft- und Raumfahrt, pp. 211

  14. History of ground motion programs at the Nevada Test Site

    International Nuclear Information System (INIS)

    Banister, J.R.

    1984-01-01

    Some measurements were made in the atmospheric testing era, but the study of ground motion from nuclear tests became of wider interest after the instigation of underground testing. The ground motion generated by underground nuclear test has been investigated for a number of reasons including understanding basic phenomena, operational and safety concerns, yield determination, stimulation of earthquake concerns, and developing methods to aid in treaty verifications. This history of ground motion programs will include discussing early studies, high yield programs, Peaceful Nuclear Explosions tests, and some more recent developments. 6 references, 10 figures

  15. Testing the gravity p-median model empirically

    Directory of Open Access Journals (Sweden)

    Kenneth Carling

    2015-12-01

    Full Text Available Regarding the location of a facility, the presumption in the widely used p-median model is that the customer opts for the shortest route to the nearest facility. However, this assumption is problematic on free markets since the customer is presumed to gravitate to a facility by the distance to and the attractiveness of it. The recently introduced gravity p-median model offers an extension to the p-median model that account for this. The model is therefore potentially interesting, although it has not yet been implemented and tested empirically. In this paper, we have implemented the model in an empirical problem of locating vehicle inspections, locksmiths, and retail stores of vehicle spare-parts for the purpose of investigating its superiority to the p-median model. We found, however, the gravity p-median model to be of limited use for the problem of locating facilities as it either gives solutions similar to the p-median model, or it gives unstable solutions due to a non-concave objective function.

  16. Australia - a nuclear weapons testing ground

    International Nuclear Information System (INIS)

    Dobbs, Michael.

    1993-01-01

    Between 1952 and 1958 Britain conducted five separate nuclear weapons trials in Australia. Australia had the uninhabited wide open spaces and the facilities which such tests need and Britain was able to use its special relationship with Australia to get agreement to conduct atomic tests in Australia and establish a permanent test site at Maralinga. Other non-nuclear tests were conducted between 1953-1963. The story of Britain's involvement in atomic weapons testing in Australia is told through its postal history. Both official and private covers are used to show how the postal communications were established and maintained throughout the test years. (UK)

  17. BUSTED BUTTE TEST FACILITY GROUND SUPPORT CONFIRMATION ANALYSIS

    International Nuclear Information System (INIS)

    Bonabian, S.

    1998-01-01

    The main purpose and objective of this analysis is to confirm the validity of the ground support design for Busted Butte Test Facility (BBTF). The highwall stability and adequacy of highwall and tunnel ground support is addressed in this analysis. The design of the BBTF including the ground support system was performed in a separate document (Reference 5.3). Both in situ and seismic loads are considered in the evaluation of the highwall and the tunnel ground support system. In this analysis only the ground support designed in Reference 5.3 is addressed. The additional ground support installed (still work in progress) by the constructor is not addressed in this analysis. This additional ground support was evaluated by the A/E during a site visit and its findings and recommendations are addressed in this analysis

  18. Quantum optics as a conceptual testing ground

    International Nuclear Information System (INIS)

    Bergon, J.A.

    1997-01-01

    Entangled states provide the necessary tools for conceptual tests of quantum mechanics and other alternative theories. Here our focus is on a test of the time symmetric, pre- and post selective quantum mechanics and its relation to the consistent histories interpretation. First, we show to produce a nonlocal entangled state, necessary for the test, where there is precisely one photon hiding in three cavities. This state can be produced by sending appropriately prepared atoms through the cavities. Then, we briefly review the proposal for an experimental test of pre- and post selective quantum mechanics using the three-cavity state. Finally, we show that the outcome of such an experiment can be discussed from the viewpoint of the consistent histories interpretation of quantum mechanics and therefore provides an opportunity to subject quantum cosmological ideas to laboratory tests. (author)

  19. Ground testing of an SP-100 prototypic reactor

    International Nuclear Information System (INIS)

    Motwani, K.; Pflasterer, G.R.; Upton, H.; Lazarus, J.D.; Gluck, R.

    1988-01-01

    SP-100 is a space power system which is being developed by GE to meet future space electrical power requirements. The ground testing of an SP-100 prototypic reactor system will be conducted at the Westinghouse Hanford Company site located at Richland, Washington. The objective of this test is to demonstrate the performance of a full scale prototypic reactor system, including the reactor, control system and flight shield. The ground test system is designed to simulate the flight operating conditions while meeting all the necessary nuclear safety requirements in a gravity environment. The goal of the reactor ground test system is to establish confidence in the design maturity of the SP-100 space reactor power system and resolve the technical issues necessary for the development of a flight mission design

  20. On the Relationship between Fourier and Response Spectra: Implications for the Adjustment of Empirical Ground-Motion Prediction Equations (GMPEs)

    Science.gov (United States)

    Bora, Sanjay; Scherbaum, Frank; Kuehn, Nicolas; Stafford, Peter

    2016-04-01

    Often, scaling of response spectral amplitudes, (e.g., spectral acceleration) obtained from empirical ground motion prediction equations (GMPEs), with respect to commonly used seismological parameters such as magnitude, distance and site condition is assumed/referred to be representing a similar scaling of Fourier spectral amplitudes. For instance, the distance scaling of response spectral amplitudes is related with the geometrical spreading of seismic waves. Such comparison of scaling of response spectral amplitudes with that of corresponding Fourier spectral amplitudes is motivated by that, the functional forms of response spectral GMPEs are often derived using the concepts borrowed from Fourier spectral modeling of ground motion. As these GMPEs are subsequently calibrated with empirical observations, this may not appear to pose any major problems in the prediction of ground motion for a particular earthquake scenario. However, the assumption that the Fourier spectral concepts persist for response spectra can lead to undesirable consequences when it comes to the adjustment of response spectral GMPEs to represent conditions not covered in the original empirical data set. In this context, a couple of important questions arise, e.g., what are the distinctions and/or similarities between Fourier and response spectra of ground-motions? And, if they are different, then what is the mechanism responsible for such differences and how do adjustments that are made to FAS manifest in response spectra? We explore the relationship between the Fourier and response spectrum of ground motion by using random vibration theory (RVT). With a simple Brune (1970, 1971) source model, RVT-generated acceleration spectra for a fixed magnitude and distance scenario are used. The RVT analyses reveal that the scaling of low oscillator-frequency response spectral ordinates can be treated as being equivalent to the scaling of the corresponding Fourier spectral ordinates. However, the high

  1. Principles of Work Sample Testing. 1. A Non-Empirical Taxonomy of Test Uses

    Science.gov (United States)

    1979-04-01

    UNIVERSITY % ’ Bowling Green , Ohio 43403 e April 1979 Contract DAHC 19-77-C-0007 Cj Prepared for U.S. ARMY RESEARCH INSTITUTEfor the BEHAVIORAL and...inferring changes in enotionality, and GSR is said to be a measure of emotion . Much of psychological measurement is derived measurement, but it is...ARI TECHNICAL REPORT TR-79-A8 Principles of Work Sample Testingi I. A Non-Empirical Taxonomy of Test Uses b y Robert M. Guion BOWLING GREEN STATE

  2. Synthetic strong ground motions for engineering design utilizing empirical Green`s functions

    Energy Technology Data Exchange (ETDEWEB)

    Hutchings, L.J.; Jarpe, S.P.; Kasameyer, P.W.; Foxall, W.

    1996-04-11

    We present a methodology for developing realistic synthetic strong ground motions for specific sites from specific earthquakes. We analyzed the possible ground motion resulting from a M = 7.25 earthquake that ruptures 82 km of the Hayward fault for a site 1.4 km from the fault in the eastern San Francisco Bay area. We developed a suite of 100 rupture scenarios for the Hayward fault earthquake and computed the corresponding strong ground motion time histories. We synthesized strong ground motion with physics-based solutions of earthquake rupture and applied physical bounds on rupture parameters. By having a suite of rupture scenarios of hazardous earthquakes for a fixed magnitude and identifying the hazard to the site from the statistical distribution of engineering parameters, we introduce a probabilistic component into the deterministic hazard calculation. Engineering parameters of synthesized ground motions agree with those recorded from the 1995 Kobe, Japan and the 1992 Landers, California earthquakes at similar distances and site geologies.

  3. Space Shuttle Boundary Layer Transition Flight Experiment Ground Testing Overview

    Science.gov (United States)

    Berger, Karen T.; Anderson, Brian P.; Campbell, Charles H.

    2014-01-01

    In support of the Boundary Layer Transition (BLT) Flight Experiment (FE) Project in which a manufactured protuberance tile was installed on the port wing of Space Shuttle Orbiter Discovery for STS-119, STS- 128, STS-131 and STS-133 as well as Space Shuttle Orbiter Endeavour for STS-134, a significant ground test campaign was completed. The primary goals of the test campaign were to provide ground test data to support the planning and safety certification efforts required to fly the flight experiment as well as validation for the collected flight data. These test included Arcjet testing of the tile protuberance, aerothermal testing to determine the boundary layer transition behavior and resultant surface heating and planar laser induced fluorescence (PLIF) testing in order to gain a better understanding of the flow field characteristics associated with the flight experiment. This paper provides an overview of the BLT FE Project ground testing. High-level overviews of the facilities, models, test techniques and data are presented, along with a summary of the insights gained from each test.

  4. Educational mismatches and skills: New empirical tests of old hypotheses

    NARCIS (Netherlands)

    Levels, M.; van der Velden, R.K.W.; Allen, J.P.

    2013-01-01

    In this paper, we empirically explore how the often reported relationship between overeducation and wages can best be understood. Exploiting the newly published Programme for International Assessment of Adult Competencies (PIAAC) data (OECD 2013), we are able to achieve a better estimation of the

  5. An empirical test of spatial competition in the audit market

    NARCIS (Netherlands)

    Numan, W.A.; Willekens, M.M.T.A.

    2012-01-01

    This study empirically examines the effects of competition through differentiation on audit pricing. Based on prior economic theory on differentiated-product markets (e.g., Hotelling, 1929; Tirole, 1988), we hypothesize that audit fees are affected by an auditor's relative location in a market

  6. Handling effluent from nuclear thermal propulsion system ground tests

    International Nuclear Information System (INIS)

    Shipers, L.R.; Allen, G.C.

    1992-01-01

    A variety of approaches for handling effluent from nuclear thermal propulsion system ground tests in an environmentally acceptable manner are discussed. The functional requirements of effluent treatment are defined and concept options are presented within the framework of these requirements. System concepts differ primarily in the choice of fission-product retention and waste handling concepts. The concept options considered range from closed cycle (venting the exhaust to a closed volume or recirculating the hydrogen in a closed loop) to open cycle (real time processing and venting of the effluent). This paper reviews the different methods to handle effluent from nuclear thermal propulsion system ground tests

  7. Effluent treatment options for nuclear thermal propulsion system ground tests

    International Nuclear Information System (INIS)

    Shipers, L.R.; Brockmann, J.E.

    1992-01-01

    A variety of approaches for handling effluent from nuclear thermal propulsion system ground tests in an environmentally acceptable manner are discussed. The functional requirements of effluent treatment are defined and concept options are presented within the framework of these requirements. System concepts differ primarily in the choice of fission-product retention and waste handling concepts. The concept options considered range from closed cycle (venting the exhaust to a closed volume or recirculating the hydrogen in a closed loop) to open cycle (real time processing and venting of the effluent). This paper reviews the strengths and weaknesses of different methods to handle effluent from nuclear thermal propulsion system ground tests

  8. Searching for a Common Ground--A Literature Review of Empirical Research on Scientific Inquiry Activities

    Science.gov (United States)

    Rönnebeck, Silke; Bernholt, Sascha; Ropohl, Mathias

    2016-01-01

    Despite the importance of scientific inquiry in science education, researchers and educators disagree considerably regarding what features define this instructional approach. While a large body of literature addresses theoretical considerations, numerous empirical studies investigate scientific inquiry on quite different levels of detail and also…

  9. An assessment of testing requirement impacts on nuclear thermal propulsion ground test facility design

    International Nuclear Information System (INIS)

    Shipers, L.R.; Ottinger, C.A.; Sanchez, L.C.

    1993-01-01

    Programs to develop solid core nuclear thermal propulsion (NTP) systems have been under way at the Department of Defense (DoD), the National Aeronautics and Space Administration (NASA), and the Department of Energy (DOE). These programs have recognized the need for a new ground test facility to support development of NTP systems. However, the different military and civilian applications have led to different ground test facility requirements. The Department of Energy (DOE) in its role as landlord and operator of the proposed research reactor test facilities has initiated an effort to explore opportunities for a common ground test facility to meet both DoD and NASA needs. The baseline design and operating limits of the proposed DoD NTP ground test facility are described. The NASA ground test facility requirements are reviewed and their potential impact on the DoD facility baseline is discussed

  10. The provincial administration of the Ottoman Empire in the XVI century on the ground of the kanunname

    Directory of Open Access Journals (Sweden)

    M. V. Kovalchuk

    2016-11-01

    Full Text Available The title of the article is ‘The provincial administration of the Ottoman empire in the XVI century on the ground of the kanunname’. The purpose of the article is to characterize the structure of the provincial administration of the Ottoman Empire in the XVI century on the ground of the sultans’ kanunname. The administrative divisions of the Ottoman Empire has been studied in the article: the largest ottoman provinces - eyalets were divided into two types: timarli, in which the system of sanjaks, ottoman law, timar system were established; and salyaneli, where timar system was absent, all taxes were gathered and distributed to local needs; the second administrative-military territorial units were sanjaks, the subdivisions of sanjak were kazas and kadiluks, then – nahiyes. Moreover, outside this system there were various types of vassal and tributary states. The governance of such large territory, control over the execution of orders, establishment of relations between different groups of multi-ethnic population of the empire, which lived mostly not in the metropolitan area, but in the provinces, needed the constant functioning of local authorities. The main task of the provincial administration was to organize and to strengthen the communications between the province and the center. Special attention has been given to the description of the place and role, rights and duties of beylerbey, sanjakbey, qadi (judge, defterdar and other local managers at the system of the Ottoman provincial administration. In conclusion it is necessary to point out that the system of local authorities was formed by representatives of the military-administrative (sipahi-timarli, sanjakbey, beylerbey, the Grand Vizier, judicial (qadis (judges, kazaskers, sheikh-ul-Islam and financial (defterdars branches of the Ottoman government. Legal prohibitions and mutual cooperation supported sustainable balance between these executive power branches.

  11. The Locus of Metaphorical Persuasion: An Empirical Test.

    Science.gov (United States)

    Hitchon, Jacqueline C.

    1997-01-01

    Investigates whether the locus of persuasion of the metaphor "A is B" lies in the valence of B, as widely assumed, or in the valence of the metaphor ground, what A and B share. Indicates that global affect toward B does not transfer onto A and that metaphorical persuasion is a distinct process meriting further investigation. (SR)

  12. Forced vibration tests of a model foundation on rock ground

    International Nuclear Information System (INIS)

    Kisaki, N.; Siota, M.; Yamada, M.; Ikeda, A.; Tsuchiya, H.; Kitazawa, K.; Kuwabara, Y.; Ogiwara, Y.

    1983-01-01

    The response of very stiff structures, such as nuclear reactor buildings, to earthquake ground motion is significantly affected by radiation damping due to the soil-structure interaction. The radiation damping can be computed by vibration admittance theory or dynamical ground compliance theory. In order to apply the values derived from these theories to the practical problems, comparative studies between theoretical results and experimental results concerning the soil-structure interaction, especially if the ground is rock, are urgently needed. However, experimental results for rock are less easily obtained than theoretical ones. The purpose of this paper is to describe the harmonic excitation tests of a model foundation on rock and to describe the results of comparative studies. (orig./HP)

  13. Change-based test selection : An empirical evaluation

    NARCIS (Netherlands)

    Soetens, Quinten; Demeyer, Serge; Zaidman, A.E.; Perez, Javier

    2015-01-01

    Regression test selection (i.e., selecting a subset of a given regression test suite) is a problem that has been studied intensely over the last decade. However, with the increasing popularity of developer tests as the driver of the test process, more fine-grained solutions that work well within the

  14. Ground Operations Demonstration Unit for Liquid Hydrogen Initial Test Results

    Science.gov (United States)

    Notardonato, W. U.; Johnson, W. L.; Swanger, A. M.; Tomsik, T.

    2015-01-01

    NASA operations for handling cryogens in ground support equipment have not changed substantially in 50 years, despite major technology advances in the field of cryogenics. NASA loses approximately 50% of the hydrogen purchased because of a continuous heat leak into ground and flight vessels, transient chill down of warm cryogenic equipment, liquid bleeds, and vent losses. NASA Kennedy Space Center (KSC) needs to develop energy-efficient cryogenic ground systems to minimize propellant losses, simplify operations, and reduce cost associated with hydrogen usage. The GODU LH2 project has designed, assembled, and started testing of a prototype storage and distribution system for liquid hydrogen that represents an advanced end-to-end cryogenic propellant system for a ground launch complex. The project has multiple objectives including zero loss storage and transfer, liquefaction of gaseous hydrogen, and densification of liquid hydrogen. The system is unique because it uses an integrated refrigeration and storage system (IRAS) to control the state of the fluid. This paper will present and discuss the results of the initial phase of testing of the GODU LH2 system.

  15. Cryogenic actuator testing for the SAFARI ground calibration setup

    Science.gov (United States)

    de Jonge, C.; Eggens, M.; Nieuwenhuizen, A. C. T.; Detrain, A.; Smit, H.; Dieleman, P.

    2012-09-01

    For the on-ground calibration setup of the SAFARI instrument cryogenic mechanisms are being developed at SRON Netherlands Institute for Space Research, including a filter wheel, XYZ-scanner and a flipmirror mechanism. Due to the extremely low background radiation requirement of the SAFARI instrument, all of these mechanisms will have to perform their work at 4.5 Kelvin and low-dissipative cryogenic actuators are required to drive these mechanisms. In this paper, the performance of stepper motors, piezoelectric actuators and brushless DC-motors as cryogenic actuators are compared. We tested stepper motor mechanical performance and electrical dissipation at 4K. The actuator requirements, test setup and test results are presented. Furthermore, design considerations and early performance tests of the flipmirror mechanism are discussed. This flipmirror features a 102 x 72 mm aluminum mirror that can be rotated 45°. A Phytron stepper motor with reduction gearbox has been chosen to drive the flipmirror. Testing showed that this motor has a dissipation of 49mW at 4K with a torque of 60Nmm at 100rpm. Thermal modeling of the flipmirror mechanism predicts that with proper thermal strapping the peak temperature of the flipmirror after a single action will be within the background level requirements of the SAFARI instrument. Early tests confirm this result. For low-duty cycle operations commercial stepper motors appear suitable as actuators for test equipment in the SAFARI on ground calibration setup.

  16. Review of Nuclear Thermal Propulsion Ground Test Options

    Science.gov (United States)

    Coote, David J.; Power, Kevin P.; Gerrish, Harold P.; Doughty, Glen

    2015-01-01

    High efficiency rocket propulsion systems are essential for humanity to venture beyond the moon. Nuclear Thermal Propulsion (NTP) is a promising alternative to conventional chemical rockets with relatively high thrust and twice the efficiency of highest performing chemical propellant engines. NTP utilizes the coolant of a nuclear reactor to produce propulsive thrust. An NTP engine produces thrust by flowing hydrogen through a nuclear reactor to cool the reactor, heating the hydrogen and expelling it through a rocket nozzle. The hot gaseous hydrogen is nominally expected to be free of radioactive byproducts from the nuclear reactor; however, it has the potential to be contaminated due to off-nominal engine reactor performance. NTP ground testing is more difficult than chemical engine testing since current environmental regulations do not allow/permit open air testing of NTP as was done in the 1960's and 1970's for the Rover/NERVA program. A new and innovative approach to rocket engine ground test is required to mitigate the unique health and safety risks associated with the potential entrainment of radioactive waste from the NTP engine reactor core into the engine exhaust. Several studies have been conducted since the ROVER/NERVA program in the 1970's investigating NTP engine ground test options to understand the technical feasibility, identify technical challenges and associated risks and provide rough order of magnitude cost estimates for facility development and test operations. The options can be divided into two distinct schemes; (1) real-time filtering of the engine exhaust and its release to the environment or (2) capture and storage of engine exhaust for subsequent processing.

  17. A procedure for empirical initialization of adaptive testing algorithms

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1997-01-01

    In constrained adaptive testing, the numbers of constraints needed to control the content of the tests can easily run into the hundreds. Proper initialization of the algorithm becomes a requirement because the presence of large numbers of constraints slows down the convergence of the ability

  18. Integrated Vehicle Ground Vibration Testing of Manned Spacecraft: Historical Precedent

    Science.gov (United States)

    Lemke, Paul R.; Tuma, Margaret L.; Askins, Bruce R.

    2008-01-01

    For the first time in nearly 30 years, NASA is developing a new manned space flight launch system. The Ares I will carry crew and cargo to not only the International Space Station, but onward for the future exploration of the Moon and Mars. The Ares I control system and structural designs use complex computer models for their development. An Integrated Vehicle Ground Vibration Test (IVGVT) will validate the efficacy of these computer models. The IVGVT will reduce the technical risk of unexpected conditions that could place the vehicle or crew in jeopardy. The Ares Project Office's Flight and Integrated Test Office commissioned a study to determine how historical programs, such as Saturn and Space Shuttle, validated the structural dynamics of an integrated flight vehicle. The study methodology was to examine the historical record and seek out members of the engineering community who recall the development of historic manned launch vehicles. These records and interviews provided insight into the best practices and lessons learned from these historic development programs. The information that was gathered allowed the creation of timelines of the historic development programs. The timelines trace the programs from the development of test articles through test preparation, test operations, and test data reduction efforts. These timelines also demonstrate how the historical tests fit within their overall vehicle development programs. Finally, the study was able to quantify approximate staffing levels during historic development programs. Using this study, the Flight and Integrated Test Office was able to evaluate the Ares I Integrated Vehicle Ground Vibration Test schedule and workforce budgets in light of the historical precedents to determine if the test had schedule or cost risks associated with it.

  19. Empirical evidence of design-related bias in studies of diagnostic tests

    NARCIS (Netherlands)

    Lijmer, J. G.; Mol, B. W.; Heisterkamp, S.; Bonsel, G. J.; Prins, M. H.; van der Meulen, J. H.; Bossuyt, P. M.

    1999-01-01

    CONTEXT: The literature contains a large number of potential biases in the evaluation of diagnostic tests. Strict application of appropriate methodological criteria would invalidate the clinical application of most study results. OBJECTIVE: To empirically determine the quantitative effect of study

  20. CRYogenic Orbital TEstbed Ground Test Article Thermal Analysis

    Science.gov (United States)

    Piryk, David; Schallhorn, Paul; Walls, Laurie; Stopnitzky, Benny; Rhys, Noah; Wollen, Mark

    2012-01-01

    The purpose of this study was to anchor thermal and fluid system models to CRYOTE ground test data. The CRYOTE ground test artide was jointly developed by Innovative Engineering Solutions, United Launch Alliance and NASA KSC. The test article was constructed out of a titanium alloy tank, Sapphire 77 composite skin (similar to G10), an external secondary payload adapter ring, thermal vent system, multi layer insulation and various data acquisition instrumentation. In efforts to understand heat loads throughout this system, the GTA (filled with liquid nitrogen for safety purposes) was subjected to a series of tests in a vacuum chamber at Marshall Space Flight Center. By anchoring analytical models against test data, higher fidelity thermal environment predictions can be made for future flight articles which would eventually demonstrate critical cryogenic fluid management technologies such as system chilldown, transfer, pressure control and long term storage. Significant factors that influenced heat loads included radiative environments, multi-layer insulation performance, tank fill levels and pressures and even contact conductance coefficients. This report demonstrates how analytical thermal/fluid networks were established and includes supporting rationale for specific thermal responses.

  1. Deep in Data. Empirical Data Based Software Accuracy Testing Using the Building America Field Data Repository

    Energy Technology Data Exchange (ETDEWEB)

    Neymark, J. [J.Neymark and Associates, Golden, CO (United States); Roberts, D. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-06-01

    This paper describes progress toward developing a usable, standardized, empirical data-based software accuracy test suite using home energy consumption and building description data. Empirical data collected from around the United States have been translated into a uniform Home Performance Extensible Markup Language format that may enable software developers to create translators to their input schemes for efficient access to the data. This could allow for modeling many homes expediently, and thus implementing software accuracy test cases by applying the translated data.

  2. Advanced stellar compass deep space navigation, ground testing results

    DEFF Research Database (Denmark)

    Betto, Maurizio; Jørgensen, John Leif; Jørgensen, Peter Siegbjørn

    2006-01-01

    Deep space exploration is in the agenda of the major space agencies worldwide and at least the European Space Agency (SMART & Aurora Programs) and the American NASA (New Millennium Program) have set up programs to allow the development and the demonstration of technologies that can reduce the risks...... and the costs of the deep space missions. Navigation is the Achilles' heel of deep space. Being performed on ground, it imposes considerable constraints on the system and the operations, it is very expensive to execute, especially when the mission lasts several years and, above all, it is not failure tolerant...... to determine the orbit of a spacecraft autonomously, on-board and without any a priori knowledge of any kind. The solution is robust, elegant and fast. This paper presents the preliminary performances obtained during the ground tests. The results are very positive and encouraging....

  3. Development of Ground Test System For RKX-200EB

    Science.gov (United States)

    Yudhi Irwanto, Herma

    2018-04-01

    After being postponed for seven years, the development of RKX-200EB now restarts by initiating a ground test, preceding the real flight test. The series of the development starts from simulation test using the real vehicle and its components, focusing on a flight sequence test using hardware in the loop simulation. The result of the simulation shows that the autonomous control system in development is able to control the X tail fin vehicle, since take off using booster, separating booster-sustainer, making flight maneuver using sustainer with average cruise speed of 1000 km/h, and doing bank to maneuver up to ±40 deg heading to the target. The simulation result also shows that the presence of sustainer in vehicle control can expand the distance range by 162% (12.6 km) from its ballistic range using only a booster.

  4. Prediction of mandibular rotation: an empirical test of clinician performance.

    Science.gov (United States)

    Baumrind, S; Korn, E L; West, E E

    1984-11-01

    An experiment was conducted in an attempt to determine empirically how effective a number of expert clinicians were at differentiating "backward rotators" from "forward rotators" on the basis of head-film information which might reasonably have been available to them prior to instituting treatment for the correction of Class II malocclusion. As a result of a previously reported ongoing study, pre- and posttreatment head films were available for 188 patients treated in the mixed dentition for the correction of Class II malocclusion and for 50 untreated Class II subjects. These subjects were divided into 14 groups (average size of group, 17; range, 6 to 23) solely on the basis of type of treatment and the clinician from whose clinic the records had originated. From within each group, we selected the two or three subjects who had exhibited the most extreme backward rotation and the two or three subjects who had exhibited the most extreme forward rotation of the mandible during the interval between films. The sole criterion for classification was magnitude of change in the mandibular plane angle of Downs between the pre- and posttreatment films of each patient. The resulting sample contained 32 backward-rotator subjects and 32 forward-rotator subjects. Five expert judges (mean clinical experience, 28 years) were asked to identify the backward-rotator subjects by examination of the pretreatment films. The findings may be summarized as follows: (1) No judge performed significantly better than chance. (2) There was strong evidence that the judges used a shared, though relatively ineffective, set of rules in making their discriminations between forward and backward rotators. (3) Statistical analysis of the predictive power of a set of standard cephalometric measurements which had previously been made for this set of subjects indicated that the numerical data also failed to identify potential backward rotators at a rate significantly better than chance. We infer from these

  5. What Happened to Remote Usability Testing? An Empirical Study of Three Methods

    DEFF Research Database (Denmark)

    Stage, Jan; Andreasen, M. S.; Nielsen, H. V.

    2007-01-01

    The idea of conducting usability tests remotely emerged ten years ago. Since then, it has been studied empirically, and some software organizations employ remote methods. Yet there are still few comparisons involving more than one remote method. This paper presents results from a systematic...... empirical comparison of three methods for remote usability testing and a conventional laboratorybased think-aloud method. The three remote methods are a remote synchronous condition, where testing is conducted in real time but the test monitor is separated spatially from the test subjects, and two remote...

  6. Sex-Specific Arrival Times on the Breeding Grounds: Hybridizing Migratory Skuas Provide Empirical Support for the Role of Sex Ratios.

    Science.gov (United States)

    Lisovski, Simeon; Fröhlich, Anne; von Tersch, Matthew; Klaassen, Marcel; Peter, Hans-Ulrich; Ritz, Markus S

    2016-04-01

    In migratory animals, protandry (earlier arrival of males on the breeding grounds) prevails over protogyny (females preceding males). In theory, sex differences in timing of arrival should be driven by the operational sex ratio, shifting toward protogyny in female-biased populations. However, empirical support for this hypothesis is, to date, lacking. To test this hypothesis, we analyzed arrival data from three populations of the long-distance migratory south polar skua (Catharacta maccormicki). These populations differed in their operational sex ratio caused by the unidirectional hybridization of male south polar skuas with female brown skuas (Catharacta antarctica lonnbergi). We found that arrival times were protandrous in allopatry, shifting toward protogyny in female-biased populations when breeding in sympatry. This unique observation is consistent with theoretical predictions that sex-specific arrival times should be influenced by sex ratio and that protogyny should be observed in populations with female-biased operational sex ratio.

  7. The Empirical Testing of a Musical Performance Assessment Paradigm

    Science.gov (United States)

    Russell, Brian E.

    2010-01-01

    The purpose of this study was to test a hypothesized model of aurally perceived performer-controlled musical factors that influence assessments of performance quality. Previous research studies on musical performance constructs, musical achievement, musical expression, and scale construction were examined to identify the factors that influence…

  8. An Empirical Test of a Model of Resistance to Persuasion.

    Science.gov (United States)

    And Others; Burgoon, Michael

    1978-01-01

    Tests a model of resistance to persuasion based upon variables not considered by earlier congruity and inoculation models. Supports the prediction that the kind of critical response set induced and the target of the criticism are mediators of resistance to persuasion. (JMF)

  9. Bridge Testing With Ground-Based Interferometric Radar: Experimental Results

    International Nuclear Information System (INIS)

    Chiara, P.; Morelli, A.

    2010-01-01

    The research of innovative non-contact techniques aimed at the vibration measurement of civil engineering structures (also for damage detection and structural health monitoring) is continuously directed to the optimization of measures and methods. Ground-Based Radar Interferometry (GBRI) represents the more recent technique available for static and dynamic control of structures and ground movements.Dynamic testing of bridges and buildings in operational conditions are currently performed: (a) to assess the conformity of the structure to the project design at the end of construction; (b) to identify the modal parameters (i.e. natural frequencies, mode shapes and damping ratios) and to check the variation of any modal parameters over the years; (c) to evaluate the amplitude of the structural response to special load conditions (i.e. strong winds, earthquakes, heavy railway or roadway loads). If such tests are carried out by using a non-contact technique (like GBRI), the classical issues of contact sensors (like accelerometers) are easily overtaken.This paper presents and discusses the results of various tests carried out on full-scale bridges by using a Stepped Frequency-Continuous Wave radar system.

  10. Bridge Testing With Ground-Based Interferometric Radar: Experimental Results

    Science.gov (United States)

    Chiara, P.; Morelli, A.

    2010-05-01

    The research of innovative non-contact techniques aimed at the vibration measurement of civil engineering structures (also for damage detection and structural health monitoring) is continuously directed to the optimization of measures and methods. Ground-Based Radar Interferometry (GBRI) represents the more recent technique available for static and dynamic control of structures and ground movements. Dynamic testing of bridges and buildings in operational conditions are currently performed: (a) to assess the conformity of the structure to the project design at the end of construction; (b) to identify the modal parameters (i.e. natural frequencies, mode shapes and damping ratios) and to check the variation of any modal parameters over the years; (c) to evaluate the amplitude of the structural response to special load conditions (i.e. strong winds, earthquakes, heavy railway or roadway loads). If such tests are carried out by using a non-contact technique (like GBRI), the classical issues of contact sensors (like accelerometers) are easily overtaken. This paper presents and discusses the results of various tests carried out on full-scale bridges by using a Stepped Frequency-Continuous Wave radar system.

  11. Price vector effects in choice experiments: an empirical test

    International Nuclear Information System (INIS)

    Hanley, Nick; Wright, Robert E.; Adamowicz, Wiktor

    2005-01-01

    This paper investigates whether the preference and willingness-to-pay estimates obtained from the choice experiment method of estimating non-market values are sensitive to the vector of prices used in the experimental design. We undertake this test in the context of water quality improvements under the European Union's new Water Framework Directive. Using a mixed logit model, which allows for differing scale between the two samples, we find no significant impact of changing the price vector on estimates of preferences or willingness-to-pay. (author) (Choice modelling; Non-market valuation; Design effects; Water Framework Directive)

  12. A grand canonical genetic algorithm for the prediction of multi-component phase diagrams and testing of empirical potentials

    International Nuclear Information System (INIS)

    Tipton, William W; Hennig, Richard G

    2013-01-01

    We present an evolutionary algorithm which predicts stable atomic structures and phase diagrams by searching the energy landscape of empirical and ab initio Hamiltonians. Composition and geometrical degrees of freedom may be varied simultaneously. We show that this method utilizes information from favorable local structure at one composition to predict that at others, achieving far greater efficiency of phase diagram prediction than a method which relies on sampling compositions individually. We detail this and a number of other efficiency-improving techniques implemented in the genetic algorithm for structure prediction code that is now publicly available. We test the efficiency of the software by searching the ternary Zr–Cu–Al system using an empirical embedded-atom model potential. In addition to testing the algorithm, we also evaluate the accuracy of the potential itself. We find that the potential stabilizes several correct ternary phases, while a few of the predicted ground states are unphysical. Our results suggest that genetic algorithm searches can be used to improve the methodology of empirical potential design. (paper)

  13. A grand canonical genetic algorithm for the prediction of multi-component phase diagrams and testing of empirical potentials.

    Science.gov (United States)

    Tipton, William W; Hennig, Richard G

    2013-12-11

    We present an evolutionary algorithm which predicts stable atomic structures and phase diagrams by searching the energy landscape of empirical and ab initio Hamiltonians. Composition and geometrical degrees of freedom may be varied simultaneously. We show that this method utilizes information from favorable local structure at one composition to predict that at others, achieving far greater efficiency of phase diagram prediction than a method which relies on sampling compositions individually. We detail this and a number of other efficiency-improving techniques implemented in the genetic algorithm for structure prediction code that is now publicly available. We test the efficiency of the software by searching the ternary Zr-Cu-Al system using an empirical embedded-atom model potential. In addition to testing the algorithm, we also evaluate the accuracy of the potential itself. We find that the potential stabilizes several correct ternary phases, while a few of the predicted ground states are unphysical. Our results suggest that genetic algorithm searches can be used to improve the methodology of empirical potential design.

  14. [Mes differ by positioning: empirical testing of decentralized dynamics of the self].

    Science.gov (United States)

    Mizokami, Shinichi

    2013-10-01

    The present study empirically tested the conceptualization of the decentralized dynamics of the self proposed by Hermans & Kempen (1993), which they developed theoretically and from clinical cases, not from large samples of empirical data. They posited that worldviews and images of the self could vary by positioning even in the same individual, and denied that the ego was an omniscient entity that knew and controlled all aspects of the self (centralized ego). Study 1 tested their conceptualization empirically with 47 university students in an experimental group and 17 as a control group. The results showed that the scores on the Rosenberg's self-esteem scale and images of the Mes in the experimental group significantly varied by positioning, but those in the control group did not. Similar results were found in Study 2 with a sample of 120 university students. These results empirically supported the conceptualization of the decentralized dynamics of the self.

  15. Portuguese validation of the Internet Addiction Test: An empirical study.

    Science.gov (United States)

    Pontes, Halley M; Patrão, Ivone M; Griffiths, Mark D

    2014-06-01

    Research into Internet addiction (IA) has increased greatly over the last decade. Despite its various definitions and general lack of consensus regarding its conceptualisation amongst researchers, instruments for measuring this phenomenon have proliferated in a number of countries. There has been little research on IA in Portugal and this may be partly due to the absence of standardised measurement tools for assessing IA. This study attempted to address this issue by adapting a Portuguese version of the Internet Addiction Test (IAT) via a translation-back translation process and Confirmatory Factor Analysis in a sample of 593 Portuguese students that completed a Portuguese version of the IAT along with questions related to socio-demographic variables. The findings suggested that the IAT appears to be a valid and reliable instrument for measuring IA among Portuguese young adults as demonstrated by its satisfactory psychometric properties. However, the present findings also suggest the need to reword and update some of the IAT's items. Prevalence of IA found in the sample was 1.2% and is discussed alongside findings relating to socio-demographic correlates. Limitations and implications of the present study are also discussed. The present study calls for a reflection of the IAT while also contributing to a better understanding of the basic aspects of IA in the Portuguese community since many health practitioners are starting to realise that Internet use may pose a risk for some individuals.

  16. Test of Weak Form Efficiency: An Empirical Study of the Malaysian Foreign Exchange Market

    OpenAIRE

    Lim, Pei Mun

    2011-01-01

    This paper empirically tests the Efficient Market Hypothesis (EMH) in the weak sense for the Malaysian foreign exchange market. The hypothesis is tested using two ways. First is by testing the random walk hypothesis based on individual unit root test and second is by testing the profitability of simple technical trading rules. The study covers the high frequency daily data from January 1997 to December 2010 and the spot exchange rates are quoted as Malaysian Ringgit per unit of US Dollar. Due...

  17. Recent Ground Hold and Rapid Depressurization Testing of Multilayer Systems

    Science.gov (United States)

    Johnson, Wesley L.

    2014-01-01

    In the development of flight insulation systems for large cryogenic orbital storage (spray on foam and multilayer insulation), testing need include all environments that are experienced during flight. While large efforts have been expended on studying, bounding, and modeling the orbital performance of the insulation systems, little effort has been expended on the ground hold and ascent phases of a mission. Historical cryogenic in-space systems that have flown have been able to ignore these phases of flight due to the insulation system being within a vacuum jacket. In the development phase of the Nuclear Mars Vehicle and the Shuttle Nuclear Vehicle, several insulation systems were evaluated for the full mission cycle. Since that time there had been minimal work on these phases of flight until the Constellation program began investigating cryogenic service modules and long duration upper stages. With the inception of the Cryogenic Propellant Storage and Transfer Technology Demonstration Mission, a specific need was seen for the data and as such, several tests were added to the Cryogenic Boil-off Reduction System liquid hydrogen test matrix to provide more data on a insulation system. Testing was attempted with both gaseous nitrogen (GN2) and gaseous helium (GHe) backfills. The initial tests with nitrogen backfill were not successfully completed due to nitrogen liquefaction and solidification preventing the rapid pumpdown of the vacuum chamber. Subsequent helium backfill tests were successful and showed minimal degradation. The results are compared to the historical data.

  18. Encouraging Sustainable Transport Choices in American Households: Results from an Empirically Grounded Agent-Based Model

    Directory of Open Access Journals (Sweden)

    Davide Natalini

    2013-12-01

    Full Text Available The transport sector needs to go through an extended process of decarbonisation to counter the threat of climate change. Unfortunately, the International Energy Agency forecasts an enormous growth in the number of cars and greenhouse gas emissions by 2050. Two issues can thus be identified: (1 the need for a new methodology that could evaluate the policy performances ex-ante and (2 the need for more effective policies. To help address these issues, we developed an Agent-Based Model called Mobility USA aimed at: (1 testing whether this could be an effective approach in analysing ex-ante policy implementation in the transport sector; and (2 evaluating the effects of alternative policy scenarios on commuting behaviours in the USA. Particularly, we tested the effects of two sets of policies, namely market-based and preference-change ones. The model results suggest that this type of agent-based approach will provide a useful tool for testing policy interventions and their effectiveness.

  19. The microelectronics and photonics test bed (MPTB) space, ground test and modeling experiments

    International Nuclear Information System (INIS)

    Campbell, A.

    1999-01-01

    This paper is an overview of the MPTB (microelectronics and photonics test bed) experiment, a combination of a space experiment, ground test and modeling programs looking at the response of advanced electronic and photonic technologies to the natural radiation environment of space. (author)

  20. Reusable LH2 tank technology demonstration through ground test

    Science.gov (United States)

    Bianca, C.; Greenberg, H. S.; Johnson, S. E.

    1995-01-01

    The paper presents the project plan to demonstrate, by March 1997, the reusability of an integrated composite LH2 tank structure, cryogenic insulation, and thermal protection system (TPS). The plan includes establishment of design requirements and a comprehensive trade study to select the most suitable Reusable Hydrogen Composite Tank system (RHCTS) within the most suitable of 4 candidate structural configurations. The 4 vehicles are winged body with the capability to deliver 25,000 lbs of payload to a circular 220 nm, 51.6 degree inclined orbit (also 40,000 lbs to a 28.5 inclined 150 nm orbit). A prototype design of the selected RHCTS is established to identify the construction, fabrication, and stress simulation and test requirements necessary in an 8 foot diameter tank structure/insulation/TPS test article. A comprehensive development test program supports the 8 foot test article development and involves the composite tank itself, cryogenic insulation, and integrated tank/insulation/TPS designs. The 8 foot diameter tank will contain the integrated cryogenic insulation and TPS designs resulting from this development and that of the concurrent lightweight durable TPS program. Tank ground testing will include 330 cycles of LH2 filling, pressurization, body loading, depressurization, draining, and entry heating.

  1. Empirical ground-motion relations for subduction-zone earthquakes and their application to Cascadia and other regions

    Science.gov (United States)

    Atkinson, G.M.; Boore, D.M.

    2003-01-01

    Ground-motion relations for earthquakes that occur in subduction zones are an important input to seismic-hazard analyses in many parts of the world. In the Cascadia region (Washington, Oregon, northern California, and British Columbia), for example, there is a significant hazard from megathrust earthquakes along the subduction interface and from large events within the subducting slab. These hazards are in addition to the hazard from shallow earthquakes in the overlying crust. We have compiled a response spectra database from thousands of strong-motion recordings from events of moment magnitude (M) 5-8.3 occurring in subduction zones around the world, including both interface and in-slab events. The 2001 M 6.8 Nisqually and 1999 M 5.9 Satsop earthquakes are included in the database, as are many records from subduction zones in Japan (Kyoshin-Net data), Mexico (Guerrero data), and Central America. The size of the database is four times larger than that available for previous empirical regressions to determine ground-motion relations for subduction-zone earthquakes. The large dataset enables improved determination of attenuation parameters and magnitude scaling, for both interface and in-slab events. Soil response parameters are also better determined by the data. We use the database to develop global ground-motion relations for interface and in-slab earthquakes, using a maximum likelihood regression method. We analyze regional variability of ground-motion amplitudes across the global database and find that there are significant regional differences. In particular, amplitudes in Cascadia differ by more than a factor of 2 from those in Japan for the same magnitude, distance, event type, and National Earthquake Hazards Reduction Program (NEHRP) soil class. This is believed to be due to regional differences in the depth of the soil profile, which are not captured by the NEHRP site classification scheme. Regional correction factors to account for these differences are

  2. Monitoring of natural revegetation of Semipalatinsk nuclear testing ground

    International Nuclear Information System (INIS)

    Sultanova, B.M.

    2002-01-01

    It is well known, that monitoring of natural revegetation of Semipalatinsk test site (STS) was carried out during period 1994-2002 at test areas (Experimental field, Balapan, Degelen). In this paper the peculiarities of vegetation cover of these test areas are observed. Thus, vegetation cover of Experimental field ground in the epicentre is completely destroyed. At present there are different stages of zonal steppe communities rehabilitation: in zones with γ-irradiation 11000-14000 μR/h the revegetation is not found; on the plots with γ-irradiation 8200-10000 μR/h rare species of Artemisia frigida are found; aggregation of plant (managed from 6000-7000 μR/h is observed; At the γ-irradiation 80-200 μR/h rarefied groups of bunch grass communities similar to the zonal steppe are formed and zonal bunch grass communities developed with 18-25 μR/h. Vegetation cover of Degelen hill tops and near-mouth ground in the results of underground nuclear expulsions are completely destroyed. Here there are three main kinds of vegetation: very stony gallery areas don't almost overgrow; at technogen tops near galleries the single plants, rare field groups and unclosed micro-phyto-biocenoses of weed and adventive species (Amaranthus retroflexus, Artemisia dracunculus, Laxctuca serriola, Chorispora sibirica etc.). On the Balapan are the revegetation is limited by high radiation pollution rate. Here cenose rehabilitation is presented by Artemisia marshalliana, Spita sareptana, Festuca valresiaca). In their paper florostic and phyrocoenitic diversity of STS's flora transformation is studied. Pattern distribution and migration of radionuclides in soils and vegetation cover is represented

  3. Supervisory Adaptive Network-Based Fuzzy Inference System (SANFIS Design for Empirical Test of Mobile Robot

    Directory of Open Access Journals (Sweden)

    Yi-Jen Mon

    2012-10-01

    Full Text Available A supervisory Adaptive Network-based Fuzzy Inference System (SANFIS is proposed for the empirical control of a mobile robot. This controller includes an ANFIS controller and a supervisory controller. The ANFIS controller is off-line tuned by an adaptive fuzzy inference system, the supervisory controller is designed to compensate for the approximation error between the ANFIS controller and the ideal controller, and drive the trajectory of the system onto a specified surface (called the sliding surface or switching surface while maintaining the trajectory onto this switching surface continuously to guarantee the system stability. This SANFIS controller can achieve favourable empirical control performance of the mobile robot in the empirical tests of driving the mobile robot with a square path. Practical experimental results demonstrate that the proposed SANFIS can achieve better control performance than that achieved using an ANFIS controller for empirical control of the mobile robot.

  4. Ground-Motion Simulations of the 2008 Ms8.0 Wenchuan, China, Earthquake Using Empirical Green's Function Method

    Science.gov (United States)

    Zhang, W.; Zhang, Y.; Yao, X.

    2010-12-01

    On May 12, 2008, a huge earthquake with magnitude Ms8.0 occurred in the Wenhuan, Sichuan Province of China. This event was the most devastating earthquake in the mainland of China since the 1976 M7.8 Tangshan earthquake. It resulted in tremendous losses of life and property. There were about 90,000 persons killed. Due to occur in the mountainous area, this great earthquake and the following thousands aftershocks also caused many other geological disasters, such as landslide, mud-rock flow and “quake lakes” which formed by landslide-induced reservoirs. This earthquake occurred along the Longmenshan fault, as the result of motion on a northeast striking reverse fault or thrust fault on the northwestern margin of the Sichuan Basin. The earthquake's epicenter and focal-mechanism are consistent with it having occurred as the result of movement on the Longmenshan fault or a tectonically related fault. The earthquake reflects tectonic stresses resulting from the convergence of crustal material slowly moving from the high Tibetan Plateau, to the west, against strong crust underlying the Sichuan Basin and southeastern China. In this study, we simulate the near-field strong ground motions of this great event based on the empirical Green’s function method (EGF). Referring to the published inversion source models, at first, we assume that there are three asperities on the rupture area and choose three different small events as the EGFs. Then, we identify the parameters of the source model using a genetic algorithm (GA). We calculate the synthetic waveforms based on the obtained source model and compare with the observed records. Our result shows that for most of the synthetic waveforms agree very well with the observed ones. The result proves the validity and the stability of the method. Finally, we forward the near-field strong ground motions near the source region and try to explain the damage distribution caused by the great earthquake.

  5. Beam loading and cavity compensation for the ground test accelerator

    International Nuclear Information System (INIS)

    Jachim, S.P.; Natter, E.F.

    1989-01-01

    The Ground Test Accelerator (GTA) will be a heavily beam-loaded H/sup minus/ linac with tight tolerances on accelerating field parameters. The methods used in modeling the effects of beam loading in this machine are described. The response of the cavity to both beam and radio-frequency (RF) drive stimulus is derived, including the effects of cavity detuning. This derivation is not restricted to a small-signal approximation. An analytical method for synthesizing a predistortion network that decouples the amplitude and phase responses of the cavity is also outlined. Simulation of performance, including beam loading, is achieved through use of a control system analysis software package. A straightforward method is presented for extrapolating this work to model large coupled structures with closely spaced parasitic modes. Results to date have enabled the RF control system designs for GTA to be optimized and have given insight into their operation. 6 refs., 10 figs

  6. Commissioning of the Ground Test Accelerator Intertank Matching Section

    International Nuclear Information System (INIS)

    Johnson, K.F.; Sander, O.R.; Atkins, W.H.; Bolme, G.O.; Cole, R.; Connolly, R.; Gilpatrick, J.D.; Ingalls, W.B.; Kersteins, D.; Little, C.; Lohsen, R.A.; Lysenko, W.P.; Mottershead, C.T.; Power, J.; Rusthoi, D.P.; Sandoval, D.P.; Stevens, R.R.; Vaughn, G.; Wadlinger, E.A.; Weiss, R.; Yuan, V.

    1992-01-01

    The Ground Test Accelerator (GTA) has the objective of verifying much of the technology (physics and engineering) required for producing high-brightness, high-current H - beams. GTA commissioning is staged to verify the beam dynamics design of each major accelerator component as it is brought on-line. The commissioning stages are the 35 keV H - injector, the 2.5 MeV Radio Frequency Quadrupole (RFQ), the Intertank Matching Section (IMS), the 3.2 MeV first 2βγ Drift Tube Linac (DTL-1) module, the 8.7 MeV 2βγ DTL (modules 1--5), and the 24 MeV GTA; all 10 DTL modules. Commissioning results from the IMS beam experiments will be presented

  7. Testing the tests--an empirical evaluation of screening tests for the detection of cognitive impairment in aviators.

    Science.gov (United States)

    Stokes, A F; Banich, M T; Elledge, V C

    1991-08-01

    The FAA has expressed concern that flight safety could be compromised by undetected cognitive impairment in pilots due to conditions such as substance abuse, mental illness, and neuropsychological problems. Interest has been shown in the possibility of adding a brief "mini-mental exam," or a simple automated test-battery to the standard flight medical to screen for such conditions. The research reported here involved the empirical evaluation of two "mini-mental exams," two paper-and-pencil test batteries, and a prototype version of an automated screening battery. Sensitivity, specificity, and positive predictive value were calculated for each sub-task in a discriminant study of 54 pilots and 62 individuals from a heterogeneous clinical population. Results suggest that the "mini-mental exams" are poor candidates for a screening test. The automated battery showed the best discrimination performance, in part because of the incorporation of dual-task tests of divided attention performance. These tests appear to be particularly sensitive to otherwise difficult-to-detect cognitive impairments of a mild or subtle nature. The use of an automated battery of tests as a screening instrument does appear to be feasible in principle, but the practical success of a screening program is heavily dependent upon the actual prevalence of cognitive impairment in the medical applicant population.

  8. Vision-based Ground Test for Active Debris Removal

    Directory of Open Access Journals (Sweden)

    Seong-Min Lim

    2013-12-01

    Full Text Available Due to the continuous space development by mankind, the number of space objects including space debris in orbits around the Earth has increased, and accordingly, difficulties of space development and activities are expected in the near future. In this study, among the stages for space debris removal, the implementation of a vision-based approach technique for approaching space debris from a far-range rendezvous state to a proximity state, and the ground test performance results were described. For the vision-based object tracking, the CAM-shift algorithm with high speed and strong performance, and the Kalman filter were combined and utilized. For measuring the distance to a tracking object, a stereo camera was used. For the construction of a low-cost space environment simulation test bed, a sun simulator was used, and in the case of the platform for approaching, a two-dimensional mobile robot was used. The tracking status was examined while changing the position of the sun simulator, and the results indicated that the CAM-shift showed a tracking rate of about 87% and the relative distance could be measured down to 0.9 m. In addition, considerations for future space environment simulation tests were proposed.

  9. A Reliability Test of a Complex System Based on Empirical Likelihood

    OpenAIRE

    Zhou, Yan; Fu, Liya; Zhang, Jun; Hui, Yongchang

    2016-01-01

    To analyze the reliability of a complex system described by minimal paths, an empirical likelihood method is proposed to solve the reliability test problem when the subsystem distributions are unknown. Furthermore, we provide a reliability test statistic of the complex system and extract the limit distribution of the test statistic. Therefore, we can obtain the confidence interval for reliability and make statistical inferences. The simulation studies also demonstrate the theorem results.

  10. The VUV instrument SPICE for Solar Orbiter: performance ground testing

    Science.gov (United States)

    Caldwell, Martin E.; Morris, Nigel; Griffin, Douglas K.; Eccleston, Paul; Anderson, Mark; Pastor Santos, Carmen; Bruzzi, Davide; Tustain, Samuel; Howe, Chris; Davenne, Jenny; Grundy, Timothy; Speight, Roisin; Sidher, Sunil D.; Giunta, Alessandra; Fludra, Andrzej; Philippon, Anne; Auchere, Frederic; Hassler, Don; Davila, Joseph M.; Thompson, William T.; Schuehle, Udo H.; Meining, Stefan; Walls, Buddy; Phelan, P.; Dunn, Greg; Klein, Roman M.; Reichel, Thomas; Gyo, Manfred; Munro, Grant J.; Holmes, William; Doyle, Peter

    2017-08-01

    SPICE is an imaging spectrometer operating at vacuum ultraviolet (VUV) wavelengths, 70.4 - 79.0 nm and 97.3 - 104.9 nm. It is a facility instrument on the Solar Orbiter mission, which carries 10 science instruments in all, to make observations of the Sun's atmosphere and heliosphere, at close proximity to the Sun, i.e to 0.28 A.U. at perihelion. SPICE's role is to make VUV measurements of plasma in the solar atmosphere. SPICE is designed to achieve spectral imaging at spectral resolution >1500, spatial resolution of several arcsec, and two-dimensional FOV of 11 x16arcmins. The many strong constraints on the instrument design imposed by the mission requirements prevent the imaging performance from exceeding those of previous instruments, but by being closer to the sun there is a gain in spatial resolution. The price which is paid is the harsher environment, particularly thermal. This leads to some novel features in the design, which needed to be proven by ground test programs. These include a dichroic solar-transmitting primary mirror to dump the solar heat, a high in-flight temperature (60deg.C) and gradients in the optics box, and a bespoke variable-line-spacing grating to minimise the number of reflective components used. The tests culminate in the systemlevel test of VUV imaging performance and pointing stability. We will describe how our dedicated facility with heritage from previous solar instruments, is used to make these tests, and show the results, firstly on the Engineering Model of the optics unit, and more recently on the Flight Model. For the keywords, select up to 8 key terms for a search on your manuscript's subject.

  11. Motor sport in France: testing-ground for the world.

    Science.gov (United States)

    Cofaigh, Eamon O

    2011-01-01

    The birth of the automobile in the late nineteenth century was greeted with a mixture of awe, scepticism and sometimes even disdain from sections of the European public. In this article, the steps taken in France to pioneer and promote this new invention are examined. Unreliable and noisy, the early automobile owes a debt of gratitude to the French aristocracy who organised and codified motor racing in an effort to test these new inventions while at the same time introduce them to a wider public. City-to-city races demonstrated the potential of the automobile before the initiative of Gordon Bennett proved to be the catalyst for the birth of international motor sport as we recognise it today. Finally this article looks at the special connection between Le Mans and the automobile. Le Mans has, through its 24-hour race, maintained a strong link with the development of everyday automobile tourism and offers the enthusiast an alternative to the machines that reach incredible speeds on modern-day closed circuits. This article examines how French roads were veritable testing grounds for the earliest cars and how the public roads of Le Mans maintain the tradition to this day.

  12. DIF Testing with an Empirical-Histogram Approximation of the Latent Density for Each Group

    Science.gov (United States)

    Woods, Carol M.

    2011-01-01

    This research introduces, illustrates, and tests a variation of IRT-LR-DIF, called EH-DIF-2, in which the latent density for each group is estimated simultaneously with the item parameters as an empirical histogram (EH). IRT-LR-DIF is used to evaluate the degree to which items have different measurement properties for one group of people versus…

  13. Understanding users’ motivations to engage in virtual worlds: A multipurpose model and empirical testing

    NARCIS (Netherlands)

    Verhagen, T.; Feldberg, J.F.M.; van den Hooff, B.J.; Meents, S.; Merikivi, J.

    2012-01-01

    Despite the growth and commercial potential of virtual worlds, relatively little is known about what drives users' motivations to engage in virtual worlds. This paper proposes and empirically tests a conceptual model aimed at filling this research gap. Given the multipurpose nature of virtual words

  14. General Strain Theory and School Bullying: An Empirical Test in South Korea

    Science.gov (United States)

    Moon, Byongook; Morash, Merry; McCluskey, John D.

    2012-01-01

    Despite recognition of bullying as a serious school and social problem with negative effects on students' well-being and safety, and the overlap between aggressive bullying acts and delinquent behavior, few empirical studies test the applicability of criminological theories to explaining bullying. This limitation in research is especially evident…

  15. Empirical Investigation of Job Applicants' Reactions to Taking a Pre-Employment Honesty Test.

    Science.gov (United States)

    Jones, John W.; Joy, Dennis

    Employee theft is widespread and difficult to detect. Many companies have attempted to control the employee theft problem through pre-employment screening. The use of paper-and-pencil honesty tests in this process has become increasingly common. These two studies empirically investigated job applicants' (N=450) reactions to taking a pre-employment…

  16. An empirical test of stage models of e-government development: evidence from Dutch municipalities

    NARCIS (Netherlands)

    Rooks, G.; Matzat, U.; Sadowski, B.M.

    2017-01-01

    In this article we empirically test stage models of e-government development. We use Lee's classification to make a distinction between four stages of e-government: informational, requests, personal, and e-democracy. We draw on a comprehensive data set on the adoption and development of e-government

  17. An empirical test of reference price theories using a semiparametric approach

    DEFF Research Database (Denmark)

    Boztug, Yasemin; Hildebrandt, Lutz

      In this paper we estimate and empirically test different behavioral theories of consumer reference price formation. Two major theories are proposed to model the reference price reaction: assimilation contrast theory and prospect theory. We assume that different consumer segments will use...

  18. Embracing Safe Ground Test Facility Operations and Maintenance

    Science.gov (United States)

    Dunn, Steven C.; Green, Donald R.

    2010-01-01

    Conducting integrated operations and maintenance in wind tunnel ground test facilities requires a balance of meeting due dates, efficient operation, responsiveness to the test customer, data quality, effective maintenance (relating to readiness and reliability), and personnel and facility safety. Safety is non-negotiable, so the balance must be an "and" with other requirements and needs. Pressure to deliver services faster at increasing levels of quality in under-maintained facilities is typical. A challenge for management is to balance the "need for speed" with safety and quality. It s especially important to communicate this balance across the organization - workers, with a desire to perform, can be tempted to cut corners on defined processes to increase speed. Having a lean staff can extend the time required for pre-test preparations, so providing a safe work environment for facility personnel and providing good stewardship for expensive National capabilities can be put at risk by one well-intending person using at-risk behavior. This paper documents a specific, though typical, operational environment and cites management and worker safety initiatives and tools used to provide a safe work environment. Results are presented and clearly show that the work environment is a relatively safe one, though still not good enough to keep from preventing injury. So, the journey to a zero injury work environment - both in measured reality and in the minds of each employee - continues. The intent of this paper is to provide a benchmark for others with operational environments and stimulate additional sharing and discussion on having and keeping a safe work environment.

  19. An empirical test of Maslow's theory of need hierarchy using hologeistic comparison by statistical sampling.

    Science.gov (United States)

    Davis-Sharts, J

    1986-10-01

    Maslow's hierarchy of basic human needs provides a major theoretical framework in nursing science. The purpose of this study was to empirically test Maslow's need theory, specifically at the levels of physiological and security needs, using a hologeistic comparative method. Thirty cultures taken from the 60 cultural units in the Health Relations Area Files (HRAF) Probability Sample were found to have data available for examining hypotheses about thermoregulatory (physiological) and protective (security) behaviors practiced prior to sleep onset. The findings demonstrate there is initial worldwide empirical evidence to support Maslow's need hierarchy.

  20. Project size and common pool size: An empirical test using Danish municipal mergers

    DEFF Research Database (Denmark)

    Hansen, Sune Welling

    The paper examines the proposition that project size tends to increase with common pool size from the law of 1 over n (Weingast et al, 1981). This remains under-investigated and a recent study conducted by Primo & Snyder (2008) argues, and empirically substantiates, a reverse law of 1 over n...... across two research designs, two outcome variables, two subsamples, and several model specifications The implications of the findings, combined with the limited potential for empirically testing Primo & Snyder’s alternative model, suggest a re-appreciation of the law of 1 over n as it was originally...

  1. Some Preliminary Notes on an Empirical Test of Freud’s Theory on Depression

    Science.gov (United States)

    Desmet, Mattias

    2012-01-01

    A review of the literature indicates that empirical researchers have difficulty translating Freud’s theory on depression into appropriate research questions and hypotheses. In their attempt to do so, the level of complexity in Freud’s work is often lost. As a result, what is empirically tested is no more than a caricature of the original theory. To help researchers avoid such problems, this study presents a conceptual analysis of Freud’s theory of depression as it is presented in Mourning and Melancholia (Freud, 1917). In analyzing Freud’s theory on the etiology of depression, it is essential to differentiate between (1) an identification with the satisfying and frustrating aspects of the love object, (2) the inter- and an intrapersonal loss of the love object, and (3) conscious and unconscious dynamics. A schematic representation of the mechanism of depression is put forward and a research design by which this schema can be empirically investigated is outlined. PMID:23675357

  2. Empirical Statistical Power for Testing Multilocus Genotypic Effects under Unbalanced Designs Using a Gibbs Sampler

    Directory of Open Access Journals (Sweden)

    Chaeyoung Lee

    2012-11-01

    Full Text Available Epistasis that may explain a large portion of the phenotypic variation for complex economic traits of animals has been ignored in many genetic association studies. A Baysian method was introduced to draw inferences about multilocus genotypic effects based on their marginal posterior distributions by a Gibbs sampler. A simulation study was conducted to provide statistical powers under various unbalanced designs by using this method. Data were simulated by combined designs of number of loci, within genotype variance, and sample size in unbalanced designs with or without null combined genotype cells. Mean empirical statistical power was estimated for testing posterior mean estimate of combined genotype effect. A practical example for obtaining empirical statistical power estimates with a given sample size was provided under unbalanced designs. The empirical statistical powers would be useful for determining an optimal design when interactive associations of multiple loci with complex phenotypes were examined.

  3. Some Preliminary Notes on an Empirical Test of Freud’s Theory on Depression

    Directory of Open Access Journals (Sweden)

    Mattias eDesmet

    2013-05-01

    Full Text Available A review of the literature indicates that empirical researchers have difficulty translating Freud’s theory on depression into appropriate research questions and hypotheses. In their attempt to do so, the level of complexity in Freud’s work is often lost. As a result, what is empirically tested is no more than a caricature of the original theory. To help researchers avoid such problems, this study presents a conceptual analysis of Freud’s theory of depression as it is presented in Mourning and Melancholia (Freud, 1917. In analyzing Freud’s theory on the etiology of depression, it is essential to differentiate between (1 an identification with the satisfying and frustrating aspects of the love object, (2 the inter- and an intrapersonal loss of the love object, and (3 conscious and unconscious dynamics. A schematic representation of the mechanism of depression is put forward and a research design by which this schema can be empirically investigated is outlined.

  4. An empirical test of pseudo random number generators by means of an exponential decaying process

    International Nuclear Information System (INIS)

    Coronel B, H.F.; Hernandez M, A.R.; Jimenez M, M.A.; Mora F, L.E.

    2007-01-01

    Empirical tests for pseudo random number generators based on the use of processes or physical models have been successfully used and are considered as complementary to theoretical tests of randomness. In this work a statistical methodology for evaluating the quality of pseudo random number generators is presented. The method is illustrated in the context of the so-called exponential decay process, using some pseudo random number generators commonly used in physics. (Author)

  5. Lessons learned on the Ground Test Accelerator control system

    International Nuclear Information System (INIS)

    Kozubal, A.J.; Weiss, R.E.

    1994-01-01

    When we initiated the control system design for the Ground Test Accelerator (GTA), we envisioned a system that would be flexible enough to handle the changing requirements of an experimental project. This control system would use a developers' toolkit to reduce the cost and time to develop applications for GTA, and through the use of open standards, the system would accommodate unforeseen requirements as they arose. Furthermore, we would attempt to demonstrate on GTA a level of automation far beyond that achieved by existing accelerator control systems. How well did we achieve these goals? What were the stumbling blocks to deploying the control system, and what assumptions did we make about requirements that turned out to be incorrect? In this paper we look at the process of developing a control system that evolved into what is now the ''Experimental Physics and Industrial Control System'' (EPICS). Also, we assess the impact of this system on the GTA project, as well as the impact of GTA on EPICS. The lessons learned on GTA will be valuable for future projects

  6. Empirical investigation of purchasing power parity for Turkey: Evidence from recent nonlinear unit root tests

    Directory of Open Access Journals (Sweden)

    Dilem Yıldırım

    2017-06-01

    Full Text Available This study explores the empirical validity of the purchasing power parity (PPP hypothesis between Turkey and its four major trading partners, the European Union, Russia, China and the US. Accounting for the nonlinear nature of real exchange rates, we employ a battery of recently developed nonlinear unit root tests. Our empirical results reveal that nonlinear unit root tests deliver stronger evidence in favour of the PPP hypothesis when compared to the conventional unit root tests only if nonlinearities in real exchange rates are correctly specified. Furthermore, it emerges from our findings that the real exchange rates of the countries having a free trade agreement are more likely to behave as linear stationary processes.

  7. X-36 on Ground after Radio and Telemetry Tests

    Science.gov (United States)

    1996-01-01

    A UH-1 helicopter lowers the X-36 Tailless Fighter Agility Research Aircraft to the ground after radio frequency and telemetry tests above Rogers Dry Lake at NASA Dryden Flight Research Center, Edwards, California, in November 1996. The purpose of taking the X-36 aloft for the radio and telemetry system checkouts was to test the systems more realistically while airborne. More taxi and radio frequency tests were conducted before the aircraft's first flight in early 1997. The NASA/Boeing X-36 Tailless Fighter Agility Research Aircraft program successfully demonstrated the tailless fighter design using advanced technologies to improve the maneuverability and survivability of possible future fighter aircraft. The program met or exceeded all project goals. For 31 flights during 1997 at the Dryden Flight Research Center, Edwards, California, the project team examined the aircraft's agility at low speed / high angles of attack and at high speed / low angles of attack. The aircraft's speed envelope reached up to 206 knots (234 mph). This aircraft was very stable and maneuverable. It handled very well. The X-36 vehicle was designed to fly without the traditional tail surfaces common on most aircraft. Instead, a canard forward of the wing was used as well as split ailerons and an advanced thrust-vectoring nozzle for directional control. The X-36 was unstable in both pitch and yaw axes, so an advanced, single-channel digital fly-by-wire control system (developed with some commercially available components) was put in place to stabilize the aircraft. Using a video camera mounted in the nose of the aircraft and an onboard microphone, the X-36 was remotely controlled by a pilot in a ground station virtual cockpit. A standard fighter-type head-up display (HUD) and a moving-map representation of the vehicle's position within the range in which it flew provided excellent situational awareness for the pilot. This pilot-in-the-loop approach eliminated the need for expensive and

  8. Integrated Ground Operations Demonstration Units Testing Plans and Status

    Science.gov (United States)

    Johnson, Robert G.; Notardonato, William U.; Currin, Kelly M.; Orozco-Smith, Evelyn M.

    2012-01-01

    Cryogenic propellant loading operations with their associated flight and ground systems are some of the most complex, critical activities in launch operations. Consequently, these systems and operations account for a sizeable portion of the life cycle costs of any launch program. NASA operations for handling cryogens in ground support equipment have not changed substantially in 50 years, despite advances in cryogenics, system health management and command and control technologies. This project was developed to mature, integrate and demonstrate advancement in the current state of the art in these areas using two distinct integrated ground operations demonstration units (GODU): GODU Integrated Refrigeration and Storage (IRAS) and GODU Autonomous Control

  9. Ground testing and simulation. II - Aerodynamic testing and simulation: Saving lives, time, and money

    Science.gov (United States)

    Dayman, B., Jr.; Fiore, A. W.

    1974-01-01

    The present work discusses in general terms the various kinds of ground facilities, in particular, wind tunnels, which support aerodynamic testing. Since not all flight parameters can be simulated simultaneously, an important problem consists in matching parameters. It is pointed out that there is a lack of wind tunnels for a complete Reynolds-number simulation. Using a computer to simulate flow fields can result in considerable reduction of wind-tunnel hours required to develop a given flight vehicle.

  10. Neutralizer Hollow Cathode Simulations and Comparisons with Ground Test Data

    Science.gov (United States)

    Mikellides, Ioannis G.; Snyder, John S.; Goebel, Dan M.; Katz, Ira; Herman, Daniel A.

    2009-01-01

    The fidelity of electric propulsion physics-based models depends largely on the validity of their predictions over a range of operating conditions and geometries. In general, increased complexity of the physics requires more extensive comparisons with laboratory data to identify the region(s) that lie outside the validity of the model assumptions and to quantify the uncertainties within its range of application. This paper presents numerical simulations of neutralizer hollow cathodes at various operating conditions and orifice sizes. The simulations were performed using a two-dimensional axisymmetric model that solves numerically a relatively extensive system of conservation laws for the partially ionized gas in these devices. A summary of the comparisons between simulation results and Langmuir probe measurements is provided. The model has also been employed to provide insight into recent ground test observations of the neutralizer cathode in NEXT. It is found that a likely cause of the observed keeper voltage drop is cathode orifice erosion. However, due to the small magnitude of this change, is approx. 0.5 V (less than 5% of the beginning-of-life value) over 10 khrs, and in light of the large uncertainties of the cathode material sputtering yield at low ion energies, other causes cannot be excluded. Preliminary simulations to understand transition to plume mode suggest that in the range of 3-5 sccm the existing 2-D model reproduces fairly well the rise of the keeper voltage in the NEXT neutralizer as observed in the laboratory. At lower flow rates the simulation produces oscillations in the keeper current and voltage that require prohibitively small time-steps to resolve with the existing algorithms.

  11. The impact of leverage on stock returns: an empirical test on the Australian stock market

    OpenAIRE

    Thuy Linh, Doan

    2009-01-01

    Asset pricing model is no longer a new topic to theoretical finance but it still maintains researchers’ interest until now. The role of firm characteristics in explaining the stock returns becomes more and more significant in the empirical studies. The Fama French three factor is the most famous model of testing the firm characteristics: size effect and book to market effect on stock returns. However, this model does not include leverage, one of the most important firm characteristics. Starti...

  12. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2

    Science.gov (United States)

    Johnson, Kenneth L.; White, K. Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  13. The Information Content of Financial and Economic Variables: Empirical Tests of Information Variables in Japan

    OpenAIRE

    Kengo Kato

    1991-01-01

    The main topic of this paper is "information variables" (or "indicators") of monetary policy, which work as criteria for setting the direction of monetary policy. After briefly surveying the notion and candidates of information variables, according to the studies mainly in the United States, empirical tests using Japan's data are conducted. It can be said that some information variables seem to be useful, but the results are mixed in general.

  14. Ground test facilities for evaluating nuclear thermal propulsion engines and fuel elements

    International Nuclear Information System (INIS)

    Allen, G.C.; Beck, D.F.; Harmon, C.D.; Shipers, L.R.

    1992-01-01

    Interagency panels evaluating nuclear thermal propulsion development options have consistently recognized the need for constructing a major new ground test facility to support fuel element and engine testing. This paper summarizes the requirements, configuration, and design issues of a proposed ground test complex for evaluating nuclear thermal propulsion engines and fuel elements being developed for the Space Nuclear Thermal Propulsion (SNTP) program. 2 refs

  15. Flow Quality Analysis of Shape Morphing Structures for Hypersonic Ground Testing Applications

    Data.gov (United States)

    National Aeronautics and Space Administration — Background: Shape morphing, high temperature, ceramic structural materials are now becoming available and can revolutionize ground testing by providing dynamic flow...

  16. AMS Ground Truth Measurements: Calibration and Test Lines

    International Nuclear Information System (INIS)

    Wasiolek, P.

    2013-01-01

    Airborne gamma spectrometry is one of the primary techniques used to define the extent of ground contamination after a radiological incident. Its usefulness was demonstrated extensively during the response to the Fukushima nuclear power plant (NPP) accident in March-May 2011. To map ground contamination a set of scintillation detectors is mounted on an airborne platform (airplane or helicopter) and flown over contaminated areas. The acquisition system collects spectral information together with the aircraft position and altitude every second. To provide useful information to decision makers, the count rate data expressed in counts per second (cps) needs to be converted to the terrestrial component of the exposure rate 1 m above ground, or surface activity of isotopes of concern. This is done using conversion coefficients derived from calibration flights. During a large scale radiological event, multiple flights may be necessary and may require use of assets from different agencies. However, as the production of a single, consistent map product depicting the ground contamination is the primary goal, it is critical to establish very early into the event a common calibration line. Such a line should be flown periodically in order to normalize data collected from different aerial acquisition systems and potentially flown at different flight altitudes and speeds. In order to verify and validate individual aerial systems, the calibration line needs to be characterized in terms of ground truth measurements. This is especially important if the contamination is due to short-lived radionuclides. The process of establishing such a line, as well as necessary ground truth measurements, is described in this document.

  17. AMS Ground Truth Measurements: Calibrations and Test Lines

    Energy Technology Data Exchange (ETDEWEB)

    Wasiolek, Piotr T. [National Security Technologies, LLC

    2015-12-01

    Airborne gamma spectrometry is one of the primary techniques used to define the extent of ground contamination after a radiological incident. Its usefulness was demonstrated extensively during the response to the Fukushima NPP accident in March-May 2011. To map ground contamination, a set of scintillation detectors is mounted on an airborne platform (airplane or helicopter) and flown over contaminated areas. The acquisition system collects spectral information together with the aircraft position and altitude every second. To provide useful information to decision makers, the count data, expressed in counts per second (cps), need to be converted to a terrestrial component of the exposure rate at 1 meter (m) above ground, or surface activity of the isotopes of concern. This is done using conversion coefficients derived from calibration flights. During a large-scale radiological event, multiple flights may be necessary and may require use of assets from different agencies. However, because production of a single, consistent map product depicting the ground contamination is the primary goal, it is critical to establish a common calibration line very early into the event. Such a line should be flown periodically in order to normalize data collected from different aerial acquisition systems and that are potentially flown at different flight altitudes and speeds. In order to verify and validate individual aerial systems, the calibration line needs to be characterized in terms of ground truth measurements This is especially important if the contamination is due to short-lived radionuclides. The process of establishing such a line, as well as necessary ground truth measurements, is described in this document.

  18. The development and testing of pulsed detonation engine ground demonstrators

    Science.gov (United States)

    Panicker, Philip Koshy

    2008-10-01

    The successful implementation of a PDE running on fuel and air mixtures will require fast-acting fuel-air injection and mixing techniques, detonation initiation techniques such as DDT enhancing devices or a pre-detonator, an effective ignition system that can sustain repeated firing at high rates and a fast and capable, closed-loop control system. The control system requires high-speed transducers for real-time monitoring of the PDE and the detection of the detonation wave speed. It is widely accepted that the detonation properties predicted by C-J detonation relations are fairly accurate in comparison to experimental values. The post-detonation flow properties can also be expressed as a function of wave speed or Mach number. Therefore, the PDE control system can use C-J relations to predict the post-detonation flow properties based on measured initial conditions and compare the values with those obtained from using the wave speed. The controller can then vary the initial conditions within the combustor for the subsequent cycle, by modulating the frequency and duty cycle of the valves, to obtain optimum air and fuel flow rates, as well as modulate the energy and timing of the ignition to achieve the required detonation properties. Five different PDE ground demonstrators were designed, built and tested to study a number of the required sub-systems. This work presents a review of all the systems that were tested, along with suggestions for their improvement. The PDE setups, ranged from a compact PDE with a 19 mm (3/4 in.) i.d., to two 25 mm (1 in.) i.d. setups, to a 101 mm (4 in.) i.d. dual-stage PDE setup with a pre-detonator. Propane-oxygen mixtures were used in the smaller PDEs. In the dual-stage PDE, propane-oxygen was used in the pre-detonator, while propane-air mixtures were used in the main combustor. Both rotary valves and solenoid valve injectors were studied. The rotary valves setups were tested at 10 Hz, while the solenoid valves were tested at up to 30 Hz

  19. Probabilistic income-maximizing behavior in regional migration: an empirical test.

    Science.gov (United States)

    Weinstein, R I; Evans, R D

    1980-01-01

    "This paper provides an empirical test of the hypothesis that migrants consider income risk in their evaluation of the returns to migration in that higher levels of income risk for a given region reduce the rate of net migration into that region." Theoretical approaches to the incorporation of income risk in migration models are first considered, and alternative approaches are then tested using data on white and nonwhite net migration in the United States between 1960 and 1970. The results indicate support for the hypothesis that the extent of net migration is inversely affected by income risk. excerpt

  20. Tacit knowledge: A refinement and empirical test of the Academic Tacit Knowledge Scale.

    Science.gov (United States)

    Insch, Gary S; McIntyre, Nancy; Dawley, David

    2008-11-01

    Researchers have linked tacit knowledge to improved organizational performance, but research on how to measure tacit knowledge is scarce. In the present study, the authors proposed and empirically tested a model of tacit knowledge and an accompanying measurement scale of academic tacit knowledge. They present 6 hypotheses that support the proposed tacit knowledge model regarding the role of cognitive (self-motivation, self-organization); technical (individual task, institutional task); and social (task-related, general) skills. The authors tested these hypotheses with 542 responses to the Academic Tacit Knowledge Scale, which included the respondents' grade point average-the performance variable. All 6 hypotheses were supported.

  1. A nonparametric empirical Bayes framework for large-scale multiple testing.

    Science.gov (United States)

    Martin, Ryan; Tokdar, Surya T

    2012-07-01

    We propose a flexible and identifiable version of the 2-groups model, motivated by hierarchical Bayes considerations, that features an empirical null and a semiparametric mixture model for the nonnull cases. We use a computationally efficient predictive recursion (PR) marginal likelihood procedure to estimate the model parameters, even the nonparametric mixing distribution. This leads to a nonparametric empirical Bayes testing procedure, which we call PRtest, based on thresholding the estimated local false discovery rates. Simulations and real data examples demonstrate that, compared to existing approaches, PRtest's careful handling of the nonnull density can give a much better fit in the tails of the mixture distribution which, in turn, can lead to more realistic conclusions.

  2. Seismic Data for Evaluation of Ground Motion Hazards in Las Vegas in Support of Test Site Readiness Ground Motion

    Energy Technology Data Exchange (ETDEWEB)

    Rodgers, A

    2008-01-16

    In this report we describe the data sets used to evaluate ground motion hazards in Las Vegas from nuclear tests at the Nevada Test Site. This analysis is presented in Rodgers et al. (2005, 2006) and includes 13 nuclear explosions recorded at the John Blume and Associates network, the Little Skull Mountain earthquake and a temporary deployment of broadband station in Las Vegas. The data are available in SAC format on CD-ROM as an appendix to this report.

  3. Strategy for a Rock Mechanics Site Descriptive Model. Development and testing of the empirical approach

    International Nuclear Information System (INIS)

    Roeshoff, Kennert; Lanaro, Flavio; Lanru Jing

    2002-05-01

    This report presents the results of one part of a wide project for the determination of a methodology for the determination of the rock mechanics properties of the rock mass for the so-called Aespoe Test Case. The Project consists of three major parts: the empirical part dealing with the characterisation of the rock mass by applying empirical methods, a part determining the rock mechanics properties of the rock mass through numerical modelling, and a third part carrying out numerical modelling for the determination of the stress state at Aespoe. All Project's parts were performed based on a limited amount of data about the geology and mechanical tests on samples selected from the Aespoe Database. This Report only considers the empirical approach. The purpose of the project is the development of a descriptive rock mechanics model for SKBs rock mass investigations for a final repository site. The empirical characterisation of the rock mass provides correlations with some of the rock mechanics properties of the rock mass such as the deformation modulus, the friction angle and cohesion for a certain stress interval and the uniaxial compressive strength. For the characterisation of the rock mass, several empirical methods were analysed and reviewed. Among those methods, some were chosen because robust, applicable and widespread in modern rock mechanics. Major weight was given to the well-known Tunnel Quality Index (Q) and Rock Mass Rating (RMR) but also the Rock Mass Index (RMi), the Geological Strength Index (GSI) and Ramamurthy's Criterion were applied for comparison with the two classical methods. The process of: i) sorting the geometrical/geological/rock mechanics data, ii) identifying homogeneous rock volumes, iii) determining the input parameters for the empirical ratings for rock mass characterisation; iv) evaluating the mechanical properties by using empirical relations with the rock mass ratings; was considered. By comparing the methodologies involved by the

  4. Strategy for a Rock Mechanics Site Descriptive Model. Development and testing of the empirical approach

    Energy Technology Data Exchange (ETDEWEB)

    Roeshoff, Kennert; Lanaro, Flavio [Berg Bygg Konsult AB, Stockholm (Sweden); Lanru Jing [Royal Inst. of Techn., Stockholm (Sweden). Div. of Engineering Geology

    2002-05-01

    This report presents the results of one part of a wide project for the determination of a methodology for the determination of the rock mechanics properties of the rock mass for the so-called Aespoe Test Case. The Project consists of three major parts: the empirical part dealing with the characterisation of the rock mass by applying empirical methods, a part determining the rock mechanics properties of the rock mass through numerical modelling, and a third part carrying out numerical modelling for the determination of the stress state at Aespoe. All Project's parts were performed based on a limited amount of data about the geology and mechanical tests on samples selected from the Aespoe Database. This Report only considers the empirical approach. The purpose of the project is the development of a descriptive rock mechanics model for SKBs rock mass investigations for a final repository site. The empirical characterisation of the rock mass provides correlations with some of the rock mechanics properties of the rock mass such as the deformation modulus, the friction angle and cohesion for a certain stress interval and the uniaxial compressive strength. For the characterisation of the rock mass, several empirical methods were analysed and reviewed. Among those methods, some were chosen because robust, applicable and widespread in modern rock mechanics. Major weight was given to the well-known Tunnel Quality Index (Q) and Rock Mass Rating (RMR) but also the Rock Mass Index (RMi), the Geological Strength Index (GSI) and Ramamurthy's Criterion were applied for comparison with the two classical methods. The process of: i) sorting the geometrical/geological/rock mechanics data, ii) identifying homogeneous rock volumes, iii) determining the input parameters for the empirical ratings for rock mass characterisation; iv) evaluating the mechanical properties by using empirical relations with the rock mass ratings; was considered. By comparing the methodologies involved

  5. Empirical Modeling of Lithium-ion Batteries Based on Electrochemical Impedance Spectroscopy Tests

    International Nuclear Information System (INIS)

    Samadani, Ehsan; Farhad, Siamak; Scott, William; Mastali, Mehrdad; Gimenez, Leonardo E.; Fowler, Michael; Fraser, Roydon A.

    2015-01-01

    Highlights: • Two commercial Lithium-ion batteries are studied through HPPC and EIS tests. • An equivalent circuit model is developed for a range of operating conditions. • This model improves the current battery empirical models for vehicle applications • This model is proved to be efficient in terms of predicting HPPC test resistances. - ABSTRACT: An empirical model for commercial lithium-ion batteries is developed based on electrochemical impedance spectroscopy (EIS) tests. An equivalent circuit is established according to EIS test observations at various battery states of charge and temperatures. A Laplace transfer time based model is developed based on the circuit which can predict the battery operating output potential difference in battery electric and plug-in hybrid vehicles at various operating conditions. This model demonstrates up to 6% improvement compared to simple resistance and Thevenin models and is suitable for modeling and on-board controller purposes. Results also show that this model can be used to predict the battery internal resistance obtained from hybrid pulse power characterization (HPPC) tests to within 20 percent, making it suitable for low to medium fidelity powertrain design purposes. In total, this simple battery model can be employed as a real-time model in electrified vehicle battery management systems

  6. An Empirical Examination of the Relationship Between Test Factor Structure and Test Hierarchical Structure.

    Science.gov (United States)

    Bart, William M.; Airasian, Peter W.

    The question of whether test factor structure is indicative of the test item hierarchy was examined. Data from 1,000 subjects on two sets of five bivalued Law School Admission Test items, which were analyzed with latent trait methods of Bock and Lieberman and of Christoffersson in Psychometrika, were analyzed with an ordering-theoretic method to…

  7. Orion Ground Test Article Water Impact Tests: Photogrammetric Evaluation of Impact Conditions

    Science.gov (United States)

    Vassilakos, Gregory J.; Mark, Stephen D.

    2018-01-01

    The Ground Test Article (GTA) is an early production version of the Orion Crew Module (CM). The structural design of the Orion CM is being developed based on LS-DYNA water landing simulations. As part of the process of confirming the accuracy of LS-DYNA water landing simulations, the GTA water impact test series was conducted at NASA Langley Research Center (LaRC) to gather data for comparison with simulations. The simulation of the GTA water impact tests requires the accurate determination of the impact conditions. To accomplish this, the GTA was outfitted with an array of photogrammetry targets. The photogrammetry system utilizes images from two cameras with a specialized tracking software to determine time histories for the 3-D coordinates of each target. The impact conditions can then be determined from the target location data.

  8. An empirical comparison of Item Response Theory and Classical Test Theory

    Directory of Open Access Journals (Sweden)

    Špela Progar

    2008-11-01

    Full Text Available Based on nonlinear models between the measured latent variable and the item response, item response theory (IRT enables independent estimation of item and person parameters and local estimation of measurement error. These properties of IRT are also the main theoretical advantages of IRT over classical test theory (CTT. Empirical evidence, however, often failed to discover consistent differences between IRT and CTT parameters and between invariance measures of CTT and IRT parameter estimates. In this empirical study a real data set from the Third International Mathematics and Science Study (TIMSS 1995 was used to address the following questions: (1 How comparable are CTT and IRT based item and person parameters? (2 How invariant are CTT and IRT based item parameters across different participant groups? (3 How invariant are CTT and IRT based item and person parameters across different item sets? The findings indicate that the CTT and the IRT item/person parameters are very comparable, that the CTT and the IRT item parameters show similar invariance property when estimated across different groups of participants, that the IRT person parameters are more invariant across different item sets, and that the CTT item parameters are at least as much invariant in different item sets as the IRT item parameters. The results furthermore demonstrate that, with regards to the invariance property, IRT item/person parameters are in general empirically superior to CTT parameters, but only if the appropriate IRT model is used for modelling the data.

  9. Person fit for test speededness: normal curvatures, likelihood ratio tests and empirical Bayes estimates

    NARCIS (Netherlands)

    Goegebeur, Y.; de Boeck, P.; Molenberghs, G.

    2010-01-01

    The local influence diagnostics, proposed by Cook (1986), provide a flexible way to assess the impact of minor model perturbations on key model parameters’ estimates. In this paper, we apply the local influence idea to the detection of test speededness in a model describing nonresponse in test data,

  10. Definitive design status of the SP-100 Ground Engineering System Test Site

    International Nuclear Information System (INIS)

    Renkey, E.J. Jr.; Bazinet, G.D.; Bitten, E.J.; Brackenbury, P.J.; Carlson, W.F.; Irwin, J.J.; Edwards, P.A.; Shen, E.J.; Titzler, P.A.

    1989-05-01

    The SP-100 reactor will be ground tested at the SP-100 Ground Engineering System (GES) Test Site on the US Department of Energy (DOE) Hanford Site near Richland, Washington. Project direction and the flight system design evolution have resulted in a smaller reactor size and the consequential revision to Test Site features to accommodate the design changes and reduce Test Site costs. The significant design events since the completion of the Conceptual Design are discussed in this paper

  11. Definitive design status of the SP-100 Ground Engineering System Test Site

    Energy Technology Data Exchange (ETDEWEB)

    Renkey, E.J. Jr.; Bazinet, G.D.; Bitten, E.J.; Brackenbury, P.J.; Carlson, W.F.; Irwin, J.J.; Edwards, P.A.; Shen, E.J.; Titzler, P.A.

    1989-05-01

    The SP-100 reactor will be ground tested at the SP-100 Ground Engineering System (GES) Test Site on the US Department of Energy (DOE) Hanford Site near Richland, Washington. Project direction and the flight system design evolution have resulted in a smaller reactor size and the consequential revision to Test Site features to accommodate the design changes and reduce Test Site costs. The significant design events since the completion of the Conceptual Design are discussed in this paper.

  12. Anechoic Chamber test of the Electromagnetic Measurement System ground test unit

    Science.gov (United States)

    Stevenson, L. E.; Scott, L. D.; Oakes, E. T.

    1987-04-01

    The Electromagnetic Measurement System (EMMS) will acquire data on electromagnetic (EM) environments at key weapon locations on various aircraft certified for nuclear weapons. The high-frequency ground unit of the EMMS consists of an instrumented B61 bomb case that will measure (with current probes) the localized current density resulting from an applied EM field. For this portion of the EMMS, the first system test was performed in the Anechoic Chamber Facility at Sandia National Laboratories, Albuquerque, New Mexico. The EMMS pod was subjected to EM radiation at microwave frequencies of 1, 3, and 10 GHz. At each frequency, the EMMS pod was rotated at many positions relative to the microwave source so that the individual current probes were exposed to a direct line-of-sight illumination. The variations between the measured and calculated electric fields for the current probes with direct illumination by the EM source are within a few db. The results obtained from the anechoic test were better than expected and verify that the high frequency ground portion of the EMMS will accurately measure the EM environments for which it was designed.

  13. A Model of Managerial Effectiveness in Information Security: From Grounded Theory to Empirical Test

    National Research Council Canada - National Science Library

    Knapp, Kenneth J

    2005-01-01

    Information security is a critical issue facing organizations worldwide. in order to mitigate risk and protect valuable information, organizations need to operate and manage effective information security programs...

  14. A Model of Managerial Effectiveness in Information Security: From Grounded Theory to Empirical Test

    Science.gov (United States)

    2005-09-13

    1 0.1% New Zealand 5 0.7% Egypt 1 0.1% Saudi Arabia 5 0.7% Hungary 1 0.1% Belgium 4 0.5% Kuwait 1 0.1% Denmark 4 0.5% Pakistan 1 0.1% France 4...0.30 92 Executive information systems 19.6 0.06 65 Telework 17.2 0.39 120 MLS Realty 14.4 0.11 106

  15. Strategies for Ground Based Testing of Manned Lunar Surface Systems

    Science.gov (United States)

    Beyer, Jeff; Peacock, Mike; Gill, Tracy

    2009-01-01

    Integrated testing (such as Multi-Element Integrated Test (MEIT)) is critical to reducing risks and minimizing problems encountered during assembly, activation, and on-orbit operation of large, complex manned spacecraft. Provides the best implementation of "Test Like You Fly:. Planning for integrated testing needs to begin at the earliest stages of Program definition. Program leadership needs to fully understand and buy in to what integrated testing is and why it needs to be performed. As Program evolves and design and schedules mature, continually look for suitable opportunities to perform testing where enough components are together in one place at one time. The benefits to be gained are well worth the costs.

  16. Testing of the Ricardian Equivalence proposition: An Empirical Examination for Malaysia (1962-2006

    Directory of Open Access Journals (Sweden)

    Ismadi Ismail

    2008-06-01

    Full Text Available This paper investigates the effects of debts and budgetary deficit on real variables using structural Vector Error Correction Model (VECM method with long-run restrictions. We compare our estimates of the impulse responses with those based on levels Vector Auto-Regressive (VAR with standard recursive order restrictions. The test is conducted on the Malaysian data covering the period of 1962-2006. The empirical results do not support the existence of “Ricardian Equivalence” hypothesis. The effects of budgetary deficit and government spending have a significant influence on private consumption and private investment.

  17. Models of expected returns on the brazilian market: Empirical tests using predictive methodology

    Directory of Open Access Journals (Sweden)

    Adriano Mussa

    2009-01-01

    Full Text Available Predictive methodologies for test of the expected returns models are largely diffused on the international academic environment. However, these methods have not been used in Brazil in a systematic way. Generally, empirical studies proceeded with Brazilian stock market data are concentrated only in the first step of these methodologies. The purpose of this article was test and compare the models CAPM, 3-factors and 4-factors using a predictive methodology, considering two steps – temporal and cross-section regressions – with standard errors obtained by the techniques of Fama and Macbeth (1973. The results indicated the superiority of the 4-fators model as compared to the 3-fators model, and the superiority of the 3- factors model as compared to the CAPM, but no one of the tested models were enough on the explanation of the Brazilian stock returns. Contrary to some empirical evidences, that do not use predictive methodology, the size and momentum effect seem do not exist on the Brazilian capital markets, but there are evidences of the value effect and the relevance of the market for explanation of expected returns. These finds rise some questions, mainly caused by the originality of the methodology on the local market and by the fact that this subject is still incipient and polemic on the Brazilian academic environment.

  18. Theories of coalition formation: An empirical test using data from Danish local government

    DEFF Research Database (Denmark)

    Skjæveland, Asbjørn; Serritzlew, Søren; Blom-Hansen, Jens

    2007-01-01

    Theories of coalition formation represent a diverse set of arguments about why some government coalitions form while others do not. In this article, the authors present a systematic empirical test of the relative importance of the various arguments. The test is designed to avoid a circularity...... problem present in many coalition studies - namely that the theories are tested on data of national government coalitions in postwar Europe: the very data that gave rise to the theories in the first place. Instead, the authors focus on government coalitions at the municipal level. They base their analysis...... on office and policy motives. At the same time, the analysis raises the question of whether actors really seek minimal coalitions....

  19. Testing a ground-based canopy model using the wind river canopy crane

    Science.gov (United States)

    Robert Van Pelt; Malcolm P. North

    1999-01-01

    A ground-based canopy model that estimates the volume of occupied space in forest canopies was tested using the Wind River Canopy Crane. A total of 126 trees in a 0.25 ha area were measured from the ground and directly from a gondola suspended from the crane. The trees were located in a low elevation, old-growth forest in the southern Washington Cascades. The ground-...

  20. Software Development and Test Methodology for a Distributed Ground System

    Science.gov (United States)

    Ritter, George; Guillebeau, Pat; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes in an effort to minimize unnecessary overhead while maximizing process benefits. The Software processes that have evolved still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Processes have evolved, highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project.

  1. Plenoptic Flow Imaging for Ground Testing, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Instantaneous volumetric flow imaging is crucial to aerodynamic development and testing. Simultaneous volumetric measurement of flow parameters enables accurate...

  2. Adoption of renewable heating systems: An empirical test of the diffusion of innovation theory

    International Nuclear Information System (INIS)

    Franceschinis, Cristiano; Thiene, Mara; Scarpa, Riccardo; Rose, John; Moretto, Michele; Cavalli, Raffaele

    2017-01-01

    The implementation of heating technologies based on renewable resources is an important part of Italy's energy policy. Yet, despite efforts to promote the uptake of such technologies, their diffusion is still limited while heating systems based on fossil fuels are still predominant. Theory suggests that beliefs and attitudes of individual consumers play a crucial role in the diffusion of innovative products. However, empirical studies corroborating such observations are still thin on the ground. We use a Choice Experiment and a Latent Class-Random Parameter model to analyze preferences of households in the Veneto region (North-East Italy) for key features of ambient heating systems. We evaluate the coherence of the underlying preference structure using as criteria psychological constructs from the Theory of Diffusion of Innovation by Rogers. Our results broadly support this theory by providing evidence of segmentation of the population consistent with the individuals' propensity to adopt innovations. We found that preferences for heating systems and respondents' willingness to pay for their key features vary across segments. These results enabled us to generate maps that show how willingness to pay estimates vary across the region and can guide local policy design aimed at stimulating adoption of sustainable solutions. - Highlights: • We relate preferences for wood pellet heating systems to Diffusion of Innovation theory. • We found a segmentation of the population according to individual innovativeness. • Preferences for wood pellet heating systems vary across population segments. • Public intervention seems necessary to foster adoption among late adopters.

  3. Single-shell tank riser resistance to ground test plan

    International Nuclear Information System (INIS)

    Kiewert, L.R.

    1996-01-01

    This Test Procedure provides the general directions for conducting Single-Shell Tank Riser to Earth Measurements which will be used by engineering as a step towards providing closure for the Lightning Hazard Issue

  4. Preliminary site design for the SP-100 ground engineering test

    International Nuclear Information System (INIS)

    Cox, C.M.; Miller, W.C.; Mahaffey, M.K.

    1986-04-01

    In November, 1985, Hanford was selected by the Department of Energy (DOE) as the preferred site for a full-scale test of the integrated nuclear subsystem for SP-100. The Hanford Engineering Development Laboratory, operated by Westinghouse Hanford Company, was assigned as the lead contractor for the Test Site. The nuclear subsystem, which includes the reactor and its primary heat transport system, will be provided by the System Developer, another contractor to be selected by DOE in late FY-1986. In addition to reactor operations, test site responsibilities include preparation of the facility plus design, procurement and installation of a vacuum chamber to house the reactor, a secondary heat transport system to dispose of the reactor heat, a facility control system, and postirradiation examination. At the conclusion of the test program, waste disposal and facility decommissioning are required. The test site must also prepare appropriate environmental and safety evaluations. This paper summarizes the preliminary design requirements, the status of design, and plans to achieve full power operation of the test reactor in September, 1990

  5. Empirical model for mean temperature for Indian zone and estimation of precipitable water vapor from ground based GPS measurements

    Directory of Open Access Journals (Sweden)

    C. Suresh Raju

    2007-10-01

    Full Text Available Estimation of precipitable water (PW in the atmosphere from ground-based Global Positioning System (GPS essentially involves modeling the zenith hydrostatic delay (ZHD in terms of surface Pressure (Ps and subtracting it from the corresponding values of zenith tropospheric delay (ZTD to estimate the zenith wet (non-hydrostatic delay (ZWD. This further involves establishing an appropriate model connecting PW and ZWD, which in its simplest case assumed to be similar to that of ZHD. But when the temperature variations are large, for the accurate estimate of PW the variation of the proportionality constant connecting PW and ZWD is to be accounted. For this a water vapor weighted mean temperature (Tm has been defined by many investigations, which has to be modeled on a regional basis. For estimating PW over the Indian region from GPS data, a region specific model for Tm in terms of surface temperature (Ts is developed using the radiosonde measurements from eight India Meteorological Department (IMD stations spread over the sub-continent within a latitude range of 8.5°–32.6° N. Following a similar procedure Tm-based models are also evolved for each of these stations and the features of these site-specific models are compared with those of the region-specific model. Applicability of the region-specific and site-specific Tm-based models in retrieving PW from GPS data recorded at the IGS sites Bangalore and Hyderabad, is tested by comparing the retrieved values of PW with those estimated from the altitude profile of water vapor measured using radiosonde. The values of ZWD estimated at 00:00 UTC and 12:00 UTC are used to test the validity of the models by estimating the PW using the models and comparing it with those obtained from radiosonde data. The region specific Tm-based model is found to be in par with if not better than a

  6. EMPIRICAL WEIGHTED MODELLING ON INTER-COUNTY INEQUALITIES EVOLUTION AND TO TEST ECONOMICAL CONVERGENCE IN ROMANIA

    Directory of Open Access Journals (Sweden)

    Natalia\tMOROIANU‐DUMITRESCU

    2015-06-01

    Full Text Available During the last decades, the regional convergence process in Europe has attracted a considerable interest as a highly significant issue, especially after EU enlargement with the New Member States from Central and Eastern Europe. The most usual empirical approaches are using the β- and σ-convergence, originally developed by a series of neo-classical models. Up-to-date, the EU integration process was proven to be accompanied by an increase of the regional inequalities. In order to determine the existence of a similar increase of the inequalities between the administrative counties (NUTS3 included in the NUTS2 and NUTS1 regions of Romania, this paper provides an empirical modelling of economic convergence allowing to evaluate the level and evolution of the inter-regional inequalities over more than a decade period lasting from 1995 up to 2011. The paper presents the results of a large cross-sectional study of σ-convergence and weighted coefficient of variation, using GDP and population data obtained from the National Institute of Statistics of Romania. Both graphical representation including non-linear regression and the associated tables summarizing numerical values of the main statistical tests are demonstrating the impact of pre- accession policy on the economic development of all Romanian NUTS types. The clearly emphasised convergence in the middle time subinterval can be correlated with the pre-accession drastic changes on economic, political and social level, and with the opening of the Schengen borders for Romanian labor force in 2002.

  7. Distribution of ground rigidity and ground model for seismic response analysis in Hualian project of large scale seismic test

    International Nuclear Information System (INIS)

    Kokusho, T.; Nishi, K.; Okamoto, T.; Tanaka, Y.; Ueshima, T.; Kudo, K.; Kataoka, T.; Ikemi, M.; Kawai, T.; Sawada, Y.; Suzuki, K.; Yajima, K.; Higashi, S.

    1997-01-01

    An international joint research program called HLSST is proceeding. HLSST is large-scale seismic test (LSST) to investigate soil-structure interaction (SSI) during large earthquake in the field in Hualien, a high seismic region in Taiwan. A 1/4-scale model building was constructed on the gravelly soil in this site, and the backfill material of crushed stone was placed around the model plant after excavation for the construction. Also the model building and the foundation ground were extensively instrumental to monitor structure and ground response. To accurately evaluate SSI during earthquakes, geotechnical investigation and forced vibration test were performed during construction process namely before/after base excavation, after structure construction and after backfilling. And the distribution of the mechanical properties of the gravelly soil and the backfill are measured after the completion of the construction by penetration test and PS-logging etc. This paper describes the distribution and the change of the shear wave velocity (V s ) measured by the field test. Discussion is made on the effect of overburden pressure during the construction process on V s in the neighbouring soil and, further on the numerical soil model for SSI analysis. (orig.)

  8. Single event effect ground test results for a fiber optic data interconnect and associated electronics

    International Nuclear Information System (INIS)

    LaBel, K.A.; Hawkins, D.K.; Cooley, J.A.; Stassinopoulos, E.G.; Seidleck, C.M.; Marshall, P.; Dale, C.; Gates, M.M.; Kim, H.S.

    1994-01-01

    As spacecraft unlock the potential of fiber optics for spaceflight applications, system level bit error rates become of concern to the system designer. The authors present ground test data and analysis on candidate system components

  9. Joint ACE ground penetrating radar antenna test facility at the Technical University of Denmark

    DEFF Research Database (Denmark)

    Lenler-Eriksen, Hans-Rudolph; Meincke, Peter; Sarri, A.

    2005-01-01

    A ground penetrating radar (GPR) antenna test facility, established within the ACE network at the Technical University of Denmark (DTU), is described. Examples of results from the facility obtained from measurements of eight different GPR antennas are presented.......A ground penetrating radar (GPR) antenna test facility, established within the ACE network at the Technical University of Denmark (DTU), is described. Examples of results from the facility obtained from measurements of eight different GPR antennas are presented....

  10. Guidelines of the Design of Electropyrotechnic Firing Circuit for Unmanned Flight and Ground Test Projects

    Science.gov (United States)

    Gonzalez, Guillermo A.; Lucy, Melvin H.; Massie, Jeffrey J.

    2013-01-01

    The NASA Langley Research Center, Engineering Directorate, Electronic System Branch, is responsible for providing pyrotechnic support capabilities to Langley Research Center unmanned flight and ground test projects. These capabilities include device selection, procurement, testing, problem solving, firing system design, fabrication and testing; ground support equipment design, fabrication and testing; checkout procedures and procedure?s training to pyro technicians. This technical memorandum will serve as a guideline for the design, fabrication and testing of electropyrotechnic firing systems. The guidelines will discuss the entire process beginning with requirements definition and ending with development and execution.

  11. Current Ground Test Options for Nuclear Thermal Propulsion (NTP)

    Science.gov (United States)

    Gerrish, Harold P., Jr.

    2014-01-01

    About 20 different NTP engines/ reactors were tested from 1959 to 1972 as part of the Rover and Nuclear Engine for Rocket Vehicle Application (NERVA) program. Most were tested in open air at test cell A or test cell C, at the Nevada Test Site (NTS). Even after serious engine breakdowns of the reactor (e.g., Phoebus 1A), the test cells were cleaned up for other engine tests. The engine test stand (ETS) was made for high altitude (approximately 1 psia) testing of an NTP engine with a flight configuration, but still had the exhaust released to open air. The Rover/NERVA program became aware of new environmental regulations which would prohibit the release of any significant quantity of radioactive particulates and noble gases into the open air. The nuclear furnace (NF-1) was the last reactor tested before the program was cancelled in 1973, but successfully demonstrated a scrubber concept on how to filter the NTP exhaust. The NF-1 was demonstrated in the summer of 1972. The NF-1 used a 44MW reactor and operated each run for approximately 90 minutes. The system cooled the hot hydrogen exhaust from the engine with a water spray before entering a particle filter. The exhaust then passed through a series of heat exchangers and water separators to help remove water from the exhaust and further reduce the exhaust temperatures. The exhaust was next prepared for the charcoal trap by passing through a dryer and effluent cooler to bring exhaust temperatures close to liquid nitrogen. At those low temperatures, most of the noble gases (e.g., Xe and Kr made from fission products) get captured in the charcoal trap. The filtered hydrogen is finally passed through a flare stack and released to the air. The concept was overall successful but did show a La plating on some surfaces and had multiple recommendations for improvement. The most recent detailed study on the NTP scrubber concept was performed by the ARES Corporation in 2006. The concept is based on a 50,000 lbf thrust engine

  12. A Hydrogen Containment Process for Nuclear Thermal Engine Ground testing

    Science.gov (United States)

    Wang, Ten-See; Stewart, Eric; Canabal, Francisco

    2016-01-01

    The objective of this study is to propose a new total hydrogen containment process to enable the testing required for NTP engine development. This H2 removal process comprises of two unit operations: an oxygen-rich burner and a shell-and-tube type of heat exchanger. This new process is demonstrated by simulation of the steady state operation of the engine firing at nominal conditions.

  13. Seismic behavior of breakwaters on complex ground by numerical tests: Liquefaction and post liquefaction ground settlements

    Science.gov (United States)

    Gu, Linlin; Zhang, Feng; Bao, Xiaohua; Shi, Zhenming; Ye, Guanlin; Ling, Xianzhang

    2018-04-01

    A large number of breakwaters have been constructed along coasts to protect humans and infrastructures from tsunamis. There is a risk that foundation soils of these structures may liquefy, or partially liquefy during the earthquake preceding a tsunami, which would greatly reduce the structures' capacity to resist the tsunami. It is necessary to consider not only the soil's liquefaction behavior due to earthquake motions but also its post-liquefaction behavior because this behavior will affect the breakwater's capacity to resist an incoming tsunami. In this study, numerical tests based on a sophisticated constitutive model and a soil-water coupled finite element method are used to predict the mechanical behavior of breakwaters and the surrounding soils. Two real breakwaters subjected to two different seismic excitations are examined through numerical simulation. The simulation results show that, earthquakes affect not only the immediate behavior of breakwaters and the surrounding soils but also their long-term settlements due to post-earthquake consolidation. A soil profile with thick clayey layers beneath liquefied soil is more vulnerable to tsunami than a soil profile with only sandy layers. Therefore, quantitatively evaluating the seismic behavior of breakwaters and surrounding soils is important for the design of breakwater structures to resist tsunamis.

  14. SuperAGILE onboard electronics and ground test instrumentation

    International Nuclear Information System (INIS)

    Pacciani, Luigi; Morelli, Ennio; Rubini, Alda; Mastropietro, Marcello; Porrovecchio, Geiland; Costa, Enrico; Del Monte, Ettore; Donnarumma, Immacolata; Evangelista, Yuri; Feroci, Marco; Lazzarotto, Francesco; Rapisarda, Massimo; Soffitta, Paolo

    2007-01-01

    In this paper we describe the electronics of the SuperAGILE X-ray imager on-board AGILE satellite and the instrumentation developed to test and improve the Front-End and digital electronics of the flight model of the imager. Although the working principle of the instrument is very well established, and the conceptual scheme simple, the budget and mechanical constraints of the AGILE small mission made necessary the introduction of new elements in SuperAGILE, regarding both the mechanics and the electronics. In fact the instrument is contained in a ∼44x44x16cm 3 volume, but the required performance is quite ambitious, leading us to equip a sensitive area of ∼1350cm 2 with 6144 Silicon μstrips detectors with a pitch of 121μm and a total length of ∼18.2cm. The result is a very light and power-cheap imager with a good sensitivity (∼15mCrab in 1 day in 15-45keV), high angular resolution (6arcmin) and gross spectral resolution. The test-equipment is versatile, and can be easily modified to test FEE based on self-triggered, data-driven and sparse-readout ASICs such as XA family chips

  15. Guidance on the Stand Down, Mothball, and Reactivation of Ground Test Facilities

    Science.gov (United States)

    Volkman, Gregrey T.; Dunn, Steven C.

    2013-01-01

    The development of aerospace and aeronautics products typically requires three distinct types of testing resources across research, development, test, and evaluation: experimental ground testing, computational "testing" and development, and flight testing. Over the last twenty plus years, computational methods have replaced some physical experiments and this trend is continuing. The result is decreased utilization of ground test capabilities and, along with market forces, industry consolidation, and other factors, has resulted in the stand down and oftentimes closure of many ground test facilities. Ground test capabilities are (and very likely will continue to be for many years) required to verify computational results and to provide information for regimes where computational methods remain immature. Ground test capabilities are very costly to build and to maintain, so once constructed and operational it may be desirable to retain access to those capabilities even if not currently needed. One means of doing this while reducing ongoing sustainment costs is to stand down the facility into a "mothball" status - keeping it alive to bring it back when needed. Both NASA and the US Department of Defense have policies to accomplish the mothball of a facility, but with little detail. This paper offers a generic process to follow that can be tailored based on the needs of the owner and the applicable facility.

  16. A Meta-Analysis of Empirically Tested School-Based Dating Violence Prevention Programs

    Directory of Open Access Journals (Sweden)

    Sarah R. Edwards

    2014-05-01

    Full Text Available Teen dating violence prevention programs implemented in schools and empirically tested were subjected to meta-analysis. Eight studies met criteria for inclusion, consisting of both within and between designs. Overall, the weighted mean effect size (ES across studies was significant, ESr = .11; 95% confidence interval (CI = [.08, .15], p < .0001, showing an overall positive effect of the studied prevention programs. However, 25% of the studies showed an effect in the negative direction, meaning students appeared to be more supportive of dating violence after participating in a dating violence prevention program. This heightens the need for thorough program evaluation as well as the need for decision makers to have access to data about the effectiveness of programs they are considering implementing. Further implications of the results and recommendations for future research are discussed.

  17. Empirical testing of Kotler's high-performance factors to increase sales growth

    Directory of Open Access Journals (Sweden)

    Oren Dayan

    2010-12-01

    Full Text Available Purpose and/or objectives: The primary objective of this study is to empirically test Kotler's (2003 high-performance model which ensures an increase in sales growth. More specifically, the study explores the influence of process variables (as measured by marketing strategies, resources management (as measured by the management of labour, materials, machines, information technology and energy and organisational variables (as measured by TQM and organisational culture on sales growth in the food, motorcar and high-technology manufacturing industries. Problem investigated Various research studies suggest that the managers of firms are continuously challenged in their attempts to increase their sales (Morre, 2007; Pauwels, Silva Risso, Srinivasan & Hanssens, 2004: 142-143; Gray & Hayes, 2007: 1. Kotler (2003 suggests a model that leads to a high performing business. The question is posed as to whether this model can be used to increase sales growth in all businesses. This study seeks to develop a generic model to increase sales growth across industries by using an adapted version of Kotler's (2003 high-performance model. The study investigates the application of this adapted model on the food, motorcar and high-technology manufacturing industries. Design and/or methodology and/or approach: An empirical causal research design that includes 770 marketing and product development practitioners from multinational food, motorcar and high-technology manufacturing firms, was used in this study. A response rate of 76.1% was achieved as only 571 useable questionnaires were returned. The internal reliability and discriminant validity of the measuring instrument were assessed by the calculation of Cronbach alpha coefficients and the conducting an exploratory factor analysis respectively. Structural Equation Modelling SEM was used to statistically test the relationships between the independent variables (marketing strategies, resource management, TQM and

  18. Tests of selection in pooled case-control data: an empirical study.

    Science.gov (United States)

    Udpa, Nitin; Zhou, Dan; Haddad, Gabriel G; Bafna, Vineet

    2011-01-01

    For smaller organisms with faster breeding cycles, artificial selection can be used to create sub-populations with different phenotypic traits. Genetic tests can be employed to identify the causal markers for the phenotypes, as a precursor to engineering strains with a combination of traits. Traditional approaches involve analyzing crosses of inbred strains to test for co-segregation with genetic markers. Here we take advantage of cheaper next generation sequencing techniques to identify genetic signatures of adaptation to the selection constraints. Obtaining individual sequencing data is often unrealistic due to cost and sample issues, so we focus on pooled genomic data. We explore a series of statistical tests for selection using pooled case (under selection) and control populations. The tests generally capture skews in the scaled frequency spectrum of alleles in a region, which are indicative of a selective sweep. Extensive simulations are used to show that these approaches work well for a wide range of population divergence times and strong selective pressures. Control vs control simulations are used to determine an empirical False Positive Rate, and regions under selection are determined using a 1% FPR level. We show that pooling does not have a significant impact on statistical power. The tests are also robust to reasonable variations in several different parameters, including window size, base-calling error rate, and sequencing coverage. We then demonstrate the viability (and the challenges) of one of these methods in two independent Drosophila populations (Drosophila melanogaster) bred under selection for hypoxia and accelerated development, respectively. Testing for extreme hypoxia tolerance showed clear signals of selection, pointing to loci that are important for hypoxia adaptation. Overall, we outline a strategy for finding regions under selection using pooled sequences, then devise optimal tests for that strategy. The approaches show promise for

  19. Economic evaluation of empirical antisecretory therapy versus Helicobacter pylori test for management of dyspepsia: a randomized trial in primary care.

    Science.gov (United States)

    Jarbol, Dorte Ejg; Bech, Mickael; Kragstrup, Jakob; Havelund, Troels; Schaffalitzky de Muckadell, Ove B

    2006-01-01

    An economic evaluation was performed of empirical antisecretory therapy versus test for Helicobacter pylori in the management of dyspepsia patients presenting in primary care. A randomized trial in 106 general practices in the County of Funen, Denmark, was designed to include prospective collection of clinical outcome measures and resource utilization data. Dyspepsia patients (n = 722) presenting in general practice with more than 2 weeks of epigastric pain or discomfort were managed according to one of three initial management strategies: (i) empirical antisecretory therapy, (ii) testing for Helicobacter pylori, or (iii) empirical antisecretory therapy, followed by Helicobacter pylori testing if symptoms improved. Cost-effectiveness and incremental cost-effectiveness ratios of the strategies were determined. The mean proportion of days without dyspeptic symptoms during the 1-year follow-up was 0.59 in the group treated with empirical antisecretory therapy, 0.57 in the H. pylori test-and-eradicate group, and 0.53 in the combination group. After 1 year, 23 percent, 26 percent, and 22 percent, respectively, were symptom-free. Applying the proportion of days without dyspeptic symptoms, the cost-effectiveness for empirical treatment, H. pylori test and the combination were 12,131 Danish kroner (DKK), 9,576 DKK, and 7,301 DKK, respectively. The incremental cost-effectiveness going from the combination strategy to empirical antisecretory treatment or H. pylori test alone was 54,783 DKK and 39,700 DKK per additional proportion of days without dyspeptic symptoms. Empirical antisecretory therapy confers a small insignificant benefit but costs more than strategies based on test for H. pylori and is probably not a cost-effective strategy for the management of dyspepsia in primary care.

  20. [Using projective tests in forensic psychiatry may lead to wrong conclusions. Only empirically tested tests should be used].

    Science.gov (United States)

    Trygg, L; Dåderman, A M; Wiklund, N; Meurling, A W; Lindgren, M; Lidberg, L; Levander, S

    2001-06-27

    The use of projective and psychometric psychological tests at the Department of Forensic Psychiatry in Stockholm (Huddinge), Sweden, was studied for a population of 60 men, including many patients with neuropsychological disabilities and multiple psychiatric disorders. The results showed that the use of projective tests like Rorschach, Object Relations Test, and House-Tree-Person was more frequent than the use of objective psychometric tests. Neuropsychological test batteries like the Halstead-Reitan Neuropsychological Test Battery or Luria-Nebraska Neuropsychological Battery were not used. The majority of patients were, however, assessed by intelligence scales like the WAIS-R. The questionable reliability and validity of the projective tests, and the risk of subjective interpretations, raise a problem when used in a forensic setting, since the courts' decisions about a sentence to prison or psychiatric care is based on the forensic psychiatric assessment. The use of objective psychometric neuropsychological tests and personality tests is recommended.

  1. The JPSS Ground Project Algorithm Verification, Test and Evaluation System

    Science.gov (United States)

    Vicente, G. A.; Jain, P.; Chander, G.; Nguyen, V. T.; Dixon, V.

    2016-12-01

    The Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) is an operational system that provides services to the Suomi National Polar-orbiting Partnership (S-NPP) Mission. It is also a unique environment for Calibration/Validation (Cal/Val) and Data Quality Assessment (DQA) of the Join Polar Satellite System (JPSS) mission data products. GRAVITE provides a fast and direct access to the data and products created by the Interface Data Processing Segment (IDPS), the NASA/NOAA operational system that converts Raw Data Records (RDR's) generated by sensors on the S-NPP into calibrated geo-located Sensor Data Records (SDR's) and generates Mission Unique Products (MUPS). It also facilitates algorithm investigation, integration, checkouts and tuning, instrument and product calibration and data quality support, monitoring and data/products distribution. GRAVITE is the portal for the latest S-NPP and JPSS baselined Processing Coefficient Tables (PCT's) and Look-Up-Tables (LUT's) and hosts a number DQA offline tools that takes advantage of the proximity to the near-real time data flows. It also contains a set of automated and ad-hoc Cal/Val tools used for algorithm analysis and updates, including an instance of the IDPS called GRAVITE Algorithm Development Area (G-ADA), that has the latest installation of the IDPS algorithms running in an identical software and hardware platforms. Two other important GRAVITE component are the Investigator-led Processing System (IPS) and the Investigator Computing Facility (ICF). The IPS is a dedicated environment where authorized users run automated scripts called Product Generation Executables (PGE's) to support Cal/Val and data quality assurance offline. This data-rich and data-driven service holds its own distribution system and allows operators to retrieve science data products. The ICF is a workspace where users can share computing applications and resources and have full access to libraries and

  2. Ground vibration test results of a JetStar airplane using impulsive sine excitation

    Science.gov (United States)

    Kehoe, Michael W.; Voracek, David F.

    1989-01-01

    Structural excitation is important for both ground vibration and flight flutter testing. The structural responses caused by this excitation are analyzed to determine frequency, damping, and mode shape information. Many excitation waveforms have been used throughout the years. The use of impulsive sine (sin omega t)/omega t as an excitation waveform for ground vibration testing and the advantages of using this waveform for flight flutter testing are discussed. The ground vibration test results of a modified JetStar airplane using impulsive sine as an excitation waveform are compared with the test results of the same airplane using multiple-input random excitation. The results indicated that the structure was sufficiently excited using the impulsive sine waveform. Comparisons of input force spectrums, mode shape plots, and frequency and damping values for the two methods of excitation are presented.

  3. Empirical tests of natural selection-based evolutionary accounts of ADHD: a systematic review.

    Science.gov (United States)

    Thagaard, Marthe S; Faraone, Stephen V; Sonuga-Barke, Edmund J; Østergaard, Søren D

    2016-10-01

    ADHD is a prevalent and highly heritable mental disorder associated with significant impairment, morbidity and increased rates of mortality. This combination of high prevalence and high morbidity/mortality seen in ADHD and other mental disorders presents a challenge to natural selection-based models of human evolution. Several hypotheses have been proposed in an attempt to resolve this apparent paradox. The aim of this study was to review the evidence for these hypotheses. We conducted a systematic review of the literature on empirical investigations of natural selection-based evolutionary accounts for ADHD in adherence with the PRISMA guideline. The PubMed, Embase, and PsycINFO databases were screened for relevant publications, by combining search terms covering evolution/selection with search terms covering ADHD. The search identified 790 records. Of these, 15 full-text articles were assessed for eligibility, and three were included in the review. Two of these reported on the evolution of the seven-repeat allele of the ADHD-associated dopamine receptor D4 gene, and one reported on the results of a simulation study of the effect of suggested ADHD-traits on group survival. The authors of the three studies interpreted their findings as favouring the notion that ADHD-traits may have been associated with increased fitness during human evolution. However, we argue that none of the three studies really tap into the core symptoms of ADHD, and that their conclusions therefore lack validity for the disorder. This review indicates that the natural selection-based accounts of ADHD have not been subjected to empirical test and therefore remain hypothetical.

  4. Respiration testing for bioventing and biosparging remediation of petroleum contaminated soil and ground water

    International Nuclear Information System (INIS)

    Gray, A.L.; Brown, A.; Moore, B.J.; Payne, R.E.

    1996-01-01

    Respiration tests were performed to measure the effect of subsurface aeration on the biodegradation rates of petroleum hydrocarbon contamination in vadose zone soils (bioventing) and ground water (biosparging). The aerobic biodegradation of petroleum contamination is typically limited by the absence of oxygen in the soil and ground water. Therefore, the goal of these bioremediation technologies is to increase the oxygen concentration in the subsurface and thereby enhance the natural aerobic biodegradation of the organic contamination. One case study for biosparging bioremediation testing is presented. At this site atmospheric air was injected into the ground water to increase the dissolved oxygen concentration in the ground water surrounding a well, and to aerate the smear zone above the ground water table. Aeration flow rates of 3 to 8 cfm (0.09 to 0.23 m 3 /min) were sufficient to increase the dissolved oxygen concentration. Petroleum hydrocarbon biodegradation rates of 32 to 47 microg/l/hour were calculated based on measurements of dissolved oxygen concentration in ground water. The results of this test have demonstrated that biosparging enhances the biodegradation of petroleum hydrocarbons, but the results as they apply to remediation are not known. Two case studies for bioventing respiration testing are presented

  5. Testing the robustness of the anthropogenic climate change detection statements using different empirical models

    KAUST Repository

    Imbers, J.; Lopez, A.; Huntingford, C.; Allen, M. R.

    2013-01-01

    This paper aims to test the robustness of the detection and attribution of anthropogenic climate change using four different empirical models that were previously developed to explain the observed global mean temperature changes over the last few decades. These studies postulated that the main drivers of these changes included not only the usual natural forcings, such as solar and volcanic, and anthropogenic forcings, such as greenhouse gases and sulfates, but also other known Earth system oscillations such as El Niño Southern Oscillation (ENSO) or the Atlantic Multidecadal Oscillation (AMO). In this paper, we consider these signals, or forced responses, and test whether or not the anthropogenic signal can be robustly detected under different assumptions for the internal variability of the climate system. We assume that the internal variability of the global mean surface temperature can be described by simple stochastic models that explore a wide range of plausible temporal autocorrelations, ranging from short memory processes exemplified by an AR(1) model to long memory processes, represented by a fractional differenced model. In all instances, we conclude that human-induced changes to atmospheric gas composition is affecting global mean surface temperature changes. ©2013. American Geophysical Union. All Rights Reserved.

  6. Testing the robustness of the anthropogenic climate change detection statements using different empirical models

    KAUST Repository

    Imbers, J.

    2013-04-27

    This paper aims to test the robustness of the detection and attribution of anthropogenic climate change using four different empirical models that were previously developed to explain the observed global mean temperature changes over the last few decades. These studies postulated that the main drivers of these changes included not only the usual natural forcings, such as solar and volcanic, and anthropogenic forcings, such as greenhouse gases and sulfates, but also other known Earth system oscillations such as El Niño Southern Oscillation (ENSO) or the Atlantic Multidecadal Oscillation (AMO). In this paper, we consider these signals, or forced responses, and test whether or not the anthropogenic signal can be robustly detected under different assumptions for the internal variability of the climate system. We assume that the internal variability of the global mean surface temperature can be described by simple stochastic models that explore a wide range of plausible temporal autocorrelations, ranging from short memory processes exemplified by an AR(1) model to long memory processes, represented by a fractional differenced model. In all instances, we conclude that human-induced changes to atmospheric gas composition is affecting global mean surface temperature changes. ©2013. American Geophysical Union. All Rights Reserved.

  7. Direct-to-Consumer Genetic Testing and Personal Genomics Services: A Review of Recent Empirical Studies

    Science.gov (United States)

    Ostergren, Jenny

    2013-01-01

    Direct-to-consumer genetic testing (DTC-GT) has sparked much controversy and undergone dramatic changes in its brief history. Debates over appropriate health policies regarding DTC-GT would benefit from empirical research on its benefits, harms, and limitations. We review the recent literature (2011-present) and summarize findings across (1) content analyses of DTC-GT websites, (2) studies of consumer perspectives and experiences, and (3) surveys of relevant health care providers. Findings suggest that neither the health benefits envisioned by DTC-GT proponents (e.g., significant improvements in positive health behaviors) nor the worst fears expressed by its critics (e.g., catastrophic psychological distress and misunderstanding of test results, undue burden on the health care system) have materialized to date. However, research in this area is in its early stages and possesses numerous key limitations. We note needs for future studies to illuminate the impact of DTC-GT and thereby guide practice and policy regarding this rapidly evolving approach to personal genomics. PMID:24058877

  8. Including values in evidence-based policy making for breast screening: An empirically grounded tool to assist expert decision makers.

    Science.gov (United States)

    Parker, Lisa

    2017-07-01

    Values are an important part of evidence-based decision making for health policy: they guide the type of evidence that is collected, how it is interpreted, and how important the conclusions are considered to be. Experts in breast screening (including clinicians, researchers, consumer advocates and senior administrators) hold differing values in relation to what is important in breast screening policy and practice, and committees may find it difficult to incorporate the complexity and variety of values into policy decisions. The decision making tool provided here is intended to assist with this process. The tool is modified from more general frameworks that are intended to assist with ethical decision making in public health, and informed by data drawn from previous empirical studies on values amongst Australian breast screening experts. It provides a structured format for breast screening committees to consider and discuss the values of themselves and others, suggests relevant topics for further inquiry and highlights areas of need for future research into the values of the public. It enables committees to publicly explain and justify their decisions with reference to values, improving transparency and accountability. It is intended to act alongside practices that seek to accommodate the values of individual women in the informed decision making process for personal decision making about participation in breast screening. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Near-Fault Broadband Ground Motion Simulations Using Empirical Green's Functions: Application to the Upper Rhine Graben (France-Germany) Case Study

    Science.gov (United States)

    Del Gaudio, Sergio; Hok, Sebastien; Festa, Gaetano; Causse, Mathieu; Lancieri, Maria

    2017-09-01

    Seismic hazard estimation relies classically on data-based ground motion prediction equations (GMPEs) giving the expected motion level as a function of several parameters characterizing the source and the sites of interest. However, records of moderate to large earthquakes at short distances from the faults are still rare. For this reason, it is difficult to obtain a reliable ground motion prediction for such a class of events and distances where also the largest amount of damage is usually observed. A possible strategy to fill this lack of information is to generate synthetic accelerograms based on an accurate modeling of both extended fault rupture and wave propagation process. The development of such modeling strategies is essential for estimating seismic hazard close to faults in moderate seismic activity zones, where data are even scarcer. For that reason, we selected a target site in Upper Rhine Graben (URG), at the French-German border. URG is a region where faults producing micro-seismic activity are very close to the sites of interest (e.g., critical infrastructures like supply lines, nuclear power plants, etc.) needing a careful investigation of seismic hazard. In this work, we demonstrate the feasibility of performing near-fault broadband ground motion numerical simulations in a moderate seismic activity region such as URG and discuss some of the challenges related to such an application. The modeling strategy is to couple the multi-empirical Green's function technique (multi-EGFt) with a k -2 kinematic source model. One of the advantages of the multi-EGFt is that it does not require a detailed knowledge of the propagation medium since the records of small events are used as the medium transfer function, if, at the target site, records of small earthquakes located on the target fault are available. The selection of suitable events to be used as multi-EGF is detailed and discussed in our specific situation where less number of events are available. We

  10. Marketing-oriented strategy concept and its empirical testing with large sawmills.

    OpenAIRE

    Niemelä, Juha S.

    1993-01-01

    The objectives of this study are both theoretical and empirical. On the theoretical level strategy concept, its operationalization and measurement are analyzed and clarified. On the empirical level marketing strategies and competitive strategies are described by country, and the study also identifies the strategic marketing decisions characterizing different countries or competitive strategies. Furthermore, the relationships between strategies and marketing structures and functions are analyz...

  11. Thermal and Fluid Modeling of the CRYogenic Orbital TEstbed (CRYOTE) Ground Test Article (GTA)

    Science.gov (United States)

    Piryk, David; Schallhorn, Paul; Walls, Laurie; Stopnitzky, Benny; Rhys, Noah; Wollen, Mark

    2012-01-01

    The purpose of this study was to anchor thermal and fluid system models to data acquired from a ground test article (GTA) for the CRYogenic Orbital TEstbed - CRYOTE. To accomplish this analysis, it was broken into four primary tasks. These included model development, pre-test predictions, testing support at Marshall Space Flight Center (MSFC} and post-test correlations. Information from MSFC facilitated the task of refining and correlating the initial models. The primary goal of the modeling/testing/correlating efforts was to characterize heat loads throughout the ground test article. Significant factors impacting the heat loads included radiative environments, multi-layer insulation (MLI) performance, tank fill levels, tank pressures, and even contact conductance coefficients. This paper demonstrates how analytical thermal/fluid networks were established, and it includes supporting rationale for specific thermal responses seen during testing.

  12. A comparison of usability methods for testing interactive health technologies: methodological aspects and empirical evidence.

    Science.gov (United States)

    Jaspers, Monique W M

    2009-05-01

    Usability evaluation is now widely recognized as critical to the success of interactive health care applications. However, the broad range of usability inspection and testing methods available may make it difficult to decide on a usability assessment plan. To guide novices in the human-computer interaction field, we provide an overview of the methodological and empirical research available on the three usability inspection and testing methods most often used. We describe two 'expert-based' and one 'user-based' usability method: (1) the heuristic evaluation, (2) the cognitive walkthrough, and (3) the think aloud. All three usability evaluation methods are applied in laboratory settings. Heuristic evaluation is a relatively efficient usability evaluation method with a high benefit-cost ratio, but requires high skills and usability experience of the evaluators to produce reliable results. The cognitive walkthrough is a more structured approach than the heuristic evaluation with a stronger focus on the learnability of a computer application. Major drawbacks of the cognitive walkthrough are the required level of detail of task and user background descriptions for an adequate application of the latest version of the technique. The think aloud is a very direct method to gain deep insight in the problems end users encounter in interaction with a system but data analyses is extensive and requires a high level of expertise both in the cognitive ergonomics and in computer system application domain. Each of the three usability evaluation methods has shown its usefulness, has its own advantages and disadvantages; no single method has revealed any significant results indicating that it is singularly effective in all circumstances. A combination of different techniques that compliment one another should preferably be used as their collective application will be more powerful than applied in isolation. Innovative mobile and automated solutions to support end-user testing have

  13. Variations in radon-222 in soil and ground water at the Nevada Test Site

    International Nuclear Information System (INIS)

    Wollenberg, H.; Straume, T.; Smith, A.; King, C.Y.

    1977-01-01

    To help evaluate the applicability of variations of radon-222 in ground water and soil gas as a possible earthquake predictor, measurements were conducted in conjunction with underground explosions at the Nevada Test Site (NTS). Radon fluctuations in ground water have been observed during a sequence of aftershocks following the Oroville, California earthquake of 1 August 1975. The NTS measurements were designed to show if these fluctuations were in response to ground shaking; if not, they could be attributed to changes in earth strain prior to the aftershocks. Well waters were periodically sampled and soil-gas 222 Rn monitored prior to and following seven underground explosions of varying strength and distance from sampling and detector locations. Soil-gas 222 Rn contents were measured by the alpha-track method; well water 222 Rn by gamma-ray spectrometry. There was no clearly identifiable correlation between well-water radon fluctuations and individual underground tests. One prominent variation in soil-gas radon corresponded to ground shaking from a pair of underground tests in alluvium; otherwise, there was no apparent correlation between radon emanation and other explosions. Markedly lower soil-gas radon contents following the tests were probably caused by consolidation of alluvium in response to ground shaking

  14. Testing the role of meander cutoff in promoting gene flow across a riverine barrier in ground skinks (Scincella lateralis.

    Directory of Open Access Journals (Sweden)

    Nathan D Jackson

    Full Text Available Despite considerable attention, the long-term impact of rivers on species diversification remains uncertain. Meander loop cutoff (MLC is one river phenomenon that may compromise a river's diversifying effects by passively transferring organisms from one side of the river to the other. However, the ability of MLC to promote gene flow across rivers has not been demonstrated empirically. Here, we test several predictions of MLC-mediated gene flow in populations of North American ground skinks (Scincella lateralis separated by a well-established riverine barrier, the Mississippi River: 1 individuals collected from within meander cutoffs should be more closely related to individuals across the river than on the same side, 2 individuals within meander cutoffs should contain more immigrants than individuals away from meander cutoffs, 3 immigration rates estimated across the river should be highest in the direction of the cutoff event, and 4 the distribution of alleles native to one side of the river should be better predicted by the historical rather than current path of the river. To test these predictions we sampled 13 microsatellite loci and mitochondrial DNA from ground skinks collected near three ancient meander loops. These predictions were generally supported by genetic data, although support was stronger for mtDNA than for microsatellite data. Partial support for genetic divergence of samples within ancient meander loops also provides evidence for the MLC hypothesis. Although a role for MLC-mediated gene flow was supported here for ground skinks, the transient nature of river channels and morphologies may limit the long-term importance of MLC in stemming population divergence across major rivers.

  15. Gender Roles, Gender (Inequality and Fertility: An Empirical Test of Five Gender Equity Indices

    Directory of Open Access Journals (Sweden)

    Melinda Mills

    2010-01-01

    Full Text Available The division of gender roles in the household and societal level gender (inequality have been situated as one of the most powerful factors underlying fertility behaviour. Despite continued theoretical attention to this issue by demographers, empirical research integrating gender roles and equity in relation to fertility remains surprisingly sparse. This paper first provides a brief review of previous research that has examined gender roles and fertility followed by a comparison of six prominent gender equality indices: Gender-related Development Index (GDI, Gender Empowerment Measure (GEM, Gender Gap Index (GGI, Gender Equality Index (GEI, the European Union Gender Equality Index (EU-GEI and the Social Institutions and Gender Index (SIGI. The paper then tests how five of these indices impact fertility intentions and behaviour using a series of multilevel (random-coefficient logistic regression models, applying the European Social Survey (2004/5. The GDI, with its emphasis on human development, adjusted for gender, has the strongest and significant effect on fertility intentions. The EU-GEI, which focuses on the universal caregiver model, uncovers that more equity significantly lowers fertility intentions, but only for women. The remaining indicators show no significant impact. The paper concludes with a reflection and suggestions for future research.

  16. Selfish mothers? An empirical test of parent-offspring conflict over extended parental care.

    Science.gov (United States)

    Paul, Manabi; Sen Majumder, Sreejani; Bhadra, Anindita

    2014-03-01

    Parent-offspring conflict (POC) theory is an interesting conceptual framework for understanding the dynamics of parental care. However, this theory is not easy to test empirically, as exact measures of parental investment in an experimental set-up are difficult to obtain. We have used free-ranging dogs Canis familiaris in India, to study POC in the context of extended parental care. We observed females and their pups in their natural habitat for the mother's tendency to share food given by humans with her pups in the weaning and post-weaning stages. Since these dogs are scavengers, and depend largely on human provided food for their sustenance, voluntary sharing of food by the mother with her pups is a good surrogate for extended parental care. Our behavioural observations convincingly demonstrate an increase of conflict and decrease of cooperation by the mother with her offspring over given food within a span of 4-6 weeks. We also demonstrate that the competition among the pups in a litter scales with litter size, an indicator of sib-sib competition. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Aerothermal Ground Testing of Flexible Thermal Protection Systems for Hypersonic Inflatable Aerodynamic Decelerators

    Science.gov (United States)

    Bruce, Walter E., III; Mesick, Nathaniel J.; Ferlemann, Paul G.; Siemers, Paul M., III; DelCorso, Joseph A.; Hughes, Stephen J.; Tobin, Steven A.; Kardell, Matthew P.

    2012-01-01

    Flexible TPS development involves ground testing and analysis necessary to characterize performance of the FTPS candidates prior to flight testing. This paper provides an overview of the analysis and ground testing efforts performed over the last year at the NASA Langley Research Center and in the Boeing Large-Core Arc Tunnel (LCAT). In the LCAT test series, material layups were subjected to aerothermal loads commensurate with peak re-entry conditions enveloping a range of HIAD mission trajectories. The FTPS layups were tested over a heat flux range from 20 to 50 W/cm with associated surface pressures of 3 to 8 kPa. To support the testing effort a significant redesign of the existing shear (wedge) model holder from previous testing efforts was undertaken to develop a new test technique for supporting and evaluating the FTPS in the high-temperature, arc jet flow. Since the FTPS test samples typically experience a geometry change during testing, computational fluid dynamic (CFD) models of the arc jet flow field and test model were developed to support the testing effort. The CFD results were used to help determine the test conditions experienced by the test samples as the surface geometry changes. This paper includes an overview of the Boeing LCAT facility, the general approach for testing FTPS, CFD analysis methodology and results, model holder design and test methodology, and selected thermal results of several FTPS layups.

  18. An SINS/GNSS Ground Vehicle Gravimetry Test Based on SGA-WZ02

    Directory of Open Access Journals (Sweden)

    Ruihang Yu

    2015-09-01

    Full Text Available In March 2015, a ground vehicle gravimetry test was implemented in eastern Changsha to assess the repeatability and accuracy of ground vehicle SINS/GNSS gravimeter—SGA-WZ02. The gravity system developed by NUDT consisted of a Strapdown Inertial Navigation System (SINS, a Global Navigation Satellite System (GNSS remote station on test vehicle, a GNSS static master station on the ground, and a data logging subsystem. A south-north profile of 35 km along the highway in eastern Changsha was chosen and four repeated available measure lines were obtained. The average speed of a vehicle is 40 km/h. To assess the external ground gravity disturbances, precise ground gravity data was built by CG-5 precise gravimeter as the reference. Under relative smooth conditions, internal accuracy among repeated lines shows an average agreement at the level of 1.86 mGal for half wavelengths about 1.1 km, and 1.22 mGal for 1.7 km. The root-mean-square (RMS of difference between calculated gravity data and reference data is about 2.27 mGal/1.1 km, and 1.74 mGal/1.7 km. Not all of the noises caused by vehicle itself and experiments environments were eliminated in the primary results. By means of selecting reasonable filters and improving the GNSS observation conditions, further developments in ground vehicle gravimetry are promising.

  19. SOAC - State-of-the-Art Car Engineering Tests at Department of Transportation High Speed Ground Test Center : Volume 2. Performance Tests.

    Science.gov (United States)

    1975-01-01

    The six-volume report presents the technical methodology, data samples, and results of tests conducted on the SOAC on the Rail Transit Test Track at the High Speed Ground Test Center in Pueblo, Colorado during the period April to July 1973. The Test ...

  20. Using Whole-House Field Tests to Empirically Derive Moisture Buffering Model Inputs

    Energy Technology Data Exchange (ETDEWEB)

    Woods, J.; Winkler, J.; Christensen, D.; Hancock, E.

    2014-08-01

    Building energy simulations can be used to predict a building's interior conditions, along with the energy use associated with keeping these conditions comfortable. These models simulate the loads on the building (e.g., internal gains, envelope heat transfer), determine the operation of the space conditioning equipment, and then calculate the building's temperature and humidity throughout the year. The indoor temperature and humidity are affected not only by the loads and the space conditioning equipment, but also by the capacitance of the building materials, which buffer changes in temperature and humidity. This research developed an empirical method to extract whole-house model inputs for use with a more accurate moisture capacitance model (the effective moisture penetration depth model). The experimental approach was to subject the materials in the house to a square-wave relative humidity profile, measure all of the moisture transfer terms (e.g., infiltration, air conditioner condensate) and calculate the only unmeasured term: the moisture absorption into the materials. After validating the method with laboratory measurements, we performed the tests in a field house. A least-squares fit of an analytical solution to the measured moisture absorption curves was used to determine the three independent model parameters representing the moisture buffering potential of this house and its furnishings. Follow on tests with realistic latent and sensible loads showed good agreement with the derived parameters, especially compared to the commonly-used effective capacitance approach. These results show that the EMPD model, once the inputs are known, is an accurate moisture buffering model.

  1. An Empirical Test of Causal Inference Between Role Perceptions, Satisfaction with Work, Performance and Organizational Level

    Science.gov (United States)

    Szilagyi, Andrew D.

    1977-01-01

    Attempts to empirically verify the causal source and direction of causal influence between role ambiguity, role conflict and job satisfaction and performance for three organizational levels in a hospital environment. (Author/RK)

  2. Empirical tests of Zipf's law mechanism in open source Linux distribution.

    Science.gov (United States)

    Maillart, T; Sornette, D; Spaeth, S; von Krogh, G

    2008-11-21

    Zipf's power law is a ubiquitous empirical regularity found in many systems, thought to result from proportional growth. Here, we establish empirically the usually assumed ingredients of stochastic growth models that have been previously conjectured to be at the origin of Zipf's law. We use exceptionally detailed data on the evolution of open source software projects in Linux distributions, which offer a remarkable example of a growing complex self-organizing adaptive system, exhibiting Zipf's law over four full decades.

  3. An Empirical Study on Washback Effects of the Internet-Based College English Test Band 4 in China

    Science.gov (United States)

    Wang, Chao; Yan, Jiaolan; Liu, Bao

    2014-01-01

    Based on Bailey's washback model, in respect of participants, process and products, the present empirical study was conducted to find the actual washback effects of the internet-based College English Test Band 4 (IB CET-4). The methods adopted are questionnaires, class observation, interview and the analysis of both the CET-4 teaching and testing…

  4. Empirical Testing of a Theoretical Extension of the Technology Acceptance Model: An Exploratory Study of Educational Wikis

    Science.gov (United States)

    Liu, Xun

    2010-01-01

    This study extended the technology acceptance model and empirically tested the new model with wikis, a new type of educational technology. Based on social cognitive theory and the theory of planned behavior, three new variables, wiki self-efficacy, online posting anxiety, and perceived behavioral control, were added to the original technology…

  5. How "Does" the Comforting Process Work? An Empirical Test of an Appraisal-Based Model of Comforting

    Science.gov (United States)

    Jones, Susanne M.; Wirtz, John G.

    2006-01-01

    Burleson and Goldsmith's (1998) comforting model suggests an appraisal-based mechanism through which comforting messages can bring about a positive change in emotional states. This study is a first empirical test of three causal linkages implied by the appraisal-based comforting model. Participants (N=258) talked about an upsetting event with a…

  6. A study on the nondestructive test optimum design for a ground tracked combat vehicle

    Energy Technology Data Exchange (ETDEWEB)

    Kim Byeong Ho; Seo, Jae Hyun; Gil, Hyeon Jun [Defence Agency for Technology and Quality, Seoul (Korea, Republic of); Kim, Seon Hyeong [Hanwha Techwin Co.,Ltd., Changwon (Korea, Republic of); Seo, Sang Chul [Changwon National University, Changwon (Korea, Republic of)

    2015-10-15

    In this study, a nondestructive test (NDT) is performed to inspect the optimal design of a ground tracked combat vehicle for self-propelled artillery, tank, and armored vehicles. The minimum qualification required for personnel performing the NDT of a ground tracked combat vehicle was initially established in US military standards, and then applied to the Korean defense specifications to develop a ground tracked combat vehicle. However, the qualification standards of an NDT inspector have been integrated into NAS410 through the military and commercial specifications unification project that were applied in the existing aerospace/defense industry public standard. The design method for this study was verified by applying the optimal design to the liquid penetrant testing Al forging used in self-propelled artillery. This confirmed the reliability and soundness of the product.

  7. Empirical tests of the Chicago model and the Easterlin hypothesis: a case study of Japan.

    Science.gov (United States)

    Ohbuchi, H

    1982-05-01

    The objective of this discussion is to test the applicability of economic theory of fertility with special reference to postwar Japan and to find a clue for forecasting the future trend of fertility. The theories examined are the "Chicago model" and the "Easterlin hypothesis." The major conclusion common among the leading economic theories of fertility, which have their origin with Gary S. Becker (1960, 1965) and Richard A. Easterlin (1966), is the positive income effect, i.e., that the relationship between income and fertility is positive despite the evidence that higher income families have fewer children and that fertility has declined with economic development. To bridge the gap between theory and fact is the primary purpose of the economic theory of fertility, and each offers a different interpretation for it. The point of the Chicago model, particularly of the household decision making model of the "new home economics," is the mechanism that a positive effect of husband's income growth on fertility is offset by a negative price effect caused by the opportunity cost of wife's time. While the opportunity cost of wife's time is independent of the female wage rate for an unemployed wife, it is directly associated with the wage rate for a gainfully employed wife. Thus, the fertility response to female wages occurs only among families with an employed wife. The primary concern of empirical efforts to test the Chicago model has been with the determination of income and price elasticities. An attempt is made to test the relevance of the Chicago model and the Easterlin hypothesis in explaning the fertility movement in postwar Japan. In case of the Chicago model, the statistical results appeared fairly successful but did not match with the theory. The effect on fertility of a rise in women's real wage (and, therefore in the opportunity cost of mother's time) and of a rise in labor force participation rate of married women of childbearing age in recent years could not

  8. Deep in Data: Empirical Data Based Software Accuracy Testing Using the Building America Field Data Repository: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Neymark, J.; Roberts, D.

    2013-06-01

    An opportunity is available for using home energy consumption and building description data to develop a standardized accuracy test for residential energy analysis tools. That is, to test the ability of uncalibrated simulations to match real utility bills. Empirical data collected from around the United States have been translated into a uniform Home Performance Extensible Markup Language format that may enable software developers to create translators to their input schemes for efficient access to the data. This may facilitate the possibility of modeling many homes expediently, and thus implementing software accuracy test cases by applying the translated data. This paper describes progress toward, and issues related to, developing a usable, standardized, empirical data-based software accuracy test suite.

  9. Ground vibration test results for Drones for Aerodynamic and Structural Testing (DAST)/Aeroelastic Research Wing (ARW-1R) aircraft

    Science.gov (United States)

    Cox, T. H.; Gilyard, G. B.

    1986-01-01

    The drones for aerodynamic and structural testing (DAST) project was designed to control flutter actively at high subsonic speeds. Accurate knowledge of the structural model was critical for the successful design of the control system. A ground vibration test was conducted on the DAST vehicle to determine the structural model characteristics. This report presents and discusses the vibration and test equipment, the test setup and procedures, and the antisymmetric and symmetric mode shape results. The modal characteristics were subsequently used to update the structural model employed in the control law design process.

  10. The ACE-DTU Planar Near-Field Ground Penetrating Radar Antenna Test Facility

    DEFF Research Database (Denmark)

    Lenler-Eriksen, Hans-Rudolph; Meincke, Peter

    2004-01-01

    The ACE-DTU planar near-field ground penetrating radar (GPR) antenna test facility is used to measure the plane-wave transmitting spectrum of a GPR loop antenna close to the air-soil interface by means of a probe buried in soil. Probe correction is implemented using knowledge about the complex...

  11. Dynamic load testing on the bearing capacity of prestressed tubular concrete piles in soft ground

    Science.gov (United States)

    Yu, Chuang; Liu, Songyu

    2008-11-01

    Dynamic load testing (DLT) is a high strain test method for assessing pile performance. The shaft capacity of a driven PTC (prestressed tubular concrete) pile in marine soft ground will vary with time after installation. The DLT method has been successfully transferred to the testing of prestressed pipe piles in marine soft clay of Lianyungang area in China. DLT is investigated to determine the ultimate bearing capacity of single pile at different period after pile installation. The ultimate bearing capacity of single pile was founded to increase more than 70% during the inventing 3 months, which demonstrate the time effect of rigid pile bearing capacity in marine soft ground. Furthermore, the skin friction and axial force along the pile shaft are presented as well, which present the load transfer mechanism of pipe pile in soft clay. It shows the economy and efficiency of DLT method compared to static load testing method.

  12. Environmental assessment of SP-100 ground engineering system test site: Hanford Site, Richland, Washington

    Energy Technology Data Exchange (ETDEWEB)

    1988-12-01

    The US Department of Energy (DOE) proposes to modify an existing reactor containment building (decommissioned Plutonium Recycle Test Reactor (PRTR) 309 Building) to provide ground test capability for the prototype SP-100 reactor. The 309 Building (Figure 1.1) is located in the 300 Area on the Hanford Site in Washington State. The National Environmental Policy Act (NEPA) requires that Federal agencies assess the potential impacts that their actions may have on the environment. This Environmental Assessment describes the consideration given to environmental impacts during reactor concept and test site selection, examines the environmental effects of the DOE proposal to ground test the nuclear subsystem, describes alternatives to the proposed action, and examines radiological risks of potential SP-100 use in space. 73 refs., 19 figs., 7 tabs.

  13. Large scale vibration tests on pile-group effects using blast-induced ground motion

    International Nuclear Information System (INIS)

    Katsuichirou Hijikata; Hideo Tanaka; Takayuki Hashimoto; Kazushige Fujiwara; Yuji Miyamoto; Osamu Kontani

    2005-01-01

    Extensive vibration tests have been performed on pile-supported structures at a large-scale mining site. Ground motions induced by large-scale blasting operations were used as excitation forces for vibration tests. The main objective of this research is to investigate the dynamic behavior of pile-supported structures, in particular, pile-group effects. Two test structures were constructed in an excavated 4 m deep pit. Their test-structures were exactly the same. One structure had 25 steel piles and the other had 4 piles. The test pit was backfilled with sand of appropriate grain size distributions to obtain good compaction, especially between the 25 piles. Accelerations were measured at the structures, in the test pit and in the adjacent free field, and pile strains were measured. Dynamic modal tests of the pile-supported structures and PS measurements of the test pit were performed before and after the vibration tests to detect changes in the natural frequencies of the soil-pile-structure systems and the soil stiffness. The vibration tests were performed six times with different levels of input motions. The maximum horizontal acceleration recorded at the adjacent ground surface varied from 57 cm/s 2 to 1,683 cm/s 2 according to the distances between the test site and the blast areas. (authors)

  14. In-flight and ground testing of single event upset sensitivity in static RAMs

    International Nuclear Information System (INIS)

    Johansson, K.; Dyreklev, P.; Granbom, B.; Calvet, C.; Fourtine, S.; Feuillatre, O.

    1998-01-01

    This paper presents the results from in-flight measurements of single event upsets (SEU) in static random access memories (SRAM) caused by the atmospheric radiation environment at aircraft altitudes. The memory devices were carried on commercial airlines at high altitude and mainly high latitudes. The SEUs were monitored by a Component Upset Test Equipment (CUTE), designed for this experiment. The in flight results are compared to ground based testing with neutrons from three different sources

  15. Saturn V First Stage Lowered to the Ground After Static Test

    Science.gov (United States)

    1966-01-01

    This vintage photograph shows the 138-foot long first stage of the Saturn V being lowered to the ground following a successful static test firing at Marshall Space flight Center's S-1C test stand. The firing provided NASA engineers information on the booster's systems. The towering 363-foot Saturn V was a multi-stage, multi-engine launch vehicle standing taller than the Statue of Liberty. Altogether, the Saturn V engines produced as much power as 85 Hoover Dams.

  16. Field Guide for Testing Existing Photovoltaic Systems for Ground Faults and Installing Equipment to Mitigate Fire Hazards

    Energy Technology Data Exchange (ETDEWEB)

    Brooks, William [Brooks Engineering, Vacaville, CA (United States); Basso, Thomas [National Renewable Energy Lab. (NREL), Golden, CO (United States); Coddington, Michael [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-10-01

    Ground faults and arc faults are the two most common reasons for fires in photovoltaic (PV) arrays and methods exist that can mitigate the hazards. This report provides field procedures for testing PV arrays for ground faults, and for implementing high resolution ground fault and arc fault detectors in existing and new PV system designs.

  17. A statistical test of the stability assumption inherent in empirical estimates of economic depreciation.

    Science.gov (United States)

    Shriver, K A

    1986-01-01

    Realistic estimates of economic depreciation are required for analyses of tax policy, economic growth and production, and national income and wealth. THe purpose of this paper is to examine the stability assumption underlying the econometric derivation of empirical estimates of economic depreciation for industrial machinery and and equipment. The results suggest that a reasonable stability of economic depreciation rates of decline may exist over time. Thus, the assumption of a constant rate of economic depreciation may be a reasonable approximation for further empirical economic analyses.

  18. Ground-based self-gravity tests for LISA Pathfinder and LISA

    International Nuclear Information System (INIS)

    Trenkel, C; Warren, C; Wealthy, D

    2009-01-01

    Gravitational coupling between the free-falling test masses and the surrounding spacecraft is one of the dominant noise sources for both LISA Pathfinder and LISA. At present, there are no plans to verify any of the self-gravity requirements by test, on the ground. Here, we explore the possibilities of conducting such tests, using a customised torsion balance. We discuss the main sources of systematic and statistical uncertainty present in such a set-up. Our preliminary assessment indicates that the sensitivity is sufficient to carry out meaningful self-gravity tests.

  19. An empirical likelihood ratio test robust to individual heterogeneity for differential expression analysis of RNA-seq.

    Science.gov (United States)

    Xu, Maoqi; Chen, Liang

    2018-01-01

    The individual sample heterogeneity is one of the biggest obstacles in biomarker identification for complex diseases such as cancers. Current statistical models to identify differentially expressed genes between disease and control groups often overlook the substantial human sample heterogeneity. Meanwhile, traditional nonparametric tests lose detailed data information and sacrifice the analysis power, although they are distribution free and robust to heterogeneity. Here, we propose an empirical likelihood ratio test with a mean-variance relationship constraint (ELTSeq) for the differential expression analysis of RNA sequencing (RNA-seq). As a distribution-free nonparametric model, ELTSeq handles individual heterogeneity by estimating an empirical probability for each observation without making any assumption about read-count distribution. It also incorporates a constraint for the read-count overdispersion, which is widely observed in RNA-seq data. ELTSeq demonstrates a significant improvement over existing methods such as edgeR, DESeq, t-tests, Wilcoxon tests and the classic empirical likelihood-ratio test when handling heterogeneous groups. It will significantly advance the transcriptomics studies of cancers and other complex disease. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. A Field Test of Electromigration as a Method for Remediating Sulfate from Shallow Ground Water

    Science.gov (United States)

    Patterson, C.G.; Runnells, D.D.

    1996-01-01

    Electromigration offers a potential tool for remediating ground water contaminated with highly soluble components, such as Na+, Cl-, NO3-, and SO4-. A field experiment was designed to test the efficacy of electromigration for preconcentrating dissolved SO42- in ground water associated with a fossil-fuel power plant. Two shallow wells, 25 feet apart (one 25 feet deep, the other 47 feet deep), were constructed in the upper portion of an unconfined alluvial aquifer. The wells were constructed with a double-wall design, with an outer casing of 4-inch PVC and an inner tube of 2-inch PVC; both were fully slotted (0.01 inch). Electrodes were constructed by wrapping the inner tubing with a 100-foot length of rare-earth metal oxide/copper wire. An electrical potential of 10.65 volts DC was applied, and tests were run for periods of 12, 44, and 216 hours. Results showed large changes in the pH from the initial pH of ground water of about 7.5 to values of approximately 2 and 12 at the anode and cathode, respectively. Despite the fact that the test conditions were far from ideal, dissolved SO42- was significantly concentrated at the anode. Over a period of approximately nine days, the concentration of SO42- at the anode reached what appeared to be a steady-state value of 2200 mg/L, compared to the initial value in ground water of approximately 1150 mg/L. The results of this field test should encourage further investigation of electromigration as a tool in the remediation of contaminated ground water.

  1. Effluent Containment System for space thermal nuclear propulsion ground test facilities

    International Nuclear Information System (INIS)

    1995-08-01

    This report presents the research and development study work performed for the Space Reactor Power System Division of the U.S. Department of Energy on an innovative ECS that would be used during ground testing of a space nuclear thermal rocket engine. A significant portion of the ground test facilities for a space nuclear thermal propulsion engine are the effluent treatment and containment systems. The proposed ECS configuration developed recycles all engine coolant media and does not impact the environment by venting radioactive material. All coolant media, hydrogen and water, are collected, treated for removal of radioactive particulates, and recycled for use in subsequent tests until the end of the facility life. Radioactive materials removed by the treatment systems are recovered, stored for decay of short-lived isotopes, or packaged for disposal as waste. At the end of the useful life, the facility will be decontaminated and dismantled for disposal

  2. Gender Roles, Gender (In)equality and Fertility: An Empirical Test of Five Gender Equity Indices

    NARCIS (Netherlands)

    Mills, M.

    2010-01-01

    The division of gender roles in the household and societal level gender (in)equality have been situated as one of the most powerful factors underlying fertility behaviour. Despite continued theoretical attention to this issue by demographers, empirical research integrating gender roles and equity in

  3. An empirical test of new developments in coalition theory for the design of international environmental agreements

    NARCIS (Netherlands)

    Finus, M.; Sáiz Pérez, M.E.; Hendrix, E.M.T.

    2009-01-01

    We consider new developments in coalition theory for the design of international environmental agreements (IEAs). Applying an empirical model on climate change that comprises benefit and cost estimates from abatement for 12 world regions, we analyze how the design of an agreement affects the success

  4. What determines crime rates? An empirical test of integrated economic and sociological theories of criminal behavior

    NARCIS (Netherlands)

    Engelen, Peter Jan; Lander, Michel W.; van Essen, Marc

    Research on crime has by no means reached a definitive conclusion on which factors are related to crime rates. We contribute to the crime literature by providing an integrated empirical model of economic and sociological theories of criminal behavior and by using a very comprehensive set of

  5. Marx and Dahrendorf on Income Inequality, Class Consciousness and Class Conflict: An Empirical Test.

    Science.gov (United States)

    Robinson, Robert V.; Kelley, Jonathan

    The issue addressed by this paper is the lack of empirical research on the class theories of Karl Marx and Ralf Dahrendorf. In order to bridge this gap, data are analyzed on the theoretical and statistical implications of Marx's theory (which focuses on ownership of the means of production) and Dahrendorf's theory (which focuses on authority in…

  6. An equivalent ground thermal test method for single-phase fluid loop space radiator

    Directory of Open Access Journals (Sweden)

    Xianwen Ning

    2015-02-01

    Full Text Available Thermal vacuum test is widely used for the ground validation of spacecraft thermal control system. However, the conduction and convection can be simulated in normal ground pressure environment completely. By the employment of pumped fluid loops’ thermal control technology on spacecraft, conduction and convection become the main heat transfer behavior between radiator and inside cabin. As long as the heat transfer behavior between radiator and outer space can be equivalently simulated in normal pressure, the thermal vacuum test can be substituted by the normal ground pressure thermal test. In this paper, an equivalent normal pressure thermal test method for the spacecraft single-phase fluid loop radiator is proposed. The heat radiation between radiator and outer space has been equivalently simulated by combination of a group of refrigerators and thermal electrical cooler (TEC array. By adjusting the heat rejection of each device, the relationship between heat flux and surface temperature of the radiator can be maintained. To verify this method, a validating system has been built up and the experiments have been carried out. The results indicate that the proposed equivalent ground thermal test method can simulate the heat rejection performance of radiator correctly and the temperature error between in-orbit theory value and experiment result of the radiator is less than 0.5 °C, except for the equipment startup period. This provides a potential method for the thermal test of space systems especially for extra-large spacecraft which employs single-phase fluid loop radiator as thermal control approach.

  7. First in situ operation performance test of ground source heat pump in Tunisia

    International Nuclear Information System (INIS)

    Naili, Nabiha; Attar, Issam; Hazami, Majdi; Farhat, Abdelhamid

    2013-01-01

    Highlights: • Evaluate the geothermal energy in Tunisia. • Study of the performance of GSHP system for cooling space. • GSHP is a promising alternative for building cooling in Tunisia. - Abstract: The main purpose of this paper is to study the energetic potential of the deployment in Tunisia of the Ground Source Heat Pump (GSHP) system for cooling mode application. Therefore, a pilot GSHP system using horizontal Ground Heat Exchanger (GHE) was installed and experimented in the Research and Technology Center of Energy (CRTEn), Borj Cédria. The experiment is conducted in a test room with a floor area of about 12 m 2 . In the floor of the tested room is integrated a polyethylene exchanger (PEX) used as a radiant floor cooling (RFC) system. The experimental setup mainly includes the ground temperature, the temperature and flow rate of water circulating in the heat pump and the GHE, as well as the power consumption of the heat pump and circulating pumps. These experimental data are essentially used to evaluate the coefficient of performance of the heat pump (COP hp ) and the overall system (COP sys ) for continuous operation mode. The COP hp and the COP sys were found to be 4.25 and 2.88, respectively. These results reveal that the use of the ground source heat pump is very appropriate for Tunisian building cooling

  8. Standard Test Methods for Insulation Integrity and Ground Path Continuity of Photovoltaic Modules

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2000-01-01

    1.1 These test methods cover procedures for (1) testing for current leakage between the electrical circuit of a photovoltaic module and its external components while a user-specified voltage is applied and (2) for testing for possible module insulation breakdown (dielectric voltage withstand test). 1.2 A procedure is described for measuring the insulation resistance between the electrical circuit of a photovoltaic module and its external components (insulation resistance test). 1.3 A procedure is provided for verifying that electrical continuity exists between the exposed external conductive surfaces of the module, such as the frame, structural members, or edge closures, and its grounding point (ground path continuity test). 1.4 This test method does not establish pass or fail levels. The determination of acceptable or unacceptable results is beyond the scope of this test method. 1.5 There is no similar or equivalent ISO standard. This standard does not purport to address all of the safety concerns, if a...

  9. Ground test program for a full-size solar dynamic heat receiver

    Science.gov (United States)

    Sedgwick, L. M.; Kaufmann, K. J.; Mclallin, K. L.; Kerslake, T. W.

    1991-01-01

    Test hardware, facilities, and procedures were developed to conduct ground testing of a full-size, solar dynamic heat receiver in a partially simulated, low earth orbit environment. The heat receiver was designed to supply 102 kW of thermal energy to a helium and xenon gas mixture continuously over a 94 minute orbit, including up to 36 minutes of eclipse. The purpose of the test program was to quantify the receiver thermodynamic performance, its operating temperatures, and thermal response to changes in environmental and power module interface boundary conditions. The heat receiver was tested in a vacuum chamber using liquid nitrogen cold shrouds and an aperture cold plate. Special test equipment was designed to provide the required ranges in interface boundary conditions that typify those expected or required for operation as part of the solar dynamic power module on the Space Station Freedom. The support hardware includes an infrared quartz lamp heater with 30 independently controllable zones and a closed-Brayton cycle engine simulator to circulate and condition the helium-xenon gas mixture. The test article, test support hardware, facilities, and instrumentation developed to conduct the ground test program are all described.

  10. Tests of Parameters Instability: Theoretical Study and Empirical Applications on Two Types of Models (ARMA Model and Market Model

    Directory of Open Access Journals (Sweden)

    Sahbi FARHANI

    2012-01-01

    Full Text Available This paper considers tests of parameters instability and structural change with known, unknown or multiple breakpoints. The results apply to a wide class of parametric models that are suitable for estimation by strong rules for detecting the number of breaks in a time series. For that, we use Chow, CUSUM, CUSUM of squares, Wald, likelihood ratio and Lagrange multiplier tests. Each test implicitly uses an estimate of a change point. We conclude with an empirical analysis on two different models (ARMA model and simple linear regression model.

  11. Do Pre-Entry Tests Predict Competencies Required to Excel Academically in Law School?: An Empirical Investigation

    Science.gov (United States)

    Wamala, Robert

    2016-01-01

    Purpose: Prospective students of law are required to demonstrate competence in certain disciplines to attain admission to law school. The grounding in the disciplines is expected to demonstrate competencies required to excel academically in law school. The purpose of this study is to investigate the relevance of the law school admission test to…

  12. Evolution of a test article handling system for the SP-100 ground engineering system test

    International Nuclear Information System (INIS)

    Shen, E.J.; Schweiger, L.J.; Miller, W.C.; Gluck, R.; Devies, S.M.

    1987-04-01

    A simulated space environment test of a flight prototypic SP-100 reactor, control system, and flight shield will be conducted at the Hanford Engineering Development Laboratory (HEDL). The flight prototypic components and the supporting primary heat removal system are collectively known as the Nuclear Assembly Test Article (TA). The unique configuration and materials of fabrication for the Test Article require a specialized handling facility to support installation, maintenance, and final disposal operations. Westinghouse Hanford Company, the Test Site Operator, working in conjunction with General Electric Company, the Test Article supplier, developed and evaluated several handling concepts resulting in the selection of a reference Test Article Handling System. The development of the reference concept for the handling system is presented

  13. Is Asian American Parenting Controlling and Harsh? Empirical Testing of Relationships between Korean American and Western Parenting Measures

    OpenAIRE

    Choi, Yoonsun; Kim, You Seung; Kim, Su Yeong; Park, Irene Kim

    2013-01-01

    Asian American parenting is often portrayed as highly controlling and even harsh. This study empirically tested the associations between a set of recently developed Korean ga-jung-kyo-yuk measures and several commonly used Western parenting measures to accurately describe Asian American family processes, specifically those of Korean Americans. The results show a much nuanced and detailed picture of Korean American parenting as a blend of Western authoritative and authoritarian styles with pos...

  14. The Design Dimensions of the Just Organization: An Empirical Test of the Relation Between Organization Design and Corporate Social Performance

    OpenAIRE

    Gerde, Virginia Woods

    1998-01-01

    Although organization design to bring about corporate social performance (CSP) is a critical issue in the business and society field, little research has been conducted. This study is an empirical test of the general model of the just organization presented by Stephens and colleagues (1991; Stephens, et al., 1997). The theoretical development describes organizational design principles from John Rawls' (1971) Theory of Justice, chosen for its emphasis on economic organizations and structure, i...

  15. On the Limitations of Government Borrowing: A Framework for Empirical Testing

    OpenAIRE

    James D. Hamilton; Marjorie A. Flavin

    1985-01-01

    This paper seeks to distinguish empirically between two views on the limitations of government borrowing. According to one view, nothing precludes the government from running a permanent budget deficit, paying interest due on the growing debt load simply by issuing new debt, An alternative perspective holds that creditors would be unwilling to purchase government debt unless the government made a credible commitment to balance its budget in present value terms. We show that distinguishing bet...

  16. Testing the impact of unemployment on self-employment: empirical evidence from OECD countries

    OpenAIRE

    Halicioglu, Ferda; Yolac, Sema

    2015-01-01

    The impact of unemployment on self-employment is rather an ambiguous issue in economics. According to refugee effect approach, there are two counter arguments: the theory of income choice argument suggests that increased unemployment may lead to increased self-employment activities whereas the counter argument defends the view that an increase in unemployment rates may decrease the endowments of human capital and entrepreneurial talent causing a rise in unemployment rates further. The empiric...

  17. Concept study of a hydrogen containment process during nuclear thermal engine ground testing

    Directory of Open Access Journals (Sweden)

    Ten-See Wang

    Full Text Available A new hydrogen containment process was proposed for ground testing of a nuclear thermal engine. It utilizes two thermophysical steps to contain the hydrogen exhaust. First, the decomposition of hydrogen through oxygen-rich combustion at higher temperature; second, the recombination of remaining hydrogen with radicals at low temperature. This is achieved with two unit operations: an oxygen-rich burner and a tubular heat exchanger. A computational fluid dynamics methodology was used to analyze the entire process on a three-dimensional domain. The computed flammability at the exit of the heat exchanger was less than the lower flammability limit, confirming the hydrogen containment capability of the proposed process. Keywords: Hydrogen decomposition reactions, Hydrogen recombination reactions, Hydrogen containment process, Nuclear thermal propulsion, Ground testing

  18. Planning for Plume Diagnostics for Ground Testing of J-2X Engines at the SSC

    Science.gov (United States)

    SaintCyr, William W.; Tejwani, Gopal D.; McVay, Gregory P.; Langford, Lester A.; SaintCyr, William W.

    2010-01-01

    John C. Stennis Space Center (SSC) is the premier test facility for liquid rocket engine development and certification for the National Aeronautics and Space Administration (NASA). Therefore, it is no surprise that the SSC will play the most prominent role in the engine development testing and certification for the J-2X engine. The Pratt & Whitney Rocketdyne J-2X engine has been selected by the Constellation Program to power the Ares I Upper Stage Element and the Ares V Earth Departure Stage in NASA s strategy of risk mitigation for hardware development by building on the Apollo program and other lessons learned to deliver a human-rated engine that is on an aggressive development schedule, with first demonstration flight in 2010 and human test flights in 2012. Accordingly, J-2X engine design, development, test, and evaluation is to build upon heritage hardware and apply valuable experience gained from past development and testing efforts. In order to leverage SSC s successful and innovative expertise in the plume diagnostics for the space shuttle main engine (SSME) health monitoring,1-10 this paper will present a blueprint for plume diagnostics for various proposed ground testing activities for J-2X at SSC. Complete description of the SSC s test facilities, supporting infrastructure, and test facilities is available in Ref. 11. The A-1 Test Stand is currently being prepared for testing the J-2X engine at sea level conditions. The A-2 Test Stand is currently being used for testing the SSME and may also be used for testing the J-2X engine at sea level conditions in the future. Very recently, ground-breaking ceremony for the new A-3 rocket engine test stand took place at SSC on August 23, 2007. A-3 is the first large - scale test stand to be built at the SSC since the A and B stands were constructed in the 1960s. The A-3 Test Stand will be used for testing J-2X engines under vacuum conditions simulating high altitude operation at approximately 30,480 m (100,000 ft

  19. Concept study of a hydrogen containment process during nuclear thermal engine ground testing

    OpenAIRE

    Wang, Ten-See; Stewart, Eric T.; Canabal, Francisco

    2016-01-01

    A new hydrogen containment process was proposed for ground testing of a nuclear thermal engine. It utilizes two thermophysical steps to contain the hydrogen exhaust. First, the decomposition of hydrogen through oxygen-rich combustion at higher temperature; second, the recombination of remaining hydrogen with radicals at low temperature. This is achieved with two unit operations: an oxygen-rich burner and a tubular heat exchanger. A computational fluid dynamics methodology was used to analyze ...

  20. Use of Ground Penetrating Radar at the FAA's National Airport Pavement Test Facility

    Science.gov (United States)

    Injun, Song

    2015-04-01

    The Federal Aviation Administration (FAA) in the United States has used a ground-coupled Ground Penetrating Radar (GPR) at the National Airport Pavement Test Facility (NAPTF) since 2005. One of the primary objectives of the testing at the facility is to provide full-scale pavement response and failure information for use in airplane landing gear design and configuration studies. During the traffic testing at the facility, a GSSI GPR system was used to develop new procedures for monitoring Hot Mix Asphalt (HMA) pavement density changes that is directly related to pavement failure. After reviewing current setups for data acquisition software and procedures for identifying different pavement layers, dielectric constant and pavement thickness were selected as dominant parameters controlling HMA properties provided by GPR. A new methodology showing HMA density changes in terms of dielectric constant variations, called dielectric sweep test, was developed and applied in full-scale pavement test. The dielectric constant changes were successfully monitored with increasing airplane traffic numbers. The changes were compared to pavement performance data (permanent deformation). The measured dielectric constants based on the known HMA thicknesses were also compared with computed dielectric constants using an equation from ASTM D4748-98 Standard Test Method for Determining the Thickness of Bound Pavement Layers Using Short-Pulse Radar. Six inches diameter cylindrical cores were taken after construction and traffic testing for the HMA layer bulk specific gravity. The measured bulk specific gravity was also compared to monitor HMA density changes caused by aircraft traffic conditions. Additionally this presentation will review the applications of the FAA's ground-coupled GPR on embedded rebar identification in concrete pavement, sewer pipes in soil, and gage identifications in 3D plots.

  1. Component design challenges for the ground-based SP-100 nuclear assembly test

    International Nuclear Information System (INIS)

    Markley, R.A.; Disney, R.K.; Brown, G.B.

    1989-01-01

    The SP-100 ground engineering system (GES) program involves a ground test of the nuclear subsystems to demonstrate their design. The GES nuclear assembly test (NAT) will be performed in a simulated space environment within a vessel maintained at ultrahigh vacuum. The NAT employs a radiation shielding system that is comprised of both prototypical and nonprototypical shield subsystems to attenuate the reactor radiation leakage and also nonprototypical heat transport subsystems to remove the heat generated by the reactor. The reactor is cooled by liquid lithium, which will operate at temperatures prototypical of the flight system. In designing the components for these systems, a number of design challenges were encountered in meeting the operational requirements of the simulated space environment (and where necessary, prototypical requirements) while also accommodating the restrictions of a ground-based test facility with its limited available space. This paper presents a discussion of the design challenges associated with the radiation shield subsystem components and key components of the heat transport systems

  2. Brief communication: why sleep in a nest? Empirical testing of the function of simple shelters made by wild chimpanzees.

    Science.gov (United States)

    Stewart, F A

    2011-10-01

    All great apes build nightly a structure ("nest" or "bed") that is assumed to function primarily as a sleeping-platform. However, several other nest function hypotheses have been proposed: antipredation, antipathogen, and thermoregulation. I tested these simple shelter functions of chimpanzee nests in an experiment for which I was the subject in Fongoli, Senegal. I slept 11 nights in chimpanzee nests and on the bare ground to test for differences in sleep quality, potential exposure to disease through bites from possible vectors, and insulation. No difference was found in the total amount of sleep nor in sleep quality; however, sleep was more disturbed on the ground. Differences in sleep disturbance between arboreal and ground conditions seemed primarily due to causes of anxiety and alertness, e.g., vocalizations of terrestrial mammals. Arboreal nest-sleeping seems to reduce risk of bites from possible disease vectors and provide insulation in cold conditions. This preliminary, but direct, test of chimpanzee nest function has implications for the evolutionary transition from limb-roosting to nest-reclining sleep in the hominoids, and from tree-to-ground sleep in the genus Homo. Copyright © 2011 Wiley-Liss, Inc.

  3. Simulation analyses of vibration tests on pile-group effects using blast-induced ground motions

    International Nuclear Information System (INIS)

    Takayuki Hashimoto; Kazushige Fujiwara; Katsuichirou Hijikata; Hideo Tanaka; Kohji Koyamada; Atsushi Suzuki; Osamu Kontani

    2005-01-01

    Extensive vibration tests have been performed on pile-supported structures at a large-scale mining site to promote better understanding of the dynamic behavior of pile-supported structures, especially pile-group effects. Two test structures were constructed in an excavated pit. One structure was supported on 25 tubular steel piles and the other on 4. The test pit was backfilled with sand of an appropriate grain size distribution to ensure good compaction. Ground motions induced by large-scale blasting operations were used as excitation forces for the tests. The 3D Finite Element Method (3D FEM)and a Genetic Algorithm (GA) were employed to identify the shear wave velocities and damping factors of the compacted sand, especially of the surface layer. A beam-interaction spring model was employed to simulate the test results of the piles and the pile-supported structures. The superstructure and pile foundation were modeled by a one-stick model comprising lumped masses and beam elements. The pile foundations were modeled just as they were, with lumped masses and beam elements to simulate the test results showing that, for the 25-pile structure, piles at different locations showed different responses. It was confirmed that the analysis methods employed were very useful for evaluating the nonlinear behavior of the soil-pile-structure system, even under severe ground motions. (authors)

  4. An empirical test of competing theories of hazard-related trust: the case of GM food.

    Science.gov (United States)

    Allum, Nick

    2007-08-01

    Few scholars doubt the importance of trust in explaining variation in public perception of technological risk. Relatively little, however, is known about the particular types of judgments that people use in granting or withholding trust. This article presents findings from an empirical study that explores several dimensions of trust relevant for citizens' judgments of scientists involved in the development of GM food. The relationship between particular dimensions of trust and perceptions of GM food risk is also explored, using structural equation modeling. Results suggest that trust judgments based on the perception of shared values are most important in relation to GM food risk, but that judgments about scientists' technical competence are also important.

  5. Criterion for traffic phases in single vehicle data and empirical test of a microscopic three-phase traffic theory

    International Nuclear Information System (INIS)

    Kerner, Boris S; Klenov, Sergey L; Hiller, Andreas

    2006-01-01

    Based on empirical and numerical microscopic analyses, the physical nature of a qualitatively different behaviour of the wide moving jam phase in comparison with the synchronized flow phase-microscopic traffic flow interruption within the wide moving jam phase-is found. A microscopic criterion for distinguishing the synchronized flow and wide moving jam phases in single vehicle data measured at a single freeway location is presented. Based on this criterion, empirical microscopic classification of different local congested traffic states is performed. Simulations made show that the microscopic criterion and macroscopic spatiotemporal objective criteria lead to the same identification of the synchronized flow and wide moving jam phases in congested traffic. Microscopic models in the context of three-phase traffic theory have been tested based on the microscopic criterion for the phases in congested traffic. It is found that microscopic three-phase traffic models can explain both microscopic and macroscopic empirical congested pattern features. It is obtained that microscopic frequency distributions for vehicle speed difference as well as fundamental diagrams and speed correlation functions can depend on the spatial co-ordinate considerably. It turns out that microscopic optimal velocity (OV) functions and time headway distributions are not necessarily qualitatively different, even if local congested traffic states are qualitatively different. The reason for this is that important spatiotemporal features of congested traffic patterns are lost in these as well as in many other macroscopic and microscopic traffic characteristics, which are widely used as the empirical basis for a test of traffic flow models, specifically, cellular automata traffic flow models

  6. Empirical likelihood

    CERN Document Server

    Owen, Art B

    2001-01-01

    Empirical likelihood provides inferences whose validity does not depend on specifying a parametric model for the data. Because it uses a likelihood, the method has certain inherent advantages over resampling methods: it uses the data to determine the shape of the confidence regions, and it makes it easy to combined data from multiple sources. It also facilitates incorporating side information, and it simplifies accounting for censored, truncated, or biased sampling.One of the first books published on the subject, Empirical Likelihood offers an in-depth treatment of this method for constructing confidence regions and testing hypotheses. The author applies empirical likelihood to a range of problems, from those as simple as setting a confidence region for a univariate mean under IID sampling, to problems defined through smooth functions of means, regression models, generalized linear models, estimating equations, or kernel smooths, and to sampling with non-identically distributed data. Abundant figures offer vi...

  7. Empirical Power Comparison Of Goodness of Fit Tests for Normality In The Presence of Outliers

    International Nuclear Information System (INIS)

    Saculinggan, Mayette; Balase, Emily Amor

    2013-01-01

    Most statistical tests such as t-tests, linear regression analysis and Analysis of Variance (ANOVA) require the normality assumptions. When the normality assumption is violated, interpretation and inferences may not be reliable. Therefore it is important to assess such assumption before using any appropriate statistical test. One of the commonly used procedures in determining whether a random sample of size n comes from a normal population are the goodness-of-fit tests for normality. Several studies have already been conducted on the comparison of the different goodness-of-fit(see, for example [2]) but it is generally limited to the sample size or to the number of GOF tests being compared(see, for example [2] [5] [6] [7] [8]). This paper compares the power of six formal tests of normality: Kolmogorov-Smirnov test (see [3]), Anderson-Darling test, Shapiro-Wilk test, Lilliefors test, Chi-Square test (see [1]) and D'Agostino-Pearson test. Small, moderate and large sample sizes and various contamination levels were used to obtain the power of each test via Monte Carlo simulation. Ten thousand samples of each sample size and contamination level at a fixed type I error rate α were generated from the given alternative distribution. The power of each test was then obtained by comparing the normality test statistics with the respective critical values. Results show that the power of all six tests is low for small sample size(see, for example [2]). But for n = 20, the Shapiro-Wilk test and Anderson – Darling test have achieved high power. For n = 60, Shapiro-Wilk test and Liliefors test are most powerful. For large sample size, Shapiro-Wilk test is most powerful (see, for example [5]). However, the test that achieves the highest power under all conditions for large sample size is D'Agostino-Pearson test (see, for example [9]).

  8. Final test results for the ground operations demonstration unit for liquid hydrogen

    Science.gov (United States)

    Notardonato, W. U.; Swanger, A. M.; Fesmire, J. E.; Jumper, K. M.; Johnson, W. L.; Tomsik, T. M.

    2017-12-01

    Described herein is a comprehensive project-a large-scale test of an integrated refrigeration and storage system called the Ground Operations and Demonstration Unit for Liquid Hydrogen (GODU LH2), sponsored by the Advanced Exploration Systems Program and constructed at Kennedy Space Center. A commercial cryogenic refrigerator interfaced with a 125,000 l liquid hydrogen tank and auxiliary systems in a manner that enabled control of the propellant state by extracting heat via a closed loop Brayton cycle refrigerator coupled to a novel internal heat exchanger. Three primary objectives were demonstrating zero-loss storage and transfer, gaseous liquefaction, and propellant densification. Testing was performed at three different liquid hydrogen fill-levels. Data were collected on tank pressure, internal tank temperature profiles, mass flow in and out of the system, and refrigeration system performance. All test objectives were successfully achieved during approximately two years of testing. A summary of the final results is presented in this paper.

  9. Hyper-X Mach 7 Scramjet Design, Ground Test and Flight Results

    Science.gov (United States)

    Ferlemann, Shelly M.; McClinton, Charles R.; Rock, Ken E.; Voland, Randy T.

    2005-01-01

    The successful Mach 7 flight test of the Hyper-X (X-43) research vehicle has provided the major, essential demonstration of the capability of the airframe integrated scramjet engine. This flight was a crucial first step toward realizing the potential for airbreathing hypersonic propulsion for application to space launch vehicles. However, it is not sufficient to have just achieved a successful flight. The more useful knowledge gained from the flight is how well the prediction methods matched the actual test results in order to have confidence that these methods can be applied to the design of other scramjet engines and powered vehicles. The propulsion predictions for the Mach 7 flight test were calculated using the computer code, SRGULL, with input from computational fluid dynamics (CFD) and wind tunnel tests. This paper will discuss the evolution of the Mach 7 Hyper-X engine, ground wind tunnel experiments, propulsion prediction methodology, flight results and validation of design methods.

  10. USB environment measurements based on full-scale static engine ground tests

    Science.gov (United States)

    Sussman, M. B.; Harkonen, D. L.; Reed, J. B.

    1976-01-01

    Flow turning parameters, static pressures, surface temperatures, surface fluctuating pressures and acceleration levels were measured in the environment of a full-scale upper surface blowing (USB) propulsive lift test configuration. The test components included a flightworthy CF6-50D engine, nacelle, and USB flap assembly utilized in conjunction with ground verification testing of the USAF YC-14 Advanced Medium STOL Transport propulsion system. Results, based on a preliminary analysis of the data, generally show reasonable agreement with predicted levels based on model data. However, additional detailed analysis is required to confirm the preliminary evaluation, to help delineate certain discrepancies with model data, and to establish a basis for future flight test comparisons.

  11. Ground motion effects of underground nuclear testing on perennial vegetation at Nevada Test Site

    International Nuclear Information System (INIS)

    Rhoads, W.A.

    1976-07-01

    In this study to estimate the potential injury to vegetation from earth movement caused by underground nuclear detonations and to estimate the extent to which this may have occurred at NTS, two explosions in the megaton range on Pahute Mesa were studied in some detail: Boxcar, which caused a surface subsidence, and Benham, which did not. Because of the subsidence phenomenology, shock propagation through the earth and along the surface, and the resulting fractures, shrubs were killed at Boxcar around the perimeter of the subsidence crater. Both trees and shrubs were killed along tectonic faults, which became the path for earth fractures, and along fractures and rock falls elsewhere. There was also evidence at Boxcar of tree damage which antedated the nuclear testing program, presumably from natural earthquakes. With the possible exception of damage to aged junipers this investigation did not reveal any good evidence of immediate effects from underground testing on vegetation beyond that recognized earlier as the edge effect

  12. The Impact of Test Case Summaries on Bug Fixing Performance : An Empirical Investigation

    NARCIS (Netherlands)

    Panichella, S.; Panichella, A.; Beller, M.; Zaidman, A.E.; Gall, H.

    2015-01-01

    Automated test generation tools have been widely investigated with the goal of reducing the cost of testing activities. However, generated tests have been shown not to help developers in detecting and finding more bugs even though they reach higher structural coverage compared to manual testing. The

  13. The Impact of Test Case Summaries on Bug Fixing Performance : An Empirical Investigation

    NARCIS (Netherlands)

    Panichella, Sebastiano; Panichella, A.; Beller, M.M.; Zaidman, A.E.; Gall, Harald C.

    2016-01-01

    Automated test generation tools have been widely investigated with the goal of reducing the cost of testing activities. However, generated tests have been shown not to help developers in detecting and finding more bugs even though they reach higher structural coverage compared to manual testing.

  14. Taking advantage of ground data systems attributes to achieve quality results in testing software

    Science.gov (United States)

    Sigman, Clayton B.; Koslosky, John T.; Hageman, Barbara H.

    1994-01-01

    During the software development life cycle process, basic testing starts with the development team. At the end of the development process, an acceptance test is performed for the user to ensure that the deliverable is acceptable. Ideally, the delivery is an operational product with zero defects. However, the goal of zero defects is normally not achieved but is successful to various degrees. With the emphasis on building low cost ground support systems while maintaining a quality product, a key element in the test process is simulator capability. This paper reviews the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) test tool that is used in the acceptance test process for unmanned satellite operations control centers. The TASS is designed to support the development, test and operational environments of the Goddard Space Flight Center (GSFC) operations control centers. The TASS uses the same basic architecture as the operations control center. This architecture is characterized by its use of distributed processing, industry standards, commercial off-the-shelf (COTS) hardware and software components, and reusable software. The TASS uses much of the same TPOCC architecture and reusable software that the operations control center developer uses. The TASS also makes use of reusable simulator software in the mission specific versions of the TASS. Very little new software needs to be developed, mainly mission specific telemetry communication and command processing software. By taking advantage of the ground data system attributes, successful software reuse for operational systems provides the opportunity to extend the reuse concept into the test area. Consistency in test approach is a major step in achieving quality results.

  15. Free Flight Ground Testing of ADEPT in Advance of the Sounding Rocket One Flight Experiment

    Science.gov (United States)

    Smith, B. P.; Dutta, S.

    2017-01-01

    The Adaptable Deployable Entry and Placement Technology (ADEPT) project will be conducting the first flight test of ADEPT, titled Sounding Rocket One (SR-1), in just two months. The need for this flight test stems from the fact that ADEPT's supersonic dynamic stability has not yet been characterized. The SR-1 flight test will provide critical data describing the flight mechanics of ADEPT in ballistic flight. These data will feed decision making on future ADEPT mission designs. This presentation will describe the SR-1 scientific data products, possible flight test outcomes, and the implications of those outcomes on future ADEPT development. In addition, this presentation will describe free-flight ground testing performed in advance of the flight test. A subsonic flight dynamics test conducted at the Vertical Spin Tunnel located at NASA Langley Research Center provided subsonic flight dynamics data at high and low altitudes for multiple center of mass (CoM) locations. A ballistic range test at the Hypervelocity Free Flight Aerodynamics Facility (HFFAF) located at NASA Ames Research Center provided supersonic flight dynamics data at low supersonic Mach numbers. Execution and outcomes of these tests will be discussed. Finally, a hypothesized trajectory estimate for the SR-1 flight will be presented.

  16. Test anxiety and the validity of cognitive tests: A confirmatory factor analysis perspective and some empirical findings

    NARCIS (Netherlands)

    Wicherts, J.M.; Zand Scholten, A.

    2010-01-01

    The validity of cognitive ability tests is often interpreted solely as a function of the cognitive abilities that these tests are supposed to measure, but other factors may be at play. The effects of test anxiety on the criterion related validity (CRV) of tests was the topic of a recent study by

  17. Verification of mechanistic-empirical design models for flexible pavements through accelerated pavement testing : technical summary.

    Science.gov (United States)

    2014-08-01

    Midwest States Accelerated Pavement Testing Pooled-Fund Program, financed by the : highway departments of Kansas, Iowa, and Missouri, has supported an accelerated : pavement testing (APT) project to validate several models incorporated in the NCHRP :...

  18. Verification of mechanistic-empirical design models for flexible pavements through accelerated pavement testing.

    Science.gov (United States)

    2014-08-01

    The Midwest States Accelerated Pavement Testing Pooled Fund Program, financed by the highway : departments of Kansas, Iowa, and Missouri, has supported an accelerated pavement testing (APT) project to : validate several models incorporated in the NCH...

  19. HEAVY METALS IN THE ECOSYSTEM COMPONENTS AT "DEGELEN" TESTING GROUND OF THE FORMER SEMIPALATINSK TEST SITE

    Directory of Open Access Journals (Sweden)

    A.B. Yankauskas

    2012-06-01

    Full Text Available The ecological situation in the former Semipalatinsk test site is characterized by a combination of both radiative and "nonradiative" factors. There were investigated near-portal areas of the tunnels with water seepage at "Degelen" site. All the tunnel waters are characterized by higher concentrations of uranium, beryllium, and molybdenum. The watercourse of the tunnel # 504 is unique for its elemental composition, in particular, the content of rare earth elements, whose concentration in the water is in the range n*10-5 – n*10-7 %. Of all the rare earth elements in the samples were found 13, the concentrations of aluminum, manganese, zinc are comparable to the concentrations of macro-components. Concentration of 238U in the studied waters lie in the range of n*10-4 – n*10-6 %, which suggests the influence of uranium, not only as a toxic element, but its significance as the radiation factor.

  20. SSE software test management STM capability: Using STM in the Ground Systems Development Environment (GSDE)

    Science.gov (United States)

    Church, Victor E.; Long, D.; Hartenstein, Ray; Perez-Davila, Alfredo

    1992-01-01

    This report is one of a series discussing configuration management (CM) topics for Space Station ground systems software development. It provides a description of the Software Support Environment (SSE)-developed Software Test Management (STM) capability, and discusses the possible use of this capability for management of developed software during testing performed on target platforms. This is intended to supplement the formal documentation of STM provided by the SEE Project. How STM can be used to integrate contractor CM and formal CM for software before delivery to operations is described. STM provides a level of control that is flexible enough to support integration and debugging, but sufficiently rigorous to insure the integrity of the testing process.

  1. Future aerospace ground test facility requirements for the Arnold Engineering Development Center

    Science.gov (United States)

    Kirchner, Mark E.; Baron, Judson R.; Bogdonoff, Seymour M.; Carter, Donald I.; Couch, Lana M.; Fanning, Arthur E.; Heiser, William H.; Koff, Bernard L.; Melnik, Robert E.; Mercer, Stephen C.

    1992-01-01

    Arnold Engineering Development Center (AEDC) was conceived at the close of World War II, when major new developments in flight technology were presaged by new aerodynamic and propulsion concepts. During the past 40 years, AEDC has played a significant part in the development of many aerospace systems. The original plans were extended through the years by some additional facilities, particularly in the area of propulsion testing. AEDC now has undertaken development of a master plan in an attempt to project requirements and to plan for ground test and computational facilities over the coming 20 to 30 years. This report was prepared in response to an AEDC request that the National Research Council (NRC) assemble a committee to prepare guidance for planning and modernizing AEDC facilities for the development and testing of future classes of aerospace systems as envisaged by the U.S. Air Force.

  2. Integrated Human-in-the-Loop Ground Testing - Value, History, and the Future

    Science.gov (United States)

    Henninger, Donald L.

    2016-01-01

    Systems for very long-duration human missions to Mars will be designed to operate reliably for many years and many of these systems will never be returned to Earth. The need for high reliability is driven by the requirement for safe functioning of remote, long-duration crewed systems and also by unsympathetic abort scenarios. Abort from a Mars mission could be as long as 450 days to return to Earth. The key to developing a human-in-the-loop architecture is a development process that allows for a logical sequence of validating successful development in a stepwise manner, with assessment of key performance parameters (KPPs) at each step; especially important are KPPs for technologies evaluated in a full systems context with human crews on Earth and on space platforms such as the ISS. This presentation will explore the implications of such an approach to technology development and validation including the roles of ground and space-based testing necessary to develop a highly reliable system for long duration human exploration missions. Historical development and systems testing from Mercury to the International Space Station (ISS) to ground testing will be reviewed. Current work as well as recommendations for future work will be described.

  3. TRL Assessment of Solar Sail Technology Development Following the 20-Meter System Ground Demonstrator Hardware Testing

    Science.gov (United States)

    Young, Roy M.; Adams, Charles L.

    2010-01-01

    The NASA In-Space Propulsion Technology (ISPT) Projects Office sponsored two separate, independent solar sail system design and development demonstration activities during 2002-2005. ATK Space Systems of Goleta, CA was the prime contractor for one development team and L' Garde, Inc. of Tustin, CA was the prime contractor for the other development team. The goal of these activities was to advance the technology readiness level (TRL) of solar sail propulsion from 3 towards 6 by the year 2006. Component and subsystem fabrication and testing were completed successfully, including the ground deployment of 10-meter and 20-meter demonstration hardware systems under vacuum conditions. The deployment and structural testing of the 20-meter solar sail systems was conducted in the 30 meter diameter Space Power Facility thermal-vacuum chamber at NASA Glenn Plum Brook in April though August, 2005. This paper will present the results of the TRL assessment following the solar sail technology development activities associated with the design, development, analysis and testing of the 20-meter system ground demonstrators.

  4. EMPIRICAL TESTING OF MODIFIED BLACK-SCHOLES OPTION PRICING MODEL FORMULA ON NSE DERIVATIVE MARKET IN INDIA

    Directory of Open Access Journals (Sweden)

    Ambrish Gupta

    2013-01-01

    Full Text Available The main objectives of this paper are to incorporate modification in Black-Scholes option pricing model formula by adding some new variables on the basis of given assumption related to risk-free interest rate, and also shows the calculation process of new risk-free interest rate on the basis of modified variable. This paper also identifies the various situations in empirical testing of modified and original Black-Scholes formula with respect to the market value on the basis of assumed and calculated risk-free interest rate.

  5. DISCLOSURE POLICY AND PRICE VOLATILITY: A THEORETICAL DESCRIPTION AND EMPIRICAL TESTS OF THE 'FILTER EFFECT'

    Institute of Scientific and Technical Information of China (English)

    XiangminChen; JianghuiLin

    2004-01-01

    In the context of a push towards full disclosure by the regulatory authorities of securities markets, we evaluate the effectiveness of corporate disclosure policy by examining the 'filter effect'. Controlling for firm size and earnings changes, we conduct an empirical test of various disclosure options. Our results shows that the recent increase in disclosure frequency in mainland China's securities markets has not yet achieved its anticipated objective. Disclosure quality remains low and small firms often manipulate their stock prices through selective release of information.

  6. Concept study of a hydrogen containment process during nuclear thermal engine ground testing

    Science.gov (United States)

    Wang, Ten-See; Stewart, Eric T.; Canabal, Francisco

    A new hydrogen containment process was proposed for ground testing of a nuclear thermal engine. It utilizes two thermophysical steps to contain the hydrogen exhaust. First, the decomposition of hydrogen through oxygen-rich combustion at higher temperature; second, the recombination of remaining hydrogen with radicals at low temperature. This is achieved with two unit operations: an oxygen-rich burner and a tubular heat exchanger. A computational fluid dynamics methodology was used to analyze the entire process on a three-dimensional domain. The computed flammability at the exit of the heat exchanger was less than the lower flammability limit, confirming the hydrogen containment capability of the proposed process.

  7. Tests of the gravitational redshift effect in space-born and ground-based experiments

    Science.gov (United States)

    Vavilova, I. B.

    2018-02-01

    This paper provides a brief overview of experiments as concerns with the tests of the gravitational redshift (GRS) effect in ground-based and space-born experiments. In particular, we consider the GRS effects in the gravitational field of the Earth, the major planets of the Solar system, compact stars (white dwarfs and neutron stars) where this effect is confirmed with a higher accuracy. We discuss availabilities to confirm the GRS effect for galaxies and galaxy clusters in visible and X-ray ranges of the electromagnetic spectrum.

  8. Heavy metals in the ecosystem components at 'Degelen' testing ground of the former Semipalatinsk test site

    International Nuclear Information System (INIS)

    Yankauskas, A.B.; Lukashenko, S.N.; Amirov, A.A.; Govenko, P.V.

    2012-01-01

    The ecological situation in the former Semipalatinsk test site is characterized by a combination of both radiative and nonradiative factors. There were investigated near-portal areas of the tunnels with water seepage at 'Degelen' site. All the tunnel waters are characterized by higher concentrations of uranium, beryllium, and molybdenum. The watercourse of the tunnel number 504 is unique for its elemental composition, in particular, the content of rare earth elements, whose concentration in the water is in the range n*10 -5 -n*10 -7 %. Of all the rare earth elements in the samples were found 13, the concentrations of aluminum, manganese, zinc are comparable to the concentrations of macro-components. Concentration of 238 U in the studied waters lie in the range of n*10 -4 - n*10 -6 %, which suggests the influence of uranium, not only as a toxic element, but its significance as the radiation factor. The analysis of complex data obtained showed that the elevated concentrations of heavy metals in the soils of the areas under study, as a rule, are a consequence of the carry-over of these metals by water flows and their subsequent deposition in the sediments. (authors)

  9. Force Limiting Vibration Tests Evaluated from both Ground Acoustic Tests and FEM Simulations of a Flight Like Vehicle System Assembly

    Science.gov (United States)

    Smith, Andrew; LaVerde, Bruce; Waldon, James; Hunt, Ron

    2014-01-01

    Marshall Space Flight Center has conducted a series of ground acoustic tests with the dual goals of informing analytical judgment, and validating analytical methods when estimating vibroacoustic responses of launch vehicle subsystems. The process of repeatedly correlating finite element-simulated responses with test-measured responses has assisted in the development of best practices for modeling and post-processing. In recent work, force transducers were integrated to measure interface forces at the base of avionics box equipment. Other force data was indirectly measured using strain gauges. The combination of these direct and indirect force measurements has been used to support and illustrate the advantages of implementing the Force Limiting approach for equipment qualification tests. The comparison of force response from integrated system level tests to measurements at the same locations during component level vibration tests provides an excellent illustration. A second comparison of the measured response cases from the system level acoustic tests to finite element simulations has also produced some principles for assessing the suitability of Finite Element Models (FEMs) for making vibroacoustics estimates. The results indicate that when FEM models are employed to guide force limiting choices, they should include sufficient detail to represent the apparent mass of the system in the frequency range of interest.

  10. The effects of traited and situational impression management on a personality test: an empirical analysis

    Directory of Open Access Journals (Sweden)

    MICHAEL S. HENRY

    2006-09-01

    Full Text Available Studies examining impression management (IM in self-report measures typically assume that impression management is either a 1 trait or 2 situational variable, which has led to often conflicting results (Stark, Chernyshenko, Chan, Lee and Drasgow. 2001. This study examined the item-level and scale-level responses on six empirically-derived facets of conscientiousness from the California Psychological Inventory (CPI between high and low IM groups. Subjects (N = 6,220 were participants in a management assessment conducted by an external consulting firm. Subjects participated in the assessment as part of either 1 a selection or promotional process, or 2 a feedback and development process, and two specific occupational groups (sales/marketing and accounting/finance were examined. Using the IRT-based DFIT framework (Raju, Van der Linden & Fleer, 1995, the item-level and scale-level differences were examined for the situational IM and traited IM approaches. The results indicated that relatively little DIF/DTF was present and that the differences between the two approaches to examining IM may not be as great as previously suggested.

  11. The Social Consequences of Poverty: An Empirical Test on Longitudinal Data.

    Science.gov (United States)

    Mood, Carina; Jonsson, Jan O

    Poverty is commonly defined as a lack of economic resources that has negative social consequences, but surprisingly little is known about the importance of economic hardship for social outcomes. This article offers an empirical investigation into this issue. We apply panel data methods on longitudinal data from the Swedish Level-of-Living Survey 2000 and 2010 (n = 3089) to study whether poverty affects four social outcomes-close social relations (social support), other social relations (friends and relatives), political participation, and activity in organizations. We also compare these effects across five different poverty indicators. Our main conclusion is that poverty in general has negative effects on social life. It has more harmful effects for relations with friends and relatives than for social support; and more for political participation than organizational activity. The poverty indicator that shows the greatest impact is material deprivation (lack of cash margin), while the most prevalent poverty indicators-absolute income poverty, and especially relative income poverty-appear to have the least effect on social outcomes.

  12. Testing an empirically derived mental health training model featuring small groups, distributed practice and patient discussion.

    Science.gov (United States)

    Murrihy, Rachael C; Byrne, Mitchell K; Gonsalvez, Craig J

    2009-02-01

    Internationally, family doctors seeking to enhance their skills in evidence-based mental health treatment are attending brief training workshops, despite clear evidence in the literature that short-term, massed formats are not likely to improve skills in this complex area. Reviews of the educational literature suggest that an optimal model of training would incorporate distributed practice techniques; repeated practice over a lengthy time period, small-group interactive learning, mentoring relationships, skills-based training and an ongoing discussion of actual patients. This study investigates the potential role of group-based training incorporating multiple aspects of good pedagogy for training doctors in basic competencies in brief cognitive behaviour therapy (BCBT). Six groups of family doctors (n = 32) completed eight 2-hour sessions of BCBT group training over a 6-month period. A baseline control design was utilised with pre- and post-training measures of doctors' BCBT skills, knowledge and engagement in BCBT treatment. Family doctors' knowledge, skills in and actual use of BCBT with patients improved significantly over the course of training compared with the control period. This research demonstrates preliminary support for the efficacy of an empirically derived group training model for family doctors. Brief CBT group-based training could prove to be an effective and viable model for future doctor training.

  13. Empirical approach based on centrifuge testing for cyclic deformations of laterally loaded piles in sand

    DEFF Research Database (Denmark)

    Truong, P.; Lehane, B. M.; Zania, Varvara

    2018-01-01

    A systematic study into the response of monopiles to lateral cyclic loading in medium dense and dense sand was performed in beam and drum centrifuge tests. The centrifuge tests were carried out at different cyclic load and magnitude ratios, while the cyclic load sequence was also varied...

  14. The Empirical Dimension of Communicative Language Tests: The Case of Selected Philippine Universities

    Science.gov (United States)

    Bernardo, Alejandro S.

    2011-01-01

    This study examined the "communicativeness" of 22 English language tests designed and administered by 22 English instructors from 22 different colleges and universities in the Philippines. Its key objective was to answer the question "How communicative are the language tests used in assessing students' competence (knowledge of the…

  15. Ground Testing a Nuclear Thermal Rocket: Design of a sub-scale demonstration experiment

    Energy Technology Data Exchange (ETDEWEB)

    David Bedsun; Debra Lee; Margaret Townsend; Clay A. Cooper; Jennifer Chapman; Ronald Samborsky; Mel Bulman; Daniel Brasuell; Stanley K. Borowski

    2012-07-01

    In 2008, the NASA Mars Architecture Team found that the Nuclear Thermal Rocket (NTR) was the preferred propulsion system out of all the combinations of chemical propulsion, solar electric, nuclear electric, aerobrake, and NTR studied. Recently, the National Research Council committee reviewing the NASA Technology Roadmaps recommended the NTR as one of the top 16 technologies that should be pursued by NASA. One of the main issues with developing a NTR for future missions is the ability to economically test the full system on the ground. In the late 1990s, the Sub-surface Active Filtering of Exhaust (SAFE) concept was first proposed by Howe as a method to test NTRs at full power and full duration. The concept relied on firing the NTR into one of the test holes at the Nevada Test Site which had been constructed to test nuclear weapons. In 2011, the cost of testing a NTR and the cost of performing a proof of concept experiment were evaluated.

  16. Adding Theoretical Grounding to Grounded Theory: Toward Multi-Grounded Theory

    OpenAIRE

    Göran Goldkuhl; Stefan Cronholm

    2010-01-01

    The purpose of this paper is to challenge some of the cornerstones of the grounded theory approach and propose an extended and alternative approach for data analysis and theory development, which the authors call multi-grounded theory (MGT). A multi-grounded theory is not only empirically grounded; it is also grounded in other ways. Three different grounding processes are acknowledged: theoretical, empirical, and internal grounding. The authors go beyond the pure inductivist approach in GT an...

  17. HIV testing and care in Burkina Faso, Kenya, Malawi and Uganda: ethics on the ground

    Directory of Open Access Journals (Sweden)

    Obermeyer Carla Makhlouf

    2013-01-01

    Full Text Available Abstract Background The ethical discourse about HIV testing has undergone a profound transformation in recent years. The greater availability of antiretroviral therapy (ART has led to a global scaling up of HIV testing and counseling as a gateway to prevention, treatment and care. In response, critics raised important ethical questions, including: How do different testing policies and practices undermine or strengthen informed consent and medical confidentiality? How well do different modalities of testing provide benefits that outweigh risks of harm? To what degree do current testing policies and programs provide equitable access to HIV services? And finally, what lessons have been learned from the field about how to improve the delivery of HIV services to achieve public health objectives and protections for human rights? This article reviews the empirical evidence that has emerged to answer these questions, from four sub-Saharan African countries, namely: Burkina Faso, Kenya, Malawi and Uganda. Discussion Expanding access to treatment and prevention in these four countries has made the biomedical benefits of HIV testing increasingly clear. But serious challenges remain with regard to protecting human rights, informed consent and ensuring linkages to care. Policy makers and practitioners are grappling with difficult ethical issues, including how to protect confidentiality, how to strengthen linkages to care, and how to provide equitable access to services, especially for most at risk populations, including men who have sex with men. Summary The most salient policy questions about HIV testing in these countries no longer address whether to scale up routine PITC (and other strategies, but how. Instead, individuals, health care providers and policy makers are struggling with a host of difficult ethical questions about how to protect rights, maximize benefits, and mitigate risks in the face of resource scarcity.

  18. The evolution of intra-organizational trust networks : The case of a German paper factory: An empirical test of six trust mechanisms

    NARCIS (Netherlands)

    Bunt, Gerhard G. van der; Wittek, Rafael P. M.; Klepper, Maurits C. de

    Based on the distinction between expressive and instrumental motives, six theoretical mechanisms for the formation of trust relationships are elaborated and empirically tested. When expressive motives drive tie formation, individuals primarily attach emotional value to social relationships. Three

  19. Empirical component model to predict the overall performance of heating coils: Calibrations and tests based on manufacturer catalogue data

    International Nuclear Information System (INIS)

    Ruivo, Celestino R.; Angrisani, Giovanni

    2015-01-01

    Highlights: • An empirical model for predicting the performance of heating coils is presented. • Low and high heating capacity cases are used for calibration. • Versions based on several effectiveness correlations are tested. • Catalogue data are considered in approach testing. • The approach is a suitable component model to be used in dynamic simulation tools. - Abstract: A simplified methodology for predicting the overall behaviour of heating coils is presented in this paper. The coil performance is predicted by the ε-NTU method. Usually manufacturers do not provide information about the overall thermal resistance or the geometric details that are required either for the device selection or to apply known empirical correlations for the estimation of the involved thermal resistances. In the present work, heating capacity tables from the manufacturer catalogue are used to calibrate simplified approaches based on the classical theory of heat exchangers, namely the effectiveness method. Only two reference operating cases are required to calibrate each approach. The validity of the simplified approaches is investigated for a relatively high number of operating cases, listed in the technical catalogue of a manufacturer. Four types of coils of three sizes of air handling units are considered. A comparison is conducted between the heating coil capacities provided by the methodology and the values given by the manufacturer catalogue. The results show that several of the proposed approaches are suitable component models to be integrated in dynamic simulation tools of air conditioning systems such as TRNSYS or EnergyPlus

  20. Ensemble Empirical Mode Decomposition based methodology for ultrasonic testing of coarse grain austenitic stainless steels.

    Science.gov (United States)

    Sharma, Govind K; Kumar, Anish; Jayakumar, T; Purnachandra Rao, B; Mariyappa, N

    2015-03-01

    A signal processing methodology is proposed in this paper for effective reconstruction of ultrasonic signals in coarse grained high scattering austenitic stainless steel. The proposed methodology is comprised of the Ensemble Empirical Mode Decomposition (EEMD) processing of ultrasonic signals and application of signal minimisation algorithm on selected Intrinsic Mode Functions (IMFs) obtained by EEMD. The methodology is applied to ultrasonic signals obtained from austenitic stainless steel specimens of different grain size, with and without defects. The influence of probe frequency and data length of a signal on EEMD decomposition is also investigated. For a particular sampling rate and probe frequency, the same range of IMFs can be used to reconstruct the ultrasonic signal, irrespective of the grain size in the range of 30-210 μm investigated in this study. This methodology is successfully employed for detection of defects in a 50mm thick coarse grain austenitic stainless steel specimens. Signal to noise ratio improvement of better than 15 dB is observed for the ultrasonic signal obtained from a 25 mm deep flat bottom hole in 200 μm grain size specimen. For ultrasonic signals obtained from defects at different depths, a minimum of 7 dB extra enhancement in SNR is achieved as compared to the sum of selected IMF approach. The application of minimisation algorithm with EEMD processed signal in the proposed methodology proves to be effective for adaptive signal reconstruction with improved signal to noise ratio. This methodology was further employed for successful imaging of defects in a B-scan. Copyright © 2014. Published by Elsevier B.V.

  1. A Test of the Empirical Profile and Coherence of the DSM-5 Psychopathy Specifier.

    Science.gov (United States)

    Miller, Joshua D; Lamkin, Joanna; Maples-Keller, Jessica L; Sleep, Chelsea E; Lynam, Donald R

    2017-11-13

    The Diagnostic and Statistical Manual of Mental Disorders-5th edition (DSM-5; American Psychiatric Association, 2013) introduced a psychopathy specifier (DSM-5 PS) as part of the Section III diagnostic model of antisocial personality disorder. Designed to capture the construct of fearless dominance/boldness, the DSM-5 PS is assessed on the basis of the presence of low scores on traits of withdrawal and anxiousness, and high scores on attention seeking. These constructs have garnered attention in the past decade but are the subject of substantial debate as to their role in the conceptualization and assessment of psychopathy, given their limited relations to the maladaptive outcomes typically associated with this personality disorder. In the current study (N = 340 undergraduates; 170 informants), we examined the DSM-5 PS, both in composite form and its trait subscales, to investigate the degree to which the DSM-5 PS manifested empirical profiles associated with psychopathy and its maladaptive correlates. Consistent with prior fearless dominance/boldness research, the DSM-5 PS manifested limited relations with other components of psychopathy, symptoms of DSM-5 Section II and III antisocial personality disorder, and self- and informant-related impairment scores. When examined at the individual subscale level, the 3 DSM-5 PS subscales manifested only partially overlapping profiles and only 1 of the 3-Attention Seeking-demonstrated an association with maladaptivity (e.g., externalizing behaviors). These findings raise important concerns about the coherence and utility of the DSM-5 PS as a diagnostic specifier included in a psychiatric nosology. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  2. Empirical validation of the CRAFFT Abuse Screening Test in a Spanish sample.

    Science.gov (United States)

    Rial, Antonio; Kim-Harris, Sion; Knight, John R; Araujo, Manuel; Gómez, Patricia; Braña, Teresa; Varela, Jesús; Golpe, Sandra

    2018-01-15

    The CRAFFT Substance Abuse Screening Instrument, developed by the Center for Adolescents Substance Abuse Research (CeASAR) (Knight et al., 1999), is a screening tool for high-risk alcohol and drug risk consumption designed for use with adolescents. Since its publication it has been the subject of translations and validations in different countries, populations and contexts that have demonstrated its enormous potential. However, there is still no empirical validation study that would ensure its good psychometric performance in Spain. The aim of this paper is to develop an adapted version of the CRAFFT in Spanish and to analyze its psychometric properties in a sample of Spanish adolescents. For this purpose an individual interview was conducted on 312 adolescents aged between 12 and 18 years of age (M = 15.01; SD = 1.83) from the Galician community. The interview included a part of the Adolescent Diagnostic Interview (ADI) and the Problem Oriented Screening Instrument for Teenagers (POSIT). The results obtained, similar to those found in other countries, allow us to report that the Spanish version of the CRAFFT has a good psychometric behaviorproperties. It was found to have a satisfactory internal consistency with a Cronbach’s alpha value of .74. In terms of sensitivity and specificity, values of 74.4% and 96.4% respectively, were obtained and the area under the ROC curve was .946. The Spanish version of the CRAFFT is made available to researchers and professionals in the field of addictive behaviors, so that it can be used with the necessary psychometric guarantees.

  3. SmartUnit: Empirical Evaluations for Automated Unit Testing of Embedded Software in Industry

    OpenAIRE

    Zhang, Chengyu; Yan, Yichen; Zhou, Hanru; Yao, Yinbo; Wu, Ke; Su, Ting; Miao, Weikai; Pu, Geguang

    2018-01-01

    In this paper, we aim at the automated unit coverage-based testing for embedded software. To achieve the goal, by analyzing the industrial requirements and our previous work on automated unit testing tool CAUT, we rebuild a new tool, SmartUnit, to solve the engineering requirements that take place in our partner companies. SmartUnit is a dynamic symbolic execution implementation, which supports statement, branch, boundary value and MC/DC coverage. SmartUnit has been used to test more than one...

  4. Image processing and computer controls for video profile diagnostic system in the ground test accelerator (GTA)

    International Nuclear Information System (INIS)

    Wright, R.; Zander, M.; Brown, S.; Sandoval, D.; Gilpatrick, D.; Gibson, H.

    1992-01-01

    This paper describes the application of video image processing to beam profile measurements on the Ground Test Accelerator (GTA). A diagnostic was needed to measure beam profiles in the intermediate matching section (IMS) between the radio-frequency quadrupole (RFQ) and the drift tube linac (DTL). Beam profiles are measured by injecting puffs of gas into the beam. The light emitted from the beam-gas interaction is captured and processed by a video image processing system, generating the beam profile data. A general purpose, modular and flexible video image processing system, imagetool, was used for the GTA image profile measurement. The development of both software and hardware for imagetool and its integration with the GTA control system (GTACS) is discussed. The software includes specialized algorithms for analyzing data and calibrating the system. The underlying design philosophy of imagetool was tested by the experience of building and using the system, pointing the way for future improvements. (Author) (3 figs., 4 refs.)

  5. Treatability tests on water from a low-level waste burial ground

    International Nuclear Information System (INIS)

    Taylor, P.A.

    1990-01-01

    Lab-scale treatability tests on trench water from a low-level waste burial ground have shown that the water can be successfully treated by existing wastewater treatment plants at Oak Ridge National Laboratory. Water from the four most highly contaminated trenches that had been identified to date was used in the treatability tests. The softening and ion exchange processes used in the Process Wastewater Treatment Plant removed Sr-90 from the trench water, which was the only radionuclide present at above the discharge limits. The air stripping and activated carbon adsorption processes used in the Nonradiological Wastewater Treatment Plant removed volatile and semi-volatile organics, which were the main contaminants in the trench water, to below detection limits. 6 refs., 2 figs., 7 tabs

  6. The ground testing of a 2 kWe solar dynamic space power system

    International Nuclear Information System (INIS)

    Calogeras, J.E.

    1992-01-01

    Over the past 25 years Space Solar Dynamic component development has advanced to the point where it is considered a leading candidate power source technology for the evolutionary phases of the Space Station Freedom (SSF) program. Selection of SD power was based on studies and analyses which indicated significant savings in life cycle costs, launch mass and EVA requirements were possible when the system is compared to more conventional photovoltaic/battery power systems. Issues associated with micro-gravity operation such as the behavior of the thermal energy storage materials are being addressed in other programs. This paper reports that a ground test of a 2 kWe solar dynamic system is being planned by the NASA Office of Aeronautics and Space Technology to address the integration issues. The test will be scalable up to 25 kWe, will be flight configured and will incorporate relevant features of the SSF Solar Dynamic Power Module design

  7. Construction management at the SP-100 ground engineering system test site

    International Nuclear Information System (INIS)

    Burchell, G.P.; Wilson, L.R.

    1991-01-01

    Contractors under the U.S. Department of Energy management have implemented a comprehensive approach to the management of design and construction of the complex facility modifications at the SP-100 Ground Engineering System Test Site on the Hanford Reservation. The SP-100 Test Site employs a multi-organizational integrated management approach with clearly defined responsibilities to assure success. This approach allows for thorough planning and analysis before the project kick off, thus minimizing the number and magnitude of problems which arise during the course of the project. When combined with a comprehensive cost and schedule/project management reporting system the problems which do occur are recognized early enough to assure timely intervention and resolution

  8. Pre-Flight Ground Testing of the Full-Scale HIFiRE-1 at Fully Duplicated Flight Conditions

    National Research Council Canada - National Science Library

    Wadhams, Tim P; MacLean, Matthew G; Holden, Michael S; Mundy, Erik

    2008-01-01

    As part of an experimental study to obtain detailed heating and pressure data over the full-scale HIFiRE-1 flight geometry, CUBRC has completed a 30-run matrix of ground tests, sponsored by the AFOSR...

  9. Mechanistic-empirical subgrade design model based on heavy vehicle simulator test results

    CSIR Research Space (South Africa)

    Theyse, HL

    2006-06-01

    Full Text Available Although Accelerated Pavement Testing (APT) is often done with specific objectives, valuable pavement performance data is generated over the long-term that may be used to investigate pavement behaviour in general and calibrate mechanistic...

  10. [Willingness of Students of Economics to Pay for Predictive Oncological Genetic Testing - An Empirical Analysis].

    Science.gov (United States)

    Siol, V; Lange, A; Prenzler, A; Neubauer, S; Frank, M

    2017-05-01

    Objectives: The present study aims to investigate the interest of young adults in predictive oncological genetic testing and their willingness to pay for such a test. Furthermore, major determinants of the 2 variables of interest were identified. Methods: 348 students of economics from the Leibniz University of Hanover were queried in July 2013 using an extensive questionnaire. Among other things, the participants were asked if they are interested in information about the probability to develop cancer in the future and their willingness to pay for such information. Data were analysed using descriptive statistics and ordinal probit regressions. Additionally marginal effects were calculated. Results: About 50% of the students were interested in predictive oncological genetic testing and were willing to pay for the test. Moreover, the participants who were willing to pay for the test partly attach high monetary values to the information that could so be obtained. The study shows that the interest of the students and their willingness to pay were primarily influenced by individual attitudes and perceptions. Conclusions: The study proves that young adults were interested in predictive genetic testing and appreciate information about their probability of develop cancer someday. © Georg Thieme Verlag KG Stuttgart · New York.

  11. Edge effects and geometric constraints: a landscape-level empirical test.

    Science.gov (United States)

    Ribeiro, Suzy E; Prevedello, Jayme A; Delciellos, Ana Cláudia; Vieira, Marcus Vinícius

    2016-01-01

    Edge effects are pervasive in landscapes yet their causal mechanisms are still poorly understood. Traditionally, edge effects have been attributed to differences in habitat quality along the edge-interior gradient of habitat patches, under the assumption that no edge effects would occur if habitat quality was uniform. This assumption was questioned recently after the recognition that geometric constraints tend to reduce population abundances near the edges of habitat patches, the so-called geometric edge effect (GEE). Here, we present the first empirical, landscape-level evaluation of the importance of the GEE in shaping abundance patterns in fragmented landscapes. Using a data set on the distribution of small mammals across 18 forest fragments, we assessed whether the incorporation of the GEE into the analysis changes the interpretation of edge effects and the degree to which predictions based on the GEE match observed responses. Quantitative predictions were generated for each fragment using simulations that took into account home range, density and matrix use for each species. The incorporation of the GEE into the analysis changed substantially the interpretation of overall observed edge responses at the landscape scale. Observed abundances alone would lead to the conclusion that the small mammals as a group have no consistent preference for forest edges or interiors and that the black-eared opossum Didelphis aurita (a numerically dominant species in the community) has on average a preference for forest interiors. In contrast, incorporation of the GEE suggested that the small mammal community as a whole has a preference for forest edges, whereas D. aurita has no preference for forest edges or interiors. Unexplained variance in edge responses was reduced by the incorporation of GEE, but remained large, varying greatly on a fragment-by-fragment basis. This study demonstrates how to model and incorporate the GEE in analyses of edge effects and that this

  12. Empirical tests of pre-main-sequence stellar evolution models with eclipsing binaries

    Science.gov (United States)

    Stassun, Keivan G.; Feiden, Gregory A.; Torres, Guillermo

    2014-06-01

    We examine the performance of standard pre-main-sequence (PMS) stellar evolution models against the accurately measured properties of a benchmark sample of 26 PMS stars in 13 eclipsing binary (EB) systems having masses 0.04-4.0 M⊙ and nominal ages ≈1-20 Myr. We provide a definitive compilation of all fundamental properties for the EBs, with a careful and consistent reassessment of observational uncertainties. We also provide a definitive compilation of the various PMS model sets, including physical ingredients and limits of applicability. No set of model isochrones is able to successfully reproduce all of the measured properties of all of the EBs. In the H-R diagram, the masses inferred for the individual stars by the models are accurate to better than 10% at ≳1 M⊙, but below 1 M⊙ they are discrepant by 50-100%. Adjusting the observed radii and temperatures using empirical relations for the effects of magnetic activity helps to resolve the discrepancies in a few cases, but fails as a general solution. We find evidence that the failure of the models to match the data is linked to the triples in the EB sample; at least half of the EBs possess tertiary companions. Excluding the triples, the models reproduce the stellar masses to better than ∼10% in the H-R diagram, down to 0.5 M⊙, below which the current sample is fully contaminated by tertiaries. We consider several mechanisms by which a tertiary might cause changes in the EB properties and thus corrupt the agreement with stellar model predictions. We show that the energies of the tertiary orbits are comparable to that needed to potentially explain the scatter in the EB properties through injection of heat, perhaps involving tidal interaction. It seems from the evidence at hand that this mechanism, however it operates in detail, has more influence on the surface properties of the stars than on their internal structure, as the lithium abundances are broadly in good agreement with model predictions. The

  13. Tests of Selection in Pooled Case-Control Data: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Nitin eUdpa

    2011-11-01

    Full Text Available For smaller organisms with faster breeding cycles, artificial selection can be used to create sub-populations with different phenotypic traits. Genetic tests can be employed to identify the causal markers for the phenotypes, as a precursor to engineering strains with a combination of traits. Traditional approaches involve analyzing crosses of inbred strains to test for co-segregation with genetic markers. Here we take advantage of cheaper next generation sequencing techniques to identifygenetic signatures of adaptation to the selection constraints. Obtaining individual sequencing data is often unrealistic due to cost and sample issues, so we focus on pooled genomic data.In this paper, we explore a series of statistical tests for selection using pooled case (under selection and control populations. Extensive simulations are used to show that these approaches work well for a wide range of population divergence times and strong selective pressures. We show that pooling does not have a significant impact on statistical power. The tests are also robust to reasonable variations in several different parameters, including window size, base-calling error rate, and sequencing coverage. We then demonstrate the viability (and the challenges of one of these methods in two independent Drosophila populations (Drosophila melanogaster bred under selectionfor hypoxia and accelerated development, respectively. Testing for extreme hypoxia tolerance showed clear signals of selection, pointing to loci that are important for hypoxia adaptation.Overall, we outline a strategy for finding regions under selection using pooled sequences, then devise optimal tests for that strategy. The approaches show promise for detecting selection, even several generations after fixation of the beneficial allele has occurred.

  14. A blind test of nondestructive underground void detection by ground penetrating radar (GPR)

    Science.gov (United States)

    Lai, Wallace W. L.; Chang, Ray K. W.; Sham, Janet F. C.

    2018-02-01

    Blind test/experiment is widely adopted in various scientific disciplines like medicine drug testing/clinical trials/psychology, but not popular in nondestructive testing and evaluation (NDTE) nor near-surface geophysics (NSG). This paper introduces a blind test of nondestructive underground void detection in highway/pavement using ground penetrating radar (GPR). Purpose of which is to help the Highways Department (HyD) of the Hong Kong Government to evaluate the feasibility of large-scale and nationwide application, and examine the ability of appropriate service providers to carry out such works. In the past failure case of such NDTE/NSG based on lowest bid price, it is not easy to know which part(s) in SWIMS (S - service provider, i.e. people; W - work procedure; I - instrumentation; M - materials in the complex underground; S - specifications by client) fails, and how it/they fail(s). This work attempts to carry out the blind test by burying fit balls (as voids) under a site with reinforced concrete road and paving block by PolyU team A. The blind test about the void centroid, spread and cover depth was then carried out by PolyU team B without prior information given. Then with this baseline, a marking scheme, acceptance criteria and passing mark were set to test six local commercial service providers, determine their scores and evaluate the performance. A pass is a prerequisite of the award of a service contract of similar nature. In this first attempt of the blind test, results were not satisfactory and it is concluded that 'S-service provider' and 'W-work procedure' amongst SWIMS contributed to most part of the unsatisfactory performance.+

  15. Design And Ground Testing For The Expert PL4/PL5 'Natural And Roughness Induced Transition'

    Science.gov (United States)

    Masutti, Davie; Chazot, Olivier; Donelli, Raffaele; de Rosa, Donato

    2011-05-01

    Unpredicted boundary layer transition can impact dramatically the stability of the vehicle, its aerodynamic coefficients and reduce the efficiency of the thermal protection system. In this frame, ESA started the EXPERT (European eXPErimental Reentry Testbed) program to pro- vide and perform in-flight experiments in order to obtain aerothermodynamic data for the validation of numerical models and of ground-to-flight extrapolation methodologies. Considering the boundary layer transition investigation, the EXPERT vehicle is equipped with two specific payloads, PL4 and PL5, concerning respectively the study of the natural and roughness induced transition. The paper is a survey on the design process of these two in-flight experiments and it covers the major analyses and findings encountered during the development of the payloads. A large amount of transition criteria have been investigated and used to estimate either the dangerousness of the height of the distributed roughness, arising due to nose erosion, or the effectiveness of height of the isolated roughness element forcing the boundary layer transition. Supporting the PL4 design, linear stability computations and CFD analyses have been performed by CIRA on the EXPERT flight vehicle to determine the amplification factor of the boundary layer instabilities at different point of the re-entry trajectory. Ground test experiments regarding the PL5 are carried on in the Mach 6 VKI H3 Hypersonic Wind Tunnel with a Reynolds numbers ranging from 18E6/m to 26E6/m. Infrared measurements (Stanton number) and flow visualization are used on a 1/16 scaled model of the EXPERT vehicle and a flat plate to validate the Potter and Whitfield criterion as a suitable methodology for ground-to-flight extrapolation and the payload design.

  16. What Drives Emergency Department Patient Satisfaction? An Empirical Test using Structural Equation Modeling

    DEFF Research Database (Denmark)

    Sørup, Christian Michel; Jacobsen, Peter

    2013-01-01

    Patient satisfaction determinants in emergency departments (EDs) have for decades been heavily investigated. Despite great focus, a lack of consensus about which parameters are deemed most important remains. This study proposes an integrated framework for ED patient satisfaction, testing four key...

  17. A comparison of usability methods for testing interactive health technologies: Methodological aspects and empirical evidence

    NARCIS (Netherlands)

    Jaspers, Monique W. M.

    2009-01-01

    OBJECTIVE: Usability evaluation is now widely recognized as critical to the success of interactive health care applications. However, the broad range of usability inspection and testing methods available may make it difficult to decide on a usability assessment plan. To guide novices in the

  18. Is high employee turnover really harmful?: An empirical test using company records

    NARCIS (Netherlands)

    Glebbeek, A.C.; Bax, E.H.

    2004-01-01

    We tested the hypothesis that employee turnover and firm performance have an inverted U-shaped relationship: overly high or low turnover is harmful. Our analysis was based on economic performance data from 110 offices of a temporary employment agency. These offices had high variation in turnover but

  19. Sex Ratios, Economic Power, and Women's Roles: A Theoretical Extension and Empirical Test.

    Science.gov (United States)

    South, Scott J.

    1988-01-01

    Tested hypotheses concerning sex ratios, women's roles, and economic power with data from 111 countries. Found undersupply of women positively associated with proportion of women who marry and fertility rate; inversely associated with women's average age at marriage, literacy rate, and divorce rate. Suggests women's economic power may counteract…

  20. Testing an integrated model of operations capabilities An empirical study of Australian airlines

    NARCIS (Netherlands)

    Nand, Alka Ashwini; Singh, Prakash J.; Power, Damien

    2013-01-01

    Purpose - The purpose of this paper is to test the integrated model of operations strategy as proposed by Schmenner and Swink to explain whether firms trade-off or accumulate capabilities, taking into account their positions relative to their asset and operating frontiers.

  1. An Empirical Test of Anchoring the NEP Scale in Environmental Ethics

    Science.gov (United States)

    Noblet, Caroline L.; Anderson, Mark; Teisl, Mario F.

    2013-01-01

    Some argue that the new ecological paradigm (NEP) scale is incomplete and does not adequately reflect contemporary debates in environmental ethics. We focus on one specific shortcoming of the NEP, its lack of an item to reflect an ecocentric viewpoint. To test this concern, we administered the NEP to three different audiences and included one…

  2. An Empirical Test of Cameron's Dimensions of Effectiveness: Implications for Australian Tertiary Institutions.

    Science.gov (United States)

    Lysons, Art; Ryder, Paul

    1988-01-01

    A theory postulating nine dimensions of effectiveness in U.S. higher education institutions was tested in an Australian sample. Findings suggest that administrators should use caution when extrapolating from results in at least the Australian context. Areas for future research are also suggested by cross-cultural comparisons. (Author/MSE)

  3. Labour turnover and its effects on performance : an empirical test using firm data

    NARCIS (Netherlands)

    Glebbeek, A.C.; Bax, E.H.

    2002-01-01

    In this article we test the hypothesis that the relationship between labour turnover and the economic performance of the firm is bell-shaped: a turnover level too low has a negative effect and likewise does a level too high. Our analysis is based on economic performance data of 110 offices of a temp

  4. Ground testing and flight demonstration of charge management of insulated test masses using UV-LED electron photoemission

    Science.gov (United States)

    Saraf, Shailendhar; Buchman, Sasha; Balakrishnan, Karthik; Lui, Chin Yang; Soulage, Michael; Faied, Dohy; Hanson, John; Ling, Kuok; Jaroux, Belgacem; Suwaidan, Badr Al; AlRashed, Abdullah; Al-Nassban, Badr; Alaqeel, Faisal; Harbi, Mohammed Al; Salamah, Badr Bin; Othman, Mohammed Bin; Qasim, Bandar Bin; Alfauwaz, Abdulrahman; Al-Majed, Mohammed; DeBra, Daniel; Byer, Robert

    2016-12-01

    The UV-LED mission demonstrates the precise control of the potential of electrically isolated test masses. Test mass charge control is essential for the operation of space accelerometers and drag-free sensors which are at the core of geodesy, aeronomy and precision navigation missions as well as gravitational wave experiments and observatories. Charge management using photoelectrons generated by the 254 nm UV line of Hg was first demonstrated on Gravity Probe B and is presently part of the LISA Pathfinder technology demonstration. The UV-LED mission and prior ground testing demonstrates that AlGaN UVLEDs operating at 255 nm are superior to Hg lamps because of their smaller size, lower power draw, higher dynamic range, and higher control authority. We show laboratory data demonstrating the effectiveness and survivability of the UV-LED devices and performance of the charge management system. We also show flight data from a small satellite experiment that was one of the payloads on KACST’s SaudiSat-4 mission that demonstrates ‘AC charge control’ (UV-LEDs and bias are AC modulated with adjustable relative phase) between a spherical test mass and its housing. The result of the mission brings the UV-LED device Technology Readiness Level (TRL) to TRL-9 and the charge management system to TRL-7. We demonstrate the ability to control the test mass potential on an 89 mm diameter spherical test mass over a 20 mm gap in a drag-free system configuration, with potential measured using an ultra-high impedance contact probe. Finally, the key electrical and optical characteristics of the UV-LEDs showed less than 7.5% change in performance after 12 months in orbit.

  5. Ground testing and flight demonstration of charge management of insulated test masses using UV-LED electron photoemission

    International Nuclear Information System (INIS)

    Saraf, Shailendhar; Buchman, Sasha; Balakrishnan, Karthik; Lui, Chin Yang; Alfauwaz, Abdulrahman; DeBra, Daniel; Soulage, Michael; Faied, Dohy; Hanson, John; Ling, Kuok; Jaroux, Belgacem; Suwaidan, Badr Al; AlRashed, Abdullah; Al-Nassban, Badr; Alaqeel, Faisal; Harbi, Mohammed Al; Salamah, Badr Bin; Othman, Mohammed Bin; Qasim, Bandar Bin; Al-Majed, Mohammed

    2016-01-01

    The UV-LED mission demonstrates the precise control of the potential of electrically isolated test masses. Test mass charge control is essential for the operation of space accelerometers and drag-free sensors which are at the core of geodesy, aeronomy and precision navigation missions as well as gravitational wave experiments and observatories. Charge management using photoelectrons generated by the 254 nm UV line of Hg was first demonstrated on Gravity Probe B and is presently part of the LISA Pathfinder technology demonstration. The UV-LED mission and prior ground testing demonstrates that AlGaN UVLEDs operating at 255 nm are superior to Hg lamps because of their smaller size, lower power draw, higher dynamic range, and higher control authority. We show laboratory data demonstrating the effectiveness and survivability of the UV-LED devices and performance of the charge management system. We also show flight data from a small satellite experiment that was one of the payloads on KACST’s SaudiSat-4 mission that demonstrates ‘AC charge control’ (UV-LEDs and bias are AC modulated with adjustable relative phase) between a spherical test mass and its housing. The result of the mission brings the UV-LED device Technology Readiness Level (TRL) to TRL-9 and the charge management system to TRL-7. We demonstrate the ability to control the test mass potential on an 89 mm diameter spherical test mass over a 20 mm gap in a drag-free system configuration, with potential measured using an ultra-high impedance contact probe. Finally, the key electrical and optical characteristics of the UV-LEDs showed less than 7.5% change in performance after 12 months in orbit. (paper)

  6. An empirical comparison of effective concentration estimators for evaluating aquatic toxicity test responses

    Energy Technology Data Exchange (ETDEWEB)

    Bailer, A.J.; Hughes, M.R.; Denton, D.L.; Oris, J.T.

    2000-01-01

    Aquatic toxicity tests are statistically evaluated by either hypothesis testing procedures to derive a no-observed-effect concentration or by inverting regression models to calculate the concentration associated with a specific reduction from the control response. These latter methods can be described as potency estimation methods. Standard US Environmental Protection Agency (USEPA) potency estimation methods are based on two different techniques. For continuous or count response data, a nominally nonparametric method that assumes monotonic decreasing responses and piecewise linear patterns between successive concentration groups is used. For quantal responses, a probit regression model with a linear dose term is fit. These techniques were compared with a recently developed parametric regression-based estimator, the relative inhibition estimator, RIp. This method is based on fitting generalized linear models, followed by estimation of the concentration associated with a particular decrement relative to control responses. These estimators, with levels of inhibition (p) of 25 and 50%, were applied to a series of chronic toxicity tests in a US EPA region 9 database of reference toxicity tests. Biological responses evaluated in these toxicity tests included the number of young produced in three broods by the water flea (Ceriodaphnia dubia) and germination success and tube length data from the giant kelp (Macrocystis pyrifera). The greatest discrepancy between the RIp and standard US EPA estimators was observed for C. dubia. The concentration-response pattern for this biological endpoint exhibited nonmonotonicity more frequently than for any of the other endpoint. Future work should consider optimal experimental designs to estimate these quantities, methods for constructing confidence intervals, and simulation studies to explore the behavior of these estimators under known conditions.

  7. Ground penetrating radar and direct current resistivity evaluation of the desiccation test cap, Savannah River Site

    International Nuclear Information System (INIS)

    Wyatt, D.E.; Cumbest, R.J.

    1996-04-01

    The Savannah River Site (SRS) has a variety of waste units that may be temporarily or permanently stabilized by closure using an impermeable cover to prevent groundwater infiltration. The placement of an engineered kaolin clay layer over a waste unit is an accepted and economical technique for providing an impermeable cover but the long term stability and integrity of the clay in non-arid conditions is unknown. A simulated kaolin cap has been constructed at the SRA adjacent to the Burial Ground Complex. The cap is designed to evaluate the effects of desiccation on clay integrity, therefore half of the cap is covered with native soil to prevent drying, while the remainder of the cap is exposed. Measurements of the continuing impermeability of a clay cap are difficult because intrusive techniques may locally compromise the structure. Point measurements made to evaluate clay integrity, such as those from grid sampling or coring and made through a soil cover, may miss cracks, joints or fissures, and may not allow for mapping of the lateral extent of elongate features. Because of these problems, a non-invasive technique is needed to map clay integrity, below a soil or vegetation cover, which is capable of moderate to rapid investigation speeds. Two non-intrusive geophysical techniques, direct current resistivity and ground penetrating radar (GPR), have been successful at the SRS in geologically mapping shallow subsurface clay layers. The applicability of each technique in detecting the clay layer in the desiccation test cap and associated anomalies was investigated

  8. Pre-Flight Tests with Astronauts, Flight and Ground Hardware, to Assure On-Orbit Success

    Science.gov (United States)

    Haddad Michael E.

    2010-01-01

    On-Orbit Constraints Test (OOCT's) refers to mating flight hardware together on the ground before they will be mated on-orbit or on the Lunar surface. The concept seems simple but it can be difficult to perform operations like this on the ground when the flight hardware is being designed to be mated on-orbit in a zero-g/vacuum environment of space or low-g/vacuum environment on the Lunar/Mars Surface. Also some of the items are manufactured years apart so how are mating tasks performed on these components if one piece is on-orbit/on Lunar/Mars surface before its mating piece is planned to be built. Both the Internal Vehicular Activity (IVA) and Extra-Vehicular Activity (EVA) OOCT's performed at Kennedy Space Center will be presented in this paper. Details include how OOCT's should mimic on-orbit/Lunar/Mars surface operational scenarios, a series of photographs will be shown that were taken during OOCT's performed on International Space Station (ISS) flight elements, lessons learned as a result of the OOCT's will be presented and the paper will conclude with possible applications to Moon and Mars Surface operations planned for the Constellation Program.

  9. Small UAV Automatic Ground Collision Avoidance System Design Considerations and Flight Test Results

    Science.gov (United States)

    Sorokowski, Paul; Skoog, Mark; Burrows, Scott; Thomas, SaraKatie

    2015-01-01

    The National Aeronautics and Space Administration (NASA) Armstrong Flight Research Center Small Unmanned Aerial Vehicle (SUAV) Automatic Ground Collision Avoidance System (Auto GCAS) project demonstrated several important collision avoidance technologies. First, the SUAV Auto GCAS design included capabilities to take advantage of terrain avoidance maneuvers flying turns to either side as well as straight over terrain. Second, the design also included innovative digital elevation model (DEM) scanning methods. The combination of multi-trajectory options and new scanning methods demonstrated the ability to reduce the nuisance potential of the SUAV while maintaining robust terrain avoidance. Third, the Auto GCAS algorithms were hosted on the processor inside a smartphone, providing a lightweight hardware configuration for use in either the ground control station or on board the test aircraft. Finally, compression of DEM data for the entire Earth and successful hosting of that data on the smartphone was demonstrated. The SUAV Auto GCAS project demonstrated that together these methods and technologies have the potential to dramatically reduce the number of controlled flight into terrain mishaps across a wide range of aviation platforms with similar capabilities including UAVs, general aviation aircraft, helicopters, and model aircraft.

  10. Prescribed differences in exercise intensity based on the TCAR test over sandy ground and grass.

    Directory of Open Access Journals (Sweden)

    Juliano Fernandes da Silva

    2010-01-01

    Full Text Available The intensity of training might be influenced by exercise mode and type of terrain. Thus, the objective of this study was a to compare the physiological indices determined in the TCAR test carried out on natural grass (NG and sandy ground (SG, and b to analyze heart rate (HR and blood lactate responses during constant exercise on SG and NG. Ten soccer players (15.11 ± 1.1 years, 168 ± 4.0 cm, 60 ± 4.0 kg were submitted to the TCAR test to determine peak velocity (PV and the intensity corresponding to 80.4% PV (V80.4 on NG and SG. The second evaluation consisted of two constant load tests (CLT (80.4% PV on NG and SG with a duration of 27 min. The paired Student t-test was used to compare the tests carried out on NG and SG. ANOVA (two-way, complemented by the Tukey test, was used to compare lactate concentrations [La] at 9, 18 and 27 min between the two types of terrain. A p value <0.05 was adopted. PV and V80.4 (15.3±1.0 and 12.3±0.6 km/h were significantly higher on grass than on sand (14.3±1.0 and 11.5±0.4 km/h. Lactate concentration during the CLT [LaV80.4] was significantly higher on sand (4.1±0.9 mmol/L than on grass (2.8±0.7 mmol/L. In the CLT, no significant difference in mean HR was observed between the two terrains, whereas there was a difference in [La]. In conclusion, the type of terrain interferes with indicators associated with aerobic power and capacity obtained by the TCAR test.

  11. Ground and surface water for drinking: a laboratory study on genotoxicity using plant tests

    Directory of Open Access Journals (Sweden)

    Donatella Feretti

    2012-02-01

    Full Text Available Surface waters are increasingly utilized for drinking water because groundwater sources are often polluted. Several monitoring studies have detected the presence of mutagenicity in drinking water, especially from surface sources due to the reaction of natural organic matter with disinfectant. The study aimed to investigate the genotoxic potential of the products of reaction between humic substances, which are naturally present in surface water, and three disinfectants: chlorine dioxide, sodium hypochlorite and peracetic acid. Commercial humic acids dissolved in distilled water at different total organic carbon (TOC concentrations were studied in order to simulate natural conditions of both ground water (TOC=2.5 mg/L and surface water (TOC=7.5 mg/L. These solutions were treated with the biocides at a 1:1 molar ratio of C:disinfectant and tested for genotoxicity using the anaphase chromosomal aberration and micronucleus tests in Allium cepa, and the Vicia faba and Tradescantia micronucleus tests. The tests were carried out after different times and with different modes of exposure, and at 1:1 and 1:10 dilutions of disinfected and undisinfected humic acid solutions. A genotoxic effect was found for sodium hypochlorite in all plant tests, at both TOCs considered, while chlorine dioxide gave positive results only with the A.cepa tests. Some positive effects were also detected for PAA (A.cepa and Tradescantia. No relevant differences were found in samples with different TOC values. The significant increase in all genotoxicity end-points induced by all tested disinfectants indicates that a genotoxic potential is exerted even in the presence of organic substances at similar concentrations to those frequently present in drinking water.

  12. Effects of arousal on cognitive control: empirical tests of the conflict-modulated Hebbian-learning hypothesis.

    Science.gov (United States)

    Brown, Stephen B R E; van Steenbergen, Henk; Kedar, Tomer; Nieuwenhuis, Sander

    2014-01-01

    An increasing number of empirical phenomena that were previously interpreted as a result of cognitive control, turn out to reflect (in part) simple associative-learning effects. A prime example is the proportion congruency effect, the finding that interference effects (such as the Stroop effect) decrease as the proportion of incongruent stimuli increases. While this was previously regarded as strong evidence for a global conflict monitoring-cognitive control loop, recent evidence has shown that the proportion congruency effect is largely item-specific and hence must be due to associative learning. The goal of our research was to test a recent hypothesis about the mechanism underlying such associative-learning effects, the conflict-modulated Hebbian-learning hypothesis, which proposes that the effect of conflict on associative learning is mediated by phasic arousal responses. In Experiment 1, we examined in detail the relationship between the item-specific proportion congruency effect and an autonomic measure of phasic arousal: task-evoked pupillary responses. In Experiment 2, we used a task-irrelevant phasic arousal manipulation and examined the effect on item-specific learning of incongruent stimulus-response associations. The results provide little evidence for the conflict-modulated Hebbian-learning hypothesis, which requires additional empirical support to remain tenable.

  13. Effects of arousal on cognitive control: Empirical tests of the conflict-modulated Hebbian-learning hypothesis

    Directory of Open Access Journals (Sweden)

    Stephen B.R.E. Brown

    2014-01-01

    Full Text Available An increasing number of empirical phenomena that were previously interpreted as a result of cognitive control, turn out to reflect (in part simple associative-learning effects. A prime example is the proportion congruency effect, the finding that interference effects (such as the Stroop effect decrease as the proportion of incongruent stimuli increases. While this was previously regarded as strong evidence for a global conflict monitoring-cognitive control loop, recent evidence has shown that the proportion congruency effect is largely item-specific and hence must be due to associative learning. The goal of our research was to test a recent hypothesis about the mechanism underlying such associative-learning effects, the conflict-modulated Hebbian-learning hypothesis, which proposes that the effect of conflict on associative learning is mediated by phasic arousal responses. In Experiment 1, we examined in detail the relationship between the item-specific proportion congruency effect and an autonomic measure of phasic arousal: task-evoked pupillary responses. In Experiment 2, we used a task-irrelevant phasic arousal manipulation and examined the effect on item-specific learning of incongruent stimulus-response associations. The results provide little evidence for the conflict-modulated Hebbian-learning hypothesis, which requires additional empirical support to remain tenable.

  14. Empirical test of Capital Asset Pricing Model on Selected Banking Shares from Borsa Istanbul

    Directory of Open Access Journals (Sweden)

    Fuzuli Aliyev

    2018-03-01

    Full Text Available In this paper we tested Capital Asset Pricing Model (shortly CAPM hereafter on the selected banking stocks of Borsa Istanbul. Here we tried to explain how to price financial assets based on their risks in the case of BIST-100 index. CAPM is an important model in the portfolio management theory used by economic agents for the selection of financial assets. We used 12 random banking stocks’ monthly return data for 2001–2010 periods. To test the validity of the CAPM, we first derived the regression equation for the risk-free interest rate and risk premium relationship using January 2001–December 2009 data. Then, estimated January–December 2010 returns with the equation. Comparing forecasted return with the actual return, we concluded that the CAPM is valid for the portfolio consisting of the 12 banks traded in the ISE, i.e. The model could predict the overall outcome of portfolio of selected banking shares

  15. More than friendship is required : an empirical test of cooperative firm strategies

    OpenAIRE

    Pesämaa, Ossi; Hair Jr, Joseph F

    2006-01-01

    Purpose - The purpose of this paper is to examine a proposed six-construct theoretical model of factors influencing successful cooperative relationships and strategy development. Design/methodology/approach - A theoretical model of strategy development and cooperative relationships was tested. Qualitative research among key experts identified 15 successful regional tourism networks. Two successful cooperative networks were selected based on annual revenues. A sample of 254 small and mediu...

  16. Shareholder Protection and Stock Market Development: An Empirical Test of the Legal Origins Hypothesis

    OpenAIRE

    John Armour; Simon Deakin; Prabirjit Sarkar; Mathias Siems; Ajit Singh

    2008-01-01

    We test the ‘law matters’ and ‘legal origin’claims using a newly created panel dataset measuring legal change over time in a sample of developed and developing countries. Our dataset improves on previous ones by avoiding country-specific variables in favour of functional and generic descriptors, by taking into account a wider range of legal data, and by considering the effects of weighting variables in different ways, thereby ensuring greater consistency of coding....

  17. Complex biological testing of ground water quality in the area of sewage settler filtration fields of JSC 'Almaty Kanty'

    International Nuclear Information System (INIS)

    Vetrinskaya, N.I.; Goldobina, E.A.; Kosmukhambetov, A.R.; Kulikova, O.V.; Kozlova, N.V.; Ismailova, Zh.B.

    2001-01-01

    Results are given on the ground water ecological quality estimation of operating survey boreholes of JSC 'Almaty Kanty' industrial enterprise filtration fields using different methods of biological testing. Proved that various biological objects reacted differently onto the toxins present in the water. Concealment of toxic effect was performed at short-period testing at several testing objects (stimulation). Revealed during long period tests, that ground water from all the boreholes surveyed is not ecologically clean and pure, and can bring damage for ecosystem of water reservoirs adjacent and sources of drinking water if migration happens. (author)

  18. Empirical usability testing in a component-based environment : improving test efficiency with component-specific usability measures

    NARCIS (Netherlands)

    Brinkman, W.P.; Haakma, R.; Bouwhuis, D.G.; Bastide, R.; Palanque, P.; Roth, J.

    2005-01-01

    This paper addresses the issue of usability testing in a component-based software engineering environment, specifically measuring the usability of different versions of a component in a more powerful manner than other, more holistic, usability methods. Three component-specific usability measures are

  19. RF System description for the ground test accelerator radio-frequency quadrupole

    International Nuclear Information System (INIS)

    Regan, A.H.; Brittain, D.; Rees, D.E.; Ziomek, D.

    1992-01-01

    This paper describes the RF system being used to provide RF power and to control the cavity field for the ground test accelerator (GTA) radio-frequency quadrupole (RFQ). The RF system consists of a low-level RF (LLRF) control system, and RF Reference generation subsystem, and a tetrode as a high-power amplifier (HPA) that can deliver up to 300 kW of peak power to the RFQ cavity at a 2% duty factor. The LLRF control system implements in-phase and quadrature (I and Q) control to maintain the cavity field within tolerances of 0.5% in amplitude and 0.5 degrees in phase in the presence of beam-induced instabilities

  20. Rf system description for the ground test accelerator radio-frequency quadrupole

    International Nuclear Information System (INIS)

    Regan, A.H.; Brittain, D.; Rees, D.E.; Ziomek, D.

    1992-01-01

    This paper describes the RF system being used to provide RF power and to control the cavity field used for the ground test accelerator (GTA) radio-frequency quadrupole (RFQ). The RF system consists of a low-level RF (LLRF) control system that uses a tetrode as a high-power amplifier (HPA) as part of its plant to deliver up to 300 kW of peak power to the RFQ at a 2% duty factor. The LLRF control system implements in-phase and quadrature (I ampersand Q) control to maintain the cavity field within tolerances of 0.5% in amplitude and 0.5 degrees in phase in the presence of beam-induced instabilities. This paper describes the identified components and presents measured performance data. The user interface with the systems is described, and cavity field measurements are included

  1. Final safety analysis report for the Ground Test Accelerator (GTA), Phase 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-10-01

    This document is the first volume of a 3 volume safety analysis report on the Ground Test Accelerator (GTA). The GTA program at the Los Alamos National Laboratory (LANL) is the major element of the national Neutral Particle Beam (NPB) program, which is supported by the Strategic Defense Initiative Office (SDIO). A principal goal of the national NPB program is to assess the feasibility of using hydrogen and deuterium neutral particle beams outside the Earth`s atmosphere. The main effort of the NPB program at Los Alamos concentrates on developing the GTA. The GTA is classified as a low-hazard facility, except for the cryogenic-cooling system, which is classified as a moderate-hazard facility. This volume consists of an introduction, summary/conclusion, site description and assessment, description of facility, and description of operation.

  2. Radiative Heating in MSL Entry: Comparison of Flight Heating Discrepancy to Ground Test and Predictive Models

    Science.gov (United States)

    Cruden, Brett A.; Brandis, Aaron M.; White, Todd R.; Mahzari, Milad; Bose, Deepak

    2014-01-01

    During the recent entry of the Mars Science Laboratory (MSL), the heat shield was equipped with thermocouple stacks to measure in-depth heating of the thermal protection system (TPS). When only convective heating was considered, the derived heat flux from gauges in the stagnation region was found to be underpredicted by as much as 17 W/sq cm, which is significant compared to the peak heating of 32 W/sq cm. In order to quantify the contribution of radiative heating phenomena to the discrepancy, ground tests and predictive simulations that replicated the MSL entry trajectory were performed. An analysis is carried through to assess the quality of the radiation model and the impact to stagnation line heating. The impact is shown to be significant, but does not fully explain the heating discrepancy.

  3. An empirical test of the Theory of Planned Behaviour applied to contraceptive use in rural Uganda.

    Science.gov (United States)

    Kiene, Susan M; Hopwood, Sarah; Lule, Haruna; Wanyenze, Rhoda K

    2014-12-01

    There is a high unmet need for contraceptives in developing countries such as Uganda, with high population growth, where efforts are needed to promote family planning and contraceptive use. Despite this high need, little research has investigated applications of health-behaviour-change theories to contraceptive use among this population. This study tested the Theory of Planned Behaviour's ability to predict contraceptive-use-related behaviours among post-partum women in rural Uganda. Results gave modest support to the theory's application and suggest an urgent need for improved theory-based interventions to promote contraceptive use in the populations of developing countries. © The Author(s) 2013.

  4. Market structure and the role of consumer information in the physician services industry: an empirical test.

    Science.gov (United States)

    Wong, H S

    1996-04-01

    This paper applies Panzar and Rosse's (1987) econometric test of market structure to examine two long-debated issues: What is the market structure for physician services? Do more physicians in a market area raise the search cost of obtaining consumer information and increase prices (Satterthwaite, 1979, 1985)? For primary care and general and family practice physicians, the monopolistically competitive model prevailed over the competing hypotheses--monopoly, perfect competition, and monopolistic competition characterized by consumer informational confusion. Although less conclisive, there is some evidence to support the monopolistically competitive model for surgeons and the consumer informational confusion model for internal medicine physicians.

  5. E85 and fuel efficiency: An empirical analysis of 2007 EPA test data

    International Nuclear Information System (INIS)

    Roberts, Matthew C.

    2008-01-01

    It is well known that ethanol has less energy per unit volume than gasoline. Differences in engine design and fuel characteristics affect the efficiency with which the chemical energy in gasoline and ethanol is converted into mechanical energy, so that the change in fuel economy may not be a linear function of energy content. This study analyzes the fuel economy tests performed by the US Environmental Protection Agency (EPA) on 2007 model year E85-compliant vehicles and finds that the difference in average fuel economy is not statistically different from the differential in energy content

  6. ACCESS, Absolute Color Calibration Experiment for Standard Stars: Integration, Test, and Ground Performance

    Science.gov (United States)

    Kaiser, Mary Elizabeth; Morris, Matthew; Aldoroty, Lauren; Kurucz, Robert; McCandliss, Stephan; Rauscher, Bernard; Kimble, Randy; Kruk, Jeffrey; Wright, Edward L.; Feldman, Paul; Riess, Adam; Gardner, Jonathon; Bohlin, Ralph; Deustua, Susana; Dixon, Van; Sahnow, David J.; Perlmutter, Saul

    2018-01-01

    Establishing improved spectrophotometric standards is important for a broad range of missions and is relevant to many astrophysical problems. Systematic errors associated with astrophysical data used to probe fundamental astrophysical questions, such as SNeIa observations used to constrain dark energy theories, now exceed the statistical errors associated with merged databases of these measurements. ACCESS, “Absolute Color Calibration Experiment for Standard Stars”, is a series of rocket-borne sub-orbital missions and ground-based experiments designed to enable improvements in the precision of the astrophysical flux scale through the transfer of absolute laboratory detector standards from the National Institute of Standards and Technology (NIST) to a network of stellar standards with a calibration accuracy of 1% and a spectral resolving power of 500 across the 0.35‑1.7μm bandpass. To achieve this goal ACCESS (1) observes HST/ Calspec stars (2) above the atmosphere to eliminate telluric spectral contaminants (e.g. OH) (3) using a single optical path and (HgCdTe) detector (4) that is calibrated to NIST laboratory standards and (5) monitored on the ground and in-flight using a on-board calibration monitor. The observations are (6) cross-checked and extended through the generation of stellar atmosphere models for the targets. The ACCESS telescope and spectrograph have been designed, fabricated, and integrated. Subsystems have been tested. Performance results for subsystems, operations testing, and the integrated spectrograph will be presented. NASA sounding rocket grant NNX17AC83G supports this work.

  7. Online social networks for crowdsourced multimedia-involved behavioral testing: An empirical study

    Directory of Open Access Journals (Sweden)

    Jun-Ho eChoi

    2016-01-01

    Full Text Available Online social networks have emerged as effective crowdsourcing media to recruit participants in recent days. However, issues regarding how to effectively exploit them have not been adequately addressed yet. In this paper, we investigate the reliability and effectiveness of multimedia-involved behavioral testing via social network-based crowdsourcing, especially focused on Facebook as a medium to recruit participants. We conduct a crowdsourcing-based experiment for a music recommendation problem. It is shown that different advertisement methods yield different degrees of efficiency and there exist significant differences in behavioral patterns across different genders and different age groups. In addition, we perform a comparison of our experiment with other multimedia-involved crowdsourcing experiments built on Amazon Mechanical Turk (MTurk, which suggests that crowdsourcing-based experiments using social networks for recruitment can achieve comparable efficiency. Based on the analysis results, advantages and disadvantages of social network-based crowdsourcing and suggestions for successful experiments are also discussed. We conclude that social networks have the potential to support multimedia-involved behavioral tests to gather in-depth data even for long-term periods.

  8. Discrimination of tenants with a visual impairment on the housing market: Empirical evidence from correspondence tests.

    Science.gov (United States)

    Verhaeghe, Pieter-Paul; Van der Bracht, Koen; Van de Putte, Bart

    2016-04-01

    According to the social model of disability, physical 'impairments' become disabilities through exclusion in social relations. An obvious form of social exclusion might be discrimination, for instance on the rental housing market. Although discrimination has detrimental health effects, very few studies have examined discrimination of people with a visual impairment. We aim to study (1) the extent of discrimination of individuals with a visual impairment on the rental housing market and (2) differences in rates of discrimination between landowners and real estate agents. We conducted correspondence tests among 268 properties on the Belgian rental housing market. Using matched tests, we compared reactions by realtors and landowners to tenants with and tenants without a visual impairment. The results show that individuals with a visual impairment are substantially discriminated against in the rental housing market: at least one in three lessors discriminate against individuals with a visual impairment. We further discern differences in the propensity toward discrimination according to the type of lessor. Private landlords are at least twice as likely to discriminate against tenants with a visual impairment than real estate agents. At the same time, realtors still discriminate against one in five tenants with a visual impairment. This study shows the substantial discrimination against visually people with an impairment. Given the important consequences discrimination might have for physical and mental health, further research into this topic is needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Online Social Networks for Crowdsourced Multimedia-Involved Behavioral Testing: An Empirical Study

    Science.gov (United States)

    Choi, Jun-Ho; Lee, Jong-Seok

    2016-01-01

    Online social networks have emerged as effective crowdsourcing media to recruit participants in recent days. However, issues regarding how to effectively exploit them have not been adequately addressed yet. In this paper, we investigate the reliability and effectiveness of multimedia-involved behavioral testing via social network-based crowdsourcing, especially focused on Facebook as a medium to recruit participants. We conduct a crowdsourcing-based experiment for a music recommendation problem. It is shown that different advertisement methods yield different degrees of efficiency and there exist significant differences in behavioral patterns across different genders and different age groups. In addition, we perform a comparison of our experiment with other multimedia-involved crowdsourcing experiments built on Amazon Mechanical Turk (MTurk), which suggests that crowdsourcing-based experiments using social networks for recruitment can achieve comparable efficiency. Based on the analysis results, advantages and disadvantages of social network-based crowdsourcing and suggestions for successful experiments are also discussed. We conclude that social networks have the potential to support multimedia-involved behavioral tests to gather in-depth data even for long-term periods. PMID:26793137

  10. Online Social Networks for Crowdsourced Multimedia-Involved Behavioral Testing: An Empirical Study.

    Science.gov (United States)

    Choi, Jun-Ho; Lee, Jong-Seok

    2015-01-01

    Online social networks have emerged as effective crowdsourcing media to recruit participants in recent days. However, issues regarding how to effectively exploit them have not been adequately addressed yet. In this paper, we investigate the reliability and effectiveness of multimedia-involved behavioral testing via social network-based crowdsourcing, especially focused on Facebook as a medium to recruit participants. We conduct a crowdsourcing-based experiment for a music recommendation problem. It is shown that different advertisement methods yield different degrees of efficiency and there exist significant differences in behavioral patterns across different genders and different age groups. In addition, we perform a comparison of our experiment with other multimedia-involved crowdsourcing experiments built on Amazon Mechanical Turk (MTurk), which suggests that crowdsourcing-based experiments using social networks for recruitment can achieve comparable efficiency. Based on the analysis results, advantages and disadvantages of social network-based crowdsourcing and suggestions for successful experiments are also discussed. We conclude that social networks have the potential to support multimedia-involved behavioral tests to gather in-depth data even for long-term periods.

  11. Clinical trial: a randomized trial of early endoscopy, Helicobacter pylori testing and empirical therapy for the management of dyspepsia in primary care.

    Science.gov (United States)

    Duggan, A E; Elliott, C A; Miller, P; Hawkey, C J; Logan, R F A

    2009-01-01

    Early endoscopy, Helicobacter pylori eradication and empirical acid suppression are commonly used dyspepsia management strategies in primary care but have not been directly compared in a single trial. To compare endoscopy, H. pylori test and refer, H. pylori test and treat and empirical acid suppression for dyspepsia in primary care. Patients presenting to their general practitioner with dyspepsia were randomized to endoscopy, H. pylori'test and treat', H. pylori test and endoscope positives, or empirical therapy with symptoms, patient satisfaction, healthcare costs and cost effectiveness at 12 months being the outcomes. At 2 months, the proportion of patients reporting no or minimal dyspeptic symptoms ranged from 74% for those having early endoscopy to 55% for those on empirical therapy (P = 0.009), but at 1 year, there was little difference among the four strategies. Early endoscopy was associated with fewer subsequent consultations for dyspepsia (P = 0.003). 'Test and treat' resulted in fewer endoscopies overall and was most cost-effective over a range of cost assumptions. Empirical therapy resulted in the lowest initial costs, but the highest rate of subsequent endoscopy. Gastro-oesophageal cancers were found in four patients randomized to the H. pylori testing strategies. While early endoscopy offered some advantages 'Test and treat' was the most cost-effective strategy. In older patients, early endoscopy may be an appropriate strategy in view of the greater risk of malignant disease. © 2008 The Authors. Journal compilation © 2008 Blackwell Publishing Ltd.

  12. Calculated concentrations of any radionuclide deposited on the ground by release from underground nuclear detonations, tests of nuclear rockets, and tests of nuclear ramjet engines

    International Nuclear Information System (INIS)

    Hicks, H.G.

    1981-11-01

    This report presents calculated gamma radiation exposure rates and ground deposition of related radionuclides resulting from three types of event that deposited detectable radioactivity outside the Nevada Test Site complex, namely, underground nuclear detonations, tests of nuclear rocket engines and tests of nuclear ramjet engines

  13. Shelf-Life Prediction of Extra Virgin Olive Oils Using an Empirical Model Based on Standard Quality Tests

    Directory of Open Access Journals (Sweden)

    Claudia Guillaume

    2016-01-01

    Full Text Available Extra virgin olive oil shelf-life could be defined as the length of time under normal storage conditions within which no off-flavours or defects are developed and quality parameters such as peroxide value and specific absorbance are retained within accepted limits for this commercial category. Prediction of shelf-life is a desirable goal in the food industry. Even when extra virgin olive oil shelf-life should be one of the most important quality markers for extra virgin olive oil, it is not recognised as a legal parameter in most regulations and standards around the world. The proposed empirical formula to be evaluated in the present study is based on common quality tests with known and predictable result changes over time and influenced by different aspects of extra virgin olive oil with a meaningful influence over its shelf-life. The basic quality tests considered in the formula are Rancimat® or induction time (IND; 1,2-diacylglycerols (DAGs; pyropheophytin a (PPP; and free fatty acids (FFA. This paper reports research into the actual shelf-life of commercially packaged extra virgin olive oils versus the predicted shelf-life of those oils determined by analysing the expected deterioration curves for the three basic quality tests detailed above. Based on the proposed model, shelf-life is predicted by choosing the lowest predicted shelf-life of any of those three tests.

  14. Is Asian American Parenting Controlling and Harsh? Empirical Testing of Relationships between Korean American and Western Parenting Measures.

    Science.gov (United States)

    Choi, Yoonsun; Kim, You Seung; Kim, Su Yeong; Park, Irene Kim

    2013-03-01

    Asian American parenting is often portrayed as highly controlling and even harsh. This study empirically tested the associations between a set of recently developed Korean ga-jung-kyo-yuk measures and several commonly used Western parenting measures to accurately describe Asian American family processes, specifically those of Korean Americans. The results show a much nuanced and detailed picture of Korean American parenting as a blend of Western authoritative and authoritarian styles with positive and-although very limited-negative parenting. Certain aspects of ga-jung-kyo-yuk are positively associated with authoritative style or authoritarian style, or even with both of them simultaneously. They were positively associated with positive parenting (warmth, acceptance, and communication) but not with harsh parenting (rejection and negative discipline). Exceptions to this general pattern were Korean traditional disciplinary practices and the later age of separate sleeping of children. The study discusses implications of these findings and provides suggestions for future research.

  15. Is Asian American Parenting Controlling and Harsh? Empirical Testing of Relationships between Korean American and Western Parenting Measures

    Science.gov (United States)

    Choi, Yoonsun; Kim, You Seung; Kim, Su Yeong; Park, Irene Kim

    2013-01-01

    Asian American parenting is often portrayed as highly controlling and even harsh. This study empirically tested the associations between a set of recently developed Korean ga-jung-kyo-yuk measures and several commonly used Western parenting measures to accurately describe Asian American family processes, specifically those of Korean Americans. The results show a much nuanced and detailed picture of Korean American parenting as a blend of Western authoritative and authoritarian styles with positive and—although very limited—negative parenting. Certain aspects of ga-jung-kyo-yuk are positively associated with authoritative style or authoritarian style, or even with both of them simultaneously. They were positively associated with positive parenting (warmth, acceptance, and communication) but not with harsh parenting (rejection and negative discipline). Exceptions to this general pattern were Korean traditional disciplinary practices and the later age of separate sleeping of children. The study discusses implications of these findings and provides suggestions for future research. PMID:23977415

  16. Shake table test of soil-pile groups-bridge structure interaction in liquefiable ground

    Science.gov (United States)

    Tang, Liang; Ling, Xianzhang; Xu, Pengju; Gao, Xia; Wang, Dongsheng

    2010-03-01

    This paper describes a shake table test study on the seismic response of low-cap pile groups and a bridge structure in liquefiable ground. The soil profile, contained in a large-scale laminar shear box, consisted of a horizontally saturated sand layer overlaid with a silty clay layer, with the simulated low-cap pile groups embedded. The container was excited in three El Centro earthquake events of different levels. Test results indicate that excessive pore pressure (EPP) during slight shaking only slightly accumulated, and the accumulation mainly occurred during strong shaking. The EPP was gradually enhanced as the amplitude and duration of the input acceleration increased. The acceleration response of the sand was remarkably influenced by soil liquefaction. As soil liquefaction occurred, the peak sand displacement gradually lagged behind the input acceleration; meanwhile, the sand displacement exhibited an increasing effect on the bending moment of the pile, and acceleration responses of the pile and the sand layer gradually changed from decreasing to increasing in the vertical direction from the bottom to the top. A jump variation of the bending moment on the pile was observed near the soil interface in all three input earthquake events. It is thought that the shake table tests could provide the groundwork for further seismic performance studies of low-cap pile groups used in bridges located on liquefiable groun.

  17. Feasibility study of a nonequilibrium MHD accelerator concept for hypersonic propulsion ground testing

    International Nuclear Information System (INIS)

    Lee, Ying-Ming; Simmons, G.A.; Nelson, G.L.

    1995-01-01

    A National Aeronautics and Space Administration (NASA) funded research study to evaluate the feasibility of using magnetohydrodynamic (MHD) body force accelerators to produce true air simulation for hypersonic propulsion ground testing is discussed in this paper. Testing over the airbreathing portion of a transatmospheric vehicle (TAV) hypersonic flight regime will require high quality air simulation for actual flight conditions behind a bow shock wave (forebody, pre-inlet region) for flight velocities up to Mach 16 and perhaps beyond. Material limits and chemical dissociation at high temperature limit the simulated flight Mach numbers in conventional facilities to less than Mach 12 for continuous and semi-continuous testing and less than Mach 7 for applications requiring true air chemistry. By adding kinetic energy directly to the flow, MHD accelerators avoid the high temperatures and pressures required in the reservoir region of conventional expansion facilities, allowing MHD to produce true flight conditions in flight regimes impossible with conventional facilities. The present study is intended to resolve some of the critical technical issues related to the operation of MHD at high pressure. Funding has been provided only for the first phase of a three to four year feasibility study that would culminate in the demonstration of MHD acceleration under conditions required to produce true flight conditions behind a bow shock wave to flight Mach numbers of 16 or greater. MHD critical issues and a program plan to resolve these are discussed

  18. Vent System Analysis for the Cryogenic Propellant Storage Transfer Ground Test Article

    Science.gov (United States)

    Hedayat, A

    2013-01-01

    To test and validate key capabilities and technologies required for future exploration elements such as large cryogenic propulsion stages and propellant depots, NASA is leading the efforts to develop and design the Cryogenic Propellant Storage and Transfer (CPST) Cryogenic Fluid Management (CFM) payload. The primary objectives of CPST payload are to demonstrate: 1) in-space storage of cryogenic propellants for long duration applications; and 2) in-space transfer of cryogenic propellants. The Ground Test Article (GTA) is a technology development version of the CPST payload. The GTA consists of flight-sized and flight-like storage and transfer tanks, liquid acquisition devices, transfer, and pressurization systems with all of the CPST functionality. The GTA is designed to perform integrated passive and active thermal storage and transfer performance testing with liquid hydrogen (LH2) in a vacuum environment. The GTA storage tank is designed to store liquid hydrogen and the transfer tank is designed to be 5% of the storage tank volume. The LH2 transfer subsystem is designed to transfer propellant from one tank to the other utilizing pressure or a pump. The LH2 vent subsystem is designed to prevent over-pressurization of the storage and transfer tanks. An in-house general-purpose computer program was utilized to model and simulate the vent subsystem operation. The modeling, analysis, and the results will be presented in the final paper.

  19. Protestant Clergy and the Culture Wats: An Empirical Test of Hunter's Thesis.

    Science.gov (United States)

    Uecker, Jeremy E; Lucke, Glenn

    2011-12-01

    This study instead focuses on culture wars among religious elites-clergy-and tests three aspects of the culture wars thesis: (1) whether cultural wars exist at all among religious elites, (2) whether clergy attitudes are polarized on these issues, and (3) whether religious authority or religious affiliation is more salient in creating culture wars cleavages. Using data from a large random sample of Protestant clergy, we find a substantial amount of engagement in culture wars by all types of Protestant clergy. The amount of polarization is more attributable to views of religious authority (i.e., biblical inerrancy) than to religious tradition. Moreover, polarization among clergy is somewhat more evident on culture wars issues than on other social and political issues. These findings are generally supportive of the culture wars thesis and should help return examinations of culture wars back to where they were originally theorized to be waged: among elites.

  20. Box-Cox Test: the theoretical justification and US-China empirical study

    Directory of Open Access Journals (Sweden)

    Tam Bang Vu

    2011-01-01

    Full Text Available In econometrics, the derivation of a theoretical model leads sometimes to two econometric models, which can be considered justified based on their respective approximation approaches. Hence, the decision of choosing one between the two hinges on applied econometric tools. In this paper, the authors develop a theoretical econometrics consumer maximization model to measure the flow of durables’ expenditures where depreciation is added to former classical econometrics model. The proposed model was formulated in both linear and logarithmic forms. Box-Cox tests were used to choose the most appropriate one among them. The proposed model was then applied to the historical data from the U.S. and China for a comparative study and the results discussed.

  1. A test and re-estimation of Taylor's empirical capacity-reserve relationship

    Science.gov (United States)

    Long, K.R.

    2009-01-01

    In 1977, Taylor proposed a constant elasticity model relating capacity choice in mines to reserves. A test of this model using a very large (n = 1,195) dataset confirms its validity but obtains significantly different estimated values for the model coefficients. Capacity is somewhat inelastic with respect to reserves, with an elasticity of 0.65 estimated for open-pit plus block-cave underground mines and 0.56 for all other underground mines. These new estimates should be useful for capacity determinations as scoping studies and as a starting point for feasibility studies. The results are robust over a wide range of deposit types, deposit sizes, and time, consistent with physical constraints on mine capacity that are largely independent of technology. ?? 2009 International Association for Mathematical Geology.

  2. Neighborhood social capital and adult health: an empirical test of a Bourdieu-based model.

    Science.gov (United States)

    Carpiano, Richard M

    2007-09-01

    Drawing upon Bourdieu's [1986. The forms of capital. In: Richardson, J.G. (Ed.), Handbook of Theory and Research for the Sociology of Education. Greenwood, New York, pp. 241-258.] social capital theory, I test a conceptual model of neighborhood conditions and social capital - considering relationships between neighborhood social capital forms (social support, social leverage, informal social control, and neighborhood organization participation) and adult health behaviors (smoking, binge drinking) and perceived health, as well as interactions between neighborhood social capital and individuals' access to that social capital. Analyzing Los Angeles Family and Neighborhood Survey data linked with tract level census data, results suggest that specific social capital forms were directly associated with both positive and negative health outcomes. Additionally, residents' neighborhood attachment moderated relationships between various social capital forms and health. Future studies should consider social capital resources and the role of differential access to such resources for promoting or compromising health.

  3. Endogenous Versus Exogenous Shocks in Complex Networks: An Empirical Test Using Book Sale Rankings

    Science.gov (United States)

    Sornette, D.; Deschâtres, F.; Gilbert, T.; Ageon, Y.

    2004-11-01

    We study the precursory and recovery signatures accompanying shocks in complex networks, that we test on a unique database of the Amazon.com ranking of book sales. We find clear distinguishing signatures classifying two types of sales peaks. Exogenous peaks occur abruptly and are followed by a power law relaxation, while endogenous peaks occur after a progressively accelerating power law growth followed by an approximately symmetrical power law relaxation which is slower than for exogenous peaks. These results are rationalized quantitatively by a simple model of epidemic propagation of interactions with long memory within a network of acquaintances. The observed relaxation of sales implies that the sales dynamics is dominated by cascades rather than by the direct effects of news or advertisements, indicating that the social network is close to critical.

  4. Testing the performance of empirical remote sensing algorithms in the Baltic Sea waters with modelled and in situ reflectance data

    Directory of Open Access Journals (Sweden)

    Martin Ligi

    2017-01-01

    Full Text Available Remote sensing studies published up to now show that the performance of empirical (band-ratio type algorithms in different parts of the Baltic Sea is highly variable. Best performing algorithms are different in the different regions of the Baltic Sea. Moreover, there is indication that the algorithms have to be seasonal as the optical properties of phytoplankton assemblages dominating in spring and summer are different. We modelled 15,600 reflectance spectra using HydroLight radiative transfer model to test 58 previously published empirical algorithms. 7200 of the spectra were modelled using specific inherent optical properties (SIOPs of the open parts of the Baltic Sea in summer and 8400 with SIOPs of spring season. Concentration range of chlorophyll-a, coloured dissolved organic matter (CDOM and suspended matter used in the model simulations were based on the actually measured values available in literature. For each optically active constituent we added one concentration below actually measured minimum and one concentration above the actually measured maximum value in order to test the performance of the algorithms in wider range. 77 in situ reflectance spectra from rocky (Sweden and sandy (Estonia, Latvia coastal areas were used to evaluate the performance of the algorithms also in coastal waters. Seasonal differences in the algorithm performance were confirmed but we found also algorithms that can be used in both spring and summer conditions. The algorithms that use bands available on OLCI, launched in February 2016, are highlighted as this sensor will be available for Baltic Sea monitoring for coming decades.

  5. Dynamic Time Warping Distance Method for Similarity Test of Multipoint Ground Motion Field

    Directory of Open Access Journals (Sweden)

    Yingmin Li

    2010-01-01

    Full Text Available The reasonability of artificial multi-point ground motions and the identification of abnormal records in seismic array observations, are two important issues in application and analysis of multi-point ground motion fields. Based on the dynamic time warping (DTW distance method, this paper discusses the application of similarity measurement in the similarity analysis of simulated multi-point ground motions and the actual seismic array records. Analysis results show that the DTW distance method not only can quantitatively reflect the similarity of simulated ground motion field, but also offers advantages in clustering analysis and singularity recognition of actual multi-point ground motion field.

  6. Innovative Approaches to Development and Ground Testing of Advanced Bimodal Space Power and Propulsion Systems

    International Nuclear Information System (INIS)

    Hill, T.; Noble, C.; Martinell, J.; Borowski, S.

    2000-01-01

    The last major development effort for nuclear power and propulsion systems ended in 1993. Currently, there is not an initiative at either the National Aeronautical and Space Administration (NASA) or the U.S. Department of Energy (DOE) that requires the development of new nuclear power and propulsion systems. Studies continue to show nuclear technology as a strong technical candidate to lead the way toward human exploration of adjacent planets or provide power for deep space missions, particularly a 15,000 lbf bimodal nuclear system with 115 kW power capability. The development of nuclear technology for space applications would require technology development in some areas and a major flight qualification program. The last major ground test facility considered for nuclear propulsion qualification was the U.S. Air Force/DOE Space Nuclear Thermal Propulsion Project. Seven years have passed since that effort, and the questions remain the same, how to qualify nuclear power and propulsion systems for future space flight. It can be reasonably assumed that much of the nuclear testing required to qualify a nuclear system for space application will be performed at DOE facilities as demonstrated by the Nuclear Rocket Engine Reactor Experiment (NERVA) and Space Nuclear Thermal Propulsion (SNTP) programs. The nuclear infrastructure to support testing in this country is aging and getting smaller, though facilities still exist to support many of the technology development needs. By renewing efforts, an innovative approach to qualifying these systems through the use of existing facilities either in the U.S. (DOE's Advance Test Reactor, High Flux Irradiation Facility and the Contained Test Facility) or overseas should be possible

  7. Innovation Approaches to Development and Ground Testing of Advanced Bimodal Space Power and Propulsion Systems

    Energy Technology Data Exchange (ETDEWEB)

    Hill, T.; Noble, C.; Martinell, J. (INEEL); Borowski, S. (NASA Glenn Research Center)

    2000-07-14

    The last major development effort for nuclear power and propulsion systems ended in 1993. Currently, there is not an initiative at either the National Aeronautical and Space Administration (NASA) or the U.S. Department of Energy (DOE) that requires the development of new nuclear power and propulsion systems. Studies continue to show nuclear technology as a strong technical candidate to lead the way toward human exploration of adjacent planets or provide power for deep space missions, particularly a 15,000 lbf bimodal nuclear system with 115 kW power capability. The development of nuclear technology for space applications would require technology development in some areas and a major flight qualification program. The last major ground test facility considered for nuclear propulsion qualification was the U.S. Air Force/DOE Space Nuclear Thermal Propulsion Project. Seven years have passed since that effort, and the questions remain the same, how to qualify nuclear power and propulsion systems for future space flight. It can be reasonably assumed that much of the nuclear testing required to qualify a nuclear system for space application will be performed at DOE facilities as demonstrated by the Nuclear Rocket Engine Reactor Experiment (NERVA) and Space Nuclear Thermal Propulsion (SNTP) programs. The nuclear infrastructure to support testing in this country is aging and getting smaller, though facilities still exist to support many of the technology development needs. By renewing efforts, an innovative approach to qualifying these systems through the use of existing facilities either in the U.S. (DOE's Advance Test Reactor, High Flux Irradiation Facility and the Contained Test Facility) or overseas should be possible.

  8. Innovative Approaches to Development and Ground Testing of Advanced Bimodal Space Power and Propulsion Systems

    Energy Technology Data Exchange (ETDEWEB)

    Hill, Thomas Johnathan; Noble, Cheryl Ann; Noble, C.; Martinell, John Stephen; Borowski, S.

    2000-07-01

    The last major development effort for nuclear power and propulsion systems ended in 1993. Currently, there is not an initiative at either the National Aeronautical and Space Administration (NASA) or the U.S. Department of Energy (DOE) that requires the development of new nuclear power and propulsion systems. Studies continue to show nuclear technology as a strong technical candidate to lead the way toward human exploration of adjacent planets or provide power for deep space missions, particularly a 15,000 lbf bimodal nuclear system with 115 kW power capability. The development of nuclear technology for space applications would require technology development in some areas and a major flight qualification program. The last major ground test facility considered for nuclear propulsion qualification was the U.S. Air Force/DOE Space Nuclear Thermal Propulsion Project. Seven years have passed since that effort, and the questions remain the same, how to qualify nuclear power and propulsion systems for future space flight. It can be reasonable assumed that much of the nuclear testing required to qualify a nuclear system for space application will be performed at DOE facilities as demonstrated by the Nuclear Rocket Engine Reactor Experiment (NERVA) and Space Nuclear Thermal Propulsion (SNTP) programs. The nuclear infrastructure to support testing in this country is aging and getting smaller, though facilities still exist to support many of the technology development needs. By renewing efforts, an innovative approach to qualifying these systems through the use of existing facilities either in the U.S. (DOE's Advance Test Reactor, High Flux Irradiation Facility and the Contained Test Facility) or overseas should be possible.

  9. Accurate relative location estimates for the North Korean nuclear tests using empirical slowness corrections

    Science.gov (United States)

    Gibbons, S. J.; Pabian, F.; Näsholm, S. P.; Kværna, T.; Mykkeltveit, S.

    2017-01-01

    Declared North Korean nuclear tests in 2006, 2009, 2013 and 2016 were observed seismically at regional and teleseismic distances. Waveform similarity allows the events to be located relatively with far greater accuracy than the absolute locations can be determined from seismic data alone. There is now significant redundancy in the data given the large number of regional and teleseismic stations that have recorded multiple events, and relative location estimates can be confirmed independently by performing calculations on many mutually exclusive sets of measurements. Using a 1-D global velocity model, the distances between the events estimated using teleseismic P phases are found to be approximately 25 per cent shorter than the distances between events estimated using regional Pn phases. The 2009, 2013 and 2016 events all take place within 1 km of each other and the discrepancy between the regional and teleseismic relative location estimates is no more than about 150 m. The discrepancy is much more significant when estimating the location of the more distant 2006 event relative to the later explosions with regional and teleseismic estimates varying by many hundreds of metres. The relative location of the 2006 event is challenging given the smaller number of observing stations, the lower signal-to-noise ratio and significant waveform dissimilarity at some regional stations. The 2006 event is however highly significant in constraining the absolute locations in the terrain at the Punggye-ri test-site in relation to observed surface infrastructure. For each seismic arrival used to estimate the relative locations, we define a slowness scaling factor which multiplies the gradient of seismic traveltime versus distance, evaluated at the source, relative to the applied 1-D velocity model. A procedure for estimating correction terms which reduce the double-difference time residual vector norms is presented together with a discussion of the associated uncertainty. The modified

  10. Patients' Acceptance of Smartphone Health Technology for Chronic Disease Management: A Theoretical Model and Empirical Test.

    Science.gov (United States)

    Dou, Kaili; Yu, Ping; Deng, Ning; Liu, Fang; Guan, YingPing; Li, Zhenye; Ji, Yumeng; Du, Ningkai; Lu, Xudong; Duan, Huilong

    2017-12-06

    Chronic disease patients often face multiple challenges from difficult comorbidities. Smartphone health technology can be used to help them manage their conditions only if they accept and use the technology. The aim of this study was to develop and test a theoretical model to predict and explain the factors influencing patients' acceptance of smartphone health technology for chronic disease management. Multiple theories and factors that may influence patients' acceptance of smartphone health technology have been reviewed. A hybrid theoretical model was built based on the technology acceptance model, dual-factor model, health belief model, and the factors identified from interviews that might influence patients' acceptance of smartphone health technology for chronic disease management. Data were collected from patient questionnaire surveys and computer log records about 157 hypertensive patients' actual use of a smartphone health app. The partial least square method was used to test the theoretical model. The model accounted for .412 of the variance in patients' intention to adopt the smartphone health technology. Intention to use accounted for .111 of the variance in actual use and had a significant weak relationship with the latter. Perceived ease of use was affected by patients' smartphone usage experience, relationship with doctor, and self-efficacy. Although without a significant effect on intention to use, perceived ease of use had a significant positive influence on perceived usefulness. Relationship with doctor and perceived health threat had significant positive effects on perceived usefulness, countering the negative influence of resistance to change. Perceived usefulness, perceived health threat, and resistance to change significantly predicted patients' intentions to use the technology. Age and gender had no significant influence on patients' acceptance of smartphone technology. The study also confirmed the positive relationship between intention to use

  11. Patients’ Acceptance of Smartphone Health Technology for Chronic Disease Management: A Theoretical Model and Empirical Test

    Science.gov (United States)

    Dou, Kaili; Yu, Ping; Liu, Fang; Guan, YingPing; Li, Zhenye; Ji, Yumeng; Du, Ningkai; Lu, Xudong; Duan, Huilong

    2017-01-01

    Background Chronic disease patients often face multiple challenges from difficult comorbidities. Smartphone health technology can be used to help them manage their conditions only if they accept and use the technology. Objective The aim of this study was to develop and test a theoretical model to predict and explain the factors influencing patients’ acceptance of smartphone health technology for chronic disease management. Methods Multiple theories and factors that may influence patients’ acceptance of smartphone health technology have been reviewed. A hybrid theoretical model was built based on the technology acceptance model, dual-factor model, health belief model, and the factors identified from interviews that might influence patients’ acceptance of smartphone health technology for chronic disease management. Data were collected from patient questionnaire surveys and computer log records about 157 hypertensive patients’ actual use of a smartphone health app. The partial least square method was used to test the theoretical model. Results The model accounted for .412 of the variance in patients’ intention to adopt the smartphone health technology. Intention to use accounted for .111 of the variance in actual use and had a significant weak relationship with the latter. Perceived ease of use was affected by patients’ smartphone usage experience, relationship with doctor, and self-efficacy. Although without a significant effect on intention to use, perceived ease of use had a significant positive influence on perceived usefulness. Relationship with doctor and perceived health threat had significant positive effects on perceived usefulness, countering the negative influence of resistance to change. Perceived usefulness, perceived health threat, and resistance to change significantly predicted patients’ intentions to use the technology. Age and gender had no significant influence on patients’ acceptance of smartphone technology. The study also

  12. An empirical test of Lanchester's square law: mortality during battles of the fire ant Solenopsis invicta

    Science.gov (United States)

    Plowes, Nicola J.R; Adams, Eldridge S

    2005-01-01

    Lanchester's models of attrition describe casualty rates during battles between groups as functions of the numbers of individuals and their fighting abilities. Originally developed to describe human warfare, Lanchester's square law has been hypothesized to apply broadly to social animals as well, with important consequences for their aggressive behaviour and social structure. According to the square law, the fighting ability of a group is proportional to the square of the number of individuals, but rises only linearly with fighting ability of individuals within the group. By analyzing mortality rates of fire ants (Solenopsis invicta) fighting in different numerical ratios, we provide the first quantitative test of Lanchester's model for a non-human animal. Casualty rates of fire ants were not consistent with the square law; instead, group fighting ability was an approximately linear function of group size. This implies that the relative numbers of casualties incurred by two fighting groups are not strongly affected by relative group sizes and that battles do not disproportionately favour group size over individual prowess. PMID:16096093

  13. Reward rate optimization in two-alternative decision making: empirical tests of theoretical predictions.

    Science.gov (United States)

    Simen, Patrick; Contreras, David; Buck, Cara; Hu, Peter; Holmes, Philip; Cohen, Jonathan D

    2009-12-01

    The drift-diffusion model (DDM) implements an optimal decision procedure for stationary, 2-alternative forced-choice tasks. The height of a decision threshold applied to accumulating information on each trial determines a speed-accuracy tradeoff (SAT) for the DDM, thereby accounting for a ubiquitous feature of human performance in speeded response tasks. However, little is known about how participants settle on particular tradeoffs. One possibility is that they select SATs that maximize a subjective rate of reward earned for performance. For the DDM, there exist unique, reward-rate-maximizing values for its threshold and starting point parameters in free-response tasks that reward correct responses (R. Bogacz, E. Brown, J. Moehlis, P. Holmes, & J. D. Cohen, 2006). These optimal values vary as a function of response-stimulus interval, prior stimulus probability, and relative reward magnitude for correct responses. We tested the resulting quantitative predictions regarding response time, accuracy, and response bias under these task manipulations and found that grouped data conformed well to the predictions of an optimally parameterized DDM.

  14. Gender, general theory of crime and computer crime: an empirical test.

    Science.gov (United States)

    Moon, Byongook; McCluskey, John D; McCluskey, Cynthia P; Lee, Sangwon

    2013-04-01

    Regarding the gender gap in computer crime, studies consistently indicate that boys are more likely than girls to engage in various types of computer crime; however, few studies have examined the extent to which traditional criminology theories account for gender differences in computer crime and the applicability of these theories in explaining computer crime across gender. Using a panel of 2,751 Korean youths, the current study tests the applicability of the general theory of crime in explaining the gender gap in computer crime and assesses the theory's utility in explaining computer crime across gender. Analyses show that self-control theory performs well in predicting illegal use of others' resident registration number (RRN) online for both boys and girls, as predicted by the theory. However, low self-control, a dominant criminogenic factor in the theory, fails to mediate the relationship between gender and computer crime and is inadequate in explaining illegal downloading of software in both boy and girl models. Theoretical implication of the findings and the directions for future research are discussed.

  15. Testing Capital Asset Pricing Model: Empirical Evidences from Indian Equity Market

    Directory of Open Access Journals (Sweden)

    Kapil CHOUDHARY

    2010-11-01

    Full Text Available The present study examines the Capital Asset Pricing Model (CAPM for the Indian stock market using monthly stock returns from 278 companies of BSE 500 Index listed on the Bombay stock exchange for the period of January 1996 to December 2009. The findings of this study are not substantiating the theory’s basic result that higher risk (beta is associated with higher levels of return. The model does explain, however, excess returns and thus lends support to the linear structure of the CAPM equation. The theory’s prediction for the intercept is that it should equal zero and the slope should equal the excess returns on the market portfolio. The results of the study lead to negate the above hypotheses and offer evidence against the CAPM. The tests conducted to examine the nonlinearity of the relationship between return and betas bolster the hypothesis that the expected return-beta relationship is linear. Additionally, this study investigates whether the CAPM adequately captures all-important determinants of returns including the residual variance of stocks. The results exhibit that residual risk has no effect on the expected returns of portfolios.

  16. Pre-installation empirical testing of room shielding for high dose rate remote afterloaders

    International Nuclear Information System (INIS)

    Klein, E.E.; Grigsby, P.W.; Williamson, J.F.; Meigooni, A.S.

    1993-01-01

    PURPOSE: Many facilities are acquiring high dose rate remote afterloading units. It is economical that these units be placed in existing shielded teletherapy rooms. Scatter-radiation barriers marginally protect uncontrolled areas from a high dose rate source especially in a room that houses a non-dynamic Cobalt-60 unit. In addition the exact thickness and material composition of the barriers are unknown and therefore, a calculation technique may give misleading results. Also, it would be impossible to evaluate an entire wall barrier by taking isolated core samples in order to assist in the calculations. A quick and inexpensive measurement of dose equivalent using a rented high activity 192Ir source evaluates the barriers and locates shielding deficiencies. METHODS AND MATERIALS: We performed transmission calculations for primary and scattered radiation based on National Council on Radiation Protection and Measurements Reports 49 and 51, respectively. We then rented a high activity 21.7 Ci (8.03 x 10(11) Bq) Ir-192 source to assess our existing teletherapy room shielding for adequacy and voids. This source was placed at the proposed location for clinical high dose rate treatment and measurements were performed. RESULTS: No deficiencies were found in controlled areas surrounding the room, but large differences were found between the calculated and measured values. Our survey located a region in the uncontrolled area above the room requiring augmented shielding which was not predicted by the calculations. A canopy shield was designed to potentially augment the shielding in the ceiling direction. CONCLUSION: Pre-installation testing by measurement is an invaluable method for locating shielding deficiencies and avoiding unnecessary enhancement of shielding particularly when there is lack of information of the inherent shielding

  17. TESTING GROUND BASED GEOPHYSICAL TECHNIQUES TO REFINE ELECTROMAGNETIC SURVEYS NORTH OF THE 300 AREA, HANFORD, WASHINGTON

    International Nuclear Information System (INIS)

    Petersen, S.W.

    2010-01-01

    Airborne electromagnetic (AEM) surveys were flown during fiscal year (FY) 2008 within the 600 Area in an attempt to characterize the underlying subsurface and to aid in the closure and remediation design study goals for the 200-PO-1 Groundwater Operable Unit (OU). The rationale for using the AEM surveys was that airborne surveys can cover large areas rapidly at relatively low costs with minimal cultural impact, and observed geo-electrical anomalies could be correlated with important subsurface geologic and hydrogeologic features. Initial interpretation of the AEM surveys indicated a tenuous correlation with the underlying geology, from which several anomalous zones likely associated with channels/erosional features incised into the Ringold units were identified near the River Corridor. Preliminary modeling resulted in a slightly improved correlation but revealed that more information was required to constrain the modeling (SGW-39674, Airborne Electromagnetic Survey Report, 200-PO-1 Groundwater Operable Unit, 600 Area, Hanford Site). Both time-and frequency domain AEM surveys were collected with the densest coverage occurring adjacent to the Columbia River Corridor. Time domain surveys targeted deeper subsurface features (e.g., top-of-basalt) and were acquired using the HeliGEOTEM(reg s ign) system along north-south flight lines with a nominal 400 m (1,312 ft) spacing. The frequency domain RESOLVE system acquired electromagnetic (EM) data along tighter spaced (100 m (328 ft) and 200 m (656 ft)) north-south profiles in the eastern fifth of the 200-PO-1 Groundwater OU (immediately adjacent to the River Corridor). The overall goal of this study is to provide further quantification of the AEM survey results, using ground based geophysical methods, and to link results to the underlying geology and/or hydrogeology. Specific goals of this project are as follows: (1) Test ground based geophysical techniques for the efficacy in delineating underlying geology; (2) Use ground

  18. TESTING GROUND BASED GEOPHYSICAL TECHNIQUES TO REFINE ELECTROMAGNETIC SURVEYS NORTH OF THE 300 AREA HANFORD WASHINGTON

    Energy Technology Data Exchange (ETDEWEB)

    PETERSEN SW

    2010-12-02

    Airborne electromagnetic (AEM) surveys were flown during fiscal year (FY) 2008 within the 600 Area in an attempt to characterize the underlying subsurface and to aid in the closure and remediation design study goals for the 200-PO-1 Groundwater Operable Unit (OU). The rationale for using the AEM surveys was that airborne surveys can cover large areas rapidly at relatively low costs with minimal cultural impact, and observed geo-electrical anomalies could be correlated with important subsurface geologic and hydrogeologic features. Initial interpretation of the AEM surveys indicated a tenuous correlation with the underlying geology, from which several anomalous zones likely associated with channels/erosional features incised into the Ringold units were identified near the River Corridor. Preliminary modeling resulted in a slightly improved correlation but revealed that more information was required to constrain the modeling (SGW-39674, Airborne Electromagnetic Survey Report, 200-PO-1 Groundwater Operable Unit, 600 Area, Hanford Site). Both time-and frequency domain AEM surveys were collected with the densest coverage occurring adjacent to the Columbia River Corridor. Time domain surveys targeted deeper subsurface features (e.g., top-of-basalt) and were acquired using the HeliGEOTEM{reg_sign} system along north-south flight lines with a nominal 400 m (1,312 ft) spacing. The frequency domain RESOLVE system acquired electromagnetic (EM) data along tighter spaced (100 m [328 ft] and 200 m [656 ft]) north-south profiles in the eastern fifth of the 200-PO-1 Groundwater OU (immediately adjacent to the River Corridor). The overall goal of this study is to provide further quantification of the AEM survey results, using ground based geophysical methods, and to link results to the underlying geology and/or hydrogeology. Specific goals of this project are as follows: (1) Test ground based geophysical techniques for the efficacy in delineating underlying geology; (2) Use ground

  19. Further Evaluation of Covariate Analysis using Empirical Bayes Estimates in Population Pharmacokinetics: the Perception of Shrinkage and Likelihood Ratio Test.

    Science.gov (United States)

    Xu, Xu Steven; Yuan, Min; Yang, Haitao; Feng, Yan; Xu, Jinfeng; Pinheiro, Jose

    2017-01-01

    Covariate analysis based on population pharmacokinetics (PPK) is used to identify clinically relevant factors. The likelihood ratio test (LRT) based on nonlinear mixed effect model fits is currently recommended for covariate identification, whereas individual empirical Bayesian estimates (EBEs) are considered unreliable due to the presence of shrinkage. The objectives of this research were to investigate the type I error for LRT and EBE approaches, to confirm the similarity of power between the LRT and EBE approaches from a previous report and to explore the influence of shrinkage on LRT and EBE inferences. Using an oral one-compartment PK model with a single covariate impacting on clearance, we conducted a wide range of simulations according to a two-way factorial design. The results revealed that the EBE-based regression not only provided almost identical power for detecting a covariate effect, but also controlled the false positive rate better than the LRT approach. Shrinkage of EBEs is likely not the root cause for decrease in power or inflated false positive rate although the size of the covariate effect tends to be underestimated at high shrinkage. In summary, contrary to the current recommendations, EBEs may be a better choice for statistical tests in PPK covariate analysis compared to LRT. We proposed a three-step covariate modeling approach for population PK analysis to utilize the advantages of EBEs while overcoming their shortcomings, which allows not only markedly reducing the run time for population PK analysis, but also providing more accurate covariate tests.

  20. Development and Testing of Techniques for In-Ground Stabilization, Size Reduction and Safe Removal of Radioactive Wastes Stored in Large Containments in Burial Grounds - 13591

    International Nuclear Information System (INIS)

    Halliwell, Stephen

    2013-01-01

    Radioactive waste materials, including Transuranic (TRU) wastes from laboratories have been stored below ground in large containments at a number of sites in the US DOE Complex, and at nuclear sites in Europe. These containments are generally referred to as caissons or shafts. The containments are in a range of sizes and depths below grade. The caissons at the DOE's Hanford site are cylindrical, of the order of 2,500 mm in diameter, 3,050 mm in height and are buried about 6,000 mm below grade. One type of caisson is made out of corrugated pipe, whereas others are made of concrete with standard re-bar. However, the larger shafts in the UK are of the order of 4,600 mm in diameter, 53,500 mm deep, and 12,000 below grade. This paper describes the R and D work and testing activities performed to date to evaluate the concept of in-ground size reduction and stabilization of the contents of large containments similar to those at Hanford. In practice, the height of the Test Facility provided for a test cell that was approximately 22' deep. That prevented a 'full scale mockup' test in the sense that the Hanford Caisson configuration would be an identical replication. Therefore, the project was conducted in two phases. The first phase tested a simulated Caisson with surrogate contents, and part of a Chute section, and the second phase tested a full chute section. These tests were performed at VJ Technologies Test Facility located in East Haven, CT, as part of the Proof of Design Concept program for studying the feasibility of an in-situ grout/grind/mix/stabilize technology for the remediation of four caissons at the 618-11 Burial Ground at US Department of Energy Hanford Site. The test site was constructed such that multiple testing areas were provided for the evaluation of various tools, equipment and procedures under conditions that simulated the Hanford site, with representative soils and layout dimensions. (authors)

  1. Development and Testing of Techniques for In-Ground Stabilization, Size Reduction and Safe Removal of Radioactive Wastes Stored in Large Containments in Burial Grounds - 13591

    Energy Technology Data Exchange (ETDEWEB)

    Halliwell, Stephen [VJ Technologies Inc, 89 Carlough Road, Bohemia, NY (United States)

    2013-07-01

    Radioactive waste materials, including Transuranic (TRU) wastes from laboratories have been stored below ground in large containments at a number of sites in the US DOE Complex, and at nuclear sites in Europe. These containments are generally referred to as caissons or shafts. The containments are in a range of sizes and depths below grade. The caissons at the DOE's Hanford site are cylindrical, of the order of 2,500 mm in diameter, 3,050 mm in height and are buried about 6,000 mm below grade. One type of caisson is made out of corrugated pipe, whereas others are made of concrete with standard re-bar. However, the larger shafts in the UK are of the order of 4,600 mm in diameter, 53,500 mm deep, and 12,000 below grade. This paper describes the R and D work and testing activities performed to date to evaluate the concept of in-ground size reduction and stabilization of the contents of large containments similar to those at Hanford. In practice, the height of the Test Facility provided for a test cell that was approximately 22' deep. That prevented a 'full scale mockup' test in the sense that the Hanford Caisson configuration would be an identical replication. Therefore, the project was conducted in two phases. The first phase tested a simulated Caisson with surrogate contents, and part of a Chute section, and the second phase tested a full chute section. These tests were performed at VJ Technologies Test Facility located in East Haven, CT, as part of the Proof of Design Concept program for studying the feasibility of an in-situ grout/grind/mix/stabilize technology for the remediation of four caissons at the 618-11 Burial Ground at US Department of Energy Hanford Site. The test site was constructed such that multiple testing areas were provided for the evaluation of various tools, equipment and procedures under conditions that simulated the Hanford site, with representative soils and layout dimensions. (authors)

  2. About condition of soil ground at locations of the former Azgir nuclear test site

    International Nuclear Information System (INIS)

    Akhmetov, E.Z.; Adymov, Zh.I.; Ermatov, A.S.

    2003-01-01

    Full text: Soil condition after underground nuclear explosions at locations of the test sites is considered. The region is situated in the zone of northern deserts and characterized by prevalence of greyish-brown soils in conditions of sharply continental climate and presence of salt in soil-formative complex including tertiary clays, loess-like loam, loam sands and sands. There are small quantity of humus in such soil. During investigation of soil characteristics and ability of soil particles to form conglomerates, possessing of different properties, it is necessary to know both element and phase composition, determining, in the most extent, such physical and mechanical macro-characteristics as: density, stickiness, air and water penetrability, solubility, chemical resistance, granulometric set and others. Phase composition of soil samples can be, to a sufficient extent, determined by the X-ray diffractometry methods using ordinary X-ray experimental facilities. Phase composition of soil includes gypsum, quartz, calcium, potash feldspar hematite, kaolin, peach and mica in different quantities. Data on element composition of soil samples were obtained from the territory of technological locations of test site using the method of X-ray-fluorescent analysis. Granulometric composition of soil ground has been investigated using the methods of dry sieving and wet sieving for determination of radionuclide distribution in different fractions of soil particles. By the method of the dry sieving of soil ground samples there are taken place a sticking the small together of fine fractions and an adhesion of stuck-together particles to more large ones. Therefore, fine fractions cannot be separate completely at dry sieving. As distinct from the dry sieving an use of water jet in the sieving allows to overcome defects of the dry method and, by a sufficiently effective separation of granulometric fractions, to obtain more precise results of investigations of granulometric

  3. PhoneSat: Ground Testing of a Phone-Based Prototype Bus

    Science.gov (United States)

    Felix, Carmen; Howard, Benjamin; Reyes, Matthew; Snarskiy, Fedor; Hickman, Ryan; Boshuizen, Christopher; Marshall, William

    2010-01-01

    Most of the key capabilities that are requisite of a satellite bus are housed in today's smart phones. PhoneSat refers to an initiative to build a ground-based prototype vehicle that could all the basic functionality of a satellite, including attitude control, using a smart Phone as its central hardware. All components used were also low cost Commercial off the Shelf (COTS). In summer 2009, an initial prototype was created using the LEGO Mindstorm toolkit demonstrating simple attitude control. Here we report on a follow up initiative to design, build and test a vehicle based on the Google s smart phone Nexus One. The report includes results from initial thermal-vacuum chamber tests and low altitude sub-orbital rocket flights which show that, at least for short durations, the Nexus One phone is able to withstand key aspects of the space environment without failure. We compare the sensor data from the Phone's accelerometers and magnetometers with that of an external microelectronic inertial measurement unit.

  4. Development of a low background test facility for the SPICA-SAFARI on-ground calibration

    Science.gov (United States)

    Dieleman, P.; Laauwen, W. M.; Ferrari, L.; Ferlet, M.; Vandenbussche, B.; Meinsma, L.; Huisman, R.

    2012-09-01

    SAFARI is a far-infrared camera to be launched in 2021 onboard the SPICA satellite. SAFARI offers imaging spectroscopy and imaging photometry in the wavelength range of 34 to 210 μm with detector NEP of 2•10-19 W/√Hz. A cryogenic test facility for SAFARI on-ground calibration and characterization is being developed. The main design driver is the required low background of a few attoWatts per pixel. This prohibits optical access to room temperature and hence all test equipment needs to be inside the cryostat at 4.5K. The instrument parameters to be verified are interfaces with the SPICA satellite, sensitivity, alignment, image quality, spectral response, frequency calibration, and point spread function. The instrument sensitivity is calibrated by a calibration source providing a spatially homogeneous signal at the attoWatt level. This low light intensity is achieved by geometrical dilution of a 150K source to an integrating sphere. The beam quality and point spread function is measured by a pinhole/mask plate wheel, back-illuminated by a second integrating sphere. This sphere is fed by a stable wide-band source, providing spectral lines via a cryogenic etalon.

  5. Species trees from consensus single nucleotide polymorphism (SNP) data: Testing phylogenetic approaches with simulated and empirical data.

    Science.gov (United States)

    Schmidt-Lebuhn, Alexander N; Aitken, Nicola C; Chuah, Aaron

    2017-11-01

    Datasets of hundreds or thousands of SNPs (Single Nucleotide Polymorphisms) from multiple individuals per species are increasingly used to study population structure, species delimitation and shallow phylogenetics. The principal software tool to infer species or population trees from SNP data is currently the BEAST template SNAPP which uses a Bayesian coalescent analysis. However, it is computationally extremely demanding and tolerates only small amounts of missing data. We used simulated and empirical SNPs from plants (Australian Craspedia, Asteraceae, and Pelargonium, Geraniaceae) to compare species trees produced (1) by SNAPP, (2) using SVD quartets, and (3) using Bayesian and parsimony analysis with several different approaches to summarising data from multiple samples into one set of traits per species. Our aims were to explore the impact of tree topology and missing data on the results, and to test which data summarising and analyses approaches would best approximate the results obtained from SNAPP for empirical data. SVD quartets retrieved the correct topology from simulated data, as did SNAPP except in the case of a very unbalanced phylogeny. Both methods failed to retrieve the correct topology when large amounts of data were missing. Bayesian analysis of species level summary data scoring the two alleles of each SNP as independent characters and parsimony analysis of data scoring each SNP as one character produced trees with branch length distributions closest to the true trees on which SNPs were simulated. For empirical data, Bayesian inference and Dollo parsimony analysis of data scored allele-wise produced phylogenies most congruent with the results of SNAPP. In the case of study groups divergent enough for missing data to be phylogenetically informative (because of additional mutations preventing amplification of genomic fragments or bioinformatic establishment of homology), scoring of SNP data as a presence/absence matrix irrespective of allele

  6. [Drug susceptibility test guided therapy and novel empirical quadruple therapy for Helicobacter pylori infection: a network Meta-analysis].

    Science.gov (United States)

    Gou, Q Y; Yu, R B; Shi, R H

    2017-05-10

    Objective: To compare the efficacy and the risk of adverse effect of drug susceptibility test guided therapy and novel empirical quadruple therapy for Helicobacter ( H .) pylori infection. Methods: Literature retrieval was conducted by using major databases. Related papers published up to June 2015 were considered eligible if they were randomized control trials comparing different pharmacological formulations for H. pylori infection and used in a network Meta-analysis and a single rate Meta-analysis to evaluate the relative and absolute rates of H. pylori eradication and the risk of adverse effect. The Jadad score was used to evaluate the methodological quality. Funnel plot was constructed to evaluate the risk of publication bias. Begg's rank correlation test or Egger's regression intercept test was done for the asymmetry of funnel plot. Results: Twenty randomized control trials for the treatment of 6 753 initial treated patients with H. pylori infection were included. Drug susceptibility test guided therapy was significantly superior to concomitant therapy, hybrid therapy, sequential therapy and bismuth quadruple therapy. The culture-based therapy had the highest likelihood of improving clinical efficacy, with lowest risk of adverse effect. Concomitant therapy had the highest probability of causing adverse effect despite its effectiveness. Hybrid therapy and bismuth quadruple therapy were associated with lower risk of adverse effect and higher effectiveness. Conclusion: Drug susceptibility test guided therapy showed superiority to other 4 interventions for H. pylori eradication mentioned above. Hybrid therapy and bismuth quadruple therapy might be applied in the settings where the culture-based strategy is not available.

  7. Summary of ground motion prediction results for Nevada Test Site underground nuclear explosions related to the Yucca Mountain project

    International Nuclear Information System (INIS)

    Walck, M.C.

    1996-10-01

    This report summarizes available data on ground motions from underground nuclear explosions recorded on and near the Nevada Test Site, with emphasis on the ground motions recorded at stations on Yucca Mountain, the site of a potential high-level radioactive waste repository. Sandia National Laboratories, through the Weapons Test Seismic Investigations project, collected and analyzed ground motion data from NTS explosions over a 14-year period, from 1977 through 1990. By combining these data with available data from earlier, larger explosions, prediction equations for several ground motion parameters have been developed for the Test Site area for underground nuclear explosion sources. Also presented are available analyses of the relationship between surface and downhole motions and spectra and relevant crustal velocity structure information for Yucca Mountain derived from the explosion data. The data and associated analyses demonstrate that ground motions at Yucca Mountain from nuclear tests have been at levels lower than would be expected from moderate to large earthquakes in the region; thus nuclear explosions, while located relatively close, would not control seismic design criteria for the potential repository

  8. Summary of ground motion prediction results for Nevada Test Site underground nuclear explosions related to the Yucca Mountain project

    Energy Technology Data Exchange (ETDEWEB)

    Walck, M.C.

    1996-10-01

    This report summarizes available data on ground motions from underground nuclear explosions recorded on and near the Nevada Test Site, with emphasis on the ground motions recorded at stations on Yucca Mountain, the site of a potential high-level radioactive waste repository. Sandia National Laboratories, through the Weapons Test Seismic Investigations project, collected and analyzed ground motion data from NTS explosions over a 14-year period, from 1977 through 1990. By combining these data with available data from earlier, larger explosions, prediction equations for several ground motion parameters have been developed for the Test Site area for underground nuclear explosion sources. Also presented are available analyses of the relationship between surface and downhole motions and spectra and relevant crustal velocity structure information for Yucca Mountain derived from the explosion data. The data and associated analyses demonstrate that ground motions at Yucca Mountain from nuclear tests have been at levels lower than would be expected from moderate to large earthquakes in the region; thus nuclear explosions, while located relatively close, would not control seismic design criteria for the potential repository.

  9. Peculiarities and opportunities of restoration of vegetation of experimental ground 'Experimental field' of Semipalatinsk Test Site

    International Nuclear Information System (INIS)

    Plisak, R.P.; Plisak, S. V.

    2003-01-01

    Full text: Geo-botanical researches at experimental ground 'Experimental field' of Semipalatinsk Test Site were conducted out in 1994-2000. 26 ground and 87 air nuclear tests were conducted out at the territory in 1949-1962. It is found that for deluvial-proluvial plain: High level of radiation pollution of soils in the epicentre of nuclear explosions is limiting factor for vegetation rehabilitation. Under level of PED of γ-irradiation 14,000-16,000 μR/h vegetation restoration has not begun until now. Only single individuals of Artemisia frigida appear under PED of γ-irradiation 10,000-13,000 μR/h. Rarefied plant aggregations constituted by annual-biennial weed species appear under PED of γ-irradiation 3,600-8,000 μR/h. Natural rehabilitation of vegetation occurs more intensively under PED of γ-irradiation of 60-200 μR/h. Vegetation aggregations close to initial zonal coenosis develop in these conditions. It is found that for tumulose: Vegetation restoration on the tops of hills starts with invasion of weed species. Plant aggregations with predominance of Caragana pumila, tyhedra distachya develop on accumulations of fine earth in cracks of mountain rocks. Lichens and mosses assimilate outcrops of mountain rocks. 2. Plant aggregations with predominance of Spiraea hypericifoia, Caragana pumila, Artemisia frigida develop on the upper parts of slopes of hills. Craters of nuclear explosions have not been assimilated by higher plants yet. Rarefied plant aggregations constituted by Psathyrostachys juncea, Artemisia frigida appear in the lower parts of slopes of hills. Single individuals of Medicago falcata, Galium ruthenicum, Melilotus dentatus are found on sides of explosion craters. Vegetation rehabilitates slowly trenches on gentle slopes of hills. Following measures are necessary for intensification of the process of restoration of vegetation destroyed and damaged by nuclear explosions: To clean slopes of hills from numerous fragment of metallic and plastic

  10. Arboreallty and morphological evolution in ground beetles (Carabidae: Harpalinae): testing the taxon pulse model.

    Science.gov (United States)

    Ober, Karen A

    2003-06-01

    One-third to two-thirds of all tropical carabids, or ground beetles, are arboreal, and evolution of arboreality has been proposed to be a dead end in this group. Many arboreal carabids have unusual morphological features that have been proposed to be adaptations for life on vegetation, including large, hemispheric eyes; an elongated prothorax; long elytra; long legs; bilobed fourth tarsomeres; adhesive setae on tarsi; and pectinate claws. However, correlations between these features and arboreality have not been rigorously tested previously. I examined the evolution of arboreality and morphological features often associated with this habitat in a phylogenetic context. The number and rates of origins and losses of arboreality in carabids in the subfamily Harpalinae were inferred with parsimony and maximum-likelihood on a variety of phylogenetic hypotheses. Correlated evolution in arboreality and morphological characters was tested with concentrated changes tests, maximum-likelihood, and independent contrasts on optimal phylogenies. There is strong evidence that both arboreality and the morphological features examined originated multiple times and can be reversed, and in no case could the hypothesis of equal rates of gains and losses be rejected. Several features are associated with arboreality: adhesive setae on the tarsi, bilobed tarsomeres, and possibly pectinate claws and an elongated prothorax. Bulgy eyes, long legs, and long elytra were not correlated with arboreality and are probably not arboreal adaptations. The evolution of arboreal carabids has not been unidirectional. These beetles have experienced multiple gains and losses of arboreality and the morphological characters commonly associated with the arboreal habitat. The evolutionary process of unidirectional character change may not be as widespread as previously thought and reversal from specialized lifestyles or habitats may be common.

  11. VLE measurements using a static cell vapor phase manual sampling method accompanied with an empirical data consistency test

    International Nuclear Information System (INIS)

    Freitag, Joerg; Kosuge, Hitoshi; Schmelzer, Juergen P.; Kato, Satoru

    2015-01-01

    Highlights: • We use a new, simple static cell vapor phase manual sampling method (SCVMS) for VLE (x, y, T) measurement. • The method is applied to non-azeotropic, asymmetric and two-liquid phase forming azeotropic binaries. • The method is approved by a data consistency test, i.e., a plot of the polarity exclusion factor vs. pressure. • The consistency test reveals that with the new SCVMS method accurate VLE near ambient temperature can be measured. • Moreover, the consistency test approves that the effect of air in the SCVMS system is negligible. - Abstract: A new static cell vapor phase manual sampling (SCVMS) method is used for the simple measurement of constant temperature x, y (vapor + liquid) equilibria (VLE). The method was applied to the VLE measurements of the (methanol + water) binary at T/K = (283.2, 298.2, 308.2 and 322.9), asymmetric (acetone + 1-butanol) binary at T/K = (283.2, 295.2, 308.2 and 324.2) and two-liquid phase forming azeotropic (water + 1-butanol) binary at T/K = (283.2 and 298.2). The accuracy of the experimental data was approved by a data consistency test, that is, an empirical plot of the polarity exclusion factor, β, vs. the system pressure, P. The SCVMS data are accurate, because the VLE data converge to the same lnβ vs. lnP straight line determined from conventional distillation-still method and a headspace gas chromatography method

  12. An empirical assessment of near-source strong ground motion for a 6.6 mb (7.5 MS) earthquake in the Eastern United States

    International Nuclear Information System (INIS)

    Campbell, Kenneth W.

    1984-06-01

    To help assess the impact of the current U.S. Geological Survey position on the seismic safety of nuclear power plants in the Eastern United States (EUS), several techniques for estimating near-source strong ground motion for a Charleston size earthquake were evaluated. The techniques for estimating the near-source strong ground motion for a 6.6 m b (7.5 M S ) in the Eastern United States which were assessed are methods based on site specific analyses, semi-theoretical scaling techniques, and intensity-based estimates. The first involves the statistical analysis of ground motion records from earthquakes and recording stations having the same general characteristics (earthquakes with magnitudes of 7.5 M S or larger, epicentral distances of 25 km or less, and sites of either soil or rock). Some recommendations for source and characterization scaling of the bias resulting primarily from an inadequate sample of near-source recordings from earthquakes of large magnitude are discussed. The second technique evaluated requires that semi-theoretical estimates of peak ground motion parameters for a 6.6 m b (7.5 M S ) earthquake be obtained from scaling relations. Each relation uses a theoretical expression between peak acceleration magnitude and distance together with available strong motion data (majority coming from California) to develop a scaling relation appropriate for the Eastern United States. None of the existing ground motion models for the EUS include the potential effects of source or site characteristics. Adjustments to account for fault mechanisms, site topography, site geology, and the size and embedment of buildings are discussed. The final approach used relations between strong ground motion parameters and Modified Mercalli Intensity in conjunction with two methods to estimate peak parameters for a 6.6 m s (7.5 M S ) earthquake. As with other techniques, adjustment of peak acceleration estimates are discussed. Each method differently approaches the problem

  13. How Funding and Policy Affect Access to and Modernization of Major Air Force Ground Test Infrastructure Assets

    Science.gov (United States)

    2017-04-06

    annually for the DoD, other government agencies, allies, and commercial customers at the world’s largest ground test flight simulation facility...center’s wind tunnels, gas turbine sea level and altitude test cells, space chambers, altitude rocket cells, ballistic ranges, arc heaters and other...complex and the second was an 12 altitude solid rocket motor test facility called J6.xx The first was the result of a herculean effort that took

  14. Analytic model for surface ground motion with spall induced by underground nuclear tests

    International Nuclear Information System (INIS)

    MacQueen, D.H.

    1982-04-01

    This report provides a detailed presentation and critique of a model used to characterize the surface ground motion following a contained, spalling underground nuclear explosion intended for calculation of the resulting atmospheric acoustic pulse. Some examples of its use are included. Some discussion of the general approach of ground motion model parameter extraction, not dependent on the specific model, is also presented

  15. HV Test of the CTS Edgeless Silicon Detector in Vacuum and Close to a Grounded Plate

    CERN Document Server

    Eremin, Vladimir; Ruggiero, Gennaro

    2007-01-01

    The TOTEM Roman Pot Silicon sensors will be operated in vacuum to minimise the mechanical stress of the thin metal window which separates the detector package from the ultra high vacuum of the beam. To approach the beam axis as close as possible the detectors will be mounted with their edge at a distance of the order 100 - 200 um from the thin metal window. As the detectors will be run in overdepletion mode to allow the full charge collection within the shaping time of the readout electronics, there will be a potential drop of more than 100 V across their edge. Moreover this potential drop might need to be further increased with the accumulated radiation dose. The main goals of the tests described in this note are: - Characterisation of the voltage-current characteristics when the detector edge is in the direct vicinity of a grounded metal plate which simulates the above mentioned vacuum window; - Demonstration of the detector operation in vacuum at different pressures.

  16. Performance report on the ground test accelerator radio-frequency quadrupole

    International Nuclear Information System (INIS)

    Sander, O.R.; Atkins, W.H.; Bolme, G.O.; Brown, S.; Cole, R.; Connolly, R.; Gilpatrick, J.D.; Garnett, R.; Guy, F.W.; Ingalls, W.B.

    1994-01-01

    The Ground Test Accelerator (GTA) uses a radio-frequency quadrupole (RFQ) to bunch and accelerate a 35 keV input beam to a final energy of 2.5 MeV. Most measured parameters of the GTA RFQ agreed with simulated predictions. The relative shape of the transmission versus the vane-voltage relationship and the Courant-Snyder (CS) parameters of the output beam's transverse and longitudinal phase spaces agreed well with predictions. However, the transmission of the RFQ was significantly lower than expected. Improved simulation studies included image charges and multipole effects in the RFQ. Most of the predicted properties of the RFQ, such as input matched-beam conditions and output-beam shapes were unaffected by these additional effects. However, the comparison of measured with predicted absolute values of transmitted beam was much improved by the inclusion of these effects in the simulations. The comparison implied a value for the input emittance that is consistent with measurements

  17. Final safety analysis report for the Ground Test Accelerator (GTA), Phase 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-10-01

    This document is the second volume of a 3 volume safety analysis report on the Ground Test Accelerator (GTA). The GTA program at the Los Alamos National Laboratory (LANL) is the major element of the national Neutral Particle Beam (NPB) program, which is supported by the Strategic Defense Initiative Office (SDIO). A principal goal of the national NPB program is to assess the feasibility of using hydrogen and deuterium neutral particle beams outside the Earth`s atmosphere. The main effort of the NPB program at Los Alamos concentrates on developing the GTA. The GTA is classified as a low-hazard facility, except for the cryogenic-cooling system, which is classified as a moderate-hazard facility. This volume consists of failure modes and effects analysis; accident analysis; operational safety requirements; quality assurance program; ES&H management program; environmental, safety, and health systems critical to safety; summary of waste-management program; environmental monitoring program; facility expansion, decontamination, and decommissioning; summary of emergency response plan; summary plan for employee training; summary plan for operating procedures; glossary; and appendices A and B.

  18. A Rigorous Test of the Fit of the Circumplex Model to Big Five Personality Data: Theoretical and Methodological Issues and Two Large Sample Empirical Tests.

    Science.gov (United States)

    DeGeest, David Scott; Schmidt, Frank

    2015-01-01

    Our objective was to apply the rigorous test developed by Browne (1992) to determine whether the circumplex model fits Big Five personality data. This test has yet to be applied to personality data. Another objective was to determine whether blended items explained correlations among the Big Five traits. We used two working adult samples, the Eugene-Springfield Community Sample and the Professional Worker Career Experience Survey. Fit to the circumplex was tested via Browne's (1992) procedure. Circumplexes were graphed to identify items with loadings on multiple traits (blended items), and to determine whether removing these items changed five-factor model (FFM) trait intercorrelations. In both samples, the circumplex structure fit the FFM traits well. Each sample had items with dual-factor loadings (8 items in the first sample, 21 in the second). Removing blended items had little effect on construct-level intercorrelations among FFM traits. We conclude that rigorous tests show that the fit of personality data to the circumplex model is good. This finding means the circumplex model is competitive with the factor model in understanding the organization of personality traits. The circumplex structure also provides a theoretically and empirically sound rationale for evaluating intercorrelations among FFM traits. Even after eliminating blended items, FFM personality traits remained correlated.

  19. Two-dimensional models as testing ground for principles and concepts of local quantum physics

    International Nuclear Information System (INIS)

    Schroer, Bert

    2005-04-01

    In the past two-dimensional models of QFT have served as theoretical laboratories for testing new concepts under mathematically controllable condition. In more recent times low-dimensional models (e.g. chiral models, factoring models) often have been treated by special recipes in a way which sometimes led to a loss of unity of QFT. In the present work I try to counteract this apartheid tendency by reviewing past results within the setting of the general principles of QFT. To this I add two new ideas: (1) a modular interpretation of the chiral model Diff(S)-covariance with a close connection to the recently formulated local covariance principle for QFT in curved spacetime and (2) a derivation of the chiral model temperature duality from a suitable operator formulation of the angular Wick rotation (in analogy to the Nelson-Symanzik duality in the Ostertwalder-Schrader setting) for rational chiral theories. The SL(2,Z) modular Verlinde relation is a special case of this thermal duality and (within the family of rational models) the matrix S appearing in the thermal duality relation becomes identified with the statistics character matrix S. The relevant angular 'Euclideanization' is done in the setting of the Tomita-Takesaki modular formalism of operator algebras. I find it appropriate to dedicate this work to the memory of J. A. Swieca with whom I shared the interest in two-dimensional models as a testing ground for QFT for more than one decade. This is a significantly extended version of an 'Encyclopedia of Mathematical Physics' contribution hep-th/0502125. (author)

  20. Characterization of Oribtal Debris via Hyper-Velocity Ground-Based Tests

    Science.gov (United States)

    Cowardin, H.

    2015-01-01

    Existing DoD and NASA satellite breakup models are based on a key laboratory-based test, Satellite Orbital debris Characterization Impact Test (SOCIT), which has supported many applications and matched on-orbit events involving older satellite designs reasonably well over the years. In order to update and improve the break-up models and the NASA Size Estimation Model (SEM) for events involving more modern satellite designs, the NASA Orbital Debris Program Office has worked in collaboration with the University of Florida to replicate a hypervelocity impact using a satellite built with modern-day spacecraft materials and construction techniques. The spacecraft, called DebriSat, was intended to be a representative of modern LEO satellites and all major designs decisions were reviewed and approved by subject matter experts at Aerospace Corporation. DebriSat is composed of 7 major subsystems including attitude determination and control system (ADCS), command and data handling (C&DH), electrical power system (EPS), payload, propulsion, telemetry tracking and command (TT&C), and thermal management. To reduce cost, most components are emulated based on existing design of flight hardware and fabricated with the same materials. All fragments down to 2 mm is size will be characterized via material, size, shape, bulk density, and the associated data will be stored in a database for multiple users to access. Laboratory radar and optical measurements will be performed on a subset of fragments to provide a better understanding of the data products from orbital debris acquired from ground-based radars and telescopes. The resulting data analysis from DebriSat will be used to update break-up models and develop the first optical SEM in conjunction with updates into the current NASA SEM. The characterization of the fragmentation will be discussed in the subsequent presentation.

  1. Two-dimensional models as testing ground for principles and concepts of local quantum physics

    Energy Technology Data Exchange (ETDEWEB)

    Schroer, Bert [FU Berlin (Germany). Institut fuer Theoretische Physik

    2005-04-15

    In the past two-dimensional models of QFT have served as theoretical laboratories for testing new concepts under mathematically controllable condition. In more recent times low-dimensional models (e.g. chiral models, factoring models) often have been treated by special recipes in a way which sometimes led to a loss of unity of QFT. In the present work I try to counteract this apartheid tendency by reviewing past results within the setting of the general principles of QFT. To this I add two new ideas: (1) a modular interpretation of the chiral model Diff(S)-covariance with a close connection to the recently formulated local covariance principle for QFT in curved spacetime and (2) a derivation of the chiral model temperature duality from a suitable operator formulation of the angular Wick rotation (in analogy to the Nelson-Symanzik duality in the Ostertwalder-Schrader setting) for rational chiral theories. The SL(2,Z) modular Verlinde relation is a special case of this thermal duality and (within the family of rational models) the matrix S appearing in the thermal duality relation becomes identified with the statistics character matrix S. The relevant angular 'Euclideanization' is done in the setting of the Tomita-Takesaki modular formalism of operator algebras. I find it appropriate to dedicate this work to the memory of J. A. Swieca with whom I shared the interest in two-dimensional models as a testing ground for QFT for more than one decade. This is a significantly extended version of an 'Encyclopedia of Mathematical Physics' contribution hep-th/0502125. (author)

  2. Life-Cycle Assessments of Selected NASA Ground-Based Test Facilities

    Science.gov (United States)

    Sydnor, George Honeycutt

    2012-01-01

    In the past two years, two separate facility-specific life cycle assessments (LCAs) have been performed as summer student projects. The first project focused on 13 facilities managed by NASA s Aeronautics Test Program (ATP), an organization responsible for large, high-energy ground test facilities that accomplish the nation s most advanced aerospace research. A facility inventory was created for each facility, and the operational-phase carbon footprint and environmental impact were calculated. The largest impacts stemmed from electricity and natural gas used directly at the facility and to generate support processes such as compressed air and steam. However, in specialized facilities that use unique inputs like R-134a, R-14, jet fuels, or nitrogen gas, these sometimes had a considerable effect on the facility s overall environmental impact. The second LCA project was conducted on the NASA Ames Arc Jet Complex and also involved creating a facility inventory and calculating the carbon footprint and environmental impact. In addition, operational alternatives were analyzed for their effectiveness at reducing impact. Overall, the Arc Jet Complex impact is dominated by the natural-gas fired boiler producing steam on-site, but alternatives were provided that could reduce the impact of the boiler operation, some of which are already being implemented. The data and results provided by these LCA projects are beneficial to both the individual facilities and NASA as a whole; the results have already been used in a proposal to reduce carbon footprint at Ames Research Center. To help future life cycle projects, several lessons learned have been recommended as simple and effective infrastructure improvements to NASA, including better utility metering and data recording and standardization of modeling choices and methods. These studies also increased sensitivity to and appreciation for quantifying the impact of NASA s activities.

  3. Testing of ground fault relay response during the energisation of megawatt range electric boilers in thermal power plants

    DEFF Research Database (Denmark)

    Silva, Filipe Miguel Faria da; Bak, Claus Leth; Davidsen, Troels

    2015-01-01

    , with the advantage that the warmed water can be reused in a thermal power plant or at regional heating, thus, minimising the overall losses. However, one problem was raised by those purchasing the boilers, mainly the possibility of an unwanted triggering of the protections relays, especially ground fault protection...... for the testing of two ground fault protection relays, in order to assure that they are not triggered by the energisation of the boiler. The test is performed via an OMICRON CMC 256 with Advanced TransPlay SW, which generates the signals that would be present at the secondary of the instrumentation transformers......, during the energisation of a boiler. A special case for concern was the presence of an electric arc between the electrodes of the boiler and the water in the boiler during approximately 2s at the energisation, which can in theory be seen as a ground fault by the relay. The voltage and current transient...

  4. A study on seismic behavior of pile foundations of bridge abutment on liquefiable ground through shaking table tests

    Science.gov (United States)

    Nakata, Mitsuhiko; Tanimoto, Shunsuke; Ishida, Shuichi; Ohsumi, Michio; Hoshikuma, Jun-ichi

    2017-10-01

    There is risk of bridge foundations to be damaged by liquefaction-induced lateral spreading of ground. Once bridge foundations have been damaged, it takes a lot of time for restoration. Therefore, it is important to assess the seismic behavior of the foundations on liquefiable ground appropriately. In this study, shaking table tests of models on a scale of 1/10 were conducted at the large scale shaking table in Public Works Research Institute, Japan, to investigate the seismic behavior of pile-supported bridge abutment on liquefiable ground. The shaking table tests were conducted for three types of model. Two are models of existing bridge which was built without design for liquefaction and the other is a model of bridge which was designed based on the current Japanese design specifications for highway bridges. As a result, the bending strains of piles of the abutment which were designed based on the current design specifications were less than those of the existing bridge.

  5. Development and testing of techniques for in-ground stabilization, size reduction, and safe removal of radioactive wastes stored in containments buried in ground

    International Nuclear Information System (INIS)

    Halliwell, Stephen; Christodoulou, Apostolos

    2013-01-01

    Since the 1950's radioactive wastes from a number of laboratories have been stored below ground at the Hanford site, Washington State, USA, in vertical pipe units (VPUs) made of five 200 litre drums without tops or bottoms, and in caissons, made out of corrugated pipe, or concrete and typically 2,500 mm in diameter. The VPU's are buried of the order of 2,100 mm below grade, and the caissons are buried of the order of 6,000 mm below grade. The waste contains fuel pieces, fission products, and a range of chemicals used in the laboratory processes. This can include various energetic reactants such as un-reacted sodium potassium (NaK), potassium superoxide (KO 2 ), and picric acid, as well as quantities of other liquids. The integrity of the containments is considered to present unacceptable risks from leakage of radioactivity to the environment. This paper describes the successful development and full scale testing of in-ground augering equipment, grouting systems and removal equipment for remediation and removal of the VPUs, and the initial development work to test the utilization of the same basic augering and grouting techniques for the stabilization, size reduction and removal of caissons. (authors)

  6. Sensitivity is not an intrinsic property of a diagnostic test: empirical evidence from histological diagnosis of Helicobacter pylori infection

    Directory of Open Access Journals (Sweden)

    Carrilho Carla

    2009-12-01

    Full Text Available Abstract Background We aimed to provide empirical evidence of how spectrum effects can affect the sensitivity of histological assessment of Helicobacter pylori infection, which may contribute to explain the heterogeneity in prevalence estimates across populations with expectedly similar prevalence. Methods Cross-sectional evaluation of dyspeptic subjects undergoing upper digestive endoscopy, including collection of biopsy specimens from the greater curvature of the antrum for assessment of H. pylori infection by histopathological study and polymerase chain reaction (PCR, from Portugal (n = 106 and Mozambique (n = 102 following the same standardized protocol. Results In the Portuguese sample the prevalence of infection was 95.3% by histological assessment and 98.1% by PCR. In the Mozambican sample the prevalence was 63.7% and 93.1%, respectively. Among those classified as infected by PCR, the sensitivity of histological assessment was 96.2% among the Portuguese and 66.3% among the Mozambican. Among those testing positive by both methods, 5.0% of the Portuguese and 20.6% of the Mozambican had mild density of colonization. Conclusions This study shows a lower sensitivity of histological assessment of H. pylori infection in Mozambican dyspeptic patients compared to the Portuguese, which may be explained by differences in the density of colonization, and may contribute to explain the heterogeneity in prevalence estimates across African settings.

  7. Ground penetrating radar for determining volumetric soil water content ; results of comparative measurements at two test sites

    NARCIS (Netherlands)

    Overmeeren, R.A. van; Sariowan, S.V.; Gehrels, J.C.

    1997-01-01

    Ground penetrating radar (GPR) can provide information on the soil water content of the unsaturated zone in sandy deposits via measurements from the surface, and so avoids drilling. Proof of this was found from measurements of radar wave velocities carried out ten times over 13 months at two test

  8. Development of a Ground Test and Analysis Protocol for NASA's NextSTEP Phase 2 Habitation Concepts

    Science.gov (United States)

    Gernhardt, Michael L.; Beaton, Kara H.; Chappell, Steven P.; Bekdash, Omar S.; Abercromby, Andrew F. J.

    2018-01-01

    The NASA Next Space Technologies for Exploration Partnerships (NextSTEP) program is a public-private partnership model that seeks commercial development of deep space exploration capabilities to support human spaceflight missions around and beyond cislunar space. NASA first issued the Phase 1 NextSTEP Broad Agency Announcement to U.S. industries in 2014, which called for innovative cislunar habitation concepts that leveraged commercialization plans for low-Earth orbit. These habitats will be part of the Deep Space Gateway (DSG), the cislunar space station planned by NASA for construction in the 2020s. In 2016, Phase 2 of the NextSTEP program selected five commercial partners to develop ground prototypes. A team of NASA research engineers and subject matter experts (SMEs) have been tasked with developing the ground-test protocol that will serve as the primary means by which these Phase 2 prototypes will be evaluated. Since 2008, this core test team has successfully conducted multiple spaceflight analog mission evaluations utilizing a consistent set of operational tools, methods, and metrics to enable the iterative development, testing, analysis, and validation of evolving exploration architectures, operations concepts, and vehicle designs. The purpose of implementing a similar evaluation process for the Phase 2 Habitation Concepts is to consistently evaluate different commercial partner ground prototypes to provide data-driven, actionable recommendations for Phase 3. This paper describes the process by which the ground test protocol was developed and the objectives, methods, and metrics by which the NextSTEP Phase 2 Habitation Concepts will be rigorously and systematically evaluated. The protocol has been developed using both a top-down and bottom-up approach. Top-down development began with the Human Exploration and Operations Mission Directorate (HEOMD) exploration objectives and ISS Exploration Capability Study Team (IECST) candidate flight objectives. Strategic

  9. Risk-based screening analysis of ground water contaminated by radionuclides introduced at the Nevada Test Site (NTS)

    International Nuclear Information System (INIS)

    Daniels, J.I.; Anspaugh, L.R.; Andricevic, R.; Jacobson, R.L.

    1993-06-01

    The Nevada Test Site (NTS) is located in the southwestern part of Nevada, about 105 km (65 mi) northwest of the city of Las Vegas. Underground tests of nuclear weapons devices have been conducted at the NTS since late 1962 and ground water beneath the NTS has been contaminated with radionuclides produced by these tests. This concern prompted this examination of the potential health risk to these individuals from drinking the contaminated ground water either at a location on the NTS (assuming loss of institutional control after 100 y) or at one offsite (considering groundwater migration). For the purpose of this assessment, a representative mix of the radionuclides of importance and their concentrations in ground water beneath the NTS were identified from measurements of radionuclide concentrations in groundwater samples-of-opportunity collected at the NTS. Transport of radionuclide-contaminated ground water offsite was evaluated using a travel-time-transport approach. At both locations of interest, potential human-health risk was calculated for an individual ingesting radionuclide-contaminated ground water over the course of a 70-y lifetime. Uncertainties about human physiological attributes, as well as about estimates of physical detriment per unit of radioactive material, were quantified and incorporated into the estimates of risk. The maximum potential excess lifetime risk of cancer mortality estimated for an individual at the offsite location ranges from 7 x 10 -7 to 1 x 10 -5 , and at the onsite location ranges from 3 x 10 -3 to 2 x 10 -2 . Both the offsite and the onsite estimates of risk are dominated by the lifetime doses from tritium. For the assessment of radionuclides in ground water, the critical uncertainty is their concentration today under the entire NTS

  10. Identifying nurse staffing research in Medline: development and testing of empirically derived search strategies with the PubMed interface.

    Science.gov (United States)

    Simon, Michael; Hausner, Elke; Klaus, Susan F; Dunton, Nancy E

    2010-08-23

    term "nurse staffing") could improve the precision of future searches in this field. Empirically selected search terms can help to develop effective search strategies. The high consistency between all test sets confirmed the validity of our approach.

  11. Syndemics of psychosocial problems and HIV risk: A systematic review of empirical tests of the disease interaction concept.

    Science.gov (United States)

    Tsai, Alexander C; Burns, Bridget F O

    2015-08-01

    In the theory of syndemics, diseases co-occur in particular temporal or geographical contexts due to harmful social conditions (disease concentration) and interact at the level of populations and individuals, with mutually enhancing deleterious consequences for health (disease interaction). This theory has widespread adherents in the field, but the extent to which there is empirical support for the concept of disease interaction remains unclear. In January 2015 we systematically searched 7 bibliographic databases and tracked citations to highly cited publications associated with the theory of syndemics. Of the 783 records, we ultimately included 34 published journal articles, 5 dissertations, and 1 conference abstract. Most studies were based on a cross-sectional design (32 [80%]), were conducted in the U.S. (32 [80%]), and focused on men who have sex with men (21 [53%]). The most frequently studied psychosocial problems were related to mental health (33 [83%]), substance abuse (36 [90%]), and violence (27 [68%]); while the most frequently studied outcome variables were HIV transmission risk behaviors (29 [73%]) or HIV infection (9 [23%]). To test the disease interaction concept, 11 (28%) studies used some variation of a product term, with less than half of these (5/11 [45%]) providing sufficient information to interpret interaction both on an additive and on a multiplicative scale. The most frequently used specification (31 [78%]) to test the disease interaction concept was the sum score corresponding to the total count of psychosocial problems. Although the count variable approach does not test hypotheses about interactions between psychosocial problems, these studies were much more likely than others (14/31 [45%] vs. 0/9 [0%]; χ2 = 6.25, P = 0.01) to incorporate language about "synergy" or "interaction" that was inconsistent with the statistical models used. Therefore, more evidence is needed to assess the extent to which diseases interact, either at the

  12. Occupations at Risk and Organizational Well-Being: An Empirical Test of a Job Insecurity Integrated Model.

    Science.gov (United States)

    Chirumbolo, Antonio; Urbini, Flavio; Callea, Antonino; Lo Presti, Alessandro; Talamo, Alessandra

    2017-01-01

    One of the more visible effects of the societal changes is the increased feelings of uncertainty in the workforce. In fact, job insecurity represents a crucial occupational risk factor and a major job stressor that has negative consequences on both organizational well-being and individual health. Many studies have focused on the consequences about the fear and the perception of losing the job as a whole (called quantitative job insecurity), while more recently research has begun to examine more extensively the worries and the perceptions of losing valued job features (called qualitative job insecurity). The vast majority of the studies, however, have investigated the effects of quantitative and qualitative job insecurity separately. In this paper, we proposed the Job Insecurity Integrated Model aimed to examine the effects of quantitative job insecurity and qualitative job insecurity on their short-term and long-term outcomes. This model was empirically tested in two independent studies, hypothesizing that qualitative job insecurity mediated the effects of quantitative job insecurity on different outcomes, such as work engagement and organizational identification (Study 1), and job satisfaction, commitment, psychological stress and turnover intention (Study 2). Study 1 was conducted on 329 employees in private firms, while Study 2 on 278 employees in both public sector and private firms. Results robustly showed that qualitative job insecurity totally mediated the effects of quantitative on all the considered outcomes. By showing that the effects of quantitative job insecurity on its outcomes passed through qualitative job insecurity, the Job Insecurity Integrated Model contributes to clarifying previous findings in job insecurity research and puts forward a framework that could profitably produce new investigations with important theoretical and practical implications.

  13. Occupations at Risk and Organizational Well-Being: An Empirical Test of a Job Insecurity Integrated Model

    Science.gov (United States)

    Chirumbolo, Antonio; Urbini, Flavio; Callea, Antonino; Lo Presti, Alessandro; Talamo, Alessandra

    2017-01-01

    One of the more visible effects of the societal changes is the increased feelings of uncertainty in the workforce. In fact, job insecurity represents a crucial occupational risk factor and a major job stressor that has negative consequences on both organizational well-being and individual health. Many studies have focused on the consequences about the fear and the perception of losing the job as a whole (called quantitative job insecurity), while more recently research has begun to examine more extensively the worries and the perceptions of losing valued job features (called qualitative job insecurity). The vast majority of the studies, however, have investigated the effects of quantitative and qualitative job insecurity separately. In this paper, we proposed the Job Insecurity Integrated Model aimed to examine the effects of quantitative job insecurity and qualitative job insecurity on their short-term and long-term outcomes. This model was empirically tested in two independent studies, hypothesizing that qualitative job insecurity mediated the effects of quantitative job insecurity on different outcomes, such as work engagement and organizational identification (Study 1), and job satisfaction, commitment, psychological stress and turnover intention (Study 2). Study 1 was conducted on 329 employees in private firms, while Study 2 on 278 employees in both public sector and private firms. Results robustly showed that qualitative job insecurity totally mediated the effects of quantitative on all the considered outcomes. By showing that the effects of quantitative job insecurity on its outcomes passed through qualitative job insecurity, the Job Insecurity Integrated Model contributes to clarifying previous findings in job insecurity research and puts forward a framework that could profitably produce new investigations with important theoretical and practical implications. PMID:29250013

  14. Occupations at Risk and Organizational Well-Being: An Empirical Test of a Job Insecurity Integrated Model

    Directory of Open Access Journals (Sweden)

    Antonio Chirumbolo

    2017-11-01

    Full Text Available One of the more visible effects of the societal changes is the increased feelings of uncertainty in the workforce. In fact, job insecurity represents a crucial occupational risk factor and a major job stressor that has negative consequences on both organizational well-being and individual health. Many studies have focused on the consequences about the fear and the perception of losing the job as a whole (called quantitative job insecurity, while more recently research has begun to examine more extensively the worries and the perceptions of losing valued job features (called qualitative job insecurity. The vast majority of the studies, however, have investigated the effects of quantitative and qualitative job insecurity separately. In this paper, we proposed the Job Insecurity Integrated Model aimed to examine the effects of quantitative job insecurity and qualitative job insecurity on their short-term and long-term outcomes. This model was empirically tested in two independent studies, hypothesizing that qualitative job insecurity mediated the effects of quantitative job insecurity on different outcomes, such as work engagement and organizational identification (Study 1, and job satisfaction, commitment, psychological stress and turnover intention (Study 2. Study 1 was conducted on 329 employees in private firms, while Study 2 on 278 employees in both public sector and private firms. Results robustly showed that qualitative job insecurity totally mediated the effects of quantitative on all the considered outcomes. By showing that the effects of quantitative job insecurity on its outcomes passed through qualitative job insecurity, the Job Insecurity Integrated Model contributes to clarifying previous findings in job insecurity research and puts forward a framework that could profitably produce new investigations with important theoretical and practical implications.

  15. Empirical likelihood-based confidence intervals for the sensitivity of a continuous-scale diagnostic test at a fixed level of specificity.

    Science.gov (United States)

    Gengsheng Qin; Davis, Angela E; Jing, Bing-Yi

    2011-06-01

    For a continuous-scale diagnostic test, it is often of interest to find the range of the sensitivity of the test at the cut-off that yields a desired specificity. In this article, we first define a profile empirical likelihood ratio for the sensitivity of a continuous-scale diagnostic test and show that its limiting distribution is a scaled chi-square distribution. We then propose two new empirical likelihood-based confidence intervals for the sensitivity of the test at a fixed level of specificity by using the scaled chi-square distribution. Simulation studies are conducted to compare the finite sample performance of the newly proposed intervals with the existing intervals for the sensitivity in terms of coverage probability. A real example is used to illustrate the application of the recommended methods.

  16. Using empirical Bayes predictors from generalized linear mixed models to test and visualize associations among longitudinal outcomes.

    Science.gov (United States)

    Mikulich-Gilbertson, Susan K; Wagner, Brandie D; Grunwald, Gary K; Riggs, Paula D; Zerbe, Gary O

    2018-01-01

    Medical research is often designed to investigate changes in a collection of response variables that are measured repeatedly on the same subjects. The multivariate generalized linear mixed model (MGLMM) can be used to evaluate random coefficient associations (e.g. simple correlations, partial regression coefficients) among outcomes that may be non-normal and differently distributed by specifying a multivariate normal distribution for their random effects and then evaluating the latent relationship between them. Empirical Bayes predictors are readily available for each subject from any mixed model and are observable and hence, plotable. Here, we evaluate whether second-stage association analyses of empirical Bayes predictors from a MGLMM, provide a good approximation and visual representation of these latent association analyses using medical examples and simulations. Additionally, we compare these results with association analyses of empirical Bayes predictors generated from separate mixed models for each outcome, a procedure that could circumvent computational problems that arise when the dimension of the joint covariance matrix of random effects is large and prohibits estimation of latent associations. As has been shown in other analytic contexts, the p-values for all second-stage coefficients that were determined by naively assuming normality of empirical Bayes predictors provide a good approximation to p-values determined via permutation analysis. Analyzing outcomes that are interrelated with separate models in the first stage and then associating the resulting empirical Bayes predictors in a second stage results in different mean and covariance parameter estimates from the maximum likelihood estimates generated by a MGLMM. The potential for erroneous inference from using results from these separate models increases as the magnitude of the association among the outcomes increases. Thus if computable, scatterplots of the conditionally independent empirical Bayes

  17. Verification of the Performance of a Vertical Ground Heat Exchanger Applied to a Test House in Melbourne, Australia

    Directory of Open Access Journals (Sweden)

    Koon Beng Ooi

    2017-10-01

    Full Text Available The ground heat exchanger is traditionally used as a heat source or sink for the heat pump that raises the temperature of water to about 50 °C to heat houses. However, in winter, the heating thermostat (temperature at which heating begins in the Australian Nationwide House Energy Rating Scheme (NatHERS is only 20 °C during daytime and 15 °C at night. In South-East Melbourne, the temperature at the bottom of a 50-meter-deep borehole has been recorded with an Emerson™ recorder at 17 °C. Melbourne has an annual average temperature of 15 °C, so the ground temperature increases by 2 °C per 50-m depth. A linear projection gives 23 °C at 200-m of depth, and as the average undisturbed temperature of the ground for a 400-m-deep vertical ground heat exchanger (VGHE. This study, by simulation and experimentation, aims to verify that the circulation of water in the VGHE’s U-tube to low-temperature radiators (LTRs could heat a house to thermal comfort. A literature review is included in the introduction. A simulation, using a model of a 60-m2 experimental house, shows that the daytime circulation of water in this VGHE/LTR-on-opposite-walls system during the 8-month cold half of the year, heats the indoors to NatHERS settings. Simulation for the cold half shows that this VGHE-LTR system could cool the indoors. Instead, a fan creating a cooling sensation of up to 4 °C is used so that the VGHE is available for the regeneration of heat extracted from the ground during the cold portion. Simulations for this hot portion show that a 3.4-m2 flat plate solar collector can collect more than twice the heat extracted from the ground in the cold portion. Thus, it can thus replenish the ground heat extracted for houses double the size of this 60-m2 experimental house. Therefore, ground heat is sustainable for family-size homes. Since no heat pump is used, the cost of VGHE-LTR systems could be comparable to systems using the ground source heat pump. Water

  18. Vessel grounding in entrance channels: case studies and physical model tests

    CSIR Research Space (South Africa)

    Tulsi, K

    2014-05-01

    Full Text Available . It was found that high speed impacts of 8 to 12 knots at 10° to the channel side slopes have the potential to damage the hull and require enormous tug forces to re-float the grounded vessel....

  19. Impact of seasonal forecast use on agricultural income in a system with varying crop costs and returns: an empirically-grounded simulation

    Science.gov (United States)

    Gunda, T.; Bazuin, J. T.; Nay, J.; Yeung, K. L.

    2017-03-01

    Access to seasonal climate forecasts can benefit farmers by allowing them to make more informed decisions about their farming practices. However, it is unclear whether farmers realize these benefits when crop choices available to farmers have different and variable costs and returns; multiple countries have programs that incentivize production of certain crops while other crops are subject to market fluctuations. We hypothesize that the benefits of forecasts on farmer livelihoods will be moderated by the combined impact of differing crop economics and changing climate. Drawing upon methods and insights from both physical and social sciences, we develop a model of farmer decision-making to evaluate this hypothesis. The model dynamics are explored using empirical data from Sri Lanka; primary sources include survey and interview information as well as game-based experiments conducted with farmers in the field. Our simulations show that a farmer using seasonal forecasts has more diversified crop selections, which drive increases in average agricultural income. Increases in income are particularly notable under a drier climate scenario, when a farmer using seasonal forecasts is more likely to plant onions, a crop with higher possible returns. Our results indicate that, when water resources are scarce (i.e. drier climate scenario), farmer incomes could become stratified, potentially compounding existing disparities in farmers’ financial and technical abilities to use forecasts to inform their crop selections. This analysis highlights that while programs that promote production of certain crops may ensure food security in the short-term, the long-term implications of these dynamics need careful evaluation.

  20. From Regional Hazard Assessment to Nuclear-Test-Ban Treaty Support - InSAR Ground Motion Services

    Science.gov (United States)

    Lege, T.; Kalia, A.; Gruenberg, I.; Frei, M.

    2016-12-01

    There are numerous scientific applications of InSAR methods in tectonics, earthquake analysis and other geologic and geophysical fields. Ground motion on local and regional scale measured and monitored via the application of the InSAR techniques provide scientists and engineers with plenty of new insights and further understanding of subsurface processes. However, the operational use of InSAR is not yet very widespread. To foster the operational utilization of the Copernicus Sentinel Satellites in the day-to-day business of federal, state and municipal work and planning BGR (Federal Institute for Geosciences and Natural Resources) initiated workshops with potential user groups. Through extensive reconcilement of interests and demands with scientific, technical, economic and governmental stakeholders (e.g. Ministries, Mining Authorities, Geological Surveys, Geodetic Surveys and Environmental Agencies on federal and state level, SMEs, German Aerospace Center) BGR developed the concept of the InSAR based German National Ground Motion Service. One important backbone for the nationwide ground motion service is the so-called Persistent Scatterer Interferometry Wide Area Product (WAP) approach developed with grants of European research funds. The presentation shows the implementation of the ground motion service and examples for product developments for operational supervision of mining, water resources management and spatial planning. Furthermore the contributions of Copernicus Sentinel 1 radar data in the context of CTBT are discussed. The DInSAR processing of Sentinel 1 IW (Interferometric Wide Swath) SAR acquisitions from January 1st and 13th Jan. 2016 allow for the first time a near real time ground motion measurement of the North Korean nuclear test site. The measured ground displacements show a strong spatio-temporal correlation to the calculated epicenter measured by teleseismic stations. We are convinced this way another space technique will soon contribute even

  1. First observations of tritium in ground water outside chimneys of underground nuclear explosions, Yucca Flat, Nevada Test Site

    International Nuclear Information System (INIS)

    Crow, N.B.

    1976-01-01

    Abnormal levels of radionuclides had not been detected in ground water at the Nevada Test Site beyond the immediate vicinity of underground nuclear explosions until April 1974, when above-background tritium activity levels were detected in ground-water inflow from the tuff beneath Yucca Flat to an emplacement chamber being mined in hole U2aw in the east-central part of Area 2. No other radionuclides were detected in a sample of water from the chamber. In comparison with the amount of tritium estimated to be present in the ground water in nearby nuclear chimneys, the activity level at U2aw is very low. To put the tritium activity levels at U2aw into proper perspective, the maximum tritium activity level observed was significantly less than the maximum permissible concentration (MPC) for a restricted area, though from mid-April 1974 until the emplacement chamber was expended in September 1974, the tritium activity exceeded the MPC for the general public. Above-background tritium activity was also detected in ground water from the adjacent exploratory hole, Ue2aw. The nearest underground nuclear explosion detonated beneath the water table, believed to be the source of the tritium observed, is Commodore (U2am), located 465 m southeast of the emplacement chamber in U2aw. Commodore was detonated in May 1967. In May 1975, tritium activity May significantly higher than regional background. was detected in ground water from hole Ue2ar, 980 m south of the emplacement chamber in U2aw and 361 m from a second underground nuclear explosion, Agile (U2v), also detonated below the water table, in February 1967. This paper describes these occurrences of tritium in the ground water. A mechanism to account for the movement of tritium is postulated

  2. Status of the Correlation Process of the V-HAB Simulation with Ground Tests and ISS Telemetry Data

    Science.gov (United States)

    Ploetner, P.; Roth, C.; Zhukov, A.; Czupalla, M.; Anderson, M.; Ewert, M.

    2013-01-01

    The Virtual Habitat (V-HAB) is a dynamic Life Support System (LSS) simulation, created for investigation of future human spaceflight missions. It provides the capability to optimize LSS during early design phases. The focal point of the paper is the correlation and validation of V-HAB against ground test and flight data. In order to utilize V-HAB to design an Environmental Control and Life Support System (ECLSS) it is important to know the accuracy of simulations, strengths and weaknesses. Therefore, simulations of real systems are essential. The modeling of the International Space Station (ISS) ECLSS in terms of single technologies as well as an integrated system and correlation against ground and flight test data is described. The results of the simulations make it possible to prove the approach taken by V-HAB.

  3. Ground tests with prototype of CeBr{sub 3} active gamma ray spectrometer proposed for future venus surface missions

    Energy Technology Data Exchange (ETDEWEB)

    Litvak, M.L., E-mail: litvak@mx.iki.rssi.ru [Space Research Institute, RAS, Moscow 117997 (Russian Federation); Sanin, A.B.; Golovin, D.V. [Space Research Institute, RAS, Moscow 117997 (Russian Federation); Jun, I. [Jet Propulsion Laboratory, Pasadena, CA (United States); Mitrofanov, I.G. [Space Research Institute, RAS, Moscow 117997 (Russian Federation); Shvetsov, V.N.; Timoshenko, G.N. [Joint Institute for Nuclear Research, Dubna (Russian Federation); Vostrukhin, A.A. [Space Research Institute, RAS, Moscow 117997 (Russian Federation)

    2017-03-11

    The results of a series of ground tests with a prototype of an active gamma-ray spectrometer based on a new generation of scintillation crystal (CeBr{sub 3}) are presented together with a consideration to its applicability to future Venus landing missions. We evaluated the instrument's capability to distinguish the subsurface elemental composition of primary rock forming elements such as O, Na, Mg, Al, Si, K and Fe. Our study uses heritage from previous ground and field tests and applies to the analysis of gamma lines from activation reaction products generated by a pulsed neutron generator. We have estimated that the expected accuracies achieved in this approach could be as high as 1–10% for the particular chemical element being studied.

  4. Summary of hydrogeologic controls on ground-water flow at the Nevada Test Site, Nye County, Nevada

    Science.gov (United States)

    Laczniak, R.J.; Cole, J.C.; Sawyer, D.A.; Trudeau, D.A.

    1996-01-01

    The underground testing of nuclear devices has generated substantial volumes of radioactive and other chemical contaminants below ground at the Nevada Test Site (NTS). Many of the more radioactive contaminants are highly toxic and are known to persist in the environment for thousands of years. In response to concerns about potential health hazards, the U.S. Department of Energy, under its Environmental Restoration Program, has made NTS the subject of a long-term investigation. Efforts supported through the U.S. Department of Energy program will assess whether byproducts of underground testing pose a potential hazard to the health and safety of the public and, if necessary, will evaluate and implement steps to remediate any of the identified dangers. Test-generated contaminants have been introduced over large areas and at variable depths above and below the water table throughout NTS. Evaluating the risks associated with these byproducts of underground testing presupposes a knowledge of the source, transport, and potential receptors of these contaminants. Ground-water flow is the primary mechanism by which contaminants can be transported significant distances away from the initial point of injection. Flow paths between contaminant sources and potential receptors are separated by remote areas that span tens of miles. The diversity and structural complexity of the rocks along these flow paths complicates the hydrology of the region. Although the hydrology has been studied in some detail, much still remains uncertain about flow rates and directions through the fractured-rock aquifers that transmit water great distances across this arid region. Unique to the hydrology of NTS are the effects of underground testing, which severely alter local rock characteristics and affect hydrologic conditions throughout the region. Any assessment of the risk must rely in part on the current understanding of ground-water flow, and the assessment will be only as good as the understanding

  5. Adoption of a service innovation in the business market : An empirical test of supply-side variables

    NARCIS (Netherlands)

    Frambach, Ruud T.; Barkema, Harry G.; Nooteboom, Bart; Wedel, Michel

    1998-01-01

    The objective of this article is to assess the influence of variables over which suppliers have control (supply-side variables) on the adoption of innovations in addition to adopter-side variables. The empirical study focused on the adoption of electronic banking in the Dutch business market. A

  6. Adoption of a service innovation in the business market : An empirical test of supply-side variables

    NARCIS (Netherlands)

    Frambach, RT; Nooteboom, B; Wedel, M; Barkema, H.W.

    The objective of this article is to assess the influence of variables over which suppliers have control (supply-side variables) on the adoption gf innovations in addition to adopter-side variables. The empirical study focused on the adoption of electronic banking in the Dutch business market. A

  7. An Attitudinal Explanation of Biases in the Criminal Justice System: An Empirical Testing of Defensive Attribution Theory

    Science.gov (United States)

    Herzog, Sergio

    2008-01-01

    Theoretical perspectives, supported by empirical evidence, have consistently argued that the judicial treatment of offenders by criminal justice agents is sometimes biased by extralegal factors, such as offenders' sociodemographic characteristics. According to defensive attribution theory, individuals tend to protect themselves against unfortunate…

  8. Summary of hydrogeologic controls on ground-water flow at the Nevada Test Site, Nye County, Nevada

    International Nuclear Information System (INIS)

    Laczniak, R.J.; Cole, J.C.; Sawyer, D.A.; Trudeau, D.A.

    1996-01-01

    The underground testing of nuclear devices has generated substantial volumes of radioactive and other chemical contaminants below ground at the Nevada Test Site (NTS). Many of the more radioactive contaminants are highly toxic and are known to persist in the environment for thousands of years. In response to concerns about potential health hazards, the US Department of Energy, under its Environmental Restoration Program, has made NTS the subject of a long-term investigation. Efforts will assess whether byproducts of underground testing pose a potential hazard to the health and safety of the public and, if necessary, will evaluate and implement steps to remediate any of the identified dangers. Ground-water flow is the primary mechanism by which contaminants can be transported significant distances away from the initial point of injection. Flow paths between contaminant sources and potential receptors are separated by remote areas that span tens of miles. The diversity and structural complexity of the rocks along these flow paths complicates the hydrology of the region. Although the hydrology has been studied in some detail, much still remains uncertain about flow rates and directions through the fractured-rock aquifers that transmit water great distances across this arid region. Unique to the hydrology of NTS are the effects of underground testing, which severely alter local rock characteristics and affect hydrologic conditions throughout the region. This report summarizes what is known and inferred about ground-water flow throughout the NTS region. The report identifies and updates what is known about some of the major controls on ground-water flow, highlights some of the uncertainties in the current understanding, and prioritizes some of the technical needs as related to the Environmental Restoration Program. 113 refs

  9. Inglorious Empire

    DEFF Research Database (Denmark)

    Khair, Tabish

    2017-01-01

    Review of 'Inglorious Empire: What the British did to India' by Shashi Tharoor, London, Hurst Publishers, 2017, 296 pp., £20.00......Review of 'Inglorious Empire: What the British did to India' by Shashi Tharoor, London, Hurst Publishers, 2017, 296 pp., £20.00...

  10. Large scale seismic test research at Hualien site in Taiwan. Results of site investigation and characterization of the foundation ground

    International Nuclear Information System (INIS)

    Okamoto, Toshiro; Kokusho, Takeharu; Nishi, Koichi

    1998-01-01

    An international joint research program called ''HLSST'' is under way. Large-Scale Seismic Test (LSST) is to be conducted to investigate Soil-Structure Interaction (SSI) during large earthquakes in the field in Hualien, a high seismic region in Taiwan. A 1/4-scale model building was constructed on the excavated gravelly ground, and the backfill material of crushed stones was placed around the model plant. The model building and the foundation ground were extensively instrumented to monitor structure and ground response. To accurately evaluate SSI during earthquakes, geotechnical investigation and forced vibration test were performed during construction process namely before/after the base excavation, after the structure construction and after the backfilling. Main results are as follows. (1) The distribution of the mechanical properties of the gravelly soil are measured by various techniques including penetration tests and PS-logging and it found that the shear wave velocities (Vs) change clearly and it depends on changing overburden pressures during the construction process. (2) Measuring Vs in the surrounding soils, it found that the Vs is smaller than that at almost same depth in the farther location. Discussion is made further on the numerical soil model for SSI analysis. (author)

  11. Development of a Ground Test and Analysis Protocol to Support NASA's NextSTEP Phase 2 Habitation Concepts

    Science.gov (United States)

    Beaton, Kara H.; Chappell, Steven P.; Bekdash, Omar S.; Gernhardt, Michael L.

    2018-01-01

    The NASA Next Space Technologies for Exploration Partnerships (NextSTEP) program is a public-private partnership model that seeks commercial development of deep space exploration capabilities to support extensive human spaceflight missions around and beyond cislunar space. NASA first issued the Phase 1 NextSTEP Broad Agency Announcement to U.S. industries in 2014, which called for innovative cislunar habitation concepts that leveraged commercialization plans for low Earth orbit. These habitats will be part of the Deep Space Gateway (DSG), the cislunar space station planned by NASA for construction in the 2020s. In 2016, Phase 2 of the NextSTEP program selected five commercial partners to develop ground prototypes. A team of NASA research engineers and subject matter experts have been tasked with developing the ground test protocol that will serve as the primary means by which these Phase 2 prototype habitats will be evaluated. Since 2008, this core test team has successfully conducted multiple spaceflight analog mission evaluations utilizing a consistent set of operational products, tools, methods, and metrics to enable the iterative development, testing, analysis, and validation of evolving exploration architectures, operations concepts, and vehicle designs. The purpose of implementing a similar evaluation process for the NextSTEP Phase 2 Habitation Concepts is to consistently evaluate the different commercial partner ground prototypes to provide data-driven, actionable recommendations for Phase 3.

  12. Testing and ground calibration of DREAMS-H relative humidity device

    Science.gov (United States)

    Genzer, Maria; Hieta, Maria; Nikkanen, Timo; Schmidt, Walter; Kemppinen, Osku; Harri, Ari-Matti; Haukka, Harri

    2015-04-01

    DREAMS (Dust Characterization, Risk Assessment and Environmental Analyzer on the Martian Surface) instrument suite is to be launched as part of the ESA ExoMars 2016/Schiaparelli lander. DREAMS consists of an environmental package for monitoring temperature, pressure, relative humidity, winds and dust opacity, as well as atmospheric electricity of Martian atmosphere. The DREAMS instruments and scientific goals are described in [1]. Here we describe testing and ground calibration of the relative humidity device, DREAMS-H, provided to the DREAMS payload by the Finnish Meteorological Institute and based on proprietary technology of Vaisala, Inc. The same kind of device is part of the REMS instrument package onboard MSL Curiosity Rover [2][3]. DREAMS-H is based on Vaisala Humicap® technology adapted for use in Martian environment by the Finnish Meteorological Institute. The device is very small and lightweighed, with total mass less than 20 g and consuming only 15 mW of power. The Humicap® sensor heads contain an active polymer film that changes its capacitance as function of relative humidity, with 0% to 100% RH measurement range. The dynamic range of the device gets smaller with sensor temperature, being in -70°C approximately 30% of the dynamic range in 0°C [3]. Good-quality relative humidity measurements require knowing the temperature of the environment in which relative humidity is measured. An important part of DREAMS-H calibration was temperature calibration of Vaisala Thermocap® temperature sensors used for housekeeping temperature measurements of the DREAMS-H device. For this, several temperature points in the desired operational range were measured with 0.1°C accuracy traceable to national standards. The main part of humidity calibration of DREAMS-H flight models was done in subzero temperatures in a humidity generator of the Finnish Center of Metrology and Accreditation (MIKES). Several relative humidity points ranging from almost dry to almost wet

  13. Unmanned Ground Vehicle for Autonomous Non-Destructive Testing of FRP Bridge Decks

    Science.gov (United States)

    Klinkhachorn, P.; Mercer, A. Scott; Halabe, Udaya B.; GangaRao, Hota V. S.

    2007-03-01

    Current non-destructive techniques for defect analysis of FRP bridge decks have a narrow scope. These techniques are very good at detecting certain types of defects but are not robust enough to detect all defects by themselves. For example, infrared thermography (IRT) can detect air filled defects and Ground Penetrating Radar (GPR) is good at detecting water filled ones. These technologies can be combined to create a more robust defect detection scheme. To accomplish this, an Unmanned Ground Vehicle (UGV) has been designed that incorporates both IR and GPR analysis to create a comprehensive defect map of a bridge deck. The UGV autonomously surveys the deck surface and acquires data. The UGV has two 1.5 GHz ground coupled GPR antennas that are mounted on the front of the UGV to collect GPR data. It also incorporates an active heating source and a radiometric IR camera to capture IR images of the deck, even in less than ideal weather scenarios such as cold cloudy days. The UGV is designed so that it can collect data in an assembly line fashion. It moves in 1 foot increments. When moving, it collects GPR data from the two antennas. When it stops it heats a section of the deck. The next time it stops to heat a section, the IR camera is analyzing the preheated deck section while preparing for the next section. Because the data is being continually collected using this method, the UGV can survey the entire deck in an efficient and timely manner.

  14. The Apache Longbow-Hellfire Missile Test at Yuma Proving Ground: Ecological Risk Assessment for Tracked Vehicle Movement across Desert Pavement

    International Nuclear Information System (INIS)

    Peterson, Mark J; Efroymson, Rebecca Ann; Hargrove, William Walter

    2008-01-01

    A multiple stressor risk assessment was conducted at Yuma Proving Ground, Arizona, as a demonstration of the Military Ecological Risk Assessment Framework. The focus was a testing program at Cibola Range, which involved an Apache Longbow helicopter firing Hellfire missiles at moving targets, M60-A1 tanks. This paper describes the ecological risk assessment for the tracked vehicle movement component of the testing program. The principal stressor associated with tracked vehicle movement was soil disturbance, and a resulting, secondary stressor was hydrological change. Water loss to washes and wash vegetation was expected to result from increased infiltration and/or evaporation associated with disturbances to desert pavement. The simulated exposure of wash vegetation to water loss was quantified using estimates of exposed land area from a digital ortho quarter quad aerial photo and field observations, a 30 30 m digital elevation model, the flow accumulation feature of ESRI ArcInfo, and a two-step process in which runoff was estimated from direct precipitation to a land area and from water that flowed from upgradient to a land area. In all simulated scenarios, absolute water loss decreased with distance from the disturbance, downgradient in the washes; however, percentage water loss was greatest in land areas immediately downgradient of a disturbance. Potential effects on growth and survival of wash trees were quantified by using an empirical relationship derived from a local unpublished study of water infiltration rates. The risk characterization concluded that neither risk to wash vegetation growth or survival nor risk to mule deer abundance and reproduction was expected. The risk characterization was negative for both the incremental risk of the test program and the combination of the test and pretest disturbances

  15. Settlement mechanism of the backfilled ground around nuclear power plant buildings. Part 1. A series of 1G shaking table tests

    International Nuclear Information System (INIS)

    Ishimaru, Makoto; Kawai, Tadashi

    2008-01-01

    The large ground settlement locally occurred at the backfilled ground around the Kashiwazaki-Kariwa Nuclear Power Plant buildings during the Niigataken Chuetsu-oki Earthquake in 2007. The purposes of this study are to verify the assumed mechanism of the settlement and to discuss the influence factors on the settlement. For these purposes, we conducted a series of 1G shaking table tests using a rigid structure and sand. In the tests, parameters, which were variously changed, are related to two factors; one is the horizontal ground displacement relative to the structure, the other is the ground strength against the sliding failure. The following results were obtained: (1) All the results showed that the ground settlement sizes near the structure were larger than the ground settlement sizes far from the structure, (2) From the video observed at the ground near the structure, it was found that the settlement locally occurred due to the sliding failure after the ground was separated from the structure, (3) The ground settlement sizes near the structure were large as the horizontal ground displacement sizes were large, and the soil strength arising from fines affected the ground settlement sizes near the structure. (author)

  16. Burst and inter-burst duration statistics as empirical test of long-range memory in the financial markets

    Science.gov (United States)

    Gontis, V.; Kononovicius, A.

    2017-10-01

    We address the problem of long-range memory in the financial markets. There are two conceptually different ways to reproduce power-law decay of auto-correlation function: using fractional Brownian motion as well as non-linear stochastic differential equations. In this contribution we address this problem by analyzing empirical return and trading activity time series from the Forex. From the empirical time series we obtain probability density functions of burst and inter-burst duration. Our analysis reveals that the power-law exponents of the obtained probability density functions are close to 3 / 2, which is a characteristic feature of the one-dimensional stochastic processes. This is in a good agreement with earlier proposed model of absolute return based on the non-linear stochastic differential equations derived from the agent-based herding model.

  17. Grounding the Innovation of Future Technologies

    Directory of Open Access Journals (Sweden)

    Antti Oulasvirta

    2005-01-01

    Full Text Available Mobile and ubiquitous technologies can potentially change the role of information and communication technology in human lives. Empirical, human-centered approaches are emerging as an alternative to technology-driven approaches in the innovation of these technologies. Three necessary empirical stages, intertwined with analytical ones and with each informing and grounding the succeeding stages, are analyzed. First, needfinding is utilized to discover societal and individual demands for technology. Second, observational and experimental studies examine the social and cognitive preconditions for interaction. From these two steps, a hypothesis is formulated regarding how technology will change existing practices. Finally, this hypothesis, embodied in the design of a prototype, is tested in a field trial. Four design cases illustrate the value of empirical grounding.

  18. On the relationship between financial measures and contractor pricing strategy: Empirical tests in the defense aerospace industry

    OpenAIRE

    Moses, O. Douglas

    1987-01-01

    This report includes two separate but related empirical studies of the relationship between financial measures for defense aerospace contractors and pricing strategies adopted by contractors. Two pricing strategies are identified: skimming and penetration. Collectively the findings indicate that the adoption of a particular pricing strategy is associated with the financial condition of the contractor as reflected in measures of risk, asset utilization and organizational slack. Keywords: Finan...

  19. A test of radon ground measurements as a geothermal prospecting tool in New Zealand

    International Nuclear Information System (INIS)

    Whitehead, N.E.

    1981-01-01

    Surveys using cellulose nitrate films to detect ground radon by the alpha track method were carried out at Wairakei, Mokai, and Broadlands geothermal fields. A correlation with fault structure was found, but no correlation with resistivity measurements. Grid spacing larger than 60 m is unlikely to detect the faults. The method confirms the presence of faults for exploratory drilling rather than being a stand-alone method. Correlations of track-etch results with bore enthalpies and radon contents of the bores were probably due to local steam leakage or thermal gradients arising since the exploitation of the geothermal field. (auth)

  20. Ground Motion Characteristics of the 2015 Gorkha Earthquake, Survey of Damage to Stone Masonry Structures and Structural Field Tests

    Directory of Open Access Journals (Sweden)

    Rishi Ram Parajuli

    2015-11-01

    Full Text Available On April 25, 2015, a M7.8 earthquake rattled central Nepal; ground motion recorded in Kantipath, Kathmandu, 76.86 km east of the epicenter suggested that the low frequency component was dominant. We consider data from eight aftershocks following the Gorkha earthquake and analyze ground motion characteristics; we found that most of the ground motion records are dominated by low frequencies for events with a moment magnitude greater than 6. The Gorkha earthquake devastated hundreds of thousands of structures. In the countryside, and especially in rural mountainous areas, most of the buildings that collapsed were stone masonry constructions. Detailed damage assessments of stone masonry buildings in Harmi Gorkha had done, with an epicentral distance of about 17 km. Structures were categorized as large, medium and small depending on their plinth area size and number of stories. Most of the structures in the area were damaged; interestingly, all ridge-line structures were heavily damaged. Moreover, Schmidt hammer tests were undertaken to determine the compressive strength of stone masonry, brick masonry with mud mortar for normal buildings and historical monuments. The compressive strengths of stone and brick masonry were found to be 12.38 and 18.75 MPa, respectively. Historical structures constructed with special bricks had a compressive strength of 29.50 MPa. Pullout tests were also conducted to determine the stone masonry-mud mortar bond strength. The cohesive strength of mud mortar and the coefficient of friction were determined.

  1. Economic evaluation of test-and-treat and empirical treatment strategies in the eradication of Helicobacter pylori infection; A Markov model in an Iranian adult population.

    Science.gov (United States)

    Mazdaki, Alireza; Ghiasvand, Hesam; Sarabi Asiabar, Ali; Naghdi, Seyran; Aryankhesal, Aidin

    2016-01-01

    Helicobacter pylori may cause many gastrointestinal problems in developing countries such as Iran. We aimed to analyze the cost- effectiveness and cost- utility of the test-and-treat and empirical treatment strategies in managing Helicobacter pylori infection. This was a Markov based economic evaluation. Effectiveness was defined as the symptoms free numbers and QALYs in 100,000 hypothetical adults. The sensitivity analysis was based on Monte Carlo approach. In the test- and- treat strategy, if the serology is the first diagnostic test vs. histology, the cost per symptoms free number would be 291,736.1 Rials while the cost per QALYs would be 339,226.1 Rials. The cost per symptoms free number and cost per QALYs when the 13 C-UBT was used as the first diagnostic test vs. serology was 1,283,200 and 1,492,103 Rials, respectively. In the empirical strategy, if histology is used as the first diagnostic test vs. 13 CUBT, the cost per symptoms free numbers and cost per QALYs would be 793,234 and 955,698 Rials, respectively. If serology were used as the first diagnostic test vs. histology, the cost per symptoms free and QALYs would be 793,234 and 368941 Rials, respectively. There was no significant and considerable dominancy between the alternatives and the diagnostic tests.

  2. USB environment measurements based on full-scale static engine ground tests. [Upper Surface Blowing for YC-14

    Science.gov (United States)

    Sussman, M. B.; Harkonen, D. L.; Reed, J. B.

    1976-01-01

    Flow turning parameters, static pressures, surface temperatures, surface fluctuating pressures and acceleration levels were measured in the environment of a full-scale upper surface blowing (USB) propulsive-lift test configuration. The test components included a flightworthy CF6-50D engine, nacelle and USB flap assembly utilized in conjunction with ground verification testing of the USAF YC-14 Advanced Medium STOL Transport propulsion system. Results, based on a preliminary analysis of the data, generally show reasonable agreement with predicted levels based on model data. However, additional detailed analysis is required to confirm the preliminary evaluation, to help delineate certain discrepancies with model data and to establish a basis for future flight test comparisons.

  3. Test of ground-based Lidar instrument WLS200S-10

    DEFF Research Database (Denmark)

    Gómez Arranz, Paula

    This report presents the result of the test performed for the given Windcube at DTU’s test site for large wind turbine at Høvsøre, Denmark. The test aims at establishing a relation between the reference wind measurements and corresponding lidar wind indications, and evaluating a set of quality...

  4. Test of ground-based Lidar instrument WLS200S-11

    DEFF Research Database (Denmark)

    Gómez Arranz, Paula

    This report presents the result of the test performed for the given Windcube at DTU’s test site for large wind turbine at Høvsøre, Denmark. The test aims at establishing a relation between the reference wind measurements and corresponding lidar wind indications, and evaluating a set of quality...

  5. Test of ground-based Lidar instrument WLS7-106

    DEFF Research Database (Denmark)

    Gómez Arranz, Paula

    This report presents the result of the test performed for the given Windcube at DTU’s test site for large wind turbine at Høvsøre, Denmark. The test aims at establishing a relation between the reference wind measurements and corresponding lidar wind indications, and evaluating a set of quality...

  6. Test of ground-based lidar instrument WLS7-159

    DEFF Research Database (Denmark)

    Gómez Arranz, Paula; Wagner, Rozenn

    This report presents the result of the test performed for the given Windcube at DTU’s test site for large wind turbine at Høvsøre, Denmark. The test aims at establishing a relation between the reference wind measurements and corresponding lidar wind indications, and evaluating a set of quality...

  7. Comparison of the Effects of using Tygon Tubing in Rocket Propulsion Ground Test Pressure Transducer Measurements

    Science.gov (United States)

    Farr, Rebecca A.; Wiley, John T.; Vitarius, Patrick

    2005-01-01

    This paper documents acoustics environments data collected during liquid oxygen- ethanol hot-fire rocket testing at NASA Marshall Space Flight Center in November- December 2003. The test program was conducted during development testing of the RS-88 development engine thrust chamber assembly in support of the Orbital Space Plane Crew Escape System Propulsion Program Pad Abort Demonstrator. In addition to induced environments analysis support, coincident data collected using other sensors and methods has allowed benchmarking of specific acoustics test measurement methodologies during propulsion tests. Qualitative effects on data characteristics caused by using tygon sense lines of various lengths in pressure transducer measurements is discussed here.

  8. Test plan: Brayton Isotope Power System Ground Demonstration System (BIPS-GDS)

    International Nuclear Information System (INIS)

    1976-01-01

    The purpose of this test plan is to provide an overall outline of all testing to be accomplished on the GDS. Included in this test plan are administrative requirements, instrumentation accuracies, instrumentation, equipment definitions, system test setup, and facility installation. The test program will enable collection of sufficient data to establish material, component, and system design integrity. The data will also be used to establish and evaluate component and system performance and reliability characteristics, verification of proper system component integration prior to initiation of Phase II, and flight system (FS) development

  9. Rotating Arc Jet Test Model: Time-Accurate Trajectory Heat Flux Replication in a Ground Test Environment

    Science.gov (United States)

    Laub, Bernard; Grinstead, Jay; Dyakonov, Artem; Venkatapathy, Ethiraj

    2011-01-01

    Though arc jet testing has been the proven method employed for development testing and certification of TPS and TPS instrumentation, the operational aspects of arc jets limit testing to selected, but constant, conditions. Flight, on the other hand, produces timevarying entry conditions in which the heat flux increases, peaks, and recedes as a vehicle descends through an atmosphere. As a result, we are unable to "test as we fly." Attempts to replicate the time-dependent aerothermal environment of atmospheric entry by varying the arc jet facility operating conditions during a test have proven to be difficult, expensive, and only partially successful. A promising alternative is to rotate the test model exposed to a constant-condition arc jet flow to yield a time-varying test condition at a point on a test article (Fig. 1). The model shape and rotation rate can be engineered so that the heat flux at a point on the model replicates the predicted profile for a particular point on a flight vehicle. This simple concept will enable, for example, calibration of the TPS sensors on the Mars Science Laboratory (MSL) aeroshell for anticipated flight environments.

  10. Isotope Brayton ground demonstration testing and flight qualification. Volume 1. Technical program

    Energy Technology Data Exchange (ETDEWEB)

    1974-12-09

    A program is proposed for the ground demonstration, development, and flight qualification of a radioisotope nuclear heated dynamic power system for use on space missions beginning in the 1980's. This type of electrical power system is based upon and combines two aerospace technologies currently under intense development; namely, the MHW isotope heat source and the closed Brayton cycle gas turbine. This power system represents the next generation of reliable, efficient economic electrical power equipment for space, and will be capable of providing 0.5 to 2.0 kW of electric power to a wide variety of spacecraft for earth orbital and interplanetary missions. The immediate design will be based upon the requirements for the Air Force SURVSATCOM mission. The proposal is presented in three volumes plus an Executive Summary. This volume describes the tasks in the technical program.

  11. Nonlinear Schrodinger equation: A testing ground for the quantization of nonlinear waves

    International Nuclear Information System (INIS)

    Klein, A.; Krejs, F.

    1976-01-01

    Quantization of the nonlinear Schrodinger equation is carried out by the method due to Kerman and Klein. A viable procedure is inferred from the quantum interpretation of the classical (soliton) solution. The ground-state energy for a system with n particles is calculated to an accuracy which includes the first quantum correction to the semiclassical result. It is demonstrated that the exact answer can be obtained systematically only at the next level of approximation. For the calculation of the first quantum correction, the quantum theory of the stability of periodic orbits in field theory is developed and discussed. Since one is dealing with a finite many-body problem, the field theory can be written so that no infinite terms are encountered, but the Hamiltonian can also be artificially rearranged so as to destory this feature. For learning purposes the calculations are carried out with the various alternatives, and our methods prove capable of providing a uniform final result

  12. Kinks and antikinks of buckled graphene: A testing ground for the φ4 field model

    Science.gov (United States)

    Yamaletdinov, R. D.; Slipko, V. A.; Pershin, Y. V.

    2017-09-01

    Kinks and antikinks of the classical φ4 field model are topological solutions connecting its two distinct ground states. Here we establish an analogy between the excitations of a long graphene nanoribbon buckled in the transverse direction and φ4 model results. Using molecular dynamics simulations, we investigated the dynamics of a buckled graphene nanoribbon with a single kink and with a kink-antikink pair. Several features of the φ4 model have been observed including the kink-antikink capture at low energies, kink-antikink reflection at high energies, and a bounce resonance. Our results pave the way towards the experimental observation of a rich variety of φ4 model predictions based on graphene.

  13. Human subjects concerns in ground based ECLSS testing - Managing uncertainty in closely recycled systems

    Science.gov (United States)

    Crump, William J.; Janik, Daniel S.; Thomas, L. Dale

    1990-01-01

    U.S. space missions have to this point used water either made on board or carried from earth and discarded after use. For Space Station Freedom, long duration life support will include air and water recycling using a series of physical-chemical subsystems. The Environmental Control and Life Support System (ECLSS) designed for this application must be tested extensively at all stages of hardware maturity. Human test subjects are required to conduct some of these tests, and the risks associated with the use of development hardware must be addressed. Federal guidelines for protection of human subjects require careful consideration of risks and potential benefits by an Institutional Review Board (IRB) before and during testing. This paper reviews the ethical principles guiding this consideration, details the problems and uncertainties inherent in current hardware testing, and presents an incremental approach to risk assessment for ECLSS testing.

  14. Contamination and UV ageing of diffuser targets used in satellite inflight and ground reference test site calibrations

    Science.gov (United States)

    Vaskuri, Anna; Greenwell, Claire; Hessey, Isabel; Tompkins, Jordan; Woolliams, Emma

    2018-02-01

    Diffuser reflectance targets are key components in in-orbit calibrations and for verifying ground reference test sites. In this work, Spectralon, Diffusil, and Heraeus diffusers were exposed to exhaust gases and ultraviolet (UV) radiation in the ambient air conditions and their degradations were monitored by measuring changes in spectral reflectances. Spectralon is a state-of-the-art diffuser made of polytetrafluoroethylene, and Diffusil and Heraeus diffusers are made of fused silica with gas bubbles inside. Based on the contamination tests, Spectralon degrades faster than fused silica diffusers. For the samples exposed to contamination for 20 minutes, the 250 nm - 400 nm total diffuse spectral reflectance of Spectralon degraded 3-5 times more when exposed to petrol-like emission and 16-23 times more when exposed to diesel-like emission, compared with Diffusil. When the reflectance changes of Spectralon were compared with those of Heraeus, Spectralon degraded 3-4 times more when exposed to petrol-like emission for 20 minutes and 5-7 times more when exposed to diesel-like emission for 7.5 minutes. When the samples contaminated were exposed to UV radiation in the ambient air, their reflectance gradually restored back to the original level. In conclusion, fused silica diffusers are more resistant to hydrocarbon contaminants present in ground reference test sites, and thus more stable under UV radiation in the air.

  15. Aerodynamic Drag Analysis of 3-DOF Flex-Gimbal GyroWheel System in the Sense of Ground Test

    Science.gov (United States)

    Huo, Xin; Feng, Sizhao; Liu, Kangzhi; Wang, Libin; Chen, Weishan

    2016-01-01

    GyroWheel is an innovative device that combines the actuating capabilities of a control moment gyro with the rate sensing capabilities of a tuned rotor gyro by using a spinning flex-gimbal system. However, in the process of the ground test, the existence of aerodynamic disturbance is inevitable, which hinders the improvement of the specification performance and control accuracy. A vacuum tank test is a possible candidate but is sometimes unrealistic due to the substantial increase in costs and complexity involved. In this paper, the aerodynamic drag problem with respect to the 3-DOF flex-gimbal GyroWheel system is investigated by simulation analysis and experimental verification. Concretely, the angular momentum envelope property of the spinning rotor system is studied and its integral dynamical model is deduced based on the physical configuration of the GyroWheel system with an appropriately defined coordinate system. In the sequel, the fluid numerical model is established and the model geometries are checked with FLUENT software. According to the diversity and time-varying properties of the rotor motions in three-dimensions, the airflow field around the GyroWheel rotor is analyzed by simulation with respect to its varying angular velocity and tilt angle. The IPC-based experimental platform is introduced, and the properties of aerodynamic drag in the ground test condition are obtained through comparing the simulation with experimental results. PMID:27941602

  16. Testing seasonal and long-term controls of streamwater DOC using empirical and process-based models.

    Science.gov (United States)

    Futter, Martyn N; de Wit, Heleen A

    2008-12-15

    Concentrations of dissolved organic carbon (DOC) in surface waters are increasing across Europe and parts of North America. Several mechanisms have been proposed to explain these increases including reductions in acid deposition, change in frequency of winter storms and changes in temperature and precipitation patterns. We used two modelling approaches to identify the mechanisms responsible for changing surface water DOC concentrations. Empirical regression analysis and INCA-C, a process-based model of stream-water DOC, were used to simulate long-term (1986--2003) patterns in stream water DOC concentrations in a small boreal stream. Both modelling approaches successfully simulated seasonal and inter-annual patterns in DOC concentration. In both models, seasonal patterns of DOC concentration were controlled by hydrology and inter-annual patterns were explained by climatic variation. There was a non-linear relationship between warmer summer temperatures and INCA-C predicted DOC. Only the empirical model was able to satisfactorily simulate the observed long-term increase in DOC. The observed long-term trends in DOC are likely to be driven by in-soil processes controlled by SO4(2-) and Cl(-) deposition, and to a lesser extent by temperature-controlled processes. Given the projected changes in climate and deposition, future modelling and experimental research should focus on the possible effects of soil temperature and moisture on organic carbon production, sorption and desorption rates, and chemical controls on organic matter solubility.

  17. Hypersonic ground test capabilities for T and E testing above mach 8 ''a case where S and T meets T and E''

    International Nuclear Information System (INIS)

    Constantino, M; Miles, R; Brown, G; Laster, M; Nelson, G

    1999-01-01

    Simulation of hypersonic flight in ground test and evaluation (T and E) facilities is a challenging and formidable task, especially to fully duplicate the flight environment above approximately Mach 8 for most all hypersonic flight systems that have been developed, conceived, or envisioned. Basically, and for many years, the enabling technology to build such a ground test wind tunnel facility has been severely limited in the area of high-temperature, high-strength materials and thermal protection approaches. To circumvent the problems, various approaches have been used, including partial simulation and use of similarity laws and reduced test time. These approaches often are not satisfactory, i.e. operability and durability testing for air-breathing propulsion development and thermal protection development of many flight systems. Thus, there is a strong need for science and technology (S and T) community involvement in technology development to address these problems. This paper discusses a specific case where this need exists and where significant S and T involvement has made and continues to make significant contributions. The case discussed will be an Air Force research program currently underway to develop enabling technologies for a Mach 8-15 hypersonic true temperature wind tunnel with relatively long run time. The research is based on a concept proposed by princeton University using radiant or beamed energy into the supersonic nozzle flow

  18. Aircraft and ground vehicle friction correlation test results obtained under winter runway conditions during joint FAA/NASA Runway Friction Program

    Science.gov (United States)

    Yager, Thomas J.; Vogler, William A.; Baldasare, Paul

    1988-01-01

    Aircraft and ground vehicle friction data collected during the Joint FAA/NASA Runway Friction Program under winter runway conditions are discussed and test results are summarized. The relationship between the different ground vehicle friction measurements obtained on compacted snow- and ice-covered conditions is defined together with the correlation to aircraft tire friction performance under similar runway conditions.

  19. Preliminary hazard analysis for the Brayton Isotope Ground Demonstration System (including vacuum test chamber)

    International Nuclear Information System (INIS)

    Miller, L.G.

    1975-01-01

    The Preliminary Hazard Analysis (PHA) of the BIPS-GDS is a tabular summary of hazards and undesired events which may lead to system damage or failure and/or hazard to personnel. The PHA reviews the GDS as it is envisioned to operate in the Vacuum Test Chamber (VTC) of the GDS Test Facility. The VTC and other equipment which will comprise the test facility are presently in an early stage of preliminary design and will undoubtedly undergo numerous changes before the design is frozen. The PHA and the FMECA to follow are intended to aid the design effort by identifying areas of concern which are critical to the safety and reliability of the BIPS-GDS and test facility

  20. Develop and Implement Operational Ground Testing Protocols to Individualize Astronaut Sleep Medication Efficacy and Individual Effects

    Data.gov (United States)

    National Aeronautics and Space Administration — The study protocol was successfully pilot tested with N=7 subjects (6 NASA flight surgeons and 1 Behavioral Health and Performance element Operations professional)...

  1. Operational Ground Testing Protocol to Optimize Astronaut Sleep Medication Efficacy and Individual Effects

    Data.gov (United States)

    National Aeronautics and Space Administration — As of July 3, 2014, data collection has been completed for 29 participants, with each participant completing testing across three nights at the Astronaut Quarantine...

  2. Comparison between ground tests and flight data for two static 32 KB memories

    International Nuclear Information System (INIS)

    Cheynet, Ph.; Velazco, R.; Cheynet, Ph.; Ecoffet, R.; Duzellier, S.; David, J.P.; Loquet, J.G.

    1999-01-01

    The study concerns two 32 K-byte static memories, one from Hitachi (HM62256) and the other (HM65756) from Matra-MHS. The results correspond to around one year of measurement in high radiation orbit and a total of 268 upsets were detected. As a preliminary conclusion it can be stated that the MHS SRAM is probably at least 4 times more sensitive to SEU (single event upset) than the Hitachi SRAM. The Hitachi memory has exhibited what we call ''stuck-at'' bit errors. This kind of event is identified when the same address and data is found in error (fixed read data) for several consecutive read cycles. A confrontation of SEU rates derived from predictions to those measured in flight has shown that: - error rate is underestimated for HM62256 using standard prediction models, - error rate can be under or over-estimated for HM65756 but the dispersion on heavy-ion ground results does not allow us to conclude. (A.C.)

  3. Modelling ground rupture due to groundwater withdrawal: applications to test cases in China and Mexico

    Science.gov (United States)

    Franceschini, A.; Teatini, P.; Janna, C.; Ferronato, M.; Gambolati, G.; Ye, S.; Carreón-Freyre, D.

    2015-11-01

    The stress variation induced by aquifer overdraft in sedimentary basins with shallow bedrock may cause rupture in the form of pre-existing fault activation or earth fissure generation. The process is causing major detrimental effects on a many areas in China and Mexico. Ruptures yield discontinuity in both displacement and stress field that classic continuous finite element (FE) models cannot address. Interface finite elements (IE), typically used in contact mechanics, may be of great help and are implemented herein to simulate the fault geomechanical behaviour. Two main approaches, i.e. Penalty and Lagrangian, are developed to enforce the contact condition on the element interface. The incorporation of IE incorporation into a three-dimensional (3-D) FE geomechanical simulator shows that the Lagrangian approach is numerically more robust and stable than the Penalty, thus providing more reliable solutions. Furthermore, the use of a Newton-Raphson scheme to deal with the non-linear elasto-plastic fault behaviour allows for quadratic convergence. The FE - IE model is applied to investigate the likely ground rupture in realistic 3-D geologic settings. The case studies are representative of the City of Wuxi in the Jiangsu Province (China), and of the City of Queretaro, Mexico, where significant land subsidence has been accompanied by the generation of several earth fissures jeopardizing the stability and integrity of the overland structures and infrastructure.

  4. Handling and disposal of SP-100 ground test nuclear fuel and equipment

    International Nuclear Information System (INIS)

    Wilson, C.E.; Potter, J.D.; Hodgson, R.D.

    1990-05-01

    The post SP-100 reactor testing period will focus on defueling the reactor, packaging the various radioactive waste forms, and shipping this material to the appropriate locations. Remote-handling techniques will be developed to defuel the reactor. Packaging the spent fuel and activated reactor components is a challenge in itself. This paper presents an overview of the strategy, methods, and equipment that will be used during the closeout phase of nuclear testing

  5. Handling and disposal of SP-100 ground test nuclear fuel and equipment

    International Nuclear Information System (INIS)

    Wilson, C.E.; Potter, J.D.; Hodgson, R.D.

    1991-01-01

    The post SP-100 reactor testing period will focus on defueling the reactor, packaging the various radiactive waste forms, and shipping this material to the appropriate locations. Remote-handling techniques will be developed to defuel the reactor. Packaging the spent fuel and activated reactor components is a challenge in itself. This paper presents an overview of the strategy, methods, and equipment that will be used during the closeout phase of nuclear testing

  6. Ground Testing and Flight Demonstration of Charge Management of Insulated Test Masses Using UV LED Electron Photoemission

    OpenAIRE

    Saraf, Shailendhar; Buchman, Sasha; Balakrishnan, Karthik; Lui, Chin Yang; Soulage, Michael; Faied, Dohy; Hanson, John; Ling, Kuok; Jaroux, Belgacem; AlRashed, Abdullah; Nassban, Badr Al; Suwaidan, Badr Al; Harbi, Mohammed Al; Salamah, Badr Bin; Othman, Mohammed Bin

    2016-01-01

    The UV LED mission demonstrates the precise control of the potential of electrically isolated test masses that is essential for the operation of space accelerometers and drag free sensors. Accelerometers and drag free sensors were and remain at the core of geodesy, aeronomy, and precision navigation missions as well as gravitational science experiments and gravitational wave observatories. Charge management using photoelectrons generated by the 254 nm UV line of Hg was first demonstrated on G...

  7. Preliminary Results From a Heavily Instrumented Engine Ice Crystal Icing Test in a Ground Based Altitude Test Facility

    Science.gov (United States)

    Flegel, Ashlie B.; Oliver, Michael J.

    2016-01-01

    Preliminary results from the heavily instrumented ALF502R-5 engine test conducted in the NASA Glenn Research Center Propulsion Systems Laboratory are discussed. The effects of ice crystal icing on a full scale engine is examined and documented. This same model engine, serial number LF01, was used during the inaugural icing test in the Propulsion Systems Laboratory facility. The uncommanded reduction of thrust (rollback) events experienced by this engine in flight were simulated in the facility. Limited instrumentation was used to detect icing on the LF01 engine. Metal temperatures on the exit guide vanes and outer shroud and the load measurement were the only indicators of ice formation. The current study features a similar engine, serial number LF11, which is instrumented to characterize the cloud entering the engine, detect/characterize ice accretion, and visualize the ice accretion in the region of interest. Data were acquired at key LF01 test points and additional points that explored: icing threshold regions, low altitude, high altitude, spinner heat effects, and the influence of varying the facility and engine parameters. For each condition of interest, data were obtained from some selected variations of ice particle median volumetric diameter, total water content, fan speed, and ambient temperature. For several cases the NASA in-house engine icing risk assessment code was used to find conditions that would lead to a rollback event. This study further helped NASA develop necessary icing diagnostic instrumentation, expand the capabilities of the Propulsion Systems Laboratory, and generate a dataset that will be used to develop and validate in-house icing prediction and risk mitigation computational tools. The ice accretion on the outer shroud region was acquired by internal video cameras. The heavily instrumented engine showed good repeatability of icing responses when compared to the key LF01 test points and during day-to-day operation. Other noticeable

  8. Ground penetrating radar results at the Box Canyon Site - 1996 survey as part of infiltration test

    International Nuclear Information System (INIS)

    Peterson, J.E. Jr.; Williams, K.H.

    1997-08-01

    This data report presents a discussion of the borehole radar tomography experiment conducted at Box Canyon, Idaho. Discussion concentrates on the survey methodology, data acquisition procedures, and the resulting tomographic images and interpretations. The entire geophysics field effort for FY96 centered around the collection of the borehole radar data within the inclined boreholes R1, R2, R3, and R4 before, during, and after the ponded infiltration experiment. The well pairs R1-R2, R2-R4, and R3-R4 comprised the bulk of the field survey; however, additional data were collected between vertical boreholes within and around the infiltration basin. The intent of the inclined boreholes was to allow access beneath the infiltration basin and to enhance the ability of the radar method to image both vertical and horizontal features where flow may dominate. This data report will concentrate on the inclined borehole data and the resulting tomograms. The borehole radar method is one in which modified ground penetrating radar antennas are lowered into boreholes and high frequency electromagnetic signals are transmitted through subsurface material to a receiving antenna. The transmitted signals may be represented as multiple raypaths crossing through the zone of interest. If sufficient raypaths are recorded, a tomographic image may be obtained through computer processing. The data normally recorded are signal amplitude versus time. The information extracted from such data includes the following: (a) the transit time which depends on the wave velocity, (b) the amplitude which depends on the wave attenuation, the dispersion which indicates a change in velocity and attenuation with frequency

  9. Monitoring the ground water level change during the pump test by using the Electric resistivity tomography

    Science.gov (United States)

    Hsu, H.; Chang, P. Y.; Yao, H. J.

    2017-12-01

    For hydrodynamics study of the unconfined aquifer in gravel formation, a pumping test was established to estimate the hydraulic conductivity in the midstream of Zhoushui River in Taiwan. The hydraulic parameters and the cone of depression could be estimated by monitoring the groundwater drawdown in an observation well which was in a short distance far from the pumping well. In this study we carried out the electric resistivity image monitoring during the whole pumping test. The electric resistivity data was measured with the surface and downhole electrodes which would produce a clear subsurface image of groundwater level through a larger distance than the distance between pumping and observation wells. The 2D electric image could also describe how a cone of depression truly created at subsurface. The continuous records could also show the change of groundwater level during the whole pumping test which could give a larger scale of the hydraulic parameters.

  10. Configuration color vision tests: the interaction between aging and the complexity of figure-ground segregation.

    Science.gov (United States)

    Stanford, T; Pollack, R H

    1984-09-01

    A cross-sectional study comparing response time and the percentage of items correctly identified in three color vision tests (Pflügertrident, HRR-AO pseudoisochromatic plates, and AO pseudoisochromatic plates) was carried out on 72 women (12 in each decade) ranging from ages 20 to 79 years. Overall, time scores increased across the age groups. Analysis of the correctness scores indicated that the AO pseudoisochromatic plates requiring the identification of numbers was more difficult than the other tests which consisted of geometric forms or the letter E. This differential difficulty increased as a function of age. There was no indication of color defect per se which led to the conclusion that figure complexity may be the key variable determining performance. The results were similar to those obtained by Lee and Pollack (1978) in their study of the Embedded Figures Test.

  11. Utility of Intelligence Tests for Treatment Planning, Classification, and Placement Decisions: Recent Empirical Findings and Future Directions.

    Science.gov (United States)

    Gresham, Frank M.; Witt, Joseph C.

    1997-01-01

    Maintains that intelligence tests contribute little to the planning, implementation, and evaluation of instructional interventions for children. Suggests that intelligence tests are not useful in making differential diagnostic and classification determinations for children with mild learning problems and that such testing is not a cost-beneficial…

  12. Finite element analysis of an underground protective test station subjected to severe ground motion

    International Nuclear Information System (INIS)

    Burchett, S.N.; Milloy, J.A.; Von Riesemann, W.A.

    1974-01-01

    A recoverable test station, to be used at a location very close to the detonation point of an underground nuclear test, was designed and tested. The design required that the station survive the very severe free-field stress and that the acceleration of the station be limited to a tolerable level. The predicted magnitude and time history of the free-field stress and acceleration indicated that the volcanic tuff medium and the materials of the proposed structure would exhibit nonlinear behavior, so that a transient dynamic analysis of a composite type structure involving nonlinear materials and large deformations was necessary. Parameter studies using a linear elastic dynamic finite element technique were first done to gain an understanding of the effect of the material properties and to study the response of various configurations of the protective structure to the transient loading. Based upon the results of these parameter studies and upon fielding considerations, a tentative design was chosen. This design was then evaluated using a newly developed finite element computer program. The results obtained from this analysis are compared to the field test results. (U.S.)

  13. Blended-Wing-Body Transonic Aerodynamics: Summary of Ground Tests and Sample Results

    Science.gov (United States)

    Carter, Melissa B.; Vicroy, Dan D.; Patel, Dharmendra

    2009-01-01

    The Blended-Wing-Body (BWB) concept has shown substantial performance benefits over conventional aircraft configuration with part of the benefit being derived from the absence of a conventional empennage arrangement. The configuration instead relies upon a bank of trailing edge devices to provide control authority and augment stability. To determine the aerodynamic characteristics of the aircraft, several wind tunnel tests were conducted with a 2% model of Boeing's BWB-450-1L configuration. The tests were conducted in the NASA Langley Research Center's National Transonic Facility and the Arnold Engineering Development Center s 16-Foot Transonic Tunnel. Characteristics of the configuration and the effectiveness of the elevons, drag rudders and winglet rudders were measured at various angles of attack, yaw angles, and Mach numbers (subsonic to transonic speeds). The data from these tests will be used to develop a high fidelity simulation model for flight dynamics analysis and also serve as a reference for CFD comparisons. This paper provides an overview of the wind tunnel tests and examines the effects of Reynolds number, Mach number, pitch-pause versus continuous sweep data acquisition and compares the data from the two wind tunnels.

  14. Ground tests of 120 kW(heat) biomass fired gasifier diesel installation

    Energy Technology Data Exchange (ETDEWEB)

    Zyssin, L.V.; Maronet, I.J.; Morshin, V.N. [Energotechnology Ltd., St. Petersburg (Russian Federation)

    1996-12-31

    For the 1 MW and less power range diesel gasifier power plants could be considered as one of the main energy sources. The brief information about works carried out in Russia according to this direction is presented. Data of preliminary tests for gas diesel installations are presented. (orig.)

  15. Ground tests of 120 kW(heat) biomass fired gasifier diesel installation

    Energy Technology Data Exchange (ETDEWEB)

    Zyssin, L V; Maronet, I J; Morshin, V N [Energotechnology Ltd., St. Petersburg (Russian Federation)

    1997-12-31

    For the 1 MW and less power range diesel gasifier power plants could be considered as one of the main energy sources. The brief information about works carried out in Russia according to this direction is presented. Data of preliminary tests for gas diesel installations are presented. (orig.)

  16. Standard practice for guided wave testing of above ground steel pipework using piezoelectric effect transduction

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2011-01-01

    1.1 This practice provides a procedure for the use of guided wave testing (GWT), also previously known as long range ultrasonic testing (LRUT) or guided wave ultrasonic testing (GWUT). 1.2 GWT utilizes ultrasonic guided waves, sent in the axial direction of the pipe, to non-destructively test pipes for defects or other features by detecting changes in the cross-section and/or stiffness of the pipe. 1.3 GWT is a screening tool. The method does not provide a direct measurement of wall thickness or the exact dimensions of defects/defected area; an estimate of the defect severity however can be provided. 1.4 This practice is intended for use with tubular carbon steel or low-alloy steel products having Nominal Pipe size (NPS) 2 to 48 corresponding to 60.3 to 1219.2 mm (2.375 to 48 in.) outer diameter, and wall thickness between 3.81 and 25.4 mm (0.15 and 1 in.). 1.5 This practice covers GWT using piezoelectric transduction technology. 1.6 This practice only applies to GWT of basic pipe configuration. This inc...

  17. Self Diagnostic Accelerometer Ground Testing on a C-17 Aircraft Engine

    Science.gov (United States)

    Tokars, Roger P.; Lekki, John D.

    2013-01-01

    The self diagnostic accelerometer (SDA) developed by the NASA Glenn Research Center was tested for the first time in an aircraft engine environment as part of the Vehicle Integrated Propulsion Research (VIPR) program. The VIPR program includes testing multiple critical flight sensor technologies. One such sensor, the accelerometer, measures vibrations to detect faults in the engine. In order to rely upon the accelerometer, the health of the accelerometer must be ensured. Sensor system malfunction is a significant contributor to propulsion in flight shutdowns (IFSD) which can lead to aircraft accidents when the issue is compounded with an inappropriate crew response. The development of the SDA is important for both reducing the IFSD rate, and hence reducing the rate at which this component failure type can put an aircraft in jeopardy, and also as a critical enabling technology for future automated malfunction diagnostic systems. The SDA is a sensor system designed to actively determine the accelerometer structural health and attachment condition, in addition to making vibration measurements. The SDA uses a signal conditioning unit that sends an electrical chirp to the accelerometer and recognizes changes in the response due to changes in the accelerometer health and attachment condition. In an effort toward demonstrating the SDAs flight worthiness and robustness, multiple SDAs were mounted and tested on a C-17 aircraft engine. The engine test conditions varied from engine off, to idle, to maximum power. The two SDA attachment conditions used were fully tight and loose. The newly developed SDA health algorithm described herein uses cross correlation pattern recognition to discriminate a healthy from a faulty SDA. The VIPR test results demonstrate for the first time the robustness of the SDA in an engine environment characterized by high vibration levels.

  18. Measuring structure deformations of a composite glider by optical means with on-ground and in-flight testing

    Science.gov (United States)

    Bakunowicz, Jerzy; Święch, Łukasz; Meyer, Ralf

    2016-12-01

    In aeronautical research experimental data sets of high quality are essential to verify and improve simulation algorithms. For this reason the experimental techniques need to be constantly refined. The shape, movement or deformation of structural aircraft elements can be measured implicitly in multiple ways; however, only optical, correlation-based techniques are able to deliver direct high-order and spatial results. In this paper two different optical metrologies are used for on-ground preparation and the actual execution of in-flight wing deformation measurements on a PW-6U glider. Firstly, the commercial PONTOS system is used for static tests on the ground and for wind tunnel investigations to successfully certify an experimental sensor pod mounted on top of the test bed fuselage. Secondly, a modification of the glider is necessary to implement the optical method named image pattern correlation technique (IPCT), which has been developed by the German Aerospace Center DLR. This scientific technology uses a stereoscopic camera set-up placed inside the experimental pod and a stochastic dot matrix applied to the area of interest on the glider wing to measure the deformation of the upper wing surface in-flight. The flight test installation, including the preparation, is described and results are presented briefly. Focussing on the compensation for typical error sources, the paper concludes with a recommended procedure to enhance the data processing for better results. Within the presented project IPCT has been developed and optimized for a new type of test bed. Adapted to the special requirements of the glider, the IPCT measurements were able to deliver a valuable wing deformation data base which now can be used to improve corresponding numerical models and simulations.

  19. Demonstration of the Military Ecological Risk Assessment Framework (MERAF): Apache Longbow - Hell Missile Test at Yuma Proving Ground

    Energy Technology Data Exchange (ETDEWEB)

    Efroymson, R.A.

    2002-05-09

    This ecological risk assessment for a testing program at Yuma Proving Ground, Arizona, is a demonstration of the Military Ecological Risk Assessment Framework (MERAF; Suter et al. 2001). The demonstration is intended to illustrate how risk assessment guidance concerning-generic military training and testing activities and guidance concerning a specific type of activity (e.g., low-altitude aircraft overflights) may be implemented at a military installation. MERAF was developed with funding from the Strategic Research and Development Program (SERDP) of the Department of Defense. Novel aspects of MERAF include: (1) the assessment of risks from physical stressors using an ecological risk assessment framework, (2) the consideration of contingent or indirect effects of stressors (e.g., population-level effects that are derived from habitat or hydrological changes), (3) the integration of risks associated with different component activities or stressors, (4) the emphasis on quantitative risk estimates and estimates of uncertainty, and (5) the modularity of design, permitting components of the framework to be used in various military risk assessments that include similar activities. The particular subject of this report is the assessment of ecological risks associated with a testing program at Cibola Range of Yuma Proving Ground, Arizona. The program involves an Apache Longbow helicopter firing Hellfire missiles at moving targets, i.e., M60-A1 tanks. Thus, the three component activities of the Apache-Hellfire test were: (1) helicopter overflight, (2) missile firing, and (3) tracked vehicle movement. The demonstration was limited, to two ecological endpoint entities (i.e., potentially susceptible and valued populations or communities): woody desert wash communities and mule deer populations. The core assessment area is composed of about 126 km{sup 2} between the Chocolate and Middle Mountains. The core time of the program is a three-week period, including fourteen days of

  20. Experimental Results of Thin-Film Photovoltaic Cells in a Low Density LEO Plasma Environment: Ground Tests

    Science.gov (United States)

    Galofaro, Joel T.; Vayner, Boris V.

    2006-01-01

    Plasma ground testing results, conducted at the Glenn Research Center (GRC) National Plasma Interaction (N-PI) Facility, are presented for a number of thin-film photovoltaic cells. The cells represent a mix of promising new technologies identified by the Air Force Research Laboratory (AFRL) under the CYGNUS Space Science Technology Experiment (SSTE-4) Program. The current ground tests are aimed at characterizing the performance and survivability of thin film technologies in the harsh low earth orbital space environment where they will be flown. Measurements of parasitic current loss, charging/dielectric breakdown of cover-slide coatings and arcing threshold tests are performed for each individual cell. These measurements are followed by a series of experiments designed to test for catastrophic arc failure mechanisms. A special type of power supply, called a solar array simulator (SAS) with adjustable voltage and current limits on the supply s output, is employed to bias two adjacent cells at a predetermined voltage and current. The bias voltage is incrementally ramped up until a sustained arc results. Sustained arcs are precursors to catastrophic arc failure where the arc current rises to a maximum value for long timescales often ranging between 30 to 100 sec times. Normal arcs by comparison, are short lived events with a timescale between 10 to 30 sec. Sustained arcs lead to pyrolization with extreme cell damage and have been shown to cause the loss of entire array strings in solar arrays. The collected data will be used to evaluate the suitability of thin-film photovoltaic technologies for future space operations.

  1. Demonstration of the Military Ecological Risk Assessment Framework (MERAF): Apache Longbow - Hell Missile Test at Yuma Proving Ground

    International Nuclear Information System (INIS)

    Efroymson, R.A.

    2002-01-01

    This ecological risk assessment for a testing program at Yuma Proving Ground, Arizona, is a demonstration of the Military Ecological Risk Assessment Framework (MERAF; Suter et al. 2001). The demonstration is intended to illustrate how risk assessment guidance concerning-generic military training and testing activities and guidance concerning a specific type of activity (e.g., low-altitude aircraft overflights) may be implemented at a military installation. MERAF was developed with funding from the Strategic Research and Development Program (SERDP) of the Department of Defense. Novel aspects of MERAF include: (1) the assessment of risks from physical stressors using an ecological risk assessment framework, (2) the consideration of contingent or indirect effects of stressors (e.g., population-level effects that are derived from habitat or hydrological changes), (3) the integration of risks associated with different component activities or stressors, (4) the emphasis on quantitative risk estimates and estimates of uncertainty, and (5) the modularity of design, permitting components of the framework to be used in various military risk assessments that include similar activities. The particular subject of this report is the assessment of ecological risks associated with a testing program at Cibola Range of Yuma Proving Ground, Arizona. The program involves an Apache Longbow helicopter firing Hellfire missiles at moving targets, i.e., M60-A1 tanks. Thus, the three component activities of the Apache-Hellfire test were: (1) helicopter overflight, (2) missile firing, and (3) tracked vehicle movement. The demonstration was limited, to two ecological endpoint entities (i.e., potentially susceptible and valued populations or communities): woody desert wash communities and mule deer populations. The core assessment area is composed of about 126 km 2 between the Chocolate and Middle Mountains. The core time of the program is a three-week period, including fourteen days of

  2. Advancing Empirical Scholarship to Further Develop Evaluation Theory and Practice

    Science.gov (United States)

    Christie, Christina A.

    2011-01-01

    Good theory development is grounded in empirical inquiry. In the context of educational evaluation, the development of empirically grounded theory has important benefits for the field and the practitioner. In particular, a shift to empirically derived theory will assist in advancing more systematic and contextually relevant evaluation practice, as…

  3. On-ground testing of the role of adhesion in the LISA-Pathfinder test mass injection phase

    Science.gov (United States)

    Bortoluzzi, D.; Zanoni, C.; Conklin, J. W.

    2017-05-01

    Many space missions share the need to fly a free-falling body inside the spacecraft, as a reference for navigation and/or as a probe for the local gravitational field. When a mechanism is required to cage such an object during the launch phase, the need arises to release it to free-fall once the operational phase must be initiated in orbit. The criticality of this phase increases when the mechanical interfaces between the body and the mechanism are affected by adhesion and the actuation authority of the control system on the free-falling body is limited. Both conditions are realized in the LISA Pathfinder mission, which aims at injecting a gold-coated 2 kg cubic test mass into a nearly perfect geodesic trajectory to demonstrate the readiness of the developed technology for in-space gravity wave detection. The criticality of adhesion is widely recognized in space technology, because it can affect and jeopardize the functionality of mechanisms, when arising between moving parts. In the LISA Pathfinder case, metallic adhesion potentially plays a relevant role, mainly for two reasons. First, thanks to its properties (ductility, high surface energy) the gold coating on the proof mass easily produces cold weldings, especially in vacuum conditions. Second, the detachment of the proof mass from the releasing device occurs abruptly and a relevant influence of the separation velocity is expected on the strength of the welding. This can produce an excessive velocity of the proof mass at the retraction of the releasing device for the following capture and centring phase on behalf of the control system. A testing activity is performed to characterize the dynamic behaviour of the adhesive bonds between the proof mass and the releasing device, which can be used to predict their contribution on the residual velocity of the proof mass after in-flight release. The study of such a dynamic phenomenon sets some challenging requirements on the measurement technique, both on the

  4. Multi-offset ground-penetrating radar imaging of a lab-scale infiltration test

    Directory of Open Access Journals (Sweden)

    A. R. Mangel

    2012-11-01

    Full Text Available A lab scale infiltration experiment was conducted in a sand tank to evaluate the use of time-lapse multi-offset ground-penetrating radar (GPR data for monitoring dynamic hydrologic events in the vadose zone. Sets of 21 GPR traces at offsets between 0.44–0.9 m were recorded every 30 s during a 3 h infiltration experiment to produce a data cube that can be viewed as multi-offset gathers at unique times or common offset images, tracking changes in arrivals through time. Specifically, we investigated whether this data can be used to estimate changes in average soil water content during wetting and drying and to track the migration of the wetting front during an infiltration event. For the first problem we found that normal-moveout (NMO analysis of the GPR reflection from the bottom of the sand layer provided water content estimates ranging between 0.10–0.30 volumetric water content, which underestimated the value determined by depth averaging a vertical array of six moisture probes by 0.03–0.05 volumetric water content. Relative errors in the estimated depth to the bottom of the 0.6 m thick sand layer were typically on the order of 2%, though increased as high as 25% as the wetting front approached the bottom of the tank. NMO analysis of the wetting front reflection during the infiltration event generally underestimated the depth of the front with discrepancies between GPR and moisture probe estimates approaching 0.15 m. The analysis also resulted in underestimates of water content in the wetted zone on the order of 0.06 volumetric water content and a wetting front velocity equal to about half the rate inferred from the probe measurements. In a parallel modeling effort we found that HYDRUS-1D also underestimates the observed average tank water content determined from the probes by approximately 0.01–0.03 volumetric water content, despite the fact that the model was calibrated to the probe data. This error suggests that the assumed conceptual

  5. What is the place of empirical proton pump inhibitor testing in the diagnosis of gastroesophageal reflux disease? (Description, duration, and dosage).

    Science.gov (United States)

    Vardar, Rukiye; Keskin, Muharrem

    2017-12-01

    Empirical acid suppression tests that are performed with proton pump inhibitors (PPI) are used to detect both the presence of acid-related gastrointestinal symptoms and gastroesophageal reflux disease (GERD). In comparison to other diagnostic methods, it is non-invasive, easily applicable, and cost-effective in the diagnosis of GERD. In addition to typical reflux symptoms, it can also be used for diagnostic purposes in patients with non-cardiac chest pain (NCCP). If the symptom response is 50% and above when obtained using the PPI test in patients with NCCP, it can be considered as positive and the treatment should be continued sensitivity of the PPI test in patients with typical symptoms of GERD is 27%-89%, while its specificity is 35%-83%. Although there are differences related to the duration and dosage of the PPI test, a significant difference has not been found according to the type of PPI. When PPI test sensitivity and specificity were calculated by cumulatively evaluating the data regarding the PPI test in the literature, a sensitivity of 82.3% and specificity of 51.5% was obtained. It has been found that high doses of PPI were mostly used in studies, and the duration of the median test was 14 days. As a result, the sensitivity of PPI trial test is good, but the specificity is low in the diagnosis of GERD in patients with typical reflux symptoms.

  6. Narrative and emotion process in psychotherapy: an empirical test of the Narrative-Emotion Process Coding System (NEPCS).

    Science.gov (United States)

    Boritz, Tali Z; Bryntwick, Emily; Angus, Lynne; Greenberg, Leslie S; Constantino, Michael J

    2014-01-01

    While the individual contributions of narrative and emotion processes to psychotherapy outcome have been the focus of recent interest in psychotherapy research literature, the empirical analysis of narrative and emotion integration has rarely been addressed. The Narrative-Emotion Processes Coding System (NEPCS) was developed to provide researchers with a systematic method for identifying specific narrative and emotion process markers, for application to therapy session videos. The present study examined the relationship between NEPCS-derived problem markers (same old storytelling, empty storytelling, unstoried emotion, abstract storytelling) and change markers (competing plotlines storytelling, inchoate storytelling, unexpected outcome storytelling, and discovery storytelling), and treatment outcome (recovered versus unchanged at therapy termination) and stage of therapy (early, middle, late) in brief emotion-focused (EFT), client-centred (CCT), and cognitive (CT) therapies for depression. Hierarchical linear modelling analyses demonstrated a significant Outcome effect for inchoate storytelling (p = .037) and discovery storytelling (p = .002), a Stage × Outcome effect for abstract storytelling (p = .05), and a Stage × Outcome × Treatment effect for competing plotlines storytelling (p = .001). There was also a significant Stage × Outcome effect for NEPCS problem markers (p = .007) and change markers (p = .03). The results provide preliminary support for the importance of assessing the contribution of narrative-emotion processes to efficacious treatment outcomes in EFT, CCT, and CT treatments of depression.

  7. Tests of regional elemental tracers of pollution aerosols. 1. Distinctness of regional signatures, stability during transport, and empirical validation

    International Nuclear Information System (INIS)

    Lowenthal, D.H.; Wunschel, K.R.; Rahn, K.A.

    1988-01-01

    The two major requirements for a successful regional tracer system are distinctness of signatures and stability of signatures during transport. Dissimilarity of the five regional signatures from eastern North America is shown by collinearity diagnostics and by apportionment of synthetic samples generated randomly. Stability of regional signatures during transport is shown first by use of tracer elements in coarse and fine aerosol to predict the maximum possible change of ratios from particle-size effects alone and then by examination of actual changes in signatures during transport from the Midwest to Underhill, VT. Two recent empirical validations of the tracer system are presented: qualitative agreement of pulses of mid-western aerosol in Vermont with pulses of perfluorocarbon tracer gas released in Ohio during CAPTEX '83 and reproduction of our three major northeastern and mid-western signatures by other investigators. The tracer system currently uses the seven elements As, Se, Sb, Zn, In, noncrustal Mn, and noncrustal V as measured by instrumental neutron activation

  8. Safety assessment for the 118-B-1 Burial Ground excavation treatability tests. Revision 2

    International Nuclear Information System (INIS)

    Zimmer, J.J.; Frain, J.M.

    1994-12-01

    This revision of the Safety Assessment provides an auditable safety analysis of the hazards for the proposed treatability test activities per DOE-EM-STD-5502-94, DOE Limited Standard, Hazard Baseline Documentation (DOE 1994). The proposed activities are classified as radiological activities and as such, no longer require Operational Safety Limits (OSLs). The OSLS, Prudent Actions, and Institutional and Organization Controls have been removed from this revision and replaced with ''Administrative Actions Important to Safety,'' as determined by the hazards analysis. Those Administrative Actions Important to Safety are summarized in Section 1.1, ''Assessment Summary.''

  9. Validation of new CFD release by Ground-Coupled Heat Transfer Test Cases

    Directory of Open Access Journals (Sweden)

    Sehnalek Stanislav

    2017-01-01

    Full Text Available In this article is presented validation of ANSYS Fluent with IEA BESTEST Task 34. Article stars with outlook to the topic, afterward are described steady-state cases used for validation. Thereafter is mentioned implementation of these cases on CFD. Article is concluded with presentation of the simulated results with a comparison of those from already validated simulation software by IEA. These validation shows high correlation with an older version of tested ANSYS as well as with other main software. The paper ends by discussion with an outline of future research.

  10. A Monte Carlo Study of Levene's Test of Homogeneity of Variance: Empirical Frequencies of Type I Error in Normal Distributions.

    Science.gov (United States)

    Neel, John H.; Stallings, William M.

    An influential statistics test recommends a Levene text for homogeneity of variance. A recent note suggests that Levene's test is upwardly biased for small samples. Another report shows inflated Alpha estimates and low power. Neither study utilized more than two sample sizes. This Monte Carlo study involved sampling from a normal population for…

  11. Impact assessment of fly ash on ground water quality: An experimental study using batch leaching tests.

    Science.gov (United States)

    Dandautiya, Rahul; Singh, Ajit Pratap; Kundu, Sanghamitra

    2018-05-01

    The fly ash, generated at the coal-based thermal power plant, is always a cause of concern to environmentalists owing to its adverse impact on air, water and land. There exists a high environmental risk when it is disposed to the environment. Thus, two different type of fly ash samples (FA-1 and FA-2) have been considered in this study to examine the leaching potential of the elements magnesium, aluminium, silicon, calcium, titanium, vanadium, chromium, manganese, iron, nickel, cobalt, copper, zinc, arsenic, selenium, strontium, cadmium, barium and lead for different types of leachant. Toxicity characteristics leaching procedure and ASTM tests have been performed in the laboratory to simulate different natural leaching scenarios. Characterisation of samples have been done through X-ray diffraction and field emission gun scanning electron microscope. The effect of different liquid to solid ratios (i.e. 5, 10, 20 and 50) on the mobilisation of elements has been analysed. The results indicated that the maximum leaching of all elements occurred at a liquid to solid ratio of 5 except for arsenic, barium and silicon. The groundwater analysis has also been done to understand the actual effects of leachate. The elements presenting the highest leachability in the two fly ash samples under all tested conditions were magnesium, aluminium, silicon and calcium. It has been observed that calcium exhibits greater leaching effects than all other constituents. The study presented here has been found very useful for assessing contamination levels in groundwater owing to leaching effects of fly ash under different scenarios, which can be helpful to prevent spreading of the contaminants by efficient management of fly ash.

  12. Aerodynamic Characterization of a Thin, High-Performance Airfoil for Use in Ground Fluids Testing

    Science.gov (United States)

    Broeren, Andy P.; Lee, Sam; Clark, Catherine

    2013-01-01

    The FAA has worked with Transport Canada and others to develop allowance times for aircraft operating in ice-pellet precipitation. Wind-tunnel testing has been carried out to better understand the flowoff characteristics and resulting aerodynamic effects of anti-icing fluids contaminated with ice pellets using a thin, high-performance wing section at the National Research Council of Canada Propulsion and Icing Wind Tunnel. The objective of this paper is to characterize the aerodynamic behavior of this wing section in order to better understand the adverse aerodynamic effects of anti-icing fluids and ice-pellet contamination. Aerodynamic performance data, boundary-layer surveys and flow visualization were conducted at a Reynolds number of approximately 6.0×10(exp 6) and a Mach number of 0.12. The clean, baseline model exhibited leading-edge stall characteristics including a leading-edge laminar separation bubble and minimal or no separation on the trailing edge of the main element or flap. These results were consistent with expected 2-D aerodynamics and showed no anomalies that could adversely affect the evaluation of anti-icing fluids and ice-pellet contamination on the wing. Tests conducted with roughness and leading-edge flow disturbances helped to explain the aerodynamic impact of the anti-icing fluids and contamination. The stalling characteristics of the wing section with fluid and contamination appear to be driven at least partially by the effects of a secondary wave of fluid that forms near the leading edge as the wing is rotated in the simulated takeoff profile. These results have provided a much more complete understanding of the adverse aerodynamic effects of anti-icing fluids and ice-pellet contamination on this wing section. This is important since these results are used, in part, to develop the ice-pellet allowance times that are applicable to many different airplanes.

  13. Empirical Hamiltonians

    International Nuclear Information System (INIS)

    Peggs, S.; Talman, R.

    1987-01-01

    As proton accelerators get larger, and include more magnets, the conventional tracking programs which simulate them run slower. The purpose of this paper is to describe a method, still under development, in which element-by-element tracking around one turn is replaced by a single man, which can be processed far faster. It is assumed for this method that a conventional program exists which can perform faithful tracking in the lattice under study for some hundreds of turns, with all lattice parameters held constant. An empirical map is then generated by comparison with the tracking program. A procedure has been outlined for determining an empirical Hamiltonian, which can represent motion through many nonlinear kicks, by taking data from a conventional tracking program. Though derived by an approximate method this Hamiltonian is analytic in form and can be subjected to further analysis of varying degrees of mathematical rigor. Even though the empirical procedure has only been described in one transverse dimension, there is good reason to hope that it can be extended to include two transverse dimensions, so that it can become a more practical tool in realistic cases

  14. Employer-provided health insurance and the incidence of job lock: a literature review and empirical test.

    Science.gov (United States)

    Rashad, Inas; Sarpong, Eric

    2008-12-01

    The incidence of 'job lock' in the health insurance context has long been viewed as a potential problem with employer-provided health insurance, a concept that was instrumental in the passage of the United States Consolidated Omnibus Budget Reconciliation Act of 1986, and later, the Health Insurance Portability and Accountability Act in 1996. Several recent developments in healthcare in the USA include declining healthcare coverage and a noticeable shift in the burden of medical care costs to employees. If these developments cause employees with employer-provided health insurance to feel locked into their jobs, optimal job matches in the labor force may not take place. A summary of the seminal papers in the current literature on the topic of job lock is given, followed by an empirical exercise using single individuals from the National Health Interview Survey (1997-2003) and the 1979 cohort of the National Longitudinal Survey of Youth (1989-2000). Econometric methods used include difference in differences, ordinary least squares and individual fixed effects models, in gauging the potential effect that employer-provided health insurance may have on job tenure and voluntary job departure. Our findings are consistent with recent assertions that there is some evidence of job lock. Individuals with employer-provided health insurance stay on the job 16% longer and are 60% less likely to voluntarily leave their jobs than those with insurance that is not provided by their employers. Productivity may not be optimal if incentives are altered owing to the existence of fringe benefits, such as health insurance. Further research in this area should determine whether legislation beyond the Consolidated Omnibus Budget Reconciliation Act and Health Insurance Portability and Accountability Act laws is needed.

  15. An Empirical Analysis of the Physical Aptitude Exam as a Predictor of Performance on the Physical Readiness Test

    National Research Council Canada - National Science Library

    Patrick, Robert

    2000-01-01

    ... shuttle run, a standing longiump, and a kneeling basketball throw. The Physical Readiness Test, administered semi-annually to all naval personnel including midshipmen, consists of modified situps, pushups, and a 1.5-mile run...

  16. Rydberg Atoms in Strong Fields: a Testing Ground for Quantum Chaos.

    Science.gov (United States)

    Courtney, Michael

    1995-01-01

    Rydberg atoms in strong static electric and magnetic fields provide experimentally accessible systems for studying the connections between classical chaos and quantum mechanics in the semiclassical limit. This experimental accessibility has motivated the development of reliable quantum mechanical solutions. This thesis uses both experimental and computed quantum spectra to test the central approaches to quantum chaos. These central approaches consist mainly of developing methods to compute the spectra of quantum systems in non -perturbative regimes, correlating statistical descriptions of eigenvalues with the classical behavior of the same Hamiltonian, and the development of semiclassical methods such as periodic-orbit theory. Particular emphasis is given to identifying the spectral signature of recurrences --quantum wave packets which follow classical orbits. The new findings include: the breakdown of the connection between energy-level statistics and classical chaos in odd-parity diamagnetic lithium, the discovery of the signature of very long period orbits in atomic spectra, quantitative evidence for the scattering of recurrences by the alkali -metal core, quantitative description of the behavior of recurrences near bifurcations, and a semiclassical interpretation of the evolution of continuum Stark spectra. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.).

  17. X-ray test system for detection of under grounds in concrete

    International Nuclear Information System (INIS)

    Katou, Kiyoshi

    1997-01-01

    Previously the durability of concrete constructions was thought to be nearly semi-permanent if its strength was secured, but thereafter the significance of diagnosis for its durability has been paid considerable attention as the buildings after ca. 1955 were constructed according to the idea of scrap and build. In addition to the conventional non-destructive methods, development of another method became necessary and roentgenoscopic imaging method has been used to examine the intra-concrete conditions. Here, a radiographic method was introduced to determine the positioning of materials buried in a concrete building. This method was not appropriate to examine a wide area from the viewpoint of working efficiency and its cost. Therefore, the examination must be done after selecting a restricted site to test. For an examination of a wide area, a combined use with laser light survey was necessary. The roentgenographic method is often used for examinations for the kind of reinforcing rods and their distributions. Presently, this method is used for diagnosing the durability for static constructions. (M.N.)

  18. Testing ground for fluctuation theorems: The one-dimensional Ising model

    Science.gov (United States)

    Lemos, C. G. O.; Santos, M.; Ferreira, A. L.; Figueiredo, W.

    2018-04-01

    In this paper we determine the nonequilibrium magnetic work performed on a Ising model and relate it to the fluctuation theorem derived some years ago by Jarzynski. The basic idea behind this theorem is the relationship connecting the free energy difference between two thermodynamic states of a system and the average work performed by an external agent, in a finite time, through nonequilibrium paths between the same thermodynamic states. We test the validity of this theorem by considering the one-dimensional Ising model where the free energy is exactly determined as a function of temperature and magnetic field. We have found that the Jarzynski theorem remains valid for all the values of the rate of variation of the magnetic field applied to the system. We have also determined the probability distribution function for the work performed on the system for the forward and reverse processes and verified that predictions based on the Crooks relation are equally correct. We also propose a method to calculate the lag between the current state of the system and that of the equilibrium based on macroscopic variables. We have shown that the lag increases with the sweeping rate of the field at its final value for the reverse process, while it decreases in the case of the forward process. The lag increases linearly with the size of the chain and with a slope decreasing with the inverse of the rate of variation of the field.

  19. Gravitational-Wave Tests of General Relativity with Ground-Based Detectors and Pulsar-Timing Arrays

    Directory of Open Access Journals (Sweden)

    Nicolás Yunes

    2013-11-01

    Full Text Available This review is focused on tests of Einstein's theory of general relativity with gravitational waves that are detectable by ground-based interferometers and pulsar-timing experiments. Einstein’s theory has been greatly constrained in the quasi-linear, quasi-stationary regime, where gravity is weak and velocities are small. Gravitational waves will allow us to probe a complimentary, yet previously unexplored regime: the non-linear and dynamical strong-field regime. Such a regime is, for example, applicable to compact binaries coalescing, where characteristic velocities can reach fifty percent the speed of light and gravitational fields are large and dynamical. This review begins with the theoretical basis and the predicted gravitational-wave observables of modified gravity theories. The review continues with a brief description of the detectors, including both gravitational-wave interferometers and pulsar-timing arrays, leading to a discussion of the data analysis formalism that is applicable for such tests. The review ends with a discussion of gravitational-wave tests for compact binary systems.

  20. Deformed Shape Calculation of a Full-Scale Wing Using Fiber Optic Strain Data from a Ground Loads Test

    Science.gov (United States)

    Jutte, Christine V.; Ko, William L.; Stephens, Craig A.; Bakalyar, John A.; Richards, W. Lance

    2011-01-01

    A ground loads test of a full-scale wing (175-ft span) was conducted using a fiber optic strain-sensing system to obtain distributed surface strain data. These data were input into previously developed deformed shape equations to calculate the wing s bending and twist deformation. A photogrammetry system measured actual shape deformation. The wing deflections reached 100 percent of the positive design limit load (equivalent to 3 g) and 97 percent of the negative design limit load (equivalent to -1 g). The calculated wing bending results were in excellent agreement with the actual bending; tip deflections were within +/- 2.7 in. (out of 155-in. max deflection) for 91 percent of the load steps. Experimental testing revealed valuable opportunities for improving the deformed shape equations robustness to real world (not perfect) strain data, which previous analytical testing did not detect. These improvements, which include filtering methods developed in this work, minimize errors due to numerical anomalies discovered in the remaining 9 percent of the load steps. As a result, all load steps attained +/- 2.7 in. accuracy. Wing twist results were very sensitive to errors in bending and require further development. A sensitivity analysis and recommendations for fiber implementation practices, along with, effective filtering methods are included

  1. Gravitational-Wave Tests of General Relativity with Ground-Based Detectors and Pulsar-Timing Arrays.

    Science.gov (United States)

    Yunes, Nicolás; Siemens, Xavier

    2013-01-01

    This review is focused on tests of Einstein's theory of general relativity with gravitational waves that are detectable by ground-based interferometers and pulsar-timing experiments. Einstein's theory has been greatly constrained in the quasi-linear, quasi-stationary regime, where gravity is weak and velocities are small. Gravitational waves will allow us to probe a complimentary, yet previously unexplored regime: the non-linear and dynamical strong-field regime . Such a regime is, for example, applicable to compact binaries coalescing, where characteristic velocities can reach fifty percent the speed of light and gravitational fields are large and dynamical. This review begins with the theoretical basis and the predicted gravitational-wave observables of modified gravity theories. The review continues with a brief description of the detectors, including both gravitational-wave interferometers and pulsar-timing arrays, leading to a discussion of the data analysis formalism that is applicable for such tests. The review ends with a discussion of gravitational-wave tests for compact binary systems.

  2. Empirical versus Random Item Selection in the Design of Intelligence Test Short Forms--The WISC-R Example.

    Science.gov (United States)

    Goh, David S.

    1979-01-01

    The advantages of using psychometric thoery to design short forms of intelligence tests are demonstrated by comparing such usage to a systematic random procedure that has previously been used. The Wechsler Intelligence Scale for Children Revised (WISC-R) Short Form is presented as an example. (JKS)

  3. Does It Matter if I Take My Mathematics Test on Computer? A Second Empirical Study of Mode Effects in NAEP

    Science.gov (United States)

    Bennett, Randy Elliot; Braswell, James; Oranje, Andreas; Sandene, Brent; Kaplan, Bruce; Yan, Fred

    2008-01-01

    This article describes selected results from the Math Online (MOL) study, one of three field investigations sponsored by the National Center for Education Statistics (NCES) to explore the use of new technology in NAEP. Of particular interest in the MOL study was the comparability of scores from paper- and computer-based tests. A nationally…

  4. An Empirical Study of the Weigl-Goldstein-Scheerer Color-Form Test According to a Developmental Frame of Reference.

    Science.gov (United States)

    Strauss, Helen; Lewin, Isaac

    1982-01-01

    Analyzed the Weigl-Goldstein-Scheerer Color-Form Test using a sample of Danish children. Distinguished three dimensions: configuration of sorting, verbalization of the sorting principle, and the flexibility of switching sorting principle. The three dimensions proved themselves to constitute the a-priori-defined gradients. Results indicated a…

  5. Using Standards and Empirical Evidence to Develop Academic English Proficiency Test Items in Reading. CSE Technical Report 664

    Science.gov (United States)

    Bailey, Alison L.; Stevens, Robin; Butler, Frances A.; Huang, Becky; Miyoshi, Judy N.

    2005-01-01

    The work we report focuses on utilizing linguistic profiles of mathematics, science and social studies textbook selections for the creation of reading test specifications. Once we determined that a text and associated tasks fit within the parameters established in Butler et al. (2004), they underwent both internal and external review by language…

  6. Does corporate social responsibility put reputation at risk by inviting activist targeting? An empirical test among European SMEs.

    NARCIS (Netherlands)

    Graafland, Johan

    Corporate social responsibility (CSR) is believed to improve a company’s reputation. However,CSR may also put reputation at risk by making the company a more attractive target for activists’campaigns. We test this effect on a sample of 1355 European small and medium-sized enterprises(SMEs). We find

  7. Empirical Specification of Utility Functions.

    Science.gov (United States)

    Mellenbergh, Gideon J.

    Decision theory can be applied to four types of decision situations in education and psychology: (1) selection; (2) placement; (3) classification; and (4) mastery. For the application of the theory, a utility function must be specified. Usually the utility function is chosen on a priori grounds. In this paper methods for the empirical assessment…

  8. An empirical test of the relative and combined effects of land-cover and climate change on local colonization and extinction.

    Science.gov (United States)

    Yalcin, Semra; Leroux, Shawn James

    2018-04-14

    Land-cover and climate change are two main drivers of changes in species ranges. Yet, the majority of studies investigating the impacts of global change on biodiversity focus on one global change driver and usually use simulations to project biodiversity responses to future conditions. We conduct an empirical test of the relative and combined effects of land-cover and climate change on species occurrence changes. Specifically, we examine whether observed local colonization and extinctions of North American birds between 1981-85 and 2001-05 are correlated with land-cover and climate change and whether bird life history and ecological traits explain interspecific variation in observed occurrence changes. We fit logistic regression models to test the impact of physical land-cover change, changes in net primary productivity, winter precipitation, mean summer temperature, and mean winter temperature on the probability of Ontario breeding bird local colonization and extinction. Models with climate change, land-cover change, and the combination of these two drivers were the top ranked models of local colonization for 30%, 27%, and 29% of species, respectively. Conversely, models with climate change, land-cover change, and the combination of these two drivers were the top ranked models of local extinction for 61%, 7%, and 9% of species, respectively. The quantitative impacts of land-cover and climate change variables also vary among bird species. We then fit linear regression models to test whether the variation in regional colonization and extinction rate could be explained by mean body mass, migratory strategy, and habitat preference of birds. Overall, species traits were weakly correlated to heterogeneity in species occurrence changes. We provide empirical evidence showing that land-cover change, climate change, and the combination of multiple global change drivers can differentially explain observed species local colonization and extinction. This article is protected

  9. An empirical analysis of the physical aptitude exam as a predictor of performance on the Physical Readiness Test

    OpenAIRE

    Patrick, Robert W.

    2000-01-01

    The Physical Aptitude Exam, administered to candidates in the Naval Academy admissions process to measure physical aptitude, consists of pullups for men or the flexed arm hang for women, a 300-yard shuttle run, a standing longiump, and a kneeling basketball throw. The Physical Readiness Test, administered semi-annually to all naval personnel including midshipmen, consists of modified situps, pushups, and a 1.5-mile run. The purpose of this research is to determine if the Physical Aptitude Exa...

  10. Computer-aided assessment of aviation pilots attention: Design of an integrated test and its empirical validation

    Directory of Open Access Journals (Sweden)

    Rosario Cannavò

    2016-01-01

    In this paper, we present a battery of seven computerized tests, encompassing classical and innovative solutions inspired by the literature in the field, for the integrated measurement of the attention factors of aviation pilots. The computer software is validated by means of an experimental trial with 50 experienced aviation pilots and 50 untrained people as controls. Statistical analyzes confirm that the instrument can effectively classify aviation pilots, and identify a subset of distinctive attention factors that could be used for monitoring their duty.

  11. Economic evaluation of medical tests at the early phases of development: a systematic review of empirical studies.

    Science.gov (United States)

    Frempong, Samuel N; Sutton, Andrew J; Davenport, Clare; Barton, Pelham

    2018-02-01

    There is little specific guidance on the implementation of cost-effectiveness modelling at the early stage of test development. The aim of this study was to review the literature in this field to examine the methodologies and tools that have been employed to date. Areas Covered: A systematic review to identify relevant studies in established literature databases. Five studies were identified and included for narrative synthesis. These studies revealed that there is no consistent approach in this growing field. The perspective of patients and the potential for value of information (VOI) to provide information on the value of future research is often overlooked. Test accuracy is an essential consideration, with most studies having described and included all possible test results in their analysis, and conducted extensive sensitivity analyses on important parameters. Headroom analysis was considered in some instances but at the early development stage (not the concept stage). Expert commentary: The techniques available to modellers that can demonstrate the value of conducting further research and product development (i.e. VOI analysis, headroom analysis) should be better utilized. There is the need for concerted efforts to develop rigorous methodology in this growing field to maximize the value and quality of such analysis.

  12. Evaluation of the Internal and Borehole Resistances during Thermal Response Tests and Impact on Ground Heat Exchanger Design

    Directory of Open Access Journals (Sweden)

    Louis Lamarche

    2017-12-01

    Full Text Available The main parameters evaluated with a conventional thermal response test (TRT are the subsurface thermal conductivity surrounding the borehole and the effective borehole thermal resistance, when averaging the inlet and outlet temperature of a ground heat exchanger with the arithmetic mean. This effective resistance depends on two resistances: the 2D borehole resistance (Rb and the 2D internal resistance (Ra which is associated to the short-circuit effect between pipes in the borehole. This paper presents a field method to evaluate these two components separately. Two approaches are proposed. In the first case, the temperature at the bottom of the borehole is measured at the same time as the inlet and outlet temperatures as done in a conventional TRT. In the second case, different flow rates are used during the experiment to infer the internal resistance. Both approaches assumed a predefined temperature profile inside the borehole. The methods were applied to real experimental tests and compared with numerical simulations. Interesting results were found by comparison with theoretical resistances calculated with the multipole method. The motivation for this work is evidenced by analyzing the impact of the internal resistance on a typical geothermal system design. It is shown to be important to know both resistance components to predict the variation of the effective resistance when the flow rate and the height of the boreholes are changed during the design process.

  13. Integration of ground-penetrating radar, ultrasonic tests and infrared thermography for the analysis of a precious medieval rose window

    Science.gov (United States)

    Nuzzo, L.; Calia, A.; Liberatore, D.; Masini, N.; Rizzo, E.

    2010-04-01

    The integration of high-resolution, non-invasive geophysical techniques (such as ground-penetrating radar or GPR) with emerging sensing techniques (acoustics, thermography) can complement limited destructive tests to provide a suitable methodology for a multi-scale assessment of the state of preservation, material and construction components of monuments. This paper presents the results of the application of GPR, infrared thermography (IRT) and ultrasonic tests to the 13th century rose window of Troia Cathedral (Apulia, Italy), affected by widespread decay and instability problems caused by the 1731 earthquake and reactivated by recent seismic activity. This integrated approach provided a wide amount of complementary information at different scales, ranging from the sub-centimetre size of the metallic joints between the various architectural elements, narrow fractures and thin mortar fillings, up to the sub-metre scale of the internal masonry structure of the circular ashlar curb linking the rose window to the façade, which was essential to understand the original building technique and to design an effective restoration strategy.

  14. Ground-water sampling of the NNWSI (Nevada Nuclear Waste Storage Investigation) water table test wells surrounding Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Matuska, N.A.

    1988-12-01

    The US Geological Survey (USGS), as part of the Nevada Nuclear Waste Storage Investigation (NNWSI) study of the water table in the vicinity of Yucca Mountain, completed 16 test holes on the Nevada Test Site and Bureau of Land Management-administered lands surrounding Yucca Mountain. These 16 wells are monitored by the USGS for water-level data; however, they had not been sampled for ground-water chemistry or isotropic composition. As part of the review of the proposed Yucca Mountain high-level nuclear waste repository, the Desert Research Institute (DRI) sampled six of these wells. The goal of this sampling program was to measure field-dependent parameters of the water such as electrical conductivity, pH, temperature and dissolved oxygen, and to collect samples for major and minor element chemistry and isotopic analysis. This information will be used as part of a program to geochemically model the flow direction between the volcanic tuff aquifers and the underlying regional carbonate aquifer

  15. Flight Test Result for the Ground-Based Radio Navigation System Sensor with an Unmanned Air Vehicle.

    Science.gov (United States)

    Jang, Jaegyu; Ahn, Woo-Guen; Seo, Seungwoo; Lee, Jang Yong; Park, Jun-Pyo

    2015-11-11

    The Ground-based Radio Navigation System (GRNS) is an alternative/backup navigation system based on time synchronized pseudolites. It has been studied for some years due to the potential vulnerability issue of satellite navigation systems (e.g., GPS or Galileo). In the framework of our study, a periodic pulsed sequence was used instead of the randomized pulse sequence recommended as the RTCM (radio technical commission for maritime services) SC (special committee)-104 pseudolite signal, as a randomized pulse sequence with a long dwell time is not suitable for applications requiring high dynamics. This paper introduces a mathematical model of the post-correlation output in a navigation sensor, showing that the aliasing caused by the additional frequency term of a periodic pulsed signal leads to a false lock (i.e., Doppler frequency bias) during the signal acquisition process or in the carrier tracking loop of the navigation sensor. We suggest algorithms to resolve the frequency false lock issue in this paper, relying on the use of a multi-correlator. A flight test with an unmanned helicopter was conducted to verify the implemented navigation sensor. The results of this analysis show that there were no false locks during the flight test and that outliers stem from bad dilution of precision (DOP) or fluctuations in the received signal quality.

  16. Study of the Pierre Auger Observatory ground detectors: tests, simulation and calibration; Etude des detecteurs de surface de l'observatoire Pierre Auger: tests, simulation et etalonnage

    Energy Technology Data Exchange (ETDEWEB)

    Creusot, A

    2004-10-01

    The Pierre Auger Observatory is intended to the ultra high energy cosmic rays study. This study is realized through the particles showers coming from the interaction between the cosmic rays and the atmosphere. The ground detection of these showers requires a comprehensive understanding of the detectors. Several test tanks have been elaborated for this purpose, especially the Orsay one. The first chapter is dedicated to the presentation of the cosmic rays and of the Pierre Auger Observatory. The second one describes the detectors used for the Observatory surface array. The Orsay test tank is then presented and detailed. We study the results we have got with the Orsay test tank in the fourth chapter and compare these results with those of the Observatory detectors in the fifth chapter. The sixth chapter is dedicated to the validation of the results set through the simulation (GEANT4 software). Finally, the first detected particles showers are presented in the seventh chapter. The data acquisition has begun this year. The construction will be finished by end of 2005. From this moment, The Pierre Auger Observatory will allow us to contribute to solving the cosmic rays puzzle. (author)

  17. Automatic apparatus and data transmission for field response tests of the ground; Automatisation et teletransmission des donnees pour les tests de reponse du terrain

    Energy Technology Data Exchange (ETDEWEB)

    Laloui, L.; Steinmann, G.

    2004-07-01

    This is the report on the third part of a development started 1998 at the Swiss Federal Institute of Technology Lausanne (EPFL) in Lausanne, Switzerland. Energy piles are becoming increasingly used as a heat exchanger and heat storage device, as are geothermal probes. Their design and sizing is subject to some uncertainty due to the fact that the planner has to estimate the thermal and mechanical properties of the ground surrounding the piles or probes. The aim of the project was to develop an apparatus for field measurements of thermal and mechanical properties of an energy pile or a geothermal probe (thermal response tests). In the reported third phase of the project the portable apparatus was equipped with a data transmission device using the Internet. Real-time data acquisition and supervision is now implemented and data processing has been improved. Another goal of the project was to obtain the official accreditation of such response tests according to the European standard EN 45,000. First operation experience from a test in Lyon, France is reported.

  18. Study of the Pierre Auger Observatory ground detectors: tests, simulation and calibration; Etude des detecteurs de surface de l'observatoire Pierre Auger: tests, simulation et etalonnage

    Energy Technology Data Exchange (ETDEWEB)

    Creusot, A

    2004-10-01

    The Pierre Auger Observatory is intended to the ultra high energy cosmic rays study. This study is realized through the particles showers coming from the interaction between the cosmic rays and the atmosphere. The ground detection of these showers requires a comprehensive understanding of the detectors. Several test tanks have been elaborated for this purpose, especially the Orsay one. The first chapter is dedicated to the presentation of the cosmic rays and of the Pierre Auger Observatory. The second one describes the detectors used for the Observatory surface array. The Orsay test tank is then presented and detailed. We study the results we have got with the Orsay test tank in the fourth chapter and compare these results with those of the Observatory detectors in the fifth chapter. The sixth chapter is dedicated to the validation of the results set through the simulation (GEANT4 software). Finally, the first detected particles showers are presented in the seventh chapter. The data acquisition has begun this year. The construction will be finished by end of 2005. From this moment, The Pierre Auger Observatory will allow us to contribute to solving the cosmic rays puzzle. (author)

  19. An Empirical Taxonomy of Crowdfunding Intermediaries

    OpenAIRE

    Haas, Philipp; Blohm, Ivo; Leimeister, Jan Marco

    2014-01-01

    Due to the recent popularity of crowdfunding, a broad magnitude of crowdfunding intermediaries has emerged, while research on crowdfunding intermediaries has been largely neglected. As a consequence, existing classifications of crowdfunding intermediaries are conceptual, lack theoretical grounding, and are not empirically validated. Thus, we develop an empirical taxonomy of crowdfunding intermediaries, which is grounded in the theories of two-sided markets and financial intermediation. Integr...

  20. Effects of resource supplements on mature ciliate biofilms: an empirical test using a new type of flow cell.

    Science.gov (United States)

    Norf, Helge; Arndt, Hartmut; Weitere, Markus

    2009-11-01

    Biofilm-dwelling consumer communities play an important role in the matter flux of many aquatic ecosystems. Due to their poor accessibility, little is as yet known about the regulation of natural biofilms. Here, a new type of flow cell is presented which facilitates both experimental manipulation and live observation of natural, pre-grown biofilms. These flow cells were used to study the dynamics of mature ciliate biofilms in response to supplementation of planktonic bacteria. The results suggest that enhanced ciliate productivity could be quickly transferred to micrometazoans (ciliate grazers), making the effects on the standing stock of the ciliates detectable only for a short time. Likewise, no effect on ciliates appeared when micrometazoan consumers were ab initio abundant. This indicates the importance of 'top-down' control of natural ciliate biofilms. The flow cells used here offer great potential for experimentally testing such control mechanisms within naturally cultivated biofilms.

  1. An empirical test of evolutionary theories for reproductive senescence and reproductive effort in the garter snake Thamnophis elegans.

    Science.gov (United States)

    Sparkman, Amanda M; Arnold, Stevan J; Bronikowski, Anne M

    2007-04-07

    Evolutionary theory predicts that differential reproductive effort and rate of reproductive senescence will evolve under different rates of external mortality. We examine the evolutionary divergence of age-specific reproduction in two life-history ecotypes of the western terrestrial garter snake, Thamnophis elegans. We test for the signature of reproductive senescence (decreasing fecundity with age) and increasing reproductive effort with age (increasing reproductive productivity per gram female) in replicate populations of two life-history ecotypes: snakes that grow fast, mature young and have shorter lifespans, and snakes that grow slow, mature late and have long lives. The difference between life-history ecotypes is due to genetic divergence in growth rate. We find (i) reproductive success (live litter mass) increases with age in both ecotypes, but does so more rapidly in the fast-growth ecotype, (ii) reproductive failure increases with age in both ecotypes, but the proportion of reproductive failure to total reproductive output remains invariant, and (iii) reproductive effort remains constant in fast-growth individuals with age, but declines in slow-growth individuals. This illustration of increasing fecundity with age, even at the latest ages, deviates from standard expectations for reproductive senescence, as does the lack of increases in reproductive effort. We discuss our findings in light of recent theories regarding the phenomenon of increased reproduction throughout life in organisms with indeterminate growth and its potential to offset theoretical expectations for the ubiquity of senescence.

  2. Becker's rational addiction theory: An empirical test with price elasticities for distilled spirits in Denmark 1911-31.

    Science.gov (United States)

    Skog, Ole-Jørgen; Melberg, Hans Olav

    2006-10-01

    To test an implication of Becker's rational addiction theory, namely that price changes will lead both to simultaneous consumption changes as well as lagged changes (and potentially also immediate changes if future changes in prices are anticipated). Time-series analysis, first of aggregate sales of distilled spirits and prices, controlled for gross national product (GNP), and secondly of deaths from delirium tremens. Denmark 1911-31. Price changes were very large in the period 1916-18 due to shortages during World War I, and the Danish case can be conceived as a natural experiment. No evidence for lagged price effects in the expected direction was found. On the contrary, the evidence pointed in the opposite direction. The immediate reduction in sales following rising prices are, to some degree, counteracted by an adjustment in the opposite direction the following year. The delirium tremens data confirm this pattern. Becker's theory is not confirmed. Several possible explanations are discussed. If the pattern observed in these data is representative of a more general mechanism, current price elasticity estimates may be too high, by ignoring lagged compensatory effects.

  3. South African physicians’ acceptance of e-prescribing technology: an empirical test of a modified UTAUT model

    Directory of Open Access Journals (Sweden)

    Jason Cohen

    2013-07-01

    Full Text Available E-prescribing systems hold promise for improving the quality and efficiency of the scripting process. Yet, the use of the technology has been associated with a number of challenges. The diffusion of e-prescribing into physician practices and the consequent realisation of its potential benefits will depend on whether physicians are willing to accept and engage with the technology. This study draws on the Unified Theory of Acceptance and Use of Technology (UTAUT and recent literature on user trust in technology to develop and test a model of the factors influencing South African physicians’ acceptance of e-prescribing. Data was collected from a sample of 72 physicians. Results indicate a general acceptance of e-prescribing amongst physicians who on average reported strong intentions to use e-prescribing technologies if given the opportunity. PLS analysis revealed that physicians’ performance expectancies and perceptions of facilitating conditions had significant direct effects on acceptance whilst trust and effort expectancy had important indirect effects. Social influence and price value perceptions did not add additional explanatory power. The model explained 63% of the variation in physician acceptance.

  4. Aberrant pain perception in direct and indirect non-suicidal self-injury: an empirical test of Joiner's interpersonal theory.

    Science.gov (United States)

    St Germain, Sarah A; Hooley, Jill M

    2013-08-01

    Using a community sample (N=148) we examined pressure pain perception in 3 study groups--people who engaged in non-suicidal self-injury, people who engaged in indirect forms of self-injury, and non-self-injuring controls. In so doing we tested hypotheses derived from Joiner's (2005) interpersonal theory of suicide. Consistent with previous studies and with Joiner's model, people who engaged in NSSI endured pain for significantly longer than non-self-injuring controls. Importantly, pain endurance in the Indirect self-injury group was comparable to that found in the NSSI group and significantly elevated relative to controls. This pattern of results suggests that abnormal pain perception may not be specific to forms of self-injury (e.g., NSSI) that involve immediate physical pain (e.g., cutting). Our findings further suggest that the concept of acquired capability for suicide might have relevance for both direct and indirect forms of self-injurious behavior. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. The decision of out-of-home placement in residential care after parental neglect: Empirically testing a psychosocial model.

    Science.gov (United States)

    Rodrigues, Leonor; Calheiros, Manuela; Pereira, Cícero

    2015-11-01

    Out-of-home placement decisions in residential care are complex, ambiguous and full of uncertainty, especially in cases of parental neglect. Literature on this topic is so far unable to understand and demonstrate the source of errors involved in those decisions and still fails to focus on professional's decision making process. Therefore, this work intends to test a socio-psychological model of decision-making that is a more integrated, dualistic and ecological version of the Theory of Planned Behavior's model. It describes the process through which the decision maker takes into account personal, contextual and social factors of the Decision-Making Ecology in the definition of his/her decision threshold. One hundred and ninety-five professionals from different Children and Youth Protection Units, throughout the Portuguese territory, participated in this online study. After reading a vignette of a (psychological and physical) neglect case toward a one-year-old child, participants were presented with a group of questions that measured worker's assessment of risk, intention, attitude, subjective norm, behavior control and beliefs toward residential care placement decision, as well as worker's behavior experience, emotions and family/child-related-values involved in that decision. A set of structural equation modeling analyses have proven the good fit of the proposed model. The intention to propose a residential care placement decision was determined by cognitive, social, affective, value-laden and experience variables and the perceived risk. Altogether our model explained 61% of professional's decision toward a parental neglect case. The theoretical and practical implications of these results are discussed, namely the importance of raising awareness about the existence of these biased psychosocial determinants. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. An empirical test of the decision to lie component of the Activation-Decision-Construction-Action Theory (ADCAT).

    Science.gov (United States)

    Masip, Jaume; Blandón-Gitlin, Iris; de la Riva, Clara; Herrero, Carmen

    2016-09-01

    Meta-analyses reveal that behavioral differences between liars and truth tellers are small. To facilitate lie detection, researchers are currently developing interviewing approaches to increase these differences. Some of these approaches assume that lying is cognitively more difficult than truth telling; however, they are not based on specific cognitive theories of lie production, which are rare. Here we examined one existing theory, Walczyk et al.'s (2014) Activation-Decision-Construction-Action Theory (ADCAT). We tested the Decision component. According to ADCAT, people decide whether to lie or tell the truth as if they were using a specific mathematical formula to calculate the motivation to lie from (a) the probability of a number of outcomes derived from lying vs. telling the truth, and (b) the costs/benefits associated with each outcome. In this study, participants read several hypothetical scenarios and indicated whether they would lie or tell the truth in each scenario (Questionnaire 1). Next, they answered several questions about the consequences of lying vs. telling the truth in each scenario, and rated the probability and valence of each consequence (Questionnaire 2). Significant associations were found between the participants' dichotomous decision to lie/tell the truth in Questionnaire 1 and their motivation to lie scores calculated from the Questionnaire 2 data. However, interestingly, whereas the expected consequences of truth telling were associated with the decision to lie vs. tell the truth, the expected consequences of lying were not. Suggestions are made to refine ADCAT, which can be a useful theoretical framework to guide deception research. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. pKWmEB: integration of Kruskal-Wallis test with empirical Bayes under polygenic background control for multi-locus genome-wide association study.

    Science.gov (United States)

    Ren, Wen-Long; Wen, Yang-Jun; Dunwell, Jim M; Zhang, Yuan-Ming

    2018-03-01

    Although nonparametric methods in genome-wide association studies (GWAS) are robust in quantitative trait nucleotide (QTN) detection, the absence of polygenic background control in single-marker association in genome-wide scans results in a high false positive rate. To overcome this issue, we proposed an integrated nonparametric method for multi-locus GWAS. First, a new model transformation was used to whiten the covariance matrix of polygenic matrix K and environmental noise. Using the transferred model, Kruskal-Wallis test along with least angle regression was then used to select all the markers that were potentially associated with the trait. Finally, all the selected markers were placed into multi-locus model, these effects were estimated by empirical Bayes, and all the nonzero effects were further identified by a likelihood ratio test for true QTN detection. This method, named pKWmEB, was validated by a series of Monte Carlo simulation studies. As a result, pKWmEB effectively controlled false positive rate, although a less stringent significance criterion was adopted. More importantly, pKWmEB retained the high power of Kruskal-Wallis test, and provided QTN effect estimates. To further validate pKWmEB, we re-analyzed four flowering time related traits in Arabidopsis thaliana, and detected some previously reported genes that were not identified by the other methods.

  8. A chi-square goodness-of-fit test for non-identically distributed random variables: with application to empirical Bayes

    International Nuclear Information System (INIS)

    Conover, W.J.; Cox, D.D.; Martz, H.F.

    1997-12-01

    When using parametric empirical Bayes estimation methods for estimating the binomial or Poisson parameter, the validity of the assumed beta or gamma conjugate prior distribution is an important diagnostic consideration. Chi-square goodness-of-fit tests of the beta or gamma prior hypothesis are developed for use when the binomial sample sizes or Poisson exposure times vary. Nine examples illustrate the application of the methods, using real data from such diverse applications as the loss of feedwater flow rates in nuclear power plants, the probability of failure to run on demand and the failure rates of the high pressure coolant injection systems at US commercial boiling water reactors, the probability of failure to run on demand of emergency diesel generators in US commercial nuclear power plants, the rate of failure of aircraft air conditioners, baseball batting averages, the probability of testing positive for toxoplasmosis, and the probability of tumors in rats. The tests are easily applied in practice by means of corresponding Mathematica reg-sign computer programs which are provided

  9. The Apache Longbow-Hellfire Missile Test at Yuma Proving Ground: Ecological Risk Assessment for Missile Firing

    International Nuclear Information System (INIS)

    Jones, Daniel Steven; Efroymson, Rebecca Ann; Hargrove, William Walter; Suter, Glenn; Pater, Larry

    2008-01-01

    A multiple stressor risk assessment was conducted at Yuma Proving Ground, Arizona, as a demonstration of the Military Ecological Risk Assessment Framework. The focus was a testing program at Cibola Range, which involved an Apache Longbow helicopter firing Hellfire missiles at moving targets, M60-A1 tanks. This paper describes the ecological risk assessment for the missile launch and detonation. The primary stressor associated with this activity was sound. Other minor stressors included the detonation impact, shrapnel, and fire. Exposure to desert mule deer (Odocoileus hemionus crooki) was quantified using the Army sound contour program BNOISE2, as well as distances from the explosion to deer. Few effects data were available from related studies. Exposure-response models for the characterization of effects consisted of human 'disturbance' and hearing damage thresholds in units of C-weighted decibels (sound exposure level) and a distance-based No Observed Adverse Effects Level for moose and cannonfire. The risk characterization used a weight-of-evidence approach and concluded that risk to mule deer behavior from the missile firing was likely for a negligible number of deer, but that no risk to mule deer abundance and reproduction is expected

  10. Nuclear multifragmentation, its relation to general physics. A rich test ground of the fundamentals of statistical mechanics

    International Nuclear Information System (INIS)

    Gross, D.H.E.

    2006-01-01

    Heat can flow from cold to hot at any phase separation even in macroscopic systems. Therefore also Lynden-Bell's famous gravo-thermal catastrophe must be reconsidered. In contrast to traditional canonical Boltzmann-Gibbs statistics this is correctly described only by microcanonical statistics. Systems studied in chemical thermodynamics (ChTh) by using canonical statistics consist of several homogeneous macroscopic phases. Evidently, macroscopic statistics as in chemistry cannot and should not be applied to non-extensive or inhomogeneous systems like nuclei or galaxies. Nuclei are small and inhomogeneous. Multifragmented nuclei are even more inhomogeneous and the fragments even smaller. Phase transitions of first order and especially phase separations therefore cannot be described by a (homogeneous) canonical ensemble. Taking this serious, fascinating perspectives open for statistical nuclear fragmentation as test ground for the basic principles of statistical mechanics, especially of phase transitions, without the use of the thermodynamic limit. Moreover, there is also a lot of similarity between the accessible phase space of fragmenting nuclei and inhomogeneous multistellar systems. This underlines the fundamental significance for statistical physics in general. (orig.)

  11. Space-Borne and Ground-Based InSAR Data Integration: The Åknes Test Site

    Directory of Open Access Journals (Sweden)

    Federica Bardi

    2016-03-01

    Full Text Available This work concerns a proposal of the integration of InSAR (Interferometric Synthetic Aperture Radar data acquired by ground-based (GB and satellite platforms. The selected test site is the Åknes rockslide, which affects the western Norwegian coast. The availability of GB-InSAR and satellite InSAR data and the accessibility of a wide literature make the landslide suitable for testing the proposed procedure. The first step consists of the organization of a geodatabase, performed in the GIS environment, containing all of the available data. The second step concerns the analysis of satellite and GB-InSAR data, separately. Two datasets, acquired by RADARSAT-2 (related to a period between October 2008 and August 2013 and by a combination of TerraSAR-X and TanDEM-X (acquired between July 2010 and October 2012, both of them in ascending orbit, processed applying SBAS (Small BAseline Subset method, are available. GB-InSAR data related to five different campaigns of measurements, referred to the summer seasons of 2006, 2008, 2009, 2010 and 2012, are available, as well. The third step relies on data integration, performed firstly from a qualitative point of view and later from a semi-quantitative point of view. The results of the proposed procedure have been validated by comparing them to GPS (Global Positioning System data. The proposed procedure allowed us to better define landslide sectors in terms of different ranges of displacements. From a qualitative point of view, stable and unstable areas have been distinguished. In the sector concerning movement, two different sectors have been defined thanks to the results of the semi-quantitative integration step: the first sector, concerning displacement values higher than 10 mm, and the 2nd sector, where the displacements did not exceed a 10-mm value of displacement in the analyzed period.

  12. Testing the generality of above-ground biomass allometry across plant functional types at the continent scale.

    Science.gov (United States)

    Paul, Keryn I; Roxburgh, Stephen H; Chave, Jerome; England, Jacqueline R; Zerihun, Ayalsew; Specht, Alison; Lewis, Tom; Bennett, Lauren T; Baker, Thomas G; Adams, Mark A; Huxtable, Dan; Montagu, Kelvin D; Falster, Daniel S; Feller, Mike; Sochacki, Stan; Ritson, Peter; Bastin, Gary; Bartle, John; Wildy, Dan; Hobbs, Trevor; Larmour, John; Waterworth, Rob; Stewart, Hugh T L; Jonson, Justin; Forrester, David I; Applegate, Grahame; Mendham, Daniel; Bradford, Matt; O'Grady, Anthony; Green, Daryl; Sudmeyer, Rob; Rance, Stan J; Turner, John; Barton, Craig; Wenk, Elizabeth H; Grove, Tim; Attiwill, Peter M; Pinkard, Elizabeth; Butler, Don; Brooksbank, Kim; Spencer, Beren; Snowdon, Peter; O'Brien, Nick; Battaglia, Michael; Cameron, David M; Hamilton, Steve; McAuthur, Geoff; Sinclair, Jenny

    2016-06-01

    Accurate ground-based estimation of the carbon stored in terrestrial ecosystems is critical to quantifying the global carbon budget. Allometric models provide cost-effective methods for biomass prediction. But do such models vary with ecoregion or plant functional type? We compiled 15 054 measurements of individual tree or shrub biomass from across Australia to examine the generality of allometric models for above-ground biomass prediction. This provided a robust case study because Australia includes ecoregions ranging from arid shrublands to tropical rainforests, and has a rich history of biomass research, particularly in planted forests. Regardless of ecoregion, for five broad categories of plant functional type (shrubs; multistemmed trees; trees of the genus Eucalyptus and closely related genera; other trees of high wood density; and other trees of low wood density), relationships between biomass and stem diameter were generic. Simple power-law models explained 84-95% of the variation in biomass, with little improvement in model performance when other plant variables (height, bole wood density), or site characteristics (climate, age, management) were included. Predictions of stand-based biomass from allometric models of varying levels of generalization (species-specific, plant functional type) were validated using whole-plot harvest data from 17 contrasting stands (range: 9-356 Mg ha(-1) ). Losses in efficiency of prediction were <1% if generalized models were used in place of species-specific models. Furthermore, application of generalized multispecies models did not introduce significant bias in biomass prediction in 92% of the 53 species tested. Further, overall efficiency of stand-level biomass prediction was 99%, with a mean absolute prediction error of only 13%. Hence, for cost-effective prediction of biomass across a wide range of stands, we recommend use of generic allometric models based on plant functional types. Development of new species

  13. Choosing the correct empirical antibiotic for urinary tract infection in pediatric: Surveillance of antimicrobial susceptibility pattern of Escherichia coli by E-Test method.

    Science.gov (United States)

    Sedighi, Iraj; Solgi, Abbas; Amanati, Ali; Alikhani, Mohammad Yousef

    2014-12-01

    Urinary Tract Infections (UTIs) are of the most common bacterial diseases worldwide. We investigate the antibiotic susceptibility patterns of Escherichia coli (E. coli) strains isolated from pediatric patients with community acquired urinary tract infection (UTI) to find a clinical guidance for choosing a right empirical antibiotic in these patients. In this cross sectional study, 100 urine specimens which were positive for E. coli had been investigated for antibiotics susceptibility pattern. The susceptibility to Co-trimoxazol (25μg), Amikacin (30μg), Ceftriaxone (30μg), Nalidixic Acid (30μg), Cefixime (5μg), and Nitrofurantoin (300μg) tested with Disk diffusion agar and MIC determined with the E-test. Mean age of patients was 38 Months. Girls had greater proportion than boys (74 versus 26%). In Disk diffusion method, 26% of the isolates were susceptible to cotrimoxazole. Susceptibility to amikacin, ceftriaxone, nitrofurantoin, nalidixic acid and cefixime was 94%, 66%, 97%, 62% and 52%, respectively. By E-Test method and according to CLSI criteria susceptibility for co-trimoxazol, amikacin, ceftriaxone and nalidixic acid was 37%, 97%, 67% and 50%, respectively. The highest percentage of agreement between Disk diffusion and E-Test method was found for amikacin (96%) and the lowest percentage for co-trimoxazole (89%). Treatment failure, prolonged or repeated hospitalization, increased costs of care, and increased mortality are some consequence of bacterial resistance in UTIs. Misuse of antibiotics in each geographic location directly affects antibiotic resistance pattern. In the treatment of UTI, proper selection of antimicrobial agents should be relevant to the bacterial susceptibility testing surveillance. According to our results, amikacin as an injectable drug and nitrofurantoin as an oral agent could be used as a drug of choice in our region for children with UTIs.

  14. Empirical Hamiltonians

    International Nuclear Information System (INIS)

    Peggs, S.; Talman, R.

    1986-08-01

    As proton accelerators get larger, and include more magnets, the conventional tracking programs which simulate them run slower. At the same time, in order to more carefully optimize the higher cost of the accelerators, they must return more accurate results, even in the presence of a longer list of realistic effects, such as magnet errors and misalignments. For these reasons conventional tracking programs continue to be computationally bound, despite the continually increasing computing power available. This limitation is especially severe for a class of problems in which some lattice parameter is slowly varying, when a faithful description is only obtained by tracking for an exceedingly large number of turns. Examples are synchrotron oscillations in which the energy varies slowly with a period of, say, hundreds of turns, or magnet ripple or noise on a comparably slow time scale. In these cases one may with to track for hundreds of periods of the slowly varying parameter. The purpose of this paper is to describe a method, still under development, in which element-by-element tracking around one turn is replaced by a single map, which can be processed far faster. Similar programs have already been written in which successive elements are ''concatenated'' with truncation to linear, sextupole, or octupole order, et cetera, using Lie algebraic techniques to preserve symplecticity. The method described here is rather more empirical than this but, in principle, contains information to all orders and is able to handle resonances in a more straightforward fashion

  15. Evaluation of two transport aircraft and several ground test vehicle friction measurements obtained for various runway surface types and conditions. A summary of test results from joint FAA/NASA Runway Friction Program

    Science.gov (United States)

    Yager, Thomas J.; Vogler, William A.; Baldasare, Paul

    1990-01-01

    Tests with specially instrumented NASA Boeing 737 and 727 aircraft together with several different ground friction measuring devices were conducted for a variety of runway surface types and conditions. These tests are part of joint FAA/NASA Aircraft/Ground Vehicle Runway Friction Program aimed at obtaining a better understanding of aircraft ground handling performance under adverse weather conditions and defining relationships between aircraft and ground vehicle tire friction measurements. Aircraft braking performance on dry, wet, snow and ice-covered runway conditions is discussed as well as ground vehicle friction data obtained under similar runway conditions. For a given contaminated runway surface condition, the correlation between ground vehicles and aircraft friction data is identified. The influence of major test parameters on friction measurements such as speed, test tire characteristics, type and amount of surface contaminant, and ambient temperature are discussed. The effect of surface type on wet friction levels is also evaluated from comparative data collected on grooved and ungrooved concrete and asphalt surfaces.

  16. Ground-water data for the Nevada Test Site and selected other areas in South-Central Nevada, 1992--1993

    International Nuclear Information System (INIS)

    1995-01-01

    The US Geological Survey, in support of the US Department of Energy Environmental Restoration and Hydrologic Resources Management Programs, collects and compiles hydrogeologic data to aid in characterizing the regional and local ground-water flow systems underlying the Nevada Test Site and vicinity. This report presents selected ground-water data collected from wells and test holes at and in the vicinity of the Nevada Test Site. Depth-to-water measurements were made during water year 1993 at 55 sites at the Nevada Test Site and 43 regional sites in the vicinity of the Nevada Test Site. Depth to water ranged from 87.7 to 674.6 meters below land surface at the Nevada Test Site and from 6.0 to 444.7 meters below land surface at sites in the vicinity of the Nevada Test Site. Depth-to-water measurements were obtained using the wire-line, electric-tape, air-line, and steel-tape devices. Total measured ground-water withdrawal from the Nevada Test Site during the 1993 calendar year was 1,888.04 million liters. Annual ground-water withdrawals from 14 wells ranged from 0.80 million to 417.20 million liters. Tritium concentrations from four samples at the Nevada Test Site and from three samples in the vicinity of the Nevada Test Site collected during water year 1993 ranged from near 0 to 27,676.0 becquerels per liter and from near 0 to 3.9 becquerels per liter, respectively

  17. Aerothermodynamic Design, Review on Ground Testing and CFD (Conception aerothermodynamique, revue sur les essais au sol et dynamique des fluides informatisee)

    Science.gov (United States)

    2010-04-01

    Aerothermodynamic Design, Review on Ground Testing and CFD (RTO-EN-AVT-186) Executive Summary The Lecture Series focus on the presentation of...impulsions ITAM et les tubes à choc DLR HEG. Les sondes à réponse rapide et les techniques de mesures instables ont été présentées ainsi que les outils de

  18. A system to test the ground surface conditions of construction sites--for safe and efficient work without physical strain.

    Science.gov (United States)

    Koningsveld, Ernst; van der Grinten, Maarten; van der Molen, Henk; Krause, Frank

    2005-07-01

    Ground surface conditions on construction sites have an important influence on the health and safety of workers and their productivity. The development of an expert-based "working conditions evaluation" system is described, intended to assist site managers in recognising unsatisfactory ground conditions and remedying these. The system was evaluated in the period 2002-2003. The evaluation shows that companies recognize poor soil/ground conditions as problematic, but are not aware of the specific physical workload hazards. The developed methods allow assessment of the ground surface quality and selection of appropriate measures for improvement. However, barriers exist at present to wide implementation of the system across the industry. Most significant of these is that responsibility for a site's condition is not clearly located within contracting arrangements, nor is it a topic of serious negotiation.

  19. Shaking Table Tests on the Seismic Behavior of Steel Frame Structures Subjected to Various Earthquake Ground Motions

    International Nuclear Information System (INIS)

    Choi, In Kil; Kim, Min Kyu; Choun, Young Sun; Seo, Jeong Moon

    2004-05-01

    The standard response spectrum proposed by US NRC has been used as a design earthquake for the design of Korean nuclear power plant structures. Recent large earthquakes occurred in near-fault zone have done significant damage and loss of life to earthquake area. A survey on some of the Quaternary fault segments near the Korean nuclear power plants is ongoing. If the faults are confirmed as active ones, it will be necessary to reevaluate the seismic safety of the nuclear power plants located near the fault. In this study, the shaking table tests of three steel frame structures were performed. Three types of input motions, artificial time histories that envelop the US NRC Regulatory Guide 1.60 spectrum and the probability based scenario earthquake spectra developed for the Korean nuclear power plant site and a typical near-fault earthquake recorded at Chi-Chi earthquake, were used as input motions. The acceleration and displacement responses of the structure due to the design earthquake were larger than those due to the other input earthquakes. It seems that the design earthquake for the Korean nuclear power plants is conservative, and that the near-fault earthquake and scenario earthquake are not so damageable for the nuclear power plant structures, because the fundamental frequencies of the nuclear power plant structures are generally greater than 5 Hz. The high frequency ground motions that appeared in the scenario earthquake can be more damageable for the equipment installed on the high floors in a building. This means that the design earthquake is not so conservative for the safety of the safety related nuclear power plant equipment

  20. Settlement mechanism of the backfilled ground around nuclear power plant buildings. Part 2. A series of centrifuge tests and a numerical simulation by using FEM about a typical test result

    International Nuclear Information System (INIS)

    Kawai, Tadashi; Ishimaru, Makoto

    2009-01-01

    During the Niigataken Chuetsu-oki earthquake, rather large settlements of the backfill ground around the rigid and stable buildings were observed. In this study, five cases of centrifuge tests with shaking events were conducted to reproduce the similar type of the settlements in order to examine the mechanism of the settlements. The results from those tests showed that the ground was settled by the negative dilatancy of sandy soils anywhere in the model ground and the additional settlements were suddenly caused when the backfill ground was apart from the rigid wall modeling the rigid and stable buildings, namely a sliding failure in an active state was occurred in the backfill ground near the structure. It was confirmed that these settlements were able to be estimated by a simple method proposed in this report, in which only the differences between the self-weight of the sliding block and the soil strength calculated at the initial stress conditions were considered as the driving forces of the sliding failure, and then the accelerations calculated from the forces being divided by the mass of the sliding block were simply integrated two times with respected to the time when the ground was apart from the structure. Further, a numerical simulation by using FEM about a typical test result was conducted, and these settlements were well simulated. (author)