WorldWideScience

Sample records for project gasbuggy consisted

  1. Gas quality analysis and evaluation program for project Gasbuggy

    Energy Technology Data Exchange (ETDEWEB)

    Smith, C F [Lawrence Radiation Laboratory, University of California, Livermore, CA (United States)

    1970-05-01

    Experimental results of the gas quality analysis program for Project Gasbuggy through August 1969 are presented graphically, addressing the questions raised by the preshot program goals. The chemical composition and the concentrations of tritium, krypton-85, carbon-14 and argon-37, 39 are presented as a function of time and gas production from the nuclear chimney. Chemically, the presence of CO{sub 2}, CO and H{sub 2} served to dilute the formation gas and caused reactions which significantly altered the gas composition at early times. The radionuclide content of the chimney gas at reentry was some 800 pCi/cm{sup 3} of which about 80% was CH{sub 3}T. Lesser quantities of tritium were observed as HT, C{sub 2}H{sub 5}T and C{sub 3}H{sub 7}T. The other major contaminant was Kr{sup 85} which was present at about one-fifth the level of CH{sub 3}T. Small quantities of carbon-14 and argon-39 were also identified. The only other radionuclides identified in the gas were relatively short-lived rare gases. During the production testing, about two and one-half chimney volumes of gas at formation pressure were removed. This removal, accompanied by dilution, has reduced the radionuclide concentrations to about 7% of their levels at reentry. The production characteristics of the Gasbuggy environment prevented an adequate test of the effectiveness of chimney flushing. However, the rapid drawdown concept is supported by the available data as an effective means of reducing contaminant levels. The changes in composition during production or testing are seen to be consistent with a model involving a non-uniform gas influx rate and flow distribution over the chimney region. Mixing times are estimated to be on the order of a few days, so that increasing concentrations following a sudden gas influx can be explained. (author)

  2. Gasbuggy in perspective

    International Nuclear Information System (INIS)

    Holzer, Alfred

    1970-01-01

    The Gasbuggy experiment set out to answer a number of questions: To what degree could a low-permeability, gas-bearing formation be stimulated? What were the mechanisms responsible for stimulation of gas? What were the problems of product contamination and potential ground shock damage? After two years of postshot work, some of these questions are being answered; more precisely, pressures, temperatures and concentrations of radioactive and non-radioactive constituents of the gas are being established. However, analyzing these quantities and their dependence on variables such as flow rates in terms of a self-consistent model of all the detonation phenomena has been a difficult and slow process. The validity of such a model must be tested by data from other detonations with geologies, reservoir properties and, perhaps, explosive yields different than those of Gasbuggy. The gas stimulation projects now being planned must be capable of furnishing some of these data before they can be called experiments in the fullest sense. (author)

  3. Gasbuggy in perspective

    Energy Technology Data Exchange (ETDEWEB)

    Holzer, Alfred [Lawrence Radiation Laboratory, University of California, Livermore, CA (United States)

    1970-05-01

    The Gasbuggy experiment set out to answer a number of questions: To what degree could a low-permeability, gas-bearing formation be stimulated? What were the mechanisms responsible for stimulation of gas? What were the problems of product contamination and potential ground shock damage? After two years of postshot work, some of these questions are being answered; more precisely, pressures, temperatures and concentrations of radioactive and non-radioactive constituents of the gas are being established. However, analyzing these quantities and their dependence on variables such as flow rates in terms of a self-consistent model of all the detonation phenomena has been a difficult and slow process. The validity of such a model must be tested by data from other detonations with geologies, reservoir properties and, perhaps, explosive yields different than those of Gasbuggy. The gas stimulation projects now being planned must be capable of furnishing some of these data before they can be called experiments in the fullest sense. (author)

  4. Long-Term Hydrologic Monitoring Program: Project Gasbuggy Rio Arriba County, New Mexico

    International Nuclear Information System (INIS)

    1986-10-01

    The Gasbuggy site is located in Rio Arriba County, New Mexico, approximately 55 air miles (88.6 kilometers) east of Farmington, New Mexico. The Gasbuggy device with a yield of 29 kilotons, was detonated December 10, 1967. It was the first US underground nuclear experiment for the stimulation of low-productive natural gas reservoirs. The purpose of the Long-Term Hydrologic Monitoring Program at the Gasbuggy site is to obtain data that will assure the public safety; inform the public, the news media, and the scientific community relative to radiological contamination; and to document compliance with federal, state, and local antipollution requirements. The Gasbuggy site geographical setting, climate, geology, and hydrology are described. Site history, including Gasbuggy event information and Gasbuggy monitoring by the US Public Health is described. Site cleanup activities conducted in 1978 are described. Postoperational surveys indicate that the Gasbuggy site is well below the established decontamination criteria and that no hazard exists or will likely occur during public use of the land surface of the Gasbuggy site

  5. An aerial radiological survey of Project Gasbuggy and surrounding area, Rio Arriba County, New Mexico. Date of survey: October 27, 1994

    International Nuclear Information System (INIS)

    1995-08-01

    An aerial radiological survey was conducted over the Project Gasbuggy site, 55 miles (89 kilometers) east of Farmington, New Mexico, on October 27, 1994. Parallel lines were flown at intervals of 300 feet (91 meters) over a 16-square-mile (41-square-kilometer) area at a 150-foot (46-meter) altitude centered on the Gasbuggy site. The gamma energy spectra obtained were reduced to an exposure rate contour map overlaid on a high altitude aerial photograph of the area. The terrestrial exposure rate varied from 14 to 20 microR/h at 1 meter above ground level. No anomalous or man-made isotopes were found

  6. Site Characterization Work Plan for Gasbuggy, New Mexico (Rev.1, Jan. 2002)

    Energy Technology Data Exchange (ETDEWEB)

    U.S. Department of Energy, National Nuclear Security Administration Nevada Operations Office (NNSA/NV)

    2002-01-25

    Project Gasbuggy was the first of three joint government-industry experiments conducted to test the effectiveness of nuclear explosives to fracture deeply buried, low-permeability natural gas reservoirs to stimulate production. The scope of this work plan is to document the environmental objectives and the proposed technical site investigation strategies that will be utilized for the site characterization of the Project Gasbuggy Site. Its goal is the collection of data in sufficient quantity and quality to determine current site conditions, support a risk assessment for the site surfaces, and evaluate if further remedial action is required to achieve permanent closure of the site that is both protective of human health and the environment. The Gasbuggy Site is located approximately 55 air miles east of Farmington, New Mexico, in Rio Arriba County within the Carson National Forest in the northeast portion of the San Juan Basin. Historically, Project Gasbuggy consisted of the joint government-industry detonation of a nuclear device on December 10, 1967, followed by reentry drilling and gas production testing and project evaluation activities in post-detonation operations from 1967 to 1976. Based on historical documentation, no chemical release sites other than the mud pits were identified; additionally, there was no material buried at the Gasbuggy Site other than drilling fluids and construction debris. Although previous characterization and restoration activities including sensitive species surveys, cultural resources surveys, surface geophysical surveys, and limited soil sampling and analysis were performed in 1978 and again in 2000, no formal closure of the site was achieved. Also, these efforts did not adequately address the site's potential for chemical contamination at the surface/shallow subsurface ground levels or the subsurface hazards for potential migration outside of the current site subsurface intrusion restrictions. Additional investigation

  7. Gasbuggy reservoir evaluation - 1969 report

    International Nuclear Information System (INIS)

    Atkinson, C.H.; Ward, Don C.; Lemon, R.F.

    1970-01-01

    The December 10, 1967, Project Gasbuggy nuclear detonation followed the drilling and testing of two exploratory wells which confirmed reservoir characteristics and suitability of the site. Reentry and gas production testing of the explosive emplacement hole indicated a collapse chimney about 150 feet in diameter extending from the 4,240-foot detonation depth to about 3,900 feet, the top of the 300-foot-thick Pictured Cliffs gas sand. Production tests of the chimney well in the summer of 1968 and during the last 12 months have resulted in a cumulative production of 213 million cubic feet of hydrocarbons, and gas recovery in 20 years is estimated to be 900 million cubic feet, which would be an increase by a factor of at least 5 over estimated recovery from conventional field wells in this low permeability area. At the end of production tests the flow rate was 160,000 cubic feet per day, which is 6 to 7 times that of an average field well in the area. Data from reentry of a pre-shot test well and a new postshot well at distances from the detonation of 300 and 250 feet, respectively, indicate low productivity and consequently low permeability in any fractures at these locations. (author)

  8. Gasbuggy reservoir evaluation - 1969 report

    Energy Technology Data Exchange (ETDEWEB)

    Atkinson, C H; Ward, Don C [Bureau of Mines, U.S. Department of the Interior (United States); Lemon, R F [El Paso Natural Gas Company (United States)

    1970-05-01

    The December 10, 1967, Project Gasbuggy nuclear detonation followed the drilling and testing of two exploratory wells which confirmed reservoir characteristics and suitability of the site. Reentry and gas production testing of the explosive emplacement hole indicated a collapse chimney about 150 feet in diameter extending from the 4,240-foot detonation depth to about 3,900 feet, the top of the 300-foot-thick Pictured Cliffs gas sand. Production tests of the chimney well in the summer of 1968 and during the last 12 months have resulted in a cumulative production of 213 million cubic feet of hydrocarbons, and gas recovery in 20 years is estimated to be 900 million cubic feet, which would be an increase by a factor of at least 5 over estimated recovery from conventional field wells in this low permeability area. At the end of production tests the flow rate was 160,000 cubic feet per day, which is 6 to 7 times that of an average field well in the area. Data from reentry of a pre-shot test well and a new postshot well at distances from the detonation of 300 and 250 feet, respectively, indicate low productivity and consequently low permeability in any fractures at these locations. (author)

  9. Gasbuggy, New Mexico Long-Term Hydrologic Monitoring Program Evaluation Report

    Energy Technology Data Exchange (ETDEWEB)

    None

    2009-06-01

    This report summarizes an evaluation of the Long-Term Hydrologic Monitoring Program (LTHMP) that has been conducted since 1972 at the Gasbuggy, New Mexico underground nuclear detonation site. The nuclear testing was conducted by the U.S. Atomic Energy Commission under the Plowshare program, which is discussed in greater detail in Appendix A. The detonation at Gasbuggy took place in 1967, 4,240 feet below ground surface, and was designed to fracture the host rock of a low-permeability natural gas-bearing formation in an effort to improve gas production. The site has historically been managed under the Nevada Offsites Project. These underground nuclear detonation sites are within the United States but outside of the Nevada Test Site where most of the experimental nuclear detonations conducted by the U.S. Government took place. Gasbuggy is managed by the U.S. Department of Energy (DOE) Office of Legacy Management (LM ).

  10. Gasbuggy, New Mexico, Hydrologic and Natural Gas Sampling and Analysis Results for 2009

    International Nuclear Information System (INIS)

    2009-11-01

    The U.S. Department of Energy (DOE) Office of Legacy Management conducted hydrologic and natural gas sampling for the Gasbuggy, New Mexico, site on June 16, and 17, 2009. Hydrologic sampling consists of collecting water samples from water wells and surface water locations. Natural gas sampling consists of collecting both gas samples and samples of produced water from gas production wells. The water well samples were analyzed for gamma-emitting radionuclides and tritium. Surface water samples were analyzed for tritium. Water samples from gas production wells were analyzed for gamma-emitting radionuclides, gross alpha, gross beta, and tritium. Natural gas samples were analyzed for tritium and carbon-14. Water samples were analyzed by ALS Laboratory Group in Fort Collins, Colorado, and natural gas samples were analyzed by Isotech Laboratories in Champaign, Illinois. Concentrations of tritium and gamma-emitting radionuclides in water samples collected in the vicinity of the Gasbuggy site continue to demonstrate that the sample locations have not been impacted by detonation-related contaminants. Results from the sampling of natural gas from producing wells demonstrate that the gas wells nearest the Gasbuggy site are not currently impacted by detonation-related contaminants. Annual sampling of the gas production wells nearest the Gasbuggy site for gas and produced water will continue for the foreseeable future. The sampling frequency of water wells and surface water sources in the surrounding area will be reduced to once every 5 years. The next hydrologic sampling event at water wells, springs, and ponds will be in 2014.

  11. Gasbuggy Site Assessment and Risk Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    None

    2011-03-01

    This report describes the geologic and hydrologic conditions and evaluates potential health risks to workers in the natural gas industry in the vicinity of the Gasbuggy, New Mexico, site, where the U.S. Atomic Energy Commission detonated an underground nuclear device in 1967. The 29-kiloton detonation took place 4,240 feet below ground surface and was designed to evaluate the use of a nuclear detonation to enhance natural gas production from the Pictured Cliffs Formation in the San Juan Basin, Rio Arriba County, New Mexico, on land administered by Carson National Forest. A site-specific conceptual model was developed based on current understanding of the hydrologic and geologic environment. This conceptual model was used for establishing plausible contaminant exposure scenarios, which were then evaluated for human health risk potential. The most mobile and, therefore, the most probable contaminant that could result in human exposure is tritium. Natural gas production wells were identified as having the greatest potential for bringing detonation-derived contaminants (tritium) to the ground surface in the form of tritiated produced water. Three exposure scenarios addressing potential contamination from gas wells were considered in the risk evaluation: a gas well worker during gas-well-drilling operations, a gas well worker performing routine maintenance, and a residential exposure. The residential exposure scenario was evaluated only for comparison; permanent residences on national forest lands at the Gasbuggy site are prohibited

  12. Gasbuggy Site Assessment and Risk Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    None

    2011-03-01

    The Gasbuggy site is in northern New Mexico in the San Juan Basin, Rio Arriba County (Figure 1-1). The Gasbuggy experiment was designed to evaluate the use of a nuclear detonation to enhance natural gas production from the Pictured Cliffs Formation, a tight, gas-bearing sandstone formation. The 29-kiloton-yield nuclear device was placed in a 17.5-inch wellbore at 4,240 feet (ft) below ground surface (bgs), approximately 40 ft below the Pictured Cliffs/Lewis shale contact, in an attempt to force the cavity/chimney formed by the detonation up into the Pictured Cliffs Sandstone. The test was conducted below the southwest quarter of Section 36, Township 29 North, Range 4 West, New Mexico Principal Meridian. The device was detonated on December 10, 1967, creating a 335-ft-high chimney above the detonation point and a cavity 160 ft in diameter. The gas produced from GB-ER (the emplacement and reentry well) during the post-detonation production tests was radioactive and diluted, primarily by carbon dioxide. After 2 years, the energy content of the gas had recovered to 80 percent of the value of gas in conventionally developed wells in the area. There is currently no technology capable of remediating deep underground nuclear detonation cavities and chimneys. Consequently, the U.S. Department of Energy (DOE) must continue to manage the Gasbuggy site to ensure that no inadvertent intrusion into the residual contamination occurs. DOE has complete control over the 1/4 section (160 acres) containing the shot cavity, and no drilling is permitted on that property. However, oil and gas leases are on the surrounding land. Therefore, the most likely route of intrusion and potential exposure would be through contaminated natural gas or contaminated water migrating into a producing natural gas well outside the immediate vicinity of ground zero. The purpose of this report is to describe the current site conditions and evaluate the potential health risks posed by the most plausible

  13. Interpreting the chemical results of the Gasbuggy experiment

    International Nuclear Information System (INIS)

    Taylor, R.W.; Lee, E.L.; Hill, J.H.

    1970-01-01

    Nuclear explosions in carbonate-bearing rocks release large amounts of CO 2 . In some cases, for example, when the explosion is contained and dolomite is the principal carbonate mineral, sufficient CO 2 may be generated to drive the formation gas away from the chimney. Rocks which contain free carbon, such as the shales of the recent Gasbuggy and proposed Bronco and Dragon Trail experiments, will liberate CO and H 2 in amounts predicted from the yield of the explosive and the C, CO 2 and H 2 O concentration in the rock. In general, the greater the amount of free carbon in a rock, the more H 2 will be produced and the higher will be the fraction of tritium in the gas phase. (author)

  14. Interpreting the chemical results of the Gasbuggy experiment

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, R W; Lee, E L; Hill, J H [Lawrence Radiation Laboratory, University of California, Livermore, CA (United States)

    1970-05-01

    Nuclear explosions in carbonate-bearing rocks release large amounts of CO{sub 2}. In some cases, for example, when the explosion is contained and dolomite is the principal carbonate mineral, sufficient CO{sub 2} may be generated to drive the formation gas away from the chimney. Rocks which contain free carbon, such as the shales of the recent Gasbuggy and proposed Bronco and Dragon Trail experiments, will liberate CO and H{sub 2} in amounts predicted from the yield of the explosive and the C, CO{sub 2} and H{sub 2}O concentration in the rock. In general, the greater the amount of free carbon in a rock, the more H{sub 2} will be produced and the higher will be the fraction of tritium in the gas phase. (author)

  15. Project Gasbuggy well plugging and site restoration plan

    International Nuclear Information System (INIS)

    1978-07-01

    The operational plan for conducting the final restoration work at the site of the first U.S. underground nuclear experiment for the stimulation of low-productivity natural gas reservoirs is given. The plan includes well plugging procedures, surface facilities decontamination and removal procedures, radiological guidelines, and environmental considerations

  16. An evaluation of water production from the Gasbuggy reentry well

    Energy Technology Data Exchange (ETDEWEB)

    Power, Dean V; Bowman, Charles R [El Paso Natural Gas Company (United States)

    1970-05-01

    During the gas production testing of the Gasbuggy chimney, water production rates increased from an initial 4 to 5 barrels per 10{sup 6} standard cubic feet of gas to 40 to 50 barrels per 10{sup 6} standard cubic feet of gas. This unexpected occurrence hampered operations and increased waste disposal costs. A model is developed which calculates the amount of water produced from condensation of water vapor through the cooling and expansion of the gas in the production tubing. Results from this model are compared with the observed water production from November of 1968 through May of 1969. This comparison shows that up to seven times more water is being produced at high gas flow rates than can be explained by condensed vapor, indicating that water is being introduced into the production tubing in particulate or liquid form. A correlation of excess water with the pressure, temperature and gas flow velocity parameters is performed to determine the relationship between this excess water and these parameters. It is found that the excess produced water varied linearly with downhole pressure when a threshold gas flow velocity was exceeded. The relationship is expressed by the equation H{sub 2}0 (in barrels per day) =126.5-0.1473 BHP (in pounds per square inch). The threshold gas velocity for excess water production was found to be about 6 feet per second in the 7 in casing or 40 feet per second in the 2 7/8 in tubing. An examination of the radioactivity of the gas and water produced from GB-E indicates that the tritiated water vapor in the chimney and tubing has been diluted by extraneous water. The tritium in the gas decreased as expected from about 10.9 {mu}Ci/SCF in November 1968 to 6.2 {mu}Ci/SCF in late February 1969. During this same period, the tritium in the water decreased from about 1.2 {mu}Ci/ml to 0.12 {mu}Ci/ml. Examination of water chemistry, preshot and during the production testing, indicates that at early times when there was no excess water, the produced

  17. An evaluation of water production from the Gasbuggy reentry well

    International Nuclear Information System (INIS)

    Power, Dean V.; Bowman, Charles R.

    1970-01-01

    During the gas production testing of the Gasbuggy chimney, water production rates increased from an initial 4 to 5 barrels per 10 6 standard cubic feet of gas to 40 to 50 barrels per 10 6 standard cubic feet of gas. This unexpected occurrence hampered operations and increased waste disposal costs. A model is developed which calculates the amount of water produced from condensation of water vapor through the cooling and expansion of the gas in the production tubing. Results from this model are compared with the observed water production from November of 1968 through May of 1969. This comparison shows that up to seven times more water is being produced at high gas flow rates than can be explained by condensed vapor, indicating that water is being introduced into the production tubing in particulate or liquid form. A correlation of excess water with the pressure, temperature and gas flow velocity parameters is performed to determine the relationship between this excess water and these parameters. It is found that the excess produced water varied linearly with downhole pressure when a threshold gas flow velocity was exceeded. The relationship is expressed by the equation H 2 0 (in barrels per day) =126.5-0.1473 BHP (in pounds per square inch). The threshold gas velocity for excess water production was found to be about 6 feet per second in the 7 in casing or 40 feet per second in the 2 7/8 in tubing. An examination of the radioactivity of the gas and water produced from GB-E indicates that the tritiated water vapor in the chimney and tubing has been diluted by extraneous water. The tritium in the gas decreased as expected from about 10.9 μCi/SCF in November 1968 to 6.2 μCi/SCF in late February 1969. During this same period, the tritium in the water decreased from about 1.2 μCi/ml to 0.12 μCi/ml. Examination of water chemistry, preshot and during the production testing, indicates that at early times when there was no excess water, the produced water was distilled

  18. Maintaining project consistency with transportation plans throughout the project life cycle with an emphasis on maintaining air quality conformity.

    Science.gov (United States)

    2014-11-01

    This document was developed for transportation professionals responsible for project : development and has three basic goals: : 1. Define project consistency and identify the causes of project inconsistencies and the : critical junctures in the proje...

  19. Assessment of hydrologic transport of radionuclides from the Gasbuggy underground nuclear test site, New Mexico

    International Nuclear Information System (INIS)

    Earman, S.; Chapman, J.; Andricevic, R.

    1996-09-01

    The U.S. Department of Energy (DOE) is operating an environmental restoration program to characterize, remediate, and close non-Nevada Test Site locations that were used for nuclear testing. Evaluation of radionuclide transport by groundwater from these sites is an important part of the preliminary risk analysis. These evaluations are undertaken to allow prioritization of the test areas in terms of risk, provide a quantitative basis for discussions with regulators and the public about future work at the sites, and provide a framework for assessing data needs to be filled by site characterization. The Gasbuggy site in northwestern New Mexico was the location of an underground detonation of a 29-kiloton nuclear device in 1967. The test took place in the Lewis Shale, approximately 182 m below the Ojo Alamo Sandstone, which is the aquifer closest to the detonation horizon. The conservative assumption was made that tritium was injected from the blast-created cavity into the Ojo Alamo Sandstone by the force of the explosion, via fractures created by the shot. Model results suggest that if radionuclides produced by the shot entered the Ojo Alamo, they are most likely contained within the area currently administered by DOE. The transport calculations are most sensitive to changes in the mean groundwater velocity, followed by the variance in hydraulic conductivity, the correlation scale of hydraulic conductivity, the transverse hydrodynamic dispersion coefficient, and uncertainty in the source size. This modeling was performed to investigate how the uncertainty in various physical parameters affects calculations of radionuclide transport at the Gasbuggy site, and to serve as a starting point for discussion regarding further investigation at the site; it was not intended to be a definitive simulation of migration pathways or radionuclide concentration values

  20. Consistency of climate change projections from multiple global and regional model intercomparison projects

    Science.gov (United States)

    Fernández, J.; Frías, M. D.; Cabos, W. D.; Cofiño, A. S.; Domínguez, M.; Fita, L.; Gaertner, M. A.; García-Díez, M.; Gutiérrez, J. M.; Jiménez-Guerrero, P.; Liguori, G.; Montávez, J. P.; Romera, R.; Sánchez, E.

    2018-03-01

    We present an unprecedented ensemble of 196 future climate projections arising from different global and regional model intercomparison projects (MIPs): CMIP3, CMIP5, ENSEMBLES, ESCENA, EURO- and Med-CORDEX. This multi-MIP ensemble includes all regional climate model (RCM) projections publicly available to date, along with their driving global climate models (GCMs). We illustrate consistent and conflicting messages using continental Spain and the Balearic Islands as target region. The study considers near future (2021-2050) changes and their dependence on several uncertainty sources sampled in the multi-MIP ensemble: GCM, future scenario, internal variability, RCM, and spatial resolution. This initial work focuses on mean seasonal precipitation and temperature changes. The results show that the potential GCM-RCM combinations have been explored very unevenly, with favoured GCMs and large ensembles of a few RCMs that do not respond to any ensemble design. Therefore, the grand-ensemble is weighted towards a few models. The selection of a balanced, credible sub-ensemble is challenged in this study by illustrating several conflicting responses between the RCM and its driving GCM and among different RCMs. Sub-ensembles from different initiatives are dominated by different uncertainty sources, being the driving GCM the main contributor to uncertainty in the grand-ensemble. For this analysis of the near future changes, the emission scenario does not lead to a strong uncertainty. Despite the extra computational effort, for mean seasonal changes, the increase in resolution does not lead to important changes.

  1. Full data consistency conditions for cone-beam projections with sources on a plane

    International Nuclear Information System (INIS)

    Clackdoyle, Rolf; Desbat, Laurent

    2013-01-01

    Cone-beam consistency conditions (also known as range conditions) are mathematical relationships between different cone-beam projections, and they therefore describe the redundancy or overlap of information between projections. These redundancies have often been exploited for applications in image reconstruction. In this work we describe new consistency conditions for cone-beam projections whose source positions lie on a plane. A further restriction is that the target object must not intersect this plane. The conditions require that moments of the cone-beam projections be polynomial functions of the source positions, with some additional constraints on the coefficients of the polynomials. A precise description of the consistency conditions is that the four parameters of the cone-beam projections (two for the detector, two for the source position) can be expressed with just three variables, using a certain formulation involving homogeneous polynomials. The main contribution of this work is our demonstration that these conditions are not only necessary, but also sufficient. Thus the consistency conditions completely characterize all redundancies, so no other independent conditions are possible and in this sense the conditions are full. The idea of the proof is to use the known consistency conditions for 3D parallel projections, and to then apply a 1996 theorem of Edholm and Danielsson that links parallel to cone-beam projections. The consistency conditions are illustrated with a simulation example. (paper)

  2. Tritium migration at the Gasbuggy site: Evaluation of possible hydrologic pathways

    International Nuclear Information System (INIS)

    Chapman, J.; Mihevc, T.; Lyles, B.

    1996-09-01

    An underground nuclear test named Gasbuggy was conducted in northwestern New Mexico in 1967. Subsequent groundwater monitoring in an overlying aquifer by the U.S. Environmental Protection Agency revealed increasing levels of tritium in monitoring well EPNG 10-36, located 132 m from the test, suggesting migration of contaminants from the nuclear cavity. There are three basic scenarios that could explain the occurrence of tritium in well 10-36: (1) introduction of tritium into the well from the land surface, (2) migration of tritium through the Ojo Alamo Formation, and (3) migration through the Pictured Cliffs Formation. The two subsurface transport scenarios were evaluated with a travel time analysis. In one, transport occurs to the Ojo Alamo sandstone either up the emplacement hole or through fractures created by the blast, and then laterally through the aquifer to the monitoring well. In the other, lateral transport occurs through fractures in the underlying Pictured Cliffs detonation horizon and then migrates up the monitoring well through plugged casing connecting the two formations. The travel time analysis indicates that the hydraulic conductivity measured in the Ojo Alamo Formation is too low for lateral transport to account for the observed arrival of tritium at the monitoring well. This suggests transport either through fractures intersecting the Ojo Alamo close to well EPNG 10-36, or through fractures in the Pictured Cliffs and up through the bottom plug in the well. The transport scenarios were investigated using hydrologic logging techniques and sampling at the monitoring well, with the fieldwork conducted after the removal of a string of 0.05-m-diameter tubing that had previously provided the only monitoring access

  3. Implementing a Systematic Process for Consistent Nursing Care in a NICU: A Quality Improvement Project.

    Science.gov (United States)

    McCarley, Renay Marie; Dowling, Donna A; Dolansky, Mary A; Bieda, Amy

    2018-03-01

    The global aim of this quality improvement project was to develop and implement a systematic process to assign and maintain consistent bedside nurses for infants and families. A systematic process based on a primary care nursing model was implemented to assign consistent care for a 48-bed, single-family room NICU. Four PDSA cycles were necessary to obtain agreement from the nursing staff as to the best process for assigning primary nurses. Post-intervention data revealed a 9.5 percent decrease of consistent caregivers for infants in the NICU ≤ 28 days and a 2.3 percent increase of consistent caregivers for infants in the NICU ≥ 29 days. Although these findings did not meet the goal of the specific aim, a systematic process was created to assign bedside nurses to infants. Further PDSAs will be needed to refine the process to reach the aim.

  4. Large rainfall changes consistently projected over substantial areas of tropical land

    Science.gov (United States)

    Chadwick, Robin; Good, Peter; Martin, Gill; Rowell, David P.

    2016-02-01

    Many tropical countries are exceptionally vulnerable to changes in rainfall patterns, with floods or droughts often severely affecting human life and health, food and water supplies, ecosystems and infrastructure. There is widespread disagreement among climate model projections of how and where rainfall will change over tropical land at the regional scales relevant to impacts, with different models predicting the position of current tropical wet and dry regions to shift in different ways. Here we show that despite uncertainty in the location of future rainfall shifts, climate models consistently project that large rainfall changes will occur for a considerable proportion of tropical land over the twenty-first century. The area of semi-arid land affected by large changes under a higher emissions scenario is likely to be greater than during even the most extreme regional wet or dry periods of the twentieth century, such as the Sahel drought of the late 1960s to 1990s. Substantial changes are projected to occur by mid-century--earlier than previously expected--and to intensify in line with global temperature rise. Therefore, current climate projections contain quantitative, decision-relevant information on future regional rainfall changes, particularly with regard to climate change mitigation policy.

  5. A Consistent Fuzzy Preference Relations Based ANP Model for R&D Project Selection

    Directory of Open Access Journals (Sweden)

    Chia-Hua Cheng

    2017-08-01

    Full Text Available In today’s rapidly changing economy, technology companies have to make decisions on research and development (R&D projects investment on a routine bases with such decisions having a direct impact on that company’s profitability, sustainability and future growth. Companies seeking profitable opportunities for investment and project selection must consider many factors such as resource limitations and differences in assessment, with consideration of both qualitative and quantitative criteria. Often, differences in perception by the various stakeholders hinder the attainment of a consensus of opinion and coordination efforts. Thus, in this study, a hybrid model is developed for the consideration of the complex criteria taking into account the different opinions of the various stakeholders who often come from different departments within the company and have different opinions about which direction to take. The decision-making trial and evaluation laboratory (DEMATEL approach is used to convert the cause and effect relations representing the criteria into a visual network structure. A consistent fuzzy preference relations based analytic network process (CFPR-ANP method is developed to calculate the preference-weights of the criteria based on the derived network structure. The CFPR-ANP is an improvement over the original analytic network process (ANP method in that it reduces the problem of inconsistency as well as the number of pairwise comparisons. The combined complex proportional assessment (COPRAS-G method is applied with fuzzy grey relations to resolve conflicts arising from differences in information and opinions provided by the different stakeholders about the selection of the most suitable R&D projects. This novel combination approach is then used to assist an international brand-name company to prioritize projects and make project decisions that will maximize returns and ensure sustainability for the company.

  6. Project consistency with transportation plans and air quality conformity workshops : technical report.

    Science.gov (United States)

    2015-04-01

    This implementation project supports streamlined project delivery, one of the goals outlined by the Texas : Department of Transportation (TxDOT) leadership to achieve an efficient and effective transportation system : in Texas. The project benefits T...

  7. Maintaining project consistency with transportation plans throughout the project life cycle with an emphasis on maintaining air quality conformity: technical report.

    Science.gov (United States)

    2016-07-01

    Federal and state transportation planning statutory and regulatory laws require transportation projects to be : consistent with transportation plans and improvement programs before a federal action can be taken on a : project requiring one. Significa...

  8. Self-consistent EXAFS PDF Projection Method by Matched Correction of Fourier Filter Signal Distortion

    International Nuclear Information System (INIS)

    Lee, Jay Min; Yang, Dong-Seok

    2007-01-01

    Inverse problem solving computation was performed for solving PDF (pair distribution function) from simulated data EXAFS based on data FEFF. For a realistic comparison with experimental data, we chose a model of the first sub-shell Mn-0 pair showing the Jahn Teller distortion in crystalline LaMnO3. To restore the Fourier filtering signal distortion, involved in the first sub-shell information isolated from higher shell contents, relevant distortion matching function was computed initially from the proximity model, and iteratively from the prior-guess during consecutive regularization computation. Adaptive computation of EXAFS background correction is an issue of algorithm development, but our preliminary test was performed under the simulated background correction perfectly excluding the higher shell interference. In our numerical result, efficient convergence of iterative solution indicates a self-consistent tendency that a true PDF solution is convinced as a counterpart of genuine chi-data, provided that a background correction function is iteratively solved using an extended algorithm of MEPP (Matched EXAFS PDF Projection) under development

  9. Funding Medical Research Projects: Taking into Account Referees' Severity and Consistency through Many-Faceted Rasch Modeling of Projects' Scores.

    Science.gov (United States)

    Tesio, Luigi; Simone, Anna; Grzeda, Mariuzs T; Ponzio, Michela; Dati, Gabriele; Zaratin, Paola; Perucca, Laura; Battaglia, Mario A

    2015-01-01

    The funding policy of research projects often relies on scores assigned by a panel of experts (referees). The non-linear nature of raw scores and the severity and inconsistency of individual raters may generate unfair numeric project rankings. Rasch measurement (many-facets version, MFRM) provides a valid alternative to scoring. MFRM was applied to the scores achieved by 75 research projects on multiple sclerosis sent in response to a previous annual call by FISM-Italian Foundation for Multiple Sclerosis. This allowed to simulate, a posteriori, the impact of MFRM on the funding scenario. The applications were each scored by 2 to 4 independent referees (total = 131) on a 10-item, 0-3 rating scale called FISM-ProQual-P. The rotation plan assured "connection" of all pairs of projects through at least 1 shared referee.The questionnaire fulfilled satisfactorily the stringent criteria of Rasch measurement for psychometric quality (unidimensionality, reliability and data-model fit). Arbitrarily, 2 acceptability thresholds were set at a raw score of 21/30 and at the equivalent Rasch measure of 61.5/100, respectively. When the cut-off was switched from score to measure 8 out of 18 acceptable projects had to be rejected, while 15 rejected projects became eligible for funding. Some referees, of various severity, were grossly inconsistent (z-std fit indexes less than -1.9 or greater than 1.9). The FISM-ProQual-P questionnaire seems a valid and reliable scale. MFRM may help the decision-making process for allocating funds to MS research projects but also in other fields. In repeated assessment exercises it can help the selection of reliable referees. Their severity can be steadily calibrated, thus obviating the need to connect them with other referees assessing the same projects.

  10. Consistency of lattice definitions of U(1) flux in Abelian projected SU(2) gauge theory

    International Nuclear Information System (INIS)

    Matsuki, Takayuki; Haymaker, Richard W.

    2004-01-01

    We reexamine the dual Abrikosov vortex under the requirement that the lattice averages of the fields satisfy exact Maxwell equations [ME]. The electric ME accounts for the total flux and the magnetic ME determines the shape of the confining string. This leads to unique and consistent definitions of flux and electric and magnetic currents at finite lattice spacing. The resulting modification of the standard DeGrand-Toussaint construction gives a magnetic current comprised of smeared monopoles

  11. Samurai project: Verifying the consistency of black-hole-binary waveforms for gravitational-wave detection

    OpenAIRE

    Hannam, Mark; Husa, Sascha; Baker, John G.; Boyle, Michael; Brügmann, Bernd; Chu, Tony; Dorband, Nils; Herrmann, Frank; Hinder, Ian; Kelly, Bernard J.; Kidder, Lawrence E.; Laguna, Pablo; Matthews, Keith D.; van-Meter, James R.; Pfeiffer, Harald P.

    2009-01-01

    We quantify the consistency of numerical-relativity black-hole-binary waveforms for use in gravitational-wave (GW) searches with current and planned ground-based detectors. We compare previously published results for the (center dot=2,vertical bar m vertical bar=2) mode of the gravitational waves from an equal-mass nonspinning binary, calculated by five numerical codes. We focus on the 1000M (about six orbits, or 12 GW cycles) before the peak of the GW amplitude and the subsequent ringdown. W...

  12. Samurai project: Verifying the consistency of black-hole-binary waveforms for gravitational-wave detection

    Science.gov (United States)

    Hannam, Mark; Husa, Sascha; Baker, John G.; Boyle, Michael; Brügmann, Bernd; Chu, Tony; Dorband, Nils; Herrmann, Frank; Hinder, Ian; Kelly, Bernard J.; Kidder, Lawrence E.; Laguna, Pablo; Matthews, Keith D.; van Meter, James R.; Pfeiffer, Harald P.; Pollney, Denis; Reisswig, Christian; Scheel, Mark A.; Shoemaker, Deirdre

    2009-04-01

    We quantify the consistency of numerical-relativity black-hole-binary waveforms for use in gravitational-wave (GW) searches with current and planned ground-based detectors. We compare previously published results for the (ℓ=2,|m|=2) mode of the gravitational waves from an equal-mass nonspinning binary, calculated by five numerical codes. We focus on the 1000M (about six orbits, or 12 GW cycles) before the peak of the GW amplitude and the subsequent ringdown. We find that the phase and amplitude agree within each code’s uncertainty estimates. The mismatch between the (ℓ=2,|m|=2) modes is better than 10-3 for binary masses above 60M⊙ with respect to the Enhanced LIGO detector noise curve, and for masses above 180M⊙ with respect to Advanced LIGO, Virgo, and Advanced Virgo. Between the waveforms with the best agreement, the mismatch is below 2×10-4. We find that the waveforms would be indistinguishable in all ground-based detectors (and for the masses we consider) if detected with a signal-to-noise ratio of less than ≈14, or less than ≈25 in the best cases.

  13. Samurai project: Verifying the consistency of black-hole-binary waveforms for gravitational-wave detection

    International Nuclear Information System (INIS)

    Hannam, Mark; Husa, Sascha; Baker, John G.; Kelly, Bernard J.; Boyle, Michael; Bruegmann, Bernd; Chu, Tony; Matthews, Keith D.; Pfeiffer, Harald P.; Scheel, Mark A.; Dorband, Nils; Pollney, Denis; Reisswig, Christian; Herrmann, Frank; Hinder, Ian; Kidder, Lawrence E.; Laguna, Pablo; Shoemaker, Deirdre

    2009-01-01

    We quantify the consistency of numerical-relativity black-hole-binary waveforms for use in gravitational-wave (GW) searches with current and planned ground-based detectors. We compare previously published results for the (l=2,|m|=2) mode of the gravitational waves from an equal-mass nonspinning binary, calculated by five numerical codes. We focus on the 1000M (about six orbits, or 12 GW cycles) before the peak of the GW amplitude and the subsequent ringdown. We find that the phase and amplitude agree within each code's uncertainty estimates. The mismatch between the (l=2,|m|=2) modes is better than 10 -3 for binary masses above 60M · with respect to the Enhanced LIGO detector noise curve, and for masses above 180M · with respect to Advanced LIGO, Virgo, and Advanced Virgo. Between the waveforms with the best agreement, the mismatch is below 2x10 -4 . We find that the waveforms would be indistinguishable in all ground-based detectors (and for the masses we consider) if detected with a signal-to-noise ratio of less than ≅14, or less than ≅25 in the best cases.

  14. Determining the explosion effects on the Gasbuggy reservoir from computer simulation of the postshot gas production history

    Energy Technology Data Exchange (ETDEWEB)

    Rogers, Leo A [El Paso Natural Gas Company (United States)

    1970-05-01

    Analysis of the gas production data from Gasbuggy to deduce reservoir properties outside the chimney is complicated by the large gas storage volume in the chimney because the gas flow from the surrounding reservoir into the chimney cannot be directly measured. This problem was overcome by developing a chimney volume factor F (M{sup 2}CF/PSI) based upon analysis of rapid drawdowns during the production tests. The chimney volume factor was in turn used to construct the time history of the required influx of gas into the chimney from the surrounding reservoir. The most probable value of F to describe the chimney is found to be 0.150 M{sup 2}CF/PSI. Postulated models of the reservoir properties outside the chimney are examined by calculating the pressure distribution and flow of gas through the reservoir with the experimentally observed chimney pressure history applied to the cavity wall. The calculated influx from the reservoir into the chimney is then compared to the required influx and the calculated pressure at a radius of 300 feet is compared to the observed pressures in a shut-in satellite well (GB-2RS) which intersects the gas-bearing formation 300 feet from the center of the chimney. A description of the mathematics in the computer program used to perform the calculations is given. Gas flow for a radial model wherein permeability and porosity are uniform through the gas producing sand outside the chimney was calculated for several values of permeability. These calculations indicated that for the first drawdown test (July 1968) the permeability-producing height product (kh) was in the region of 15 to 30 millidarcy-feet (md-ft) and that after several months of testing, the effective kh had dropped to less than 8 md-ft. Calculations wherein (1) the permeability decreases from the chimney out to the 'fracture' radius, and (2) an increased production height is used near the chimney, match the data better than the simple radial model. Reasonable fits to the data for

  15. Determining the explosion effects on the Gasbuggy reservoir from computer simulation of the postshot gas production history

    International Nuclear Information System (INIS)

    Rogers, Leo A.

    1970-01-01

    Analysis of the gas production data from Gasbuggy to deduce reservoir properties outside the chimney is complicated by the large gas storage volume in the chimney because the gas flow from the surrounding reservoir into the chimney cannot be directly measured. This problem was overcome by developing a chimney volume factor F (M 2 CF/PSI) based upon analysis of rapid drawdowns during the production tests. The chimney volume factor was in turn used to construct the time history of the required influx of gas into the chimney from the surrounding reservoir. The most probable value of F to describe the chimney is found to be 0.150 M 2 CF/PSI. Postulated models of the reservoir properties outside the chimney are examined by calculating the pressure distribution and flow of gas through the reservoir with the experimentally observed chimney pressure history applied to the cavity wall. The calculated influx from the reservoir into the chimney is then compared to the required influx and the calculated pressure at a radius of 300 feet is compared to the observed pressures in a shut-in satellite well (GB-2RS) which intersects the gas-bearing formation 300 feet from the center of the chimney. A description of the mathematics in the computer program used to perform the calculations is given. Gas flow for a radial model wherein permeability and porosity are uniform through the gas producing sand outside the chimney was calculated for several values of permeability. These calculations indicated that for the first drawdown test (July 1968) the permeability-producing height product (kh) was in the region of 15 to 30 millidarcy-feet (md-ft) and that after several months of testing, the effective kh had dropped to less than 8 md-ft. Calculations wherein (1) the permeability decreases from the chimney out to the 'fracture' radius, and (2) an increased production height is used near the chimney, match the data better than the simple radial model. Reasonable fits to the data for the

  16. WE-AB-207A-02: John’s Equation Based Consistency Condition and Incomplete Projection Restoration Upon Circular Orbit CBCT

    International Nuclear Information System (INIS)

    Ma, J; Qi, H; Wu, S; Xu, Y; Zhou, L; Yan, H

    2016-01-01

    Purpose: In transmitted X-ray tomography imaging, projections are sometimes incomplete due to a variety of reasons, such as geometry inaccuracy, defective detector cells, etc. To address this issue, we have derived a direct consistency condition based on John’s Equation, and proposed a method to effectively restore incomplete projections based on this consistency condition. Methods: Through parameter substitutions, we have derived a direct consistency condition equation from John’s equation, in which the left side is only projection derivative of view and the right side is projection derivative of other geometrical parameters. Based on this consistency condition, a projection restoration method is proposed, which includes five steps: 1) Forward projecting reconstructed image and using linear interpolation to estimate the incomplete projections as the initial result; 2) Performing Fourier transform on the projections; 3) Restoring the incomplete frequency data using the consistency condition equation; 4) Performing inverse Fourier transform; 5) Repeat step 2)∼4) until our criteria is met to terminate the iteration. Results: A beam-blocking-based scatter correction case and a bad-pixel correction case were used to demonstrate the efficacy and robustness of our restoration method. The mean absolute error (MAE), signal noise ratio (SNR) and mean square error (MSE) were employed as our evaluation metrics of the reconstructed images. For the scatter correction case, the MAE is reduced from 63.3% to 71.7% with 4 iterations. Compared with the existing Patch’s method, the MAE of our method is further reduced by 8.72%. For the bad-pixel case, the SNR of the reconstructed image by our method is increased from 13.49% to 21.48%, with the MSE being decreased by 45.95%, compared with linear interpolation method. Conclusion: Our studies have demonstrated that our restoration method based on the new consistency condition could effectively restore the incomplete projections

  17. 0-6758 : maintaining project consistency with transportation plans through the project life cycle with an emphasis on maintaining air quality conformity.

    Science.gov (United States)

    2014-03-01

    Streamlined project delivery is a federally : mandated goal that the Texas Department of : Transportation (TxDOT) leadership supports to : achieve a more efficient and effective : transportation system in Texas. : Federal and state transportation pla...

  18. Interface Consistency

    DEFF Research Database (Denmark)

    Staunstrup, Jørgen

    1998-01-01

    This paper proposes that Interface Consistency is an important issue for the development of modular designs. Byproviding a precise specification of component interfaces it becomes possible to check that separately developedcomponents use a common interface in a coherent matter thus avoiding a very...... significant source of design errors. Awide range of interface specifications are possible, the simplest form is a syntactical check of parameter types.However, today it is possible to do more sophisticated forms involving semantic checks....

  19. Consistency in PERT problems

    OpenAIRE

    Bergantiños, Gustavo; Valencia-Toledo, Alfredo; Vidal-Puga, Juan

    2016-01-01

    The program evaluation review technique (PERT) is a tool used to schedule and coordinate activities in a complex project. In assigning the cost of a potential delay, we characterize the Shapley rule as the only rule that satisfies consistency and other desirable properties.

  20. Economic Effectiveness Evaluation In Projects Consisting In Automating Settlement Systems. The Example Of A Financial Centre Of An Internationa l Automotive Company

    OpenAIRE

    Miroslaw Dyczkowski

    2010-01-01

    Economic effectiveness has become a decisive factor in feasibility studies for IT projects due to deteriorating economic situation. This paper characterises such an assessment on the example of an automated settlement system for a supply chain. The described project was carried out by a financial centre – located in Poland – of an in ternational company operating in the automotive industry. It aimed at applying EDI technologies to automate logistics. The project...

  1. Improving consistency in findings from pharmacoepidemiological studies: The IMI-protect (Pharmacoepidemiological research on outcomes of therapeutics by a European consortium) project

    NARCIS (Netherlands)

    De Groot, Mark C.H.; Schlienger, Raymond; Reynolds, Robert; Gardarsdottir, Helga; Juhaeri, Juhaeri; Hesse, Ulrik; Gasse, Christiane; Rottenkolber, Marietta; Schuerch, Markus; Kurz, Xavier; Klungel, Olaf H.

    2013-01-01

    Background: Pharmacoepidemiological (PE) research should provide consistent, reliable and reproducible results to contribute to the benefit-risk assessment of medicines. IMI-PROTECT aims to identify sources of methodological variations in PE studies using a common protocol and analysis plan across

  2. Structural Consistency, Consistency, and Sequential Rationality.

    OpenAIRE

    Kreps, David M; Ramey, Garey

    1987-01-01

    Sequential equilibria comprise consistent beliefs and a sequentially ra tional strategy profile. Consistent beliefs are limits of Bayes ratio nal beliefs for sequences of strategies that approach the equilibrium strategy. Beliefs are structurally consistent if they are rationaliz ed by some single conjecture concerning opponents' strategies. Consis tent beliefs are not necessarily structurally consistent, notwithstan ding a claim by Kreps and Robert Wilson (1982). Moreover, the spirit of stru...

  3. Consistent model driven architecture

    Science.gov (United States)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  4. Bitcoin Meets Strong Consistency

    OpenAIRE

    Decker, Christian; Seidel, Jochen; Wattenhofer, Roger

    2014-01-01

    The Bitcoin system only provides eventual consistency. For everyday life, the time to confirm a Bitcoin transaction is prohibitively slow. In this paper we propose a new system, built on the Bitcoin blockchain, which enables strong consistency. Our system, PeerCensus, acts as a certification authority, manages peer identities in a peer-to-peer network, and ultimately enhances Bitcoin and similar systems with strong consistency. Our extensive analysis shows that PeerCensus is in a secure state...

  5. Consistent classical supergravity theories

    International Nuclear Information System (INIS)

    Muller, M.

    1989-01-01

    This book offers a presentation of both conformal and Poincare supergravity. The consistent four-dimensional supergravity theories are classified. The formulae needed for further modelling are included

  6. Projectables

    DEFF Research Database (Denmark)

    Rasmussen, Troels A.; Merritt, Timothy R.

    2017-01-01

    CNC cutting machines have become essential tools for designers and architects enabling rapid prototyping, model-building and production of high quality components. Designers often cut from new materials, discarding the irregularly shaped remains. We introduce ProjecTables, a visual augmented...... reality system for interactive packing of model parts onto sheet materials. ProjecTables enables designers to (re)use scrap materials for CNC cutting that would have been previously thrown away, at the same time supporting aesthetic choices related to wood grain, avoiding surface blemishes, and other...... relevant material properties. We conducted evaluations of ProjecTables with design students from Aarhus School of Architecture, demonstrating that participants could quickly and easily place and orient model parts reducing material waste. Contextual interviews and ideation sessions led to a deeper...

  7. Consistency of orthodox gravity

    Energy Technology Data Exchange (ETDEWEB)

    Bellucci, S. [INFN, Frascati (Italy). Laboratori Nazionali di Frascati; Shiekh, A. [International Centre for Theoretical Physics, Trieste (Italy)

    1997-01-01

    A recent proposal for quantizing gravity is investigated for self consistency. The existence of a fixed-point all-order solution is found, corresponding to a consistent quantum gravity. A criterion to unify couplings is suggested, by invoking an application of their argument to more complex systems.

  8. Quasiparticles and thermodynamical consistency

    International Nuclear Information System (INIS)

    Shanenko, A.A.; Biro, T.S.; Toneev, V.D.

    2003-01-01

    A brief and simple introduction into the problem of the thermodynamical consistency is given. The thermodynamical consistency relations, which should be taken into account under constructing a quasiparticle model, are found in a general manner from the finite-temperature extension of the Hellmann-Feynman theorem. Restrictions following from these relations are illustrated by simple physical examples. (author)

  9. Reporting consistently on CSR

    DEFF Research Database (Denmark)

    Thomsen, Christa; Nielsen, Anne Ellerup

    2006-01-01

    This chapter first outlines theory and literature on CSR and Stakeholder Relations focusing on the different perspectives and the contextual and dynamic character of the CSR concept. CSR reporting challenges are discussed and a model of analysis is proposed. Next, our paper presents the results...... of a case study showing that companies use different and not necessarily consistent strategies for reporting on CSR. Finally, the implications for managerial practice are discussed. The chapter concludes by highlighting the value and awareness of the discourse and the discourse types adopted...... in the reporting material. By implementing consistent discourse strategies that interact according to a well-defined pattern or order, it is possible to communicate a strong social commitment on the one hand, and to take into consideration the expectations of the shareholders and the other stakeholders...

  10. Geometrically Consistent Mesh Modification

    KAUST Repository

    Bonito, A.

    2010-01-01

    A new paradigm of adaptivity is to execute refinement, coarsening, and smoothing of meshes on manifolds with incomplete information about their geometry and yet preserve position and curvature accuracy. We refer to this collectively as geometrically consistent (GC) mesh modification. We discuss the concept of discrete GC, show the failure of naive approaches, and propose and analyze a simple algorithm that is GC and accuracy preserving. © 2010 Society for Industrial and Applied Mathematics.

  11. The Rucio Consistency Service

    CERN Document Server

    Serfon, Cedric; The ATLAS collaboration

    2016-01-01

    One of the biggest challenge with Large scale data management system is to ensure the consistency between the global file catalog and what is physically on all storage elements. To tackle this issue, the Rucio software which is used by the ATLAS Distributed Data Management system has been extended to automatically handle lost or unregistered files (aka Dark Data). This system automatically detects these inconsistencies and take actions like recovery or deletion of unneeded files in a central manner. In this talk, we will present this system, explain the internals and give some results.

  12. Is cosmology consistent?

    International Nuclear Information System (INIS)

    Wang Xiaomin; Tegmark, Max; Zaldarriaga, Matias

    2002-01-01

    We perform a detailed analysis of the latest cosmic microwave background (CMB) measurements (including BOOMERaNG, DASI, Maxima and CBI), both alone and jointly with other cosmological data sets involving, e.g., galaxy clustering and the Lyman Alpha Forest. We first address the question of whether the CMB data are internally consistent once calibration and beam uncertainties are taken into account, performing a series of statistical tests. With a few minor caveats, our answer is yes, and we compress all data into a single set of 24 bandpowers with associated covariance matrix and window functions. We then compute joint constraints on the 11 parameters of the 'standard' adiabatic inflationary cosmological model. Our best fit model passes a series of physical consistency checks and agrees with essentially all currently available cosmological data. In addition to sharp constraints on the cosmic matter budget in good agreement with those of the BOOMERaNG, DASI and Maxima teams, we obtain a heaviest neutrino mass range 0.04-4.2 eV and the sharpest constraints to date on gravity waves which (together with preference for a slight red-tilt) favor 'small-field' inflation models

  13. Consistent Quantum Theory

    Science.gov (United States)

    Griffiths, Robert B.

    2001-11-01

    Quantum mechanics is one of the most fundamental yet difficult subjects in physics. Nonrelativistic quantum theory is presented here in a clear and systematic fashion, integrating Born's probabilistic interpretation with Schrödinger dynamics. Basic quantum principles are illustrated with simple examples requiring no mathematics beyond linear algebra and elementary probability theory. The quantum measurement process is consistently analyzed using fundamental quantum principles without referring to measurement. These same principles are used to resolve several of the paradoxes that have long perplexed physicists, including the double slit and Schrödinger's cat. The consistent histories formalism used here was first introduced by the author, and extended by M. Gell-Mann, J. Hartle and R. Omnès. Essential for researchers yet accessible to advanced undergraduate students in physics, chemistry, mathematics, and computer science, this book is supplementary to standard textbooks. It will also be of interest to physicists and philosophers working on the foundations of quantum mechanics. Comprehensive account Written by one of the main figures in the field Paperback edition of successful work on philosophy of quantum mechanics

  14. Reasons for decision in the matter of Enbridge Southern Lights GP on behalf of Enbridge Southern Lights LP and Enbridge Pipelines Inc. : facilities[Application dated 9 March 2007 for the Southern Lights Project consisting of the: 1. Diluent Pipeline Project, and 2. Capacity Replacement Project

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2008-02-15

    In March 2007, Enbridge Southern Lights (ESL) GP on behalf of Enbridge Southern Lights LP and Enbridge Pipelines Inc. (EPI) applied for approvals related to the Southern Lights Project. The first component of the project involves the construction of a pipeline to transport diluent from Chicago, Illinois to Edmonton on Line 13, an existing EPI Mainline pipeline. The second component of the project involves a Capacity Replacement Project to replace the loss of southbound capacity on the EPI Mainline system resulting from the transfer of Line 13 to diluent service. The application required Board approvals for several related project components, including the transfer of ownership of EPI's Line 13 to ESL; the removal of Line 13 from southbound crude oil delivery service; reversing the flow on Line 13 to carry diluent from the Canada/US border northbound to Edmonton, Alberta; constructing a new oil pipeline to transport light sour crude oil; physical changes and alterations to EPI's Line 2; and, appropriate tolls and tariffs. The Board determined that the Southern Lights Project is an innovative and cost-effective solution to transport diluent. The applicants demonstrated sufficient diluent shipping commitments to ensure the long term viability of the pipeline. The Board found that the proposal to build new facilities on existing EPI sites and right-of-way should minimize negative impacts on area landowners, and also judged that mitigation planned for the construction phase will minimize potential adverse effects. The Board will require ESL to conduct an emergency response exercise where Line 13 crosses the South Saskatchewan River. Ongoing discussions between the applicants and Aboriginal groups, and a Heritage Resource Discovery Contingency Plan, will reduce potential impacts to traditional use sites. Having reviewed all evidence, the Board approved applications for the Southern Lights Pipeline Project, worth an estimated $247.5 million in Canadian spending

  15. Replica consistency in a Data Grid

    International Nuclear Information System (INIS)

    Domenici, Andrea; Donno, Flavia; Pucciani, Gianni; Stockinger, Heinz; Stockinger, Kurt

    2004-01-01

    A Data Grid is a wide area computing infrastructure that employs Grid technologies to provide storage capacity and processing power to applications that handle very large quantities of data. Data Grids rely on data replication to achieve better performance and reliability by storing copies of data sets on different Grid nodes. When a data set can be modified by applications, the problem of maintaining consistency among existing copies arises. The consistency problem also concerns metadata, i.e., additional information about application data sets such as indices, directories, or catalogues. This kind of metadata is used both by the applications and by the Grid middleware to manage the data. For instance, the Replica Management Service (the Grid middleware component that controls data replication) uses catalogues to find the replicas of each data set. Such catalogues can also be replicated and their consistency is crucial to the correct operation of the Grid. Therefore, metadata consistency generally poses stricter requirements than data consistency. In this paper we report on the development of a Replica Consistency Service based on the middleware mainly developed by the European Data Grid Project. The paper summarises the main issues in the replica consistency problem, and lays out a high-level architectural design for a Replica Consistency Service. Finally, results from simulations of different consistency models are presented

  16. Consistency of canonical formulation of Horava gravity

    International Nuclear Information System (INIS)

    Soo, Chopin

    2011-01-01

    Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.

  17. Consistency of canonical formulation of Horava gravity

    Energy Technology Data Exchange (ETDEWEB)

    Soo, Chopin, E-mail: cpsoo@mail.ncku.edu.tw [Department of Physics, National Cheng Kung University, Tainan, Taiwan (China)

    2011-09-22

    Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.

  18. Measuring process and knowledge consistency

    DEFF Research Database (Denmark)

    Edwards, Kasper; Jensen, Klaes Ladeby; Haug, Anders

    2007-01-01

    When implementing configuration systems, knowledge about products and processes are documented and replicated in the configuration system. This practice assumes that products are specified consistently i.e. on the same rule base and likewise for processes. However, consistency cannot be taken...... for granted; rather the contrary, and attempting to implement a configuration system may easily ignite a political battle. This is because stakes are high in the sense that the rules and processes chosen may only reflect one part of the practice, ignoring a majority of the employees. To avoid this situation......, this paper presents a methodology for measuring product and process consistency prior to implementing a configuration system. The methodology consists of two parts: 1) measuring knowledge consistency and 2) measuring process consistency. Knowledge consistency is measured by developing a questionnaire...

  19. Consistency argued students of fluid

    Science.gov (United States)

    Viyanti; Cari; Suparmi; Winarti; Slamet Budiarti, Indah; Handika, Jeffry; Widyastuti, Fatma

    2017-01-01

    Problem solving for physics concepts through consistency arguments can improve thinking skills of students and it is an important thing in science. The study aims to assess the consistency of the material Fluid student argmentation. The population of this study are College students PGRI Madiun, UIN Sunan Kalijaga Yogyakarta and Lampung University. Samples using cluster random sampling, 145 samples obtained by the number of students. The study used a descriptive survey method. Data obtained through multiple-choice test and interview reasoned. Problem fluid modified from [9] and [1]. The results of the study gained an average consistency argmentation for the right consistency, consistency is wrong, and inconsistent respectively 4.85%; 29.93%; and 65.23%. Data from the study have an impact on the lack of understanding of the fluid material which is ideally in full consistency argued affect the expansion of understanding of the concept. The results of the study as a reference in making improvements in future studies is to obtain a positive change in the consistency of argumentations.

  20. Coordinating user interfaces for consistency

    CERN Document Server

    Nielsen, Jakob

    2001-01-01

    In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analys

  1. Choice, internal consistency, and rationality

    OpenAIRE

    Aditi Bhattacharyya; Prasanta K. Pattanaik; Yongsheng Xu

    2010-01-01

    The classical theory of rational choice is built on several important internal consistency conditions. In recent years, the reasonableness of those internal consistency conditions has been questioned and criticized, and several responses to accommodate such criticisms have been proposed in the literature. This paper develops a general framework to accommodate the issues raised by the criticisms of classical rational choice theory, and examines the broad impact of these criticisms from both no...

  2. Self-consistent quark bags

    International Nuclear Information System (INIS)

    Rafelski, J.

    1979-01-01

    After an introductory overview of the bag model the author uses the self-consistent solution of the coupled Dirac-meson fields to represent a bound state of strongly ineteracting fermions. In this framework he discusses the vivial approach to classical field equations. After a short description of the used numerical methods the properties of bound states of scalar self-consistent Fields and the solutions of a self-coupled Dirac field are considered. (HSI) [de

  3. Time-consistent and market-consistent evaluations

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2014-01-01

    We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  4. Market-consistent actuarial valuation

    CERN Document Server

    Wüthrich, Mario V

    2016-01-01

    This is the third edition of this well-received textbook, presenting powerful methods for measuring insurance liabilities and assets in a consistent way, with detailed mathematical frameworks that lead to market-consistent values for liabilities. Topics covered are stochastic discounting with deflators, valuation portfolio in life and non-life insurance, probability distortions, asset and liability management, financial risks, insurance technical risks, and solvency. Including updates on recent developments and regulatory changes under Solvency II, this new edition of Market-Consistent Actuarial Valuation also elaborates on different risk measures, providing a revised definition of solvency based on industry practice, and presents an adapted valuation framework which takes a dynamic view of non-life insurance reserving risk.

  5. The Principle of Energetic Consistency

    Science.gov (United States)

    Cohn, Stephen E.

    2009-01-01

    A basic result in estimation theory is that the minimum variance estimate of the dynamical state, given the observations, is the conditional mean estimate. This result holds independently of the specifics of any dynamical or observation nonlinearity or stochasticity, requiring only that the probability density function of the state, conditioned on the observations, has two moments. For nonlinear dynamics that conserve a total energy, this general result implies the principle of energetic consistency: if the dynamical variables are taken to be the natural energy variables, then the sum of the total energy of the conditional mean and the trace of the conditional covariance matrix (the total variance) is constant between observations. Ensemble Kalman filtering methods are designed to approximate the evolution of the conditional mean and covariance matrix. For them the principle of energetic consistency holds independently of ensemble size, even with covariance localization. However, full Kalman filter experiments with advection dynamics have shown that a small amount of numerical dissipation can cause a large, state-dependent loss of total variance, to the detriment of filter performance. The principle of energetic consistency offers a simple way to test whether this spurious loss of variance limits ensemble filter performance in full-blown applications. The classical second-moment closure (third-moment discard) equations also satisfy the principle of energetic consistency, independently of the rank of the conditional covariance matrix. Low-rank approximation of these equations offers an energetically consistent, computationally viable alternative to ensemble filtering. Current formulations of long-window, weak-constraint, four-dimensional variational methods are designed to approximate the conditional mode rather than the conditional mean. Thus they neglect the nonlinear bias term in the second-moment closure equation for the conditional mean. The principle of

  6. Consistent guiding center drift theories

    International Nuclear Information System (INIS)

    Wimmel, H.K.

    1982-04-01

    Various guiding-center drift theories are presented that are optimized in respect of consistency. They satisfy exact energy conservation theorems (in time-independent fields), Liouville's theorems, and appropriate power balance equations. A theoretical framework is given that allows direct and exact derivation of associated drift-kinetic equations from the respective guiding-center drift-orbit theories. These drift-kinetic equations are listed. Northrop's non-optimized theory is discussed for reference, and internal consistency relations of G.C. drift theories are presented. (orig.)

  7. Weak consistency and strong paraconsistency

    Directory of Open Access Journals (Sweden)

    Gemma Robles

    2009-11-01

    Full Text Available In a standard sense, consistency and paraconsistency are understood as, respectively, the absence of any contradiction and as the absence of the ECQ (“E contradictione quodlibet” rule that allows us to conclude any well formed formula from any contradiction. The aim of this paper is to explain the concepts of weak consistency alternative to the standard one, the concepts of paraconsistency related to them and the concept of strong paraconsistency, all of which have been defined by the author together with José M. Méndez.

  8. Consistent force fields for saccharides

    DEFF Research Database (Denmark)

    Rasmussen, Kjeld

    1999-01-01

    Consistent force fields for carbohydrates were hitherto developed by extensive optimization ofpotential energy function parameters on experimental data and on ab initio results. A wide range of experimental data is used: internal structures obtained from gas phase electron diffraction and from x......-anomeric effects are accounted for without addition of specific terms. The work is done in the framework of the Consistent Force Field which originatedin Israel and was further developed in Denmark. The actual methods and strategies employed havebeen described previously. Extensive testing of the force field...

  9. Glass consistency and glass performance

    International Nuclear Information System (INIS)

    Plodinec, M.J.; Ramsey, W.G.

    1994-01-01

    Glass produced by the Defense Waste Processing Facility (DWPF) will have to consistently be more durable than a benchmark glass (evaluated using a short-term leach test), with high confidence. The DWPF has developed a Glass Product Control Program to comply with this specification. However, it is not clear what relevance product consistency has on long-term glass performance. In this report, the authors show that DWPF glass, produced in compliance with this specification, can be expected to effectively limit the release of soluble radionuclides to natural environments. However, the release of insoluble radionuclides to the environment will be limited by their solubility, and not glass durability

  10. Time-consistent actuarial valuations

    NARCIS (Netherlands)

    Pelsser, A.A.J.; Salahnejhad Ghalehjooghi, A.

    2016-01-01

    Time-consistent valuations (i.e. pricing operators) can be created by backward iteration of one-period valuations. In this paper we investigate the continuous-time limits of well-known actuarial premium principles when such backward iteration procedures are applied. This method is applied to an

  11. Dynamically consistent oil import tariffs

    International Nuclear Information System (INIS)

    Karp, L.; Newbery, D.M.

    1992-01-01

    The standard theory of optimal tariffs considers tariffs on perishable goods produced abroad under static conditions, in which tariffs affect prices only in that period. Oil and other exhaustable resources do not fit this model, for current tariffs affect the amount of oil imported, which will affect the remaining stock and hence its future price. The problem of choosing a dynamically consistent oil import tariff when suppliers are competitive but importers have market power is considered. The open-loop Nash tariff is solved for the standard competitive case in which the oil price is arbitraged, and it was found that the resulting tariff rises at the rate of interest. This tariff was found to have an equilibrium that in general is dynamically inconsistent. Nevertheless, it is shown that necessary and sufficient conditions exist under which the tariff satisfies the weaker condition of time consistency. A dynamically consistent tariff is obtained by assuming that all agents condition their current decisions on the remaining stock of the resource, in contrast to open-loop strategies. For the natural case in which all agents choose their actions simultaneously in each period, the dynamically consistent tariff was characterized, and found to differ markedly from the time-inconsistent open-loop tariff. It was shown that if importers do not have overwhelming market power, then the time path of the world price is insensitive to the ability to commit, as is the level of wealth achieved by the importer. 26 refs., 4 figs

  12. Consistently violating the non-Gaussian consistency relation

    International Nuclear Information System (INIS)

    Mooij, Sander; Palma, Gonzalo A.

    2015-01-01

    Non-attractor models of inflation are characterized by the super-horizon evolution of curvature perturbations, introducing a violation of the non-Gaussian consistency relation between the bispectrum's squeezed limit and the power spectrum's spectral index. In this work we show that the bispectrum's squeezed limit of non-attractor models continues to respect a relation dictated by the evolution of the background. We show how to derive this relation using only symmetry arguments, without ever needing to solve the equations of motion for the perturbations

  13. Consistence of Network Filtering Rules

    Institute of Scientific and Technical Information of China (English)

    SHE Kun; WU Yuancheng; HUANG Juncai; ZHOU Mingtian

    2004-01-01

    The inconsistence of firewall/VPN(Virtual Private Network) rule makes a huge maintainable cost.With development of Multinational Company,SOHO office,E-government the number of firewalls/VPN will increase rapidly.Rule table in stand-alone or network will be increased in geometric series accordingly.Checking the consistence of rule table manually is inadequate.A formal approach can define semantic consistence,make a theoretic foundation of intelligent management about rule tables.In this paper,a kind of formalization of host rules and network ones for auto rule-validation based on SET theory were proporsed and a rule validation scheme was defined.The analysis results show the superior performance of the methods and demonstrate its potential for the intelligent management based on rule tables.

  14. Self-consistent radial sheath

    International Nuclear Information System (INIS)

    Hazeltine, R.D.

    1988-12-01

    The boundary layer arising in the radial vicinity of a tokamak limiter is examined, with special reference to the TEXT tokamak. It is shown that sheath structure depends upon the self-consistent effects of ion guiding-center orbit modification, as well as the radial variation of E /times/ B-induced toroidal rotation. Reasonable agreement with experiment is obtained from an idealized model which, however simplified, preserves such self-consistent effects. It is argued that the radial sheath, which occurs whenever confining magnetic field-lines lie in the plasma boundary surface, is an object of some intrinsic interest. It differs from the more familiar axial sheath because magnetized charges respond very differently to parallel and perpendicular electric fields. 11 refs., 1 fig

  15. Lagrangian multiforms and multidimensional consistency

    Energy Technology Data Exchange (ETDEWEB)

    Lobb, Sarah; Nijhoff, Frank [Department of Applied Mathematics, University of Leeds, Leeds LS2 9JT (United Kingdom)

    2009-10-30

    We show that well-chosen Lagrangians for a class of two-dimensional integrable lattice equations obey a closure relation when embedded in a higher dimensional lattice. On the basis of this property we formulate a Lagrangian description for such systems in terms of Lagrangian multiforms. We discuss the connection of this formalism with the notion of multidimensional consistency, and the role of the lattice from the point of view of the relevant variational principle.

  16. Consistency and Communication in Committees

    OpenAIRE

    Inga Deimen; Felix Ketelaar; Mark T. Le Quement

    2013-01-01

    This paper analyzes truthtelling incentives in pre-vote communication in heterogeneous committees. We generalize the classical Condorcet jury model by introducing a new informational structure that captures consistency of information. In contrast to the impossibility result shown by Coughlan (2000) for the classical model, full pooling of information followed by sincere voting is an equilibrium outcome of our model for a large set of parameter values implying the possibility of ex post confli...

  17. Deep Feature Consistent Variational Autoencoder

    OpenAIRE

    Hou, Xianxu; Shen, Linlin; Sun, Ke; Qiu, Guoping

    2016-01-01

    We present a novel method for constructing Variational Autoencoder (VAE). Instead of using pixel-by-pixel loss, we enforce deep feature consistency between the input and the output of a VAE, which ensures the VAE's output to preserve the spatial correlation characteristics of the input, thus leading the output to have a more natural visual appearance and better perceptual quality. Based on recent deep learning works such as style transfer, we employ a pre-trained deep convolutional neural net...

  18. Maintaining consistency in distributed systems

    Science.gov (United States)

    Birman, Kenneth P.

    1991-01-01

    In systems designed as assemblies of independently developed components, concurrent access to data or data structures normally arises within individual programs, and is controlled using mutual exclusion constructs, such as semaphores and monitors. Where data is persistent and/or sets of operation are related to one another, transactions or linearizability may be more appropriate. Systems that incorporate cooperative styles of distributed execution often replicate or distribute data within groups of components. In these cases, group oriented consistency properties must be maintained, and tools based on the virtual synchrony execution model greatly simplify the task confronting an application developer. All three styles of distributed computing are likely to be seen in future systems - often, within the same application. This leads us to propose an integrated approach that permits applications that use virtual synchrony with concurrent objects that respect a linearizability constraint, and vice versa. Transactional subsystems are treated as a special case of linearizability.

  19. Decentralized Consistent Updates in SDN

    KAUST Repository

    Nguyen, Thanh Dang

    2017-04-10

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.

  20. Risks management in project planning

    OpenAIRE

    Stankevičiūtė, Roberta

    2017-01-01

    Project management consists of two very important aspects – managing the right project and managing the project right. To know that you are managing the right project you need to ensure that your project is based on an actual requirement and that your project goal is relevant and beneficial. And professional project planning assists in managing project the right way. The project planning process is very time consuming and is one of the most important parts of the project management process. T...

  1. Projective measure without projective Baire

    DEFF Research Database (Denmark)

    Schrittesser, David; Friedman, Sy David

    We prove that it is consistent (relative to a Mahlo cardinal) that all projective sets of reals are Lebesgue measurable, but there is a ∆13 set without the Baire property. The complexity of the set which provides a counterexample to the Baire property is optimal.......We prove that it is consistent (relative to a Mahlo cardinal) that all projective sets of reals are Lebesgue measurable, but there is a ∆13 set without the Baire property. The complexity of the set which provides a counterexample to the Baire property is optimal....

  2. Evaluating the hydrological consistency of evaporation products

    KAUST Repository

    Lopez Valencia, Oliver Miguel; Houborg, Rasmus; McCabe, Matthew

    2017-01-01

    Advances in space-based observations have provided the capacity to develop regional- to global-scale estimates of evaporation, offering insights into this key component of the hydrological cycle. However, the evaluation of large-scale evaporation retrievals is not a straightforward task. While a number of studies have intercompared a range of these evaporation products by examining the variance amongst them, or by comparison of pixel-scale retrievals against ground-based observations, there is a need to explore more appropriate techniques to comprehensively evaluate remote-sensing-based estimates. One possible approach is to establish the level of product agreement between related hydrological components: for instance, how well do evaporation patterns and response match with precipitation or water storage changes? To assess the suitability of this "consistency"-based approach for evaluating evaporation products, we focused our investigation on four globally distributed basins in arid and semi-arid environments, comprising the Colorado River basin, Niger River basin, Aral Sea basin, and Lake Eyre basin. In an effort to assess retrieval quality, three satellite-based global evaporation products based on different methodologies and input data, including CSIRO-PML, the MODIS Global Evapotranspiration product (MOD16), and Global Land Evaporation: the Amsterdam Methodology (GLEAM), were evaluated against rainfall data from the Global Precipitation Climatology Project (GPCP) along with Gravity Recovery and Climate Experiment (GRACE) water storage anomalies. To ensure a fair comparison, we evaluated consistency using a degree correlation approach after transforming both evaporation and precipitation data into spherical harmonics. Overall we found no persistent hydrological consistency in these dryland environments. Indeed, the degree correlation showed oscillating values between periods of low and high water storage changes, with a phase difference of about 2–3 months

  3. Evaluating the hydrological consistency of evaporation products

    KAUST Repository

    Lopez Valencia, Oliver Miguel

    2017-01-18

    Advances in space-based observations have provided the capacity to develop regional- to global-scale estimates of evaporation, offering insights into this key component of the hydrological cycle. However, the evaluation of large-scale evaporation retrievals is not a straightforward task. While a number of studies have intercompared a range of these evaporation products by examining the variance amongst them, or by comparison of pixel-scale retrievals against ground-based observations, there is a need to explore more appropriate techniques to comprehensively evaluate remote-sensing-based estimates. One possible approach is to establish the level of product agreement between related hydrological components: for instance, how well do evaporation patterns and response match with precipitation or water storage changes? To assess the suitability of this "consistency"-based approach for evaluating evaporation products, we focused our investigation on four globally distributed basins in arid and semi-arid environments, comprising the Colorado River basin, Niger River basin, Aral Sea basin, and Lake Eyre basin. In an effort to assess retrieval quality, three satellite-based global evaporation products based on different methodologies and input data, including CSIRO-PML, the MODIS Global Evapotranspiration product (MOD16), and Global Land Evaporation: the Amsterdam Methodology (GLEAM), were evaluated against rainfall data from the Global Precipitation Climatology Project (GPCP) along with Gravity Recovery and Climate Experiment (GRACE) water storage anomalies. To ensure a fair comparison, we evaluated consistency using a degree correlation approach after transforming both evaporation and precipitation data into spherical harmonics. Overall we found no persistent hydrological consistency in these dryland environments. Indeed, the degree correlation showed oscillating values between periods of low and high water storage changes, with a phase difference of about 2–3 months

  4. The least weighted squares II. Consistency and asymptotic normality

    Czech Academy of Sciences Publication Activity Database

    Víšek, Jan Ámos

    2002-01-01

    Roč. 9, č. 16 (2002), s. 1-28 ISSN 1212-074X R&D Projects: GA AV ČR KSK1019101 Grant - others:GA UK(CR) 255/2000/A EK /FSV Institutional research plan: CEZ:AV0Z1075907 Keywords : robust regression * consistency * asymptotic normality Subject RIV: BA - General Mathematics

  5. Consistency Anchor Formalization and Correctness Proofs

    OpenAIRE

    Miguel, Correia; Bessani, Alysson

    2014-01-01

    This is report contains the formal proofs for the techniques for increasing the consistency of cloud storage as presented in "Bessani et al. SCFS: A Cloud-backed File System. Proc. of the 2014 USENIX Annual Technical Conference. June 2014." The consistency anchor technique allows one to increase the consistency provided by eventually consistent cloud storage services like Amazon S3. This technique has been used in the SCFS (Shared Cloud File System) cloud-backed file system for solving rea...

  6. High-performance speech recognition using consistency modeling

    Science.gov (United States)

    Digalakis, Vassilios; Murveit, Hy; Monaco, Peter; Neumeyer, Leo; Sankar, Ananth

    1994-12-01

    The goal of SRI's consistency modeling project is to improve the raw acoustic modeling component of SRI's DECIPHER speech recognition system and develop consistency modeling technology. Consistency modeling aims to reduce the number of improper independence assumptions used in traditional speech recognition algorithms so that the resulting speech recognition hypotheses are more self-consistent and, therefore, more accurate. At the initial stages of this effort, SRI focused on developing the appropriate base technologies for consistency modeling. We first developed the Progressive Search technology that allowed us to perform large-vocabulary continuous speech recognition (LVCSR) experiments. Since its conception and development at SRI, this technique has been adopted by most laboratories, including other ARPA contracting sites, doing research on LVSR. Another goal of the consistency modeling project is to attack difficult modeling problems, when there is a mismatch between the training and testing phases. Such mismatches may include outlier speakers, different microphones and additive noise. We were able to either develop new, or transfer and evaluate existing, technologies that adapted our baseline genonic HMM recognizer to such difficult conditions.

  7. Project delivery system (PDS)

    CERN Document Server

    2001-01-01

    As business environments become increasingly competitive, companies seek more comprehensive solutions to the delivery of their projects. "Project Delivery System: Fourth Edition" describes the process-driven project delivery systems which incorporates the best practices from Total Quality and is aligned with the Project Management Institute and ISO Quality Standards is the means by which projects are consistently and efficiently planned, executed and completed to the satisfaction of clients and customers.

  8. A new approach to hull consistency

    Directory of Open Access Journals (Sweden)

    Kolev Lubomir

    2016-06-01

    Full Text Available Hull consistency is a known technique to improve the efficiency of iterative interval methods for solving nonlinear systems describing steady-states in various circuits. Presently, hull consistency is checked in a scalar manner, i.e. successively for each equation of the nonlinear system with respect to a single variable. In the present poster, a new more general approach to implementing hull consistency is suggested which consists in treating simultaneously several equations with respect to the same number of variables.

  9. Student Effort, Consistency, and Online Performance

    Science.gov (United States)

    Patron, Hilde; Lopez, Salvador

    2011-01-01

    This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas…

  10. Translationally invariant self-consistent field theories

    International Nuclear Information System (INIS)

    Shakin, C.M.; Weiss, M.S.

    1977-01-01

    We present a self-consistent field theory which is translationally invariant. The equations obtained go over to the usual Hartree-Fock equations in the limit of large particle number. In addition to deriving the dynamic equations for the self-consistent amplitudes we discuss the calculation of form factors and various other observables

  11. Sticky continuous processes have consistent price systems

    DEFF Research Database (Denmark)

    Bender, Christian; Pakkanen, Mikko; Sayit, Hasanjan

    Under proportional transaction costs, a price process is said to have a consistent price system, if there is a semimartingale with an equivalent martingale measure that evolves within the bid-ask spread. We show that a continuous, multi-asset price process has a consistent price system, under...

  12. Consistent-handed individuals are more authoritarian.

    Science.gov (United States)

    Lyle, Keith B; Grillo, Michael C

    2014-01-01

    Individuals differ in the consistency with which they use one hand over the other to perform everyday activities. Some individuals are very consistent, habitually using a single hand to perform most tasks. Others are relatively inconsistent, and hence make greater use of both hands. More- versus less-consistent individuals have been shown to differ in numerous aspects of personality and cognition. In several respects consistent-handed individuals resemble authoritarian individuals. For example, both consistent-handedness and authoritarianism have been linked to cognitive inflexibility. Therefore we hypothesised that consistent-handedness is an external marker for authoritarianism. Confirming our hypothesis, we found that consistent-handers scored higher than inconsistent-handers on a measure of submission to authority, were more likely to identify with a conservative political party (Republican), and expressed less-positive attitudes towards out-groups. We propose that authoritarianism may be influenced by the degree of interaction between the left and right brain hemispheres, which has been found to differ between consistent- and inconsistent-handed individuals.

  13. Testing the visual consistency of web sites

    NARCIS (Netherlands)

    van der Geest, Thea; Loorbach, N.R.

    2005-01-01

    Consistency in the visual appearance of Web pages is often checked by experts, such as designers or reviewers. This article reports a card sort study conducted to determine whether users rather than experts could distinguish visual (in-)consistency in Web elements and pages. The users proved to

  14. Consistent spectroscopy for a extended gauge model

    International Nuclear Information System (INIS)

    Oliveira Neto, G. de.

    1990-11-01

    The consistent spectroscopy was obtained with a Lagrangian constructed with vector fields with a U(1) group extended symmetry. As consistent spectroscopy is understood the determination of quantum physical properties described by the model in an manner independent from the possible parametrizations adopted in their description. (L.C.J.A.)

  15. Consistency of eye movements in MOT using horizontally flipped trials

    Czech Academy of Sciences Publication Activity Database

    Děchtěrenko, F.; Lukavský, Jiří

    2013-01-01

    Roč. 42, Suppl (2013), s. 42-42 ISSN 0301-0066. [36th European Conference on Visual Perception. 25.08.2013.-29.08.2013, Brémy] R&D Projects: GA ČR GA13-28709S Institutional support: RVO:68081740 Keywords : eye movements * symmetry * consistency Subject RIV: AN - Psychology http://www.ecvp.uni-bremen.de/~ecvpprog/abstract164.html

  16. Self-consistent T-matrix theory of superconductivity

    Czech Academy of Sciences Publication Activity Database

    Šopík, B.; Lipavský, Pavel; Männel, M.; Morawetz, K.; Matlock, P.

    2011-01-01

    Roč. 84, č. 9 (2011), 094529/1-094529/13 ISSN 1098-0121 R&D Projects: GA ČR GAP204/10/0212; GA ČR(CZ) GAP204/11/0015 Institutional research plan: CEZ:AV0Z10100521 Keywords : superconductivity * T-matrix * superconducting gap * restricted self-consistency Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 3.691, year: 2011

  17. Modeling and Testing Legacy Data Consistency Requirements

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard

    2003-01-01

    An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...

  18. Managing Projects with KPRO

    Science.gov (United States)

    Braden, Barry M.

    2004-01-01

    How does a Project Management Office provide: Consistent, familiar, easily used scheduling tools to Project Managers and project team members? Provide a complete list of organization resources available for use on the project? Facilitate resource tracking and visibility? Provide the myriad reports that the organization requires? Facilitate consistent budget planning and cost performance information? Provide all of this to the entire organization? Provide for the unique requirement of the organization? and get people to use it? Answer: Implementation of the Kennedy space Center Projects and Resources Online (KPRO), a modified COTS solution.

  19. Consistency in the World Wide Web

    DEFF Research Database (Denmark)

    Thomsen, Jakob Grauenkjær

    Tim Berners-Lee envisioned that computers will behave as agents of humans on the World Wide Web, where they will retrieve, extract, and interact with information from the World Wide Web. A step towards this vision is to make computers capable of extracting this information in a reliable...... and consistent way. In this dissertation we study steps towards this vision by showing techniques for the specication, the verication and the evaluation of the consistency of information in the World Wide Web. We show how to detect certain classes of errors in a specication of information, and we show how...... the World Wide Web, in order to help perform consistent evaluations of web extraction techniques. These contributions are steps towards having computers reliable and consistently extract information from the World Wide Web, which in turn are steps towards achieving Tim Berners-Lee's vision. ii...

  20. Consistent histories and operational quantum theory

    International Nuclear Information System (INIS)

    Rudolph, O.

    1996-01-01

    In this work a generalization of the consistent histories approach to quantum mechanics is presented. We first critically review the consistent histories approach to nonrelativistic quantum mechanics in a mathematically rigorous way and give some general comments about it. We investigate to what extent the consistent histories scheme is compatible with the results of the operational formulation of quantum mechanics. According to the operational approach, nonrelativistic quantum mechanics is most generally formulated in terms of effects, states, and operations. We formulate a generalized consistent histories theory using the concepts and the terminology which have proven useful in the operational formulation of quantum mechanics. The logical rule of the logical interpretation of quantum mechanics is generalized to the present context. The algebraic structure of the generalized theory is studied in detail

  1. Self-consistent areas law in QCD

    International Nuclear Information System (INIS)

    Makeenko, Yu.M.; Migdal, A.A.

    1980-01-01

    The problem of obtaining the self-consistent areas law in quantum chromodynamics (QCD) is considered from the point of view of the quark confinement. The exact equation for the loop average in multicolor QCD is reduced to a bootstrap form. Its iterations yield new manifestly gauge invariant perturbation theory in the loop space, reproducing asymptotic freedom. For large loops, the areas law apprears to be a self-consistent solution

  2. Consistency of the MLE under mixture models

    OpenAIRE

    Chen, Jiahua

    2016-01-01

    The large-sample properties of likelihood-based statistical inference under mixture models have received much attention from statisticians. Although the consistency of the nonparametric MLE is regarded as a standard conclusion, many researchers ignore the precise conditions required on the mixture model. An incorrect claim of consistency can lead to false conclusions even if the mixture model under investigation seems well behaved. Under a finite normal mixture model, for instance, the consis...

  3. Self-consistent asset pricing models

    Science.gov (United States)

    Malevergne, Y.; Sornette, D.

    2007-08-01

    We discuss the foundations of factor or regression models in the light of the self-consistency condition that the market portfolio (and more generally the risk factors) is (are) constituted of the assets whose returns it is (they are) supposed to explain. As already reported in several articles, self-consistency implies correlations between the return disturbances. As a consequence, the alphas and betas of the factor model are unobservable. Self-consistency leads to renormalized betas with zero effective alphas, which are observable with standard OLS regressions. When the conditions derived from internal consistency are not met, the model is necessarily incomplete, which means that some sources of risk cannot be replicated (or hedged) by a portfolio of stocks traded on the market, even for infinite economies. Analytical derivations and numerical simulations show that, for arbitrary choices of the proxy which are different from the true market portfolio, a modified linear regression holds with a non-zero value αi at the origin between an asset i's return and the proxy's return. Self-consistency also introduces “orthogonality” and “normality” conditions linking the betas, alphas (as well as the residuals) and the weights of the proxy portfolio. Two diagnostics based on these orthogonality and normality conditions are implemented on a basket of 323 assets which have been components of the S&P500 in the period from January 1990 to February 2005. These two diagnostics show interesting departures from dynamical self-consistency starting about 2 years before the end of the Internet bubble. Assuming that the CAPM holds with the self-consistency condition, the OLS method automatically obeys the resulting orthogonality and normality conditions and therefore provides a simple way to self-consistently assess the parameters of the model by using proxy portfolios made only of the assets which are used in the CAPM regressions. Finally, the factor decomposition with the

  4. The CHPRC Columbia River Protection Project Quality Assurance Project Plan

    Energy Technology Data Exchange (ETDEWEB)

    Fix, N. J.

    2008-11-30

    Pacific Northwest National Laboratory researchers are working on the CHPRC Columbia River Protection Project (hereafter referred to as the Columbia River Project). This is a follow-on project, funded by CH2M Hill Plateau Remediation Company, LLC (CHPRC), to the Fluor Hanford, Inc. Columbia River Protection Project. The work scope consists of a number of CHPRC funded, related projects that are managed under a master project (project number 55109). All contract releases associated with the Fluor Hanford Columbia River Project (Fluor Hanford, Inc. Contract 27647) and the CHPRC Columbia River Project (Contract 36402) will be collected under this master project. Each project within the master project is authorized by a CHPRC contract release that contains the project-specific statement of work. This Quality Assurance Project Plan provides the quality assurance requirements and processes that will be followed by the Columbia River Project staff.

  5. The CHPRC Columbia River Protection Project Quality Assurance Project Plan

    International Nuclear Information System (INIS)

    Fix, N.J.

    2008-01-01

    Pacific Northwest National Laboratory researchers are working on the CHPRC Columbia River Protection Project (hereafter referred to as the Columbia River Project). This is a follow-on project, funded by CH2M Hill Plateau Remediation Company, LLC (CHPRC), to the Fluor Hanford, Inc. Columbia River Protection Project. The work scope consists of a number of CHPRC funded, related projects that are managed under a master project (project number 55109). All contract releases associated with the Fluor Hanford Columbia River Project (Fluor Hanford, Inc. Contract 27647) and the CHPRC Columbia River Project (Contract 36402) will be collected under this master project. Each project within the master project is authorized by a CHPRC contract release that contains the project-specific statement of work. This Quality Assurance Project Plan provides the quality assurance requirements and processes that will be followed by the Columbia River Project staff

  6. Towards thermodynamical consistency of quasiparticle picture

    International Nuclear Information System (INIS)

    Biro, T.S.; Shanenko, A.A.; Toneev, V.D.; Research Inst. for Particle and Nuclear Physics, Hungarian Academy of Sciences, Budapest

    2003-01-01

    The purpose of the present article is to call attention to some realistic quasi-particle-based description of the quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamical consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamical consistency. A particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential, which can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics [ru

  7. Toward thermodynamic consistency of quasiparticle picture

    International Nuclear Information System (INIS)

    Biro, T.S.; Toneev, V.D.; Shanenko, A.A.

    2003-01-01

    The purpose of the present article is to call attention to some realistic quasiparticle-based description of quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamic consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamic consistency. Particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential that can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics

  8. Toward a consistent RHA-RPA

    International Nuclear Information System (INIS)

    Shepard, J.R.

    1991-01-01

    The authors examine the RPA based on a relativistic Hartree approximation description for nuclear ground states. This model includes contributions from the negative energy sea at the 1-loop level. They emphasize consistency between the treatment of the ground state and the RPA. This consistency is important in the description of low-lying collective levels but less important for the longitudinal (e, e') quasi-elastic response. They also study the effect of imposing a 3-momentum cutoff on negative energy sea contributions. A cutoff of twice the nucleon mass improves agreement with observed spin orbit splittings in nuclei compared to the standard infinite cutoff results, an effect traceable to the fact that imposing the cutoff reduces m*/m. The cutoff is much less important than consistency in the description of low-lying collective levels. The cutoff model provides excellent agreement with quasi-elastic (e, e') data

  9. Personalized recommendation based on unbiased consistence

    Science.gov (United States)

    Zhu, Xuzhen; Tian, Hui; Zhang, Ping; Hu, Zheng; Zhou, Tao

    2015-08-01

    Recently, in physical dynamics, mass-diffusion-based recommendation algorithms on bipartite network provide an efficient solution by automatically pushing possible relevant items to users according to their past preferences. However, traditional mass-diffusion-based algorithms just focus on unidirectional mass diffusion from objects having been collected to those which should be recommended, resulting in a biased causal similarity estimation and not-so-good performance. In this letter, we argue that in many cases, a user's interests are stable, and thus bidirectional mass diffusion abilities, no matter originated from objects having been collected or from those which should be recommended, should be consistently powerful, showing unbiased consistence. We further propose a consistence-based mass diffusion algorithm via bidirectional diffusion against biased causality, outperforming the state-of-the-art recommendation algorithms in disparate real data sets, including Netflix, MovieLens, Amazon and Rate Your Music.

  10. Financial model calibration using consistency hints.

    Science.gov (United States)

    Abu-Mostafa, Y S

    2001-01-01

    We introduce a technique for forcing the calibration of a financial model to produce valid parameters. The technique is based on learning from hints. It converts simple curve fitting into genuine calibration, where broad conclusions can be inferred from parameter values. The technique augments the error function of curve fitting with consistency hint error functions based on the Kullback-Leibler distance. We introduce an efficient EM-type optimization algorithm tailored to this technique. We also introduce other consistency hints, and balance their weights using canonical errors. We calibrate the correlated multifactor Vasicek model of interest rates, and apply it successfully to Japanese Yen swaps market and US dollar yield market.

  11. Proteolysis and consistency of Meshanger cheese

    NARCIS (Netherlands)

    Jong, de L.

    1978-01-01

    Proteolysis in Meshanger cheese, estimated by quantitative polyacrylamide gel electrophoresis is discussed. The conversion of α s1 -casein was proportional to rennet concentration in the cheese. Changes in consistency, after a maximum, were correlated to breakdown of

  12. Developing consistent pronunciation models for phonemic variants

    CSIR Research Space (South Africa)

    Davel, M

    2006-09-01

    Full Text Available Pronunciation lexicons often contain pronunciation variants. This can create two problems: It can be difficult to define these variants in an internally consistent way and it can also be difficult to extract generalised grapheme-to-phoneme rule sets...

  13. Image recognition and consistency of response

    Science.gov (United States)

    Haygood, Tamara M.; Ryan, John; Liu, Qing Mary A.; Bassett, Roland; Brennan, Patrick C.

    2012-02-01

    Purpose: To investigate the connection between conscious recognition of an image previously encountered in an experimental setting and consistency of response to the experimental question. Materials and Methods: Twenty-four radiologists viewed 40 frontal chest radiographs and gave their opinion as to the position of a central venous catheter. One-to-three days later they again viewed 40 frontal chest radiographs and again gave their opinion as to the position of the central venous catheter. Half of the radiographs in the second set were repeated images from the first set and half were new. The radiologists were asked of each image whether it had been included in the first set. For this study, we are evaluating only the 20 repeated images. We used the Kruskal-Wallis test and Fisher's exact test to determine the relationship between conscious recognition of a previously interpreted image and consistency in interpretation of the image. Results. There was no significant correlation between recognition of the image and consistency in response regarding the position of the central venous catheter. In fact, there was a trend in the opposite direction, with radiologists being slightly more likely to give a consistent response with respect to images they did not recognize than with respect to those they did recognize. Conclusion: Radiologists' recognition of previously-encountered images in an observer-performance study does not noticeably color their interpretation on the second encounter.

  14. Consistent Valuation across Curves Using Pricing Kernels

    Directory of Open Access Journals (Sweden)

    Andrea Macrina

    2018-03-01

    Full Text Available The general problem of asset pricing when the discount rate differs from the rate at which an asset’s cash flows accrue is considered. A pricing kernel framework is used to model an economy that is segmented into distinct markets, each identified by a yield curve having its own market, credit and liquidity risk characteristics. The proposed framework precludes arbitrage within each market, while the definition of a curve-conversion factor process links all markets in a consistent arbitrage-free manner. A pricing formula is then derived, referred to as the across-curve pricing formula, which enables consistent valuation and hedging of financial instruments across curves (and markets. As a natural application, a consistent multi-curve framework is formulated for emerging and developed inter-bank swap markets, which highlights an important dual feature of the curve-conversion factor process. Given this multi-curve framework, existing multi-curve approaches based on HJM and rational pricing kernel models are recovered, reviewed and generalised and single-curve models extended. In another application, inflation-linked, currency-based and fixed-income hybrid securities are shown to be consistently valued using the across-curve valuation method.

  15. Guided color consistency optimization for image mosaicking

    Science.gov (United States)

    Xie, Renping; Xia, Menghan; Yao, Jian; Li, Li

    2018-01-01

    This paper studies the problem of color consistency correction for sequential images with diverse color characteristics. Existing algorithms try to adjust all images to minimize color differences among images under a unified energy framework, however, the results are prone to presenting a consistent but unnatural appearance when the color difference between images is large and diverse. In our approach, this problem is addressed effectively by providing a guided initial solution for the global consistency optimization, which avoids converging to a meaningless integrated solution. First of all, to obtain the reliable intensity correspondences in overlapping regions between image pairs, we creatively propose the histogram extreme point matching algorithm which is robust to image geometrical misalignment to some extents. In the absence of the extra reference information, the guided initial solution is learned from the major tone of the original images by searching some image subset as the reference, whose color characteristics will be transferred to the others via the paths of graph analysis. Thus, the final results via global adjustment will take on a consistent color similar to the appearance of the reference image subset. Several groups of convincing experiments on both the synthetic dataset and the challenging real ones sufficiently demonstrate that the proposed approach can achieve as good or even better results compared with the state-of-the-art approaches.

  16. Consistent application of codes and standards

    International Nuclear Information System (INIS)

    Scott, M.A.

    1989-01-01

    The guidelines presented in the US Department of Energy, General Design Criteria (DOE 6430.1A), and the Design and Evaluation Guidelines for Department of Energy Facilities Subject to Natural Phenomena Hazards (UCRL-15910) provide a consistent and well defined approach to determine the natural phenomena hazards loads for US Department of Energy site facilities. The guidelines for the application of loads combinations and allowables criteria are not as well defined and are more flexible in interpretation. This flexibility in the interpretation of load combinations can lead to conflict between the designer and overseer. The establishment of an efficient set of acceptable design criteria, based on US Department of Energy guidelines, provides a consistent baseline for analysis, design, and review. Additionally, the proposed method should not limit the design and analytical innovation necessary to analyze or qualify the unique structure. This paper investigates the consistent application of load combinations, analytical methods, and load allowables and suggests a reference path consistent with the US Department of Energy guidelines

  17. Consistency in multi-viewpoint architectural design

    NARCIS (Netherlands)

    Dijkman, R.M.; Dijkman, Remco Matthijs

    2006-01-01

    This thesis presents a framework that aids in preserving consistency in multi-viewpoint designs. In a multi-viewpoint design each stakeholder constructs his own design part. We call each stakeholder’s design part the view of that stakeholder. To construct his view, a stakeholder has a viewpoint.

  18. Consistent Visual Analyses of Intrasubject Data

    Science.gov (United States)

    Kahng, SungWoo; Chung, Kyong-Mee; Gutshall, Katharine; Pitts, Steven C.; Kao, Joyce; Girolami, Kelli

    2010-01-01

    Visual inspection of single-case data is the primary method of interpretation of the effects of an independent variable on a dependent variable in applied behavior analysis. The purpose of the current study was to replicate and extend the results of DeProspero and Cohen (1979) by reexamining the consistency of visual analysis across raters. We…

  19. Consistent Stochastic Modelling of Meteocean Design Parameters

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Sterndorff, M. J.

    2000-01-01

    Consistent stochastic models of metocean design parameters and their directional dependencies are essential for reliability assessment of offshore structures. In this paper a stochastic model for the annual maximum values of the significant wave height, and the associated wind velocity, current...

  20. On the existence of consistent price systems

    DEFF Research Database (Denmark)

    Bayraktar, Erhan; Pakkanen, Mikko S.; Sayit, Hasanjan

    2014-01-01

    We formulate a sufficient condition for the existence of a consistent price system (CPS), which is weaker than the conditional full support condition (CFS). We use the new condition to show the existence of CPSs for certain processes that fail to have the CFS property. In particular this condition...

  1. Dynamic phonon exchange requires consistent dressing

    International Nuclear Information System (INIS)

    Hahne, F.J.W.; Engelbrecht, C.A.; Heiss, W.D.

    1976-01-01

    It is shown that states with undersirable properties (such as ghosts, states with complex eigenenergies and states with unrestricted normalization) emerge from two-body calculations using dynamic effective interactions if one is not careful in introducing single-particle self-energy insertions in a consistent manner

  2. Consistent feeding positions of great tit parents

    NARCIS (Netherlands)

    Lessells, C.M.; Poelman, E.H.; Mateman, A.C.; Cassey, Ph.

    2006-01-01

    When parent birds arrive at the nest to provision their young, their position on the nest rim may influence which chick or chicks are fed. As a result, the consistency of feeding positions of the individual parents, and the difference in position between the parents, may affect how equitably food is

  3. Consistency of the postulates of special relativity

    International Nuclear Information System (INIS)

    Gron, O.; Nicola, M.

    1976-01-01

    In a recent article in this journal, Kingsley has tried to show that the postulates of special relativity contradict each other. It is shown that the arguments of Kingsley are invalid because of an erroneous appeal to symmetry in a nonsymmetric situation. The consistency of the postulates of special relativity and the relativistic kinematics deduced from them is restated

  4. Consistency of Network Traffic Repositories: An Overview

    NARCIS (Netherlands)

    Lastdrager, E.; Lastdrager, E.E.H.; Pras, Aiko

    2009-01-01

    Traffc repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffc that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for

  5. Consistency analysis of network traffic repositories

    NARCIS (Netherlands)

    Lastdrager, Elmer; Lastdrager, E.E.H.; Pras, Aiko

    Traffic repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffic that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for

  6. CAREM Project

    International Nuclear Information System (INIS)

    Ishida, Viviana; Gomez, Silvia

    2001-01-01

    CAREM project consists on the development and design of an advanced nuclear power plant. CAREM is a very low power innovative reactor conceived with new generation design solutions. Based on an indirect cycle integrated light water reactor using enriched uranium, CAREM has some distinctive features that greatly simplify the reactor and also contribute to a high level of safety: integrated primary system, primary system cooling by natural convection, self pressurization, and passive safety systems. In order to verify its innovative features the construction of a prototype is planned. (author)

  7. A consistent interpretation of quantum mechanics

    International Nuclear Information System (INIS)

    Omnes, Roland

    1990-01-01

    Some mostly recent theoretical and mathematical advances can be linked together to yield a new consistent interpretation of quantum mechanics. It relies upon a unique and universal interpretative rule of a logical character which is based upon Griffiths consistent history. Some new results in semi-classical physics allow classical physics to be derived from this rule, including its logical aspects, and to prove accordingly the existence of determinism within the quantum framework. Together with decoherence, this can be used to retrieve the existence of facts, despite the probabilistic character of the theory. Measurement theory can then be made entirely deductive. It is accordingly found that wave packet reduction is a logical property, whereas one can always choose to avoid using it. The practical consequences of this interpretation are most often in agreement with the Copenhagen formulation but they can be proved never to give rise to any logical inconsistency or paradox. (author)

  8. Self-consistency in Capital Markets

    Science.gov (United States)

    Benbrahim, Hamid

    2013-03-01

    Capital Markets are considered, at least in theory, information engines whereby traders contribute to price formation with their diverse perspectives. Regardless whether one believes in efficient market theory on not, actions by individual traders influence prices of securities, which in turn influence actions by other traders. This influence is exerted through a number of mechanisms including portfolio balancing, margin maintenance, trend following, and sentiment. As a result market behaviors emerge from a number of mechanisms ranging from self-consistency due to wisdom of the crowds and self-fulfilling prophecies, to more chaotic behavior resulting from dynamics similar to the three body system, namely the interplay between equities, options, and futures. This talk will address questions and findings regarding the search for self-consistency in capital markets.

  9. Student Effort, Consistency and Online Performance

    Directory of Open Access Journals (Sweden)

    Hilde Patron

    2011-07-01

    Full Text Available This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas effort, or total minutes spent online, is not. Other independent variables include GPA and the difference between a pre-test and a post-test. The GPA is used as a measure of motivation, and the difference between a post-test and pre-test as marginal learning. As expected, the level of motivation is found statistically significant at a 99% confidence level, and marginal learning is also significant at a 95% level.

  10. Consistent thermodynamic properties of lipids systems

    DEFF Research Database (Denmark)

    Cunico, Larissa; Ceriani, Roberta; Sarup, Bent

    different pressures, with azeotrope behavior observed. Available thermodynamic consistency tests for TPx data were applied before performing parameter regressions for Wilson, NRTL, UNIQUAC and original UNIFAC models. The relevance of enlarging experimental databank of lipids systems data in order to improve......Physical and thermodynamic properties of pure components and their mixtures are the basic requirement for process design, simulation, and optimization. In the case of lipids, our previous works[1-3] have indicated a lack of experimental data for pure components and also for their mixtures...... the performance of predictive thermodynamic models was confirmed in this work by analyzing the calculated values of original UNIFAC model. For solid-liquid equilibrium (SLE) data, new consistency tests have been developed [2]. Some of the developed tests were based in the quality tests proposed for VLE data...

  11. Consistency relation for cosmic magnetic fields

    DEFF Research Database (Denmark)

    Jain, R. K.; Sloth, M. S.

    2012-01-01

    If cosmic magnetic fields are indeed produced during inflation, they are likely to be correlated with the scalar metric perturbations that are responsible for the cosmic microwave background anisotropies and large scale structure. Within an archetypical model of inflationary magnetogenesis, we show...... that there exists a new simple consistency relation for the non-Gaussian cross correlation function of the scalar metric perturbation with two powers of the magnetic field in the squeezed limit where the momentum of the metric perturbation vanishes. We emphasize that such a consistency relation turns out...... to be extremely useful to test some recent calculations in the literature. Apart from primordial non-Gaussianity induced by the curvature perturbations, such a cross correlation might provide a new observational probe of inflation and can in principle reveal the primordial nature of cosmic magnetic fields. DOI...

  12. Consistent Estimation of Partition Markov Models

    Directory of Open Access Journals (Sweden)

    Jesús E. García

    2017-04-01

    Full Text Available The Partition Markov Model characterizes the process by a partition L of the state space, where the elements in each part of L share the same transition probability to an arbitrary element in the alphabet. This model aims to answer the following questions: what is the minimal number of parameters needed to specify a Markov chain and how to estimate these parameters. In order to answer these questions, we build a consistent strategy for model selection which consist of: giving a size n realization of the process, finding a model within the Partition Markov class, with a minimal number of parts to represent the process law. From the strategy, we derive a measure that establishes a metric in the state space. In addition, we show that if the law of the process is Markovian, then, eventually, when n goes to infinity, L will be retrieved. We show an application to model internet navigation patterns.

  13. Internal Branding and Employee Brand Consistent Behaviours

    DEFF Research Database (Denmark)

    Mazzei, Alessandra; Ravazzani, Silvia

    2017-01-01

    constitutive processes. In particular, the paper places emphasis on the role and kinds of communication practices as a central part of the nonnormative and constitutive internal branding process. The paper also discusses an empirical study based on interviews with 32 Italian and American communication managers...... and 2 focus groups with Italian communication managers. Findings show that, in order to enhance employee brand consistent behaviours, the most effective communication practices are those characterised as enablement-oriented. Such a communication creates the organizational conditions adequate to sustain......Employee behaviours conveying brand values, named brand consistent behaviours, affect the overall brand evaluation. Internal branding literature highlights a knowledge gap in terms of communication practices intended to sustain such behaviours. This study contributes to the development of a non...

  14. Self-consistent velocity dependent effective interactions

    International Nuclear Information System (INIS)

    Kubo, Takayuki; Sakamoto, Hideo; Kammuri, Tetsuo; Kishimoto, Teruo.

    1993-09-01

    The field coupling method is extended to a system with a velocity dependent mean potential. By means of this method, we can derive the effective interactions which are consistent with the mean potential. The self-consistent velocity dependent effective interactions are applied to the microscopic analysis of the structures of giant dipole resonances (GDR) of 148,154 Sm, of the first excited 2 + states of Sn isotopes and of the first excited 3 - states of Mo isotopes. It is clarified that the interactions play crucial roles in describing the splitting of the resonant structure of GDR peaks, in restoring the energy weighted sum rule values, and in reducing B (Eλ) values. (author)

  15. Evaluating Temporal Consistency in Marine Biodiversity Hotspots

    OpenAIRE

    Piacenza, Susan E.; Thurman, Lindsey L.; Barner, Allison K.; Benkwitt, Cassandra E.; Boersma, Kate S.; Cerny-Chipman, Elizabeth B.; Ingeman, Kurt E.; Kindinger, Tye L.; Lindsley, Amy J.; Nelson, Jake; Reimer, Jessica N.; Rowe, Jennifer C.; Shen, Chenchen; Thompson, Kevin A.; Heppell, Selina S.

    2015-01-01

    With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monito...

  16. Cloud Standardization: Consistent Business Processes and Information

    Directory of Open Access Journals (Sweden)

    Razvan Daniel ZOTA

    2013-01-01

    Full Text Available Cloud computing represents one of the latest emerging trends in distributed computing that enables the existence of hardware infrastructure and software applications as services. The present paper offers a general approach to the cloud computing standardization as a mean of improving the speed of adoption for the cloud technologies. Moreover, this study tries to show out how organizations may achieve more consistent business processes while operating with cloud computing technologies.

  17. Consistency Analysis of Nearest Subspace Classifier

    OpenAIRE

    Wang, Yi

    2015-01-01

    The Nearest subspace classifier (NSS) finds an estimation of the underlying subspace within each class and assigns data points to the class that corresponds to its nearest subspace. This paper mainly studies how well NSS can be generalized to new samples. It is proved that NSS is strongly consistent under certain assumptions. For completeness, NSS is evaluated through experiments on various simulated and real data sets, in comparison with some other linear model based classifiers. It is also ...

  18. Consistency relations in effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Munshi, Dipak; Regan, Donough, E-mail: D.Munshi@sussex.ac.uk, E-mail: D.Regan@sussex.ac.uk [Astronomy Centre, School of Mathematical and Physical Sciences, University of Sussex, Brighton BN1 9QH (United Kingdom)

    2017-06-01

    The consistency relations in large scale structure relate the lower-order correlation functions with their higher-order counterparts. They are direct outcome of the underlying symmetries of a dynamical system and can be tested using data from future surveys such as Euclid. Using techniques from standard perturbation theory (SPT), previous studies of consistency relation have concentrated on continuity-momentum (Euler)-Poisson system of an ideal fluid. We investigate the consistency relations in effective field theory (EFT) which adjusts the SPT predictions to account for the departure from the ideal fluid description on small scales. We provide detailed results for the 3D density contrast δ as well as the scaled divergence of velocity θ-bar . Assuming a ΛCDM background cosmology, we find the correction to SPT results becomes important at k ∼> 0.05 h/Mpc and that the suppression from EFT to SPT results that scales as square of the wave number k , can reach 40% of the total at k ≈ 0.25 h/Mpc at z = 0. We have also investigated whether effective field theory corrections to models of primordial non-Gaussianity can alter the squeezed limit behaviour, finding the results to be rather insensitive to these counterterms. In addition, we present the EFT corrections to the squeezed limit of the bispectrum in redshift space which may be of interest for tests of theories of modified gravity.

  19. Consistent probabilities in loop quantum cosmology

    International Nuclear Information System (INIS)

    Craig, David A; Singh, Parampreet

    2013-01-01

    A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler–DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent-histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce versus a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation values of the volume and matter density in this theory. We also show from the consistent histories point of view that all states in this model, whether quantum or classical, achieve arbitrarily large volume in the limit of infinite ‘past’ or ‘future’ scalar ‘time’, in the sense that the wave function evaluated at any arbitrary fixed value of the volume vanishes in that limit. Finally, we briefly discuss certain misconceptions concerning the utility of the consistent histories approach in these models. (paper)

  20. Orthology and paralogy constraints: satisfiability and consistency.

    Science.gov (United States)

    Lafond, Manuel; El-Mabrouk, Nadia

    2014-01-01

    A variety of methods based on sequence similarity, reconciliation, synteny or functional characteristics, can be used to infer orthology and paralogy relations between genes of a given gene family  G. But is a given set  C of orthology/paralogy constraints possible, i.e., can they simultaneously co-exist in an evolutionary history for  G? While previous studies have focused on full sets of constraints, here we consider the general case where  C does not necessarily involve a constraint for each pair of genes. The problem is subdivided in two parts: (1) Is  C satisfiable, i.e. can we find an event-labeled gene tree G inducing  C? (2) Is there such a G which is consistent, i.e., such that all displayed triplet phylogenies are included in a species tree? Previous results on the Graph sandwich problem can be used to answer to (1), and we provide polynomial-time algorithms for satisfiability and consistency with a given species tree. We also describe a new polynomial-time algorithm for the case of consistency with an unknown species tree and full knowledge of pairwise orthology/paralogy relationships, as well as a branch-and-bound algorithm in the case when unknown relations are present. We show that our algorithms can be used in combination with ProteinOrtho, a sequence similarity-based orthology detection tool, to extract a set of robust orthology/paralogy relationships.

  1. Consistency of color representation in smart phones.

    Science.gov (United States)

    Dain, Stephen J; Kwan, Benjamin; Wong, Leslie

    2016-03-01

    One of the barriers to the construction of consistent computer-based color vision tests has been the variety of monitors and computers. Consistency of color on a variety of screens has necessitated calibration of each setup individually. Color vision examination with a carefully controlled display has, as a consequence, been a laboratory rather than a clinical activity. Inevitably, smart phones have become a vehicle for color vision tests. They have the advantage that the processor and screen are associated and there are fewer models of smart phones than permutations of computers and monitors. Colorimetric consistency of display within a model may be a given. It may extend across models from the same manufacturer but is unlikely to extend between manufacturers especially where technologies vary. In this study, we measured the same set of colors in a JPEG file displayed on 11 samples of each of four models of smart phone (iPhone 4s, iPhone5, Samsung Galaxy S3, and Samsung Galaxy S4) using a Photo Research PR-730. The iPhones are white LED backlit LCD and the Samsung are OLEDs. The color gamut varies between models and comparison with sRGB space shows 61%, 85%, 117%, and 110%, respectively. The iPhones differ markedly from the Samsungs and from one another. This indicates that model-specific color lookup tables will be needed. Within each model, the primaries were quite consistent (despite the age of phone varying within each sample). The worst case in each model was the blue primary; the 95th percentile limits in the v' coordinate were ±0.008 for the iPhone 4 and ±0.004 for the other three models. The u'v' variation in white points was ±0.004 for the iPhone4 and ±0.002 for the others, although the spread of white points between models was u'v'±0.007. The differences are essentially the same for primaries at low luminance. The variation of colors intermediate between the primaries (e.g., red-purple, orange) mirror the variation in the primaries. The variation in

  2. Do Health Systems Have Consistent Performance Across Locations and Is Consistency Associated With Higher Performance?

    Science.gov (United States)

    Crespin, Daniel J; Christianson, Jon B; McCullough, Jeffrey S; Finch, Michael D

    This study addresses whether health systems have consistent diabetes care performance across their ambulatory clinics and whether increasing consistency is associated with improvements in clinic performance. Study data included 2007 to 2013 diabetes care intermediate outcome measures for 661 ambulatory clinics in Minnesota and bordering states. Health systems provided more consistent performance, as measured by the standard deviation of performance for clinics in a system, relative to propensity score-matched proxy systems created for comparison purposes. No evidence was found that improvements in consistency were associated with higher clinic performance. The combination of high performance and consistent care is likely to enhance a health system's brand reputation, allowing it to better mitigate the financial risks of consumers seeking care outside the organization. These results suggest that larger health systems are most likely to deliver the combination of consistent and high-performance care. Future research should explore the mechanisms that drive consistent care within health systems.

  3. Managemant of NASA's major projects

    Science.gov (United States)

    James, L. B.

    1973-01-01

    Approaches used to manage major projects are studied and the existing documents on NASA management are reviewed. The work consists of: (1) the project manager's role, (2) request for proposal, (3) project plan, (4) management information system, (5) project organizational thinking, (6) management disciplines, (7) important decisions, and (8) low cost approach.

  4. Evaluating Temporal Consistency in Marine Biodiversity Hotspots.

    Science.gov (United States)

    Piacenza, Susan E; Thurman, Lindsey L; Barner, Allison K; Benkwitt, Cassandra E; Boersma, Kate S; Cerny-Chipman, Elizabeth B; Ingeman, Kurt E; Kindinger, Tye L; Lindsley, Amy J; Nelson, Jake; Reimer, Jessica N; Rowe, Jennifer C; Shen, Chenchen; Thompson, Kevin A; Heppell, Selina S

    2015-01-01

    With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monitoring dataset collected over an eight year period off the US Pacific Coast, we developed a methodological approach for avoiding biases associated with hotspot delineation. We aggregated benthic fish species data from research trawls and calculated mean hotspot thresholds for fish species richness and Shannon's diversity indices over the eight year dataset. We used a spatial frequency distribution method to assign hotspot designations to the grid cells annually. We found no areas containing consistently high biodiversity through the entire study period based on the mean thresholds, and no grid cell was designated as a hotspot for greater than 50% of the time-series. To test if our approach was sensitive to sampling effort and the geographic extent of the survey, we followed a similar routine for the northern region of the survey area. Our finding of low consistency in benthic fish biodiversity hotspots over time was upheld, regardless of biodiversity metric used, whether thresholds were calculated per year or across all years, or the spatial extent for which we calculated thresholds and identified hotspots. Our results suggest that static measures of benthic fish biodiversity off the US West Coast are insufficient for identification of hotspots and that long-term data are required to appropriately identify patterns of high temporal variability in biodiversity for these highly mobile taxa. Given that ecological communities are responding to a changing climate and other

  5. Self-consistent gravitational self-force

    International Nuclear Information System (INIS)

    Pound, Adam

    2010-01-01

    I review the problem of motion for small bodies in general relativity, with an emphasis on developing a self-consistent treatment of the gravitational self-force. An analysis of the various derivations extant in the literature leads me to formulate an asymptotic expansion in which the metric is expanded while a representative worldline is held fixed. I discuss the utility of this expansion for both exact point particles and asymptotically small bodies, contrasting it with a regular expansion in which both the metric and the worldline are expanded. Based on these preliminary analyses, I present a general method of deriving self-consistent equations of motion for arbitrarily structured (sufficiently compact) small bodies. My method utilizes two expansions: an inner expansion that keeps the size of the body fixed, and an outer expansion that lets the body shrink while holding its worldline fixed. By imposing the Lorenz gauge, I express the global solution to the Einstein equation in the outer expansion in terms of an integral over a worldtube of small radius surrounding the body. Appropriate boundary data on the tube are determined from a local-in-space expansion in a buffer region where both the inner and outer expansions are valid. This buffer-region expansion also results in an expression for the self-force in terms of irreducible pieces of the metric perturbation on the worldline. Based on the global solution, these pieces of the perturbation can be written in terms of a tail integral over the body's past history. This approach can be applied at any order to obtain a self-consistent approximation that is valid on long time scales, both near and far from the small body. I conclude by discussing possible extensions of my method and comparing it to alternative approaches.

  6. Consistency Checking of Web Service Contracts

    DEFF Research Database (Denmark)

    Cambronero, M. Emilia; Okika, Joseph C.; Ravn, Anders Peter

    2008-01-01

    Behavioural properties are analyzed for web service contracts formulated in Business Process Execution Language (BPEL) and Choreography Description Language (CDL). The key result reported is an automated technique to check consistency between protocol aspects of the contracts. The contracts...... are abstracted to (timed) automata and from there a simulation is set up, which is checked using automated tools for analyzing networks of finite state processes. Here we use the Concurrency Work Bench. The proposed techniques are illustrated with a case study that include otherwise difficult to analyze fault...

  7. A method for consistent precision radiation therapy

    International Nuclear Information System (INIS)

    Leong, J.

    1985-01-01

    Using a meticulous setup procedure in which repeated portal films were taken before each treatment until satisfactory portal verifications were obtained, a high degree of precision in patient positioning was achieved. A fluctuation from treatment to treatment, over 11 treatments, of less than +-0.10 cm (S.D.) for anatomical points inside the treatment field was obtained. This, however, only applies to specific anatomical points selected for this positioning procedure and does not apply to all points within the portal. We have generalized this procedure and have suggested a means by which any target volume can be consistently positioned which may approach this degree of precision. (orig.)

  8. Gentzen's centenary the quest for consistency

    CERN Document Server

    Rathjen, Michael

    2015-01-01

    Gerhard Gentzen has been described as logic’s lost genius, whom Gödel called a better logician than himself. This work comprises articles by leading proof theorists, attesting to Gentzen’s enduring legacy to mathematical logic and beyond. The contributions range from philosophical reflections and re-evaluations of Gentzen’s original consistency proofs to the most recent developments in proof theory. Gentzen founded modern proof theory. His sequent calculus and natural deduction system beautifully explain the deep symmetries of logic. They underlie modern developments in computer science such as automated theorem proving and type theory.

  9. Two consistent calculations of the Weinberg angle

    International Nuclear Information System (INIS)

    Fairlie, D.B.

    1979-01-01

    The Weinberg-Salam theory is reformulated as a pure Yang-Mills theory in a six-dimensional space, the Higgs field being interpreted as gauge potentials in the additional dimensions. Viewed in this way, the condition that the Higgs field transforms as a U(1) representation of charge one is equivalent to requiring a value of 30 0 C for the Weinberg angle. A second consistent determination comes from the idea borrowed from monopole theory that the electromagnetic field is in the direction of the Higgs field. (Author)

  10. Autonomous Propellant Loading Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The AES Autonomous Propellant Loading (APL) project consists of three activities. The first is to develop software that will automatically control loading of...

  11. Consistent resolution of some relativistic quantum paradoxes

    International Nuclear Information System (INIS)

    Griffiths, Robert B.

    2002-01-01

    A relativistic version of the (consistent or decoherent) histories approach to quantum theory is developed on the basis of earlier work by Hartle, and used to discuss relativistic forms of the paradoxes of spherical wave packet collapse, Bohm's formulation of the Einstein-Podolsky-Rosen paradox, and Hardy's paradox. It is argued that wave function collapse is not needed for introducing probabilities into relativistic quantum mechanics, and in any case should never be thought of as a physical process. Alternative approaches to stochastic time dependence can be used to construct a physical picture of the measurement process that is less misleading than collapse models. In particular, one can employ a coarse-grained but fully quantum-mechanical description in which particles move along trajectories, with behavior under Lorentz transformations the same as in classical relativistic physics, and detectors are triggered by particles reaching them along such trajectories. States entangled between spacelike separate regions are also legitimate quantum descriptions, and can be consistently handled by the formalism presented here. The paradoxes in question arise because of using modes of reasoning which, while correct for classical physics, are inconsistent with the mathematical structure of quantum theory, and are resolved (or tamed) by using a proper quantum analysis. In particular, there is no need to invoke, nor any evidence for, mysterious long-range superluminal influences, and thus no incompatibility, at least from this source, between relativity theory and quantum mechanics

  12. Self-consistent model of confinement

    International Nuclear Information System (INIS)

    Swift, A.R.

    1988-01-01

    A model of the large-spatial-distance, zero--three-momentum, limit of QCD is developed from the hypothesis that there is an infrared singularity. Single quarks and gluons do not propagate because they have infinite energy after renormalization. The Hamiltonian formulation of the path integral is used to quantize QCD with physical, nonpropagating fields. Perturbation theory in the infrared limit is simplified by the absence of self-energy insertions and by the suppression of large classes of diagrams due to vanishing propagators. Remaining terms in the perturbation series are resummed to produce a set of nonlinear, renormalizable integral equations which fix both the confining interaction and the physical propagators. Solutions demonstrate the self-consistency of the concepts of an infrared singularity and nonpropagating fields. The Wilson loop is calculated to provide a general proof of confinement. Bethe-Salpeter equations for quark-antiquark pairs and for two gluons have finite-energy solutions in the color-singlet channel. The choice of gauge is addressed in detail. Large classes of corrections to the model are discussed and shown to support self-consistency

  13. Subgame consistent cooperation a comprehensive treatise

    CERN Document Server

    Yeung, David W K

    2016-01-01

    Strategic behavior in the human and social world has been increasingly recognized in theory and practice. It is well known that non-cooperative behavior could lead to suboptimal or even highly undesirable outcomes. Cooperation suggests the possibility of obtaining socially optimal solutions and the calls for cooperation are prevalent in real-life problems. Dynamic cooperation cannot be sustainable if there is no guarantee that the agreed upon optimality principle at the beginning is maintained throughout the cooperation duration. It is due to the lack of this kind of guarantees that cooperative schemes fail to last till its end or even fail to get started. The property of subgame consistency in cooperative dynamic games and the corresponding solution mechanism resolve this “classic” problem in game theory. This book is a comprehensive treatise on subgame consistent dynamic cooperation covering the up-to-date state of the art analyses in this important topic. It sets out to provide the theory, solution tec...

  14. Sludge characterization: the role of physical consistency

    Energy Technology Data Exchange (ETDEWEB)

    Spinosa, Ludovico; Wichmann, Knut

    2003-07-01

    The physical consistency is an important parameter in sewage sludge characterization as it strongly affects almost all treatment, utilization and disposal operations. In addition, in many european Directives a reference to the physical consistency is reported as a characteristic to be evaluated for fulfilling the regulations requirements. Further, in many analytical methods for sludge different procedures are indicated depending on whether a sample is liquid or not, is solid or not. Three physical behaviours (liquid, paste-like and solid) can be observed with sludges, so the development of analytical procedures to define the boundary limit between liquid and paste-like behaviours (flowability) and that between solid and paste-like ones (solidity) is of growing interest. Several devices can be used for evaluating the flowability and solidity properties, but often they are costly and difficult to be operated in the field. Tests have been carried out to evaluate the possibility to adopt a simple extrusion procedure for flowability measurements, and a Vicat needle for solidity ones. (author)

  15. Consistent mutational paths predict eukaryotic thermostability

    Directory of Open Access Journals (Sweden)

    van Noort Vera

    2013-01-01

    Full Text Available Abstract Background Proteomes of thermophilic prokaryotes have been instrumental in structural biology and successfully exploited in biotechnology, however many proteins required for eukaryotic cell function are absent from bacteria or archaea. With Chaetomium thermophilum, Thielavia terrestris and Thielavia heterothallica three genome sequences of thermophilic eukaryotes have been published. Results Studying the genomes and proteomes of these thermophilic fungi, we found common strategies of thermal adaptation across the different kingdoms of Life, including amino acid biases and a reduced genome size. A phylogenetics-guided comparison of thermophilic proteomes with those of other, mesophilic Sordariomycetes revealed consistent amino acid substitutions associated to thermophily that were also present in an independent lineage of thermophilic fungi. The most consistent pattern is the substitution of lysine by arginine, which we could find in almost all lineages but has not been extensively used in protein stability engineering. By exploiting mutational paths towards the thermophiles, we could predict particular amino acid residues in individual proteins that contribute to thermostability and validated some of them experimentally. By determining the three-dimensional structure of an exemplar protein from C. thermophilum (Arx1, we could also characterise the molecular consequences of some of these mutations. Conclusions The comparative analysis of these three genomes not only enhances our understanding of the evolution of thermophily, but also provides new ways to engineer protein stability.

  16. Consistency of extreme flood estimation approaches

    Science.gov (United States)

    Felder, Guido; Paquet, Emmanuel; Penot, David; Zischg, Andreas; Weingartner, Rolf

    2017-04-01

    Estimations of low-probability flood events are frequently used for the planning of infrastructure as well as for determining the dimensions of flood protection measures. There are several well-established methodical procedures to estimate low-probability floods. However, a global assessment of the consistency of these methods is difficult to achieve, the "true value" of an extreme flood being not observable. Anyway, a detailed comparison performed on a given case study brings useful information about the statistical and hydrological processes involved in different methods. In this study, the following three different approaches for estimating low-probability floods are compared: a purely statistical approach (ordinary extreme value statistics), a statistical approach based on stochastic rainfall-runoff simulation (SCHADEX method), and a deterministic approach (physically based PMF estimation). These methods are tested for two different Swiss catchments. The results and some intermediate variables are used for assessing potential strengths and weaknesses of each method, as well as for evaluating the consistency of these methods.

  17. Consistent biokinetic models for the actinide elements

    International Nuclear Information System (INIS)

    Leggett, R.W.

    2001-01-01

    The biokinetic models for Th, Np, Pu, Am and Cm currently recommended by the International Commission on Radiological Protection (ICRP) were developed within a generic framework that depicts gradual burial of skeletal activity in bone volume, depicts recycling of activity released to blood and links excretion to retention and translocation of activity. For other actinide elements such as Ac, Pa, Bk, Cf and Es, the ICRP still uses simplistic retention models that assign all skeletal activity to bone surface and depicts one-directional flow of activity from blood to long-term depositories to excreta. This mixture of updated and older models in ICRP documents has led to inconsistencies in dose estimates and interpretation of bioassay for radionuclides with reasonably similar biokinetics. This paper proposes new biokinetic models for Ac, Pa, Bk, Cf and Es that are consistent with the updated models for Th, Np, Pu, Am and Cm. The proposed models are developed within the ICRP's generic model framework for bone-surface-seeking radionuclides, and an effort has been made to develop parameter values that are consistent with results of comparative biokinetic data on the different actinide elements. (author)

  18. Project Notes

    Science.gov (United States)

    School Science Review, 1978

    1978-01-01

    Presents sixteen project notes developed by pupils of Chipping Norton School and Bristol Grammar School, in the United Kingdom. These Projects include eight biology A-level projects and eight Chemistry A-level projects. (HM)

  19. LANDFIRE: A nationally consistent vegetation, wildland fire, and fuel assessment

    Science.gov (United States)

    Rollins, Matthew G.

    2009-01-01

    LANDFIRE is a 5-year, multipartner project producing consistent and comprehensive maps and data describing vegetation, wildland fuel, fire regimes and ecological departure from historical conditions across the United States. It is a shared project between the wildland fire management and research and development programs of the US Department of Agriculture Forest Service and US Department of the Interior. LANDFIRE meets agency and partner needs for comprehensive, integrated data to support landscape-level fire management planning and prioritization, community and firefighter protection, effective resource allocation, and collaboration between agencies and the public. The LANDFIRE data production framework is interdisciplinary, science-based and fully repeatable, and integrates many geospatial technologies including biophysical gradient analyses, remote sensing, vegetation modelling, ecological simulation, and landscape disturbance and successional modelling. LANDFIRE data products are created as 30-m raster grids and are available over the internet at www.landfire.gov, accessed 22 April 2009. The data products are produced at scales that may be useful for prioritizing and planning individual hazardous fuel reduction and ecosystem restoration projects; however, the applicability of data products varies by location and specific use, and products may need to be adjusted by local users.

  20. A consistent thermodynamic database for cement minerals

    International Nuclear Information System (INIS)

    Blanc, P.; Claret, F.; Burnol, A.; Marty, N.; Gaboreau, S.; Tournassat, C.; Gaucher, E.C.; Giffault, E.; Bourbon, X.

    2010-01-01

    work - the formation enthalpy and the Cp(T) function are taken from the literature or estimated - finally, the Log K(T) function is calculated, based on the selected dataset and it is compared to experimental data gathered at different temperatures. Each experimental point is extracted from solution compositions by using PHREEQC with a selection of aqueous complexes, consistent with the Thermochimie database. The selection was tested namely by drawing activity diagrams, allowing to assess phases relations. An example of such a diagram, drawn in the CaO-Al 2 O 3 -SiO 2 -H 2 O system is displayed. It can be seen that low pH concrete alteration proceeds essentially in decreasing the C/S ratio in C-S-H phases to the point where C-S-H are no longer stable and replaced by zeolite, then clay minerals. This evolution corresponds to a decrease in silica activity, which is consistent with the pH decrease, as silica concentration depends essentially on pH. Some rather consistent phase relations have been obtained for the SO 3 -Al 2 O 3 -CaO-CO 2 -H 2 O system. Addition of iron III enlarges the AFm-SO 4 stability field to the low temperature domain, whereas it decreases the pH domain where ettringite is stable. On the other hand, the stability field of katoite remains largely ambiguous, namely with respect to a hydro-garnet/grossular solid solution. With respect to other databases this work was made in consistency with a larger mineral selection, so that it can be used for modelling works in the cement clay interaction context

  1. Self-consistent modelling of ICRH

    International Nuclear Information System (INIS)

    Hellsten, T.; Hedin, J.; Johnson, T.; Laxaaback, M.; Tennfors, E.

    2001-01-01

    The performance of ICRH is often sensitive to the shape of the high energy part of the distribution functions of the resonating species. This requires self-consistent calculations of the distribution functions and the wave-field. In addition to the wave-particle interactions and Coulomb collisions the effects of the finite orbit width and the RF-induced spatial transport are found to be important. The inward drift dominates in general even for a symmetric toroidal wave spectrum in the centre of the plasma. An inward drift does not necessarily produce a more peaked heating profile. On the contrary, for low concentrations of hydrogen minority in deuterium plasmas it can even give rise to broader profiles. (author)

  2. Non linear self consistency of microtearing modes

    International Nuclear Information System (INIS)

    Garbet, X.; Mourgues, F.; Samain, A.

    1987-01-01

    The self consistency of a microtearing turbulence is studied in non linear regimes where the ergodicity of the flux lines determines the electron response. The current which sustains the magnetic perturbation via the Ampere law results from the combines action of the radial electric field in the frame where the island chains are static and of the thermal electron diamagnetism. Numerical calculations show that at usual values of β pol in Tokamaks the turbulence can create a diffusion coefficient of order ν th p 2 i where p i is the ion larmor radius and ν th the electron ion collision frequency. On the other hand, collisionless regimes involving special profiles of each mode near the resonant surface seem possible

  3. Consistent evolution in a pedestrian flow

    Science.gov (United States)

    Guan, Junbiao; Wang, Kaihua

    2016-03-01

    In this paper, pedestrian evacuation considering different human behaviors is studied by using a cellular automaton (CA) model combined with the snowdrift game theory. The evacuees are divided into two types, i.e. cooperators and defectors, and two different human behaviors, herding behavior and independent behavior, are investigated. It is found from a large amount of numerical simulations that the ratios of the corresponding evacuee clusters are evolved to consistent states despite 11 typically different initial conditions, which may largely owe to self-organization effect. Moreover, an appropriate proportion of initial defectors who are of herding behavior, coupled with an appropriate proportion of initial defectors who are of rationally independent thinking, are two necessary factors for short evacuation time.

  4. PROJECT SCOPE MANAGEMENT PROCESS

    Directory of Open Access Journals (Sweden)

    Yana Derenskaya

    2018-01-01

    : analysis of results of planning the project scope; study of templates of project works’ structures, recommendations for the formation of levels of structure; decomposition of the totality of project work; creating a dictionary of the project work structure; updating the description of the project scope and the project scope management plan. Practical importance. In order to improve the management of the scope of projects in the pharmacy, the components of subprocesses, participants, input and output documents are investigated and the algorithm for managing the project scope is built. It is determined that the starting elements of project scope management are the justification of the initial data, i.e. the project purpose, impacts of the environment and the internal potential of the enterprise in relation to the project implementation (assets of the organizational process. It is recommended to create a structure of project work starting from the analytical research existing at the enterprise or recommended by standards and guidelines on project management approaches to building a hierarchical structure of works, templates of project work structures. It is noted that the created structure of project work should be audited by participants of the project office. According to the results of planning the sequence and duration of operations for managing the scope of works, a precedence diagram of the investigated process is constructed. Value/originality. The developed recommendations regarding the consistency and structure of subprocesses and operations of the project scope management will allow the enterprise to significantly save time for planning the scope of subsequent projects, using the database created in previous periods, statistics on the implementation of the described operations.

  5. Thermodynamically consistent model calibration in chemical kinetics

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2011-05-01

    Full Text Available Abstract Background The dynamics of biochemical reaction systems are constrained by the fundamental laws of thermodynamics, which impose well-defined relationships among the reaction rate constants characterizing these systems. Constructing biochemical reaction systems from experimental observations often leads to parameter values that do not satisfy the necessary thermodynamic constraints. This can result in models that are not physically realizable and may lead to inaccurate, or even erroneous, descriptions of cellular function. Results We introduce a thermodynamically consistent model calibration (TCMC method that can be effectively used to provide thermodynamically feasible values for the parameters of an open biochemical reaction system. The proposed method formulates the model calibration problem as a constrained optimization problem that takes thermodynamic constraints (and, if desired, additional non-thermodynamic constraints into account. By calculating thermodynamically feasible values for the kinetic parameters of a well-known model of the EGF/ERK signaling cascade, we demonstrate the qualitative and quantitative significance of imposing thermodynamic constraints on these parameters and the effectiveness of our method for accomplishing this important task. MATLAB software, using the Systems Biology Toolbox 2.1, can be accessed from http://www.cis.jhu.edu/~goutsias/CSS lab/software.html. An SBML file containing the thermodynamically feasible EGF/ERK signaling cascade model can be found in the BioModels database. Conclusions TCMC is a simple and flexible method for obtaining physically plausible values for the kinetic parameters of open biochemical reaction systems. It can be effectively used to recalculate a thermodynamically consistent set of parameter values for existing thermodynamically infeasible biochemical reaction models of cellular function as well as to estimate thermodynamically feasible values for the parameters of new

  6. Organization of project management

    International Nuclear Information System (INIS)

    Schmidt, R.

    1975-01-01

    When speaking about interfaces within a project and their management, one has to understand and define what an interface is. In general, each component facing another one and each person working on a project with another person represents an interface. Therefore a project will consist practically in its entirety of interfaces with components and people sandwiched between them. This paper is limited to the most important interfaces with a focus on the problems occuring at them and their resolution. (orig.) [de

  7. Merging By Decentralized Eventual Consistency Algorithms

    Directory of Open Access Journals (Sweden)

    Ahmed-Nacer Mehdi

    2015-12-01

    Full Text Available Merging mechanism is an essential operation for version control systems. When each member of collaborative development works on an individual copy of the project, software merging allows to reconcile modifications made concurrently as well as managing software change through branching. The collaborative system is in charge to propose a merge result that includes user’s modifications. Theusers now have to check and adapt this result. The adaptation should be as effort-less as possible, otherwise, the users may get frustrated and will quit the collaboration. This paper aims to reduce the conflicts during the collaboration and im prove the productivity. It has three objectives: study the users’ behavior during the collaboration, evaluate the quality of textual merging results produced by specific algorithms and propose a solution to improve the r esult quality produced by the default merge tool of distributed version control systems. Through a study of eight open-source repositories totaling more than 3 million lines of code, we observe the behavior of the concurrent modifications during t he merge p rocedure. We i dentified when th e ex isting merge techniques under-perform, and we propose solutions to improve the quality of the merge. We finally compare with the traditional merge tool through a large corpus of collaborative editing.

  8. The Development of Australia's National Training System: A Dynamic Tension between Consistency and Flexibility. Occasional Paper

    Science.gov (United States)

    Bowman, Kaye; McKenna, Suzy

    2016-01-01

    This occasional paper provides an overview of the development of Australia's national training system and is a key knowledge document of a wider research project "Consistency with flexibility in the Australian national training system." This research project investigates the various approaches undertaken by each of the jurisdictions to…

  9. CONSISTENCY UNDER SAMPLING OF EXPONENTIAL RANDOM GRAPH MODELS.

    Science.gov (United States)

    Shalizi, Cosma Rohilla; Rinaldo, Alessandro

    2013-04-01

    The growing availability of network data and of scientific interest in distributed systems has led to the rapid development of statistical models of network structure. Typically, however, these are models for the entire network, while the data consists only of a sampled sub-network. Parameters for the whole network, which is what is of interest, are estimated by applying the model to the sub-network. This assumes that the model is consistent under sampling , or, in terms of the theory of stochastic processes, that it defines a projective family. Focusing on the popular class of exponential random graph models (ERGMs), we show that this apparently trivial condition is in fact violated by many popular and scientifically appealing models, and that satisfying it drastically limits ERGM's expressive power. These results are actually special cases of more general results about exponential families of dependent random variables, which we also prove. Using such results, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses.

  10. Exploring the Consistent behavior of Information Services

    Directory of Open Access Journals (Sweden)

    Kapidakis Sarantos

    2016-01-01

    Full Text Available Computer services are normally assumed to work well all the time. This usually happens for crucial services like bank electronic services, but not necessarily so for others, that there is no commercial interest in their operation. In this work we examined the operation and the errors of information services and tried to find clues that will help predicting the consistency of the behavior and the quality of the harvesting, which is harder because of the transient conditions and the many services and the huge amount of harvested information. We found many unexpected situations. The services that always successfully satisfy a request may in fact return part of it. A significant part of the OAI services have ceased working while many other serves occasionally fail to respond. Some services fail in the same way each time, and we pronounce them dead, as we do not see a way to overcome that. Others also always, or sometimes fail, but not in the same way, and we hope that their behavior is affected by temporary factors, that may improve later on. We categorized the services into classes, to study their behavior in more detail.

  11. A Consistent Phylogenetic Backbone for the Fungi

    Science.gov (United States)

    Ebersberger, Ingo; de Matos Simoes, Ricardo; Kupczok, Anne; Gube, Matthias; Kothe, Erika; Voigt, Kerstin; von Haeseler, Arndt

    2012-01-01

    The kingdom of fungi provides model organisms for biotechnology, cell biology, genetics, and life sciences in general. Only when their phylogenetic relationships are stably resolved, can individual results from fungal research be integrated into a holistic picture of biology. However, and despite recent progress, many deep relationships within the fungi remain unclear. Here, we present the first phylogenomic study of an entire eukaryotic kingdom that uses a consistency criterion to strengthen phylogenetic conclusions. We reason that branches (splits) recovered with independent data and different tree reconstruction methods are likely to reflect true evolutionary relationships. Two complementary phylogenomic data sets based on 99 fungal genomes and 109 fungal expressed sequence tag (EST) sets analyzed with four different tree reconstruction methods shed light from different angles on the fungal tree of life. Eleven additional data sets address specifically the phylogenetic position of Blastocladiomycota, Ustilaginomycotina, and Dothideomycetes, respectively. The combined evidence from the resulting trees supports the deep-level stability of the fungal groups toward a comprehensive natural system of the fungi. In addition, our analysis reveals methodologically interesting aspects. Enrichment for EST encoded data—a common practice in phylogenomic analyses—introduces a strong bias toward slowly evolving and functionally correlated genes. Consequently, the generalization of phylogenomic data sets as collections of randomly selected genes cannot be taken for granted. A thorough characterization of the data to assess possible influences on the tree reconstruction should therefore become a standard in phylogenomic analyses. PMID:22114356

  12. [Consistent Declarative Memory with Depressive Symptomatology].

    Science.gov (United States)

    Botelho de Oliveira, Silvia; Flórez, Ruth Natalia Suárez; Caballero, Diego Andrés Vásquez

    2012-12-01

    Some studies have suggested that potentiated remembrance of negative events on people with depressive disorders seems to be an important factor in the etiology, course and maintenance of depression. Evaluate the emotional memory in people with and without depressive symptomatology by means of an audio-visual test. 73 university students were evaluated, male and female, between 18 and 40 years old, distributed in two groups: with depressive symptomatology (32) and without depressive symptomatology (40), using the Scale from the Center of Epidemiologic Studies for Depression (CES-D, English Abbreviation) and a cutting point of 20. There were not meaningful differences between free and voluntary recalls, with and without depressive symptomatology, in spite of the fact that both groups had granted a higher emotional value to the audio-visual test and that they had associated it with emotional sadness. People with depressive symptomatology did not exhibit the effect of mnemonic potentiation generally associated to the content of the emotional version of the test; therefore, the hypothesis of emotional consistency was not validated. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  13. Self consistent field theory of virus assembly

    Science.gov (United States)

    Li, Siyu; Orland, Henri; Zandi, Roya

    2018-04-01

    The ground state dominance approximation (GSDA) has been extensively used to study the assembly of viral shells. In this work we employ the self-consistent field theory (SCFT) to investigate the adsorption of RNA onto positively charged spherical viral shells and examine the conditions when GSDA does not apply and SCFT has to be used to obtain a reliable solution. We find that there are two regimes in which GSDA does work. First, when the genomic RNA length is long enough compared to the capsid radius, and second, when the interaction between the genome and capsid is so strong that the genome is basically localized next to the wall. We find that for the case in which RNA is more or less distributed uniformly in the shell, regardless of the length of RNA, GSDA is not a good approximation. We observe that as the polymer-shell interaction becomes stronger, the energy gap between the ground state and first excited state increases and thus GSDA becomes a better approximation. We also present our results corresponding to the genome persistence length obtained through the tangent-tangent correlation length and show that it is zero in case of GSDA but is equal to the inverse of the energy gap when using SCFT.

  14. Consistency based correlations for tailings consolidation

    Energy Technology Data Exchange (ETDEWEB)

    Azam, S.; Paul, A.C. [Regina Univ., Regina, SK (Canada). Environmental Systems Engineering

    2010-07-01

    The extraction of oil, uranium, metals and mineral resources from the earth generates significant amounts of tailings slurry. The tailings are contained in a disposal area with perimeter dykes constructed from the coarser fraction of the slurry. There are many unique challenges pertaining to the management of the containment facilities for several decades beyond mine closure that are a result of the slow settling rates of the fines and the high standing toxic waters. Many tailings dam failures in different parts of the world have been reported to result in significant contaminant releases causing public concern over the conventional practice of tailings disposal. Therefore, in order to reduce and minimize the environmental footprint, the fluid tailings need to undergo efficient consolidation. This paper presented an investigation into the consolidation behaviour of tailings in conjunction with soil consistency that captured physicochemical interactions. The paper discussed the large strain consolidation behaviour (volume compressibility and hydraulic conductivity) of six fine-grained soil slurries based on published data. The paper provided background information on the study and presented the research methodology. The geotechnical index properties of the selected materials were also presented. The large strain consolidation, volume compressibility correlations, and hydraulic conductivity correlations were provided. It was concluded that the normalized void ratio best described volume compressibility whereas liquidity index best explained the hydraulic conductivity. 17 refs., 3 tabs., 4 figs.

  15. Consistency between GRUAN sondes, LBLRTM and IASI

    Directory of Open Access Journals (Sweden)

    X. Calbet

    2017-06-01

    Full Text Available Radiosonde soundings from the GCOS Reference Upper-Air Network (GRUAN data record are shown to be consistent with Infrared Atmospheric Sounding Instrument (IASI-measured radiances via LBLRTM (Line-By-Line Radiative Transfer Model in the part of the spectrum that is mostly affected by water vapour absorption in the upper troposphere (from 700 hPa up. This result is key for climate data records, since GRUAN, IASI and LBLRTM constitute reference measurements or a reference radiative transfer model in each of their fields. This is specially the case for night-time radiosonde measurements. Although the sample size is small (16 cases, daytime GRUAN radiosonde measurements seem to have a small dry bias of 2.5 % in absolute terms of relative humidity, located mainly in the upper troposphere, with respect to LBLRTM and IASI. Full metrological closure is not yet possible and will not be until collocation uncertainties are better characterized and a full uncertainty covariance matrix is clarified for GRUAN.

  16. Self-consistent nuclear energy systems

    International Nuclear Information System (INIS)

    Shimizu, A.; Fujiie, Y.

    1995-01-01

    A concept of self-consistent energy systems (SCNES) has been proposed as an ultimate goal of the nuclear energy system in the coming centuries. SCNES should realize a stable and unlimited energy supply without endangering the human race and the global environment. It is defined as a system that realizes at least the following four objectives simultaneously: (a) energy generation -attain high efficiency in the utilization of fission energy; (b) fuel production - secure inexhaustible energy source: breeding of fissile material with the breeding ratio greater than one and complete burning of transuranium through recycling; (c) burning of radionuclides - zero release of radionuclides from the system: complete burning of transuranium and elimination of radioactive fission products by neutron capture reactions through recycling; (d) system safety - achieve system safety both for the public and experts: eliminate criticality-related safety issues by using natural laws and simple logic. This paper describes the concept of SCNES and discusses the feasibility of the system. Both ''neutron balance'' and ''energbalance'' of the system are introduced as the necessary conditions to be satisfied at least by SCNES. Evaluations made so far indicate that both the neutron balance and the energy balance can be realized by fast reactors but not by thermal reactors. Concerning the system safety, two safety concepts: ''self controllability'' and ''self-terminability'' are introduced to eliminate the criticality-related safety issues in fast reactors. (author)

  17. Toward a consistent model for glass dissolution

    International Nuclear Information System (INIS)

    Strachan, D.M.; McGrail, B.P.; Bourcier, W.L.

    1994-01-01

    Understanding the process of glass dissolution in aqueous media has advanced significantly over the last 10 years through the efforts of many scientists around the world. Mathematical models describing the glass dissolution process have also advanced from simple empirical functions to structured models based on fundamental principles of physics, chemistry, and thermodynamics. Although borosilicate glass has been selected as the waste form for disposal of high-level wastes in at least 5 countries, there is no international consensus on the fundamental methodology for modeling glass dissolution that could be used in assessing the long term performance of waste glasses in a geologic repository setting. Each repository program is developing their own model and supporting experimental data. In this paper, we critically evaluate a selected set of these structured models and show that a consistent methodology for modeling glass dissolution processes is available. We also propose a strategy for a future coordinated effort to obtain the model input parameters that are needed for long-term performance assessments of glass in a geologic repository. (author) 4 figs., tabs., 75 refs

  18. Algorithms for assessing person-based consistency among linked records for the investigation of maternal use of medications and safety

    Directory of Open Access Journals (Sweden)

    Duong Tran

    2017-04-01

    Quality assessment indicated high consistency among linked records. The set of algorithms developed in this project can be applied to similar linked perinatal datasets to promote a consistent approach and comparability across studies.

  19. View from Europe: stability, consistency or pragmatism

    International Nuclear Information System (INIS)

    Dunster, H.J.

    1988-01-01

    The last few years of this decade look like a period of reappraisal of radiation protection standards. The revised risk estimates from Japan will be available, and the United Nations Scientific Committee on the Effects of Atomic Radiation will be publishing new reports on biological topics. The International Commission on Radiological Protection (ICRP) has started a review of its basic recommendations, and the new specification for dose equivalent in radiation fields of the International Commission on Radiation Units and Measurements (ICRU) will be coming into use. All this is occurring at a time when some countries are still trying to catch up with committed dose equivalent and the recently recommended change in the value of the quality factor for neutrons. In Europe, the problems of adapting to new ICRP recommendations are considerable. The European Community, including 12 states and nine languages, takes ICRP recommendations as a basis and develops council directives that are binding on member states, which have then to arrange for their own regulatory changes. Any substantial adjustments could take 5 y or more to work through the system. Clearly, the regulatory preference is for stability. Equally clearly, trade unions and public interest groups favor a rapid response to scientific developments (provided that the change is downward). Organizations such as the ICRP have to balance their desire for internal consistency and intellectual purity against the practical problems of their clients in adjusting to change. This paper indicates some of the changes that might be necessary over the next few years and how, given a pragmatic approach, they might be accommodated in Europe without too much regulatory confusion

  20. The Consistency Between Clinical and Electrophysiological Diagnoses

    Directory of Open Access Journals (Sweden)

    Esra E. Okuyucu

    2009-09-01

    Full Text Available OBJECTIVE: The aim of this study was to provide information concerning the impact of electrophysiological tests in the clinical management and diagnosis of patients, and to evaluate the consistency between referring clinical diagnoses and electrophysiological diagnoses. METHODS: The study included 957 patients referred to the electroneuromyography (ENMG laboratory from different clinics with different clinical diagnoses in 2008. Demographic data, referring clinical diagnoses, the clinics where the requests wanted, and diagnoses after ENMG testing were recorded and statistically evaluated. RESULTS: In all, 957 patients [644 (67.3% female and 313 (32.7% male] were included in the study. Mean age of the patients was 45.40 ± 14.54 years. ENMG requests were made by different specialists; 578 (60.4% patients were referred by neurologists, 122 (12.8% by orthopedics, 140 (14.6% by neurosurgeons, and 117 (12.2% by physical treatment and rehabilitation departments. According to the results of ENMG testing, 513 (53.6% patients’ referrals were related to their referral diagnosis, whereas 397 (41.5% patients had normal ENMG test results, and 47 (4.9% patients had a diagnosis that differed from the referring diagnosis. Among the relation between the referral diagnosis and electrophysiological diagnosis according to the clinics where the requests were made, there was no statistical difference (p= 0.794, but there were statistically significant differences between the support of different clinical diagnoses, such as carpal tunnel syndrome, polyneuropathy, radiculopathy-plexopathy, entrapment neuropathy, and myopathy based on ENMG test results (p< 0.001. CONCLUSION: ENMG is a frequently used neurological examination. As such, referrals for ENMG can be made to either support the referring diagnosis or to exclude other diagnoses. This may explain the inconsistency between clinical referring diagnoses and diagnoses following ENMG

  1. Self-consistent meson mass spectrum

    International Nuclear Information System (INIS)

    Balazs, L.A.P.

    1982-01-01

    A dual-topological-unitarization (or dual-fragmentation) approach to the calculation of hadron masses is presented, in which the effect of planar ''sea''-quark loops is taken into account from the beginning. Using techniques based on analyticity and generalized ladder-graph dynamics, we first derive the approximate ''generic'' Regge-trajectory formula α(t) = max (S 1 +S 2 , S 3 +S 4 )-(1/2) +2alpha-circumflex'[s/sub a/ +(1/2)(t-summationm/sub i/ 2 )] for any given hadronic process 1+2→3+4, where S/sub i/ and m/sub i/ are the spins and masses of i = 1,2,3,4, and √s/sub a/ is the effective mass of the lowest nonvanishing contribution (a) exchanged in the crossed channel. By requiring a minimization of secondary (background, etc.) contributions to a, and demanding simultaneous consistency for entire sets of such processes, we are then able to calculate the masses of all the lowest pseudoscalar and vector qq-bar states with q = u,d,s and the Regge trajectories on which they lie. By making certain additional assumptions we are also able to do this with q = u,d,c and q = u,d,b. Our only arbitrary parameters are m/sub rho/, m/sub K/*, m/sub psi/, and m/sub Upsilon/, one of which merely serves to fix the energy scale. In contrast to many other approaches, a small m/sub π/ 2 /m/sub rho/ 2 ratio arises quite naturally in the present scheme

  2. Speed Consistency in the Smart Tachograph.

    Science.gov (United States)

    Borio, Daniele; Cano, Eduardo; Baldini, Gianmarco

    2018-05-16

    In the transportation sector, safety risks can be significantly reduced by monitoring the behaviour of drivers and by discouraging possible misconducts that entail fatigue and can increase the possibility of accidents. The Smart Tachograph (ST), the new revision of the Digital Tachograph (DT), has been designed with this purpose: to verify that speed limits and compulsory rest periods are respected by drivers. In order to operate properly, the ST periodically checks the consistency of data from different sensors, which can be potentially manipulated to avoid the monitoring of the driver behaviour. In this respect, the ST regulation specifies a test procedure to detect motion conflicts originating from inconsistencies between Global Navigation Satellite System (GNSS) and odometry data. This paper provides an experimental evaluation of the speed verification procedure specified by the ST regulation. Several hours of data were collected using three vehicles and considering light urban and highway environments. The vehicles were equipped with an On-Board Diagnostics (OBD) data reader and a GPS/Galileo receiver. The tests prescribed by the regulation were implemented with specific focus on synchronization aspects. The experimental analysis also considered aspects such as the impact of tunnels and the presence of data gaps. The analysis shows that the metrics selected for the tests are resilient to data gaps, latencies between GNSS and odometry data and simplistic manipulations such as data scaling. The new ST forces an attacker to falsify data from both sensors at the same time and in a coherent way. This makes more difficult the implementation of frauds in comparison to the current version of the DT.

  3. Thermodynamically consistent mesoscopic model of the ferro/paramagnetic transition

    Czech Academy of Sciences Publication Activity Database

    Benešová, Barbora; Kružík, Martin; Roubíček, Tomáš

    2013-01-01

    Roč. 64, Č. 1 (2013), s. 1-28 ISSN 0044-2275 R&D Projects: GA AV ČR IAA100750802; GA ČR GA106/09/1573; GA ČR GAP201/10/0357 Grant - others:GA ČR(CZ) GA106/08/1397; GA MŠk(CZ) LC06052 Program:GA; LC Institutional support: RVO:67985556 Keywords : ferro-para-magnetism * evolution * thermodynamics Subject RIV: BA - General Mathematics; BA - General Mathematics (UT-L) Impact factor: 1.214, year: 2013 http://library.utia.cas.cz/separaty/2012/MTR/kruzik-thermodynamically consistent mesoscopic model of the ferro-paramagnetic transition.pdf

  4. Project management. A discipline which contributes to project success

    International Nuclear Information System (INIS)

    Hoch, G.

    2008-01-01

    The presentation covers the following topics: description of the Project - a contract was signed between KNPP and European Consortium Kozloduy (ECK) consisting of Framatome ANP GmbH as Leader (63%), Framatome ANP S.A.S. (17%), Atomenergoexport (20%) ; Project management in the modernization of NPP Kozloduy units 5 and 6; Project management process within AREVA NP GmbH; current status

  5. Time-Consistent and Market-Consistent Evaluations (Revised version of 2012-086)

    NARCIS (Netherlands)

    Stadje, M.A.; Pelsser, A.

    2014-01-01

    Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  6. Astro tourism: Astro Izery project

    Science.gov (United States)

    Mrozek, Tomasz; Kołomański, Sylwester; Żakowicz, Grzegorz; Kornafel, Stanisław; Czarnecki, Tomasz L.; Suchan, Pavel; Kamiński, Zbigniew

    2015-03-01

    The Astro Izery project is carried by several institutions from Poland and Czech Republic. Its aim is to educate and inform tourists, who visit the Izery Mountains, about astronomy and light pollution. The project consists of two activities: permanent (sundials, planetary path etc.) and periodic (meetings, workshops). After five years the project is in good health and will gain more elements in next years.

  7. Define Project

    DEFF Research Database (Denmark)

    Munk-Madsen, Andreas

    2005-01-01

    "Project" is a key concept in IS management. The word is frequently used in textbooks and standards. Yet we seldom find a precise definition of the concept. This paper discusses how to define the concept of a project. The proposed definition covers both heavily formalized projects and informally...... organized, agile projects. Based on the proposed definition popular existing definitions are discussed....

  8. Project Management

    DEFF Research Database (Denmark)

    Project Management Theory Meets Practice contains the proceedings from the 1st Danish Project Management Research Conference (DAPMARC 2015), held in Copenhagen, Denmark, on May 21st, 2015.......Project Management Theory Meets Practice contains the proceedings from the 1st Danish Project Management Research Conference (DAPMARC 2015), held in Copenhagen, Denmark, on May 21st, 2015....

  9. Project Management

    DEFF Research Database (Denmark)

    Pilkington, Alan; Chai, Kah-Hin; Le, Yang

    2015-01-01

    This paper identifies the true coverage of PM theory through a bibliometric analysis of the International Journal of Project Management from 1996-2012. We identify six persistent research themes: project time management, project risk management, programme management, large-scale project management......, project success/failure and practitioner development. These differ from those presented in review and editorial articles in the literature. In addition, topics missing from the PM BOK: knowledge management project-based organization and project portfolio management have become more popular topics...

  10. Consistently Showing Your Best Side? Intra-individual Consistency in #Selfie Pose Orientation

    Science.gov (United States)

    Lindell, Annukka K.

    2017-01-01

    Painted and photographic portraits of others show an asymmetric bias: people favor their left cheek. Both experimental and database studies confirm that the left cheek bias extends to selfies. To date all such selfie studies have been cross-sectional; whether individual selfie-takers tend to consistently favor the same pose orientation, or switch between multiple poses, remains to be determined. The present study thus examined intra-individual consistency in selfie pose orientations. Two hundred selfie-taking participants (100 male and 100 female) were identified by searching #selfie on Instagram. The most recent 10 single-subject selfies for the each of the participants were selected and coded for type of selfie (normal; mirror) and pose orientation (left, midline, right), resulting in a sample of 2000 selfies. Results indicated that selfie-takers do tend to consistently adopt a preferred pose orientation (α = 0.72), with more participants showing an overall left cheek bias (41%) than would be expected by chance (overall right cheek bias = 31.5%; overall midline bias = 19.5%; no overall bias = 8%). Logistic regression modellng, controlling for the repeated measure of participant identity, indicated that sex did not affect pose orientation. However, selfie type proved a significant predictor when comparing left and right cheek poses, with a stronger left cheek bias for mirror than normal selfies. Overall, these novel findings indicate that selfie-takers show intra-individual consistency in pose orientation, and in addition, replicate the previously reported left cheek bias for selfies and other types of portrait, confirming that the left cheek bias also presents within individuals’ selfie corpora. PMID:28270790

  11. 78 FR 58288 - Consistency Certification for a Proposed Project in Sterling, New York; Notice of Appeal

    Science.gov (United States)

    2013-09-23

    ... and Dates: You may submit written comments concerning this appeal or requests for a public hearing to..., Room 6111, Silver Spring, MD 20910, or via email to [email protected] . Comments or requests for a public hearing must be sent in writing postmarked or emailed no later than October 23, 2013...

  12. Consistent Practices for the Probability of Detection (POD) of Fracture Critical Metallic Components Project

    Science.gov (United States)

    Hughitt, Brian; Generazio, Edward (Principal Investigator); Nichols, Charles; Myers, Mika (Principal Investigator); Spencer, Floyd (Principal Investigator); Waller, Jess (Principal Investigator); Wladyka, Jordan (Principal Investigator); Aldrin, John; Burke, Eric; Cerecerez, Laura; hide

    2016-01-01

    NASA-STD-5009 requires that successful flaw detection by NDE methods be statistically qualified for use on fracture critical metallic components, but does not standardize practices. This task works towards standardizing calculations and record retention with a web-based tool, the NNWG POD Standards Library or NPSL. Test methods will also be standardized with an appropriately flexible appendix to -5009 identifying best practices. Additionally, this appendix will describe how specimens used to qualify NDE systems will be cataloged, stored and protected from corrosion, damage, or loss.

  13. Project financing

    International Nuclear Information System (INIS)

    Cowan, A.

    1998-01-01

    Project financing was defined ('where a lender to a specific project has recourse only to the cash flow and assets of that project for repayment and security respectively') and its attributes were described. Project financing was said to be particularly well suited to power, pipeline, mining, telecommunications, petro-chemicals, road construction, and oil and gas projects, i.e. large infrastructure projects that are difficult to fund on-balance sheet, where the risk profile of a project does not fit the corporation's risk appetite, or where higher leverage is required. Sources of project financing were identified. The need to analyze and mitigate risks, and being aware that lenders always take a conservative view and gravitate towards the lowest common denominator, were considered the key to success in obtaining project financing funds. TransAlta Corporation's project financing experiences were used to illustrate the potential of this source of financing

  14. Project descriptions

    International Nuclear Information System (INIS)

    Anon.

    1990-01-01

    This part specifies the activities and project tasks of each project broken down according to types of financing, listing the current projects Lw 1 through 3 funded by long-term provisions (budget), the current projects LB 1 and 2, LG 1 through 5, LK1, LM1, and LU 1 through 6 financed from special funds, and the planned projects ZG 1 through 4 and ZU 1, also financed from special funds. (DG) [de

  15. SISCAL project

    Science.gov (United States)

    Santer, Richard P.; Fell, Frank

    2003-05-01

    ), combining satellite data, evaluation algorithms and value-adding ancillary digital information. This prevents the end user from investing funds into expensive equipment or to hire specialized personnel. The data processor shall be a generic tool, which may be applied to a large variety of operationally gathered satellite data. In the frame of SISCAL, the processor shall be applied to remotely sensed data of selected coastal areas and lakes in Central Europe and the Eastern Mediterranean, according to the needs of the end users within the SISCAL consortium. A number of measures are required to achieve the objective of the proposed project: (1) Identification and specification of the SISCAL end user needs for NRT water related data products accessible to EO techniques. (2) Selection of the most appropriate instruments, evaluation algorithms and ancillary data bases required to provide the identified data products. (3) Development of the actual Near-Real-Time data processor for the specified EO data products. (4) Development of the GIS processor adding ancillary digital information to the satellite images and providing the required geographical projections. (5) Development of a product retrieval and management system to handle ordering and distribution of data products between the SISCAL server and the end users, including payment and invoicing. (6) Evaluation of the derived data products in terms of accuracy and usefulness by comparison with available in-situ measurements and by making use of the local expertise of the end users. (7) Establishing an Internet server dedicated to internal communication between the consortium members as well as presenting the SISCAL project to a larger public. (8) Marketing activities, presentation of data processor to potential external customers, identification of their exact needs. The innovative aspect of the SISCAL project consists in the generation of NRT data products on water quality parameters from EO data. This article mainly deals

  16. Project studies

    DEFF Research Database (Denmark)

    Geraldi, Joana; Söderlund, Jonas

    2018-01-01

    Project organising is a growing field of scholarly inquiry and management practice. In recent years, two important developments have influenced this field: (1) the study and practice of projects have extended their level of analysis from mainly focussing on individual projects to focussing on micro......, and of the explanations of project practices they could offer. To discuss avenues for future research on projects and project practice, this paper suggests the notion of project studies to better grasp the status of our field. We combine these two sets of ideas to analyse the status and future options for advancing...... project research: (1) levels of analysis; and (2) type of research. Analysing recent developments within project studies, we observe the emergence of what we refer to as type 3 research, which reconciles the need for theoretical development and engagement with practice. Type 3 research suggests pragmatic...

  17. Project Longshot

    Science.gov (United States)

    West, J. Curtis; Chamberlain, Sally A.; Stevens, Robert; Pagan, Neftali

    1989-01-01

    Project Longshot is an unmanned probe to our nearest star system, Alpha Centauri, 4.3 light years away. The Centauri system is a trinary system consisting of two central stars (A and B) orbiting a barycenter, and a third (Proxima Centauri) orbiting the two. The system is a declination of -67 degrees. The goal is to reach the Centauri system in 50 years. This time space was chosen because any shorter time would be impossible of the relativistic velocities involved, and any greater time would be impossible because of the difficulty of creating a spacecraft with such a long lifetime. Therefore, the following mission profile is proposed: (1) spacecraft is assembled in Earth orbit; (2) spacecraft escapes Earth and Sun in the ecliptic with a single impulse maneuver; (3) spacecraft changed declination to point toward Centauri system; (4) spacecraft accelerates to 0.1c; (5) spacecraft coasts at 0.1c for 41 years; (6) spacecraft decelerates upon reaching Centauri system; and (7) spacecraft orbits Centauri system, conducts investigations, and relays data to Earth. The total time to reach the Centauri system, taking into consideration acceleration and deceleration, will be approximately 50 years.

  18. Virtual projects

    DEFF Research Database (Denmark)

    Svejvig, Per; Commisso, Trine Hald

    2012-01-01

    that the best practice knowledge has not permeated sufficiently to the practice. Furthermore, the appropriate application of information and communication technology (ICT) remains a big challenge, and finally project managers are not sufficiently trained in organizing and conducting virtual projects....... The overall implications for research and practice are to acknowledge virtual project management as very different to traditional project management and to address this difference.......Virtual projects are common with global competition, market development, and not least the financial crisis forcing organizations to reduce their costs drastically. Organizations therefore have to place high importance on ways to carry out virtual projects and consider appropriate practices...

  19. Project financing

    International Nuclear Information System (INIS)

    Alvarez, M.U.

    1990-01-01

    This paper presents the basic concepts and components of the project financing of large industrial facilities. Diagrams of a simple partnership structure and a simple leveraged lease structure are included. Finally, a Hypothetical Project is described with basic issues identified for discussion purposes. The topics of the paper include non-recourse financing, principal advantages and objectives, disadvantages, project financing participants and agreements, feasibility studies, organization of the project company, principal agreements in a project financing, insurance, and an examination of a hypothetical project

  20. Microsoft project

    OpenAIRE

    Markić, Lucija; Mandušić, Dubravka; Grbavac, Vitomir

    2005-01-01

    Microsoft Project je alat čije su prednosti u svakodnevnom radu nezamjenjive. Pomoću Microsoft Projecta omogućeno je upravljanje resursima, stvaranje izvještaja o projektima u vremenu, te analize različitih scenarija. Pojavljuje u tri verzije: Microsoft Project Professional, Microsoft Project Server i Microsoft Project Server Client Access Licenses. Upravo je trend da suvremeni poslovni ljudi zadatke povjeravaju Microsoft Projectu jer on znatno povećava produktivnost rada. Te prednos...

  1. Project ethics

    CERN Document Server

    Jonasson, Haukur Ingi

    2013-01-01

    How relevant is ethics to project management? The book - which aims to demystify the field of ethics for project managers and managers in general - takes both a critical and a practical look at project management in terms of success criteria, and ethical opportunities and risks. The goal is to help the reader to use ethical theory to further identify opportunities and risks within their projects and thereby to advance more directly along the path of mature and sustainable managerial practice.

  2. Project Temporalities

    DEFF Research Database (Denmark)

    Tryggestad, Kjell; Justesen, Lise; Mouritsen, Jan

    2013-01-01

    Purpose – The purpose of this paper is to explore how animals can become stakeholders in interaction with project management technologies and what happens with project temporalities when new and surprising stakeholders become part of a project and a recognized matter of concern to be taken...... into account. Design/methodology/approach – The paper is based on a qualitative case study of a project in the building industry. The authors use actor-network theory (ANT) to analyze the emergence of animal stakeholders, stakes and temporalities. Findings – The study shows how project temporalities can...... multiply in interaction with project management technologies and how conventional linear conceptions of project time may be contested with the emergence of new non-human stakeholders and temporalities. Research limitations/implications – The study draws on ANT to show how animals can become stakeholders...

  3. Network Visualization Project (NVP)

    Science.gov (United States)

    2016-07-01

    Application data flow .............................................................................2 Fig. 2 Sample JSON data...interface supporting improved network analysis and network communication visualization. 2. Application Design NVP consists of 2 parts: back-end data...notation ( JSON ) format. This JSON is provided as input to the front-end application of the project. This interaction of the user with the back-end

  4. 14 CFR 151.41 - Project costs.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Project costs. 151.41 Section 151.41... FEDERAL AID TO AIRPORTS Rules and Procedures for Airport Development Projects § 151.41 Project costs. (a) For the purposes of subparts B and C, project costs consist of any costs involved in accomplishing a...

  5. Project Management Methodology in Human Resource Management

    Science.gov (United States)

    Josler, Cheryl; Burger, James

    2005-01-01

    When charged with overseeing a project, how can one ensure that the project will be completed on time, within budget, and to the satisfaction of everyone involved? In this article, the authors examine project management methodology as a means of ensuring that projects are conducted in a disciplined, well-managed and consistent manner that serves…

  6. Multiplicative Consistency for Interval Valued Reciprocal Preference Relations

    OpenAIRE

    Wu, Jian; Chiclana, Francisco

    2014-01-01

    The multiplicative consistency (MC) property of interval additive reciprocal preference relations (IARPRs) is explored, and then the consistency index is quantified by the multiplicative consistency estimated IARPR. The MC property is used to measure the level of consistency of the information provided by the experts and also to propose the consistency index induced ordered weighted averaging (CI-IOWA) operator. The novelty of this operator is that it aggregates individual IARPRs in such ...

  7. A new self-consistent model for thermodynamics of binary solutions

    Czech Academy of Sciences Publication Activity Database

    Svoboda, Jiří; Shan, Y. V.; Fischer, F. D.

    2015-01-01

    Roč. 108, NOV (2015), s. 27-30 ISSN 1359-6462 R&D Projects: GA ČR(CZ) GA14-24252S Institutional support: RVO:68081723 Keywords : Thermodynamics * Analytical methods * CALPHAD * Phase diagram * Self-consistent model Subject RIV: BJ - Thermodynamics Impact factor: 3.305, year: 2015

  8. Efficient and Effective Project Management

    Directory of Open Access Journals (Sweden)

    Dusan Pene

    2014-03-01

    Full Text Available The purpose of the article is to investigate different authorities and responsibilities of a project manager and of a project leader. Considering the fact that nowadays the project management is becoming the important factor in performing and leading the investments which are modified by modern leadership theories, we can say that the key element is the sovereign leadership of a manager and a project leader. The current multi-project environments and modern techniques at the project management area need the interdisciplinary leadership approach and at the same time they enable the strengthening of company’s competitive features so they are consistently satisfying high project expectations of the project investor or a client.

  9. LEX Project

    DEFF Research Database (Denmark)

    Damkilde, Lars; Larsen, Torben J.; Walbjørn, Jacob

    This document is aimed at helping all parties involved in the LEX project to get a common understanding of words, process, levels and the overall concept.......This document is aimed at helping all parties involved in the LEX project to get a common understanding of words, process, levels and the overall concept....

  10. OMEGA project

    International Nuclear Information System (INIS)

    Shibuya, E.H.

    1989-01-01

    The OMEGA - Observation of Multiple particle production, Exotic Interactions and Gamma-ray Air Shower-project is presented. The project try to associate photosensitive detectors from experiences of hadronic interactions with electronic detectors used by experiences that investigate extensive atmospheric showers. (M.C.K.)

  11. Privacy, Time Consistent Optimal Labour Income Taxation and Education Policy

    OpenAIRE

    Konrad, Kai A.

    1999-01-01

    Incomplete information is a commitment device for time consistency problems. In the context of time consistent labour income taxation privacy reduces welfare losses and increases the effectiveness of public education as a second best policy.

  12. Generalized contexts and consistent histories in quantum mechanics

    International Nuclear Information System (INIS)

    Losada, Marcelo; Laura, Roberto

    2014-01-01

    We analyze a restriction of the theory of consistent histories by imposing that a valid description of a physical system must include quantum histories which satisfy the consistency conditions for all states. We prove that these conditions are equivalent to imposing the compatibility conditions of our formalism of generalized contexts. Moreover, we show that the theory of consistent histories with the consistency conditions for all states and the formalism of generalized context are equally useful representing expressions which involve properties at different times

  13. Personality and Situation Predictors of Consistent Eating Patterns

    OpenAIRE

    Vainik, Uku; Dub?, Laurette; Lu, Ji; Fellows, Lesley K.

    2015-01-01

    Introduction A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studi...

  14. Two Impossibility Results on the Converse Consistency Principle in Bargaining

    OpenAIRE

    Youngsub Chun

    1999-01-01

    We present two impossibility results on the converse consistency principle in the context of bargaining. First, we show that there is no solution satis-fying Pareto optimality, contraction independence, and converse consistency. Next, we show that there is no solution satisfying Pareto optimality, strong individual rationality, individual monotonicity, and converse consistency.

  15. Personality consistency analysis in cloned quarantine dog candidates

    Directory of Open Access Journals (Sweden)

    Jin Choi

    2017-01-01

    Full Text Available In recent research, personality consistency has become an important characteristic. Diverse traits and human-animal interactions, in particular, are studied in the field of personality consistency in dogs. Here, we investigated the consistency of dominant behaviours in cloned and control groups followed by the modified Puppy Aptitude Test, which consists of ten subtests to ascertain the influence of genetic identity. In this test, puppies are exposed to stranger, restraint, prey-like object, noise, startling object, etc. Six cloned and four control puppies participated and the consistency of responses at ages 7–10 and 16 weeks in the two groups was compared. The two groups showed different consistencies in the subtests. While the average scores of the cloned group were consistent (P = 0.7991, those of the control group were not (P = 0.0089. Scores of Pack Drive and Fight or Flight Drive were consistent in the cloned group, however, those of the control group were not. Scores of Prey Drive were not consistent in either the cloned or the control group. Therefore, it is suggested that consistency of dominant behaviour is affected by genetic identity and some behaviours can be influenced more than others. Our results suggest that cloned dogs could show more consistent traits than non-cloned. This study implies that personality consistency could be one of the ways to analyse traits of puppies.

  16. Checking Consistency of Pedigree Information is NP-complete

    DEFF Research Database (Denmark)

    Aceto, Luca; Hansen, Jens A.; Ingolfsdottir, Anna

    Consistency checking is a fundamental computational problem in genetics. Given a pedigree and information on the genotypes of some of the individuals in it, the aim of consistency checking is to determine whether these data are consistent with the classic Mendelian laws of inheritance. This probl...

  17. 26 CFR 1.338-8 - Asset and stock consistency.

    Science.gov (United States)

    2010-04-01

    ... that are controlled foreign corporations. (6) Stock consistency. This section limits the application of... 26 Internal Revenue 4 2010-04-01 2010-04-01 false Asset and stock consistency. 1.338-8 Section 1... (CONTINUED) INCOME TAXES Effects on Corporation § 1.338-8 Asset and stock consistency. (a) Introduction—(1...

  18. Watchdog Project

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Rhett [Schweitzer Engineering Laboratories, Inc., Pullman, WA (United States); Campbell, Jack [CenterPoint Energy Houston Electric, TX (United States); Hadley, Mark [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-12-30

    The Watchdog Project completed 100% of the project Statement of Project Objective (SOPO). The Watchdog project was a very aggressive project looking to accomplish commercialization of technology that had never been commercialized, as a result it took six years to complete not the original three that were planned. No additional federal funds were requested from the original proposal and SEL contributed the additional cost share required to complete the project. The result of the Watchdog Project is the world’s first industrial rated Software Defined Network (SDN) switch commercially available. This technology achieved the SOPOO and DOE Roadmap goals to have strong network access control, improve reliability and network performance, and give the asset owner the ability to minimize attack surface before and during an attack. The Watchdog project is an alliance between CenterPoint Energy Houston Electric, Pacific Northwest National Laboratories (PNNL), and Schweitzer Engineering Laboratories, Inc. (SEL). SEL is the world’s leader in microprocessor-based electronic equipment for protecting electric power systems. PNNL performs basic and applied research to deliver energy, environmental, and national security for our nation. CenterPoint Energy is the third largest publicly traded natural gas delivery company in the U.S and third largest combined electricity and natural gas delivery company. The Watchdog Project efforts were combined with the SDN Project efforts to produce the entire SDN system solution for the critical infrastructure. The Watchdog project addresses Topic Area of Interest 5: Secure Communications, for the DEFOA- 0000359 by protecting the control system local area network itself and the communications coming from and going to the electronic devices on the local network. Local area networks usually are not routed and have little or no filtering capabilities. Combine this with the fact control system protocols are designed with inherent trust the control

  19. Freedom Project

    Directory of Open Access Journals (Sweden)

    Alejandra Suarez

    2014-02-01

    Full Text Available Freedom Project trains prisoners in nonviolent communication and meditation. Two complementary studies of its effects are reported in this article. The first study is correlational; we found decreased recidivism rates among prisoners trained by Freedom Project compared with recidivism rates in Washington state. The second study compared trained prisoners with a matched-pair control group and found improvement in self-reported anger, self-compassion, and certain forms of mindfulness among the trained group. Ratings of role-plays simulating difficult interactions show increased social skills among the group trained by Freedom Project than in the matched controls.

  20. EBFA project

    International Nuclear Information System (INIS)

    Anon.

    1979-01-01

    An engineering project office was established during the fall of 1976 to manage and coordinate all of the activities of the Electron Beam Fusion Project. The goal of the project is to develop the Electron Beam Fusion Accelerator (EBFA) and its supporting systems, and integrate these systems into the new Electron Beam Fusion Facility (EBFF). Supporting systems for EBFA include a control/monitor system, a data acquistion/automatic data processing system, the liquid transfer systems, the insulating gas transfer systems, etc. Engineers and technicians were assigned to the project office to carry out the engineering design, initiate procurement, monitor the fabrication, perform the assembly and to assist the pulsed power research group in the activation of the EBFA

  1. Project Reptile!

    Science.gov (United States)

    Diffily, Deborah

    2001-01-01

    Integrating curriculum is important in helping children make connections within and among areas. Presents a class project for kindergarten children which came out of the students' interests and desire to build a reptile exhibit. (ASK)

  2. Project Soar.

    Science.gov (United States)

    Austin, Marion

    1982-01-01

    Project Soar, a Saturday enrichment program for gifted students (6-14 years old), allows students to work intensively in a single area of interest. Examples are cited of students' work in crewel embroidery, creative writing, and biochemistry. (CL)

  3. EUROFANCOLEN Project

    International Nuclear Information System (INIS)

    Bueren, J. A.

    2014-01-01

    The first follow-up report of European Project EUROFANCOLEN, the purpose of which is to develop a gene therapy clinical trial to resolve bone marrow failure in patients with a genetic disease known as Fanconi anemia (FA), was sent to the European Commission in September. The main objective of project EUROFANCOLEN is to develop a gene therapy trial for patients with Fanconi anemia Type A (FA-A), which affects 80% of the patients with FA in Spain. (Author)

  4. Project Management

    DEFF Research Database (Denmark)

    Kampf, Constance

    2009-01-01

    In this video Associate Professor Constance Kampf talks about the importance project management. Not only as a tool in implementation, but also as a way of thinking, and as something that needs to be considered from idea conception......In this video Associate Professor Constance Kampf talks about the importance project management. Not only as a tool in implementation, but also as a way of thinking, and as something that needs to be considered from idea conception...

  5. The Impact of Project Management Maturity upon IT/IS Project Management Outcomes

    Science.gov (United States)

    Carcillo, Anthony Joseph, Jr.

    2013-01-01

    Although it is assumed that increasing the institutionalization (or maturity) of project management in an organization leads to greater project success, the literature has diverse views. The purpose of this mixed methods study was to examine the correlation between project management maturity and IT/IS project outcomes. The sample consisted of two…

  6. The random projection method

    CERN Document Server

    Vempala, Santosh S

    2005-01-01

    Random projection is a simple geometric technique for reducing the dimensionality of a set of points in Euclidean space while preserving pairwise distances approximately. The technique plays a key role in several breakthrough developments in the field of algorithms. In other cases, it provides elegant alternative proofs. The book begins with an elementary description of the technique and its basic properties. Then it develops the method in the context of applications, which are divided into three groups. The first group consists of combinatorial optimization problems such as maxcut, graph coloring, minimum multicut, graph bandwidth and VLSI layout. Presented in this context is the theory of Euclidean embeddings of graphs. The next group is machine learning problems, specifically, learning intersections of halfspaces and learning large margin hypotheses. The projection method is further refined for the latter application. The last set consists of problems inspired by information retrieval, namely, nearest neig...

  7. Consistent Regulation of Infrastructure Businesses: Some Economic Issues

    OpenAIRE

    Flavio M. Menezes

    2008-01-01

    This paper examines some important economic aspects associated with the notion that consistency in the regulation of infrastructure businesses is a desirable feature. It makes two important points. First, it is not easy to measure consistency. In particular, one cannot simply point to different regulatory parameters as evidence of inconsistent regulatory policy. Second, even if one does observe consistency emerging from decisions made by different regulators, it does not necessarily mean that...

  8. Geospace exploration project: Arase (ERG)

    Science.gov (United States)

    Miyoshi, Y.; Kasaba, Y.; Shinohara, I.; Takashima, T.; Asamura, K.; Matsumoto, H.; Higashio, N.; Mitani, T.; Kasahara, S.; Yokota, S.; Wang, S.; Kazama, Y.; Kasahara, Y.; Yagitani, S.; Matsuoka, A.; Kojima, H.; Katoh, Y.; Shiokawa, K.; Seki, K.; Fujimoto, M.; Ono, T.; ERG project Group

    2017-06-01

    The ERG (Exploration of energization and Radiation in Geospace) is Japanese geospace exploration project. The project focuses on relativistic electron acceleration mechanism of the outer belt and dynamics of space storms in the context of the cross-energy coupling via wave-particle interactions. The project consists of the satellite observation team, the ground-based network observation team, and integrated-data analysis/simulation team. The satellite was launched on December 20 2016 and has been nicknamed, “Arase”. This paper describes overview of the project and future plan for observations.

  9. HIRFL-CSR project

    International Nuclear Information System (INIS)

    Zhan, W.L.; Xia, J.W.; Wei, B.W.; Yuan, Y.J.; Zhao, H.W.; Man, K.T.; Dang, J.R.; Yuan, P.; Gao, D.Q.; Yang, X.T.; Song, M.T.; Zhang, W.Z.; Xiao, G.Q.; Cai, X.H.; Tang, J.Y.; Qiao, W.M.; Yang, X.D.; Wang, Y.F.

    2001-01-01

    HIRFL-CSR, the project that was proposed to upgrade the HIRFL facility, is a multifunctional Cooling Storage Ring (CSR) system, consisting of a main ring (CSRm) and an experimental ring (CSRe). The heavy ion beams from the HIRFL will be injected, accumulated, cooled and accelerated to high energy in the CSRm, then fast-extracted to produce radioactive ion beams (RIB), highly-charged stage ions, or slow-extracted to do experiment. The secondary beams will be accepted by CSRe and used for internal-target experiments of the high sensitive and high precision spectroscopy with cool beam. CSR project was started in end of 1999 and finish in the end of 2004. The period from beginning of 2000 to the summer of 2001 is the time for the building construction, fabrication design, prototype experiments. In this paper, the outline and status of the project will be reported

  10. Self-consistent descriptions of vector mesons in hot matter reexamined

    International Nuclear Information System (INIS)

    Riek, Felix; Knoll, Joern

    2010-01-01

    Technical concepts are presented that improve the self-consistent treatment of vector mesons in a hot and dense medium. First applications concern an interacting gas of pions and ρ mesons. As an extension of earlier studies, we thereby include random-phase-approximation-type vertex corrections and further use dispersion relations to calculate the real part of the vector-meson self-energy. An improved projection method preserves the four transversality of the vector-meson polarization tensor throughout the self-consistent calculations, thereby keeping the scheme void of kinematical singularities.

  11. Project mobilisation

    International Nuclear Information System (INIS)

    Clark, J.; Limbrick, A.

    1996-01-01

    This paper identifies and reviews the issues to be addressed and the procedures to be followed during the mobilisation of projects using LFG as an energy source. Knowledge of the procedures involved in project mobilisation, their sequence and probable timescales, is essential for efficient project management. It is assumed that the majority of projects will be situated on existing, licensed landfill sites and, in addition to complying with the relevant conditions of the waste management licence and original planning consent, any proposed developments on the site will require a separate planning consent. Experience in the UK indicates that obtaining planning permission rarely constitutes a barrier to the development of schemes for the utilisation of LFG. Even so, an appreciation of the applicable environmental and planning legislation is essential as this will enable the developer to recognise the main concerns of the relevant planning authority at an early stage of the project, resulting in the preparation of an informed and well-structured application for planning permission. For a LFG utilisation scheme on an existing landfill site, the need to carry out an environmental assessment (EA) as part of the application for planning permission will, in vitually all cases, be discretionary. Even if not deemed necessary by the planning authority, an EA is a useful tool at the planning application stage, to identify and address potential problems and to support discussions with bodies such as the Environment Agency, from whom consents or authorisations may be required. Carrying out an EA can thus provide for more cost-effective project development and enhanced environmental protection. Typically, the principal contractual arrangements, such as the purchase of gas or the sale of electricity, will have been established before the project mobilisation phase. However, there are many other contractural arrangements that must be established, and consents and permits that may be

  12. Personality consistency in dogs: a meta-analysis.

    Science.gov (United States)

    Fratkin, Jamie L; Sinn, David L; Patall, Erika A; Gosling, Samuel D

    2013-01-01

    Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that 'puppy tests' measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., 'puppy tests') versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed.

  13. Personality consistency in dogs: a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Jamie L Fratkin

    Full Text Available Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that 'puppy tests' measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family. Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43. Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., 'puppy tests' versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed.

  14. Personality Consistency in Dogs: A Meta-Analysis

    Science.gov (United States)

    Fratkin, Jamie L.; Sinn, David L.; Patall, Erika A.; Gosling, Samuel D.

    2013-01-01

    Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that ‘puppy tests’ measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., ‘puppy tests’) versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed. PMID:23372787

  15. Team work on international projects

    International Nuclear Information System (INIS)

    Hayfield, F.

    1983-01-01

    A successful team will result in Project efficiency and so lead to a better achievement of the Project objectives. Such a team will be self-motivating and have a high level of morale. An effective team will also create a better context for transfer of know-how and so better prepare its members for greater roles on future Project teams. The nature of Project work forces the process of team building to recognize several facts of life. A Project team can have a life as short as one year and as long as ten years. A team usually consists of people on temporary transfer from different departments yet retaining a link of some sort to their departments of origin. It may consist of members of one company only or of several as in a joint-venture and may include Client personnel. On International Projects, the members of a team may have different nationalities and be working in a language foreign to many of them. Many of the Project people may be expatriates to the Project area on a bachelor or on a married status well away from their head or usual office. Team building is a complex organizational and human process, with no mathematical formula for the ideal solution. It starts with the selection of the right Project Manager who should be a leader, a technocrat manager and an integrator all at the same time. The Project Manager must have the authority to create the organizational and human climate that will motivate to a maximum each member of the team. Each member must understand clearly his role and realize that this contribution to the Project will influence his career development. Loyalty to the Project Manager must be possible and the Departmental Manager has to recognize this necessity. This presentation will indicate the basic steps of a team building process on a typical major international Project

  16. Student Consistency and Implications for Feedback in Online Assessment Systems

    Science.gov (United States)

    Madhyastha, Tara M.; Tanimoto, Steven

    2009-01-01

    Most of the emphasis on mining online assessment logs has been to identify content-specific errors. However, the pattern of general "consistency" is domain independent, strongly related to performance, and can itself be a target of educational data mining. We demonstrate that simple consistency indicators are related to student outcomes,…

  17. 26 CFR 301.6224(c)-3 - Consistent settlements.

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 18 2010-04-01 2010-04-01 false Consistent settlements. 301.6224(c)-3 Section... settlements. (a) In general. If the Internal Revenue Service enters into a settlement agreement with any..., settlement terms consistent with those contained in the settlement agreement entered into. (b) Requirements...

  18. Self-consistent calculation of atomic structure for mixture

    International Nuclear Information System (INIS)

    Meng Xujun; Bai Yun; Sun Yongsheng; Zhang Jinglin; Zong Xiaoping

    2000-01-01

    Based on relativistic Hartree-Fock-Slater self-consistent average atomic model, atomic structure for mixture is studied by summing up component volumes in mixture. Algorithmic procedure for solving both the group of Thomas-Fermi equations and the self-consistent atomic structure is presented in detail, and, some numerical results are discussed

  19. A Preliminary Study toward Consistent Soil Moisture from AMSR2

    NARCIS (Netherlands)

    Parinussa, R.M.; Holmes, T.R.H.; Wanders, N.; Dorigo, W.A.; de Jeu, R.A.M.

    2015-01-01

    A preliminary study toward consistent soil moisture products from the Advanced Microwave Scanning Radiometer 2 (AMSR2) is presented. Its predecessor, the Advanced Microwave Scanning Radiometer for Earth Observing System (AMSR-E), has providedEarth scientists with a consistent and continuous global

  20. Consistency and Inconsistency in PhD Thesis Examination

    Science.gov (United States)

    Holbrook, Allyson; Bourke, Sid; Lovat, Terry; Fairbairn, Hedy

    2008-01-01

    This is a mixed methods investigation of consistency in PhD examination. At its core is the quantification of the content and conceptual analysis of examiner reports for 804 Australian theses. First, the level of consistency between what examiners say in their reports and the recommendation they provide for a thesis is explored, followed by an…

  1. Delimiting Coefficient a from Internal Consistency and Unidimensionality

    Science.gov (United States)

    Sijtsma, Klaas

    2015-01-01

    I discuss the contribution by Davenport, Davison, Liou, & Love (2015) in which they relate reliability represented by coefficient a to formal definitions of internal consistency and unidimensionality, both proposed by Cronbach (1951). I argue that coefficient a is a lower bound to reliability and that concepts of internal consistency and…

  2. Risk aversion vs. the Omega ratio : Consistency results

    NARCIS (Netherlands)

    Balder, Sven; Schweizer, Nikolaus

    This paper clarifies when the Omega ratio and related performance measures are consistent with second order stochastic dominance and when they are not. To avoid consistency problems, the threshold parameter in the ratio should be chosen as the expected return of some benchmark – as is commonly done

  3. Carl Rogers during Initial Interviews: A Moderate and Consistent Therapist.

    Science.gov (United States)

    Edwards, H. P.; And Others

    1982-01-01

    Analyzed two initial interviews by Carl Rogers in their entirety using the Carkhuff scales, Hill's category system, and a brief grammatical analysis to establish the level and consistency with which Rogers provides facilitative conditions. Results indicated his behavior as counselor was stable and consistent within and across interviews. (Author)

  4. Policy consistency and the achievement of Nigeria's foreign policy ...

    African Journals Online (AJOL)

    This study is an attempt to investigate the policy consistency of Nigeria‟s foreign policy and to understand the basis for this consistency; and also to see whether peacekeeping/peace-enforcement is key instrument in the achievement of Nigeria‟s foreign policy goals. The objective of the study was to examine whether the ...

  5. Decentralized Consistency Checking in Cross-organizational Workflows

    NARCIS (Netherlands)

    Wombacher, Andreas

    Service Oriented Architectures facilitate loosely coupled composed services, which are established in a decentralized way. One challenge for such composed services is to guarantee consistency, i.e., deadlock-freeness. This paper presents a decentralized approach to consistency checking, which

  6. Consistency of a system of equations: What does that mean?

    NARCIS (Netherlands)

    Still, Georg J.; Kern, Walter; Koelewijn, Jaap; Bomhoff, M.J.

    2010-01-01

    The concept of (structural) consistency also called structural solvability is an important basic tool for analyzing the structure of systems of equations. Our aim is to provide a sound and practically relevant meaning to this concept. The implications of consistency are expressed in terms of

  7. Quasi-Particle Self-Consistent GW for Molecules.

    Science.gov (United States)

    Kaplan, F; Harding, M E; Seiler, C; Weigend, F; Evers, F; van Setten, M J

    2016-06-14

    We present the formalism and implementation of quasi-particle self-consistent GW (qsGW) and eigenvalue only quasi-particle self-consistent GW (evGW) adapted to standard quantum chemistry packages. Our implementation is benchmarked against high-level quantum chemistry computations (coupled-cluster theory) and experimental results using a representative set of molecules. Furthermore, we compare the qsGW approach for five molecules relevant for organic photovoltaics to self-consistent GW results (scGW) and analyze the effects of the self-consistency on the ground state density by comparing calculated dipole moments to their experimental values. We show that qsGW makes a significant improvement over conventional G0W0 and that partially self-consistent flavors (in particular evGW) can be excellent alternatives.

  8. Consistency of hand preference: predictions to intelligence and school achievement.

    Science.gov (United States)

    Kee, D W; Gottfried, A; Bathurst, K

    1991-05-01

    Gottfried and Bathurst (1983) reported that hand preference consistency measured over time during infancy and early childhood predicts intellectual precocity for females, but not for males. In the present study longitudinal assessments of children previously classified by Gottfried and Bathurst as consistent or nonconsistent in cross-time hand preference were conducted during middle childhood (ages 5 to 9). Findings show that (a) early measurement of hand preference consistency for females predicts school-age intellectual precocity, (b) the locus of the difference between consistent vs. nonconsistent females is in verbal intelligence, and (c) the precocity of the consistent females was also revealed on tests of school achievement, particularly tests of reading and mathematics.

  9. Putting humans in ecology: consistency in science and management.

    Science.gov (United States)

    Hobbs, Larry; Fowler, Charles W

    2008-03-01

    Normal and abnormal levels of human participation in ecosystems can be revealed through the use of macro-ecological patterns. Such patterns also provide consistent and objective guidance that will lead to achieving and maintaining ecosystem health and sustainability. This paper focuses on the consistency of this type of guidance and management. Such management, in sharp contrast to current management practices, ensures that our actions as individuals, institutions, political groups, societies, and as a species are applied consistently across all temporal, spatial, and organizational scales. This approach supplants management of today, where inconsistency results from debate, politics, and legal and religious polarity. Consistency is achieved when human endeavors are guided by natural patterns. Pattern-based management meets long-standing demands for enlightened management that requires humans to participate in complex systems in consistent and sustainable ways.

  10. Projective mapping

    DEFF Research Database (Denmark)

    Dehlholm, Christian; Brockhoff, Per B.; Bredie, Wender Laurentius Petrus

    2012-01-01

    by the practical testing environment. As a result of the changes, a reasonable assumption would be to question the consequences caused by the variations in method procedures. Here, the aim is to highlight the proven or hypothetic consequences of variations of Projective Mapping. Presented variations will include...... instructions and influence heavily the product placements and the descriptive vocabulary (Dehlholm et.al., 2012b). The type of assessors performing the method influences results with an extra aspect in Projective Mapping compared to more analytical tests, as the given spontaneous perceptions are much dependent......Projective Mapping (Risvik et.al., 1994) and its Napping (Pagès, 2003) variations have become increasingly popular in the sensory field for rapid collection of spontaneous product perceptions. It has been applied in variations which sometimes are caused by the purpose of the analysis and sometimes...

  11. Isotopes Project

    International Nuclear Information System (INIS)

    Dairiki, J.M.; Browne, E.; Firestone, R.B.; Lederer, C.M.; Shirley, V.S.

    1984-01-01

    The Isotopes Project compiles and evaluates nuclear structure and decay data and disseminates these data to the scientific community. From 1940-1978 the Project had as its main objective the production of the Table of Isotopes. Since publication of the seventh (and last) edition in 1978, the group now coordinates its nuclear data evaluation efforts with those of other data centers via national and international nuclear data networks. The group is currently responsible for the evaluation of mass chains A = 167-194. All evaluated data are entered into the International Evaluated Nuclear Structure Data File (ENSDF) and are published in Nuclear Data Sheets. In addition to the evaluation effort, the Isotopes Project is responsible for production of the Radioactivity Handbook

  12. Project evaluation: one framework - four approaches

    DEFF Research Database (Denmark)

    Rode, Anna Le Gerstrøm; Svejvig, Per

    . Introducing a framework that can help structure such evaluations, the aim of this paper is to contribute to project theory and practice by inspiring project researchers and aiding project workers in their efforts to open up the black box of projects and deliver relevant and valuable results......There are many theoretical and practical reasons for evaluating projects – including explorative arguments focusing on expanding descriptive knowledge on projects as well as normative arguments focusing on developing prescriptive knowledge of project management. Despite the need for effective...... project management and research methods that can assess effective project management methodologies, extant literature on evaluation procedures or guidelines on how to evaluate projects and/or project management is scarce. To address this challenge, this paper introduces an evaluation framework consisting...

  13. Personality and Situation Predictors of Consistent Eating Patterns.

    Science.gov (United States)

    Vainik, Uku; Dubé, Laurette; Lu, Ji; Fellows, Lesley K

    2015-01-01

    A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied. A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner). The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation. Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption. Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.

  14. Personality and Situation Predictors of Consistent Eating Patterns.

    Directory of Open Access Journals (Sweden)

    Uku Vainik

    Full Text Available A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied.A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner. The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation.Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption.Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.

  15. LLAMA Project

    Science.gov (United States)

    Arnal, E. M.; Abraham, Z.; Giménez de Castro, G.; de Gouveia dal Pino, E. M.; Larrarte, J. J.; Lepine, J.; Morras, R.; Viramonte, J.

    2014-10-01

    The project LLAMA, acronym of Long Latin American Millimetre Array is very briefly described in this paper. This project is a joint scientific and technological undertaking of Argentina and Brazil on the basis of an equal investment share, whose mail goal is both to install and to operate an observing facility capable of exploring the Universe at millimetre and sub/millimetre wavelengths. This facility will be erected in the argentinean province of Salta, in a site located at 4830m above sea level.

  16. Self-consistent ECCD calculations with bootstrap current

    International Nuclear Information System (INIS)

    Decker, J.; Bers, A.; Ram, A. K; Peysson, Y.

    2003-01-01

    To achieve high performance, steady-state operation in tokamaks, it is increasingly important to find the appropriate means for modifying and sustaining the pressure and magnetic shear profiles in the plasma. In such advanced scenarios, especially in the vicinity of internal transport barrier, RF induced currents have to be calculated self-consistently with the bootstrap current, thus taking into account possible synergistic effects resulting from the momentum space distortion of the electron distribution function f e . Since RF waves can cause the distribution of electrons to become non-Maxwellian, the associated changes in parallel diffusion of momentum between trapped and passing particles can be expected to modify the bootstrap current fraction; conversely, the bootstrap current distribution function can enhance the current driven by RF waves. For this purpose, a new, fast and fully implicit solver has been recently developed to carry out computations including new and detailed evaluations of the interactions between bootstrap current (BC) and Electron Cyclotron current drive (ECCD). Moreover, Ohkawa current drive (OKCD) appears to be an efficient method for driving current when the fraction of trapped particles is large. OKCD in the presence of BC is also investigated. Here, results are illustrated around projected tokamak parameters in high performance scenarios of AlcatorC-MOD. It is shown that by increasing n // , the EC wave penetration into the bulk of the electron distribution is greater, and since the resonance extends up to high p // values, this situation is the usual ECCD based on the Fisch-Boozer mechanism concerning passing particles. However, because of the close vicinity of the trapped boundary at r/a=0.7, this process is counterbalanced by the Ohkawa effect, possibly leading to a negative net current. Therefore, by injecting the EC wave in the opposite toroidal direction (n // RF by OKCD may be 70% larger than that of ECCD, with a choice of EC

  17. The Kyoto University tandem upgrading project

    International Nuclear Information System (INIS)

    Nakamura, Masanobu; Shimoura, Susumu; Takimoto, Kiyohiko; Sakaguchi, Harutaka; Kobayashi, Shinsaku

    1988-01-01

    A brief description on the Kyoto University tandem upgrading project. The project consists of replacing the old 5 MV tandem Van de Graaff by an 8UDH pelletron. The old pressure vessel and beam lines are used again without significant modification. The project is planned to be completed at the end of 1989. (orig.)

  18. Facial Mimicry and Emotion Consistency: Influences of Memory and Context.

    Science.gov (United States)

    Kirkham, Alexander J; Hayes, Amy E; Pawling, Ralph; Tipper, Steven P

    2015-01-01

    This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene), whilst others were always inconsistent (e.g., frowning with a positive scene). During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.

  19. Facial Mimicry and Emotion Consistency: Influences of Memory and Context.

    Directory of Open Access Journals (Sweden)

    Alexander J Kirkham

    Full Text Available This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene, whilst others were always inconsistent (e.g., frowning with a positive scene. During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.

  20. Quasiparticle self-consistent GW method: a short summary

    International Nuclear Information System (INIS)

    Kotani, Takao; Schilfgaarde, Mark van; Faleev, Sergey V; Chantis, Athanasios

    2007-01-01

    We have developed a quasiparticle self-consistent GW method (QSGW), which is a new self-consistent method to calculate the electronic structure within the GW approximation. The method is formulated based on the idea of a self-consistent perturbation; the non-interacting Green function G 0 , which is the starting point for GWA to obtain G, is determined self-consistently so as to minimize the perturbative correction generated by GWA. After self-consistency is attained, we have G 0 , W (the screened Coulomb interaction) and G self-consistently. This G 0 can be interpreted as the optimum non-interacting propagator for the quasiparticles. We will summarize some theoretical discussions to justify QSGW. Then we will survey results which have been obtained up to now: e.g., band gaps for normal semiconductors are predicted to a precision of 0.1-0.3 eV; the self-consistency including the off-diagonal part is required for NiO and MnO; and so on. There are still some remaining disagreements with experiments; however, they are very systematic, and can be explained from the neglect of excitonic effects

  1. Protective Factors, Risk Indicators, and Contraceptive Consistency Among College Women.

    Science.gov (United States)

    Morrison, Leslie F; Sieving, Renee E; Pettingell, Sandra L; Hellerstedt, Wendy L; McMorris, Barbara J; Bearinger, Linda H

    2016-01-01

    To explore risk and protective factors associated with consistent contraceptive use among emerging adult female college students and whether effects of risk indicators were moderated by protective factors. Secondary analysis of National Longitudinal Study of Adolescent to Adult Health Wave III data. Data collected through in-home interviews in 2001 and 2002. National sample of 18- to 25-year-old women (N = 842) attending 4-year colleges. We examined relationships between protective factors, risk indicators, and consistent contraceptive use. Consistent contraceptive use was defined as use all of the time during intercourse in the past 12 months. Protective factors included external supports of parental closeness and relationship with caring nonparental adult and internal assets of self-esteem, confidence, independence, and life satisfaction. Risk indicators included heavy episodic drinking, marijuana use, and depression symptoms. Multivariable logistic regression models were used to evaluate relationships between protective factors and consistent contraceptive use and between risk indicators and contraceptive use. Self-esteem, confidence, independence, and life satisfaction were significantly associated with more consistent contraceptive use. In a final model including all internal assets, life satisfaction was significantly related to consistent contraceptive use. Marijuana use and depression symptoms were significantly associated with less consistent use. With one exception, protective factors did not moderate relationships between risk indicators and consistent use. Based on our findings, we suggest that risk and protective factors may have largely independent influences on consistent contraceptive use among college women. A focus on risk and protective factors may improve contraceptive use rates and thereby reduce unintended pregnancy among college students. Copyright © 2016 AWHONN, the Association of Women's Health, Obstetric and Neonatal Nurses. Published

  2. The Consistent Preferences Approach to Deductive Reasoning in Games

    CERN Document Server

    Asheim, Geir B

    2006-01-01

    "The Consistent Preferences Approach to Deductive Reasoning in Games" presents, applies, and synthesizes what my co-authors and I have called the 'consistent preferences' approach to deductive reasoning in games. Briefly described, this means that the object of the analysis is the ranking by each player of his own strategies, rather than his choice. The ranking can be required to be consistent (in different senses) with his beliefs about the opponent's ranking of her strategies. This can be contrasted to the usual 'rational choice' approach where a player's strategy choice is (in dif

  3. Project Avatar

    DEFF Research Database (Denmark)

    Juhlin, Jonas Alastair

    'Project Avatar' tager udgangspunkt i den efterretningsdisciplin, der kaldes Open Source Intelligence og indebærer al den information, som ligger frit tilgængeligt i åbne kilder. Med udbredelsen af sociale medier åbners der op for helt nye typer af informationskilder. Spørgsmålet er; hvor nyttig er...

  4. Project Baltia

    Index Scriptorium Estoniae

    2007-01-01

    Uus arhitektuuriajakiri "Project Baltia" tutvustab Baltimaade, Soome ja Peterburi regiooni arhitektuuri, linnaehitust ja disaini. Ilmub neli korda aastas inglise- ja venekeelsena. Väljaandja: kirjastus Balticum Peterburis koostöös Amsterdami ja Moskva kirjastusega A-Fond. Peatoimetaja Vladimir Frolov

  5. Tedese Project

    Science.gov (United States)

    Buforn, E.; Davila, J. Martin; Bock, G.; Pazos, A.; Udias, A.; Hanka, W.

    The TEDESE (Terremotos y Deformacion Cortical en el Sur de España) project is a joint project of the Universidad Complutense de Madrid (UCM) and Real Instituto y Observatorio de la Armada de San Fernando, Cadiz (ROA) supported by the Spanish Ministerio de Ciencia y Tecnologia with the participation of the GeoforschungZen- trum, Potsdam (GFZ). The aim is to carry out a study of the characteristics of the oc- currence and mechanism of earthquakes together with measurements of crustal struc- ture and deformations in order to obtain an integrated evaluation of seismic risk in southern Spain from. As part of this project a temporal network of 10 broad-band seismological stations, which will complete those already existing in the zone, have been installed in southern Spain and northern Africa for one year beginning in October 2001. The objectives of the project are the study in detail of the focal mechanisms of earthquakes in this area, of structural in crust and upper mantle, of seismic anisotropy in crust and mantle as indicator for tectonic deformation processed and the measure- ments of crustal deformations using techniques with permanent GPS and SLR stations and temporary GPS surveys. From these studies, seismotectonic models and maps will be elaborated and seismic risk in the zone will be evaluated.

  6. Project Boomerang

    Science.gov (United States)

    King, Allen L.

    1975-01-01

    Describes an experimental project on boomerangs designed for an undergraduate course in classical mechanics. The students designed and made their own boomerangs, devised their own procedures, and carried out suitable measurements. Presents some of their data and a simple analysis for the two-bladed boomerang. (Author/MLH)

  7. Project Narrative

    Energy Technology Data Exchange (ETDEWEB)

    Driscoll, Mary C. [St. Bonaventure University, St Bonaventure, NY(United States)

    2012-07-12

    The Project Narrative describes how the funds from the DOE grant were used to purchase equipment for the biology, chemistry, physics and mathematics departments. The Narrative also describes how the equipment is being used. There is also a list of the positive outcomes as a result of having the equipment that was purchased with the DOE grant.

  8. Radiochemistry Project

    International Nuclear Information System (INIS)

    Anon.

    Researches carried out in the 'Radiochemistry Project' of the Agricultural Nuclear Energy Center, Piracicaba, Sao Paulo State, Brazil, are described. Such researches comprise: dosimetry and radiological protection; development of techniques and methods of chemical analysis and radiochemistry. (M.A.) [pt

  9. FLOAT Project

    DEFF Research Database (Denmark)

    Sørensen, Eigil V.; Aarup, Bendt

    The objective of the FLOAT project is to study the reliability of high-performance fibre-reinforced concrete, also known as Compact Reinforced Composite (CRC), for the floats of wave energy converters. In order to reach a commercial breakthrough, wave energy converters need to achieve a lower price...

  10. Hydrology Project

    International Nuclear Information System (INIS)

    Anon.

    Research carried out in the 'Hydrology Project' of the Centro de Energia Nuclear na Agricultura', Piracicaba, Sao Paulo State, Brazil, are described. Such research comprises: Amazon hydrology and Northeast hydrology. Techniques for the measurement of isotope ratios are used. (M.A.) [pt

  11. CHEMVAL project

    International Nuclear Information System (INIS)

    Chandratillake, M.; Falck, W.E.; Read, D.

    1992-01-01

    This report summarises the development history of the CHEMVAL Thermodynamic Database, the criteria employed for data selection and the contents of Version 4.0, issued to participants on the completion of the project. It accompanies a listing of the database constructed using the dBase III + /IV database management package. (Author)

  12. Project COLD.

    Science.gov (United States)

    Kazanjian, Wendy C.

    1982-01-01

    Describes Project COLD (Climate, Ocean, Land, Discovery) a scientific study of the Polar Regions, a collection of 35 modules used within the framework of existing subjects: oceanography, biology, geology, meterology, geography, social science. Includes a partial list of topics and one activity (geodesic dome) from a module. (Author/SK)

  13. Swedish projects

    International Nuclear Information System (INIS)

    Thunell, J.

    1993-01-01

    The main sources of the financing of Swedish research on gas technology are listed in addition to names of organizations which carry out this research. The titles and descriptions of the projects carried out are presented in addition to lists of reports published with information on prices. (AB)

  14. Multiphase flows of N immiscible incompressible fluids: A reduction-consistent and thermodynamically-consistent formulation and associated algorithm

    Science.gov (United States)

    Dong, S.

    2018-05-01

    We present a reduction-consistent and thermodynamically consistent formulation and an associated numerical algorithm for simulating the dynamics of an isothermal mixture consisting of N (N ⩾ 2) immiscible incompressible fluids with different physical properties (densities, viscosities, and pair-wise surface tensions). By reduction consistency we refer to the property that if only a set of M (1 ⩽ M ⩽ N - 1) fluids are present in the system then the N-phase governing equations and boundary conditions will exactly reduce to those for the corresponding M-phase system. By thermodynamic consistency we refer to the property that the formulation honors the thermodynamic principles. Our N-phase formulation is developed based on a more general method that allows for the systematic construction of reduction-consistent formulations, and the method suggests the existence of many possible forms of reduction-consistent and thermodynamically consistent N-phase formulations. Extensive numerical experiments have been presented for flow problems involving multiple fluid components and large density ratios and large viscosity ratios, and the simulation results are compared with the physical theories or the available physical solutions. The comparisons demonstrate that our method produces physically accurate results for this class of problems.

  15. SDN Project

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Rhett [Schweitzer Engineering Laboratories Inc, Pullman, WA (United States)

    2016-12-23

    The SDN Project completed on time and on budget and successfully accomplished 100% of the scope of work outlined in the original Statement of Project Objective (SOPO). The SDN Project formed an alliance between Ameren Corporation, University of Illinois Urbana- Champaign (UIUC), Pacific Northwest National Laboratories (PNNL), and Schweitzer Engineering Laboratories, Inc. (SEL). The objective of the SDN Project is to address Topic Area of Interest 2: Sustain critical energy delivery functions while responding to a cyber-intrusion under Funding Opportunity Announcement DE-FOA-0000797. The goal of the project is to design and commercially release technology that provides a method to sustain critical energy delivery functions during a cyber intrusion and to do this control system operators need the ability to quickly identify and isolate the affected network areas, and re-route critical information and control flows around. The objective of the SDN Project is to develop a Flow Controller that monitors, configures, and maintains the safe, reliable network traffic flows of all the local area networks (LANs) on a control system in the Energy sector. The SDN team identified the core attributes of a control system and produced an SDN flow controller that has the same core attributes enabling networks to be designed, configured and deployed that maximize the whitelisted, deny-bydefault and purpose built networks. This project researched, developed and commercially released technology that: Enables all field networks be to configured and monitored as if they are a single asset to be protected; Enables greatly improved and even precalculated response actions to reliability and cyber events; Supports pre-configured localized response actions tailored to provide resilience against failures and centralized response to cyber-attacks that improve network reliability and availability; Architecturally enables the right subject matter experts, who are usually the information

  16. On the consistent histories approach to quantum mechanics

    International Nuclear Information System (INIS)

    Dowker, F.; Kent, A.

    1996-01-01

    We review the consistent histories formulations of quantum mechanics developed by Griffiths, Omnes, Gell-Man, and Hartle, and we describe the classifications of consistent sets. We illustrate some general features of consistent sets by a few lemmas and examples. We also consider various interpretations of the formalism, and we examine the new problems which arise in reconstructing the past and predicting the future. It is shown that Omnes characterization of true statements---statements that can be deduced unconditionally in his interpretation---is incorrect. We examine critically Gell-Mann and Hartle's interpretation of the formalism, and in particular, their discussions of communication, prediction, and retrodiction, and we conclude that their explanation of the apparent persistence of quasiclassicality relies on assumptions about an as-yet-unknown theory of experience. Our overall conclusion is that the consistent histories approach illustrates the need to supplement quantum mechanics by some selection principle in order to produce a fundamental theory capable of unconditional predictions

  17. Consistency of Trend Break Point Estimator with Underspecified Break Number

    Directory of Open Access Journals (Sweden)

    Jingjing Yang

    2017-01-01

    Full Text Available This paper discusses the consistency of trend break point estimators when the number of breaks is underspecified. The consistency of break point estimators in a simple location model with level shifts has been well documented by researchers under various settings, including extensions such as allowing a time trend in the model. Despite the consistency of break point estimators of level shifts, there are few papers on the consistency of trend shift break point estimators in the presence of an underspecified break number. The simulation study and asymptotic analysis in this paper show that the trend shift break point estimator does not converge to the true break points when the break number is underspecified. In the case of two trend shifts, the inconsistency problem worsens if the magnitudes of the breaks are similar and the breaks are either both positive or both negative. The limiting distribution for the trend break point estimator is developed and closely approximates the finite sample performance.

  18. Liking for Evaluators: Consistency and Self-Esteem Theories

    Science.gov (United States)

    Regan, Judith Weiner

    1976-01-01

    Consistency and self-esteem theories make contrasting predictions about the relationship between a person's self-evaluation and his liking for an evaluator. Laboratory experiments confirmed predictions about these theories. (Editor/RK)

  19. Sparse PDF Volumes for Consistent Multi-Resolution Volume Rendering

    KAUST Repository

    Sicat, Ronell Barrera; Kruger, Jens; Moller, Torsten; Hadwiger, Markus

    2014-01-01

    This paper presents a new multi-resolution volume representation called sparse pdf volumes, which enables consistent multi-resolution volume rendering based on probability density functions (pdfs) of voxel neighborhoods. These pdfs are defined

  20. Structures, profile consistency, and transport scaling in electrostatic convection

    DEFF Research Database (Denmark)

    Bian, N.H.; Garcia, O.E.

    2005-01-01

    Two mechanisms at the origin of profile consistency in models of electrostatic turbulence in magnetized plasmas are considered. One involves turbulent diffusion in collisionless plasmas and the subsequent turbulent equipartition of Lagrangian invariants. By the very nature of its definition...

  1. 15 CFR 930.36 - Consistency determinations for proposed activities.

    Science.gov (United States)

    2010-01-01

    ... necessity of issuing separate consistency determinations for each incremental action controlled by the major... plans), and that affect any coastal use or resource of more than one State. Many States share common...

  2. Decentralized Consistent Network Updates in SDN with ez-Segway

    KAUST Repository

    Nguyen, Thanh Dang; Chiesa, Marco; Canini, Marco

    2017-01-01

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and black-holes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes

  3. The utility of theory of planned behavior in predicting consistent ...

    African Journals Online (AJOL)

    admin

    disease. Objective: To examine the utility of theory of planned behavior in predicting consistent condom use intention of HIV .... (24-25), making subjective norms as better predictors of intention ..... Organizational Behavior and Human Decision.

  4. A methodology for the data energy regional consumption consistency analysis

    International Nuclear Information System (INIS)

    Canavarros, Otacilio Borges; Silva, Ennio Peres da

    1999-01-01

    The article introduces a methodology for data energy regional consumption consistency analysis. The work was going based on recent studies accomplished by several cited authors and boarded Brazilian matrices and Brazilian energetics regional balances. The results are compared and analyzed

  5. Island of Stability for Consistent Deformations of Einstein's Gravity

    DEFF Research Database (Denmark)

    Dietrich, Dennis D.; Berkhahn, Felix; Hofmann, Stefan

    2012-01-01

    We construct deformations of general relativity that are consistent and phenomenologically viable, since they respect, in particular, cosmological backgrounds. These deformations have unique symmetries in accordance with their Minkowski cousins (Fierz-Pauli theory for massive gravitons) and incor...

  6. Projective geometry and projective metrics

    CERN Document Server

    Busemann, Herbert

    2005-01-01

    The basic results and methods of projective and non-Euclidean geometry are indispensable for the geometer, and this book--different in content, methods, and point of view from traditional texts--attempts to emphasize that fact. Results of special theorems are discussed in detail only when they are needed to develop a feeling for the subject or when they illustrate a general method. On the other hand, an unusual amount of space is devoted to the discussion of the fundamental concepts of distance, motion, area, and perpendicularity.Topics include the projective plane, polarities and conic sectio

  7. Self-consistent normal ordering of gauge field theories

    International Nuclear Information System (INIS)

    Ruehl, W.

    1987-01-01

    Mean-field theories with a real action of unconstrained fields can be self-consistently normal ordered. This leads to a considerable improvement over standard mean-field theory. This concept is applied to lattice gauge theories. First an appropriate real action mean-field theory is constructed. The equations determining the Gaussian kernel necessary for self-consistent normal ordering of this mean-field theory are derived. (author). 4 refs

  8. Consistency of the least weighted squares under heteroscedasticity

    Czech Academy of Sciences Publication Activity Database

    Víšek, Jan Ámos

    2011-01-01

    Roč. 2011, č. 47 (2011), s. 179-206 ISSN 0023-5954 Grant - others:GA UK(CZ) GA402/09/055 Institutional research plan: CEZ:AV0Z10750506 Keywords : Regression * Consistency * The least weighted squares * Heteroscedasticity Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.454, year: 2011 http://library.utia.cas.cz/separaty/2011/SI/visek-consistency of the least weighted squares under heteroscedasticity.pdf

  9. Cosmological consistency tests of gravity theory and cosmic acceleration

    Science.gov (United States)

    Ishak-Boushaki, Mustapha B.

    2017-01-01

    Testing general relativity at cosmological scales and probing the cause of cosmic acceleration are among the important objectives targeted by incoming and future astronomical surveys and experiments. I present our recent results on consistency tests that can provide insights about the underlying gravity theory and cosmic acceleration using cosmological data sets. We use statistical measures, the rate of cosmic expansion, the growth rate of large scale structure, and the physical consistency of these probes with one another.

  10. Self-consistency corrections in effective-interaction calculations

    International Nuclear Information System (INIS)

    Starkand, Y.; Kirson, M.W.

    1975-01-01

    Large-matrix extended-shell-model calculations are used to compute self-consistency corrections to the effective interaction and to the linked-cluster effective interaction. The corrections are found to be numerically significant and to affect the rate of convergence of the corresponding perturbation series. The influence of various partial corrections is tested. It is concluded that self-consistency is an important effect in determining the effective interaction and improving the rate of convergence. (author)

  11. Parquet equations for numerical self-consistent-field theory

    International Nuclear Information System (INIS)

    Bickers, N.E.

    1991-01-01

    In recent years increases in computational power have provided new motivation for the study of self-consistent-field theories for interacting electrons. In this set of notes, the so-called parquet equations for electron systems are derived pedagogically. The principal advantages of the parquet approach are outlined, and its relationship to simpler self-consistent-field methods, including the Baym-Kadanoff technique, is discussed in detail. (author). 14 refs, 9 figs

  12. Consistent Estimation of Pricing Kernels from Noisy Price Data

    OpenAIRE

    Vladislav Kargin

    2003-01-01

    If pricing kernels are assumed non-negative then the inverse problem of finding the pricing kernel is well-posed. The constrained least squares method provides a consistent estimate of the pricing kernel. When the data are limited, a new method is suggested: relaxed maximization of the relative entropy. This estimator is also consistent. Keywords: $\\epsilon$-entropy, non-parametric estimation, pricing kernel, inverse problems.

  13. Measuring consistency of autobiographical memory recall in depression.

    LENUS (Irish Health Repository)

    Semkovska, Maria

    2012-05-15

    Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms.

  14. Measuring consistency of autobiographical memory recall in depression.

    Science.gov (United States)

    Semkovska, Maria; Noone, Martha; Carton, Mary; McLoughlin, Declan M

    2012-05-15

    Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  15. Autonomous Navigation with Constrained Consistency for C-Ranger

    Directory of Open Access Journals (Sweden)

    Shujing Zhang

    2014-06-01

    Full Text Available Autonomous underwater vehicles (AUVs have become the most widely used tools for undertaking complex exploration tasks in marine environments. Their synthetic ability to carry out localization autonomously and build an environmental map concurrently, in other words, simultaneous localization and mapping (SLAM, are considered to be pivotal requirements for AUVs to have truly autonomous navigation. However, the consistency problem of the SLAM system has been greatly ignored during the past decades. In this paper, a consistency constrained extended Kalman filter (EKF SLAM algorithm, applying the idea of local consistency, is proposed and applied to the autonomous navigation of the C-Ranger AUV, which is developed as our experimental platform. The concept of local consistency (LC is introduced after an explicit theoretical derivation of the EKF-SLAM system. Then, we present a locally consistency-constrained EKF-SLAM design, LC-EKF, in which the landmark estimates used for linearization are fixed at the beginning of each local time period, rather than evaluated at the latest landmark estimates. Finally, our proposed LC-EKF algorithm is experimentally verified, both in simulations and sea trials. The experimental results show that the LC-EKF performs well with regard to consistency, accuracy and computational efficiency.

  16. Are prescription drug insurance choices consistent with expected utility theory?

    Science.gov (United States)

    Bundorf, M Kate; Mata, Rui; Schoenbaum, Michael; Bhattacharya, Jay

    2013-09-01

    To determine the extent to which people make choices inconsistent with expected utility theory when choosing among prescription drug insurance plans and whether tabular or graphical presentation format influences the consistency of their choices. Members of an Internet-enabled panel chose between two Medicare prescription drug plans. The "low variance" plan required higher out-of-pocket payments for the drugs respondents usually took but lower out-of-pocket payments for the drugs they might need if they developed a new health condition than the "high variance" plan. The probability of a change in health varied within subjects and the presentation format (text vs. graphical) and the affective salience of the clinical condition (abstract vs. risk related to specific clinical condition) varied between subjects. Respondents were classified based on whether they consistently chose either the low or high variance plan. Logistic regression models were estimated to examine the relationship between decision outcomes and task characteristics. The majority of respondents consistently chose either the low or high variance plan, consistent with expected utility theory. Half of respondents consistently chose the low variance plan. Respondents were less likely to make discrepant choices when information was presented in graphical format. Many people, although not all, make choices consistent with expected utility theory when they have information on differences among plans in the variance of out-of-pocket spending. Medicare beneficiaries would benefit from information on the extent to which prescription drug plans provide risk protection. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  17. Forestry and biomass energy projects

    DEFF Research Database (Denmark)

    Swisher, J.N.

    1994-01-01

    This paper presents a comprehensive and consistent methodology to account for the costs and net carbon flows of different categories of forestry and biomass energy projects and describes the application of the methodology to several sets of projects in Latin America. The results suggest that both...... biomass energy development and forestry measures including reforestation and forest protection can contribute significantly to the reduction of global CO2 emissions, and that local land-use capacity must determine the type of project that is appropriate in specific cases. No single approach alone...... is sufficient as either a national or global strategy for sustainable land use or carbon emission reduction. The methodology allows consistent comparisons of the costs and quantities of carbon stored in different types of projects and/or national programs, facilitating the inclusion of forestry and biomass...

  18. A thermodynamically consistent phenomenological model for ferroelectric and ferroelastic hysteresis

    Czech Academy of Sciences Publication Activity Database

    Kaltenbacher, B.; Krejčí, Pavel

    2016-01-01

    Roč. 96, č. 7 (2016), s. 874-891 ISSN 0044-2267 R&D Projects: GA ČR(CZ) GA15-12227S Institutional support: RVO:67985840 Keywords : piezoelectricity * hysteresis * ferroelasticity Subject RIV: BA - General Mathematics Impact factor: 1.332, year: 2016 http://onlinelibrary.wiley.com/doi/10.1002/zamm.201400292/abstract

  19. Moderation and Consistency of Teacher Judgement: Teachers' Views

    Science.gov (United States)

    Connolly, Stephen; Klenowski, Valentina; Wyatt-Smith, Claire Maree

    2012-01-01

    Major curriculum and assessment reforms in Australia have generated research interest in issues related to standards, teacher judgement and moderation. This article is based on one related inquiry of a large-scale Australian Research Council Linkage project conducted in Queensland. This qualitative study analysed interview data to identify…

  20. A thermodynamically consistent phenomenological model for ferroelectric and ferroelastic hysteresis

    Czech Academy of Sciences Publication Activity Database

    Kaltenbacher, B.; Krejčí, Pavel

    2016-01-01

    Roč. 96, č. 7 (2016), s. 874-891 ISSN 0044-2267 R&D Projects: GA ČR(CZ) GA15-12227S Institutional support: RVO:67985840 Keywords : piezoelectric ity * hysteresis * ferroelasticity Subject RIV: BA - General Mathematics Impact factor: 1.332, year: 2016 http://onlinelibrary.wiley.com/doi/10.1002/zamm.201400292/abstract

  1. A thermodynamically consistent model of shape-memory alloys

    Czech Academy of Sciences Publication Activity Database

    Benešová, Barbora

    2011-01-01

    Roč. 11, č. 1 (2011), s. 355-356 ISSN 1617-7061 R&D Projects: GA ČR GAP201/10/0357 Institutional research plan: CEZ:AV0Z20760514 Keywords : slape memory alloys * model based on relaxation * thermomechanic coupling Subject RIV: BA - General Mathematics http://onlinelibrary.wiley.com/doi/10.1002/pamm.201110169/abstract

  2. Project Radon

    International Nuclear Information System (INIS)

    Ekholm, S.

    1988-01-01

    The project started in March 1987. The objective is to perform radon monitoring in 2000 dwellings occupied by people employed by State Power Board and to continue to contribute to the development of radon filters. The project participates in developing methods for radon measurement and decontamination and in adapting the methods to large scale application. About 400 so called radon trace measurements (coarse measurement) and about 10 action measurements (decontamination measurement) have been made so far. Experience shows that methods are fully applicable and that the decontamination measures recommended give perfectly satisfactory results. It is also established that most of the houses with high radon levels have poor ventilation Many of them suffer from moisture and mould problems. The work planned for 1988 and 1989 will in addition to measurements be directed towards improvement of the measuring methods. An activity catalogue will be prepared in cooperation with ventilation enterprises. (O.S.)

  3. PARTNER Project

    CERN Multimedia

    Ballantine, A; Dixon-Altaber, H; Dosanjh, M; Kuchina, L

    2011-01-01

    Hadrontherapy uses particle beams to treat tumours located near critical organs and tumours that respond poorly to conventional radiation therapy. It has become evident that there is an emerging need for reinforcing research in hadrontherapy and it is essential to train professionals in this rapidly developing field. PARTNER is a 4-year Marie Curie Training project funded by the European Commission with 5.6 million Euros aimed at the creation of the next generation of experts. Ten academic institutes and research centres and two leading companies are participating in PARTNER, that is coordinated by CERN, forming a unique multidisciplinary and multinational European network. The project offers research and training opportunities to 25 young biologists, engineers, physicians and physicists and is allowing them to actively develop modern techniques for treating cancer in close collaboration with leading European Institutions. For this purpose PARTNER relies on cutting edge research and technology development, ef...

  4. Swedish projects

    International Nuclear Information System (INIS)

    Thunell, J.

    1992-01-01

    A description is given of research activities, concerning heating systems, which were carried out in Sweden during 1991. The main subject areas dealt with under the gas technology group within the area of heating systems were catalytic combustion, polyethylene materials, and gas applications within the paper and pulp industries. A list is given of the titles of project reports published during 1991 and of those begun during that year. Under the Swedish Centre for Gas Technology (SGC), the main areas of research regarding gas applications were polyethylene materials, industrial applications and the reduction of pollutant emissions. A detailed list is given of research projects which were in progress or proposed by March 1992 under the heating system gas technology research group in Sweden. This list also presents the aims and descriptions of the methods, etc. (AB)

  5. AVE project

    International Nuclear Information System (INIS)

    2004-01-01

    During 1998, ANAV began to optimize Human Resources to cope with the ERE and ANA-ANV integration. Project AVE was intended to achieve an orderly transfer of know-how, skills, attitudes and experiences. The most complex part was renovation of personnel with Operating Licenses. Nearly 140 people had joined the organization by late December 2003. This opportunity was seized to draw up a new Training Manual, and a common Initial Training Plan was designed for the two plants, accounting for the singularities of each one. The plan is divided into 5 modules: Common Training, Specific Training, PEI/CAT, Management, and on-the-job Training. The training environments were defined according to the nature of the capabilities to be acquired. Project AVE resulted in the merger of the Asco and Vandellos II Training services. (Author)

  6. Spent Nuclear Fuel project, project management plan

    International Nuclear Information System (INIS)

    Fuquay, B.J.

    1995-01-01

    The Hanford Spent Nuclear Fuel Project has been established to safely store spent nuclear fuel at the Hanford Site. This Project Management Plan sets forth the management basis for the Spent Nuclear Fuel Project. The plan applies to all fabrication and construction projects, operation of the Spent Nuclear Fuel Project facilities, and necessary engineering and management functions within the scope of the project

  7. ARTIST Project

    CSIR Research Space (South Africa)

    Ferguson, K

    2012-10-01

    Full Text Available Biennial Conference Presented by: Keith Ferguson Date: 9 October 2012 Mobile IPTV Broadcasting Platform Consortium: CSIR, UCT, ECA Funded by TIA 2008-2011 ARTIST Project Min time - sacrifice quality Max quality - sacrifice time Application Context... idth > ARTIST Platform Advertiser Client 1 Client 2 Client 3 Client 4 Sport channel News channel Wildlife channel Advert database Transaction database Transcoder Servers Media Switching Servers INTERNET Channel viewing Advert upload...

  8. CARA project

    International Nuclear Information System (INIS)

    Bergallo, Juan E.; Brasnarof, Daniel O.

    2000-01-01

    The CARA (Advanced Fuels for Argentine Reactors) Project successfully completed its first stage, phase one, last year. The performance of this fuel has been partially examined, using CNEA and CONUAR facilities and personnel. With the results obtained in this stage, determined by the corresponding tests and verification of the fuel behavior, the performance of the second stage started immediately afterwards. Works performed and results obtained during the development of the second stage are generally described in this paper. (author)

  9. Polytope projects

    CERN Document Server

    Iordache, Octavian

    2013-01-01

    How do you know what works and what doesn't? This book contains case studies highlighting the power of polytope projects for complex problem solving. Any sort of combinational problem characterized by a large variety of possibly complex constructions and deconstructions based on simple building blocks can be studied in a similar way. Although the majority of case studies are related to chemistry, the method is general and equally applicable to other fields for engineering or science.

  10. Projection Methods

    DEFF Research Database (Denmark)

    Wagner, Falko Jens; Poulsen, Mikael Zebbelin

    1999-01-01

    When trying to solve a DAE problem of high index with more traditional methods, it often causes instability in some of the variables, and finally leads to breakdown of convergence and integration of the solution. This is nicely shown in [ESF98, p. 152 ff.].This chapter will introduce projection...... methods as a way of handling these special problems. It is assumed that we have methods for solving normal ODE systems and index-1 systems....

  11. Project Phaseolus

    International Nuclear Information System (INIS)

    Anon.

    Research carried out through the Phaseolus Project of the 'Centro de Energia Nuclear na Agricultura' (CENA) Piracicaba, Sao Paulo State, Brazil, is described. It comprises the following subject s: plant breeding; nitrogen fixation; tissue cultures; proteins; photosynthetic efficiency; soil-plant interactions; electron microscopy of the golden mosaic virus; pest control; production of 15 N-enriched ammonium sulfate, and determination of elements in the beans plant. (M.A.) [pt

  12. Numatron project

    Energy Technology Data Exchange (ETDEWEB)

    Sugimoto, K [Osaka Univ., Toyonaka (Japan). Dept. of Physics; Hirao, Yasuo

    1977-04-01

    A project of high energy heavy ion acceleration is under consideration. The high energy heavy ions can produce highly condensed states of nuclei. A new phase of nucleus would be seen at the incident energy higher than 140 MeV/nucleon. High energy heavy ions causing high density states and meson emission will produce various new nuclides. Process of formation of atomic elements will be studied. Various fields of science can be also investigated by the high energy heavy ions. Spectroscopic study of multi-valent ions will be made high energy uranium. Study of materials for the fusion reactor is important. Impurity heavy ion from the wall of the fusion reactor may lose the energy of the reactor, and the characteristic features of heavy ions should be investigated. The highly ionized states of atoms are also produced by heavy ion injection into material. Several projects of heavy ion acceleration are in progress in the world. The Numatron project in Japan is to construct a combination machine of a Cockcroft type machine, three linear accelerator and a synchrotron. The planned energy of the machine is 670 MeV/nucleon. Technical problems are under investigation.

  13. Business Intelligence Support For Project Management

    OpenAIRE

    Muntean, Mihaela; Cabau, Liviu Gabiel

    2013-01-01

    With respect to the project management framework, a project live cycle consists of phases like: initiation, planning, execution, monitoring & control and closing. Monitoring implies measuring the progress and performance of the project during its execution and communicating the status. Actual performance is compared with the planned one. Therefore, a minimal set of key performance indicators will be proposed. Monitoring the schedule progress, the project budget and the scope will be possible....

  14. The accuracy and consistency of nutrition care process terminology use in cases of refeeding syndrome.

    Science.gov (United States)

    Matthews, Kylie L; Palmer, Michelle A; Capra, Sandra M

    2017-11-08

    Using standardised terminology in acute care has encouraged consistency in patient care and the evaluation of outcomes. As such, the Nutrition Care Process (NCP) and Nutrition Care Process Terminology (NCPT) may assist dietitian nutritionists in the delivery of high quality nutrition care worldwide; however, limited research has been conducted examining the consistency and accuracy of its use. We aimed to examine the NCPT that dietitian nutritionists would use to formulate a diagnostic statement relating to refeeding syndrome (RFS). A multimethod action research approach was used, incorporating two projects. The first was a survey examining Australian dietitian nutritionists' (n = 195) opinions regarding NCPT use in cases of RFS. To establish if results were similar internationally, an interview was then conducted with 22 dietitian nutritionists working within 10 different countries. 'Imbalance of nutrients' was only identified as a correct code by 17% of respondents in project 1. No mention of this term was made in project 2. Also 86% of respondents incorrectly selected more than one diagnostic code. The majority of respondents (80%, n = 52/65) who incorrectly selected 'Malnutrition', without also selecting 'Imbalance of nutrients', selected 'reduce intake' as an intervention, suggesting some misunderstanding in the requirement for interrelated diagnoses, interventions and goals. Our findings demonstrate that there is limited accuracy and consistency in selecting nutritional diagnostic codes in relation to RFS. Respondents also demonstrated limited knowledge regarding appropriate application of the NCP and NCPT. Implementation practices may require further refinement, as accurate and consistent use is required to procure the benefits of standardised terminology. © 2017 Dietitians Association of Australia.

  15. Martial arts striking hand peak acceleration, accuracy and consistency.

    Science.gov (United States)

    Neto, Osmar Pinto; Marzullo, Ana Carolina De Miranda; Bolander, Richard P; Bir, Cynthia A

    2013-01-01

    The goal of this paper was to investigate the possible trade-off between peak hand acceleration and accuracy and consistency of hand strikes performed by martial artists of different training experiences. Ten male martial artists with training experience ranging from one to nine years volunteered to participate in the experiment. Each participant performed 12 maximum effort goal-directed strikes. Hand acceleration during the strikes was obtained using a tri-axial accelerometer block. A pressure sensor matrix was used to determine the accuracy and consistency of the strikes. Accuracy was estimated by the radial distance between the centroid of each subject's 12 strikes and the target, whereas consistency was estimated by the square root of the 12 strikes mean squared distance from their centroid. We found that training experience was significantly correlated to hand peak acceleration prior to impact (r(2)=0.456, p =0.032) and accuracy (r(2)=0. 621, p=0.012). These correlations suggest that more experienced participants exhibited higher hand peak accelerations and at the same time were more accurate. Training experience, however, was not correlated to consistency (r(2)=0.085, p=0.413). Overall, our results suggest that martial arts training may lead practitioners to achieve higher striking hand accelerations with better accuracy and no change in striking consistency.

  16. Self-consistent electrodynamic scattering in the symmetric Bragg case

    International Nuclear Information System (INIS)

    Campos, H.S.

    1988-01-01

    We have analyzed the symmetric Bragg case, introducing a model of self consistent scattering for two elliptically polarized beams. The crystal is taken as a set of mathematical planes, each of them defined by a surface density of dipoles. We have considered the mesofield and the epifield differently from that of the Ewald's theory and, we assumed a plane of dipoles and the associated fields as a self consistent scattering unit. The exact analytical treatment when applied to any two neighbouring planes, results in a general and self consistent Bragg's equation, in terms of the amplitude and phase variations. The generalized solution for the set of N planes was obtained after introducing an absorption factor in the incident radiation, in two ways: (i) the analytical one, through a rule of field similarity, which says that the incidence occurs in both faces of the all crystal planes and also, through a matricial development with the Chebyshev polynomials; (ii) using the numerical solution we calculated, iteratively, the reflectivity, the reflection phase, the transmissivity, the transmission phase and the energy. The results are showed through reflection and transmission curves, which are characteristics as from kinematical as dynamical theories. The conservation of the energy results from the Ewald's self consistency principle is used. In the absorption case, the results show that it is not the only cause for the asymmetric form in the reflection curves. The model contains basic elements for a unified, microscope, self consistent, vectorial and exact formulation for interpretating the X ray diffraction in perfect crystals. (author)

  17. Cognitive consistency and math-gender stereotypes in Singaporean children.

    Science.gov (United States)

    Cvencek, Dario; Meltzoff, Andrew N; Kapur, Manu

    2014-01-01

    In social psychology, cognitive consistency is a powerful principle for organizing psychological concepts. There have been few tests of cognitive consistency in children and no research about cognitive consistency in children from Asian cultures, who pose an interesting developmental case. A sample of 172 Singaporean elementary school children completed implicit and explicit measures of math-gender stereotype (male=math), gender identity (me=male), and math self-concept (me=math). Results showed strong evidence for cognitive consistency; the strength of children's math-gender stereotypes, together with their gender identity, significantly predicted their math self-concepts. Cognitive consistency may be culturally universal and a key mechanism for developmental change in social cognition. We also discovered that Singaporean children's math-gender stereotypes increased as a function of age and that boys identified with math more strongly than did girls despite Singaporean girls' excelling in math. The results reveal both cultural universals and cultural variation in developing social cognition. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. A consistent response spectrum analysis including the resonance range

    International Nuclear Information System (INIS)

    Schmitz, D.; Simmchen, A.

    1983-01-01

    The report provides a complete consistent Response Spectrum Analysis for any component. The effect of supports with different excitation is taken into consideration, at is the description of the resonance ranges. It includes information explaining how the contributions of the eigenforms with higher eigenfrequencies are to be considered. Stocking of floor response spectra is also possible using the method described here. However, modified floor response spectra must now be calculated for each building mode. Once these have been prepared, the calculation of the dynamic component values is practically no more complicated than with the conventional, non-consistent methods. The consistent Response Spectrum Analysis can supply smaller and larger values than the conventional theory, a fact which can be demonstrated using simple examples. The report contains a consistent Response Spectrum Analysis (RSA), which, as far as we know, has been formulated in this way for the first time. A consistent RSA is so important because today this method is preferentially applied as an important tool for the earthquake proof of components in nuclear power plants. (orig./HP)

  19. GRAVITATIONALLY CONSISTENT HALO CATALOGS AND MERGER TREES FOR PRECISION COSMOLOGY

    International Nuclear Information System (INIS)

    Behroozi, Peter S.; Wechsler, Risa H.; Wu, Hao-Yi; Busha, Michael T.; Klypin, Anatoly A.; Primack, Joel R.

    2013-01-01

    We present a new algorithm for generating merger trees and halo catalogs which explicitly ensures consistency of halo properties (mass, position, and velocity) across time steps. Our algorithm has demonstrated the ability to improve both the completeness (through detecting and inserting otherwise missing halos) and purity (through detecting and removing spurious objects) of both merger trees and halo catalogs. In addition, our method is able to robustly measure the self-consistency of halo finders; it is the first to directly measure the uncertainties in halo positions, halo velocities, and the halo mass function for a given halo finder based on consistency between snapshots in cosmological simulations. We use this algorithm to generate merger trees for two large simulations (Bolshoi and Consuelo) and evaluate two halo finders (ROCKSTAR and BDM). We find that both the ROCKSTAR and BDM halo finders track halos extremely well; in both, the number of halos which do not have physically consistent progenitors is at the 1%-2% level across all halo masses. Our code is publicly available at http://code.google.com/p/consistent-trees. Our trees and catalogs are publicly available at http://hipacc.ucsc.edu/Bolshoi/.

  20. Project as a System and its Management

    Directory of Open Access Journals (Sweden)

    Jiří Skalický

    2017-06-01

    Full Text Available The contribution aims to describe project as a system, to define project control goal and strategy, control variables and their relationships. Three common control variables represented by the project triangle, are extended by two other important variables – project risk and quality. The control system consists of two components: social one – project manager and project team – and technical one – project dynamic simulation model as a decision making support of project manager in project milestones. In the project planning phase, the project baseline with planned controlled variables is created. In milestones after project launch, the actual values of these variables are measured. If the actual values deviate from planned ones, corrective actions are proposed and new baseline for the following control interval is created. Project plan takes into account the actual project progress and optimum corrective actions are determined by simulation, respecting control strategy and availability of resources. The contribution presents list of references to articles dealing with project as a system and its simulation. In most cases, they refer to the project control using the Earned Value Management method and its derivatives. Using of the dynamic simulation model for project monitoring and control, suggested in this contribution, presents a novel approach. The proposed model can serve as departure point to future research of authors and for development of appropriate and applicable tool.

  1. Coloss project

    International Nuclear Information System (INIS)

    2005-01-01

    The COLOSS project was a shared-cost action, co-ordinated by IRSN within the Euratom Research Framework Programme 1998-2002. Started in February 2000, the project lasted three years. The work-programme performed by 19 partners was shaped around complementary activities aimed at improving severe accident codes. Unresolved risk-relevant issues regarding H2 production, melt generation and the source term were studied, through a large number of experiments such as a) dissolution of fresh and high burn-up UO 2 and MOX by molten Zircaloy, b) simultaneous dissolution of UO 2 and ZrO 2 by molten Zircaloy, c) oxidation of U-O-Zr mixtures by steam, d) degradation-oxidation of B 4 C control rods. Significant results have been produced from separate-effects, semi-global and large-scale tests on COLOSS topics. Break-through were achieved on some issues. Nevertheless, more data are needed for consolidation of the modelling on burn-up effects on UO 2 and MOX dissolution and on oxidation of U-O-Zr and B 4 C-metal mixtures. There was experimental evidence that the oxidation of these mixtures can contribute significantly to the large H2 production observed during the reflooding of degraded cores under severe accident conditions. Based on the experimental results obtained on the COLOSS topics, corresponding models were developed and were successfully implemented in several severe accident codes. Upgraded codes were then used for plant calculations to evaluate the consequences of new models on key severe accident sequences occurring in different plants designs involving B 4 C control rods (EPR, BWR, VVER- 1000) as well as in the TMI-2 accident. The large series of plant calculations involved sensitivity studies and code benchmarks. Main severe accident codes in use in the EU for safety studies were used such as ICARE/CATHARE, SCDAP/RELAP5, ASTEC, MELCOR and MAAP4. This activity enabled: a) the assessment of codes to calculate core degradation, b) the identification of main

  2. BAO Plate Archive Project

    Science.gov (United States)

    Mickaelian, A. M.; Gigoyan, K. S.; Gyulzadyan, M. V.; Paronyan, G. M.; Abrahamyan, H. V.; Andreasyan, H. R.; Azatyan, N. M.; Kostandyan, G. R.; Samsonyan, A. L.; Mikayelyan, G. A.; Farmanyan, S. V.; Harutyunyan, V. L.

    2017-12-01

    We present the Byurakan Astrophysical Observatory (BAO) Plate Archive Project that is aimed at digitization, extraction and analysis of archival data and building an electronic database and interactive sky map. BAO Plate Archive consists of 37,500 photographic plates and films, obtained with 2.6m telescope, 1m and 0.5m Schmidt telescopes and other smaller ones during 1947-1991. The famous Markarian Survey (or the First Byurakan Survey, FBS) 2000 plates were digitized in 2002-2005 and the Digitized FBS (DFBS, www.aras.am/Dfbs/dfbs.html) was created. New science projects have been conducted based on this low-dispersion spectroscopic material. Several other smaller digitization projects have been carried out as well, such as part of Second Byurakan Survey (SBS) plates, photographic chain plates in Coma, where the blazar ON 231 is located and 2.6m film spectra of FBS Blue Stellar Objects. However, most of the plates and films are not digitized. In 2015, we have started a project on the whole BAO Plate Archive digitization, creation of electronic database and its scientific usage. Armenian Virtual Observatory (ArVO, www.aras.am/Arvo/arvo.htm) database will accommodate all new data. The project runs in collaboration with the Armenian Institute of Informatics and Automation Problems (IIAP) and will continues during 4 years in 2015-2018. The final result will be an Electronic Database and online Interactive Sky map to be used for further research projects. ArVO will provide all standards and tools for efficient usage of the scientific output and its integration in international databases.

  3. Consistent Partial Least Squares Path Modeling via Regularization.

    Science.gov (United States)

    Jung, Sunho; Park, JaeHong

    2018-01-01

    Partial least squares (PLS) path modeling is a component-based structural equation modeling that has been adopted in social and psychological research due to its data-analytic capability and flexibility. A recent methodological advance is consistent PLS (PLSc), designed to produce consistent estimates of path coefficients in structural models involving common factors. In practice, however, PLSc may frequently encounter multicollinearity in part because it takes a strategy of estimating path coefficients based on consistent correlations among independent latent variables. PLSc has yet no remedy for this multicollinearity problem, which can cause loss of statistical power and accuracy in parameter estimation. Thus, a ridge type of regularization is incorporated into PLSc, creating a new technique called regularized PLSc. A comprehensive simulation study is conducted to evaluate the performance of regularized PLSc as compared to its non-regularized counterpart in terms of power and accuracy. The results show that our regularized PLSc is recommended for use when serious multicollinearity is present.

  4. Context-dependent individual behavioral consistency in Daphnia

    DEFF Research Database (Denmark)

    Heuschele, Jan; Ekvall, Mikael T.; Bianco, Giuseppe

    2017-01-01

    The understanding of consistent individual differences in behavior, often termed "personality," for adapting and coping with threats and novel environmental conditions has advanced considerably during the last decade. However, advancements are almost exclusively associated with higher-order animals......, whereas studies focusing on smaller aquatic organisms are still rare. Here, we show individual differences in the swimming behavior of Daphnia magna, a clonal freshwater invertebrate, before, during, and after being exposed to a lethal threat, ultraviolet radiation (UVR). We show consistency in swimming...... that of adults. Overall, we show that aquatic invertebrates are far from being identical robots, but instead they show considerable individual differences in behavior that can be attributed to both ontogenetic development and individual consistency. Our study also demonstrates, for the first time...

  5. Consistent forcing scheme in the cascaded lattice Boltzmann method

    Science.gov (United States)

    Fei, Linlin; Luo, Kai Hong

    2017-11-01

    In this paper, we give an alternative derivation for the cascaded lattice Boltzmann method (CLBM) within a general multiple-relaxation-time (MRT) framework by introducing a shift matrix. When the shift matrix is a unit matrix, the CLBM degrades into an MRT LBM. Based on this, a consistent forcing scheme is developed for the CLBM. The consistency of the nonslip rule, the second-order convergence rate in space, and the property of isotropy for the consistent forcing scheme is demonstrated through numerical simulations of several canonical problems. Several existing forcing schemes previously used in the CLBM are also examined. The study clarifies the relation between MRT LBM and CLBM under a general framework.

  6. Application of consistent fluid added mass matrix to core seismic

    International Nuclear Information System (INIS)

    Koo, K. H.; Lee, J. H.

    2003-01-01

    In this paper, the application algorithm of a consistent fluid added mass matrix including the coupling terms to the core seismic analysis is developed and installed at SAC-CORE3.0 code. As an example, we assumed the 7-hexagon system of the LMR core and carried out the vibration modal analysis and the nonlinear time history seismic response analysis using SAC-CORE3.0. Used consistent fluid added mass matrix is obtained by using the finite element program of the FAMD(Fluid Added Mass and Damping) code. From the results of the vibration modal analysis, the core duct assemblies reveal strongly coupled vibration modes, which are so different from the case of in-air condition. From the results of the time history seismic analysis, it was verified that the effects of the coupled terms of the consistent fluid added mass matrix are significant in impact responses and the dynamic responses

  7. Self-consistent approximations beyond the CPA: Part II

    International Nuclear Information System (INIS)

    Kaplan, T.; Gray, L.J.

    1982-01-01

    This paper concentrates on a self-consistent approximation for random alloys developed by Kaplan, Leath, Gray, and Diehl. The construction of the augmented space formalism for a binary alloy is sketched, and the notation to be used derived. Using the operator methods of the augmented space, the self-consistent approximation is derived for the average Green's function, and for evaluating the self-energy, taking into account the scattering by clusters of excitations. The particular cluster approximation desired is derived by treating the scattering by the excitations with S /SUB T/ exactly. Fourier transforms on the disorder-space clustersite labels solve the self-consistent set of equations. Expansion to short range order in the alloy is also discussed. A method to reduce the problem to a computationally tractable form is described

  8. Linear augmented plane wave method for self-consistent calculations

    International Nuclear Information System (INIS)

    Takeda, T.; Kuebler, J.

    1979-01-01

    O.K. Andersen has recently introduced a linear augmented plane wave method (LAPW) for the calculation of electronic structure that was shown to be computationally fast. A more general formulation of an LAPW method is presented here. It makes use of a freely disposable number of eigenfunctions of the radial Schroedinger equation. These eigenfunctions can be selected in a self-consistent way. The present formulation also results in a computationally fast method. It is shown that Andersen's LAPW is obtained in a special limit from the present formulation. Self-consistent test calculations for copper show the present method to be remarkably accurate. As an application, scalar-relativistic self-consistent calculations are presented for the band structure of FCC lanthanum. (author)

  9. Self-consistency and coherent effects in nonlinear resonances

    International Nuclear Information System (INIS)

    Hofmann, I.; Franchetti, G.; Qiang, J.; Ryne, R. D.

    2003-01-01

    The influence of space charge on emittance growth is studied in simulations of a coasting beam exposed to a strong octupolar perturbation in an otherwise linear lattice, and under stationary parameters. We explore the importance of self-consistency by comparing results with a non-self-consistent model, where the space charge electric field is kept 'frozen-in' to its initial values. For Gaussian distribution functions we find that the 'frozen-in' model results in a good approximation of the self-consistent model, hence coherent response is practically absent and the emittance growth is self-limiting due to space charge de-tuning. For KV or waterbag distributions, instead, strong coherent response is found, which we explain in terms of absence of Landau damping

  10. A consistent time frame for Chaucer's Canterbury Pilgrimage

    Science.gov (United States)

    Kummerer, K. R.

    2001-08-01

    A consistent time frame for the pilgrimage that Geoffrey Chaucer describes in The Canterbury Tales can be established if the seven celestial assertions related to the journey mentioned in the text can be reconciled with each other and the date of April 18 that is also mentioned. Past attempts to establish such a consistency for all seven celestial assertions have not been successful. The analysis herein, however, indicates that in The Canterbury Tales Chaucer accurately describes the celestial conditions he observed in the April sky above the London(Canterbury region of England in the latter half of the fourteenth century. All seven celestial assertions are in agreement with each other and consistent with the April 18 date. The actual words of Chaucer indicate that the Canterbury journey began during the 'seson' he defines in the General Prologue and ends under the light of the full Moon on the night of April 18, 1391.

  11. An approach to a self-consistent nuclear energy system

    International Nuclear Information System (INIS)

    Fujii-e, Yoichi; Arie, Kazuo; Endo, Hiroshi

    1992-01-01

    A nuclear energy system should provide a stable supply of energy without endangering the environment or humans. If there is fear about exhausting world energy resources, accumulating radionuclides, and nuclear reactor safety, tension is created in human society. Nuclear energy systems of the future should be able to eliminate fear from people's minds. In other words, the whole system, including the nuclear fuel cycle, should be self-consistent. This is the ultimate goal of nuclear energy. If it can be realized, public acceptance of nuclear energy will increase significantly. In a self-consistent nuclear energy system, misunderstandings between experts on nuclear energy and the public should be minimized. The way to achieve this goal is to explain using simple logic. This paper proposes specific targets for self-consistent nuclear energy systems and shows that the fast breeder reactor (FBR) lies on the route to attaining the final goal

  12. Consistent forcing scheme in the cascaded lattice Boltzmann method.

    Science.gov (United States)

    Fei, Linlin; Luo, Kai Hong

    2017-11-01

    In this paper, we give an alternative derivation for the cascaded lattice Boltzmann method (CLBM) within a general multiple-relaxation-time (MRT) framework by introducing a shift matrix. When the shift matrix is a unit matrix, the CLBM degrades into an MRT LBM. Based on this, a consistent forcing scheme is developed for the CLBM. The consistency of the nonslip rule, the second-order convergence rate in space, and the property of isotropy for the consistent forcing scheme is demonstrated through numerical simulations of several canonical problems. Several existing forcing schemes previously used in the CLBM are also examined. The study clarifies the relation between MRT LBM and CLBM under a general framework.

  13. Consistency and Reconciliation Model In Regional Development Planning

    Directory of Open Access Journals (Sweden)

    Dina Suryawati

    2016-10-01

    Full Text Available The aim of this study was to identify the problems and determine the conceptual model of regional development planning. Regional development planning is a systemic, complex and unstructured process. Therefore, this study used soft systems methodology to outline unstructured issues with a structured approach. The conceptual models that were successfully constructed in this study are a model of consistency and a model of reconciliation. Regional development planning is a process that is well-integrated with central planning and inter-regional planning documents. Integration and consistency of regional planning documents are very important in order to achieve the development goals that have been set. On the other hand, the process of development planning in the region involves technocratic system, that is, both top-down and bottom-up system of participation. Both must be balanced, do not overlap and do not dominate each other. regional, development, planning, consistency, reconciliation

  14. Bootstrap-Based Inference for Cube Root Consistent Estimators

    DEFF Research Database (Denmark)

    Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi

    This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...... to be inconsistent. Our method restores consistency of the nonparametric bootstrap by altering the shape of the criterion function defining the estimator whose distribution we seek to approximate. This modification leads to a generic and easy-to-implement resampling method for inference that is conceptually distinct...... from other available distributional approximations based on some form of modified bootstrap. We offer simulation evidence showcasing the performance of our inference method in finite samples. An extension of our methodology to general M-estimation problems is also discussed....

  15. Self-consistent modelling of resonant tunnelling structures

    DEFF Research Database (Denmark)

    Fiig, T.; Jauho, A.P.

    1992-01-01

    We report a comprehensive study of the effects of self-consistency on the I-V-characteristics of resonant tunnelling structures. The calculational method is based on a simultaneous solution of the effective-mass Schrödinger equation and the Poisson equation, and the current is evaluated...... applied voltages and carrier densities at the emitter-barrier interface. We include the two-dimensional accumulation layer charge and the quantum well charge in our self-consistent scheme. We discuss the evaluation of the current contribution originating from the two-dimensional accumulation layer charges......, and our qualitative estimates seem consistent with recent experimental studies. The intrinsic bistability of resonant tunnelling diodes is analyzed within several different approximation schemes....

  16. Summary of results of underground engineering experience

    Energy Technology Data Exchange (ETDEWEB)

    Holzer, F [Lawrence Radiation Laboratory, University of California, Livermore, CA (United States)

    1969-07-01

    Results pertinent to the use of nuclear explosives in underground engineering applications have been accumulating for the past 10 years from the Plowshare and Weapons tests of the AEC. Thus, predictive and measurement techniques of shock effects and chimney formation were developed in the course of analyzing explosions in granite, salt, and dolomite. The ability to predict effects related specifically to safety has resulted from many measurements on detonations at the Nevada Test Site, where also many of the techniques for handling, emplacing, and firing the explosive have been developed. This gestation period culminated in the execution of Project Gasbuggy, jointly sponsored by industry and government, and the first nuclear explosion in a gasbearing formation. The Gasbuggy explosive had a nominal yield of 25 kt and was detonated 4240 ft below the surface in the San Juan Basin in northwestern New Mexico on December 10, 1967. The shot point was 40 ft below the lower boundary of a 285-ft-thick gas-bearing sandstone formation of very low permeability. No radioactive venting occurred, and no damage to surrounding gas wells or structures resulted. Post-shot geophysical exploration and gas production tests have revealed that the nuclear explosion created a subsurface chimney approximately 160 ft in diameter and 335 ft high. Fractures appear to extend to about 400 ft symmetrically from the detonation point, with shifts or offsets along geological weaknesses extending out to perhaps 750 ft. Presently, radioactive constituents in the gas consist of tritium and krypton-85, with concentrations of approximately 10 {mu}Ci/ft{sup 3} and 1.5 {mu}Ci/ft{sup 3} respectively. These concentrations are decreasing a gas withdrawn from the chimney is replaced by formation gas. Tests to evaluate the increase in productivity and ultimate recovery are currently in progress. (author)

  17. Summary of results of underground engineering experience

    International Nuclear Information System (INIS)

    Holzer, F.

    1969-01-01

    Results pertinent to the use of nuclear explosives in underground engineering applications have been accumulating for the past 10 years from the Plowshare and Weapons tests of the AEC. Thus, predictive and measurement techniques of shock effects and chimney formation were developed in the course of analyzing explosions in granite, salt, and dolomite. The ability to predict effects related specifically to safety has resulted from many measurements on detonations at the Nevada Test Site, where also many of the techniques for handling, emplacing, and firing the explosive have been developed. This gestation period culminated in the execution of Project Gasbuggy, jointly sponsored by industry and government, and the first nuclear explosion in a gasbearing formation. The Gasbuggy explosive had a nominal yield of 25 kt and was detonated 4240 ft below the surface in the San Juan Basin in northwestern New Mexico on December 10, 1967. The shot point was 40 ft below the lower boundary of a 285-ft-thick gas-bearing sandstone formation of very low permeability. No radioactive venting occurred, and no damage to surrounding gas wells or structures resulted. Post-shot geophysical exploration and gas production tests have revealed that the nuclear explosion created a subsurface chimney approximately 160 ft in diameter and 335 ft high. Fractures appear to extend to about 400 ft symmetrically from the detonation point, with shifts or offsets along geological weaknesses extending out to perhaps 750 ft. Presently, radioactive constituents in the gas consist of tritium and krypton-85, with concentrations of approximately 10 μCi/ft 3 and 1.5 μCi/ft 3 respectively. These concentrations are decreasing a gas withdrawn from the chimney is replaced by formation gas. Tests to evaluate the increase in productivity and ultimate recovery are currently in progress. (author)

  18. An Explicit Consistent Geometric Stiffness Matrix for the DKT Element

    Directory of Open Access Journals (Sweden)

    Eliseu Lucena Neto

    Full Text Available Abstract A large number of references dealing with the geometric stiffness matrix of the DKT finite element exist in the literature, where nearly all of them adopt an inconsistent form. While such a matrix may be part of the element to treat nonlinear problems in general, it is of crucial importance for linearized buckling analysis. The present work seems to be the first to obtain an explicit expression for this matrix in a consistent way. Numerical results on linear buckling of plates assess the element performance either with the proposed explicit consistent matrix, or with the most commonly used inconsistent matrix.

  19. The cluster bootstrap consistency in generalized estimating equations

    KAUST Repository

    Cheng, Guang

    2013-03-01

    The cluster bootstrap resamples clusters or subjects instead of individual observations in order to preserve the dependence within each cluster or subject. In this paper, we provide a theoretical justification of using the cluster bootstrap for the inferences of the generalized estimating equations (GEE) for clustered/longitudinal data. Under the general exchangeable bootstrap weights, we show that the cluster bootstrap yields a consistent approximation of the distribution of the regression estimate, and a consistent approximation of the confidence sets. We also show that a computationally more efficient one-step version of the cluster bootstrap provides asymptotically equivalent inference. © 2012.

  20. Consistency in the description of diffusion in compacted bentonite

    International Nuclear Information System (INIS)

    Lehikoinen, J.; Muurinen, A.

    2009-01-01

    A macro-level diffusion model, which aims to provide a unifying framework for explaining the experimentally observed co-ion exclusion and greatly controversial counter-ion surface diffusion in a consistent fashion, is presented. It is explained in detail why a term accounting for the non-zero mobility of the counter-ion surface excess is required in the mathematical form of the macroscopic diffusion flux. The prerequisites for the consistency of the model and the problems associated with the interpretation of diffusion in such complex pore geometries as in compacted smectite clays are discussed. (author)

  1. An energetically consistent vertical mixing parameterization in CCSM4

    DEFF Research Database (Denmark)

    Nielsen, Søren Borg; Jochum, Markus; Eden, Carsten

    2018-01-01

    An energetically consistent stratification-dependent vertical mixing parameterization is implemented in the Community Climate System Model 4 and forced with energy conversion from the barotropic tides to internal waves. The structures of the resulting dissipation and diffusivity fields are compared......, however, depends greatly on the details of the vertical mixing parameterizations, where the new energetically consistent parameterization results in low thermocline diffusivities and a sharper and shallower thermocline. It is also investigated if the ocean state is more sensitive to a change in forcing...

  2. The consistency service of the ATLAS Distributed Data Management system

    CERN Document Server

    Serfon, C; The ATLAS collaboration

    2011-01-01

    With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failures is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically corrects the errors reported and informs the users in case of irrecoverable file loss.

  3. The Consistency Service of the ATLAS Distributed Data Management system

    CERN Document Server

    Serfon, C; The ATLAS collaboration

    2010-01-01

    With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failure is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically correct the errors reported and informs the users in case of irrecoverable file loss.

  4. Consistency among integral measurements of aggregate decay heat power

    Energy Technology Data Exchange (ETDEWEB)

    Takeuchi, H.; Sagisaka, M.; Oyamatsu, K.; Kukita, Y. [Nagoya Univ. (Japan)

    1998-03-01

    Persisting discrepancies between summation calculations and integral measurements force us to assume large uncertainties in the recommended decay heat power. In this paper, we develop a hybrid method to calculate the decay heat power of a fissioning system from those of different fissioning systems. Then, this method is applied to examine consistency among measured decay heat powers of {sup 232}Th, {sup 233}U, {sup 235}U, {sup 238}U and {sup 239}Pu at YAYOI. The consistency among the measured values are found to be satisfied for the {beta} component and fairly well for the {gamma} component, except for cooling times longer than 4000 s. (author)

  5. Standard Model Vacuum Stability and Weyl Consistency Conditions

    DEFF Research Database (Denmark)

    Antipin, Oleg; Gillioz, Marc; Krog, Jens

    2013-01-01

    At high energy the standard model possesses conformal symmetry at the classical level. This is reflected at the quantum level by relations between the different beta functions of the model. These relations are known as the Weyl consistency conditions. We show that it is possible to satisfy them...... order by order in perturbation theory, provided that a suitable coupling constant counting scheme is used. As a direct phenomenological application, we study the stability of the standard model vacuum at high energies and compare with previous computations violating the Weyl consistency conditions....

  6. STP: A mathematically and physically consistent library of steam properties

    International Nuclear Information System (INIS)

    Aguilar, F.; Hutter, A.C.; Tuttle, P.G.

    1982-01-01

    A new FORTRAN library of subroutines has been developed from the fundamental equation of Keenan et al. to evaluate a large set of water properties including derivatives such as sound speed and isothermal compressibility. The STP library uses the true saturation envelope of the Keenan et al. fundamental equation. The evaluation of the true envelope by a continuation method is explained. This envelope, along with other design features, imparts an exceptionally high degree of thermodynamic and mathematical consistency to the STP library, even at the critical point. Accuracy and smoothness, library self-consistency, and designed user convenience make the STP library a reliable and versatile water property package

  7. Weyl consistency conditions in non-relativistic quantum field theory

    Energy Technology Data Exchange (ETDEWEB)

    Pal, Sridip; Grinstein, Benjamín [Department of Physics, University of California,San Diego, 9500 Gilman Drive, La Jolla, CA 92093 (United States)

    2016-12-05

    Weyl consistency conditions have been used in unitary relativistic quantum field theory to impose constraints on the renormalization group flow of certain quantities. We classify the Weyl anomalies and their renormalization scheme ambiguities for generic non-relativistic theories in 2+1 dimensions with anisotropic scaling exponent z=2; the extension to other values of z are discussed as well. We give the consistency conditions among these anomalies. As an application we find several candidates for a C-theorem. We comment on possible candidates for a C-theorem in higher dimensions.

  8. A Van Atta reflector consisting of half-wave dipoles

    DEFF Research Database (Denmark)

    Appel-Hansen, Jørgen

    1966-01-01

    The reradiation pattern of a passive Van Atta reflector consisting of half-wave dipoles is investigated. The character of the reradiation pattern first is deduced by qualitative and physical considerations. Various types of array elements are considered and several geometrical configurations...... of these elements are outlined. Following this, an analysis is made of the reradiation pattern of a linear Van Atta array consisting of four equispaced half-wave dipoles. The general form of the reradiation pattern is studied analytically. The influence of scattering and coupling is determined and the dependence...

  9. A self-consistent theory of the magnetic polaron

    International Nuclear Information System (INIS)

    Marvakov, D.I.; Kuzemsky, A.L.; Vlahov, J.P.

    1984-10-01

    A finite temperature self-consistent theory of magnetic polaron in the s-f model of ferromagnetic semiconductors is developed. The calculations are based on the novel approach of the thermodynamic two-time Green function methods. This approach consists in the introduction of the ''irreducible'' Green functions (IGF) and derivation of the exact Dyson equation and exact self-energy operator. It is shown that IGF method gives a unified and natural approach for a calculation of the magnetic polaron states by taking explicitly into account the damping effects and finite lifetime. (author)

  10. Evidence for Consistency of the Glycation Gap in Diabetes

    OpenAIRE

    Nayak, Ananth U.; Holland, Martin R.; Macdonald, David R.; Nevill, Alan; Singh, Baldev M.

    2011-01-01

    OBJECTIVE Discordance between HbA1c and fructosamine estimations in the assessment of glycemia is often encountered. A number of mechanisms might explain such discordance, but whether it is consistent is uncertain. This study aims to coanalyze paired glycosylated hemoglobin (HbA1c)-fructosamine estimations by using fructosamine to determine a predicted HbA1c, to calculate a glycation gap (G-gap) and to determine whether the G-gap is consistent over time. RESEARCH DESIGN AND METHODS We include...

  11. Diagnostic language consistency among multicultural English-speaking nurses.

    Science.gov (United States)

    Wieck, K L

    1996-01-01

    Cultural differences among nurses may influence the choice of terminology applicable to use of a nursing diagnostic statement. This study explored whether defining characteristics are consistently applied by culturally varied nurses in an English language setting. Two diagnoses, pain, and high risk for altered skin integrity, were studied within six cultures: African, Asian, Filipino, East Indian, African-American, and Anglo-American nurses. Overall, there was consistency between the cultural groups. Analysis of variance for the pain scale demonstrated differences among cultures on two characteristics of pain, restlessness and grimace. The only difference on the high risk for altered skin integrity scale was found on the distructor, supple skin.

  12. Atomic power project, Kakrapar, Gujarat

    International Nuclear Information System (INIS)

    Varadarajan, G.

    1992-01-01

    The atomic power project at Kakrapar, comprising of two units of 235 MW each, went critical very recently in September 1992. The work consisted of construction of reactor and turbine buildings, outer and inner containment walls, calandria vault, natural draught cooling tower, etc. Nearly 152,000m 3 of normal aggregate concrete and 3,500m 3 of heavy aggregate concrete were produced and poured. The paper describes salient innovative construction features of the project. Incidentally, the project received a Certificate of Merit in the Excellence in Concrete competition held by the Maharashtra India Chapter of the American Concrete Institute. (author). 7 figs

  13. Examination of the relationship between project management critical success factors and project success of oil and gas drilling projects

    Science.gov (United States)

    Alagba, Tonye J.

    Oil and gas drilling projects are the primary means by which oil companies recover large volumes of commercially available hydrocarbons from deep reservoirs. These types of projects are complex in nature, involving management of multiple stakeholder interfaces, multidisciplinary personnel, complex contractor relationships, and turbulent environmental and market conditions, necessitating the application of proven project management best practices and critical success factors (CSFs) to achieve success. Although there is some practitioner oriented literature on project management CSFs for drilling projects, none of these is based on empirical evidence, from research. In addition, the literature has reported alarming rates of oil and gas drilling project failure, which is attributable not to technical factors, but to failure of project management. The aim of this quantitative correlational study therefore, was to discover an empirically verified list of project management CSFs, which consistent application leads to successful implementation of oil and gas drilling projects. The study collected survey data online, from a random sample of 127 oil and gas drilling personnel who were members of LinkedIn's online community "Drilling Supervisors, Managers, and Engineers". The results of the study indicated that 10 project management factors are individually related to project success of oil and gas drilling projects. These 10 CSFs are namely; Project mission, Top management support, Project schedule/plan, Client consultation, Personnel, Technical tasks, Client acceptance, Monitoring and feedback, Communication, and Troubleshooting. In addition, the study found that the relationships between the 10 CSFs and drilling project success is unaffected by participant and project demographics---role of project personnel, and project location. The significance of these findings are both practical, and theoretical. Practically, application of an empirically verified CSFs list to oil

  14. Shippingport: Overall project progress

    International Nuclear Information System (INIS)

    Crimi, F.P.

    1989-01-01

    The Shippingport atomic power station (SAPS) consisted of the nuclear steam supply system and associated radioactive waste processing systems, which were owned by the US Department of Energy (DOE), and the balance of plant, owned by the Duquesne Light Company. The station is located at Shippingport, Pennsylvania, on 7 acres of land leased by DOE from Duquesne Light Company. The Shippingport Station Decommissioning Project (SSDP) is being performed under contract to the DOE by the General Electric Company (GE) and its preselected subcontractor, MK-Ferguson Company, as the decommissioning operations contractor (DOC). This paper describes the decommissioning work that has been accomplished since July 1988, and the project's cost and schedule status. As the first decommissioning of a commercial, full-scale nuclear power plant, the SSDP is expected to set the standards for the demolition of future nuclear power plants

  15. Calculation of projected ranges

    International Nuclear Information System (INIS)

    Biersack, J.P.

    1980-09-01

    The concept of multiple scattering is reconsidered for obtaining the directional spreading of ion motion as a function of energy loss. From this the mean projection of each pathlength element of the ion trajectory is derived which - upon summation or integration - leads to the desired mean projected range. In special cases, the calculation can be carried out analytically, otherwise a simple general algorithm is derived which is suitable even for the smallest programmable calculators. Necessary input for the present treatment consists only of generally accessable stopping power and straggling formulas. The procedure does not rely on scattering cross sections, e.g. power potential or f(t 1 sup(/) 2 ) approximations. The present approach lends itself easily to include electronic straggling or to treat composed target materials, or even to account for the so-called time integral. (orig.)

  16. Projecte WebSat

    OpenAIRE

    Pascual Aventí, Guillem

    2004-01-01

    El marc d'aquest projecte és el servei d' atenció tècnica d'un dels distribuïdors d'una coneguda marca d'ofimàtica i electrònica domèstica. Un dels processos de negoci consisteix en la reparació d'aquesta mena d'equips. El marco del proyecto es el servicio de atención técnica de uno de los distribuidores de una conocida marca de ofimática y electrónica doméstica. Uno de los procesos de negocio consiste en la reparación de este tipo de equipos. The framework of this project is the techni...

  17. Thermodynamically consistent description of criticality in models of correlated electrons

    Czech Academy of Sciences Publication Activity Database

    Janiš, Václav; Kauch, Anna; Pokorný, Vladislav

    2017-01-01

    Roč. 95, č. 4 (2017), s. 1-14, č. článku 045108. ISSN 2469-9950 R&D Projects: GA ČR GA15-14259S Institutional support: RVO:68378271 Keywords : conserving approximations * Anderson model * Hubbard model * parquet equations Subject RIV: BM - Solid Matter Physics ; Magnetism OBOR OECD: Condensed matter physics (including formerly solid state physics, supercond.) Impact factor: 3.836, year: 2016

  18. Project Exodus

    Science.gov (United States)

    1990-01-01

    Project Exodus is an in-depth study to identify and address the basic problems of a manned mission to Mars. The most important problems concern propulsion, life support, structure, trajectory, and finance. Exodus will employ a passenger ship, cargo ship, and landing craft for the journey to Mars. These three major components of the mission design are discussed separately. Within each component the design characteristics of structures, trajectory, and propulsion are addressed. The design characteristics of life support are mentioned only in those sections requiring it.

  19. Consistency relation for the Lorentz invariant single-field inflation

    International Nuclear Information System (INIS)

    Huang, Qing-Guo

    2010-01-01

    In this paper we compute the sizes of equilateral and orthogonal shape bispectrum for the general Lorentz invariant single-field inflation. The stability of field theory implies a non-negative square of sound speed which leads to a consistency relation between the sizes of orthogonal and equilateral shape bispectrum, namely f NL orth. ≤ −0.054f NL equil. . In particular, for the single-field Dirac-Born-Infeld (DBI) inflation, the consistency relation becomes f NL orth. = 0.070f NL equil. ≤ 0. These consistency relations are also valid in the mixed scenario where the quantum fluctuations of some other light scalar fields contribute to a part of total curvature perturbation on the super-horizon scale and may generate a local form bispectrum. A distinguishing prediction of the mixed scenario is τ NL loc. > ((6/5)f NL loc. ) 2 . Comparing these consistency relations to WMAP 7yr data, there is still a big room for the Lorentz invariant inflation, but DBI inflation has been disfavored at more than 68% CL

  20. Short-Cut Estimators of Criterion-Referenced Test Consistency.

    Science.gov (United States)

    Brown, James Dean

    1990-01-01

    Presents simplified methods for deriving estimates of the consistency of criterion-referenced, English-as-a-Second-Language tests, including (1) the threshold loss agreement approach using agreement or kappa coefficients, (2) the squared-error loss agreement approach using the phi(lambda) dependability approach, and (3) the domain score…

  1. SOCIAL COMPARISON, SELF-CONSISTENCY AND THE PRESENTATION OF SELF.

    Science.gov (United States)

    MORSE, STANLEY J.; GERGEN, KENNETH J.

    TO DISCOVER HOW A PERSON'S (P) SELF-CONCEPT IS AFFECTED BY THE CHARACTERISTICS OF ANOTHER (O) WHO SUDDENLY APPEARS IN THE SAME SOCIAL ENVIRONMENT, SEVERAL QUESTIONNAIRES, INCLUDING THE GERGEN-MORSE (1967) SELF-CONSISTENCY SCALE AND HALF THE COOPERSMITH SELF-ESTEEM INVENTORY, WERE ADMINISTERED TO 78 UNDERGRADUATE MEN WHO HAD ANSWERED AN AD FOR WORK…

  2. Consistency of the Takens estimator for the correlation dimension

    NARCIS (Netherlands)

    Borovkova, S.; Burton, Robert; Dehling, H.

    Motivated by the problem of estimating the fractal dimension of a strange attractor, we prove weak consistency of U-statistics for stationary ergodic and mixing sequences when the kernel function is unbounded, extending by this earlier results of Aaronson, Burton, Dehling, Gilat, Hill and Weiss. We

  3. Assessing atmospheric bias correction for dynamical consistency using potential vorticity

    International Nuclear Information System (INIS)

    Rocheta, Eytan; Sharma, Ashish; Evans, Jason P

    2014-01-01

    Correcting biases in atmospheric variables prior to impact studies or dynamical downscaling can lead to new biases as dynamical consistency between the ‘corrected’ fields is not maintained. Use of these bias corrected fields for subsequent impact studies and dynamical downscaling provides input conditions that do not appropriately represent intervariable relationships in atmospheric fields. Here we investigate the consequences of the lack of dynamical consistency in bias correction using a measure of model consistency—the potential vorticity (PV). This paper presents an assessment of the biases present in PV using two alternative correction techniques—an approach where bias correction is performed individually on each atmospheric variable, thereby ignoring the physical relationships that exists between the multiple variables that are corrected, and a second approach where bias correction is performed directly on the PV field, thereby keeping the system dynamically coherent throughout the correction process. In this paper we show that bias correcting variables independently results in increased errors above the tropopause in the mean and standard deviation of the PV field, which are improved when using the alternative proposed. Furthermore, patterns of spatial variability are improved over nearly all vertical levels when applying the alternative approach. Results point to a need for a dynamically consistent atmospheric bias correction technique which results in fields that can be used as dynamically consistent lateral boundaries in follow-up downscaling applications. (letter)

  4. An algebraic method for constructing stable and consistent autoregressive filters

    International Nuclear Information System (INIS)

    Harlim, John; Hong, Hoon; Robbins, Jacob L.

    2015-01-01

    In this paper, we introduce an algebraic method to construct stable and consistent univariate autoregressive (AR) models of low order for filtering and predicting nonlinear turbulent signals with memory depth. By stable, we refer to the classical stability condition for the AR model. By consistent, we refer to the classical consistency constraints of Adams–Bashforth methods of order-two. One attractive feature of this algebraic method is that the model parameters can be obtained without directly knowing any training data set as opposed to many standard, regression-based parameterization methods. It takes only long-time average statistics as inputs. The proposed method provides a discretization time step interval which guarantees the existence of stable and consistent AR model and simultaneously produces the parameters for the AR models. In our numerical examples with two chaotic time series with different characteristics of decaying time scales, we find that the proposed AR models produce significantly more accurate short-term predictive skill and comparable filtering skill relative to the linear regression-based AR models. These encouraging results are robust across wide ranges of discretization times, observation times, and observation noise variances. Finally, we also find that the proposed model produces an improved short-time prediction relative to the linear regression-based AR-models in forecasting a data set that characterizes the variability of the Madden–Julian Oscillation, a dominant tropical atmospheric wave pattern

  5. Delimiting coefficient alpha from internal consistency and unidimensionality

    NARCIS (Netherlands)

    Sijtsma, K.

    2015-01-01

    I discuss the contribution by Davenport, Davison, Liou, & Love (2015) in which they relate reliability represented by coefficient α to formal definitions of internal consistency and unidimensionality, both proposed by Cronbach (1951). I argue that coefficient α is a lower bound to reliability and

  6. Challenges of Predictability and Consistency in the First ...

    African Journals Online (AJOL)

    This article aims to investigate some features of Endemann's (1911) Wörterbuch der Sotho-Sprache (Dictionary of the Sotho language) with the focus on challenges of predictability and consistency in the lemmatization approach, the access alphabet, cross references and article treatments. The dictionary has hitherto ...

  7. The Impact of Orthographic Consistency on German Spoken Word Identification

    Science.gov (United States)

    Beyermann, Sandra; Penke, Martina

    2014-01-01

    An auditory lexical decision experiment was conducted to find out whether sound-to-spelling consistency has an impact on German spoken word processing, and whether such an impact is different at different stages of reading development. Four groups of readers (school children in the second, third and fifth grades, and university students)…

  8. Final Report Fermionic Symmetries and Self consistent Shell Model

    International Nuclear Information System (INIS)

    Zamick, Larry

    2008-01-01

    In this final report in the field of theoretical nuclear physics we note important accomplishments.We were confronted with 'anomoulous' magnetic moments by the experimetalists and were able to expain them. We found unexpected partial dynamical symmetries--completely unknown before, and were able to a large extent to expain them. The importance of a self consistent shell model was emphasized.

  9. Using the Perceptron Algorithm to Find Consistent Hypotheses

    OpenAIRE

    Anthony, M.; Shawe-Taylor, J.

    1993-01-01

    The perceptron learning algorithm yields quite naturally an algorithm for finding a linearly separable boolean function consistent with a sample of such a function. Using the idea of a specifying sample, we give a simple proof that this algorithm is not efficient, in general.

  10. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  11. Consistent seasonal snow cover depth and duration variability over ...

    Indian Academy of Sciences (India)

    Decline in consistent seasonal snow cover depth, duration and changing snow cover build- up pattern over the WH in recent decades indicate that WH has undergone considerable climate change and winter weather patterns are changing in the WH. 1. Introduction. Mountainous regions around the globe are storehouses.

  12. Is There a Future for Education Consistent with Agenda 21?

    Science.gov (United States)

    Smyth, John

    1999-01-01

    Discusses recent experiences in developing and implementing strategies for education consistent with the concept of sustainable development at two different levels: (1) the international level characterized by Agenda 21 along with the efforts of the United Nations Commission on Sustainable Development to foster its progress; and (2) the national…

  13. Diagnosing a Strong-Fault Model by Conflict and Consistency

    Directory of Open Access Journals (Sweden)

    Wenfeng Zhang

    2018-03-01

    Full Text Available The diagnosis method for a weak-fault model with only normal behaviors of each component has evolved over decades. However, many systems now demand a strong-fault models, the fault modes of which have specific behaviors as well. It is difficult to diagnose a strong-fault model due to its non-monotonicity. Currently, diagnosis methods usually employ conflicts to isolate possible fault and the process can be expedited when some observed output is consistent with the model’s prediction where the consistency indicates probably normal components. This paper solves the problem of efficiently diagnosing a strong-fault model by proposing a novel Logic-based Truth Maintenance System (LTMS with two search approaches based on conflict and consistency. At the beginning, the original a strong-fault model is encoded by Boolean variables and converted into Conjunctive Normal Form (CNF. Then the proposed LTMS is employed to reason over CNF and find multiple minimal conflicts and maximal consistencies when there exists fault. The search approaches offer the best candidate efficiency based on the reasoning result until the diagnosis results are obtained. The completeness, coverage, correctness and complexity of the proposals are analyzed theoretically to show their strength and weakness. Finally, the proposed approaches are demonstrated by applying them to a real-world domain—the heat control unit of a spacecraft—where the proposed methods are significantly better than best first and conflict directly with A* search methods.

  14. Consistent dynamical and statistical description of fission and comparison

    Energy Technology Data Exchange (ETDEWEB)

    Shunuan, Wang [Chinese Nuclear Data Center, Beijing, BJ (China)

    1996-06-01

    The research survey of consistent dynamical and statistical description of fission is briefly introduced. The channel theory of fission with diffusive dynamics based on Bohr channel theory of fission and Fokker-Planck equation and Kramers-modified Bohr-Wheeler expression according to Strutinsky method given by P.Frobrich et al. are compared and analyzed. (2 figs.).

  15. Brief Report: Consistency of Search Engine Rankings for Autism Websites

    Science.gov (United States)

    Reichow, Brian; Naples, Adam; Steinhoff, Timothy; Halpern, Jason; Volkmar, Fred R.

    2012-01-01

    The World Wide Web is one of the most common methods used by parents to find information on autism spectrum disorders and most consumers find information through search engines such as Google or Bing. However, little is known about how the search engines operate or the consistency of the results that are returned over time. This study presents the…

  16. Consistency of the Self-Schema in Depression.

    Science.gov (United States)

    Ross, Michael J.; Mueller, John H.

    Depressed individuals may filter or distort environmental information in direct relationship to their self perceptions. To investigate the degree of uncertainty about oneself and others, as measured by consistent/inconsistent responses, 72 college students (32 depressed and 40 nondepressed) rated selected adjectives from the Derry and Kuiper…

  17. Composition consisting of a dendrimer and an active substance

    NARCIS (Netherlands)

    1995-01-01

    The invention relates to a composition consisting of a dendrimer provided with blocking agents and an active substance occluded in the dendrimer. According to the invention a blocking agent is a compound which is sterically of sufficient size, which readily enters into a chemical bond with the

  18. Analytical relativistic self-consistent-field calculations for atoms

    International Nuclear Information System (INIS)

    Barthelat, J.C.; Pelissier, M.; Durand, P.

    1980-01-01

    A new second-order representation of the Dirac equation is presented. This representation which is exact for a hydrogen atom is applied to approximate analytical self-consistent-field calculations for atoms. Results are given for the rare-gas atoms from helium to radon and for lead. The results compare favorably with numerical Dirac-Hartree-Fock solutions

  19. A consistent analysis for the quark condensate in QCD

    International Nuclear Information System (INIS)

    Huang Zheng; Huang Tao

    1988-08-01

    The dynamical symmetry breaking in QCD is analysed based on the vacuum condensates. A self-consistent equation for the quark condensate (φ φ) is derived. A nontrivial solution for (φ φ) ≠ 0 is given in terms of the QCD scale parameter A

  20. The consistency assessment of topological relations in cartographic generalization

    Science.gov (United States)

    Zheng, Chunyan; Guo, Qingsheng; Du, Xiaochu

    2006-10-01

    The field of research in the generalization assessment has been less studied than the generalization process itself, and it is very important to keep topological relation consistency for meeting generalization quality. This paper proposes a methodology to assess the quality of generalized map from topological relations consistency. Taking roads (including railway) and residential areas for examples, from the viewpoint of the spatial cognition, some issues about topological consistency in different map scales are analyzed. The statistic information about the inconsistent topological relations can be obtained by comparing the two matrices: one is the matrix for the topological relations in the generalized map; the other is the theoretical matrix for the topological relations that should be maintained after generalization. Based on the fuzzy set theory and the classification of map object types, the consistency evaluation model of topological relations is established. The paper proves the feasibility of the method through the example about how to evaluate the local topological relations between simple roads and residential area finally.

  1. Numerical consistency check between two approaches to radiative ...

    Indian Academy of Sciences (India)

    approaches for a consistency check on numerical accuracy, and find out the stabil- ... ln(MR/1 GeV) to top-quark mass scale t0(= ln(mt/1 GeV)) where t0 ≤ t ≤ tR, we ..... It is in general to tone down the solar mixing angle through further fine.

  2. Consistency or Discrepancy? Rethinking Schools from Organizational Hypocrisy to Integrity

    Science.gov (United States)

    Kiliçoglu, Gökhan

    2017-01-01

    Consistency in statements, decisions and practices is highly important for both organization members and the image of an organization. It is expected from organizations, especially from their administrators, to "walk the talk"--in other words, to try to practise what they preach. However, in the process of gaining legitimacy and adapting…

  3. Diagnosing a Strong-Fault Model by Conflict and Consistency.

    Science.gov (United States)

    Zhang, Wenfeng; Zhao, Qi; Zhao, Hongbo; Zhou, Gan; Feng, Wenquan

    2018-03-29

    The diagnosis method for a weak-fault model with only normal behaviors of each component has evolved over decades. However, many systems now demand a strong-fault models, the fault modes of which have specific behaviors as well. It is difficult to diagnose a strong-fault model due to its non-monotonicity. Currently, diagnosis methods usually employ conflicts to isolate possible fault and the process can be expedited when some observed output is consistent with the model's prediction where the consistency indicates probably normal components. This paper solves the problem of efficiently diagnosing a strong-fault model by proposing a novel Logic-based Truth Maintenance System (LTMS) with two search approaches based on conflict and consistency. At the beginning, the original a strong-fault model is encoded by Boolean variables and converted into Conjunctive Normal Form (CNF). Then the proposed LTMS is employed to reason over CNF and find multiple minimal conflicts and maximal consistencies when there exists fault. The search approaches offer the best candidate efficiency based on the reasoning result until the diagnosis results are obtained. The completeness, coverage, correctness and complexity of the proposals are analyzed theoretically to show their strength and weakness. Finally, the proposed approaches are demonstrated by applying them to a real-world domain-the heat control unit of a spacecraft-where the proposed methods are significantly better than best first and conflict directly with A* search methods.

  4. Consistency Check for the Bin Packing Constraint Revisited

    Science.gov (United States)

    Dupuis, Julien; Schaus, Pierre; Deville, Yves

    The bin packing problem (BP) consists in finding the minimum number of bins necessary to pack a set of items so that the total size of the items in each bin does not exceed the bin capacity C. The bin capacity is common for all the bins.

  5. Matrix analysis for associated consistency in cooperative game theory

    NARCIS (Netherlands)

    Xu, G.; Driessen, Theo; Sun, H.; Sun, H.

    Hamiache's recent axiomatization of the well-known Shapley value for TU games states that the Shapley value is the unique solution verifying the following three axioms: the inessential game property, continuity and associated consistency. Driessen extended Hamiache's axiomatization to the enlarged

  6. Matrix analysis for associated consistency in cooperative game theory

    NARCIS (Netherlands)

    Xu Genjiu, G.; Driessen, Theo; Sun, H.; Sun, H.

    Hamiache axiomatized the Shapley value as the unique solution verifying the inessential game property, continuity and associated consistency. Driessen extended Hamiache’s axiomatization to the enlarged class of efficient, symmetric, and linear values. In this paper, we introduce the notion of row

  7. Consistent measurements comparing the drift features of noble gas mixtures

    CERN Document Server

    Becker, U; Fortunato, E M; Kirchner, J; Rosera, K; Uchida, Y

    1999-01-01

    We present a consistent set of measurements of electron drift velocities and Lorentz deflection angles for all noble gases with methane and ethane as quenchers in magnetic fields up to 0.8 T. Empirical descriptions are also presented. Details on the World Wide Web allow for guided design and optimization of future detectors.

  8. Robust Visual Tracking Via Consistent Low-Rank Sparse Learning

    KAUST Repository

    Zhang, Tianzhu

    2014-06-19

    Object tracking is the process of determining the states of a target in consecutive video frames based on properties of motion and appearance consistency. In this paper, we propose a consistent low-rank sparse tracker (CLRST) that builds upon the particle filter framework for tracking. By exploiting temporal consistency, the proposed CLRST algorithm adaptively prunes and selects candidate particles. By using linear sparse combinations of dictionary templates, the proposed method learns the sparse representations of image regions corresponding to candidate particles jointly by exploiting the underlying low-rank constraints. In addition, the proposed CLRST algorithm is computationally attractive since temporal consistency property helps prune particles and the low-rank minimization problem for learning joint sparse representations can be efficiently solved by a sequence of closed form update operations. We evaluate the proposed CLRST algorithm against 14 state-of-the-art tracking methods on a set of 25 challenging image sequences. Experimental results show that the CLRST algorithm performs favorably against state-of-the-art tracking methods in terms of accuracy and execution time.

  9. Structural covariance networks across healthy young adults and their consistency.

    Science.gov (United States)

    Guo, Xiaojuan; Wang, Yan; Guo, Taomei; Chen, Kewei; Zhang, Jiacai; Li, Ke; Jin, Zhen; Yao, Li

    2015-08-01

    To investigate structural covariance networks (SCNs) as measured by regional gray matter volumes with structural magnetic resonance imaging (MRI) from healthy young adults, and to examine their consistency and stability. Two independent cohorts were included in this study: Group 1 (82 healthy subjects aged 18-28 years) and Group 2 (109 healthy subjects aged 20-28 years). Structural MRI data were acquired at 3.0T and 1.5T using a magnetization prepared rapid-acquisition gradient echo sequence for these two groups, respectively. We applied independent component analysis (ICA) to construct SCNs and further applied the spatial overlap ratio and correlation coefficient to evaluate the spatial consistency of the SCNs between these two datasets. Seven and six independent components were identified for Group 1 and Group 2, respectively. Moreover, six SCNs including the posterior default mode network, the visual and auditory networks consistently existed across the two datasets. The overlap ratios and correlation coefficients of the visual network reached the maximums of 72% and 0.71. This study demonstrates the existence of consistent SCNs corresponding to general functional networks. These structural covariance findings may provide insight into the underlying organizational principles of brain anatomy. © 2014 Wiley Periodicals, Inc.

  10. Consistency in behavior of the CEO regarding corporate social responsibility

    NARCIS (Netherlands)

    Elving, W.J.L.; Kartal, D.

    2012-01-01

    Purpose - When corporations adopt a corporate social responsibility (CSR) program and use and name it in their external communications, their members should act in line with CSR. The purpose of this paper is to present an experiment in which the consistent or inconsistent behavior of a CEO was

  11. Self-consistent description of the isospin mixing

    International Nuclear Information System (INIS)

    Gabrakov, S.I.; Pyatov, N.I.; Baznat, M.I.; Salamov, D.I.

    1978-03-01

    The properties of collective 0 + states built of unlike particle-hole excitations in spherical nuclei have been investigated in a self-consistent microscopic approach. These states arise when the broken isospin symmetry of the nuclear shell model Hamiltonian is restored. The numerical calculations were performed with Woods-Saxon wave functions

  12. Potential application of the consistency approach for vaccine potency testing.

    Science.gov (United States)

    Arciniega, J; Sirota, L A

    2012-01-01

    The Consistency Approach offers the possibility of reducing the number of animals used for a potency test. However, it is critical to assess the effect that such reduction may have on assay performance. Consistency of production, sometimes referred to as consistency of manufacture or manufacturing, is an old concept implicit in regulation, which aims to ensure the uninterrupted release of safe and effective products. Consistency of manufacture can be described in terms of process capability, or the ability of a process to produce output within specification limits. For example, the standard method for potency testing of inactivated rabies vaccines is a multiple-dilution vaccination challenge test in mice that gives a quantitative, although highly variable estimate. On the other hand, a single-dilution test that does not give a quantitative estimate, but rather shows if the vaccine meets the specification has been proposed. This simplified test can lead to a considerable reduction in the number of animals used. However, traditional indices of process capability assume that the output population (potency values) is normally distributed, which clearly is not the case for the simplified approach. Appropriate computation of capability indices for the latter case will require special statistical considerations.

  13. Performance and consistency of indicator groups in two biodiversity hotspots.

    Directory of Open Access Journals (Sweden)

    Joaquim Trindade-Filho

    Full Text Available In a world limited by data availability and limited funds for conservation, scientists and practitioners must use indicator groups to define spatial conservation priorities. Several studies have evaluated the effectiveness of indicator groups, but still little is known about the consistency in performance of these groups in different regions, which would allow their a priori selection.We systematically examined the effectiveness and the consistency of nine indicator groups in representing mammal species in two top-ranked Biodiversity Hotspots (BH: the Brazilian Cerrado and the Atlantic Forest. To test for group effectiveness we first found the best sets of sites able to maximize the representation of each indicator group in the BH and then calculated the average representation of different target species by the indicator groups in the BH. We considered consistent indicator groups whose representation of target species was not statistically different between BH. We called effective those groups that outperformed the target-species representation achieved by random sets of species. Effective indicator groups required the selection of less than 2% of the BH area for representing target species. Restricted-range species were the most effective indicators for the representation of all mammal diversity as well as target species. It was also the only group with high consistency.We show that several indicator groups could be applied as shortcuts for representing mammal species in the Cerrado and the Atlantic Forest to develop conservation plans, however, only restricted-range species consistently held as the most effective indicator group for such a task. This group is of particular importance in conservation planning as it captures high diversity of endemic and endangered species.

  14. Performance and consistency of indicator groups in two biodiversity hotspots.

    Science.gov (United States)

    Trindade-Filho, Joaquim; Loyola, Rafael Dias

    2011-01-01

    In a world limited by data availability and limited funds for conservation, scientists and practitioners must use indicator groups to define spatial conservation priorities. Several studies have evaluated the effectiveness of indicator groups, but still little is known about the consistency in performance of these groups in different regions, which would allow their a priori selection. We systematically examined the effectiveness and the consistency of nine indicator groups in representing mammal species in two top-ranked Biodiversity Hotspots (BH): the Brazilian Cerrado and the Atlantic Forest. To test for group effectiveness we first found the best sets of sites able to maximize the representation of each indicator group in the BH and then calculated the average representation of different target species by the indicator groups in the BH. We considered consistent indicator groups whose representation of target species was not statistically different between BH. We called effective those groups that outperformed the target-species representation achieved by random sets of species. Effective indicator groups required the selection of less than 2% of the BH area for representing target species. Restricted-range species were the most effective indicators for the representation of all mammal diversity as well as target species. It was also the only group with high consistency. We show that several indicator groups could be applied as shortcuts for representing mammal species in the Cerrado and the Atlantic Forest to develop conservation plans, however, only restricted-range species consistently held as the most effective indicator group for such a task. This group is of particular importance in conservation planning as it captures high diversity of endemic and endangered species.

  15. Conformal consistency relations for single-field inflation

    International Nuclear Information System (INIS)

    Creminelli, Paolo; Noreña, Jorge; Simonović, Marko

    2012-01-01

    We generalize the single-field consistency relations to capture not only the leading term in the squeezed limit — going as 1/q 3 , where q is the small wavevector — but also the subleading one, going as 1/q 2 . This term, for an (n+1)-point function, is fixed in terms of the variation of the n-point function under a special conformal transformation; this parallels the fact that the 1/q 3 term is related with the scale dependence of the n-point function. For the squeezed limit of the 3-point function, this conformal consistency relation implies that there are no terms going as 1/q 2 . We verify that the squeezed limit of the 4-point function is related to the conformal variation of the 3-point function both in the case of canonical slow-roll inflation and in models with reduced speed of sound. In the second case the conformal consistency conditions capture, at the level of observables, the relation among operators induced by the non-linear realization of Lorentz invariance in the Lagrangian. These results mean that, in any single-field model, primordial correlation functions of ζ are endowed with an SO(4,1) symmetry, with dilations and special conformal transformations non-linearly realized by ζ. We also verify the conformal consistency relations for any n-point function in models with a modulation of the inflaton potential, where the scale dependence is not negligible. Finally, we generalize (some of) the consistency relations involving tensors and soft internal momenta

  16. Work Project Report

    CERN Document Server

    Sallinen, Roosa-Maria

    2015-01-01

    I worked in High Power Converters section (HPC). My supervisors were Karsten Kahle and Charles-Mathieu Genton. Our team consisted of us and Francisco Rafael Blanquez Delgado who also helped me if I had any problems. The team’s main assignment is to design the new Static Var Compensator (SVC) for MEQ59 in Meyrin. The idea is to standardise all the SVCs needed at CERN in order to make the design, installation and maintenance easier and more cost effective. This report describes my project at CERN.

  17. Myanmar Model Project

    International Nuclear Information System (INIS)

    Le Heron, John

    1998-01-01

    The National Radiation Laboratory was approached by the IAEA in 1997 to provide assistance to the government of Myanmar, as part of the Model Project, in setting up an appropriate regulatory framework for radiation protection. To this end John Le Heron spent 3 weeks in late 1997 based at the Atomic Energy Department of the Ministry of Science and Technology, Yangon, assessing the existing legal framework, assisting with the review and design of the legal framework for consistency with the Basic Safety Standards, and assisting in the preparation of a system of notification, authorisation and inspection of radiation practices. (author)

  18. PORTNUS Project

    Energy Technology Data Exchange (ETDEWEB)

    Loyal, Rebecca E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-14

    The objective of the Portunus Project is to create large, automated offshore ports that will the pace and scale of international trade. Additionally, these ports would increase the number of U.S. domestic trade vessels needed, as the imported goods would need to be transported from these offshore platforms to land-based ports such as Boston, Los Angeles, and Newark. Currently, domestic trade in the United States can only be conducted by vessels that abide by the Merchant Marine Act of 1920 – also referred to as the Jones Act. The Jones Act stipulates that vessels involved in domestic trade must be U.S. owned, U.S. built, and manned by a crew made up of U.S. citizens. The Portunus Project would increase the number of Jones Act vessels needed, which raises an interesting economic concern. Are Jones Act ships more expensive to operate than foreign vessels? Would it be more economically efficient to modify the Jones Act and allow vessels manned by foreign crews to engage in U.S. domestic trade? While opposition to altering the Jones Act is strong, it is important to consider the possibility that ship-owners who employ foreign crews will lobby for the chance to enter a growing domestic trade market. Their success would mean potential job loss for thousands of Americans currently employed in maritime trade.

  19. FLORAM project

    Energy Technology Data Exchange (ETDEWEB)

    Zulauf, W E [Sao Paolos Environmental Secretariat, Sao Paolo (Brazil); Goelho, A S.R. [Riocell, S.A. (Brazil); Saber, A [IEA-Instituto de Estudos Avancados (Brazil); and others

    1996-12-31

    The project FLORAM was formulated at the `Institute for Advanced Studies` of the University of Sao Paulo. It aims at decreasing the level of carbon dioxide in the atmosphere and thus curbing the green-house effect by way of a huge effort of forestation and reforestation. The resulting forests when the trees mature, will be responsible for the absorption of about 6 billion tons of excess carbon. It represents 5 % of the total amount of CO{sub 2} which is in excess in the earth`s atmosphere and represents 5 % of the available continental surfaces which can be forested as well. Therefore, if similar projects are implemented throughout the world, in theory all the exceeding CO{sub 2}, responsible for the `greenhouse effect`, (27 % or 115 billion tons of carbon) would be absorbed. Regarding this fact, there would be a 400 million hectar increase of growing forests. FLORAM in Brazil aims to plant 20.000.000 ha in 2 years at a cost of 20 billion dollars. If it reaches its goals that will mean that Brazil will have reforested an area almost half as big as France. (author)

  20. FLORAM project

    Energy Technology Data Exchange (ETDEWEB)

    Zulauf, W.E. [Sao Paolos Environmental Secretariat, Sao Paolo (Brazil); Goelho, A.S.R. [Riocell, S.A. (Brazil); Saber, A. [IEA-Instituto de Estudos Avancados (Brazil)] [and others

    1995-12-31

    The project FLORAM was formulated at the `Institute for Advanced Studies` of the University of Sao Paulo. It aims at decreasing the level of carbon dioxide in the atmosphere and thus curbing the green-house effect by way of a huge effort of forestation and reforestation. The resulting forests when the trees mature, will be responsible for the absorption of about 6 billion tons of excess carbon. It represents 5 % of the total amount of CO{sub 2} which is in excess in the earth`s atmosphere and represents 5 % of the available continental surfaces which can be forested as well. Therefore, if similar projects are implemented throughout the world, in theory all the exceeding CO{sub 2}, responsible for the `greenhouse effect`, (27 % or 115 billion tons of carbon) would be absorbed. Regarding this fact, there would be a 400 million hectar increase of growing forests. FLORAM in Brazil aims to plant 20.000.000 ha in 2 years at a cost of 20 billion dollars. If it reaches its goals that will mean that Brazil will have reforested an area almost half as big as France. (author)

  1. Project Financing

    OpenAIRE

    S. GATTI

    2005-01-01

    Στην εισαγωγή της παρούσας εργασίας δίνεται ο ορισμός του project financing, τα ιστορικά στοιχεία και οι τάσεις αγοράς του. Στο πρώτο κεφάλαιο αναφέρεται γιατί οι εταιρείες προτιμούν την χρηματοδότηση με project financing. Γίνεται λόγος για τα πλεονεκτήματά του έναντι της άμεσης χρηματοδότησης, καθώς και για τα μειονεκτήματα του project financing. Στο δεύτερο κεφάλαιο παρουσιάζονται τα χρηματοοικονομικά στοιχεία και ο ρόλος του χρηματοοικονομικού συμβούλου. Στην τρίτη ενότητα γίνεται η αναγνώ...

  2. Dispersion Differences and Consistency of Artificial Periodic Structures.

    Science.gov (United States)

    Cheng, Zhi-Bao; Lin, Wen-Kai; Shi, Zhi-Fei

    2017-10-01

    Dispersion differences and consistency of artificial periodic structures, including phononic crystals, elastic metamaterials, as well as periodic structures composited of phononic crystals and elastic metamaterials, are investigated in this paper. By developing a K(ω) method, complex dispersion relations and group/phase velocity curves of both the single-mechanism periodic structures and the mixing-mechanism periodic structures are calculated at first, from which dispersion differences of artificial periodic structures are discussed. Then, based on a unified formulation, dispersion consistency of artificial periodic structures is investigated. Through a comprehensive comparison study, the correctness for the unified formulation is verified. Mathematical derivations of the unified formulation for different artificial periodic structures are presented. Furthermore, physical meanings of the unified formulation are discussed in the energy-state space.

  3. Consistent Conformal Extensions of the Standard Model arXiv

    CERN Document Server

    Loebbert, Florian; Plefka, Jan

    The question of whether classically conformal modifications of the standard model are consistent with experimental obervations has recently been subject to renewed interest. The method of Gildener and Weinberg provides a natural framework for the study of the effective potential of the resulting multi-scalar standard model extensions. This approach relies on the assumption of the ordinary loop hierarchy $\\lambda_\\text{s} \\sim g^2_\\text{g}$ of scalar and gauge couplings. On the other hand, Andreassen, Frost and Schwartz recently argued that in the (single-scalar) standard model, gauge invariant results require the consistent scaling $\\lambda_\\text{s} \\sim g^4_\\text{g}$. In the present paper we contrast these two hierarchy assumptions and illustrate the differences in the phenomenological predictions of minimal conformal extensions of the standard model.

  4. Surfactant modified clays’ consistency limits and contact angles

    Directory of Open Access Journals (Sweden)

    S Akbulut

    2012-07-01

    Full Text Available This study was aimed at preparing a surfactant modified clay (SMC and researching the effect of surfactants on clays' contact angles and consistency limits; clay was thus modified by surfactants formodifying their engineering properties. Seven surfactants (trimethylglycine, hydroxyethylcellulose  octyl phenol ethoxylate, linear alkylbenzene sulfonic acid, sodium lauryl ether sulfate, cetyl trimethylammonium chloride and quaternised ethoxylated fatty amine were used as surfactants in this study. The experimental results indicated that SMC consistency limits (liquid and plastic limits changedsignificantly compared to those of natural clay. Plasticity index and liquid limit (PI-LL values representing soil class approached the A-line when zwitterion, nonionic, and anionic surfactant percentageincreased. However, cationic SMC became transformed from CH (high plasticity clay to MH (high plasticity silt class soils, according to the unified soil classification system (USCS. Clay modifiedwith cationic and anionic surfactants gave higher and lower contact angles than natural clay, respectively.

  5. Rotating D0-branes and consistent truncations of supergravity

    International Nuclear Information System (INIS)

    Anabalón, Andrés; Ortiz, Thomas; Samtleben, Henning

    2013-01-01

    The fluctuations around the D0-brane near-horizon geometry are described by two-dimensional SO(9) gauged maximal supergravity. We work out the U(1) 4 truncation of this theory whose scalar sector consists of five dilaton and four axion fields. We construct the full non-linear Kaluza–Klein ansatz for the embedding of the dilaton sector into type IIA supergravity. This yields a consistent truncation around a geometry which is the warped product of a two-dimensional domain wall and the sphere S 8 . As an application, we consider the solutions corresponding to rotating D0-branes which in the near-horizon limit approach AdS 2 ×M 8 geometries, and discuss their thermodynamical properties. More generally, we study the appearance of such solutions in the presence of non-vanishing axion fields

  6. Substituting fields within the action: Consistency issues and some applications

    International Nuclear Information System (INIS)

    Pons, Josep M.

    2010-01-01

    In field theory, as well as in mechanics, the substitution of some fields in terms of other fields at the level of the action raises an issue of consistency with respect to the equations of motion. We discuss this issue and give an expression which neatly displays the difference between doing the substitution at the level of the Lagrangian or at the level of the equations of motion. Both operations do not commute in general. A very relevant exception is the case of auxiliary variables, which are discussed in detail together with some of their relevant applications. We discuss the conditions for the preservation of symmetries--Noether as well as non-Noether--under the reduction of degrees of freedom provided by the mechanism of substitution. We also examine how the gauge fixing procedures fit in our framework and give simple examples on the issue of consistency in this case.

  7. Design of a Turbulence Generator of Medium Consistency Pulp Pumps

    Directory of Open Access Journals (Sweden)

    Hong Li

    2012-01-01

    Full Text Available The turbulence generator is a key component of medium consistency centrifugal pulp pumps, with functions to fluidize the medium consistency pulp and to separate gas from the liquid. Structure sizes of the generator affect the hydraulic performance. The radius and the blade laying angle are two important structural sizes of a turbulence generator. Starting with the research on the flow inside and shearing characteristics of the MC pulp, a simple mathematical model at the flow section of the shearing chamber is built, and the formula and procedure to calculate the radius of the turbulence generator are established. The blade laying angle is referenced from the turbine agitator which has the similar shape with the turbulence generator, and the CFD simulation is applied to study the different flow fields with different blade laying angles. Then the recommended blade laying angle of the turbulence generator is formed to be between 60° and 75°.

  8. On the consistent effect histories approach to quantum mechanics

    International Nuclear Information System (INIS)

    Rudolph, O.

    1996-01-01

    A formulation of the consistent histories approach to quantum mechanics in terms of generalized observables (POV measures) and effect operators is provided. The usual notion of open-quote open-quote history close-quote close-quote is generalized to the notion of open-quote open-quote effect history.close-quote close-quote The space of effect histories carries the structure of a D-poset. Recent results of J. D. Maitland Wright imply that every decoherence functional defined for ordinary histories can be uniquely extended to a bi-additive decoherence functional on the space of effect histories. Omngrave es close-quote logical interpretation is generalized to the present context. The result of this work considerably generalizes and simplifies the earlier formulation of the consistent effect histories approach to quantum mechanics communicated in a previous work of this author. copyright 1996 American Institute of Physics

  9. Consistency Across Standards or Standards in a New Business Model

    Science.gov (United States)

    Russo, Dane M.

    2010-01-01

    Presentation topics include: standards in a changing business model, the new National Space Policy is driving change, a new paradigm for human spaceflight, consistency across standards, the purpose of standards, danger of over-prescriptive standards, a balance is needed (between prescriptive and general standards), enabling versus inhibiting, characteristics of success-oriented standards, characteristics of success-oriented standards, and conclusions. Additional slides include NASA Procedural Requirements 8705.2B identifies human rating standards and requirements, draft health and medical standards for human rating, what's been done, government oversight models, examples of consistency from anthropometry, examples of inconsistency from air quality and appendices of government and non-governmental human factors standards.

  10. Quantitative verification of ab initio self-consistent laser theory.

    Science.gov (United States)

    Ge, Li; Tandy, Robert J; Stone, A D; Türeci, Hakan E

    2008-10-13

    We generalize and test the recent "ab initio" self-consistent (AISC) time-independent semiclassical laser theory. This self-consistent formalism generates all the stationary lasing properties in the multimode regime (frequencies, thresholds, internal and external fields, output power and emission pattern) from simple inputs: the dielectric function of the passive cavity, the atomic transition frequency, and the transverse relaxation time of the lasing transition.We find that the theory gives excellent quantitative agreement with full time-dependent simulations of the Maxwell-Bloch equations after it has been generalized to drop the slowly-varying envelope approximation. The theory is infinite order in the non-linear hole-burning interaction; the widely used third order approximation is shown to fail badly.

  11. Self-consistent studies of magnetic thin film Ni (001)

    International Nuclear Information System (INIS)

    Wang, C.S.; Freeman, A.J.

    1979-01-01

    Advances in experimental methods for studying surface phenomena have provided the stimulus to develop theoretical methods capable of interpreting this wealth of new information. Of particular interest have been the relative roles of bulk and surface contributions since in several important cases agreement between experiment and bulk self-consistent (SC) calculations within the local spin density functional formalism (LSDF) is lacking. We discuss our recent extension of the (LSDF) approach to the study of thin films (slabs) and the role of surface effects on magnetic properties. Results are described for Ni (001) films using our new SC numerical basis set LCAO method. Self-consistency within the superposition of overlapping spherical atomic charge density model is obtained iteratively with the atomic configuration as the adjustable parameter. Results are presented for the electronic charge densities and local density of states. The origin and role of (magnetic) surface states is discussed by comparison with results of earlier bulk calculations

  12. Self-consistent equilibria in the pulsar magnetosphere

    International Nuclear Information System (INIS)

    Endean, V.G.

    1976-01-01

    For a 'collisionless' pulsar magnetosphere the self-consistent equilibrium particle distribution functions are functions of the constants of the motion ony. Reasons are given for concluding that to a good approximation they will be functions of the rotating frame Hamiltonian only. This is shown to result in a rigid rotation of the plasma, which therefore becomes trapped inside the velocity of light cylinder. The self-consistent field equations are derived, and a method of solving them is illustrated. The axial component of the magnetic field decays to zero at the plasma boundary. In practice, some streaming of particles into the wind zone may occur as a second-order effect. Acceleration of such particles to very high energies is expected when they approach the velocity of light cylinder, but they cannot be accelerated to very high energies near the star. (author)

  13. Consistent creep and rupture properties for creep-fatigue evaluation

    International Nuclear Information System (INIS)

    Schultz, C.C.

    1978-01-01

    The currently accepted practice of using inconsistent representations of creep and rupture behaviors in the prediction of creep-fatigue life is shown to introduce a factor of safety beyond that specified in current ASME Code design rules for 304 stainless steel Class 1 nuclear components. Accurate predictions of creep-fatigue life for uniaxial tests on a given heat of material are obtained by using creep and rupture properties for that same heat of material. The use of a consistent representation of creep and rupture properties for a mininum strength heat is also shown to provide adequate predictions. The viability of using consistent properties (either actual or those of a minimum heat) to predict creep-fatigue life thus identifies significant design uses for the results of characterization tests and improved creep and rupture correlations

  14. Lagrangian space consistency relation for large scale structure

    International Nuclear Information System (INIS)

    Horn, Bart; Hui, Lam; Xiao, Xiao

    2015-01-01

    Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias and Riotto and Peloso and Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present. The simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space

  15. Rotating D0-branes and consistent truncations of supergravity

    Energy Technology Data Exchange (ETDEWEB)

    Anabalón, Andrés [Departamento de Ciencias, Facultad de Artes Liberales, Facultad de Ingeniería y Ciencias, Universidad Adolfo Ibáñez, Av. Padre Hurtado 750, Viña del Mar (Chile); Université de Lyon, Laboratoire de Physique, UMR 5672, CNRS École Normale Supérieure de Lyon 46, allée d' Italie, F-69364 Lyon cedex 07 (France); Ortiz, Thomas; Samtleben, Henning [Université de Lyon, Laboratoire de Physique, UMR 5672, CNRS École Normale Supérieure de Lyon 46, allée d' Italie, F-69364 Lyon cedex 07 (France)

    2013-12-18

    The fluctuations around the D0-brane near-horizon geometry are described by two-dimensional SO(9) gauged maximal supergravity. We work out the U(1){sup 4} truncation of this theory whose scalar sector consists of five dilaton and four axion fields. We construct the full non-linear Kaluza–Klein ansatz for the embedding of the dilaton sector into type IIA supergravity. This yields a consistent truncation around a geometry which is the warped product of a two-dimensional domain wall and the sphere S{sup 8}. As an application, we consider the solutions corresponding to rotating D0-branes which in the near-horizon limit approach AdS{sub 2}×M{sub 8} geometries, and discuss their thermodynamical properties. More generally, we study the appearance of such solutions in the presence of non-vanishing axion fields.

  16. Consistent creep and rupture properties for creep-fatigue evaluation

    International Nuclear Information System (INIS)

    Schultz, C.C.

    1979-01-01

    The currently accepted practice of using inconsistent representations of creep and rupture behaviors in the prediction of creep-fatigue life is shown to introduce a factor of safety beyond that specified in current ASME Code design rules for 304 stainless steel Class 1 nuclear components. Accurate predictions of creep-fatigue life for uniaxial tests on a given heat of material are obtained by using creep and rupture properties for that same heat of material. The use of a consistent representation of creep and rupture properties for a minimum strength heat is also shown to provide reasonable predictions. The viability of using consistent properties (either actual or those of a minimum strength heat) to predict creep-fatigue life thus identifies significant design uses for the results of characterization tests and improved creep and rupture correlations. 12 refs

  17. Full self-consistency versus quasiparticle self-consistency in diagrammatic approaches: exactly solvable two-site Hubbard model.

    Science.gov (United States)

    Kutepov, A L

    2015-08-12

    Self-consistent solutions of Hedin's equations (HE) for the two-site Hubbard model (HM) have been studied. They have been found for three-point vertices of increasing complexity (Γ = 1 (GW approximation), Γ1 from the first-order perturbation theory, and the exact vertex Γ(E)). Comparison is made between the cases when an additional quasiparticle (QP) approximation for Green's functions is applied during the self-consistent iterative solving of HE and when QP approximation is not applied. The results obtained with the exact vertex are directly related to the present open question-which approximation is more advantageous for future implementations, GW + DMFT or QPGW + DMFT. It is shown that in a regime of strong correlations only the originally proposed GW + DMFT scheme is able to provide reliable results. Vertex corrections based on perturbation theory (PT) systematically improve the GW results when full self-consistency is applied. The application of QP self-consistency combined with PT vertex corrections shows similar problems to the case when the exact vertex is applied combined with QP sc. An analysis of Ward Identity violation is performed for all studied in this work's approximations and its relation to the general accuracy of the schemes used is provided.

  18. Time-Consistent and Market-Consistent Evaluations (replaced by CentER DP 2012-086)

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2011-01-01

    We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  19. Time-Consistent and Market-Consistent Evaluations (Revised version of CentER DP 2011-063)

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2012-01-01

    Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  20. Overspecification of colour, pattern, and size: Salience, absoluteness, and consistency

    OpenAIRE

    Sammie eTarenskeen; Mirjam eBroersma; Mirjam eBroersma; Bart eGeurts

    2015-01-01

    The rates of overspecification of colour, pattern, and size are compared, to investigate how salience and absoluteness contribute to the production of overspecification. Colour and pattern are absolute attributes, whereas size is relative and less salient. Additionally, a tendency towards consistent responses is assessed. Using a within-participants design, we find similar rates of colour and pattern overspecification, which are both higher than the rate of size overspecification. Using a bet...

  1. Spectrally Consistent Satellite Image Fusion with Improved Image Priors

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Aanæs, Henrik; Jensen, Thomas B.S.

    2006-01-01

    Here an improvement to our previous framework for satellite image fusion is presented. A framework purely based on the sensor physics and on prior assumptions on the fused image. The contributions of this paper are two fold. Firstly, a method for ensuring 100% spectrally consistency is proposed......, even when more sophisticated image priors are applied. Secondly, a better image prior is introduced, via data-dependent image smoothing....

  2. Monetary Poverty, Material Deprivation and Consistent Poverty in Portugal

    OpenAIRE

    Carlos Farinha Rodrigues; Isabel Andrade

    2012-01-01

    In this paper we use the Portuguese component of the European Union Statistics on Income and Living Conditions {EU-SILC) to develop a measure of consistent poverty in Portugal. It is widely agreed that being poor does not simply mean not having enough monetary resources. It also reflects a lack of access to the resources required to enjoy a minimum standard of living and participation in the society one belor]gs to. The coexistence of material deprivation and monetary poverty leads ...

  3. Consistency requirements on Δ contributions to the NN potential

    International Nuclear Information System (INIS)

    Rinat, A.S.

    1982-04-01

    We discuss theories leading to intermediate state NΔ and ΔΔ contributions to Vsub(NN). We focus on the customary addition of Lsub(ΔNπ)' to Lsub(πNN)' in a conventional field theory and argue that overcounting of contributions to tsub(πN) and Vsub(NN) will be the rule. We then discuss the cloudy bag model where a similar interaction naturally arises and which leads to a consistent theory. (author)

  4. Quark mean field theory and consistency with nuclear matter

    International Nuclear Information System (INIS)

    Dey, J.; Tomio, L.; Dey, M.; Frederico, T.

    1989-01-01

    1/N c expansion in QCD (with N c the number of colours) suggests using a potential from meson sector (e.g. Richardson) for baryons. For light quarks a σ field has to be introduced to ensure chiral symmetry breaking ( χ SB). It is found that nuclear matter properties can be used to pin down the χ SB-modelling. All masses, M Ν , m σ , m ω are found to scale with density. The equations are solved self consistently. (author)

  5. Overspecification of color, pattern, and size: salience, absoluteness, and consistency

    OpenAIRE

    Tarenskeen, S.L.; Broersma, M.; Geurts, B.

    2015-01-01

    The rates of overspecification of color, pattern, and size are compared, to investigate how salience and absoluteness contribute to the production of overspecification. Color and pattern are absolute and salient attributes, whereas size is relative and less salient. Additionally, a tendency toward consistent responses is assessed. Using a within-participants design, we find similar rates of color and pattern overspecification, which are both higher than the rate of size overspecification. Usi...

  6. The consistent histories interpretation of quantum fields in curved spacetime

    International Nuclear Information System (INIS)

    Blencowe, M.

    1991-01-01

    As an initial attempt to address some of the foundation problems of quantum mechanics, the author formulates the consistent histories interpretation of quantum field theory on a globally hyperbolic curved space time. He then constructs quasiclassical histories for a free, massive scalar field. In the final part, he points out the shortcomings of the theory and conjecture that one must take into account the fact that gravity is quantized in order to overcome them

  7. EVALUATION OF CONSISTENCY AND SETTING TIME OF IRANIAN DENTAL STONES

    Directory of Open Access Journals (Sweden)

    F GOL BIDI

    2000-09-01

    Full Text Available Introduction. Dental stones are widely used in dentistry and the success or failure of many dental treatments depend on the accuracy of these gypsums. The purpose of this study was the evaluation of Iranian dental stones and comparison between Iranian and foreign ones. In this investigation, consistency and setting time were compared between Pars Dendn, Almas and Hinrizit stones. The latter is accepted by ADA (American Dental Association. Consistency and setting time are 2 of 5 properties that are necessitated by both ADA specification No. 25 and Iranian Standard Organization specification No. 2569 for evaluation of dental stones. Methods. In this study, the number and preparation of specimens and test conditions were done according to the ADA specification No. 25 and all the measurements were done with vicat apparatus. Results. The results of this study showed that the standard consistency of Almas stone was obtained by 42ml water and 100gr powder and the setting time of this stone was 11±0.03 min. Which was with in the limits of ADA specification (12±4 min. The standard consistency of Pars Dandan stone was obrianed by 31ml water and 100 gr powder, but the setting time of this stone was 5± 0.16 min which was nt within the limits of ADA specification. Discussion: Comparison of Iranian and Hinrizit stones properties showed that two probable problems of Iranian stones are:1- Unhemogrnousity of Iranian stoned powder was caused by uncontrolled temperature, pressure and humidity in the production process of stone. 2- Impurities such as sodium chloride was responsible fo shortening of Pars Dendens setting time.

  8. Modeling a Consistent Behavior of PLC-Sensors

    Directory of Open Access Journals (Sweden)

    E. V. Kuzmin

    2014-01-01

    Full Text Available The article extends the cycle of papers dedicated to programming and verificatoin of PLC-programs by LTL-specification. This approach provides the availability of correctness analysis of PLC-programs by the model checking method.The model checking method needs to construct a finite model of a PLC program. For successful verification of required properties it is important to take into consideration that not all combinations of input signals from the sensors can occur while PLC works with a control object. This fact requires more advertence to the construction of the PLC-program model.In this paper we propose to describe a consistent behavior of sensors by three groups of LTL-formulas. They will affect the program model, approximating it to the actual behavior of the PLC program. The idea of LTL-requirements is shown by an example.A PLC program is a description of reactions on input signals from sensors, switches and buttons. In constructing a PLC-program model, the approach to modeling a consistent behavior of PLC sensors allows to focus on modeling precisely these reactions without an extension of the program model by additional structures for realization of a realistic behavior of sensors. The consistent behavior of sensors is taken into account only at the stage of checking a conformity of the programming model to required properties, i. e. a property satisfaction proof for the constructed model occurs with the condition that the model contains only such executions of the program that comply with the consistent behavior of sensors.

  9. Sun light European Project

    Science.gov (United States)

    Soubielle, Marie-Laure

    2015-04-01

    2015 has been declared the year of light. Sunlight plays a major role in the world. From the sunbeams that heat our planet and feed our plants to the optical analysis of the sun or the modern use of sun particles in technologies, sunlight is everywhere and it is vital. This project aims to understand better the light of the Sun in a variety of fields. The experiments are carried out by students aged 15 to 20 in order to share their discoveries with Italian students from primary and secondary schools. The experiments will also be presented to a group of Danish students visiting our school in January. All experiments are carried out in English and involve teams of teachers. This project is 3 folds: part 1: Biological project = what are the mechanisms of photosynthesis? part 2: Optical project= what are the components of sunlight and how to use it? part 3: Technical project= how to use the energy of sunlight for modern devices? Photosynthesis project Biology and English Context:Photosynthesis is a process used by plants and other organisms to convert light energy, normally from the Sun, into chemical energy that can later fuel the organisms' activities. This chemical energy is stored in molecules which are synthesized from carbon dioxide and water. In most cases, oxygen is released as a waste product. Most plants perform photosynthesis. Photosynthesis maintains atmospheric oxygen levels and supplies all of the organic compounds and most of the energy necessary for life on Earth. Outcome: Our project consists in understanding the various steps of photosynthesis. Students will shoot a DVD of the experiments presenting the equipments required, the steps of the experiments and the results they have obtained for a better understanding of photosynthesis Digital pen project Electricity, Optics and English Context: Sunlight is a complex source of light based on white light that can be decomposed to explain light radiations or colours. This light is a precious source to create

  10. Overspecification of colour, pattern, and size: Salience, absoluteness, and consistency

    Directory of Open Access Journals (Sweden)

    Sammie eTarenskeen

    2015-11-01

    Full Text Available The rates of overspecification of colour, pattern, and size are compared, to investigate how salience and absoluteness contribute to the production of overspecification. Colour and pattern are absolute attributes, whereas size is relative and less salient. Additionally, a tendency towards consistent responses is assessed. Using a within-participants design, we find similar rates of colour and pattern overspecification, which are both higher than the rate of size overspecification. Using a between-participants design, however, we find similar rates of pattern and size overspecification, which are both lower than the rate of colour overspecification. This indicates that although many speakers are more likely to include colour than pattern (probably because colour is more salient, they may also treat pattern like colour due to a tendency towards consistency. We find no increase in size overspecification when the salience of size is increased, suggesting that speakers are more likely to include absolute than relative attributes. However, we do find an increase in size overspecification when mentioning the attributes is triggered, which again shows that speakers tend refer in a consistent manner, and that there are circumstances in which even size overspecification is frequently produced.

  11. Consistent Partial Least Squares Path Modeling via Regularization

    Directory of Open Access Journals (Sweden)

    Sunho Jung

    2018-02-01

    Full Text Available Partial least squares (PLS path modeling is a component-based structural equation modeling that has been adopted in social and psychological research due to its data-analytic capability and flexibility. A recent methodological advance is consistent PLS (PLSc, designed to produce consistent estimates of path coefficients in structural models involving common factors. In practice, however, PLSc may frequently encounter multicollinearity in part because it takes a strategy of estimating path coefficients based on consistent correlations among independent latent variables. PLSc has yet no remedy for this multicollinearity problem, which can cause loss of statistical power and accuracy in parameter estimation. Thus, a ridge type of regularization is incorporated into PLSc, creating a new technique called regularized PLSc. A comprehensive simulation study is conducted to evaluate the performance of regularized PLSc as compared to its non-regularized counterpart in terms of power and accuracy. The results show that our regularized PLSc is recommended for use when serious multicollinearity is present.

  12. Temporal and contextual consistency of leadership in homing pigeon flocks.

    Directory of Open Access Journals (Sweden)

    Carlos D Santos

    Full Text Available Organized flight of homing pigeons (Columba livia was previously shown to rely on simple leadership rules between flock mates, yet the stability of this social structuring over time and across different contexts remains unclear. We quantified the repeatability of leadership-based flock structures within a flight and across multiple flights conducted with the same animals. We compared two contexts of flock composition: flocks of birds of the same age and flight experience; and, flocks of birds of different ages and flight experience. All flocks displayed consistent leadership-based structures over time, showing that individuals have stable roles in the navigational decisions of the flock. However, flocks of balanced age and flight experience exhibited reduced leadership stability, indicating that these factors promote flock structuring. Our study empirically demonstrates that leadership and followership are consistent behaviours in homing pigeon flocks, but such consistency is affected by the heterogeneity of individual flight experiences and/or age. Similar evidence from other species suggests leadership as an important mechanism for coordinated motion in small groups of animals with strong social bonds.

  13. Consistency checks in beam emission modeling for neutral beam injectors

    International Nuclear Information System (INIS)

    Punyapu, Bharathi; Vattipalle, Prahlad; Sharma, Sanjeev Kumar; Baruah, Ujjwal Kumar; Crowley, Brendan

    2015-01-01

    In positive neutral beam systems, the beam parameters such as ion species fractions, power fractions and beam divergence are routinely measured using Doppler shifted beam emission spectrum. The accuracy with which these parameters are estimated depend on the accuracy of the atomic modeling involved in these estimations. In this work, an effective procedure to check the consistency of the beam emission modeling in neutral beam injectors is proposed. As a first consistency check, at a constant beam voltage and current, the intensity of the beam emission spectrum is measured by varying the pressure in the neutralizer. Then, the scaling of measured intensity of un-shifted (target) and Doppler shifted intensities (projectile) of the beam emission spectrum at these pressure values are studied. If the un-shifted component scales with pressure, then the intensity of this component will be used as a second consistency check on the beam emission modeling. As a further check, the modeled beam fractions and emission cross sections of projectile and target are used to predict the intensity of the un-shifted component and then compared with the value of measured target intensity. An agreement between the predicted and measured target intensities provide the degree of discrepancy in the beam emission modeling. In order to test this methodology, a systematic analysis of Doppler shift spectroscopy data obtained on the JET neutral beam test stand data was carried out

  14. A dynamical mechanism for large volumes with consistent couplings

    Energy Technology Data Exchange (ETDEWEB)

    Abel, Steven [IPPP, Durham University,Durham, DH1 3LE (United Kingdom)

    2016-11-14

    A mechanism for addressing the “decompactification problem” is proposed, which consists of balancing the vacuum energy in Scherk-Schwarzed theories against contributions coming from non-perturbative physics. Universality of threshold corrections ensures that, in such situations, the stable minimum will have consistent gauge couplings for any gauge group that shares the same N=2 beta function for the bulk excitations as the gauge group that takes part in the minimisation. Scherk-Schwarz compactification from 6D to 4D in heterotic strings is discussed explicitly, together with two alternative possibilities for the non-perturbative physics, namely metastable SQCD vacua and a single gaugino condensate. In the former case, it is shown that modular symmetries gives various consistency checks, and allow one to follow soft-terms, playing a similar role to R-symmetry in global SQCD. The latter case is particularly attractive when there is nett Bose-Fermi degeneracy in the massless sector. In such cases, because the original Casimir energy is generated entirely by excited and/or non-physical string modes, it is completely immune to the non-perturbative IR physics. Such a separation between UV and IR contributions to the potential greatly simplifies the analysis of stabilisation, and is a general possibility that has not been considered before.

  15. Consistency of variables in PCS and JASTRO great area database

    International Nuclear Information System (INIS)

    Nishino, Tomohiro; Teshima, Teruki; Abe, Mitsuyuki

    1998-01-01

    To examine whether the Patterns of Care Study (PCS) reflects the data for the major areas in Japan, the consistency of variables in the PCS and in the major area database of the Japanese Society for Therapeutic Radiology and Oncology (JASTRO) were compared. Patients with esophageal or uterine cervical cancer were sampled from the PCS and JASTRO databases. From the JASTRO database, 147 patients with esophageal cancer and 95 patients with uterine cervical cancer were selected according to the eligibility criteria for the PCS. From the PCS, 455 esophageal and 432 uterine cervical cancer patients were surveyed. Six items for esophageal cancer and five items for uterine cervical cancer were selected for a comparative analysis of PCS and JASTRO databases. Esophageal cancer: Age (p=.0777), combination of radiation and surgery (p=.2136), and energy of the external beam (p=.6400) were consistent for PCS and JASTRO. However, the dose of the external beam for the non-surgery group showed inconsistency (p=.0467). Uterine cervical cancer: Age (p=.6301) and clinical stage (p=.8555) were consistent for the two sets of data. However, the energy of the external beam (p<.0001), dose rate of brachytherapy (p<.0001), and brachytherapy utilization by clinical stage (p<.0001) showed inconsistencies. It appears possible that the JASTRO major area database could not account for all patients' backgrounds and factors and that both surveys might have an imbalance in the stratification of institutions including differences in equipment and staffing patterns. (author)

  16. Self-assessment: Strategy for higher standards, consistency, and performance

    International Nuclear Information System (INIS)

    Ide, W.E.

    1996-01-01

    In late 1994, Palo Verde operations underwent a transformation from a unitized structure to a single functional unit. It was necessary to build consistency in watchstanding practices and create a shared mission. Because there was a lack of focus on actual plant operations and because personnel were deeply involved with administrative tasks, command and control of evolutions were weak. Improvement was needed. Consistent performance standards have been set for all three operating units. These expectation focus on nuclear, radiological, and industrial safety. Straightforward descriptions of watchstanding and monitoring practices have been provided to all department personnel. The desired professional and leadership qualities for employee conduct have been defined and communicated thoroughly. A healthy and competitive atmosphere developed with the successful implementation of these standards. Overall performance improved. The auxiliary operators demonstrated increased pride and ownership in the performance of their work activities. In addition, their morale improved. Crew teamwork improved as well as the quality of shift briefs. There was a decrease in the noise level and the administrative functions in the control room. The use of self-assessment helped to anchor and define higher and more consistent standards. The proof of Palo Verde's success was evident when an Institute of Nuclear Power Operations finding was turned into a strength within 1 yr

  17. Wide baseline stereo matching based on double topological relationship consistency

    Science.gov (United States)

    Zou, Xiaohong; Liu, Bin; Song, Xiaoxue; Liu, Yang

    2009-07-01

    Stereo matching is one of the most important branches in computer vision. In this paper, an algorithm is proposed for wide-baseline stereo vision matching. Here, a novel scheme is presented called double topological relationship consistency (DCTR). The combination of double topological configuration includes the consistency of first topological relationship (CFTR) and the consistency of second topological relationship (CSTR). It not only sets up a more advanced model on matching, but discards mismatches by iteratively computing the fitness of the feature matches and overcomes many problems of traditional methods depending on the powerful invariance to changes in the scale, rotation or illumination across large view changes and even occlusions. Experimental examples are shown where the two cameras have been located in very different orientations. Also, epipolar geometry can be recovered using RANSAC by far the most widely method adopted possibly. By the method, we can obtain correspondences with high precision on wide baseline matching problems. Finally, the effectiveness and reliability of this method are demonstrated in wide-baseline experiments on the image pairs.

  18. Consistently Trained Artificial Neural Network for Automatic Ship Berthing Control

    Directory of Open Access Journals (Sweden)

    Y.A. Ahmed

    2015-09-01

    Full Text Available In this paper, consistently trained Artificial Neural Network controller for automatic ship berthing is discussed. Minimum time course changing manoeuvre is utilised to ensure such consistency and a new concept named ‘virtual window’ is introduced. Such consistent teaching data are then used to train two separate multi-layered feed forward neural networks for command rudder and propeller revolution output. After proper training, several known and unknown conditions are tested to judge the effectiveness of the proposed controller using Monte Carlo simulations. After getting acceptable percentages of success, the trained networks are implemented for the free running experiment system to judge the network’s real time response for Esso Osaka 3-m model ship. The network’s behaviour during such experiments is also investigated for possible effect of initial conditions as well as wind disturbances. Moreover, since the final goal point of the proposed controller is set at some distance from the actual pier to ensure safety, therefore a study on automatic tug assistance is also discussed for the final alignment of the ship with actual pier.

  19. Consistency in color parameters of a commonly used shade guide.

    Science.gov (United States)

    Tashkandi, Esam

    2010-01-01

    The use of shade guides to assess the color of natural teeth subjectively remains one of the most common means for dental shade assessment. Any variation in the color parameters of the different shade guides may lead to significant clinical implications. Particularly, since the communication between the clinic and the dental laboratory is based on using the shade guide designation. The purpose of this study was to investigate the consistency of the L∗a∗b∗ color parameters of a sample of a commonly used shade guide. The color parameters of a total of 100 VITAPAN Classical Vacuum shade guide (VITA Zahnfabrik, Bad Säckingen, Germany(were measured using a X-Rite ColorEye 7000A Spectrophotometer (Grand Rapids, Michigan, USA). Each shade guide consists of 16 tabs with different designations. Each shade tab was measured five times and the average values were calculated. The ΔE between the average L∗a∗b∗ value for each shade tab and the average of the 100 shade tabs of the same designation was calculated. Using the Student t-test analysis, no significant differences were found among the measured sample. There is a high consistency level in terms of color parameters of the measured VITAPAN Classical Vacuum shade guide sample tested.

  20. Context-specific metabolic networks are consistent with experiments.

    Directory of Open Access Journals (Sweden)

    Scott A Becker

    2008-05-01

    Full Text Available Reconstructions of cellular metabolism are publicly available for a variety of different microorganisms and some mammalian genomes. To date, these reconstructions are "genome-scale" and strive to include all reactions implied by the genome annotation, as well as those with direct experimental evidence. Clearly, many of the reactions in a genome-scale reconstruction will not be active under particular conditions or in a particular cell type. Methods to tailor these comprehensive genome-scale reconstructions into context-specific networks will aid predictive in silico modeling for a particular situation. We present a method called Gene Inactivity Moderated by Metabolism and Expression (GIMME to achieve this goal. The GIMME algorithm uses quantitative gene expression data and one or more presupposed metabolic objectives to produce the context-specific reconstruction that is most consistent with the available data. Furthermore, the algorithm provides a quantitative inconsistency score indicating how consistent a set of gene expression data is with a particular metabolic objective. We show that this algorithm produces results consistent with biological experiments and intuition for adaptive evolution of bacteria, rational design of metabolic engineering strains, and human skeletal muscle cells. This work represents progress towards producing constraint-based models of metabolism that are specific to the conditions where the expression profiling data is available.

  1. The study of consistent properties of gelatinous shampoo with minoxidil

    Directory of Open Access Journals (Sweden)

    I. V. Gnitko

    2016-04-01

    Full Text Available The aim of the work is the study of consistent properties of gelatinous shampoo with minoxidil 1% for the complex therapy and prevention of alopecia. This shampoo with minoxidil was selected according to the complex physical-chemical, biopharmaceutical and microbiological investigations. Methods and results. It has been established that consistent properties of the gelatinous minoxidil 1% shampoo and the «mechanical stability» (1.70 describe the formulation as exceptionally thixotropic composition with possibility of restoration after mechanical loads. Also this fact allows to predict stability of the consistent properties during long storage. Conclusion. Factors of dynamic flowing for the foam detergent gel with minoxidil (Кd1=38.9%; Kd2=78.06% quantitatively confirm sufficient degree of distribution at the time of spreading composition on the skin surface of the hairy part of head or during technological operations of manufacturing. Insignificant difference of «mechanical stability» for the gelatinous minoxidil 1% shampoo and its base indicates the absence of interactions between active substance and the base.

  2. Consistent Kaluza-Klein truncations via exceptional field theory

    Energy Technology Data Exchange (ETDEWEB)

    Hohm, Olaf [Center for Theoretical Physics, Massachusetts Institute of Technology,Cambridge, MA 02139 (United States); Samtleben, Henning [Université de Lyon, Laboratoire de Physique, UMR 5672, CNRS,École Normale Supérieure de Lyon, 46, allée d’Italie, F-69364 Lyon cedex 07 (France)

    2015-01-26

    We present the generalized Scherk-Schwarz reduction ansatz for the full supersymmetric exceptional field theory in terms of group valued twist matrices subject to consistency equations. With this ansatz the field equations precisely reduce to those of lower-dimensional gauged supergravity parametrized by an embedding tensor. We explicitly construct a family of twist matrices as solutions of the consistency equations. They induce gauged supergravities with gauge groups SO(p,q) and CSO(p,q,r). Geometrically, they describe compactifications on internal spaces given by spheres and (warped) hyperboloides H{sup p,q}, thus extending the applicability of generalized Scherk-Schwarz reductions beyond homogeneous spaces. Together with the dictionary that relates exceptional field theory to D=11 and IIB supergravity, respectively, the construction defines an entire new family of consistent truncations of the original theories. These include not only compactifications on spheres of different dimensions (such as AdS{sub 5}×S{sup 5}), but also various hyperboloid compactifications giving rise to a higher-dimensional embedding of supergravities with non-compact and non-semisimple gauge groups.

  3. Marginal Consistency: Upper-Bounding Partition Functions over Commutative Semirings.

    Science.gov (United States)

    Werner, Tomás

    2015-07-01

    Many inference tasks in pattern recognition and artificial intelligence lead to partition functions in which addition and multiplication are abstract binary operations forming a commutative semiring. By generalizing max-sum diffusion (one of convergent message passing algorithms for approximate MAP inference in graphical models), we propose an iterative algorithm to upper bound such partition functions over commutative semirings. The iteration of the algorithm is remarkably simple: change any two factors of the partition function such that their product remains the same and their overlapping marginals become equal. In many commutative semirings, repeating this iteration for different pairs of factors converges to a fixed point when the overlapping marginals of every pair of factors coincide. We call this state marginal consistency. During that, an upper bound on the partition function monotonically decreases. This abstract algorithm unifies several existing algorithms, including max-sum diffusion and basic constraint propagation (or local consistency) algorithms in constraint programming. We further construct a hierarchy of marginal consistencies of increasingly higher levels and show than any such level can be enforced by adding identity factors of higher arity (order). Finally, we discuss instances of the framework for several semirings, including the distributive lattice and the max-sum and sum-product semirings.

  4. Consistency relation in power law G-inflation

    International Nuclear Information System (INIS)

    Unnikrishnan, Sanil; Shankaranarayanan, S.

    2014-01-01

    In the standard inflationary scenario based on a minimally coupled scalar field, canonical or non-canonical, the subluminal propagation of speed of scalar perturbations ensures the following consistency relation: r ≤ −8n T , where r is the tensor-to-scalar-ratio and n T is the spectral index for tensor perturbations. However, recently, it has been demonstrated that this consistency relation could be violated in Galilean inflation models even in the absence of superluminal propagation of scalar perturbations. It is therefore interesting to investigate whether the subluminal propagation of scalar field perturbations impose any bound on the ratio r/|n T | in G-inflation models. In this paper, we derive the consistency relation for a class of G-inflation models that lead to power law inflation. Within these class of models, it turns out that one can have r > −8n T or r ≤ −8n T depending on the model parameters. However, the subluminal propagation of speed of scalar field perturbations, as required by causality, restricts r ≤ −(32/3) n T

  5. On the consistency of risk acceptance criteria with normative theories for decision-making

    Energy Technology Data Exchange (ETDEWEB)

    Abrahamsen, E.B. [University of Stavanger, 4036 Stavanger (Norway)], E-mail: eirik.abrahamsen@uis.no; Aven, T. [University of Stavanger, 4036 Stavanger (Norway)

    2008-12-15

    In evaluation of safety in projects it is common to use risk acceptance criteria to support decision-making. In this paper, we discuss to what extent the risk acceptance criteria is in accordance with the normative theoretical framework of the expected utility theory and the rank-dependent utility theory. We show that the use of risk acceptance criteria may violate the independence axiom of the expected utility theory and the comonotonic independence axiom of the rank-dependent utility theory. Hence the use of risk acceptance criteria is not in general consistent with these theories. The level of inconsistency is highest for the expected utility theory.

  6. On the consistency of risk acceptance criteria with normative theories for decision-making

    International Nuclear Information System (INIS)

    Abrahamsen, E.B.; Aven, T.

    2008-01-01

    In evaluation of safety in projects it is common to use risk acceptance criteria to support decision-making. In this paper, we discuss to what extent the risk acceptance criteria is in accordance with the normative theoretical framework of the expected utility theory and the rank-dependent utility theory. We show that the use of risk acceptance criteria may violate the independence axiom of the expected utility theory and the comonotonic independence axiom of the rank-dependent utility theory. Hence the use of risk acceptance criteria is not in general consistent with these theories. The level of inconsistency is highest for the expected utility theory

  7. A Large Tracking Detector In Vacuum Consisting Of Self-Supporting Straw Tubes

    Science.gov (United States)

    Wintz, P.

    2004-02-01

    A novel technique to stretch the anode wire simply by the gas over-pressure inside straw drift tubes reduces the necessary straw weight to an absolute minimum. Our detector will consist of more than 3000 straws filling up a cylindrical tracking volume of 1m diameter and 30cm length. The projected spatial resolution is 200μm. The detector with a total mass of less than 15kg will be operated in vacuum, but will have an added wall thickness of 3mm mylar, only. The detector design, production experience and first results will be discussed.

  8. A Large Tracking Detector In Vacuum Consisting Of Self-Supporting Straw Tubes

    International Nuclear Information System (INIS)

    Wintz, P.

    2004-01-01

    A novel technique to stretch the anode wire simply by the gas over-pressure inside straw drift tubes reduces the necessary straw weight to an absolute minimum. Our detector will consist of more than 3000 straws filling up a cylindrical tracking volume of 1m diameter and 30cm length. The projected spatial resolution is 200μm. The detector with a total mass of less than 15kg will be operated in vacuum, but will have an added wall thickness of 3mm mylar, only. The detector design, production experience and first results will be discussed

  9. Project: Ultracentrifuges

    International Nuclear Information System (INIS)

    Olea C, O.

    1990-07-01

    The trans elastic ultracentrifuge of magnetic suspension, is an instrument that arose of an interdisciplinary group directed by the Dr. James Clark Keith where it was projected, designed and built a centrifuge that didn't exist, to be applied in forced diffusion of uranium, like one of the many application fields. The written present, has as purpose to give to know the fundamental physical principles of this technology, its fundamental characteristics of design, the application of this in the separation process of isotopes, as well as the previous studies and essential control parameters in the experimental processes, the same thing that, the most outstanding results and the detection systems used in the confirmation and finally, the carried out potential applications of the principles of the ultracentrifugation technology. (Author)

  10. ENVISION Project

    CERN Multimedia

    Ballantine, A; Dixon-Altaber, H; Dosanjh, M; Kuchina, L

    2011-01-01

    Hadrontherapy is a highly advanced technique of cancer radiotherapy that uses beams of charged particles (ions) to destroy tumour cells. While conventional X-rays traverse the human body depositing radiation as they pass through, ions deliver most of their energy at one point. Hadrontherapy is most advantageous once the position of the tumour is accurately known, so that healthy tissues can be protected. Accurate positioning is a crucial challenge for targeting moving organs, as in lung cancer, and for adapting the irradiation as the tumour shrinks with treatment. Therefore, quality assurance becomes one of the most relevant issues for an effective outcome of the cancer treatment. In order to improve the quality assurance tools for hadrontherapy, the European Commission is funding ENVISION, a 4-year project that aims at developing solutions for: real-• time non invasive monitoring • quantitative imaging • precise determination of delivered dose • fast feedback for optimal treatment planning • real-t...

  11. EUROTRAC projects

    International Nuclear Information System (INIS)

    Slanina, J.; Arends, B.G.; Wyers, G.P.

    1992-07-01

    The projects discussed are BIATEX (BIosphere-ATmosphere EXchange of pollutants), ACE (Acidity in Clouds Experiment) and GCE (Ground-based Cloud Experiment). ECN also coordinates BIATEX and contributes to the coordination of EUROTRAC. Research in BIATEX is aimed at the development of equipment, by which atmosphere-surface interactions of air pollution can be quantified. A ion chromatograph, connected to a rotating denuder, is developed to be applicated in the field for on-line analysis of denuder extracts and other samples. To investigate dry deposition of ammonia a continuous-flow denuder has been developed. A thermodenuder system to measure the concentrations of HNO 3 and NH 4 NO 3 in the ambient air is optimized to determine depositions and is part of the ECN monitoring station in Zegveld, Netherlands. An aerosol separation technique, based on a cyclone separator, has also been developed. All this equipment has been used in field experiments above wheat and heather. An automated monitoring station for long-term investigations of NH 3 , HNO 3 and SO 2 dry deposition on grassland and the impact of the deposition on the presence and composition of water films has been set up and fully tested. Research in GCE concerns the uptake and conversion of air pollution in clouds (cloud chemistry). Measuring equipment from several collaborative institutes has been specified and calibrated in a cloud chamber at ECN. The ECN contribution is the determination of the gas phase composition and the micro-physical characterization of the clouds. Measurement campaigns were carried out in the Po area (Italy) in fog, and in Kleiner Feldberg near Frankfurt, Germany, in orographic clouds. Estimations are given of the deposition of fog water and cloud water on forests in the Netherlands and the low mountain range in Germany. The project ACE was not started because of financial reasons and will be reconsidered. 26 figs., 1 tab., 3 apps., 34 refs

  12. Group sparse multiview patch alignment framework with view consistency for image classification.

    Science.gov (United States)

    Gui, Jie; Tao, Dacheng; Sun, Zhenan; Luo, Yong; You, Xinge; Tang, Yuan Yan

    2014-07-01

    No single feature can satisfactorily characterize the semantic concepts of an image. Multiview learning aims to unify different kinds of features to produce a consensual and efficient representation. This paper redefines part optimization in the patch alignment framework (PAF) and develops a group sparse multiview patch alignment framework (GSM-PAF). The new part optimization considers not only the complementary properties of different views, but also view consistency. In particular, view consistency models the correlations between all possible combinations of any two kinds of view. In contrast to conventional dimensionality reduction algorithms that perform feature extraction and feature selection independently, GSM-PAF enjoys joint feature extraction and feature selection by exploiting l(2,1)-norm on the projection matrix to achieve row sparsity, which leads to the simultaneous selection of relevant features and learning transformation, and thus makes the algorithm more discriminative. Experiments on two real-world image data sets demonstrate the effectiveness of GSM-PAF for image classification.

  13. Project Success in IT Project Management

    OpenAIRE

    Siddiqui, Farhan Ahmed

    2010-01-01

    The rate of failed and challenged Information Technology (IT) projects is too high according to the CHAOS Studies by the Standish Group and the literature on project management (Standish Group, 2008). The CHAOS Studies define project success as meeting the triple constraints of scope, time, and cost. The criteria for project success need to be agreed by all parties before the start of the project and constantly reviewed as the project progresses. Assessing critical success factors is another ...

  14. The Atlas upgrade project

    International Nuclear Information System (INIS)

    Bollinger, L.M.

    1988-01-01

    ATLAS is a heavy-ion accelerator system consisting of a 9-MV tandem electrostatic injector coupled to a superconducting linac. A project now well advanced will upgrade the capabilities of ATLAS immensely by replacing the tandem and its negative-ion source with a positive-ion injector that consists of an electron-cyclotron resonance (ECR) ion source and a 12-MV superconducting injector linac of novel design. This project will increase the beam intensity 100 fold and will extend the projectile-mass range up to uranium. Phase 1 of the work, which is nearing completion in late 1988, will provide an injector comprising the ECR source and its 350-kV voltage platform, beam analysis and bunching systems, beam lines, and a prototype 3-MV linac. The ECR source and its voltage platform are operational, development of the new class of low-frequency interdigital superconducting resonators required for the injector linac has been completed, and assembly of the whole system is in progress. Test runs and then routine use of the Phase 1 injector systems are planned for early 1989, and the final 12-MV injector linac will be commissioned in 1990. 12 refs., 6 figs

  15. Response of coral assemblages to thermal stress: are bleaching intensity and spatial patterns consistent between events?

    Science.gov (United States)

    Penin, Lucie; Vidal-Dupiol, Jeremie; Adjeroud, Mehdi

    2013-06-01

    Mass bleaching events resulting in coral mortality are among the greatest threats to coral reefs, and are projected to increase in frequency and intensity with global warming. Achieving a better understanding of the consistency of the response of coral assemblages to thermal stress, both spatially and temporally, is essential to determine which reefs are more able to tolerate climate change. We compared variations in spatial and taxonomic patterns between two bleaching events at the scale of an island (Moorea Island, French Polynesia). Despite similar thermal stress and light conditions, bleaching intensity was significantly lower in 2007 (approximately 37 % of colonies showed signs of bleaching) than in 2002, when 55 % of the colonies bleached. Variations in the spatial patterns of bleaching intensity were consistent between the two events. Among nine sampling stations at three locations and three depths, the stations at which the bleaching response was lowest in 2002 were those that showed the lowest levels of bleaching in 2007. The taxonomic patterns of susceptibility to bleaching were also consistent between the two events. These findings have important implications for conservation because they indicate that corals are capable of acclimatization and/or adaptation and that, even at small spatial scales, some areas are consistently more susceptible to bleaching than others.

  16. Consonance in Information System Projects: A Relationship Marketing Perspective

    Science.gov (United States)

    Lin, Pei-Ying

    2010-01-01

    Different stakeholders in the information system project usually have different perceptions and expectations of the projects. There is seldom consistency in the stakeholders' evaluations of the project outcome. Thus the outcomes of information system projects are usually disappointing to one or more stakeholders. Consonance is a process that can…

  17. Dictionary-based fiber orientation estimation with improved spatial consistency.

    Science.gov (United States)

    Ye, Chuyang; Prince, Jerry L

    2018-02-01

    Diffusion magnetic resonance imaging (dMRI) has enabled in vivo investigation of white matter tracts. Fiber orientation (FO) estimation is a key step in tract reconstruction and has been a popular research topic in dMRI analysis. In particular, the sparsity assumption has been used in conjunction with a dictionary-based framework to achieve reliable FO estimation with a reduced number of gradient directions. Because image noise can have a deleterious effect on the accuracy of FO estimation, previous works have incorporated spatial consistency of FOs in the dictionary-based framework to improve the estimation. However, because FOs are only indirectly determined from the mixture fractions of dictionary atoms and not modeled as variables in the objective function, these methods do not incorporate FO smoothness directly, and their ability to produce smooth FOs could be limited. In this work, we propose an improvement to Fiber Orientation Reconstruction using Neighborhood Information (FORNI), which we call FORNI+; this method estimates FOs in a dictionary-based framework where FO smoothness is better enforced than in FORNI alone. We describe an objective function that explicitly models the actual FOs and the mixture fractions of dictionary atoms. Specifically, it consists of data fidelity between the observed signals and the signals represented by the dictionary, pairwise FO dissimilarity that encourages FO smoothness, and weighted ℓ 1 -norm terms that ensure the consistency between the actual FOs and the FO configuration suggested by the dictionary representation. The FOs and mixture fractions are then jointly estimated by minimizing the objective function using an iterative alternating optimization strategy. FORNI+ was evaluated on a simulation phantom, a physical phantom, and real brain dMRI data. In particular, in the real brain dMRI experiment, we have qualitatively and quantitatively evaluated the reproducibility of the proposed method. Results demonstrate that

  18. Nonlinear and self-consistent treatment of ECRH

    Energy Technology Data Exchange (ETDEWEB)

    Tsironis, C.; Vlahos, L.

    2005-07-01

    A self-consistent formulation for the nonlinear interaction of electromagnetic waves with relativistic magnetized electrons is applied for the description of the ECRH. In general, electron-cyclotron absorption is the result of resonances between the cyclotron harmonics and the Doppler-shifted waver frequency. The resonant interaction results to an intense wave-particle energy exchange and an electron acceleration, and for that reason it is widely applied in fusion experiments for plasma heating and current drive. The linear theory, for the wave absorption, as well as the quasilinear theory for the electron distribution function, are the most frequently-used tools for the study of wave-particle interactions. However, in many cases the validity of these theories is violated, namely cases where nonlinear effects, like, e. g. particle trapping in the wave field, are dominant in the particle phase-space. Our model consists of electrons streaming and gyrating in a tokamak plasma slab, which is finite in the directions perpendicular to the main magnetic field. The particles interact with an electromagnetic electron-cyclotron wave of the ordinary (O-) or the extraordinary (X-) mode. A set of nonlinear and relativistic equations is derived, which take into account the effects of the charged particle motions on the wave. These consist of the equations of motion for the plasma electrons in the slab, as well as the wave equation in terms of the vector potential. The effect of the electron motions on the temporal evolution of the wave is reflected in the current density source term. (Author)

  19. Nonlinear and self-consistent treatment of ECRH

    International Nuclear Information System (INIS)

    Tsironis, C.; Vlahos, L.

    2005-01-01

    A self-consistent formulation for the nonlinear interaction of electromagnetic waves with relativistic magnetized electrons is applied for the description of the ECRH. In general, electron-cyclotron absorption is the result of resonances between the cyclotron harmonics and the Doppler-shifted waver frequency. The resonant interaction results to an intense wave-particle energy exchange and an electron acceleration, and for that reason it is widely applied in fusion experiments for plasma heating and current drive. The linear theory, for the wave absorption, as well as the quasilinear theory for the electron distribution function, are the most frequently-used tools for the study of wave-particle interactions. However, in many cases the validity of these theories is violated, namely cases where nonlinear effects, like, e. g. particle trapping in the wave field, are dominant in the particle phase-space. Our model consists of electrons streaming and gyrating in a tokamak plasma slab, which is finite in the directions perpendicular to the main magnetic field. The particles interact with an electromagnetic electron-cyclotron wave of the ordinary (O-) or the extraordinary (X-) mode. A set of nonlinear and relativistic equations is derived, which take into account the effects of the charged particle motions on the wave. These consist of the equations of motion for the plasma electrons in the slab, as well as the wave equation in terms of the vector potential. The effect of the electron motions on the temporal evolution of the wave is reflected in the current density source term. (Author)

  20. Development of a Consistent and Reproducible Porcine Scald Burn Model

    Science.gov (United States)

    Kempf, Margit; Kimble, Roy; Cuttle, Leila

    2016-01-01

    There are very few porcine burn models that replicate scald injuries similar to those encountered by children. We have developed a robust porcine burn model capable of creating reproducible scald burns for a wide range of burn conditions. The study was conducted with juvenile Large White pigs, creating replicates of burn combinations; 50°C for 1, 2, 5 and 10 minutes and 60°C, 70°C, 80°C and 90°C for 5 seconds. Visual wound examination, biopsies and Laser Doppler Imaging were performed at 1, 24 hours and at 3 and 7 days post-burn. A consistent water temperature was maintained within the scald device for long durations (49.8 ± 0.1°C when set at 50°C). The macroscopic and histologic appearance was consistent between replicates of burn conditions. For 50°C water, 10 minute duration burns showed significantly deeper tissue injury than all shorter durations at 24 hours post-burn (p ≤ 0.0001), with damage seen to increase until day 3 post-burn. For 5 second duration burns, by day 7 post-burn the 80°C and 90°C scalds had damage detected significantly deeper in the tissue than the 70°C scalds (p ≤ 0.001). A reliable and safe model of porcine scald burn injury has been successfully developed. The novel apparatus with continually refreshed water improves consistency of scald creation for long exposure times. This model allows the pathophysiology of scald burn wound creation and progression to be examined. PMID:27612153

  1. Consistent individual differences in fathering in threespined stickleback Gasterosteus aculeatus

    Directory of Open Access Journals (Sweden)

    Laura R. STEIN, Alison M. BELL

    2012-02-01

    Full Text Available There is growing evidence that individual animals show consistent differences in behavior. For example, individual threespined stickleback fish differ in how they react to predators and how aggressive they are during social interactions with conspecifics. A relatively unexplored but potentially important axis of variation is parental behavior. In sticklebacks, fathers provide all of the parental care that is necessary for offspring survival; therefore paternal care is directly tied to fitness. In this study, we assessed whether individual male sticklebacks differ consistently from each other in parental behavior. We recorded visits to nest, total time fanning, and activity levels of 11 individual males every day throughout one clutch, and then allowed the males to breed again. Half of the males were exposed to predation risk while parenting during the first clutch, and the other half of the males experienced predation risk during the second clutch. We detected dramatic temporal changes in parental behaviors over the course of the clutch: for example, total time fanning increased six-fold prior to eggs hatching, then decreased to approximately zero. Despite these temporal changes, males retained their individually-distinctive parenting styles within a clutch that could not be explained by differences in body size or egg mass. Moreover, individual differences in parenting were maintained when males reproduced for a second time. Males that were exposed to simulated predation risk briefly decreased fanning and increased activity levels. Altogether, these results show that individual sticklebacks consistently differ from each other in how they behave as parents [Current Zoology 58 (1: 45–52, 2012].

  2. Consistent individual differences in fathering in threespined stickleback Gasterosteus aculeatus

    Institute of Scientific and Technical Information of China (English)

    Laura R. STEIN; Alison M. BELL

    2012-01-01

    There is growing evidence that individual animals show consistent differences in behavior.For example,individual threespined stickleback fish differ in how they react to predators and how aggressive they are during social interactions with conspecifics.A relatively unexplored but potentially important axis of variation is parental behavior.In sticklebacks,fathers provide all of the parental care that is necessary for offspring survival; therefore paternal care is directly tied to fimess.In this study,we assessed whether individual male sticklebacks differ consistently from each other in parental behavior.We recorded visits to nest,total time fanning,and activity levels of 11 individual males every day throughout one clutch,and then allowed the males to breed again.Half of the males were exposed to predation risk while parenting during the fast clutch,and the other half of the males experienced predation risk during the second clutch.We detected dramatic temporal changes in parental behaviors over the course of the clutch:for example,total time fanning increased six-fold prior to eggs hatching,then decreased to approximately zero.Despite these temporal changes,males retained their individually-distinctive parenting styles within a clutch that could not be explained by differences in body size or egg mass.Moreover,individual differences in parenting were maintained when males reproduced for a second time.Males that were exposed to simulated predation risk briefly decreased fanning and increased activity levels.Altogether,these results show that individual sticklebacks consistently differ from each other in how they behave as parents [Current Zoology 58 (1):45-52,2012].

  3. Flood damage curves for consistent global risk assessments

    Science.gov (United States)

    de Moel, Hans; Huizinga, Jan; Szewczyk, Wojtek

    2016-04-01

    Assessing potential damage of flood events is an important component in flood risk management. Determining direct flood damage is commonly done using depth-damage curves, which denote the flood damage that would occur at specific water depths per asset or land-use class. Many countries around the world have developed flood damage models using such curves which are based on analysis of past flood events and/or on expert judgement. However, such damage curves are not available for all regions, which hampers damage assessments in those regions. Moreover, due to different methodologies employed for various damage models in different countries, damage assessments cannot be directly compared with each other, obstructing also supra-national flood damage assessments. To address these problems, a globally consistent dataset of depth-damage curves has been developed. This dataset contains damage curves depicting percent of damage as a function of water depth as well as maximum damage values for a variety of assets and land use classes (i.e. residential, commercial, agriculture). Based on an extensive literature survey concave damage curves have been developed for each continent, while differentiation in flood damage between countries is established by determining maximum damage values at the country scale. These maximum damage values are based on construction cost surveys from multinational construction companies, which provide a coherent set of detailed building cost data across dozens of countries. A consistent set of maximum flood damage values for all countries was computed using statistical regressions with socio-economic World Development Indicators from the World Bank. Further, based on insights from the literature survey, guidance is also given on how the damage curves and maximum damage values can be adjusted for specific local circumstances, such as urban vs. rural locations, use of specific building material, etc. This dataset can be used for consistent supra

  4. Coagulation of Agglomerates Consisting of Polydisperse Primary Particles.

    Science.gov (United States)

    Goudeli, E; Eggersdorfer, M L; Pratsinis, S E

    2016-09-13

    The ballistic agglomeration of polydisperse particles is investigated by an event-driven (ED) method and compared to the coagulation of spherical particles and agglomerates consisting of monodisperse primary particles (PPs). It is shown for the first time to our knowledge that increasing the width or polydispersity of the PP size distribution initially accelerates the coagulation rate of their agglomerates but delays the attainment of their asymptotic fractal-like structure and self-preserving size distribution (SPSD) without altering them, provided that sufficiently large numbers of PPs are employed. For example, the standard asymptotic mass fractal dimension, Df, of 1.91 is attained when clusters are formed containing, on average, about 15 monodisperse PPs, consistent with fractal theory and the literature. In contrast, when polydisperse PPs with a geometric standard deviation of 3 are employed, about 500 PPs are needed to attain that Df. Even though the same asymptotic Df and mass-mobility exponent, Dfm, are attained regardless of PP polydispersity, the asymptotic prefactors or lacunarities of Df and Dfm increase with PP polydispersity. For monodisperse PPs, the average agglomerate radius of gyration, rg, becomes larger than the mobility radius, rm, when agglomerates consist of more than 15 PPs. Increasing PP polydispersity increases that number of PPs similarly to the above for the attainment of the asymptotic Df or Dfm. The agglomeration kinetics are quantified by the overall collision frequency function. When the SPSD is attained, the collision frequency is independent of PP polydispersity. Accounting for the SPSD polydispersity in the overall agglomerate collision frequency is in good agreement with that frequency from detailed ED simulations once the SPSD is reached. Most importantly, the coagulation of agglomerates is described well by a monodisperse model for agglomerate and PP sizes, whereas the detailed agglomerate size distribution can be obtained by

  5. Near-resonant absorption in the time-dependent self-consistent field and multiconfigurational self-consistent field approximations

    DEFF Research Database (Denmark)

    Norman, Patrick; Bishop, David M.; Jensen, Hans Jørgen Aa

    2001-01-01

    Computationally tractable expressions for the evaluation of the linear response function in the multiconfigurational self-consistent field approximation were derived and implemented. The finite lifetime of the electronically excited states was considered and the linear response function was shown...... to be convergent in the whole frequency region. This was achieved through the incorporation of phenomenological damping factors that lead to complex response function values....

  6. Consistent and efficient processing of ADCP streamflow measurements

    Science.gov (United States)

    Mueller, David S.; Constantinescu, George; Garcia, Marcelo H.; Hanes, Dan

    2016-01-01

    The use of Acoustic Doppler Current Profilers (ADCPs) from a moving boat is a commonly used method for measuring streamflow. Currently, the algorithms used to compute the average depth, compute edge discharge, identify invalid data, and estimate velocity and discharge for invalid data vary among manufacturers. These differences could result in different discharges being computed from identical data. Consistent computational algorithm, automated filtering, and quality assessment of ADCP streamflow measurements that are independent of the ADCP manufacturer are being developed in a software program that can process ADCP moving-boat discharge measurements independent of the ADCP used to collect the data.

  7. Consistency of FMEA used in the validation of analytical procedures

    DEFF Research Database (Denmark)

    Oldenhof, M.T.; van Leeuwen, J.F.; Nauta, Maarten

    2011-01-01

    is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating......In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection...

  8. Designing apps for success developing consistent app design practices

    CERN Document Server

    David, Matthew

    2014-01-01

    In 2007, Apple released the iPhone. With this release came tools as revolutionary as the internet was to businesses and individuals back in the mid- and late-nineties: Apps. Much like websites drove (and still drive) business, so too do apps drive sales, efficiencies and communication between people. But also like web design and development, in its early years and iterations, guidelines and best practices for apps are few and far between.Designing Apps for Success provides web/app designers and developers with consistent app design practices that result in timely, appropriate, and efficiently

  9. The numerical multiconfiguration self-consistent field approach for atoms

    International Nuclear Information System (INIS)

    Stiehler, Johannes

    1995-12-01

    The dissertation uses the Multiconfiguration Self-Consistent Field Approach to specify the electronic wave function of N electron atoms in a static electrical field. It presents numerical approaches to describe the wave functions and introduces new methods to compute the numerical Fock equations. Based on results computed with an implemented computer program the universal application, flexibility and high numerical precision of the presented approach is shown. RHF results and for the first time MCSCF results for polarizabilities and hyperpolarizabilities of various states of the atoms He to Kr are discussed. In addition, an application to interpret a plasma spectrum of gallium is presented. (orig.)

  10. Self-consistent potential variations in magnetic wells

    International Nuclear Information System (INIS)

    Kesner, J.; Knorr, G.; Nicholson, D.R.

    1981-01-01

    Self-consistent electrostatic potential variations are considered in a spatial region of weak magnetic field, as in the proposed tandem mirror thermal barriers (with no trapped ions). For some conditions, equivalent to ion distributions with a sufficiently high net drift speed along the magnetic field, the desired potential depressions are found. When the net drift speed is not high enough, potential depressions are found only in combination with strong electric fields on the boundaries of the system. These potential depressions are not directly related to the magnetic field depression. (author)

  11. Applicability of self-consistent mean-field theory

    International Nuclear Information System (INIS)

    Guo Lu; Sakata, Fumihiko; Zhao Enguang

    2005-01-01

    Within the constrained Hartree-Fock (CHF) theory, an analytic condition is derived to estimate whether a concept of the self-consistent mean field is realized in the level repulsive region. The derived condition states that an iterative calculation of the CHF equation does not converge when the quantum fluctuations coming from two-body residual interaction and quadrupole deformation become larger than a single-particle energy difference between two avoided crossing orbits. By means of numerical calculation, it is shown that the analytic condition works well for a realistic case

  12. Island of stability for consistent deformations of Einstein's gravity.

    Science.gov (United States)

    Berkhahn, Felix; Dietrich, Dennis D; Hofmann, Stefan; Kühnel, Florian; Moyassari, Parvin

    2012-03-30

    We construct deformations of general relativity that are consistent and phenomenologically viable, since they respect, in particular, cosmological backgrounds. These deformations have unique symmetries in accordance with their Minkowski cousins (Fierz-Pauli theory for massive gravitons) and incorporate a background curvature induced self-stabilizing mechanism. Self-stabilization is essential in order to guarantee hyperbolic evolution in and unitarity of the covariantized theory, as well as the deformation's uniqueness. We show that the deformation's parameter space contains islands of absolute stability that are persistent through the entire cosmic evolution.

  13. The self-consistent dynamic pole tide in global oceans

    Science.gov (United States)

    Dickman, S. R.

    1985-01-01

    The dynamic pole tide is characterized in a self-consistent manner by means of introducing a single nondifferential matrix equation compatible with the Liouville equation, modelling the ocean as global and of uniform depth. The deviations of the theory from the realistic ocean, associated with the nonglobality of the latter, are also given consideration, with an inference that in realistic oceans long-period modes of resonances would be increasingly likely to exist. The analysis of the nature of the pole tide and its effects on the Chandler wobble indicate that departures of the pole tide from the equilibrium may indeed be minimal.

  14. Simplified models for dark matter face their consistent completions

    Energy Technology Data Exchange (ETDEWEB)

    Gonçalves, Dorival; Machado, Pedro A. N.; No, Jose Miguel

    2017-03-01

    Simplified dark matter models have been recently advocated as a powerful tool to exploit the complementarity between dark matter direct detection, indirect detection and LHC experimental probes. Focusing on pseudoscalar mediators between the dark and visible sectors, we show that the simplified dark matter model phenomenology departs significantly from that of consistent ${SU(2)_{\\mathrm{L}} \\times U(1)_{\\mathrm{Y}}}$ gauge invariant completions. We discuss the key physics simplified models fail to capture, and its impact on LHC searches. Notably, we show that resonant mono-Z searches provide competitive sensitivities to standard mono-jet analyses at $13$ TeV LHC.

  15. Two-particle self-consistent approach to unconventional superconductivity

    Energy Technology Data Exchange (ETDEWEB)

    Otsuki, Junya [Department of Physics, Tohoku University, Sendai (Japan); Theoretische Physik III, Zentrum fuer Elektronische Korrelationen und Magnetismus, Universitaet Augsburg (Germany)

    2013-07-01

    A non-perturbative approach to unconventional superconductivity is developed based on the idea of the two-particle self-consistent (TPSC) theory. An exact sum-rule which the momentum-dependent pairing susceptibility satisfies is derived. Effective pairing interactions between quasiparticles are determined so that an approximate susceptibility should fulfill this sum-rule, in which fluctuations belonging to different symmetries mix at finite momentum. The mixing leads to a suppression of the d{sub x{sup 2}-y{sup 2}} pairing close to the half-filling, resulting in a maximum of T{sub c} away from half-filling.

  16. Correlations and self-consistency in pion scattering. II

    International Nuclear Information System (INIS)

    Johnson, M.B.; Keister, B.D.

    1978-01-01

    In an attempt to overcome certain difficulties of summing higher order processes in pion multiple scattering theories, a new, systematic expansion for the interaction of a pion in nuclear matter is derived within the context of the Foldy-Walecka theory, incorporating nucleon-nucleon correlations and an idea of self-consistency. The first two orders in the expansion are evaluated as a function of the nonlocality range; the expansion appears to be rapidly converging, in contrast to expansion schemes previously examined. (Auth.)

  17. Quark mean field theory and consistency with nuclear matter

    International Nuclear Information System (INIS)

    Dey, J.; Dey, M.; Frederico, T.; Tomio, L.

    1990-09-01

    1/N c expansion in QCD (with N c the number of colours) suggests using a potential from meson sector (e.g. Richardson) for baryons. For light quarks a σ field has to be introduced to ensure chiral symmetry breaking ( χ SB). It is found that nuclear matter properties can be used to pin down the χ SB-modelling. All masses, M N , m σ , m ω are found to scale with density. The equations are solved self consistently. (author). 29 refs, 2 tabs

  18. A self-consistent model of an isothermal tokamak

    Science.gov (United States)

    McNamara, Steven; Lilley, Matthew

    2014-10-01

    Continued progress in liquid lithium coating technologies have made the development of a beam driven tokamak with minimal edge recycling a feasibly possibility. Such devices are characterised by improved confinement due to their inherent stability and the suppression of thermal conduction. Particle and energy confinement become intrinsically linked and the plasma thermal energy content is governed by the injected beam. A self-consistent model of a purely beam fuelled isothermal tokamak is presented, including calculations of the density profile, bulk species temperature ratios and the fusion output. Stability considerations constrain the operating parameters and regions of stable operation are identified and their suitability to potential reactor applications discussed.

  19. Self-consistent calculation of 208Pb spectrum

    International Nuclear Information System (INIS)

    Pal'chik, V.V.; Pyatov, N.I.; Fayans, S.A.

    1981-01-01

    The self-consistent model with exact accounting for one-particle continuum is applied to calculate all discrete particle-hole natural parity states with 2 208 Pb nucleus (up to the neutron emission threshold, 7.4 MeV). Contributions to the energy-weighted sum rules S(EL) of the first collective levels and total contributions of all discrete levels are evaluated. Most strongly the collectivization is manifested for octupole states. With multipolarity growth L contributions of discrete levels are sharply reduced. The results are compared with other models and the experimental data obtained in (e, e'), (p, p') reactions and other data [ru

  20. Poisson solvers for self-consistent multi-particle simulations

    International Nuclear Information System (INIS)

    Qiang, J; Paret, S

    2014-01-01

    Self-consistent multi-particle simulation plays an important role in studying beam-beam effects and space charge effects in high-intensity beams. The Poisson equation has to be solved at each time-step based on the particle density distribution in the multi-particle simulation. In this paper, we review a number of numerical methods that can be used to solve the Poisson equation efficiently. The computational complexity of those numerical methods will be O(N log(N)) or O(N) instead of O(N2), where N is the total number of grid points used to solve the Poisson equation

  1. Consistency of differential and integral thermonuclear neutronics data

    International Nuclear Information System (INIS)

    Reupke, W.A.

    1978-01-01

    To increase the accuracy of the neutronics analysis of nuclear reactors, physicists and engineers have employed a variety of techniques, including the adjustment of multigroup differential data to improve consistency with integral data. Of the various adjustment strategies, a generalized least-squares procedure which adjusts the combined differential and integral data can significantly improve the accuracy of neutronics calculations compared to calculations employing only differential data. This investigation analyzes 14 MeV neutron-driven integral experiments, using a more extensively developed methodology and a newly developed computer code, to extend the domain of adjustment from the energy range of fission reactors to the energy range of fusion reactors

  2. Consistent treatment of one-body dynamics and collective fluctuations

    International Nuclear Information System (INIS)

    Pfitzner, A.

    1986-09-01

    We show how the residual coupling deltaV between collective and intrinsic motion induces correlations, which lead to fluctuations of the collective variables and to a redistribution of single-particle occupation numbers rho/sub α/. The evolution of rho/sub α/ and of the collective fluctuations is consistently described by a coupled system of equations, which accounts for the dependence of the transport coefficients on rho/sub α/, and for the dependence of the transition rates in the master equation on the collective variances. (author)

  3. Mean-field theory and self-consistent dynamo modeling

    International Nuclear Information System (INIS)

    Yoshizawa, Akira; Yokoi, Nobumitsu

    2001-12-01

    Mean-field theory of dynamo is discussed with emphasis on the statistical formulation of turbulence effects on the magnetohydrodynamic equations and the construction of a self-consistent dynamo model. The dynamo mechanism is sought in the combination of the turbulent residual-helicity and cross-helicity effects. On the basis of this mechanism, discussions are made on the generation of planetary magnetic fields such as geomagnetic field and sunspots and on the occurrence of flow by magnetic fields in planetary and fusion phenomena. (author)

  4. The Consistency Of High Attorney Of Papua In Corruption Investigation

    Directory of Open Access Journals (Sweden)

    Samsul Tamher

    2015-08-01

    Full Text Available This study aimed to determine the consistency of High Attorney of Papua in corruption investigation and efforts to return the state financial loss. The type of study used in this paper is a normative-juridical and empirical-juridical. The results showed that the High Attorney of Papua in corruption investigation is not optimal due to the political interference on a case that involving local officials so that the High Attorney in decide the case is not accordance with the rule of law. The efforts of the High Attorney of Papua to return the state financial loss through State Auction Body civil- and criminal laws.

  5. Wavelets in self-consistent electronic structure calculations

    International Nuclear Information System (INIS)

    Wei, S.; Chou, M.Y.

    1996-01-01

    We report the first implementation of orthonormal wavelet bases in self-consistent electronic structure calculations within the local-density approximation. These local bases of different scales efficiently describe localized orbitals of interest. As an example, we studied two molecules, H 2 and O 2 , using pseudopotentials and supercells. Considerably fewer bases are needed compared with conventional plane-wave approaches, yet calculated binding properties are similar. Our implementation employs fast wavelet and Fourier transforms, avoiding evaluating any three-dimensional integral numerically. copyright 1996 The American Physical Society

  6. Self-consistent electronic-structure calculations for interface geometries

    International Nuclear Information System (INIS)

    Sowa, E.C.; Gonis, A.; MacLaren, J.M.; Zhang, X.G.

    1992-01-01

    This paper describes a technique for computing self-consistent electronic structures and total energies of planar defects, such as interfaces, which are embedded in an otherwise perfect crystal. As in the Layer Korringa-Kohn-Rostoker approach, the solid is treated as a set of coupled layers of atoms, using Bloch's theorem to take advantage of the two-dimensional periodicity of the individual layers. The layers are coupled using the techniques of the Real-Space Multiple-Scattering Theory, avoiding artificial slab or supercell boundary conditions. A total-energy calculation on a Cu crystal, which has been split apart at a (111) plane, is used to illustrate the method

  7. Sensor and control for consistent seed drill coulter depth

    DEFF Research Database (Denmark)

    Kirkegaard Nielsen, Søren; Nørremark, Michael; Green, Ole

    2016-01-01

    The consistent depth placement of seeds is vital for achieving the optimum yield of agricultural crops. In state-of-the-art seeding machines, the depth of drill coulters will vary with changes in soil resistance. This paper presents the retrofitting of an angle sensor to the pivoting point...... by a sub-millimetre accurate positioning system (iGPS, Nikon Metrology NV, Belgium) mounted on the drill coulter. At a drill coulter depth of 55 mm and controlled by an ordinary fixed spring loaded down force only, the change in soil resistance decreased the mean depth by 23 mm. By dynamically controlling...

  8. SIMPLE ESTIMATOR AND CONSISTENT STRONGLY OF STABLE DISTRIBUTIONS

    Directory of Open Access Journals (Sweden)

    Cira E. Guevara Otiniano

    2016-06-01

    Full Text Available Stable distributions are extensively used to analyze earnings of financial assets, such as exchange rates and stock prices assets. In this paper we propose a simple and strongly consistent estimator for the scale parameter of a symmetric stable L´evy distribution. The advantage of this estimator is that your computational time is minimum thus it can be used to initialize intensive computational procedure such as maximum likelihood. With random samples of sized n we tested the efficacy of these estimators by Monte Carlo method. We also included applications for three data sets.

  9. Tunneling in a self-consistent dynamic image potential

    International Nuclear Information System (INIS)

    Rudberg, B.G.R.; Jonson, M.

    1991-01-01

    We have calculated the self-consistent effective potential for an electron tunneling through a square barrier while interacting with surface plasmons. This potential reduces to the classical image potential in the static limit. In the opposite limit, when the ''velocity'' of the tunneling electron is large, it reduces to the unperturbed square-barrier potential. For a wide variety of parameters the dynamic effects on the transmission coefficient T=|t 2 | can, for instance, be related to the Buettiker-Landauer traversal time for tunneling, given by τ BL =ℎ|d lnt/dV|

  10. On the hydrodynamic limit of self-consistent field equations

    International Nuclear Information System (INIS)

    Pauli, H.C.

    1980-01-01

    As an approximation to the nuclear many-body problem, the hydrodynamical limit of self-consistent field equations is worked out and applied to the treatment of vibrational and rotational motion. Its validity is coupled to the value of a smallness parameter, behaving as 20Asup(-2/3) with the number of nucleons. For finite nuclei, this number is not small enough as compared to 1, and indeed one observes a discrepancy of roughly a factor of 5 between the hydrodynamic frequencies and the relevant experimental numbers. (orig.)

  11. Multiconfigurational self-consistent reaction field theory for nonequilibrium solvation

    DEFF Research Database (Denmark)

    Mikkelsen, Kurt V.; Cesar, Amary; Ågren, Hans

    1995-01-01

    electronic structure whereas the inertial polarization vector is not necessarily in equilibrium with the actual electronic structure. The electronic structure of the compound is described by a correlated electronic wave function - a multiconfigurational self-consistent field (MCSCF) wave function. This wave......, open-shell, excited, and transition states. We demonstrate the theory by computing solvatochromatic shifts in optical/UV spectra of some small molecules and electron ionization and electron detachment energies of the benzene molecule. It is shown that the dependency of the solvent induced affinity...

  12. A Consistent Pricing Model for Index Options and Volatility Derivatives

    DEFF Research Database (Denmark)

    Kokholm, Thomas

    to be priced consistently, while allowing for jumps in volatility and returns. An affine specification using Lévy processes as building blocks leads to analytically tractable pricing formulas for volatility derivatives, such as VIX options, as well as efficient numerical methods for pricing of European options...... on the underlying asset. The model has the convenient feature of decoupling the vanilla skews from spot/volatility correlations and allowing for different conditional correlations in large and small spot/volatility moves. We show that our model can simultaneously fit prices of European options on S&P 500 across...

  13. A Consistent Pricing Model for Index Options and Volatility Derivatives

    DEFF Research Database (Denmark)

    Cont, Rama; Kokholm, Thomas

    2013-01-01

    to be priced consistently, while allowing for jumps in volatility and returns. An affine specification using Lévy processes as building blocks leads to analytically tractable pricing formulas for volatility derivatives, such as VIX options, as well as efficient numerical methods for pricing of European options...... on the underlying asset. The model has the convenient feature of decoupling the vanilla skews from spot/volatility correlations and allowing for different conditional correlations in large and small spot/volatility moves. We show that our model can simultaneously fit prices of European options on S&P 500 across...

  14. Consistent vapour-liquid equilibrium data containing lipids

    DEFF Research Database (Denmark)

    Cunico, Larissa; Ceriani, Roberta; Sarup, Bent

    Consistent physical and thermodynamic properties of pure components and their mixtures are important for process design, simulation, and optimization as well as design of chemical based products. In the case of lipids, it was observed a lack of experimental data for pure compounds and also...... for their mixtures in open literature, what makes necessary the development of reliable predictive models based on limited data. To contribute to the missing data, measurements of isobaric vapour-liquid equilibrium (VLE) data of three binary mixtures at two different pressures were performed at State University...

  15. 4onse D1.3 - Project Identity Manual

    OpenAIRE

    Cannata Massimiliano; Strigaro Daniele

    2016-01-01

    This document describes the corporate identity which has been developed for the 4onse project. The corporate identity consists of logo for the overall project and templates for written and presentation materials and printed communication materials.

  16. The Danish Agenda for Rethinking Project Management

    DEFF Research Database (Denmark)

    Svejvig, Per; Grex, Sara

    2016-01-01

    Purpose – The purpose of this paper is to analyze the similarities and differences between the Danish Re-thinking Project Management initiative named Project Half Double (PHD) and the rethinking project man-agement (RPM) research stream. The paper furthermore discusses how PHD and RPM can inspire...... a foundation for further development of both rethinking project management and Project Half Double....... each other in research and practice. Design/methodology/approach – This is an empirical paper based on collaborative research between in-dustry and researchers. PHD has developed principles and practices driven by industry consisting of 10 leading stars and the impact, leadership and flow (ILF) method...

  17. Project LASER

    Science.gov (United States)

    1990-01-01

    NASA formally launched Project LASER (Learning About Science, Engineering and Research) in March 1990, a program designed to help teachers improve science and mathematics education and to provide 'hands on' experiences. It featured the first LASER Mobile Teacher Resource Center (MTRC), is designed to reach educators all over the nation. NASA hopes to operate several MTRCs with funds provided by private industry. The mobile unit is a 22-ton tractor-trailer stocked with NASA educational publications and outfitted with six work stations. Each work station, which can accommodate two teachers at a time, has a computer providing access to NASA Spacelink. Each also has video recorders and photocopy/photographic equipment for the teacher's use. MTRC is only one of the five major elements within LASER. The others are: a Space Technology Course, to promote integration of space science studies with traditional courses; the Volunteer Databank, in which NASA employees are encouraged to volunteer as tutors, instructors, etc; Mobile Discovery Laboratories that will carry simple laboratory equipment and computers to provide hands-on activities for students and demonstrations of classroom activities for teachers; and the Public Library Science Program which will present library based science and math programs.

  18. Self-consistent viscous heating of rapidly compressed turbulence

    Science.gov (United States)

    Campos, Alejandro; Morgan, Brandon

    2017-11-01

    Given turbulence subjected to infinitely rapid deformations, linear terms representing interactions between the mean flow and the turbulence dictate the evolution of the flow, whereas non-linear terms corresponding to turbulence-turbulence interactions are safely ignored. For rapidly deformed flows where the turbulence Reynolds number is not sufficiently large, viscous effects can't be neglected and tend to play a prominent role, as shown in the study of Davidovits & Fisch (2016). For such a case, the rapid increase of viscosity in a plasma-as compared to the weaker scaling of viscosity in a fluid-leads to the sudden viscous dissipation of turbulent kinetic energy. As shown in Davidovits & Fisch, increases in temperature caused by the direct compression of the plasma drive sufficiently large values of viscosity. We report on numerical simulations of turbulence where the increase in temperature is the result of both the direct compression (an inviscid mechanism) and the self-consistent viscous transfer of energy from the turbulent scales towards the thermal energy. A comparison between implicit large-eddy simulations against well-resolved direct numerical simulations is included to asses the effect of the numerical and subgrid-scale dissipation on the self-consistent viscous This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  19. Parton Distributions based on a Maximally Consistent Dataset

    Science.gov (United States)

    Rojo, Juan

    2016-04-01

    The choice of data that enters a global QCD analysis can have a substantial impact on the resulting parton distributions and their predictions for collider observables. One of the main reasons for this has to do with the possible presence of inconsistencies, either internal within an experiment or external between different experiments. In order to assess the robustness of the global fit, different definitions of a conservative PDF set, that is, a PDF set based on a maximally consistent dataset, have been introduced. However, these approaches are typically affected by theory biases in the selection of the dataset. In this contribution, after a brief overview of recent NNPDF developments, we propose a new, fully objective, definition of a conservative PDF set, based on the Bayesian reweighting approach. Using the new NNPDF3.0 framework, we produce various conservative sets, which turn out to be mutually in agreement within the respective PDF uncertainties, as well as with the global fit. We explore some of their implications for LHC phenomenology, finding also good consistency with the global fit result. These results provide a non-trivial validation test of the new NNPDF3.0 fitting methodology, and indicate that possible inconsistencies in the fitted dataset do not affect substantially the global fit PDFs.

  20. Self-consistent modeling of electron cyclotron resonance ion sources

    International Nuclear Information System (INIS)

    Girard, A.; Hitz, D.; Melin, G.; Serebrennikov, K.; Lecot, C.

    2004-01-01

    In order to predict the performances of electron cyclotron resonance ion source (ECRIS), it is necessary to perfectly model the different parts of these sources: (i) magnetic configuration; (ii) plasma characteristics; (iii) extraction system. The magnetic configuration is easily calculated via commercial codes; different codes also simulate the ion extraction, either in two dimension, or even in three dimension (to take into account the shape of the plasma at the extraction influenced by the hexapole). However the characteristics of the plasma are not always mastered. This article describes the self-consistent modeling of ECRIS: we have developed a code which takes into account the most important construction parameters: the size of the plasma (length, diameter), the mirror ratio and axial magnetic profile, whether a biased probe is installed or not. These input parameters are used to feed a self-consistent code, which calculates the characteristics of the plasma: electron density and energy, charge state distribution, plasma potential. The code is briefly described, and some of its most interesting results are presented. Comparisons are made between the calculations and the results obtained experimentally