WorldWideScience

Sample records for providing quantitative input

  1. Quantitative security analysis for programs with low input and noisy output

    NARCIS (Netherlands)

    Ngo, Minh Tri; Huisman, Marieke

    Classical quantitative information flow analysis often considers a system as an information-theoretic channel, where private data are the only inputs and public data are the outputs. However, for systems where an attacker is able to influence the initial values of public data, these should also be

  2. Phase-based vascular input function: Improved quantitative DCE-MRI of atherosclerotic plaques

    NARCIS (Netherlands)

    van Hoof, R. H. M.; Hermeling, E.; Truijman, M. T. B.; van Oostenbrugge, R. J.; Daemen, J. W. H.; van der Geest, R. J.; van Orshoven, N. P.; Schreuder, A. H.; Backes, W. H.; Daemen, M. J. A. P.; Wildberger, J. E.; Kooi, M. E.

    2015-01-01

    Purpose: Quantitative pharmacokinetic modeling of dynamic contrast-enhanced (DCE)-MRI can be used to assess atherosclerotic plaque microvasculature, which is an important marker of plaque vulnerability. Purpose of the present study was (1) to compare magnitude-versus phase-based vascular input

  3. A quantitative approach to modeling the information processing of NPP operators under input information overload

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task under input information overload. We primarily develop the information processing model having multiple stages, which contains information flow. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory. We also investigate the applicability of this approach to quantifying the information reduction of operators under the input information overload

  4. PROVIDING ENGLISH LANGUAGE INPUT: DECREASING STUDENTS’ ANXIETY IN READING COMPREHENSION PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Elva Yohana

    2016-11-01

    Full Text Available The primary condition for successful in second or foreign language learning is providing an adequate environment. It is as a medium of increasing the students’ language exposure in order to be able to success in acquiring second or foreign language profciency. This study was designed to propose the adequate English language input that can decrease the students’ anxiety in reading comprehension performance. Of the four skills, somehow reading can be regarded as especially important because reading is assumed to be the central means for learning new information. Some students, however, still encounter many problems in reading. It is because of their anxiety when they are reading. Providing and creating an interesting-contextual reading material and gratifed teachers can make out this problem which occurs mostly in Indonesian’s classrooms. It revealed that the younger learners of English there do not received adequate amount of the target language input in their learning of English. Hence, it suggested the adoption of extensive reading programs as the most effective means in the creation of an input-rich environment in EFL learning contexts. Besides they also give suggestion to book writers and publisher to provide myriad books that appropriate and readable for their students.

  5. Image-derived and arterial blood sampled input functions for quantitative PET imaging of the angiotensin II subtype 1 receptor in the kidney

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Tao; Tsui, Benjamin M. W.; Li, Xin; Vranesic, Melin; Lodge, Martin A.; Gulaldi, Nedim C. M.; Szabo, Zsolt, E-mail: zszabo@jhmi.edu [Russell H. Morgan Department of Radiology and Radiological Science, The Johns Hopkins School of Medicine, Baltimore, Maryland 21287 (United States)

    2015-11-15

    Purpose: The radioligand {sup 11}C-KR31173 has been introduced for positron emission tomography (PET) imaging of the angiotensin II subtype 1 receptor in the kidney in vivo. To study the biokinetics of {sup 11}C-KR31173 with a compartmental model, the input function is needed. Collection and analysis of arterial blood samples are the established approach to obtain the input function but they are not feasible in patients with renal diseases. The goal of this study was to develop a quantitative technique that can provide an accurate image-derived input function (ID-IF) to replace the conventional invasive arterial sampling and test the method in pigs with the goal of translation into human studies. Methods: The experimental animals were injected with [{sup 11}C]KR31173 and scanned up to 90 min with dynamic PET. Arterial blood samples were collected for the artery derived input function (AD-IF) and used as a gold standard for ID-IF. Before PET, magnetic resonance angiography of the kidneys was obtained to provide the anatomical information required for derivation of the recovery coefficients in the abdominal aorta, a requirement for partial volume correction of the ID-IF. Different image reconstruction methods, filtered back projection (FBP) and ordered subset expectation maximization (OS-EM), were investigated for the best trade-off between bias and variance of the ID-IF. The effects of kidney uptakes on the quantitative accuracy of ID-IF were also studied. Biological variables such as red blood cell binding and radioligand metabolism were also taken into consideration. A single blood sample was used for calibration in the later phase of the input function. Results: In the first 2 min after injection, the OS-EM based ID-IF was found to be biased, and the bias was found to be induced by the kidney uptake. No such bias was found with the FBP based image reconstruction method. However, the OS-EM based image reconstruction was found to reduce variance in the subsequent

  6. How citizen advisory boards provide input into major waste policy decisions

    International Nuclear Information System (INIS)

    Rogers, E.; Murakami, L.; Hanson, L.

    1995-01-01

    Volunteer citizen boards, such as Site Specific Advisory Boards, can be a very important key to success for the Department of Energy's (DOE's) Waste Management program. These boards can provide informed, independent recommendations reflecting the diversity of the community and its values. A successful volunteer process requires collaboration among regulators, DOE and other Boards; knowing how and when to interface with the broader public; understanding the diversity and representational issues of a citizens group; knowing the open-quotes ins and outsclose quotes of working with volunteers; education and training and most importantly, planning. Volunteers on a citizens board were created to tackle the big picture, policy decisions. The chair of the Rocky Flats Citizens Advisory Board will describe her Board's successes, including the challenges in reaching consensus agreements, as well as the need for integration with other boards and the sites' on-going public involvement programs to provide the input the department is seeking. Finally, one of the greatest challenges for the boards is interfacing with the greater public-at-large, seeing how the CAB has overcome this challenge and integrating broader public input into its decisions

  7. How citizen advisory boards provide input into major waste policy decisions

    Energy Technology Data Exchange (ETDEWEB)

    Rogers, E.; Murakami, L.; Hanson, L. [Rocky Flats Citizen Advisory Board, Westminster, CO (United States)

    1995-12-31

    Volunteer citizen boards, such as Site Specific Advisory Boards, can be a very important key to success for the Department of Energy`s (DOE`s) Waste Management program. These boards can provide informed, independent recommendations reflecting the diversity of the community and its values. A successful volunteer process requires collaboration among regulators, DOE and other Boards; knowing how and when to interface with the broader public; understanding the diversity and representational issues of a citizens group; knowing the {open_quotes}ins and outs{close_quotes} of working with volunteers; education and training and most importantly, planning. Volunteers on a citizens board were created to tackle the big picture, policy decisions. The chair of the Rocky Flats Citizens Advisory Board will describe her Board`s successes, including the challenges in reaching consensus agreements, as well as the need for integration with other boards and the sites` on-going public involvement programs to provide the input the department is seeking. Finally, one of the greatest challenges for the boards is interfacing with the greater public-at-large, seeing how the CAB has overcome this challenge and integrating broader public input into its decisions.

  8. Quantitative Evidence Synthesis with Power Priors

    NARCIS (Netherlands)

    Rietbergen, C.|info:eu-repo/dai/nl/322847796

    2016-01-01

    The aim of this thesis is to provide the applied researcher with a practical approach for quantitative evidence synthesis using the conditional power prior that allows for subjective input and thereby provides an alternative tgbgo deal with the difficulties as- sociated with the joint power prior

  9. Quantitative Hydraulic Models Of Early Land Plants Provide Insight Into Middle Paleozoic Terrestrial Paleoenvironmental Conditions

    Science.gov (United States)

    Wilson, J. P.; Fischer, W. W.

    2010-12-01

    Fossil plants provide useful proxies of Earth’s climate because plants are closely connected, through physiology and morphology, to the environments in which they lived. Recent advances in quantitative hydraulic models of plant water transport provide new insight into the history of climate by allowing fossils to speak directly to environmental conditions based on preserved internal anatomy. We report results of a quantitative hydraulic model applied to one of the earliest terrestrial plants preserved in three dimensions, the ~396 million-year-old vascular plant Asteroxylon mackei. This model combines equations describing the rate of fluid flow through plant tissues with detailed observations of plant anatomy; this allows quantitative estimates of two critical aspects of plant function. First and foremost, results from these models quantify the supply of water to evaporative surfaces; second, results describe the ability of plant vascular systems to resist tensile damage from extreme environmental events, such as drought or frost. This approach permits quantitative comparisons of functional aspects of Asteroxylon with other extinct and extant plants, informs the quality of plant-based environmental proxies, and provides concrete data that can be input into climate models. Results indicate that despite their small size, water transport cells in Asteroxylon could supply a large volume of water to the plant's leaves--even greater than cells from some later-evolved seed plants. The smallest Asteroxylon tracheids have conductivities exceeding 0.015 m^2 / MPa * s, whereas Paleozoic conifer tracheids do not reach this threshold until they are three times wider. However, this increase in conductivity came at the cost of little to no adaptations for transport safety, placing the plant’s vegetative organs in jeopardy during drought events. Analysis of the thickness-to-span ratio of Asteroxylon’s tracheids suggests that environmental conditions of reduced relative

  10. 76 FR 11980 - Stakeholder Input: Listening Session to Provide Information and Solicit Suggestions for...

    Science.gov (United States)

    2011-03-04

    ... ENVIRONMENTAL PROTECTION AGENCY 40 CFR Chapter I [Docket EPA-HQ-OW-2011-0119; FRL-9275-4] Stakeholder Input: Listening Session to Provide Information and Solicit Suggestions for Regulations... stakeholders. DATES: The listening sessions will be held at 210 Holiday Court, Annapolis, Maryland 21401, on...

  11. Separation of input function for rapid measurement of quantitative CMRO2 and CBF in a single PET scan with a dual tracer administration method

    International Nuclear Information System (INIS)

    Kudomi, Nobuyuki; Watabe, Hiroshi; Hayashi, Takuya; Iida, Hidehiro

    2007-01-01

    Cerebral metabolic rate of oxygen (CMRO 2 ), oxygen extraction fraction (OEF) and cerebral blood flow (CBF) images can be quantified using positron emission tomography (PET) by administrating 15 O-labelled water (H 15 2 O) and oxygen ( 15 O 2 ). Conventionally, those images are measured with separate scans for three tracers C 15 O for CBV, H 15 2 O for CBF and 15 O 2 for CMRO 2 , and there are additional waiting times between the scans in order to minimize the influence of the radioactivity from the previous tracers, which results in a relatively long study period. We have proposed a dual tracer autoradiographic (DARG) approach (Kudomi et al 2005), which enabled us to measure CBF, OEF and CMRO 2 rapidly by sequentially administrating H 15 2 O and 15 O 2 within a short time. Because quantitative CBF and CMRO 2 values are sensitive to arterial input function, it is necessary to obtain accurate input function and a drawback of this approach is to require separation of the measured arterial blood time-activity curve (TAC) into pure water and oxygen input functions under the existence of residual radioactivity from the first injected tracer. For this separation, frequent manual sampling was required. The present paper describes two calculation methods: namely a linear and a model-based method, to separate the measured arterial TAC into its water and oxygen components. In order to validate these methods, we first generated a blood TAC for the DARG approach by combining the water and oxygen input functions obtained in a series of PET studies on normal human subjects. The combined data were then separated into water and oxygen components by the present methods. CBF and CMRO 2 were calculated using those separated input functions and tissue TAC. The quantitative accuracy in the CBF and CMRO 2 values by the DARG approach did not exceed the acceptable range, i.e., errors in those values were within 5%, when the area under the curve in the input function of the second tracer

  12. Semiautomatic determination of arterial input functions for quantitative dynamic contrast-enhanced magnetic resonance imaging in non-small cell lung cancer patients.

    Science.gov (United States)

    Chung, Julius; Kim, Jae-Hun; Lee, Eun Ju; Kim, Yoo Na; Yi, Chin A

    2015-03-01

    The aim of this study was to validate a semiautomatic detection method for the arterial input functions (AIFs) using Kendall coefficient of concordance (KCC) for quantitative analysis of dynamic contrast-enhanced magnetic resonance imaging in non-small cell lung cancer patients. We prospectively enrolled 28 patients (17 men, 11 women; mean age, 62 years) who had biopsy-proven non-small cell lung cancer. All enrolled patients underwent dynamic contrast-enhanced magnetic resonance imaging of the entire thorax. For the quantitative measurement of pharmacokinetic parameters, K and ve, of the lung cancers, AIFs were determined in 2 different ways: a manual method that involved 3 independent thoracic radiologists selecting a region of interest (ROI) within the aortic arch in the 2D coronal plane and a semiautomatic method that used in-house software to establish a KCC score, which provided a measure of similarity to typical AIF pattern. Three independent readers selected voxel clusters with high KCC scores calculated 3-dimensionally across planes in the data set. K and ve were correlated using intraclass correlation coefficients (ICCs), and Bland-Altman plots were used to examine agreement across methods and reproducibility within a method. Arterial input functions were determined using the data from ROI volumes that were significantly larger in the semiautomatic method (mean ± SD, 3360 ± 768 mm) than in the manual method (677 ± 380 mm) (P < 0.001). K showed very strong agreement (ICC, 0.927) and ve showed moderately strong agreement (ICC, 0.718) between the semiautomatic and manual methods. The reproducibility for K (ICCmanual, 0.813 and ICCsemiautomatic, 0.998; P < 0.001) and ve (ICCmanual, 0.455 and ICCsemiautomatic, 0.985, P < 0.001) was significantly better with the semiautomatic method than the manual method. We found semiautomated detection using KCC to be a robust method for determining the AIF. This method allows for larger ROIs specified in 3D across planes

  13. Use of microinterrupts to provide an instrument oriented input/output structure

    International Nuclear Information System (INIS)

    Zaky, S.G.

    1981-01-01

    This paper describes the design of a bit-slice based computer, which has been developed for use in data acquisition and control applications. The main design goals have been to provide fast response to external events, and sufficient processing capability to perform data reduction in real time. The initial application of this computer has been in airborne, geophysical surveying, where such instruments as Gamma-ray spectrometers, magnetometers and navigation equipment are involved. In order to meet the response requirement mentioned above, a microinterrupt facility has been incorporated. Microinterrupts are serviced in microcodes routines which can be initiated within a maximum of two microinstruction cycle times from an external event. This facility makes it possible to implement powerful input/output control functions without the need for complex and specialized hardware interfaces for each instrument. (orig.)

  14. A method for quantitative analysis of standard and high-throughput qPCR expression data based on input sample quantity.

    Directory of Open Access Journals (Sweden)

    Mateusz G Adamski

    Full Text Available Over the past decade rapid advances have occurred in the understanding of RNA expression and its regulation. Quantitative polymerase chain reactions (qPCR have become the gold standard for quantifying gene expression. Microfluidic next generation, high throughput qPCR now permits the detection of transcript copy number in thousands of reactions simultaneously, dramatically increasing the sensitivity over standard qPCR. Here we present a gene expression analysis method applicable to both standard polymerase chain reactions (qPCR and high throughput qPCR. This technique is adjusted to the input sample quantity (e.g., the number of cells and is independent of control gene expression. It is efficiency-corrected and with the use of a universal reference sample (commercial complementary DNA (cDNA permits the normalization of results between different batches and between different instruments--regardless of potential differences in transcript amplification efficiency. Modifications of the input quantity method include (1 the achievement of absolute quantification and (2 a non-efficiency corrected analysis. When compared to other commonly used algorithms the input quantity method proved to be valid. This method is of particular value for clinical studies of whole blood and circulating leukocytes where cell counts are readily available.

  15. Comparison of Linear Microinstability Calculations of Varying Input Realism

    International Nuclear Information System (INIS)

    Rewoldt, G.

    2003-01-01

    The effect of varying ''input realism'' or varying completeness of the input data for linear microinstability calculations, in particular on the critical value of the ion temperature gradient for the ion temperature gradient mode, is investigated using gyrokinetic and gyrofluid approaches. The calculations show that varying input realism can have a substantial quantitative effect on the results

  16. Comparison of linear microinstability calculations of varying input realism

    International Nuclear Information System (INIS)

    Rewoldt, G.; Kinsey, J.E.

    2004-01-01

    The effect of varying 'input realism' or varying completeness of the input data for linear microinstability calculations, in particular on the critical value of the ion temperature gradient for the ion temperature gradient mode, is investigated using gyrokinetic and gyrofluid approaches. The calculations show that varying input realism can have a substantial quantitative effect on the results

  17. Preliminary evaluation of MRI-derived input function for quantitative measurement of glucose metabolism in an integrated PET-MRI

    International Nuclear Information System (INIS)

    Anazodo, Udunna; Kewin, Matthew; Finger, Elizabeth; Thiessen, Jonathan; Hadway, Jennifer; Butler, John; Pavlosky, William; Prato, Frank; Thompson, Terry; St Lawrence, Keith

    2015-01-01

    PET semi-quantitative methods such as relative uptake value can be robust but offer no biological information and do not account for intra-subject variability in tracer administration or clearance. Simultaneous multimodal measurements that combine PET and MRI not only permit crucial multiparametric measurements, it provides means of applying tracer kinetic modelling without the need for serial arterial blood sampling. In this study we adapted an image-derived input function (IDIF) method to improve characterization of glucose metabolism in an ongoing dementia study. Here we present preliminary results in a small group of frontotemporal dementia patients and controls. IDIF was obtained directly from dynamic PET data guided by regions of interest drawn on carotid vessels on high resolution T1-weighted MR Images. IDIF was corrected for contamination of non-arterial voxels. A validation of the method was performed in a porcine model in a PET-CT scanner comparing IDIF to direct arterial blood samples. Metabolic rate of glucose (CMRglc) was measured voxel-by-voxel in gray matter producing maps that were compared between groups. Net influx rate (Ki) and global mean CMRglc are reported. A good correlation (r = 0.9 p<0.0001) was found between corrected IDIF and input function measured from direct arterial blood sampling in the validation study. In 3 FTD and 3 controls, a trend towards hypometabolism was found in frontal, temporal and parietal lobes similar to significant differences previously reported by other groups. The global mean CMRglc and Ki observed in control subjects are in line with previous reports. In general, kinetic modelling of PET-FDG using an MR-IDIF can improve characterization of glucose metabolism in dementia. This method is feasible in multimodal studies that aim to combine PET molecular imaging with MRI as dynamic PET can be acquired along with multiple MRI measurements.

  18. Preliminary evaluation of MRI-derived input function for quantitative measurement of glucose metabolism in an integrated PET-MRI

    Energy Technology Data Exchange (ETDEWEB)

    Anazodo, Udunna; Kewin, Matthew [Lawson Health Research Institute, Department of Medical Biophysics, Western University, London, Ontario (Canada); Finger, Elizabeth [Department of Clinical Neurological Sciences, Western University, London, Ontario (Canada); Thiessen, Jonathan; Hadway, Jennifer; Butler, John [Lawson Health Research Institute, Department of Medical Biophysics, Western University, London, Ontario (Canada); Pavlosky, William [Diagnostic Imaging, St Joseph' s Health Care, London, Ontario (Canada); Prato, Frank; Thompson, Terry; St Lawrence, Keith [Lawson Health Research Institute, Department of Medical Biophysics, Western University, London, Ontario (Canada)

    2015-05-18

    PET semi-quantitative methods such as relative uptake value can be robust but offer no biological information and do not account for intra-subject variability in tracer administration or clearance. Simultaneous multimodal measurements that combine PET and MRI not only permit crucial multiparametric measurements, it provides means of applying tracer kinetic modelling without the need for serial arterial blood sampling. In this study we adapted an image-derived input function (IDIF) method to improve characterization of glucose metabolism in an ongoing dementia study. Here we present preliminary results in a small group of frontotemporal dementia patients and controls. IDIF was obtained directly from dynamic PET data guided by regions of interest drawn on carotid vessels on high resolution T1-weighted MR Images. IDIF was corrected for contamination of non-arterial voxels. A validation of the method was performed in a porcine model in a PET-CT scanner comparing IDIF to direct arterial blood samples. Metabolic rate of glucose (CMRglc) was measured voxel-by-voxel in gray matter producing maps that were compared between groups. Net influx rate (Ki) and global mean CMRglc are reported. A good correlation (r = 0.9 p<0.0001) was found between corrected IDIF and input function measured from direct arterial blood sampling in the validation study. In 3 FTD and 3 controls, a trend towards hypometabolism was found in frontal, temporal and parietal lobes similar to significant differences previously reported by other groups. The global mean CMRglc and Ki observed in control subjects are in line with previous reports. In general, kinetic modelling of PET-FDG using an MR-IDIF can improve characterization of glucose metabolism in dementia. This method is feasible in multimodal studies that aim to combine PET molecular imaging with MRI as dynamic PET can be acquired along with multiple MRI measurements.

  19. Evaluation of the use of a standard input function for compartment analysis of [123I]iomazenil data. Factors influencing the quantitative results

    International Nuclear Information System (INIS)

    Seike, Yujiro; Hashikawa, Kazuo; Oku, Naohiko

    2004-01-01

    Adoption of standard input function (SIF) has been proposed for kinetic analysis of receptor binding potential (BP), instead of invasive frequent arterial samplings. The purpose of this study was to assess the SIP method in quantitative analysis of [ 123 I]iomazenil (IMZ), a central benzodiazepine antagonist, for SPECT. SPECT studies were performed on 10 patients with cerebrovascular disease or Alzheimer disease. Intermittent dynamic SPECT scans were performed from 0 to 201 min after IMZ-injection. BPs calculated from SIFs obtained from normal volunteers (BP s ) were compared with those of individual arterial samplings (BP O ). Good correlations were shown between BP O s and BP S s in the 9 subjects, but maximum BP S s were four times larger than the corresponding BP O s in one case. There were no abnormal laboratory data in this patient, but the relative arterial input count in the late period was higher than the SIF. Simulation studies with modified input functions revealed that height in the late period can produce significant errors in estimated BPs. These results suggested that the simplified method with one-point arterial sampling and SIF can not be applied clinically. One additional arterial sampling in the late period may be useful. (author)

  20. Nuclear Facility Isotopic Content (NFIC) Waste Management System to provide input for safety envelope definition

    International Nuclear Information System (INIS)

    Genser, J.R.

    1992-01-01

    The Westinghouse Savannah River Company (WSRC) is aggressively applying environmental remediation and radioactive waste management activities at the US Department of Energy's Savannah River Site (SRS) to ensure compliance with today's challenging governmental laws and regulatory requirements. This report discusses a computer-based Nuclear Facility Isotopic Content (NFIC) Waste Management System developed to provide input for the safety envelope definition and assessment of site-wide facilities. Information was formulated describing the SRS ''Nuclear Facilities'' and their respective bounding inventories of nuclear materials and radioactive waste using the NFIC Waste Management System

  1. Qualitative and Quantitative Analysis for US Army Recruiting Input Allocation

    National Research Council Canada - National Science Library

    Brence, John

    2004-01-01

    .... An objective study of the quantitative and qualitative aspects of recruiting is necessary to meet the future needs of the Army, in light of strong possibilities of recruiting resource reduction...

  2. Input and execution

    International Nuclear Information System (INIS)

    Carr, S.; Lane, G.; Rowling, G.

    1986-11-01

    This document describes the input procedures, input data files and operating instructions for the SYVAC A/C 1.03 computer program. SYVAC A/C 1.03 simulates the groundwater mediated movement of radionuclides from underground facilities for the disposal of low and intermediate level wastes to the accessible environment, and provides an estimate of the subsequent radiological risk to man. (author)

  3. Quantitative measures of walking and strength provide insight into brain corticospinal tract pathology in multiple sclerosis

    Directory of Open Access Journals (Sweden)

    Nora E Fritz

    2017-01-01

    Quantitative measures of strength and walking are associated with brain corticospinal tract pathology. The addition of these quantitative measures to basic clinical information explains more of the variance in corticospinal tract fractional anisotropy and magnetization transfer ratio than the basic clinical information alone. Outcome measurement for multiple sclerosis clinical trials has been notoriously challenging; the use of quantitative measures of strength and walking along with tract-specific imaging methods may improve our ability to monitor disease change over time, with intervention, and provide needed guidelines for developing more effective targeted rehabilitation strategies.

  4. Effect of simultaneous application of mycorrhiza with compost, vermicompost and sulfural geranole on some quantitative and qualitative characteristics of sesame (Sesamum indicum L. in a low input cropping system

    Directory of Open Access Journals (Sweden)

    P rezvani moghaddam

    2016-03-01

    quantitative and qualitative characteristics of sesame (Sesamum indicum L. in a low input cropping system was investigated. Materials and methods In order to evaluate the effects of simultaneous application of mycorrhiza and organic fertilizers on some quantitative and qualitative characteristics of sesame (Sesamum indicum L., an experiment was conducted based on randomized complete block design with three replications at Agricultural Research Farm, Ferdowsi University of Mashhad, Iran during growing season 2009-2010 growing season. Treatments were mycorrhiza (Glomus mosseae, mycorrhiza+compost, mycorrhiza+vermicompost, mycorrhiza+organic sulfural geranole, compost, vermicompost, Organic sulfural geranole and control (no fertilizer. Finally, data analysis was done using SAS 9.1 and means were compared by duncan’s multiple range test at 5% level of probability. Results and discussion The results showed that the effect of different organic and biological fertilizers were significant on seed yield. Seed yield significantly increased by using mycorrhiza in both condition of single and mixed with organic sulfural geranole and vermicompost compared to control treatment. Biological yield, in simultaneous application of vermicompost and organic sulfural geranole with mycorrhiza increased significantly compared to separate use of these fertilizers. All study organic fertilizers with mycorrhiza had significant effect on increasing oil content of sesame. Seed oil increased in simultaneous application of mycorrhiza and each of compost, vermicompost and organic sulfural geranole compared to separate application of mycorrhiza 12, 13 and 10 percentages, respectively. It seems that mycorrhiza and organic fertilizers improved quantitative and qualitative characteristics of sesame due to provide better conditions to absorption and transportation of nutrient to the plant (Hawkes et al., 2008. Conclusion In general, the results showed that the simultaneous use of ecological inputs can improve

  5. An open tool for input function estimation and quantification of dynamic PET FDG brain scans.

    Science.gov (United States)

    Bertrán, Martín; Martínez, Natalia; Carbajal, Guillermo; Fernández, Alicia; Gómez, Álvaro

    2016-08-01

    Positron emission tomography (PET) analysis of clinical studies is mostly restricted to qualitative evaluation. Quantitative analysis of PET studies is highly desirable to be able to compute an objective measurement of the process of interest in order to evaluate treatment response and/or compare patient data. But implementation of quantitative analysis generally requires the determination of the input function: the arterial blood or plasma activity which indicates how much tracer is available for uptake in the brain. The purpose of our work was to share with the community an open software tool that can assist in the estimation of this input function, and the derivation of a quantitative map from the dynamic PET study. Arterial blood sampling during the PET study is the gold standard method to get the input function, but is uncomfortable and risky for the patient so it is rarely used in routine studies. To overcome the lack of a direct input function, different alternatives have been devised and are available in the literature. These alternatives derive the input function from the PET image itself (image-derived input function) or from data gathered from previous similar studies (population-based input function). In this article, we present ongoing work that includes the development of a software tool that integrates several methods with novel strategies for the segmentation of blood pools and parameter estimation. The tool is available as an extension to the 3D Slicer software. Tests on phantoms were conducted in order to validate the implemented methods. We evaluated the segmentation algorithms over a range of acquisition conditions and vasculature size. Input function estimation algorithms were evaluated against ground truth of the phantoms, as well as on their impact over the final quantification map. End-to-end use of the tool yields quantification maps with [Formula: see text] relative error in the estimated influx versus ground truth on phantoms. The main

  6. Providing Open-Access Know How for Directors of Quantitative and Mathematics Support Centers

    Directory of Open Access Journals (Sweden)

    Michael Schuckers

    2017-01-01

    Full Text Available The purpose of this editorial is to introduce the quantitative literacy community to the newly published A Handbook for Directors of Quantitative and Mathematics Centers. QMaSCs (pronounced “Q-masks” can be broadly defined as centers that have supporting students in quantitative fields of study as part of their mission. Some focus only on calculus or mathematics; others concentrate on numeracy or quantitative literacy, and some do all of that. A QMaSC may be embedded in a mathematics department, or part of a learning commons, or a stand-alone center. There are hundreds of these centers in the U.S. The new handbook, which is the outgrowth of a 2013 NSF-sponsored, national workshop attended by 23 QMaSC directors from all quarters of the U.S., is available open access on the USF Scholar Commons and in hard copy from Amazon.com. This editorial by the handbook’s editors provides background and overview of the 20 detailed chapters on center leadership and management; community interactions; staffing, hiring and training; center assessment; and starting a center; and then a collection of ten case studies from research universities, four-year state colleges, liberal arts colleges, and a community college. The editorial ends by pointing out the need and potential benefits of a professional organization for QMaSC directors.

  7. Quantitative assessment of CA1 local circuits: knowledge base for interneuron-pyramidal cell connectivity.

    Science.gov (United States)

    Bezaire, Marianne J; Soltesz, Ivan

    2013-09-01

    In this work, through a detailed literature review, data-mining, and extensive calculations, we provide a current, quantitative estimate of the cellular and synaptic constituents of the CA1 region of the rat hippocampus. Beyond estimating the cell numbers of GABAergic interneuron types, we calculate their convergence onto CA1 pyramidal cells and compare it with the known input synapses on CA1 pyramidal cells. The convergence calculation and comparison are also made for excitatory inputs to CA1 pyramidal cells. In addition, we provide a summary of the excitatory and inhibitory convergence onto interneurons. The quantitative knowledge base assembled and synthesized here forms the basis for data-driven, large-scale computational modeling efforts. Additionally, this work highlights specific instances where the available data are incomplete, which should inspire targeted experimental projects toward a more complete quantification of the CA1 neurons and their connectivity. Copyright © 2013 Wiley Periodicals, Inc.

  8. MDS MIC Catalog Inputs

    Science.gov (United States)

    Johnson-Throop, Kathy A.; Vowell, C. W.; Smith, Byron; Darcy, Jeannette

    2006-01-01

    This viewgraph presentation reviews the inputs to the MDS Medical Information Communique (MIC) catalog. The purpose of the group is to provide input for updating the MDS MIC Catalog and to request that MMOP assign Action Item to other working groups and FSs to support the MITWG Process for developing MIC-DDs.

  9. Quantitative analysis of the publishing landscape in high-energy physics

    International Nuclear Information System (INIS)

    Mele, Salvatore; Dallman, David; Vigen, Jens; Yeomans, Joanne

    2006-01-01

    World-wide collaboration in high-energy physics (HEP) is a tradition which dates back several decades, with scientific publications mostly coauthored by scientists from different countries. This coauthorship phenomenon makes it difficult to identify precisely the 'share' of each country in HEP scientific production. One year's worth of HEP scientific articles published in peer-reviewed journals is analysed and their authors are uniquely assigned to countries. This method allows the first correct estimation on a pro rata basis of the share of HEP scientific publishing among several countries and institutions. The results provide an interesting insight into the geographical collaborative patterns of the HEP community. The HEP publishing landscape is further analysed to provide information on the journals favoured by the HEP community and on the geographical variation of their author bases. These results provide quantitative input to the ongoing debate on the possible transition of HEP publishing to an Open Access model. Foreword. This paper reports the results of a recent detailed study of the publishing landscape in high energy physics. We thought that because of its direct relevance to the high energy physics community, this important quantitative input to the debate on the transition to Open Access naturally finds its place in our journal. Marc Henneaux, JHEP Scientific Director

  10. Extending Quantitative Easing

    DEFF Research Database (Denmark)

    Hallett, Andrew Hughes; Fiedler, Salomon; Kooths, Stefan

    The notes in this compilation address the pros and cons associated with the extension of ECB quantitative easing programme of asset purchases. The notes have been requested by the Committee on Economic and Monetary Affairs as an input for the February 2017 session of the Monetary Dialogue....

  11. Industrial ecology: Quantitative methods for exploring a lower carbon future

    Science.gov (United States)

    Thomas, Valerie M.

    2015-03-01

    Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.

  12. Development of Input Function Measurement System for Small Animal PET Study

    International Nuclear Information System (INIS)

    Kim, Jong Guk; Kim, Byung Su; Kim, Jin Su

    2010-01-01

    For quantitative measurement of radioactivity concentration in tissue and a validated tracer kinetic model, the high sensitive detection system has been required for blood sampling. With the accurate measurement of time activity curves (TACs) of labeled compounds in blood (plasma) enable to provide quantitative information on biological parameters of interest in local tissue. Especially, the development of new tracers for PET imaging requires knowledge of the kinetics of the tracer in the body and in arterial blood and plasma. Conventional approaches of obtaining an input function are to sample arterial blood sequentially by manual as a function of time. Several continuous blood sampling systems have been developed and used in nuclear medicine research field to overcome the limited temporal resolution in sampling by the conventional method. In this work, we developed the high sensitive and unique geometric design of GSO detector for small animal blood activity measurement

  13. Modeling and generating input processes

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, M.E.

    1987-01-01

    This tutorial paper provides information relevant to the selection and generation of stochastic inputs to simulation studies. The primary area considered is multivariate but much of the philosophy at least is relevant to univariate inputs as well. 14 refs.

  14. 7 CFR 3430.15 - Stakeholder input.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Stakeholder input. 3430.15 Section 3430.15... Stakeholder input. Section 103(c)(2) of the Agricultural Research, Extension, and Education Reform Act of 1998... RFAs for competitive programs. CSREES will provide instructions for submission of stakeholder input in...

  15. Quantitative myocardial perfusion from static cardiac and dynamic arterial CT

    Science.gov (United States)

    Bindschadler, Michael; Branch, Kelley R.; Alessio, Adam M.

    2018-05-01

    Quantitative myocardial blood flow (MBF) estimation by dynamic contrast enhanced cardiac computed tomography (CT) requires multi-frame acquisition of contrast transit through the blood pool and myocardium to inform the arterial input and tissue response functions. Both the input and the tissue response functions for the entire myocardium are sampled with each acquisition. However, the long breath holds and frequent sampling can result in significant motion artifacts and relatively high radiation dose. To address these limitations, we propose and evaluate a new static cardiac and dynamic arterial (SCDA) quantitative MBF approach where (1) the input function is well sampled using either prediction from pre-scan timing bolus data or measured from dynamic thin slice ‘bolus tracking’ acquisitions, and (2) the whole-heart tissue response data is limited to one contrast enhanced CT acquisition. A perfusion model uses the dynamic arterial input function to generate a family of possible myocardial contrast enhancement curves corresponding to a range of MBF values. Combined with the timing of the single whole-heart acquisition, these curves generate a lookup table relating myocardial contrast enhancement to quantitative MBF. We tested the SCDA approach in 28 patients that underwent a full dynamic CT protocol both at rest and vasodilator stress conditions. Using measured input function plus single (enhanced CT only) or plus double (enhanced and contrast free baseline CT’s) myocardial acquisitions yielded MBF estimates with root mean square (RMS) error of 1.2 ml/min/g and 0.35 ml/min/g, and radiation dose reductions of 90% and 83%, respectively. The prediction of the input function based on timing bolus data and the static acquisition had an RMS error compared to the measured input function of 26.0% which led to MBF estimation errors greater than threefold higher than using the measured input function. SCDA presents a new, simplified approach for quantitative

  16. Posterior Inferotemporal Cortex Cells Use Multiple Input Pathways for Shape Encoding.

    Science.gov (United States)

    Ponce, Carlos R; Lomber, Stephen G; Livingstone, Margaret S

    2017-05-10

    In the macaque monkey brain, posterior inferior temporal (PIT) cortex cells contribute to visual object recognition. They receive concurrent inputs from visual areas V4, V3, and V2. We asked how these different anatomical pathways shape PIT response properties by deactivating them while monitoring PIT activity in two male macaques. We found that cooling of V4 or V2|3 did not lead to consistent changes in population excitatory drive; however, population pattern analyses showed that V4-based pathways were more important than V2|3-based pathways. We did not find any image features that predicted decoding accuracy differences between both interventions. Using the HMAX hierarchical model of visual recognition, we found that different groups of simulated "PIT" units with different input histories (lacking "V2|3" or "V4" input) allowed for comparable levels of object-decoding performance and that removing a large fraction of "PIT" activity resulted in similar drops in performance as in the cooling experiments. We conclude that distinct input pathways to PIT relay similar types of shape information, with V1-dependent V4 cells providing more quantitatively useful information for overall encoding than cells in V2 projecting directly to PIT. SIGNIFICANCE STATEMENT Convolutional neural networks are the best models of the visual system, but most emphasize input transformations across a serial hierarchy akin to the primary "ventral stream" (V1 → V2 → V4 → IT). However, the ventral stream also comprises parallel "bypass" pathways: V1 also connects to V4, and V2 to IT. To explore the advantages of mixing long and short pathways in the macaque brain, we used cortical cooling to silence inputs to posterior IT and compared the findings with an HMAX model with parallel pathways. Copyright © 2017 the authors 0270-6474/17/375019-16$15.00/0.

  17. Reproducibility of CSF quantitative culture methods for estimating rate of clearance in cryptococcal meningitis.

    Science.gov (United States)

    Dyal, Jonathan; Akampurira, Andrew; Rhein, Joshua; Morawski, Bozena M; Kiggundu, Reuben; Nabeta, Henry W; Musubire, Abdu K; Bahr, Nathan C; Williams, Darlisha A; Bicanic, Tihana; Larsen, Robert A; Meya, David B; Boulware, David R

    2016-05-01

    Quantitative cerebrospinal fluid (CSF) cultures provide a measure of disease severity in cryptococcal meningitis. The fungal clearance rate by quantitative cultures has become a primary endpoint for phase II clinical trials. This study determined the inter-assay accuracy of three different quantitative culture methodologies. Among 91 participants with meningitis symptoms in Kampala, Uganda, during August-November 2013, 305 CSF samples were prospectively collected from patients at multiple time points during treatment. Samples were simultaneously cultured by three methods: (1) St. George's 100 mcl input volume of CSF with five 1:10 serial dilutions, (2) AIDS Clinical Trials Group (ACTG) method using 1000, 100, 10 mcl input volumes, and two 1:100 dilutions with 100 and 10 mcl input volume per dilution on seven agar plates; and (3) 10 mcl calibrated loop of undiluted and 1:100 diluted CSF (loop). Quantitative culture values did not statistically differ between St. George-ACTG methods (P= .09) but did for St. George-10 mcl loop (Pmethods was high (r≥0.88). For detecting sterility, the ACTG-method had the highest negative predictive value of 97% (91% St. George, 60% loop), but the ACTG-method had occasional (∼10%) difficulties in quantification due to colony clumping. For CSF clearance rate, St. George-ACTG methods did not differ overall (mean -0.05 ± 0.07 log10CFU/ml/day;P= .14) on a group level; however, individual-level clearance varied. The St. George and ACTG quantitative CSF culture methods produced comparable but not identical results. Quantitative cultures can inform treatment management strategies. © The Author 2016. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. Quantitative contrast-enhanced first-pass cardiac perfusion MRI at 3 tesla with accurate arterial input function and myocardial wall enhancement.

    Science.gov (United States)

    Breton, Elodie; Kim, Daniel; Chung, Sohae; Axel, Leon

    2011-09-01

    To develop, and validate in vivo, a robust quantitative first-pass perfusion cardiovascular MR (CMR) method with accurate arterial input function (AIF) and myocardial wall enhancement. A saturation-recovery (SR) pulse sequence was modified to sequentially acquire multiple slices after a single nonselective saturation pulse at 3 Tesla. In each heartbeat, an AIF image is acquired in the aortic root with a short time delay (TD) (50 ms), followed by the acquisition of myocardial images with longer TD values (∼150-400 ms). Longitudinal relaxation rates (R(1) = 1/T(1)) were calculated using an ideal saturation recovery equation based on the Bloch equation, and corresponding gadolinium contrast concentrations were calculated assuming fast water exchange condition. The proposed method was validated against a reference multi-point SR method by comparing their respective R(1) measurements in the blood and left ventricular myocardium, before and at multiple time-points following contrast injections, in 7 volunteers. R(1) measurements with the proposed method and reference multi-point method were strongly correlated (r > 0.88, P < 10(-5)) and in good agreement (mean difference ±1.96 standard deviation 0.131 ± 0.317/0.018 ± 0.140 s(-1) for blood/myocardium, respectively). The proposed quantitative first-pass perfusion CMR method measured accurate R(1) values for quantification of AIF and myocardial wall contrast agent concentrations in 3 cardiac short-axis slices, in a total acquisition time of 523 ms per heartbeat. Copyright © 2011 Wiley-Liss, Inc.

  19. Global sensitivity analysis of computer models with functional inputs

    International Nuclear Information System (INIS)

    Iooss, Bertrand; Ribatet, Mathieu

    2009-01-01

    Global sensitivity analysis is used to quantify the influence of uncertain model inputs on the response variability of a numerical model. The common quantitative methods are appropriate with computer codes having scalar model inputs. This paper aims at illustrating different variance-based sensitivity analysis techniques, based on the so-called Sobol's indices, when some model inputs are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary metamodeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked generalized linear models (GLMs) or generalized additive models (GAMs). The 'mean model' allows to estimate the sensitivity indices of each scalar model inputs, while the 'dispersion model' allows to derive the total sensitivity index of the functional model inputs. The proposed approach is compared to some classical sensitivity analysis methodologies on an analytical function. Lastly, the new methodology is applied to an industrial computer code that simulates the nuclear fuel irradiation.

  20. How input fluctuations reshape the dynamics of a biological switching system

    Science.gov (United States)

    Hu, Bo; Kessler, David A.; Rappel, Wouter-Jan; Levine, Herbert

    2012-12-01

    An important task in quantitative biology is to understand the role of stochasticity in biochemical regulation. Here, as an extension of our recent work [Phys. Rev. Lett.PRLTAO0031-900710.1103/PhysRevLett.107.148101 107, 148101 (2011)], we study how input fluctuations affect the stochastic dynamics of a simple biological switch. In our model, the on transition rate of the switch is directly regulated by a noisy input signal, which is described as a non-negative mean-reverting diffusion process. This continuous process can be a good approximation of the discrete birth-death process and is much more analytically tractable. Within this setup, we apply the Feynman-Kac theorem to investigate the statistical features of the output switching dynamics. Consistent with our previous findings, the input noise is found to effectively suppress the input-dependent transitions. We show analytically that this effect becomes significant when the input signal fluctuates greatly in amplitude and reverts slowly to its mean.

  1. Quantitative Characterization of Nanostructured Materials

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Frank (Bud) Bridges, University of California-Santa Cruz

    2010-08-05

    The two-and-a-half day symposium on the "Quantitative Characterization of Nanostructured Materials" will be the first comprehensive meeting on this topic held under the auspices of a major U.S. professional society. Spring MRS Meetings provide a natural venue for this symposium as they attract a broad audience of researchers that represents a cross-section of the state-of-the-art regarding synthesis, structure-property relations, and applications of nanostructured materials. Close interactions among the experts in local structure measurements and materials researchers will help both to identify measurement needs pertinent to real-world materials problems and to familiarize the materials research community with the state-of-the-art local structure measurement techniques. We have chosen invited speakers that reflect the multidisciplinary and international nature of this topic and the need to continually nurture productive interfaces among university, government and industrial laboratories. The intent of the symposium is to provide an interdisciplinary forum for discussion and exchange of ideas on the recent progress in quantitative characterization of structural order in nanomaterials using different experimental techniques and theory. The symposium is expected to facilitate discussions on optimal approaches for determining atomic structure at the nanoscale using combined inputs from multiple measurement techniques.

  2. Screening important inputs in models with strong interaction properties

    International Nuclear Information System (INIS)

    Saltelli, Andrea; Campolongo, Francesca; Cariboni, Jessica

    2009-01-01

    We introduce a new method for screening inputs in mathematical or computational models with large numbers of inputs. The method proposed here represents an improvement over the best available practice for this setting when dealing with models having strong interaction effects. When the sample size is sufficiently high the same design can also be used to obtain accurate quantitative estimates of the variance-based sensitivity measures: the same simulations can be used to obtain estimates of the variance-based measures according to the Sobol' and the Jansen formulas. Results demonstrate that Sobol' is more efficient for the computation of the first-order indices, while Jansen performs better for the computation of the total indices.

  3. Screening important inputs in models with strong interaction properties

    Energy Technology Data Exchange (ETDEWEB)

    Saltelli, Andrea [European Commission, Joint Research Centre, 21020 Ispra, Varese (Italy); Campolongo, Francesca [European Commission, Joint Research Centre, 21020 Ispra, Varese (Italy)], E-mail: francesca.campolongo@jrc.it; Cariboni, Jessica [European Commission, Joint Research Centre, 21020 Ispra, Varese (Italy)

    2009-07-15

    We introduce a new method for screening inputs in mathematical or computational models with large numbers of inputs. The method proposed here represents an improvement over the best available practice for this setting when dealing with models having strong interaction effects. When the sample size is sufficiently high the same design can also be used to obtain accurate quantitative estimates of the variance-based sensitivity measures: the same simulations can be used to obtain estimates of the variance-based measures according to the Sobol' and the Jansen formulas. Results demonstrate that Sobol' is more efficient for the computation of the first-order indices, while Jansen performs better for the computation of the total indices.

  4. Application of neural networks to quantitative spectrometry analysis

    International Nuclear Information System (INIS)

    Pilato, V.; Tola, F.; Martinez, J.M.; Huver, M.

    1999-01-01

    Accurate quantitative analysis of complex spectra (fission and activation products), relies upon experts' knowledge. In some cases several hours, even days of tedious calculations are needed. This is because current software is unable to solve deconvolution problems when several rays overlap. We have shown that such analysis can be correctly handled by a neural network, and the procedure can be automated with minimum laboratory measurements for networks training, as long as all the elements of the analysed solution figure in the training set and provided that adequate scaling of input data is performed. Once the network has been trained, analysis is carried out in a few seconds. On submitting to a test between several well-known laboratories, where unknown quantities of 57 Co, 58 Co, 85 Sr, 88 Y, 131 I, 139 Ce, 141 Ce present in a sample had to be determined, the results yielded by our network classed it amongst the best. The method is described, including experimental device and measures, training set designing, relevant input parameters definition, input data scaling and networks training. Main results are presented together with a statistical model allowing networks error prediction

  5. Global quantitative indices reflecting provider process-of-care: data-base derivation.

    Science.gov (United States)

    Moran, John L; Solomon, Patricia J

    2010-04-19

    Controversy has attended the relationship between risk-adjusted mortality and process-of-care. There would be advantage in the establishment, at the data-base level, of global quantitative indices subsuming the diversity of process-of-care. A retrospective, cohort study of patients identified in the Australian and New Zealand Intensive Care Society Adult Patient Database, 1993-2003, at the level of geographic and ICU-level descriptors (n = 35), for both hospital survivors and non-survivors. Process-of-care indices were established by analysis of: (i) the smoothed time-hazard curve of individual patient discharge and determined by pharmaco-kinetic methods as area under the hazard-curve (AUC), reflecting the integrated experience of the discharge process, and time-to-peak-hazard (TMAX, in days), reflecting the time to maximum rate of hospital discharge; and (ii) individual patient ability to optimize output (as length-of-stay) for recorded data-base physiological inputs; estimated as a technical production-efficiency (TE, scaled [0,(maximum)1]), via the econometric technique of stochastic frontier analysis. For each descriptor, multivariate correlation-relationships between indices and summed mortality probability were determined. The data-set consisted of 223129 patients from 99 ICUs with mean (SD) age and APACHE III score of 59.2(18.9) years and 52.7(30.6) respectively; 41.7% were female and 45.7% were mechanically ventilated within the first 24 hours post-admission. For survivors, AUC was maximal in rural and for-profit ICUs, whereas TMAX (>or= 7.8 days) and TE (>or= 0.74) were maximal in tertiary-ICUs. For non-survivors, AUC was maximal in tertiary-ICUs, but TMAX (>or= 4.2 days) and TE (>or= 0.69) were maximal in for-profit ICUs. Across descriptors, significant differences in indices were demonstrated (analysis-of-variance, P variance, for survivors (0.89) and non-survivors (0.89), was maximized by combinations of indices demonstrating a low correlation with

  6. Quantitative assessment of multiple sclerosis lesion load using CAD and expert input

    Science.gov (United States)

    Gertych, Arkadiusz; Wong, Alexis; Sangnil, Alan; Liu, Brent J.

    2008-03-01

    Multiple sclerosis (MS) is a frequently encountered neurological disease with a progressive but variable course affecting the central nervous system. Outline-based lesion quantification in the assessment of lesion load (LL) performed on magnetic resonance (MR) images is clinically useful and provides information about the development and change reflecting overall disease burden. Methods of LL assessment that rely on human input are tedious, have higher intra- and inter-observer variability and are more time-consuming than computerized automatic (CAD) techniques. At present it seems that methods based on human lesion identification preceded by non-interactive outlining by CAD are the best LL quantification strategies. We have developed a CAD that automatically quantifies MS lesions, displays 3-D lesion map and appends radiological findings to original images according to current DICOM standard. CAD is also capable to display and track changes and make comparison between patient's separate MRI studies to determine disease progression. The findings are exported to a separate imaging tool for review and final approval by expert. Capturing and standardized archiving of manual contours is also implemented. Similarity coefficients calculated from quantities of LL in collected exams show a good correlation of CAD-derived results vs. those incorporated as expert's reading. Combining the CAD approach with an expert interaction may impact to the diagnostic work-up of MS patients because of improved reproducibility in LL assessment and reduced time for single MR or comparative exams reading. Inclusion of CAD-generated outlines as DICOM-compliant overlays into the image data can serve as a better reference in MS progression tracking.

  7. Input Enhancement and L2 Question Formation.

    Science.gov (United States)

    White, Lydia; And Others

    1991-01-01

    Investigated the extent to which form-focused instruction and corrective feedback (i.e., "input enhancement"), provided within a primarily communicative program, contribute to learners' accuracy in question formation. Study results are interpreted as evidence that input enhancement can bring about genuine changes in learners' interlanguage…

  8. Prioritizing Interdependent Production Processes using Leontief Input-Output Model

    Directory of Open Access Journals (Sweden)

    Masbad Jesah Grace

    2016-03-01

    Full Text Available This paper proposes a methodology in identifying key production processes in an interdependent production system. Previous approaches on this domain have drawbacks that may potentially affect the reliability of decision-making. The proposed approach adopts the Leontief input-output model (L-IOM which was proven successful in analyzing interdependent economic systems. The motivation behind such adoption lies in the strength of L-IOM in providing a rigorous quantitative framework in identifying key components of interdependent systems. In this proposed approach, the consumption and production flows of each process are represented respectively by the material inventory produced by the prior process and the material inventory produced by the current process, both in monetary values. A case study in a furniture production system located in central Philippines was carried out to elucidate the proposed approach. Results of the case were reported in this work

  9. Quantitative population epigenetics - a catalyst for sustainable agriculture

    Directory of Open Access Journals (Sweden)

    Stauß, Reinhold

    2014-02-01

    Full Text Available The use of quantitative population epigenetics and the related importance of stress can lead to a paradigm shift, away from a high-input and high-output agriculture with a maximum utilization of the genetic potential to an ecological intensification, to a low-input and high-output agriculture which is optimization and harmonization of limiting stress factors to achieve maximum results with limited environmental or ecological resources.

  10. BBN based Quantitative Assessment of Software Design Specification

    International Nuclear Information System (INIS)

    Eom, Heung-Seop; Park, Gee-Yong; Kang, Hyun-Gook; Kwon, Kee-Choon; Chang, Seung-Cheol

    2007-01-01

    Probabilistic Safety Assessment (PSA), which is one of the important methods in assessing the overall safety of a nuclear power plant (NPP), requires quantitative reliability information of safety-critical software, but the conventional reliability assessment methods can not provide enough information for PSA of a NPP. Therefore current PSA which includes safety-critical software does not usually consider the reliability of the software or uses arbitrary values for it. In order to solve this situation this paper proposes a method that can produce quantitative reliability information of safety-critical software for PSA by making use of Bayesian Belief Networks (BBN). BBN has generally been used to model an uncertain system in many research fields including the safety assessment of software. The proposed method was constructed by utilizing BBN which can combine the qualitative and the quantitative evidence relevant to the reliability of safety critical software. The constructed BBN model can infer a conclusion in a formal and a quantitative way. A case study was carried out with the proposed method to assess the quality of software design specification (SDS) of safety-critical software that will be embedded in a reactor protection system. The intermediate V and V results of the software design specification were used as inputs to the BBN model

  11. INPUT-OUTPUT ANALYSIS : THE NEXT 25 YEARS

    NARCIS (Netherlands)

    Dietzenbacher, Erik; Lenzen, Manfred; Los, Bart; Guan, Dabo; Lahr, Michael L.; Sancho, Ferran; Suh, Sangwon; Yang, Cuihong; Sancho, S.

    2013-01-01

    This year marks the 25th anniversary of the International Input-Output Association and the 25th volume of Economic Systems Research. To celebrate this anniversary, a group of eight experts provide their views on the future of input-output. Looking forward, they foresee progress in terms of data

  12. Smart mobility solution with multiple input Output interface.

    Science.gov (United States)

    Sethi, Aartika; Deb, Sujay; Ranjan, Prabhat; Sardar, Arghya

    2017-07-01

    Smart wheelchairs are commonly used to provide solution for mobility impairment. However their usage is limited primarily due to high cost owing from sensors required for giving input, lack of adaptability for different categories of input and limited functionality. In this paper we propose a smart mobility solution using smartphone with inbuilt sensors (accelerometer, camera and speaker) as an input interface. An Emotiv EPOC+ is also used for motor imagery based input control synced with facial expressions in cases of extreme disability. Apart from traction, additional functions like home security and automation are provided using Internet of Things (IoT) and web interfaces. Although preliminary, our results suggest that this system can be used as an integrated and efficient solution for people suffering from mobility impairment. The results also indicate a decent accuracy is obtained for the overall system.

  13. Quantitative reliability assessment for safety critical system software

    International Nuclear Information System (INIS)

    Chung, Dae Won; Kwon, Soon Man

    2005-01-01

    An essential issue in the replacement of the old analogue I and C to computer-based digital systems in nuclear power plants is the quantitative software reliability assessment. Software reliability models have been successfully applied to many industrial applications, but have the unfortunate drawback of requiring data from which one can formulate a model. Software which is developed for safety critical applications is frequently unable to produce such data for at least two reasons. First, the software is frequently one-of-a-kind, and second, it rarely fails. Safety critical software is normally expected to pass every unit test producing precious little failure data. The basic premise of the rare events approach is that well-tested software does not fail under normal routine and input signals, which means that failures must be triggered by unusual input data and computer states. The failure data found under the reasonable testing cases and testing time for these conditions should be considered for the quantitative reliability assessment. We will present the quantitative reliability assessment methodology of safety critical software for rare failure cases in this paper

  14. Finding identifiable parameter combinations in nonlinear ODE models and the rational reparameterization of their input-output equations.

    Science.gov (United States)

    Meshkat, Nicolette; Anderson, Chris; Distefano, Joseph J

    2011-09-01

    When examining the structural identifiability properties of dynamic system models, some parameters can take on an infinite number of values and yet yield identical input-output data. These parameters and the model are then said to be unidentifiable. Finding identifiable combinations of parameters with which to reparameterize the model provides a means for quantitatively analyzing the model and computing solutions in terms of the combinations. In this paper, we revisit and explore the properties of an algorithm for finding identifiable parameter combinations using Gröbner Bases and prove useful theoretical properties of these parameter combinations. We prove a set of M algebraically independent identifiable parameter combinations can be found using this algorithm and that there exists a unique rational reparameterization of the input-output equations over these parameter combinations. We also demonstrate application of the procedure to a nonlinear biomodel. Copyright © 2011 Elsevier Inc. All rights reserved.

  15. Quantitative estimates of Asian dust input to the western Philippine Sea in the mid-late Quaternary and its potential significance for paleoenvironment

    Science.gov (United States)

    Xu, Zhaokai; Li, Tiegang; Clift, Peter D.; Lim, Dhongil; Wan, Shiming; Chen, Hongjin; Tang, Zheng; Jiang, Fuqing; Xiong, Zhifang

    2015-09-01

    We present a new high-resolution multiproxy data set of Sr-Nd isotopes, rare earth element, soluble iron, and total organic carbon data from International Marine Global Change Study Core MD06-3047 located in the western Philippine Sea. We integrate our new data with published clay mineralogy, rare earth element chemistry, thermocline depth, and δ13C differences between benthic and planktonic foraminifera, in order to quantitatively constrain Asian dust input to the basin. We explore the relationship between Philippine Sea and high-latitude Pacific eolian fluxes, as well as its significance for marine productivity and atmospheric CO2 during the mid-late Quaternary. Three different indices indicate that Asian dust contributes between ˜15% and ˜50% to the detrital fraction of the sediments. Eolian dust flux in Core MD06-3047 is similar to that in the polar southern Pacific sediment. Coherent changes for most dust flux maximum/minimum indicate that dust generation in interhemispheric source areas might have a common response to climatic variation over the mid-late Quaternary. Furthermore, we note relatively good coherence between Asian dust input, soluble iron concentration, local marine productivity, and even global atmospheric CO2 concentration over the entire study interval. This suggests that dust-borne iron fertilization of marine phytoplankton might have been a periodic process operating at glacial/interglacial time scales over the past 700 ka. We suggest that strengthening of the biological pump in the Philippine Sea, and elsewhere in the tropical western Pacific during the mid-late Quaternary glacial periods may contribute to the lowering of atmospheric CO2 concentrations during ice ages.

  16. Comparing Jupiter and Saturn: dimensionless input rates from plasma sources within the magnetosphere

    Directory of Open Access Journals (Sweden)

    V. M. Vasyliūnas

    2008-06-01

    Full Text Available The quantitative significance for a planetary magnetosphere of plasma sources associated with a moon of the planet can be assessed only by expressing the plasma mass input rate in dimensionless form, as the ratio of the actual mass input to some reference value. Traditionally, the solar wind mass flux through an area equal to the cross-section of the magnetosphere has been used. Here I identify another reference value of mass input, independent of the solar wind and constructed from planetary parameters alone, which can be shown to represent a mass input sufficiently large to prevent corotation already at the source location. The source rate from Enceladus at Saturn has been reported to be an order of magnitude smaller (in absolute numbers than that from Io at Jupiter. Both reference values, however, are also smaller at Saturn than at Jupiter, by factors ~40 to 60; expressed in dimensionless form, the estimated mass input from Enceladus may be larger than that from Io by factors ~4 to 6. The magnetosphere of Saturn may thus, despite a lower mass input in kg s−1, intrinsically be more heavily mass-loaded than the magnetosphere of Jupiter.

  17. Comparing Jupiter and Saturn: dimensionless input rates from plasma sources within the magnetosphere

    Directory of Open Access Journals (Sweden)

    V. M. Vasyliūnas

    2008-06-01

    Full Text Available The quantitative significance for a planetary magnetosphere of plasma sources associated with a moon of the planet can be assessed only by expressing the plasma mass input rate in dimensionless form, as the ratio of the actual mass input to some reference value. Traditionally, the solar wind mass flux through an area equal to the cross-section of the magnetosphere has been used. Here I identify another reference value of mass input, independent of the solar wind and constructed from planetary parameters alone, which can be shown to represent a mass input sufficiently large to prevent corotation already at the source location. The source rate from Enceladus at Saturn has been reported to be an order of magnitude smaller (in absolute numbers than that from Io at Jupiter. Both reference values, however, are also smaller at Saturn than at Jupiter, by factors ~40 to 60; expressed in dimensionless form, the estimated mass input from Enceladus may be larger than that from Io by factors ~4 to 6. The magnetosphere of Saturn may thus, despite a lower mass input in kg s−1, intrinsically be more heavily mass-loaded than the magnetosphere of Jupiter.

  18. The Quantitative Linear-Time–Branching-Time Spectrum

    DEFF Research Database (Denmark)

    Thrane, Claus; Fahrenberg, Uli; Legay, Axel

    2011-01-01

    We present a distance-agnostic approach to quantitative verification. Taking as input an unspecified distance on system traces, or executions, we develop a game-based framework which allows us to define a spectrum of different interesting system distances corresponding to the given trace distance...

  19. Global quantitative indices reflecting provider process-of-care: data-base derivation

    Directory of Open Access Journals (Sweden)

    Solomon Patricia J

    2010-04-01

    Full Text Available Abstract Background Controversy has attended the relationship between risk-adjusted mortality and process-of-care. There would be advantage in the establishment, at the data-base level, of global quantitative indices subsuming the diversity of process-of-care. Methods A retrospective, cohort study of patients identified in the Australian and New Zealand Intensive Care Society Adult Patient Database, 1993-2003, at the level of geographic and ICU-level descriptors (n = 35, for both hospital survivors and non-survivors. Process-of-care indices were established by analysis of: (i the smoothed time-hazard curve of individual patient discharge and determined by pharmaco-kinetic methods as area under the hazard-curve (AUC, reflecting the integrated experience of the discharge process, and time-to-peak-hazard (TMAX, in days, reflecting the time to maximum rate of hospital discharge; and (ii individual patient ability to optimize output (as length-of-stay for recorded data-base physiological inputs; estimated as a technical production-efficiency (TE, scaled [0,(maximum1], via the econometric technique of stochastic frontier analysis. For each descriptor, multivariate correlation-relationships between indices and summed mortality probability were determined. Results The data-set consisted of 223129 patients from 99 ICUs with mean (SD age and APACHE III score of 59.2(18.9 years and 52.7(30.6 respectively; 41.7% were female and 45.7% were mechanically ventilated within the first 24 hours post-admission. For survivors, AUC was maximal in rural and for-profit ICUs, whereas TMAX (≥ 7.8 days and TE (≥ 0.74 were maximal in tertiary-ICUs. For non-survivors, AUC was maximal in tertiary-ICUs, but TMAX (≥ 4.2 days and TE (≥ 0.69 were maximal in for-profit ICUs. Across descriptors, significant differences in indices were demonstrated (analysis-of-variance, P ≤ 0.0001. Total explained variance, for survivors (0.89 and non-survivors (0.89, was maximized by

  20. Usability Improvement for Data Input into the Fatigue Avoidance Scheduling Tool (FAST)

    National Research Council Canada - National Science Library

    Miller, James C

    2005-01-01

    ...) data input mode than using the graphic schedule input mode. The Grid input mode provided both a statistically and an operationally significant reduction in data input time, compared to the Graphic mode for both novice...

  1. Providing disabled persons in developing countries access to computer games through a novel gaming input device

    CSIR Research Space (South Africa)

    Smith, Andrew C

    2011-01-01

    Full Text Available A novel input device is presented for use with a personal computer by persons with physical disabilities who would otherwise not be able to enjoy computer gaming. This device is simple to manufacture and low in cost. A gaming application...

  2. Evaluation of input and process components of quality of child health services provided at 24 × 7 primary health centers of a district in Central Gujarat

    Directory of Open Access Journals (Sweden)

    Paragkumar Chavda

    2015-01-01

    Full Text Available Context: With the critical Indian challenge on child survival and health, time is ripe to initiate focus on quality of services apart from measuring coverage, to bring about improvements. Aims: To assess the quality of child health services provided at 24 × 7 Primary Health Centers of Vadodara District in Gujarat in terms of Input and Process Indicators. Settings and Design: The study was carried out in 12 randomly chosen 24 × 7 Primary Health Centers (PHCs of Vadodara district using a modified quality assessment checklist of the Program on District Quality Assurance for Reproductive and Child Health (RCH services with use of scores from May 2010 to June 2011. Subjects and Methods: Inputs assessment was done by facility survey. Process assessment for the four child health service components used actual observation of service, review of records and interview of service providers and clients. Results: The mean obtained score for facilities in Input section was 65%. Highest score was obtained for Drugs and Consumables (86% followed by Equipments and Supplies (74%. The score obtained for Infrastructure facility was 65%, Personnel and training was 56% and Essential protocols and guidelines scored 43%. The mean obtained score in the process section was 55%. Highest scores were obtained for immunization at 76%. This was followed by newborn care (52%, growth monitoring (52%. management of sick child (41%. Conclusion: Quality improvement efforts should focus not only on resource-intensive structural improvements, but also on cost-effective measures at improving service delivery process, especially adherence to service guidelines by providers.

  3. FLUTAN input specifications

    International Nuclear Information System (INIS)

    Borgwaldt, H.; Baumann, W.; Willerding, G.

    1991-05-01

    FLUTAN is a highly vectorized computer code for 3-D fluiddynamic and thermal-hydraulic analyses in cartesian and cylinder coordinates. It is related to the family of COMMIX codes originally developed at Argonne National Laboratory, USA. To a large extent, FLUTAN relies on basic concepts and structures imported from COMMIX-1B and COMMIX-2 which were made available to KfK in the frame of cooperation contracts in the fast reactor safety field. While on the one hand not all features of the original COMMIX versions have been implemented in FLUTAN, the code on the other hand includes some essential innovative options like CRESOR solution algorithm, general 3-dimensional rebalacing scheme for solving the pressure equation, and LECUSSO-QUICK-FRAM techniques suitable for reducing 'numerical diffusion' in both the enthalphy and momentum equations. This report provides users with detailed input instructions, presents formulations of the various model options, and explains by means of comprehensive sample input, how to use the code. (orig.) [de

  4. Semi-parametric arterial input functions for quantitative dynamic contrast enhanced magnetic resonance imaging in mice

    Czech Academy of Sciences Publication Activity Database

    Taxt, T.; Reed, R. K.; Pavlin, T.; Rygh, C. B.; Andersen, E.; Jiřík, Radovan

    2018-01-01

    Roč. 46, FEB (2018), s. 10-20 ISSN 0730-725X R&D Projects: GA ČR GA17-13830S; GA MŠk(CZ) LO1212 Institutional support: RVO:68081731 Keywords : DCE-MRI * blind deconvolution * arterial input function Subject RIV: FA - Cardiovascular Diseases incl. Cardiotharic Surgery Impact factor: 2.225, year: 2016

  5. Mars 2.2 code manual: input requirements

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Lee, Won Jae; Jeong, Jae Jun; Lee, Young Jin; Hwang, Moon Kyu; Kim, Kyung Doo; Lee, Seung Wook; Bae, Sung Won

    2003-07-01

    Korea Advanced Energy Research Institute (KAERI) conceived and started the development of MARS code with the main objective of producing a state-of-the-art realistic thermal hydraulic systems analysis code with multi-dimensional analysis capability. MARS achieves this objective by very tightly integrating the one dimensional RELAP5/MOD3 with the multi-dimensional COBRA-TF codes. The method of integration of the two codes is based on the dynamic link library techniques, and the system pressure equation matrices of both codes are implicitly integrated and solved simultaneously. In addition, the Equation-of-State (EOS) for the light water was unified by replacing the EOS of COBRA-TF by that of the RELAP5. This input manual provides a complete list of input required to run MARS. The manual is divided largely into two parts, namely, the one-dimensional part and the multi-dimensional part. The inputs for auxiliary parts such as minor edit requests and graph formatting inputs are shared by the two parts and as such mixed input is possible. The overall structure of the input is modeled on the structure of the RELAP5 and as such the layout of the manual is very similar to that of the RELAP. This similitude to RELAP5 input is intentional as this input scheme will allow minimum modification between the inputs of RELAP5 and MARS. MARS development team would like to express its appreciation to the RELAP5 Development Team and the USNRC for making this manual possible

  6. MARS code manual volume II: input requirements

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Kim, Kyung Doo; Bae, Sung Won; Jeong, Jae Jun; Lee, Seung Wook; Hwang, Moon Kyu

    2010-02-01

    Korea Advanced Energy Research Institute (KAERI) conceived and started the development of MARS code with the main objective of producing a state-of-the-art realistic thermal hydraulic systems analysis code with multi-dimensional analysis capability. MARS achieves this objective by very tightly integrating the one dimensional RELAP5/MOD3 with the multi-dimensional COBRA-TF codes. The method of integration of the two codes is based on the dynamic link library techniques, and the system pressure equation matrices of both codes are implicitly integrated and solved simultaneously. In addition, the Equation-Of-State (EOS) for the light water was unified by replacing the EOS of COBRA-TF by that of the RELAP5. This input manual provides a complete list of input required to run MARS. The manual is divided largely into two parts, namely, the one-dimensional part and the multi-dimensional part. The inputs for auxiliary parts such as minor edit requests and graph formatting inputs are shared by the two parts and as such mixed input is possible. The overall structure of the input is modeled on the structure of the RELAP5 and as such the layout of the manual is very similar to that of the RELAP. This similitude to RELAP5 input is intentional as this input scheme will allow minimum modification between the inputs of RELAP5 and MARS3.1. MARS3.1 development team would like to express its appreciation to the RELAP5 Development Team and the USNRC for making this manual possible

  7. The perspective of healthcare providers and patients on health literacy: a systematic review of the quantitative and qualitative studies.

    Science.gov (United States)

    Rajah, Retha; Ahmad Hassali, Mohamed Azmi; Jou, Lim Ching; Murugiah, Muthu Kumar

    2018-03-01

    Health literacy (HL) is a multifaceted concept, thus understanding the perspective of healthcare providers, patients, and the system is vital. This systematic review examines and synthesises the available studies on HL-related knowledge, attitude, practice, and perceived barriers. CINAHL and Medline (via EBSCOhost), Google Scholar, PubMed, ProQuest, Sage Journals, and Science Direct were searched. Both quantitative and/or qualitative studies in the English language were included. Intervention studies and studies focusing on HL assessment tools and prevalence of low HL were excluded. The risk of biasness reduced with the involvement of two reviewers independently assessing study eligibility and quality. A total of 30 studies were included, which consist of 19 quantitative, 9 qualitative, and 2 mixed-method studies. Out of 17 studies, 13 reported deficiency of HL-related knowledge among healthcare providers and 1 among patients. Three studies showed a positive attitude of healthcare providers towards learning about HL. Another three studies demonstrated patients feel shame exposing their literacy and undergoing HL assessment. Common HL communication techniques reported practiced by healthcare providers were the use of everyday language, teach-back method, and providing patients with reading materials and aids, while time constraint was the most reported HL perceived barriers by both healthcare providers and patients. Significant gaps exists in HL knowledge among healthcare providers and patients that needs immediate intervention. Such as, greater effort placed in creating a health system that provides an opportunity for healthcare providers to learn about HL and patients to access health information with taking consideration of their perceived barriers.

  8. Input data required for specific performance assessment codes

    International Nuclear Information System (INIS)

    Seitz, R.R.; Garcia, R.S.; Starmer, R.J.; Dicke, C.A.; Leonard, P.R.; Maheras, S.J.; Rood, A.S.; Smith, R.W.

    1992-02-01

    The Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory generated this report on input data requirements for computer codes to assist States and compacts in their performance assessments. This report gives generators, developers, operators, and users some guidelines on what input data is required to satisfy 22 common performance assessment codes. Each of the codes is summarized and a matrix table is provided to allow comparison of the various input required by the codes. This report does not determine or recommend which codes are preferable

  9. Post-BEMUSE Reflood Model Input Uncertainty Methods (PREMIUM) Benchmark Phase II: Identification of Influential Parameters

    International Nuclear Information System (INIS)

    Kovtonyuk, A.; Petruzzi, A.; D'Auria, F.

    2015-01-01

    The objective of the Post-BEMUSE Reflood Model Input Uncertainty Methods (PREMIUM) benchmark is to progress on the issue of the quantification of the uncertainty of the physical models in system thermal-hydraulic codes by considering a concrete case: the physical models involved in the prediction of core reflooding. The PREMIUM benchmark consists of five phases. This report presents the results of Phase II dedicated to the identification of the uncertain code parameters associated with physical models used in the simulation of reflooding conditions. This identification is made on the basis of the Test 216 of the FEBA/SEFLEX programme according to the following steps: - identification of influential phenomena; - identification of the associated physical models and parameters, depending on the used code; - quantification of the variation range of identified input parameters through a series of sensitivity calculations. A procedure for the identification of potentially influential code input parameters has been set up in the Specifications of Phase II of PREMIUM benchmark. A set of quantitative criteria has been as well proposed for the identification of influential IP and their respective variation range. Thirteen participating organisations, using 8 different codes (7 system thermal-hydraulic codes and 1 sub-channel module of a system thermal-hydraulic code) submitted Phase II results. The base case calculations show spread in predicted cladding temperatures and quench front propagation that has been characterized. All the participants, except one, predict a too fast quench front progression. Besides, the cladding temperature time trends obtained by almost all the participants show oscillatory behaviour which may have numeric origins. Adopted criteria for identification of influential input parameters differ between the participants: some organisations used the set of criteria proposed in Specifications 'as is', some modified the quantitative thresholds

  10. Providing Quantitative Information and a Nudge to Undergo Stool Testing in a Colorectal Cancer Screening Decision Aid: A Randomized Clinical Trial.

    Science.gov (United States)

    Schwartz, Peter H; Perkins, Susan M; Schmidt, Karen K; Muriello, Paul F; Althouse, Sandra; Rawl, Susan M

    2017-08-01

    Guidelines recommend that patient decision aids should provide quantitative information about probabilities of potential outcomes, but the impact of this information is unknown. Behavioral economics suggests that patients confused by quantitative information could benefit from a "nudge" towards one option. We conducted a pilot randomized trial to estimate the effect sizes of presenting quantitative information and a nudge. Primary care patients (n = 213) eligible for colorectal cancer screening viewed basic screening information and were randomized to view (a) quantitative information (quantitative module), (b) a nudge towards stool testing with the fecal immunochemical test (FIT) (nudge module), (c) neither a nor b, or (d) both a and b. Outcome measures were perceived colorectal cancer risk, screening intent, preferred test, and decision conflict, measured before and after viewing the decision aid, and screening behavior at 6 months. Patients viewing the quantitative module were more likely to be screened than those who did not ( P = 0.012). Patients viewing the nudge module had a greater increase in perceived colorectal cancer risk than those who did not ( P = 0.041). Those viewing the quantitative module had a smaller increase in perceived risk than those who did not ( P = 0.046), and the effect was moderated by numeracy. Among patients with high numeracy who did not view the nudge module, those who viewed the quantitative module had a greater increase in intent to undergo FIT ( P = 0.028) than did those who did not. The limitations of this study were the limited sample size and single healthcare system. Adding quantitative information to a decision aid increased uptake of colorectal cancer screening, while adding a nudge to undergo FIT did not increase uptake. Further research on quantitative information in decision aids is warranted.

  11. Computational complexity a quantitative perspective

    CERN Document Server

    Zimand, Marius

    2004-01-01

    There has been a common perception that computational complexity is a theory of "bad news" because its most typical results assert that various real-world and innocent-looking tasks are infeasible. In fact, "bad news" is a relative term, and, indeed, in some situations (e.g., in cryptography), we want an adversary to not be able to perform a certain task. However, a "bad news" result does not automatically become useful in such a scenario. For this to happen, its hardness features have to be quantitatively evaluated and shown to manifest extensively. The book undertakes a quantitative analysis of some of the major results in complexity that regard either classes of problems or individual concrete problems. The size of some important classes are studied using resource-bounded topological and measure-theoretical tools. In the case of individual problems, the book studies relevant quantitative attributes such as approximation properties or the number of hard inputs at each length. One chapter is dedicated to abs...

  12. Making coarse grained polymer simulations quantitatively predictive for statics and dynamics

    Science.gov (United States)

    Kremer, Kurt

    2010-03-01

    By combining input from short simulation runs of rather small systems with all atomistic details together with properly adapted coarse grained models we are able quantitatively predict static and especially dynamical properties of both pure polymer melts of long fully entangled but also of systems with low molecular weight additives. Comparisons to rather different experiments such as diffusion constant measurements or NMR relaxation experiments show a remarkable quantitative agreement without any adjustable parameter. Reintroduction of chemical details into the coarse grained trajectories allows the study of long time trajectories in all atomistic detail providing the opportunity for rather different means of data analysis. References: V. Harmandaris, K. Kremer, Macromolecules, in press (2009) V. Harmandaris et al, Macromolecules, 40, 7026 (2007) B. Hess, S. Leon, N. van der Vegt, K. Kremer, Soft Matter 2, 409 (2006) D. Fritz et al, Soft Matter 5, 4556 (2009)

  13. A Quantitative Version of a Theorem due to Borwein-Reich-Shafrir

    DEFF Research Database (Denmark)

    Kohlenbach, Ulrich

    2001-01-01

    We give a quantitative analysis of a result due to Borwein, Reich and Shafrir on the asymptotic behaviour of the general Krasnoselski-Mann iteration for nonexpansive self-mappings of convex sets in arbitrary normed spaces. Besides providing explicit bounds we also get new qualitative results...... bounds were known in that bounded case. For the unbounded case, no quantitative information was known before. Our results were obtained in a case study of analysing non-effective proofs in analysis by certain logical methods. General logical meta-theorems of the author guarantee (at least under some...... concerning the independence of the rate of asymptotic regularity of that iteration from various input data. In the special case of bounded convex sets, where by well-known results of Ishikawa, Edelstein/O'Brien and Goebel/Kirk the norm of the iteration converges to zero, we obtain uniform bounds which do...

  14. TART input manual

    International Nuclear Information System (INIS)

    Kimlinger, J.R.; Plechaty, E.F.

    1982-01-01

    The TART code is a Monte Carlo neutron/photon transport code that is only on the CRAY computer. All the input cards for the TART code are listed, and definitions for all input parameters are given. The execution and limitations of the code are described, and input for two sample problems are given

  15. A combined usage of stochastic and quantitative risk assessment methods in the worksites: Application on an electric power provider

    International Nuclear Information System (INIS)

    Marhavilas, P.K.; Koulouriotis, D.E.

    2012-01-01

    An individual method cannot build either a realistic forecasting model or a risk assessment process in the worksites, and future perspectives should focus on the combined forecasting/estimation approach. The main purpose of this paper is to gain insight into a risk prediction and estimation methodological framework, using the combination of three different methods, including the proportional quantitative-risk-assessment technique (PRAT), the time-series stochastic process (TSP), and the method of estimating the societal-risk (SRE) by F–N curves. In order to prove the usefulness of the combined usage of stochastic and quantitative risk assessment methods, an application on an electric power provider industry is presented to, using empirical data.

  16. Sensory Synergy as Environmental Input Integration

    Directory of Open Access Journals (Sweden)

    Fady eAlnajjar

    2015-01-01

    Full Text Available The development of a method to feed proper environmental inputs back to the central nervous system (CNS remains one of the challenges in achieving natural movement when part of the body is replaced with an artificial device. Muscle synergies are widely accepted as a biologically plausible interpretation of the neural dynamics between the CNS and the muscular system. Yet the sensorineural dynamics of environmental feedback to the CNS has not been investigated in detail. In this study, we address this issue by exploring the concept of sensory synergy. In contrast to muscle synergy, we hypothesize that sensory synergy plays an essential role in integrating the overall environmental inputs to provide low-dimensional information to the CNS. We assume that sensor synergy and muscle synergy communicate using these low-dimensional signals. To examine our hypothesis, we conducted posture control experiments involving lateral disturbance with 9 healthy participants. Proprioceptive information represented by the changes on muscle lengths were estimated by using the musculoskeletal model analysis software SIMM. Changes on muscles lengths were then used to compute sensory synergies. The experimental results indicate that the environmental inputs were translated into the two dimensional signals and used to move the upper limb to the desired position immediately after the lateral disturbance. Participants who showed high skill in posture control were found to be likely to have a strong correlation between sensory and muscle signaling as well as high coordination between the utilized sensory synergies. These results suggest the importance of integrating environmental inputs into suitable low-dimensional signals before providing them to the CNS. This mechanism should be essential when designing the prosthesis’ sensory system to make the controller simpler

  17. Sensory synergy as environmental input integration.

    Science.gov (United States)

    Alnajjar, Fady; Itkonen, Matti; Berenz, Vincent; Tournier, Maxime; Nagai, Chikara; Shimoda, Shingo

    2014-01-01

    The development of a method to feed proper environmental inputs back to the central nervous system (CNS) remains one of the challenges in achieving natural movement when part of the body is replaced with an artificial device. Muscle synergies are widely accepted as a biologically plausible interpretation of the neural dynamics between the CNS and the muscular system. Yet the sensorineural dynamics of environmental feedback to the CNS has not been investigated in detail. In this study, we address this issue by exploring the concept of sensory synergy. In contrast to muscle synergy, we hypothesize that sensory synergy plays an essential role in integrating the overall environmental inputs to provide low-dimensional information to the CNS. We assume that sensor synergy and muscle synergy communicate using these low-dimensional signals. To examine our hypothesis, we conducted posture control experiments involving lateral disturbance with nine healthy participants. Proprioceptive information represented by the changes on muscle lengths were estimated by using the musculoskeletal model analysis software SIMM. Changes on muscles lengths were then used to compute sensory synergies. The experimental results indicate that the environmental inputs were translated into the two dimensional signals and used to move the upper limb to the desired position immediately after the lateral disturbance. Participants who showed high skill in posture control were found to be likely to have a strong correlation between sensory and muscle signaling as well as high coordination between the utilized sensory synergies. These results suggest the importance of integrating environmental inputs into suitable low-dimensional signals before providing them to the CNS. This mechanism should be essential when designing the prosthesis' sensory system to make the controller simpler.

  18. Input-output supervisor

    International Nuclear Information System (INIS)

    Dupuy, R.

    1970-01-01

    The input-output supervisor is the program which monitors the flow of informations between core storage and peripheral equipments of a computer. This work is composed of three parts: 1 - Study of a generalized input-output supervisor. With sample modifications it looks like most of input-output supervisors which are running now on computers. 2 - Application of this theory on a magnetic drum. 3 - Hardware requirement for time-sharing. (author) [fr

  19. Input data for inferring species distributions in Kyphosidae world-wide

    Directory of Open Access Journals (Sweden)

    Steen Wilhelm Knudsen

    2016-09-01

    Full Text Available Input data files for inferring the relationship among the family Kyphosidae, as presented in (Knudsen and Clements, 2016 [1], is here provided together with resulting topologies, to allow the reader to explore the topologies in detail. The input data files comprise seven nexus-files with sequence alignments of mtDNA and nDNA markers for performing Bayesian analysis. A matrix of recoded character states inferred from the morphology examined in museum specimens representing Dichistiidae, Girellidae, Kyphosidae, Microcanthidae and Scorpididae, is also provided, and can be used for performing a parsimonious analysis to infer the relationship among these perciform families. The nucleotide input data files comprise both multiple and single representatives of the various species to allow for inference of the relationship among the species in Kyphosidae and between the families closely related to Kyphosidae. The ‘.xml’-files with various constrained relationships among the families potentially closely related to Kyphosidae are also provided to allow the reader to rerun and explore the results from the stepping-stone analysis. The resulting topologies are supplied in newick-file formats together with input data files for Bayesian analysis, together with ‘.xml’-files. Re-running the input data files in the appropriate software, will enable the reader to examine log-files and tree-files themselves. Keywords: Sea chub, Drummer, Kyphosus, Scorpis, Girella

  20. A guidance on MELCOR input preparation : An input deck for Ul-Chin 3 and 4 Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Song Won

    1997-02-01

    The objective of this study is to enhance the capability of assessing the severe accident sequence analyses and the containment behavior using MELCOR computer code and to provide the guideline of its efficient use. This report shows the method of the input deck preparation as well as the assessment strategy for the MELCOR code. MELCOR code is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. The code is being developed at Sandia National Laboratories for the U.S. NRC as a second generation plant risk assessment tool and the successor to the source term code package. The accident sequence of the reference input deck prepared in this study for Ulchin unit 3 and 4 nuclear power plants, is the total loss of feedwater (TLOFW) without any success of safety systems, which is similar to station blackout (TLMB). It is very useful to simulate a well-known sequence through the best estimated code or experiment, because the results of the simulation before core melt can be compared with the FSAR, but no data is available after core melt. The precalculation for the TLOFW using the reference input deck is performed successfully as expected. The other sequences will be carried out with minor changes in the reference input. This input deck will be improved continually by the adding of the safety systems not included in this input deck, and also through the sensitivity and uncertainty analyses. (author). 19 refs., 10 tabs., 55 figs.

  1. A guidance on MELCOR input preparation : An input deck for Ul-Chin 3 and 4 Nuclear Power Plant

    International Nuclear Information System (INIS)

    Cho, Song Won.

    1997-02-01

    The objective of this study is to enhance the capability of assessing the severe accident sequence analyses and the containment behavior using MELCOR computer code and to provide the guideline of its efficient use. This report shows the method of the input deck preparation as well as the assessment strategy for the MELCOR code. MELCOR code is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. The code is being developed at Sandia National Laboratories for the U.S. NRC as a second generation plant risk assessment tool and the successor to the source term code package. The accident sequence of the reference input deck prepared in this study for Ulchin unit 3 and 4 nuclear power plants, is the total loss of feedwater (TLOFW) without any success of safety systems, which is similar to station blackout (TLMB). It is very useful to simulate a well-known sequence through the best estimated code or experiment, because the results of the simulation before core melt can be compared with the FSAR, but no data is available after core melt. The precalculation for the TLOFW using the reference input deck is performed successfully as expected. The other sequences will be carried out with minor changes in the reference input. This input deck will be improved continually by the adding of the safety systems not included in this input deck, and also through the sensitivity and uncertainty analyses. (author). 19 refs., 10 tabs., 55 figs

  2. The UK waste input-output table: Linking waste generation to the UK economy.

    Science.gov (United States)

    Salemdeeb, Ramy; Al-Tabbaa, Abir; Reynolds, Christian

    2016-10-01

    In order to achieve a circular economy, there must be a greater understanding of the links between economic activity and waste generation. This study introduces the first version of the UK waste input-output table that could be used to quantify both direct and indirect waste arisings across the supply chain. The proposed waste input-output table features 21 industrial sectors and 34 waste types and is for the 2010 time-period. Using the waste input-output table, the study results quantitatively confirm that sectors with a long supply chain (i.e. manufacturing and services sectors) have higher indirect waste generation rates compared with industrial primary sectors (e.g. mining and quarrying) and sectors with a shorter supply chain (e.g. construction). Results also reveal that the construction, mining and quarrying sectors have the highest waste generation rates, 742 and 694 tonne per £1m of final demand, respectively. Owing to the aggregated format of the first version of the waste input-output, the model does not address the relationship between waste generation and recycling activities. Therefore, an updated version of the waste input-output table is expected be developed considering this issue. Consequently, the expanded model would lead to a better understanding of waste and resource flows in the supply chain. © The Author(s) 2016.

  3. Quantitative sexing (Q-Sexing) and relative quantitative sexing (RQ ...

    African Journals Online (AJOL)

    samer

    Key words: Polymerase chain reaction (PCR), quantitative real time polymerase chain reaction (qPCR), quantitative sexing, Siberian tiger. INTRODUCTION. Animal molecular sexing .... 43:3-12. Ellegren H (1996). First gene on the avian W chromosome (CHD) provides a tag for universal sexing of non-ratite birds. Proc.

  4. WetLab-2: Providing Quantitative PCR Capabilities on ISS

    Science.gov (United States)

    Parra, Macarena; Jung, Jimmy Kar Chuen; Almeida, Eduardo; Boone, Travis David; Schonfeld, Julie; Tran, Luan Hoang

    2015-01-01

    The objective of NASA Ames Research Centers WetLab-2 Project is to place on the ISS a system capable of conducting gene expression analysis via quantitative real-time PCR (qRT-PCR) of biological specimens sampled or cultured on orbit. The WetLab-2 system is capable of processing sample types ranging from microbial cultures to animal tissues dissected on-orbit. The project has developed a RNA preparation module that can lyse cells and extract RNA of sufficient quality and quantity for use as templates in qRT-PCR reactions. Our protocol has the advantage that it uses non-toxic chemicals, alcohols or other organics. The resulting RNA is transferred into a pipette and then dispensed into reaction tubes that contain all lyophilized reagents needed to perform qRT-PCR reactions. These reaction tubes are mounted on rotors to centrifuge the liquid to the reaction window of the tube using a cordless drill. System operations require simple and limited crew actions including syringe pushes, valve turns and pipette dispenses. The resulting process takes less than 30 min to have tubes ready for loading into the qRT-PCR unit.The project has selected a Commercial-Off-The-Shelf (COTS) qRT-PCR unit, the Cepheid SmartCycler, that will fly in its COTS configuration. The SmartCycler has a number of advantages including modular design (16 independent PCR modules), low power consumption, rapid thermal ramp times and four-color detection. The ability to detect up to four fluorescent channels will enable multiplex assays that can be used to normalize for RNA concentration and integrity, and to study multiple genes of interest in each module. The WetLab-2 system will have the capability to downlink data from the ISS to the ground after a completed run and to uplink new programs. The ability to conduct qRT-PCR on-orbit eliminates the confounding effects on gene expression of reentry stresses and shock acting on live cells and organisms or the concern of RNA degradation of fixed samples. The

  5. Quantitative analysis of the publishing landscape in High-Energy Physics

    CERN Document Server

    Mele, S; Vigen, Jens; Yeomans, Joanne

    2006-01-01

    World-wide collaboration in high-energy physics (HEP) is a tradition which dates back several decades, with scientific publications mostly coauthored by scientists from different countries. This coauthorship phenomenon makes it difficult to identify precisely the "share" of each country in HEP scientific production. One year's worth of HEP scientific articles published in peer-reviewed journals is analysed and their authors are uniquely assigned to countries. This method allows the first correct estimation on a pro rata basis of the share of HEP scientific publishing among several countries and institutions. The results provide an interesting insight into the geographical collaborative patterns of the HEP community. The HEP publishing landscape is further analysed to provide information on the journals favoured by the HEP community and on the geographical variation of their author bases. These results provide quantitative input to the ongoing debate on the possible transition of HEP publishing to an Open Acce...

  6. Measuring the Efficiency of Financial Inputs for Entrepreneurship

    OpenAIRE

    Lakshmi Balasubramanyan

    2009-01-01

    This study employs data on small businesses from the Office of Advocacy for the U.S. Small Business Administration, along with the Federal Deposit Insurance Corporation Call Report data for U.S. commercial banks. It examines the efficiency of the impact of the financial inputs on small business entrepreneurial output. This study provides a metric to capture financial input efficiency to the entrepreneurial process. The metric obtained from this analysis is useful for identification and adopti...

  7. Conceptual Design of GRIG (GUI Based RETRAN Input Generator)

    International Nuclear Information System (INIS)

    Lee, Gyung Jin; Hwang, Su Hyun; Hong, Soon Joon; Lee, Byung Chul; Jang, Chan Su; Um, Kil Sup

    2007-01-01

    For the development of high performance methodology using advanced transient analysis code, it is essential to generate the basic input of transient analysis code by rigorous QA procedures. There are various types of operating NPPs (Nuclear Power Plants) in Korea such as Westinghouse plants, KSNP(Korea Standard Nuclear Power Plant), APR1400 (Advance Power Reactor), etc. So there are some difficulties to generate and manage systematically the input of transient analysis code reflecting the inherent characteristics of various types of NPPs. To minimize the user faults and investment man power and to generate effectively and accurately the basic inputs of transient analysis code for all domestic NPPs, it is needed to develop the program that can automatically generate the basic input, which can be directly applied to the transient analysis, from the NPP design material. ViRRE (Visual RETRAN Running Environment) developed by KEPCO (Korea Electric Power Corporation) and KAERI (Korea Atomic Energy Research Institute) provides convenient working environment for Kori Unit 1/2. ViRRE shows the calculated results through on-line display but its capability is limited on the convenient execution of RETRAN. So it can not be used as input generator. ViSA (Visual System Analyzer) developed by KAERI is a NPA (Nuclear Plant Analyzer) using RETRAN and MARS code as thermal-hydraulic engine. ViSA contains both pre-processing and post-processing functions. In the pre-processing, only the trip data cards and boundary conditions can be changed through GUI mode based on pre-prepared text-input, so the capability of input generation is very limited. SNAP (Symbolic Nuclear Analysis Package) developed by Applied Programming Technology, Inc. and NRC (Nuclear Regulatory Commission) provides efficient working environment for the use of nuclear safety analysis codes such as RELAP5 and TRAC-M codes. SNAP covers wide aspects of thermal-hydraulic analysis from model creation through data analysis

  8. Quantitative determination of Auramine O by terahertz spectroscopy with 2DCOS-PLSR model

    Science.gov (United States)

    Zhang, Huo; Li, Zhi; Chen, Tao; Qin, Binyi

    2017-09-01

    Residues of harmful dyes such as Auramine O (AO) in herb and food products threaten the health of people. So, fast and sensitive detection techniques of the residues are needed. As a powerful tool for substance detection, terahertz (THz) spectroscopy was used for the quantitative determination of AO by combining with an improved partial least-squares regression (PLSR) model in this paper. Absorbance of herbal samples with different concentrations was obtained by THz-TDS in the band between 0.2THz and 1.6THz. We applied two-dimensional correlation spectroscopy (2DCOS) to improve the PLSR model. This method highlighted the spectral differences of different concentrations, provided a clear criterion of the input interval selection, and improved the accuracy of detection result. The experimental result indicated that the combination of the THz spectroscopy and 2DCOS-PLSR is an excellent quantitative analysis method.

  9. Sound effects: Multimodal input helps infants find displaced objects.

    Science.gov (United States)

    Shinskey, Jeanne L

    2017-09-01

    Before 9 months, infants use sound to retrieve a stationary object hidden by darkness but not one hidden by occlusion, suggesting auditory input is more salient in the absence of visual input. This article addresses how audiovisual input affects 10-month-olds' search for displaced objects. In AB tasks, infants who previously retrieved an object at A subsequently fail to find it after it is displaced to B, especially following a delay between hiding and retrieval. Experiment 1 manipulated auditory input by keeping the hidden object audible versus silent, and visual input by presenting the delay in the light versus dark. Infants succeeded more at B with audible than silent objects and, unexpectedly, more after delays in the light than dark. Experiment 2 presented both the delay and search phases in darkness. The unexpected light-dark difference disappeared. Across experiments, the presence of auditory input helped infants find displaced objects, whereas the absence of visual input did not. Sound might help by strengthening object representation, reducing memory load, or focusing attention. This work provides new evidence on when bimodal input aids object processing, corroborates claims that audiovisual processing improves over the first year of life, and contributes to multisensory approaches to studying cognition. Statement of contribution What is already known on this subject Before 9 months, infants use sound to retrieve a stationary object hidden by darkness but not one hidden by occlusion. This suggests they find auditory input more salient in the absence of visual input in simple search tasks. After 9 months, infants' object processing appears more sensitive to multimodal (e.g., audiovisual) input. What does this study add? This study tested how audiovisual input affects 10-month-olds' search for an object displaced in an AB task. Sound helped infants find displaced objects in both the presence and absence of visual input. Object processing becomes more

  10. Simplified quantitative treatment of uncertainty and interindividual variability in health risk assessment

    International Nuclear Information System (INIS)

    Bogen, K.T.

    1993-01-01

    A distinction between uncertainty (or the extent of lack of knowledge) and interindividual variability (or the extent of person-to-person heterogeneity) regarding the values of input variates must be maintained if a quantitative characterization of uncertainty in population risk or in individual risk is sought. Here, some practical methods are presented that should facilitate implementation of the analytic framework for uncertainty and variability proposed by Bogen and Spear. (1,2) Two types of methodology are discussed: one that facilitates the distinction between uncertainty and variability per se, and another that may be used to simplify quantitative analysis of distributed inputs representing either uncertainty or variability. A simple and a complex form for modeled increased risk are presented and then used to illustrate methods facilitating the distinction between uncertainty and variability in reference to characterization of both population and individual risk. Finally, a simple form of discrete probability calculus is proposed as an easily implemented, practical altemative to Monte-Carlo based procedures to quantitative integration of uncertainty and variability in risk assessment

  11. ColloInputGenerator

    DEFF Research Database (Denmark)

    2013-01-01

    This is a very simple program to help you put together input files for use in Gries' (2007) R-based collostruction analysis program. It basically puts together a text file with a frequency list of lexemes in the construction and inserts a column where you can add the corpus frequencies. It requires...... it as input for basic collexeme collostructional analysis (Stefanowitsch & Gries 2003) in Gries' (2007) program. ColloInputGenerator is, in its current state, based on programming commands introduced in Gries (2009). Projected updates: Generation of complete work-ready frequency lists....

  12. Self-Structured Organizing Single-Input CMAC Control for Robot Manipulator

    Directory of Open Access Journals (Sweden)

    ThanhQuyen Ngo

    2011-09-01

    Full Text Available This paper represents a self-structured organizing single-input control system based on differentiable cerebellar model articulation controller (CMAC for an n-link robot manipulator to achieve the high-precision position tracking. In the proposed scheme, the single-input CMAC controller is solely used to control the plant, so the input space dimension of CMAC can be simplified and no conventional controller is needed. The structure of single-input CMAC will also be self-organized; that is, the layers of single-input CMAC will grow or prune systematically and their receptive functions can be automatically adjusted. The online tuning laws of single-input CMAC parameters are derived in gradient-descent learning method and the discrete-type Lyapunov function is applied to determine the learning rates of proposed control system so that the stability of the system can be guaranteed. The simulation results of robot manipulator are provided to verify the effectiveness of the proposed control methodology.

  13. Incorporating uncertainty in RADTRAN 6.0 input files.

    Energy Technology Data Exchange (ETDEWEB)

    Dennis, Matthew L.; Weiner, Ruth F.; Heames, Terence John (Alion Science and Technology)

    2010-02-01

    Uncertainty may be introduced into RADTRAN analyses by distributing input parameters. The MELCOR Uncertainty Engine (Gauntt and Erickson, 2004) has been adapted for use in RADTRAN to determine the parameter shape and minimum and maximum of the distribution, to sample on the distribution, and to create an appropriate RADTRAN batch file. Coupling input parameters is not possible in this initial application. It is recommended that the analyst be very familiar with RADTRAN and able to edit or create a RADTRAN input file using a text editor before implementing the RADTRAN Uncertainty Analysis Module. Installation of the MELCOR Uncertainty Engine is required for incorporation of uncertainty into RADTRAN. Gauntt and Erickson (2004) provides installation instructions as well as a description and user guide for the uncertainty engine.

  14. Investing in biogas: Timing, technological choice and the value of flexibility from input mix

    International Nuclear Information System (INIS)

    Di Corato, Luca; Moretto, Michele

    2011-01-01

    In a stochastic dynamic frame, we study the technology choice problem of a continuous co-digestion biogas plant where input factors are substitutes but need to be mixed together to provide output. Given any initial rule for the composition of the feedstock, we consider the possibility of revising it if economic circumstances make it profitable. Flexibility in the mix is an advantage under randomly fluctuating input costs and comes at a higher investment cost. We show that the degree of flexibility in the productive technology installed depends on the value of the option to profitably re-arrange the input mix. Such option adds value to the project in that it provides a device for hedging against fluctuations in the input relative convenience. Accounting for such value we discuss the trade-off between investment timing and profit smoothing flexibility. - Research highlights: ► We study the technology choice problem of a continuous co-digestion biogas plant where input factors are substitutes but need to be mixed together to provide output. ► We show that the degree of flexibility in the productive technology installed depends on the value of the option to profitably re-arrange the input mix. ► Such option adds value to the project in that it provides a device for hedging against fluctuations in the input relative convenience.

  15. Agricultural input subsidies in sub-Saharan Africa\\ud – the case of Tanzania

    OpenAIRE

    Kato, Tamahi

    2016-01-01

    This thesis investigates the design, implementation and impacts of the market-smart input subsidy (NAIVS) in Tanzania’s Ruvuma Region.\\ud The research uses a mixed-methods approach, where quantitative data analysis is complemented by qualitative research. Using four waves of household panel data, I found that voucher receipt had no statistically significant impact on maize yields, income poverty or the household assets owned by recipient households. The qualitative research finds that this wa...

  16. Nuclear magnetic resonance provides a quantitative description of protein conformational flexibility on physiologically important time scales.

    Science.gov (United States)

    Salmon, Loïc; Bouvignies, Guillaume; Markwick, Phineus; Blackledge, Martin

    2011-04-12

    A complete description of biomolecular activity requires an understanding of the nature and the role of protein conformational dynamics. In recent years, novel nuclear magnetic resonance-based techniques that provide hitherto inaccessible detail concerning biomolecular motions occurring on physiologically important time scales have emerged. Residual dipolar couplings (RDCs) provide precise information about time- and ensemble-averaged structural and dynamic processes with correlation times up to the millisecond and thereby encode key information for understanding biological activity. In this review, we present the application of two very different approaches to the quantitative description of protein motion using RDCs. The first is purely analytical, describing backbone dynamics in terms of diffusive motions of each peptide plane, using extensive statistical analysis to validate the proposed dynamic modes. The second is based on restraint-free accelerated molecular dynamics simulation, providing statistically sampled free energy-weighted ensembles that describe conformational fluctuations occurring on time scales from pico- to milliseconds, at atomic resolution. Remarkably, the results from these two approaches converge closely in terms of distribution and absolute amplitude of motions, suggesting that this kind of combination of analytical and numerical models is now capable of providing a unified description of protein conformational dynamics in solution.

  17. Input measurements in reprocessing plants

    International Nuclear Information System (INIS)

    Trincherini, P.R.; Facchetti, S.

    1980-01-01

    The aim of this work is to give a review of the methods and the problems encountered in measurements in 'input accountability tanks' of irradiated fuel treatment plants. This study was prompted by the conviction that more and more precise techniques and methods should be at the service of safeguards organizations and that ever greater efforts should be directed towards promoting knowledge of them among operators and all those general area of interest includes the nuclear fuel cycle. The overall intent is to show the necessity of selecting methods which produce measurements which are not only more precise but are absolutely reliable both for routine plant operation and for safety checks in the input area. A description and a critical evaluation of the most common physical and chemical methods are provided, together with an estimate of the precision and accuracy obtained in real operating conditions

  18. Critical Quantitative Inquiry in Context

    Science.gov (United States)

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  19. Comparing in-service multi-input loads applied on non-stiff components submitted to vibration fatigue to provide specifications for robust design

    Directory of Open Access Journals (Sweden)

    Le Corre Gwenaëlle

    2018-01-01

    Full Text Available This study focuses on applications from the automotive industry, on mechanical components submitted to vibration loads. On one hand, the characterization of loading for dimensioning new structures in fatigue is enriched and updated by customer data analysis. On the other hand, the loads characterization also aims to provide robust specifications for simulation or physical tests. These specifications are needed early in the project, in order to perform the first durability verification activities. At this time, detailed information about the geometry and the material is rare. Vibration specifications need to be adapted to a calculation time or physical test durations in accordance with the pace imposed by the projects timeframe. In the trucks industry, the dynamic behaviour can vary significantly from one configuration of truck to another, as the trucks architecture impacts the load environment of the components. The vibration specifications need to be robust by taking care of the diversity of vehicles and markets considered in the scope of the projects. For non-stiff structures, the lifetime depends, among other things, on the frequency content of the loads, as well as the interactions between the components of the multi-input loads. In this context, this paper proposes an approach to compare sets of variable amplitude multi-input loads applied on non-stiff structures. The comparison is done in terms of damage, with limited information on the structure where the loads sets are applied on. The methodology is presented, as well as an application. Activities planned to validate the methodology are also exposed.

  20. Comparison Study on Empirical Correlation for Mass Transfer Coefficient with Gas Hold-up and Input Power of Aeration Process

    International Nuclear Information System (INIS)

    Park, Sang Kyoo; Yang, Hei Cheon

    2017-01-01

    As stricter environmental regulation have led to an increase in the water treatment cost, it is necessary to quantitatively study the input power of the aeration process to improve the energy efficiency of the water treatment processes. The objective of this study is to propose the empirical correlations for the mass transfer coefficient with the gas hold-up and input power in order to investigate the mass transfer characteristics of the aeration process. It was found that as the input power increases, the mass transfer coefficient increases because of the decrease of gas hold-up and increase of Reynolds number, the penetration length, and dispersion of mixed flow. The correlations for the volumetric mass transfer coefficients with gas hold-up and input power were consistent with the experimental data, with the maximum deviation less than approximately ±10.0%.

  1. Comparison Study on Empirical Correlation for Mass Transfer Coefficient with Gas Hold-up and Input Power of Aeration Process

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sang Kyoo; Yang, Hei Cheon [Chonnam Nat’l Univ., Gwangju (Korea, Republic of)

    2017-06-15

    As stricter environmental regulation have led to an increase in the water treatment cost, it is necessary to quantitatively study the input power of the aeration process to improve the energy efficiency of the water treatment processes. The objective of this study is to propose the empirical correlations for the mass transfer coefficient with the gas hold-up and input power in order to investigate the mass transfer characteristics of the aeration process. It was found that as the input power increases, the mass transfer coefficient increases because of the decrease of gas hold-up and increase of Reynolds number, the penetration length, and dispersion of mixed flow. The correlations for the volumetric mass transfer coefficients with gas hold-up and input power were consistent with the experimental data, with the maximum deviation less than approximately ±10.0%.

  2. Total dose induced increase in input offset voltage in JFET input operational amplifiers

    International Nuclear Information System (INIS)

    Pease, R.L.; Krieg, J.; Gehlhausen, M.; Black, J.

    1999-01-01

    Four different types of commercial JFET input operational amplifiers were irradiated with ionizing radiation under a variety of test conditions. All experienced significant increases in input offset voltage (Vos). Microprobe measurement of the electrical characteristics of the de-coupled input JFETs demonstrates that the increase in Vos is a result of the mismatch of the degraded JFETs. (authors)

  3. Numerical Investigation of the Influence of the Input Air Irregularity on the Performance of Turbofan Jet Engine

    Science.gov (United States)

    Novikova, Y.; Zubanov, V.

    2018-01-01

    The article describes the numerical investigation of the input air irregularity influence of turbofan engine on its characteristics. The investigated fan has a wide-blade, an inlet diameter about 2 meters, a pressure ratio about 1.6 and the bypass ratio about 4.8. The flow irregularity was simulated by the flap input in the fan inlet channel. Input of flap was carried out by an amount of 10 to 22,5% of the input channel diameter with increments of 2,5%. A nonlinear harmonic analysis (NLH-analysis) of NUMECA Fine/Turbo software was used to study the flow irregularity. The behavior of the calculated LPC characteristics repeats the experiment behavior, but there is a quantitative difference: the calculated efficiency and pressure ratio of booster consistent with the experimental data within 3% and 2% respectively, the calculated efficiency and pressure ratio of fan duct - within 4% and 2.5% respectively. An increasing the level of air irregularity in the input stage of the fan reduces the calculated mass flow, maximum pressure ratio and efficiency. With the value of flap input 12.5%, reducing the maximum air flow is 1.44%, lowering the maximum pressure ratio is 2.6%, efficiency decreasing is 3.1%.

  4. A method to quantitate cerebral blood flow using a rotating gamma camera and iodine-123 iodoamphetamine with one blood sampling

    International Nuclear Information System (INIS)

    Iida, Hidehiro; Itoh, Hiroshi; Bloomfield, P.M.; Munaka, Masahiro; Higano, Shuichi; Murakami, Matsutaro; Inugami, Atsushi; Eberl, S.; Aizawa, Yasuo; Kanno, Iwao; Uemura, Kazuo

    1994-01-01

    A method has been developed to quantitate regional cerebral blood blow (rCBF) using iodine-123-labelled N-isopropyl-p-iodoamphetamine (IMP). This technique requires only two single-photon emission tomography (SPET) scans and one blood sample. Based on a two-compartment model, radioactivity concentrations in the brain for each scan time are calculated. A standard input function has been generated by combining the input functions from 12 independent studies prior to this work to avoid frequent arterial blood sampling, and one blood sample is taken at 10 min following IMP administration for calibration of the standard arterial input function. This calibration time was determined such that the integration of the first 40 min of the calibrated, combined input function agreed best with those from 12 individual input functions (the difference was 5.3% on average). This method was applied to eight subjects (two normals and six patients with cerebral infarction), and yielded rCBF values which agreed well with those obtained by a positron emission tomography H 2 15 O autoradiography method. This method was also found to provide rCBF values that were consistent with those obtained by the non-linear least squares fitting technique and those obtained by conventional microsphere model analysis. The optimum SPET scan times were found to be 40 and 180 min for the early and delayed scans, respectively. These scan times allow the use of a conventional rotating gamma camera for clinical purposes. V d values ranged between 10 and 40 ml/g depending on the pathological condition, thereby suggesting the importance of measuring V d for each ROI. In conclusion, optimization of the blood sampling time and the scanning time enabled quantitative measurement of rCBF with two SPET scans and one blood sample. (orig.)

  5. Optimally decoding the input rate from an observation of the interspike intervals

    Energy Technology Data Exchange (ETDEWEB)

    Feng Jianfeng [COGS, University of Sussex at Brighton (United Kingdom) and Computational Neuroscience Laboratory, Babraham Institute, Cambridge (United Kingdom)]. E-mail: jf218@cam.ac.uk

    2001-09-21

    A neuron extensively receives both inhibitory and excitatory inputs. What is the ratio r between these two types of input so that the neuron can most accurately read out input information (rate)? We explore the issue in this paper provided that the neuron is an ideal observer - decoding the input information with the attainment of the Cramer-Rao inequality bound. It is found that, in general, adding certain amounts of inhibitory inputs to a neuron improves its capability of accurately decoding the input information. By calculating the Fisher information of an integrate-and-fire neuron, we determine the optimal ratio r for decoding the input information from an observation of the efferent interspike intervals. Surprisingly, the Fisher information can be zero for certain values of the ratio, seemingly implying that it is impossible to read out the encoded information at these values. By analysing the maximum likelihood estimate of the input information, it is concluded that the input information is in fact most easily estimated at the points where the Fisher information vanishes. (author)

  6. On Optimal Input Design and Model Selection for Communication Channels

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yanyan [ORNL; Djouadi, Seddik M [ORNL; Olama, Mohammed M [ORNL

    2013-01-01

    In this paper, the optimal model (structure) selection and input design which minimize the worst case identification error for communication systems are provided. The problem is formulated using metric complexity theory in a Hilbert space setting. It is pointed out that model selection and input design can be handled independently. Kolmogorov n-width is used to characterize the representation error introduced by model selection, while Gel fand and Time n-widths are used to represent the inherent error introduced by input design. After the model is selected, an optimal input which minimizes the worst case identification error is shown to exist. In particular, it is proven that the optimal model for reducing the representation error is a Finite Impulse Response (FIR) model, and the optimal input is an impulse at the start of the observation interval. FIR models are widely popular in communication systems, such as, in Orthogonal Frequency Division Multiplexing (OFDM) systems.

  7. Input or intimacy

    Directory of Open Access Journals (Sweden)

    Judit Navracsics

    2014-01-01

    Full Text Available According to the critical period hypothesis, the earlier the acquisition of a second language starts, the better. Owing to the plasticity of the brain, up until a certain age a second language can be acquired successfully according to this view. Early second language learners are commonly said to have an advantage over later ones especially in phonetic/phonological acquisition. Native-like pronunciation is said to be most likely to be achieved by young learners. However, there is evidence of accentfree speech in second languages learnt after puberty as well. Occasionally, on the other hand, a nonnative accent may appear even in early second (or third language acquisition. Cross-linguistic influences are natural in multilingual development, and we would expect the dominant language to have an impact on the weaker one(s. The dominant language is usually the one that provides the largest amount of input for the child. But is it always the amount that counts? Perhaps sometimes other factors, such as emotions, ome into play? In this paper, data obtained from an EnglishPersian-Hungarian trilingual pair of siblings (under age 4 and 3 respectively is analyzed, with a special focus on cross-linguistic influences at the phonetic/phonological levels. It will be shown that beyond the amount of input there are more important factors that trigger interference in multilingual development.

  8. Patient and healthcare provider barriers to hypertension awareness, treatment and follow up: a systematic review and meta-analysis of qualitative and quantitative studies.

    Directory of Open Access Journals (Sweden)

    Rasha Khatib

    Full Text Available BACKGROUND: Although the importance of detecting, treating, and controlling hypertension has been recognized for decades, the majority of patients with hypertension remain uncontrolled. The path from evidence to practice contains many potential barriers, but their role has not been reviewed systematically. This review aimed to synthesize and identify important barriers to hypertension control as reported by patients and healthcare providers. METHODS: Electronic databases MEDLINE, EMBASE and Global Health were searched systematically up to February 2013. Two reviewers independently selected eligible studies. Two reviewers categorized barriers based on a theoretical framework of behavior change. The theoretical framework suggests that a change in behavior requires a strong commitment to change [intention], the necessary skills and abilities to adopt the behavior [capability], and an absence of health system and support constraints. FINDINGS: Twenty-five qualitative studies and 44 quantitative studies met the inclusion criteria. In qualitative studies, health system barriers were most commonly discussed in studies of patients and health care providers. Quantitative studies identified disagreement with clinical recommendations as the most common barrier among health care providers. Quantitative studies of patients yielded different results: lack of knowledge was the most common barrier to hypertension awareness. Stress, anxiety and depression were most commonly reported as barriers that hindered or delayed adoption of a healthier lifestyle. In terms of hypertension treatment adherence, patients mostly reported forgetting to take their medication. Finally, priority setting barriers were most commonly reported by patients in terms of following up with their health care providers. CONCLUSIONS: This review identified a wide range of barriers facing patients and health care providers pursuing hypertension control, indicating the need for targeted multi

  9. Current Practices in Defining Seismic Input for Nuclear Facilities

    International Nuclear Information System (INIS)

    2015-05-01

    This report has been written in the framework of seismic subgroup of the OECD/NEA CSNI Working Group on Integrity and Ageing of Components and Structures (WGIAGE) to provide a brief review of current practices regarding the definition of the seismic input for design and reevaluation of nuclear power plants. It is taken for granted that, prior to conducting the seismic design of a nuclear facility, a seismic hazard analysis (SHA) has been conducted for the site where the facility is located. This provides some reference motions for defining those that will later be used as input for the dynamic analyses of the facility. The objective of the report is to clarify the current practices in various OECD Member States for defining the seismic input to be used in the dynamic calculations of NPPs, once the SHA results are already at hand. Current practices have been summarized for Canada, Czech Republic, Finland, France, Germany, Japan, Slovenia, South Korea, Spain, Sweden, The Netherlands, United Kingdom and United States. The main findings of the report are: a) The approaches followed by the regulatory bodies of OECD Member States differ substantially, certainly in relation with the consideration of site effects, but also in the probability level of the event that a nuclear facility should be required to withstand. b) In many countries a probabilistic approach is adopted for the design, in some cases combined with a deterministic one; in other cases, like France, Japan or South Korea, a deterministic approach is followed. c) The US and Japan have the more complete guidelines in relation with site effects. The former provide specific approaches for definition of the seismic input. The latter clearly recognizes the need to propagate the bedrock motion to foundation level, thereby introducing the site effect in some way. d) The definition of bedrock is very heterogeneous in the various countries, although this should not constitute a serious problem if the starting

  10. Comparing apples and oranges: fold-change detection of multiple simultaneous inputs.

    Directory of Open Access Journals (Sweden)

    Yuval Hart

    Full Text Available Sensory systems often detect multiple types of inputs. For example, a receptor in a cell-signaling system often binds multiple kinds of ligands, and sensory neurons can respond to different types of stimuli. How do sensory systems compare these different kinds of signals? Here, we consider this question in a class of sensory systems - including bacterial chemotaxis- which have a property known as fold-change detection: their output dynamics, including amplitude and response time, depends only on the relative changes in signal, rather than absolute changes, over a range of several decades of signal. We analyze how fold-change detection systems respond to multiple signals, using mathematical models. Suppose that a step of fold F1 is made in input 1, together with a step of F2 in input 2. What total response does the system provide? We show that when both input signals impact the same receptor with equal number of binding sites, the integrated response is multiplicative: the response dynamics depend only on the product of the two fold changes, F1F2. When the inputs bind the same receptor with different number of sites n1 and n2, the dynamics depend on a product of power laws, [Formula: see text]. Thus, two input signals which vary over time in an inverse way can lead to no response. When the two inputs affect two different receptors, other types of integration may be found and generally the system is not constrained to respond according to the product of the fold-change of each signal. These predictions can be readily tested experimentally, by providing cells with two simultaneously varying input signals. The present study suggests how cells can compare apples and oranges, namely by comparing each to its own background level, and then multiplying these two fold-changes.

  11. Interdependence of PRECIS Role Operators: A Quantitative Analysis of Their Associations.

    Science.gov (United States)

    Mahapatra, Manoranjan; Biswas, Subal Chandra

    1986-01-01

    Analyzes associations among different role operators quantitatively by taking input strings from 200 abstracts, each related to subject fields of taxation, genetic psychology, and Shakespearean drama, and subjecting them to the Chi-square test. Significant associations by other differencing operators and connectives are discussed. A schema of role…

  12. Feed and manure use in low-N-input and high-N-input dairy cattle production systems

    Science.gov (United States)

    Powell, J. Mark

    2014-11-01

    In most parts of Sub-Saharan Africa fertilizers and feeds are costly, not readily available and used sparingly in agricultural production. In many parts of Western Europe, North America, and Oceania fertilizers and feeds are relatively inexpensive, readily available and used abundantly to maximize profitable agricultural production. A case study, dairy systems approach was used to illustrate how differences in feed and manure management in a low-N-input dairy cattle system (Niger, West Africa) and a high-N-input dairy production system (Wisconsin, USA) impact agricultural production and environmental N loss. In Niger, an additional daily feed N intake of 114 g per dairy animal unit (AU, 1000 kg live weight) could increase annual milk production from 560 to 1320 kg AU-1, and the additional manure N could greatly increase millet production. In Wisconsin, reductions in daily feed N intake of 100 g AU-1 would not greatly impact milk production but decrease urinary N excretion by 25% and ammonia and nitrous oxide emissions from manure by 18% to 30%. In Niger, compared to the practice of housing livestock and applying dung only onto fields, corralling cattle or sheep on cropland (to capture urinary N) increased millet yields by 25% to 95%. The additional millet grain due to dung applications or corralling would satisfy the annual food grain requirements of 2-5 persons; the additional forage would provide 120-300 more days of feed for a typical head of cattle; and 850 to 1600 kg ha-1 more biomass would be available for soil conservation. In Wisconsin, compared to application of barn manure only, corralling heifers in fields increased forage production by only 8% to 11%. The application of barn manure or corralling increased forage production by 20% to 70%. This additional forage would provide 350-580 more days of feed for a typical dairy heifer. Study results demonstrate how different approaches to feed and manure management in low-N-input and high-N-input dairy cattle

  13. Robust Fault Detection for Switched Fuzzy Systems With Unknown Input.

    Science.gov (United States)

    Han, Jian; Zhang, Huaguang; Wang, Yingchun; Sun, Xun

    2017-10-03

    This paper investigates the fault detection problem for a class of switched nonlinear systems in the T-S fuzzy framework. The unknown input is considered in the systems. A novel fault detection unknown input observer design method is proposed. Based on the proposed observer, the unknown input can be removed from the fault detection residual. The weighted H∞ performance level is considered to ensure the robustness. In addition, the weighted H₋ performance level is introduced, which can increase the sensibility of the proposed detection method. To verify the proposed scheme, a numerical simulation example and an electromechanical system simulation example are provided at the end of this paper.

  14. Estimation of the pulmonary input function in dynamic whole body PET

    International Nuclear Information System (INIS)

    Ho-Shon, K.; Buchen, P.; Meikle, S.R.; Fulham, M.J.; University of Sydney, Sydney, NSW

    1998-01-01

    Full text: Dynamic data acquisition in Whole Body PET (WB-PET) has the potential to measure the metabolic rate of glucose (MRGlc) in tissue in-vivo. Estimation of changes in tumoral MRGlc may be a valuable tool in cancer by providing an quantitative index of response to treatment. A necessary requirement is an input function (IF) that can be obtained from arterial, 'arterialised' venous or pulmonary arterial blood in the case of lung tumours. Our aim was to extract the pulmonary input function from dynamic WB-PET data using Principal Component Analysis (PCA), Factor Analysis (FA) and Maximum Entropy (ME) for the evaluation of patients undergoing induction chemotherapy for non-small cell lung cancer. PCA is first used as a method of dimension reduction to obtain a signal space, defined by an optimal metric and a set of vectors. FA is used together with a ME constraint to rotate these vectors to obtain 'physiological' factors. A form of entropy function that does not require normalised data was used. This enabled the introduction of a penalty function based on the blood concentration at the last time point which provides an additional constraint. Tissue functions from 10 planes through normal lung were simulated. The model was a linear combination of an IF and a tissue time activity curve (TAC). The proportion of the IF to TAC was varied over the planes to simulate the apical to basal gradient in vascularity of the lung and pseudo Poisson noise was added. The method accurately extracted the IF at noise levels spanning the expected range for dynamic ROI data acquired with the interplane septa extended. Our method is minimally invasive because it requires only 1 late venous blood sample and is applicable to a wide range of tracers since it does not assume a particular compartmental model. Pilot data from 2 patients have been collected enabling comparison of the estimated IF with direct blood sampling from the pulmonary artery

  15. PLEXOS Input Data Generator

    Energy Technology Data Exchange (ETDEWEB)

    2017-02-01

    The PLEXOS Input Data Generator (PIDG) is a tool that enables PLEXOS users to better version their data, automate data processing, collaborate in developing inputs, and transfer data between different production cost modeling and other power systems analysis software. PIDG can process data that is in a generalized format from multiple input sources, including CSV files, PostgreSQL databases, and PSS/E .raw files and write it to an Excel file that can be imported into PLEXOS with only limited manual intervention.

  16. Input and language development in bilingually developing children.

    Science.gov (United States)

    Hoff, Erika; Core, Cynthia

    2013-11-01

    Language skills in young bilingual children are highly varied as a result of the variability in their language experiences, making it difficult for speech-language pathologists to differentiate language disorder from language difference in bilingual children. Understanding the sources of variability in bilingual contexts and the resulting variability in children's skills will help improve language assessment practices by speech-language pathologists. In this article, we review literature on bilingual first language development for children under 5 years of age. We describe the rate of development in single and total language growth, we describe effects of quantity of input and quality of input on growth, and we describe effects of family composition on language input and language growth in bilingual children. We provide recommendations for language assessment of young bilingual children and consider implications for optimizing children's dual language development. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  17. Keeping community health workers in Uganda motivated: key challenges, facilitators, and preferred program inputs

    Science.gov (United States)

    Brunie, Aurélie; Wamala-Mucheri, Patricia; Otterness, Conrad; Akol, Angela; Chen, Mario; Bufumbo, Leonard; Weaver, Mark

    2014-01-01

    Introduction: In the face of global health worker shortages, community health workers (CHWs) are an important health care delivery strategy for underserved populations. In Uganda, community-based programs often use volunteer CHWs to extend services, including family planning, in rural areas. This study examined factors related to CHW motivation and level of activity in 3 family planning programs in Uganda. Methods: Data were collected between July and August 2011, and sources comprised 183 surveys with active CHWs, in-depth interviews (IDIs) with 43 active CHWs and 5 former CHWs, and service statistics records. Surveys included a discrete choice experiment (DCE) to elicit CHW preferences for selected program inputs. Results: Service statistics indicated an average of 56 visits with family planning clients per surveyed CHW over the 3-month period prior to data collection. In the survey, new skills and knowledge, perceived impact on the community, and enhanced status were the main positive aspects of the job reported by CHWs; the main challenges related to transportation. Multivariate analyses identified 2 correlates of CHWs being highly vs. less active (in terms of number of client visits): experiencing problems with supplies and not collaborating with peers. DCE results showed that provision of a package including a T-shirt, badge, and bicycle was the program input CHWs preferred, followed by a mobile phone (without airtime). IDI data reinforced and supplemented these quantitative findings. Social prestige, social responsibility, and aspirations for other opportunities were important motivators, while main challenges related to transportation and commodity stockouts. CHWs had complex motivations for wanting better compensation, including offsetting time and transportation costs, providing for their families, and feeling appreciated for their efforts. Conclusion: Volunteer CHW programs in Uganda and elsewhere need to carefully consider appropriate combinations of

  18. Input-profile-based software failure probability quantification for safety signal generation systems

    International Nuclear Information System (INIS)

    Kang, Hyun Gook; Lim, Ho Gon; Lee, Ho Jung; Kim, Man Cheol; Jang, Seung Cheol

    2009-01-01

    The approaches for software failure probability estimation are mainly based on the results of testing. Test cases represent the inputs, which are encountered in an actual use. The test inputs for the safety-critical application such as a reactor protection system (RPS) of a nuclear power plant are the inputs which cause the activation of protective action such as a reactor trip. A digital system treats inputs from instrumentation sensors as discrete digital values by using an analog-to-digital converter. Input profile must be determined in consideration of these characteristics for effective software failure probability quantification. Another important characteristic of software testing is that we do not have to repeat the test for the same input value since the software response is deterministic for each specific digital input. With these considerations, we propose an effective software testing method for quantifying the failure probability. As an example application, the input profile of the digital RPS is developed based on the typical plant data. The proposed method in this study is expected to provide a simple but realistic mean to quantify the software failure probability based on input profile and system dynamics.

  19. Robust fault detection and isolation technique for single-input/single-output closed-loop control systems that exhibit actuator and sensor faults

    DEFF Research Database (Denmark)

    Izadi-Zamanabadi, Roozbeh; Alavi, S. M. Mahdi; Hayes, M. J.

    2008-01-01

    An integrated quantitative feedback design and frequency-based fault detection and isolation (FDI) approach is presented for single-input/single-output systems. A novel design methodology, based on shaping the system frequency response, is proposed to generate an appropriate residual signal...

  20. Rigour in quantitative research.

    Science.gov (United States)

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  1. Control Board Digital Interface Input Devices – Touchscreen, Trackpad, or Mouse?

    Energy Technology Data Exchange (ETDEWEB)

    Thomas A. Ulrich; Ronald L. Boring; Roger Lew

    2015-08-01

    The authors collaborated with a power utility to evaluate input devices for use in the human system interface (HSI) for a new digital Turbine Control System (TCS) at a nuclear power plant (NPP) undergoing a TCS upgrade. A standalone dynamic software simulation of the new digital TCS and a mobile kiosk were developed to conduct an input device study to evaluate operator preference and input device effectiveness. The TCS software presented the anticipated HSI for the TCS and mimicked (i.e., simulated) the turbine systems’ responses to operator commands. Twenty-four licensed operators from the two nuclear power units participated in the study. Three input devices were tested: a trackpad, mouse, and touchscreen. The subjective feedback from the survey indicates the operators preferred the touchscreen interface. The operators subjectively rated the touchscreen as the fastest and most comfortable input device given the range of tasks they performed during the study, but also noted a lack of accuracy for selecting small targets. The empirical data suggest the mouse input device provides the most consistent performance for screen navigation and manipulating on screen controls. The trackpad input device was both empirically and subjectively found to be the least effective and least desired input device.

  2. High Input Voltage, Silicon Carbide Power Processing Unit Performance Demonstration

    Science.gov (United States)

    Bozak, Karin E.; Pinero, Luis R.; Scheidegger, Robert J.; Aulisio, Michael V.; Gonzalez, Marcelo C.; Birchenough, Arthur G.

    2015-01-01

    A silicon carbide brassboard power processing unit has been developed by the NASA Glenn Research Center in Cleveland, Ohio. The power processing unit operates from two sources: a nominal 300 Volt high voltage input bus and a nominal 28 Volt low voltage input bus. The design of the power processing unit includes four low voltage, low power auxiliary supplies, and two parallel 7.5 kilowatt (kW) discharge power supplies that are capable of providing up to 15 kilowatts of total power at 300 to 500 Volts (V) to the thruster. Additionally, the unit contains a housekeeping supply, high voltage input filter, low voltage input filter, and master control board, such that the complete brassboard unit is capable of operating a 12.5 kilowatt Hall effect thruster. The performance of the unit was characterized under both ambient and thermal vacuum test conditions, and the results demonstrate exceptional performance with full power efficiencies exceeding 97%. The unit was also tested with a 12.5kW Hall effect thruster to verify compatibility and output filter specifications. With space-qualified silicon carbide or similar high voltage, high efficiency power devices, this would provide a design solution to address the need for high power electric propulsion systems.

  3. Provider Opinions Regarding the Development of a Stigma-Reduction Intervention Tailored for Providers

    Science.gov (United States)

    Mittal, Dinesh; Corrigan, Patrick; Drummond, Karen L.; Porchia, Sylvia; Sullivan, Greer

    2016-01-01

    Interventions involving contact with a person who has recovered from mental illness are most effective at reducing stigma. This study sought input from health care providers to inform the design of a contact intervention intended to reduce provider stigma toward persons with serious mental illness. Using a purposive sampling strategy, data were…

  4. Development and validation of gui based input file generation code for relap

    International Nuclear Information System (INIS)

    Anwar, M.M.; Khan, A.A.; Chughati, I.R.; Chaudri, K.S.; Inyat, M.H.; Hayat, T.

    2009-01-01

    Reactor Excursion and Leak Analysis Program (RELAP) is a widely acceptable computer code for thermal hydraulics modeling of Nuclear Power Plants. It calculates thermal- hydraulic transients in water-cooled nuclear reactors by solving approximations to the one-dimensional, two-phase equations of hydraulics in an arbitrarily connected system of nodes. However, the preparation of input file and subsequent analysis of results in this code is a tedious task. The development of a Graphical User Interface (GUI) for preparation of the input file for RELAP-5 is done with the validation of GUI generated Input File. The GUI is developed in Microsoft Visual Studio using Visual C Sharp (C) as programming language. The Nodalization diagram is drawn graphically and the program contains various component forms along with the starting data form, which are launched for properties assignment to generate Input File Cards serving as GUI for the user. The GUI is provided with Open / Save function to store and recall the Nodalization diagram along with Components' properties. The GUI generated Input File is validated for several case studies and individual component cards are compared with the originally required format. The generated Input File of RELAP is found consistent with the requirement of RELAP. The GUI provided a useful platform for simulating complex hydrodynamic problems efficiently with RELAP. (author)

  5. Speaker Input Variability Does Not Explain Why Larger Populations Have Simpler Languages.

    Science.gov (United States)

    Atkinson, Mark; Kirby, Simon; Smith, Kenny

    2015-01-01

    A learner's linguistic input is more variable if it comes from a greater number of speakers. Higher speaker input variability has been shown to facilitate the acquisition of phonemic boundaries, since data drawn from multiple speakers provides more information about the distribution of phonemes in a speech community. It has also been proposed that speaker input variability may have a systematic influence on individual-level learning of morphology, which can in turn influence the group-level characteristics of a language. Languages spoken by larger groups of people have less complex morphology than those spoken in smaller communities. While a mechanism by which the number of speakers could have such an effect is yet to be convincingly identified, differences in speaker input variability, which is thought to be larger in larger groups, may provide an explanation. By hindering the acquisition, and hence faithful cross-generational transfer, of complex morphology, higher speaker input variability may result in structural simplification. We assess this claim in two experiments which investigate the effect of such variability on language learning, considering its influence on a learner's ability to segment a continuous speech stream and acquire a morphologically complex miniature language. We ultimately find no evidence to support the proposal that speaker input variability influences language learning and so cannot support the hypothesis that it explains how population size determines the structural properties of language.

  6. Examination of Information Technology (IT) Certification and the Human Resources (HR) Professional Perception of Job Performance: A Quantitative Study

    Science.gov (United States)

    O'Horo, Neal O.

    2013-01-01

    The purpose of this quantitative survey study was to test the Leontief input/output theory relating the input of IT certification to the output of the English-speaking U.S. human resource professional perceived IT professional job performance. Participants (N = 104) rated their perceptions of IT certified vs. non-IT certified professionals' job…

  7. Energy Input Flux in the Global Quiet-Sun Corona

    Energy Technology Data Exchange (ETDEWEB)

    Mac Cormack, Cecilia; Vásquez, Alberto M.; López Fuentes, Marcelo; Nuevo, Federico A. [Instituto de Astronomía y Física del Espacio (IAFE), CONICET-UBA, CC 67—Suc 28, (C1428ZAA) Ciudad Autónoma de Buenos Aires (Argentina); Landi, Enrico; Frazin, Richard A. [Department of Climate and Space Sciences and Engineering (CLaSP), University of Michigan, 2455 Hayward Street, Ann Arbor, MI 48109-2143 (United States)

    2017-07-01

    We present first results of a novel technique that provides, for the first time, constraints on the energy input flux at the coronal base ( r ∼ 1.025 R {sub ⊙}) of the quiet Sun at a global scale. By combining differential emission measure tomography of EUV images, with global models of the coronal magnetic field, we estimate the energy input flux at the coronal base that is required to maintain thermodynamically stable structures. The technique is described in detail and first applied to data provided by the Extreme Ultraviolet Imager instrument, on board the Solar TErrestrial RElations Observatory mission, and the Atmospheric Imaging Assembly instrument, on board the Solar Dynamics Observatory mission, for two solar rotations with different levels of activity. Our analysis indicates that the typical energy input flux at the coronal base of magnetic loops in the quiet Sun is in the range ∼0.5–2.0 × 10{sup 5} (erg s{sup −1} cm{sup −2}), depending on the structure size and level of activity. A large fraction of this energy input, or even its totality, could be accounted for by Alfvén waves, as shown by recent independent observational estimates derived from determinations of the non-thermal broadening of spectral lines in the coronal base of quiet-Sun regions. This new tomography product will be useful for the validation of coronal heating models in magnetohydrodinamic simulations of the global corona.

  8. SSYST-3. Input description

    International Nuclear Information System (INIS)

    Meyder, R.

    1983-12-01

    The code system SSYST-3 is designed to analyse the thermal and mechanical behaviour of a fuel rod during a LOCA. The report contains a complete input-list for all modules and several tested inputs for a LOCA analysis. (orig.)

  9. A framework for quantitative assessment of impacts related to energy and mineral resource development

    Science.gov (United States)

    Haines, Seth S.; Diffendorfer, James; Balistrieri, Laurie S.; Berger, Byron R.; Cook, Troy A.; Gautier, Donald L.; Gallegos, Tanya J.; Gerritsen, Margot; Graffy, Elisabeth; Hawkins, Sarah; Johnson, Kathleen; Macknick, Jordan; McMahon, Peter; Modde, Tim; Pierce, Brenda; Schuenemeyer, John H.; Semmens, Darius; Simon, Benjamin; Taylor, Jason; Walton-Day, Katherine

    2013-01-01

    Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sage grouse leks and piñon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. The framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.

  10. Material input of nuclear fuel

    International Nuclear Information System (INIS)

    Rissanen, S.; Tarjanne, R.

    2001-01-01

    The Material Input (MI) of nuclear fuel, expressed in terms of the total amount of natural material needed for manufacturing a product, is examined. The suitability of the MI method for assessing the environmental impacts of fuels is also discussed. Material input is expressed as a Material Input Coefficient (MIC), equalling to the total mass of natural material divided by the mass of the completed product. The material input coefficient is, however, only an intermediate result, which should not be used as such for the comparison of different fuels, because the energy contents of nuclear fuel is about 100 000-fold compared to the energy contents of fossil fuels. As a final result, the material input is expressed in proportion to the amount of generated electricity, which is called MIPS (Material Input Per Service unit). Material input is a simplified and commensurable indicator for the use of natural material, but because it does not take into account the harmfulness of materials or the way how the residual material is processed, it does not alone express the amount of environmental impacts. The examination of the mere amount does not differentiate between for example coal, natural gas or waste rock containing usually just sand. Natural gas is, however, substantially more harmful for the ecosystem than sand. Therefore, other methods should also be used to consider the environmental load of a product. The material input coefficient of nuclear fuel is calculated using data from different types of mines. The calculations are made among other things by using the data of an open pit mine (Key Lake, Canada), an underground mine (McArthur River, Canada) and a by-product mine (Olympic Dam, Australia). Furthermore, the coefficient is calculated for nuclear fuel corresponding to the nuclear fuel supply of Teollisuuden Voima (TVO) company in 2001. Because there is some uncertainty in the initial data, the inaccuracy of the final results can be even 20-50 per cent. The value

  11. Litter input controls on soil carbon in a temperate deciduous forest

    DEFF Research Database (Denmark)

    Bowden, Richard D.; Deem, Lauren; Plante, Alain F.

    2014-01-01

    Above- and belowground litter inputs in a temperate deciduous forest were altered for 20 yr to determine the importance of leaves and roots on soil C and soil organic matter (SOM) quantity and quality. Carbon and SOM quantity and quality were measured in the O horizon and mineral soil to 50 cm...... soil C, but decreases in litter inputs resulted in rapid soil C declines. Root litter may ultimately provide more stable sources of soil C. Management activities or environmental alterations that decrease litter inputs in mature forests can lower soil C content; however, increases in forest...

  12. Chemical sensors are hybrid-input memristors

    Science.gov (United States)

    Sysoev, V. I.; Arkhipov, V. E.; Okotrub, A. V.; Pershin, Y. V.

    2018-04-01

    Memristors are two-terminal electronic devices whose resistance depends on the history of input signal (voltage or current). Here we demonstrate that the chemical gas sensors can be considered as memristors with a generalized (hybrid) input, namely, with the input consisting of the voltage, analyte concentrations and applied temperature. The concept of hybrid-input memristors is demonstrated experimentally using a single-walled carbon nanotubes chemical sensor. It is shown that with respect to the hybrid input, the sensor exhibits some features common with memristors such as the hysteretic input-output characteristics. This different perspective on chemical gas sensors may open new possibilities for smart sensor applications.

  13. Quantitative analysis of the anti-noise performance of an m-sequence in an electromagnetic method

    Science.gov (United States)

    Yuan, Zhe; Zhang, Yiming; Zheng, Qijia

    2018-02-01

    An electromagnetic method with a transmitted waveform coded by an m-sequence achieved better anti-noise performance compared to the conventional manner with a square-wave. The anti-noise performance of the m-sequence varied with multiple coding parameters; hence, a quantitative analysis of the anti-noise performance for m-sequences with different coding parameters was required to optimize them. This paper proposes the concept of an identification system, with the identified Earth impulse response obtained by measuring the system output with the input of the voltage response. A quantitative analysis of the anti-noise performance of the m-sequence was achieved by analyzing the amplitude-frequency response of the corresponding identification system. The effects of the coding parameters on the anti-noise performance are summarized by numerical simulation, and their optimization is further discussed in our conclusions; the validity of the conclusions is further verified by field experiment. The quantitative analysis method proposed in this paper provides a new insight into the anti-noise mechanism of the m-sequence, and could be used to evaluate the anti-noise performance of artificial sources in other time-domain exploration methods, such as the seismic method.

  14. Understanding quantitative research: part 1.

    Science.gov (United States)

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.

  15. Quantum nondemolition measurement with a nonclassical meter input and an electro-optic enhancement

    DEFF Research Database (Denmark)

    Andersen, Ulrik Lund; Buchler, B.C.; Bachor, H.A.

    2002-01-01

    Optical quantum nondemolition measurements are performed using a beamsplitter with a nonclassical meter input and a electro-optic feedforward loop. The nonclassical meter input is provided by a stable 4.5 dB amplitude squeezed source generated by an optical parametric amplifier. We show...

  16. Using Non-Invasive Multi-Spectral Imaging to Quantitatively Assess Tissue Vasculature

    Energy Technology Data Exchange (ETDEWEB)

    Vogel, A; Chernomordik, V; Riley, J; Hassan, M; Amyot, F; Dasgeb, B; Demos, S G; Pursley, R; Little, R; Yarchoan, R; Tao, Y; Gandjbakhche, A H

    2007-10-04

    This research describes a non-invasive, non-contact method used to quantitatively analyze the functional characteristics of tissue. Multi-spectral images collected at several near-infrared wavelengths are input into a mathematical optical skin model that considers the contributions from different analytes in the epidermis and dermis skin layers. Through a reconstruction algorithm, we can quantify the percent of blood in a given area of tissue and the fraction of that blood that is oxygenated. Imaging normal tissue confirms previously reported values for the percent of blood in tissue and the percent of blood that is oxygenated in tissue and surrounding vasculature, for the normal state and when ischemia is induced. This methodology has been applied to assess vascular Kaposi's sarcoma lesions and the surrounding tissue before and during experimental therapies. The multi-spectral imaging technique has been combined with laser Doppler imaging to gain additional information. Results indicate that these techniques are able to provide quantitative and functional information about tissue changes during experimental drug therapy and investigate progression of disease before changes are visibly apparent, suggesting a potential for them to be used as complementary imaging techniques to clinical assessment.

  17. Evaluation of two population-based input functions for quantitative neurological FDG PET studies

    International Nuclear Information System (INIS)

    Eberl, S.; Anayat, A.R.; Fulton, R.R.; Hooper, P.K.; Fulham, M.J.

    1997-01-01

    The conventional measurement of the regional cerebral metabolic rate of glucose (rCMRGlc) with fluorodexoyglucose (FDG) and positron emission tomography (PET) requires arterial or arterialised-venous (a-v) blood sampling at frequent intervals to obtain the plasma input function (IF). We evaluated the accuracy of rCMRGlc measurements using population-based IFs that were calibrated with two a-v blood samples. Population-based IFs were derived from: (1) the average of a-v IFs from 26 patients (Standard IF) and (2) a published model of FDG plasma concentration (Feng IF). Values for rCMRGlc calculated from the population-based IFs were compared with values obtained with IFs derived from frequent a-v blood sampling in 20 non-diabetic and six diabetic patients. Values for rCMRGlc calculated with the different IFs were highly correlated for both patient groups (r≥0.992) and root mean square residuals about the regression line were less than 0.24 mg/min/100 g. The Feng IF tended to underestimate high rCMRGlc. Both population-based IFs simplify the measurement of rCMRGlc with minimal loss in accuracy and require only two a-v blood samples for calibration. The reduced blood sampling requirements markedly reduce radiation exposure to the blood sampler. (orig.)

  18. Whole-Brain Monosynaptic Afferent Inputs to Basal Forebrain Cholinergic System

    Directory of Open Access Journals (Sweden)

    Rongfeng Hu

    2016-10-01

    Full Text Available The basal forebrain cholinergic system (BFCS robustly modulates many important behaviors, such as arousal, attention, learning and memory, through heavy projections to cortex and hippocampus. However, the presynaptic partners governing BFCS activity still remain poorly understood. Here, we utilized a recently developed rabies virus-based cell-type-specific retrograde tracing system to map the whole-brain afferent inputs of the BFCS. We found that the BFCS receives inputs from multiple cortical areas, such as orbital frontal cortex, motor cortex, and insular cortex, and that the BFCS also receives dense inputs from several subcortical nuclei related to motivation and stress, including lateral septum (LS, central amygdala (CeA, paraventricular nucleus of hypothalamus (PVH, dorsal raphe (DRN and parabrachial nucleus (PBN. Interestingly, we found that the BFCS receives inputs from the olfactory areas and the entorhinal-hippocampal system. These results greatly expand our knowledge about the connectivity of the mouse BFCS and provided important preliminary indications for future exploration of circuit function.

  19. Input variable selection for data-driven models of Coriolis flowmeters for two-phase flow measurement

    International Nuclear Information System (INIS)

    Wang, Lijuan; Yan, Yong; Wang, Xue; Wang, Tao

    2017-01-01

    Input variable selection is an essential step in the development of data-driven models for environmental, biological and industrial applications. Through input variable selection to eliminate the irrelevant or redundant variables, a suitable subset of variables is identified as the input of a model. Meanwhile, through input variable selection the complexity of the model structure is simplified and the computational efficiency is improved. This paper describes the procedures of the input variable selection for the data-driven models for the measurement of liquid mass flowrate and gas volume fraction under two-phase flow conditions using Coriolis flowmeters. Three advanced input variable selection methods, including partial mutual information (PMI), genetic algorithm-artificial neural network (GA-ANN) and tree-based iterative input selection (IIS) are applied in this study. Typical data-driven models incorporating support vector machine (SVM) are established individually based on the input candidates resulting from the selection methods. The validity of the selection outcomes is assessed through an output performance comparison of the SVM based data-driven models and sensitivity analysis. The validation and analysis results suggest that the input variables selected from the PMI algorithm provide more effective information for the models to measure liquid mass flowrate while the IIS algorithm provides a fewer but more effective variables for the models to predict gas volume fraction. (paper)

  20. StreamQRE: Modular Specification and Efficient Evaluation of Quantitative Queries over Streaming Data.

    Science.gov (United States)

    Mamouras, Konstantinos; Raghothaman, Mukund; Alur, Rajeev; Ives, Zachary G; Khanna, Sanjeev

    2017-06-01

    Real-time decision making in emerging IoT applications typically relies on computing quantitative summaries of large data streams in an efficient and incremental manner. To simplify the task of programming the desired logic, we propose StreamQRE, which provides natural and high-level constructs for processing streaming data. Our language has a novel integration of linguistic constructs from two distinct programming paradigms: streaming extensions of relational query languages and quantitative extensions of regular expressions. The former allows the programmer to employ relational constructs to partition the input data by keys and to integrate data streams from different sources, while the latter can be used to exploit the logical hierarchy in the input stream for modular specifications. We first present the core language with a small set of combinators, formal semantics, and a decidable type system. We then show how to express a number of common patterns with illustrative examples. Our compilation algorithm translates the high-level query into a streaming algorithm with precise complexity bounds on per-item processing time and total memory footprint. We also show how to integrate approximation algorithms into our framework. We report on an implementation in Java, and evaluate it with respect to existing high-performance engines for processing streaming data. Our experimental evaluation shows that (1) StreamQRE allows more natural and succinct specification of queries compared to existing frameworks, (2) the throughput of our implementation is higher than comparable systems (for example, two-to-four times greater than RxJava), and (3) the approximation algorithms supported by our implementation can lead to substantial memory savings.

  1. Enhanced Input in LCTL Pedagogy

    Directory of Open Access Journals (Sweden)

    Marilyn S. Manley

    2009-08-01

    Full Text Available Language materials for the more-commonly-taught languages (MCTLs often include visual input enhancement (Sharwood Smith 1991, 1993 which makes use of typographical cues like bolding and underlining to enhance the saliency of targeted forms. For a variety of reasons, this paper argues that the use of enhanced input, both visual and oral, is especially important as a tool for the lesscommonly-taught languages (LCTLs. As there continues to be a scarcity of teaching resources for the LCTLs, individual teachers must take it upon themselves to incorporate enhanced input into their own self-made materials. Specific examples of how to incorporate both visual and oral enhanced input into language teaching are drawn from the author’s own experiences teaching Cuzco Quechua. Additionally, survey results are presented from the author’s Fall 2010 semester Cuzco Quechua language students, supporting the use of both visual and oral enhanced input.

  2. Enhanced Input in LCTL Pedagogy

    Directory of Open Access Journals (Sweden)

    Marilyn S. Manley

    2010-08-01

    Full Text Available Language materials for the more-commonly-taught languages (MCTLs often include visual input enhancement (Sharwood Smith 1991, 1993 which makes use of typographical cues like bolding and underlining to enhance the saliency of targeted forms. For a variety of reasons, this paper argues that the use of enhanced input, both visual and oral, is especially important as a tool for the lesscommonly-taught languages (LCTLs. As there continues to be a scarcity of teaching resources for the LCTLs, individual teachers must take it upon themselves to incorporate enhanced input into their own self-made materials. Specific examples of how to incorporate both visual and oral enhanced input into language teaching are drawn from the author’s own experiences teaching Cuzco Quechua. Additionally, survey results are presented from the author’s Fall 2010 semester Cuzco Quechua language students, supporting the use of both visual and oral enhanced input.

  3. The Anthropogenic Effects of Hydrocarbon Inputs to Coastal Seas: Are There Potential Biogeochemical Impacts?

    Science.gov (United States)

    Anderson, M. R.; Rivkin, R. B.

    2016-02-01

    Petroleum hydrocarbon discharges related to fossil fuel exploitation have the potential to alter microbial processes in the upper ocean. While the ecotoxicological effects of such inputs are commonly evaluated, the potential for eutrophication from the constituent organic and inorganic nutrients has been largely ignored. Hydrocarbons from natural seeps and anthropogenic sources represent a measurable source of organic carbon for surface waters. The most recent (1989-1997) estimate of average world-wide input of hydrocarbons to the sea is 1.250 x 1012 g/yr ≈ 1.0 x 1012g C/year. Produced water from offshore platforms is the largest waste stream from oil and gas exploitation and contributes significant quantities of inorganic nutrients such as N, P and Fe. In coastal areas where such inputs are a significant source of these nutrients, model studies show the potential to shift production toward smaller cells and net heterotrophy. The consequences of these nutrient sources for coastal systems and semi enclosed seas are complex and difficult to predict, because (1) there is a lack of comprehensive data on inputs and in situ concentrations and (2) the is no conceptual or quantitative framework to consider their effects on ocean biogeochemical processes. Here we use examples from the North Sea (produced water discharges 1% total riverine input and NH4 3% of the annual riverine nitrogen load), the South China Sea (total petroleum hydrocarbons = 10-1750 μg/l in offshore waters), and the Gulf of Mexico (seeps = 76-106 x 109 gC/yr, Macondo blowout 545 x 109 gC) to demonstrate how hydrocarbon and produced water inputs can influence basin scale biogeochemical and ecosystem processes and to propose a framework to consider these effects on larger scales.

  4. The Impact of Quantitative Data Provided by a Multi-spectral Digital Skin Lesion Analysis Device on Dermatologists'Decisions to Biopsy Pigmented Lesions.

    Science.gov (United States)

    Farberg, Aaron S; Winkelmann, Richard R; Tucker, Natalie; White, Richard; Rigel, Darrell S

    2017-09-01

    BACKGROUND: Early diagnosis of melanoma is critical to survival. New technologies, such as a multi-spectral digital skin lesion analysis (MSDSLA) device [MelaFind, STRATA Skin Sciences, Horsham, Pennsylvania] may be useful to enhance clinician evaluation of concerning pigmented skin lesions. Previous studies evaluated the effect of only the binary output. OBJECTIVE: The objective of this study was to determine how decisions dermatologists make regarding pigmented lesion biopsies are impacted by providing both the underlying classifier score (CS) and associated probability risk provided by multi-spectral digital skin lesion analysis. This outcome was also compared against the improvement reported with the provision of only the binary output. METHODS: Dermatologists attending an educational conference evaluated 50 pigmented lesions (25 melanomas and 25 benign lesions). Participants were asked if they would biopsy the lesion based on clinical images, and were asked this question again after being shown multi-spectral digital skin lesion analysis data that included the probability graphs and classifier score. RESULTS: Data were analyzed from a total of 160 United States board-certified dermatologists. Biopsy sensitivity for melanoma improved from 76 percent following clinical evaluation to 92 percent after quantitative multi-spectral digital skin lesion analysis information was provided ( p quantitative data were provided. Negative predictive value also increased (68% vs. 91%, panalysis (64% vs. 86%, p data into physician evaluation of pigmented lesions led to both increased sensitivity and specificity, thereby resulting in more accurate biopsy decisions.

  5. Dynamic Contrast-Enhanced Perfusion MRI of High Grade Brain Gliomas Obtained with Arterial or Venous Waveform Input Function.

    Science.gov (United States)

    Filice, Silvano; Crisi, Girolamo

    2016-01-01

    The aim of this study was to evaluate the differences in dynamic contrast-enhanced (DCE) magnetic resonance imaging (MRI) perfusion estimates of high-grade brain gliomas (HGG) due to the use of an input function (IF) obtained respectively from arterial (AIF) and venous (VIF) approaches by two different commercially available software applications. This prospective study includes 20 patients with pathologically confirmed diagnosis of high-grade gliomas. The data source was processed by using two DCE dedicated commercial packages, both based on the extended Toft model, but the first customized to obtain input function from arterial measurement and the second from sagittal sinus sampling. The quantitative parametric perfusion maps estimated from the two software packages were compared by means of a region of interest (ROI) analysis. The resulting input functions from venous and arterial data were also compared. No significant difference has been found between the perfusion parameters obtained with the two different software packages (P-value < .05). The comparison of the VIFs and AIFs obtained by the two packages showed no statistical differences. Direct comparison of DCE-MRI measurements with IF generated by means of arterial or venous waveform led to no statistical difference in quantitative metrics for evaluating HGG. However, additional research involving DCE-MRI acquisition protocols and post-processing would be beneficial to further substantiate the effectiveness of venous approach as the IF method compared with arterial-based IF measurement. Copyright © 2015 by the American Society of Neuroimaging.

  6. Cancer and the LGBTQ Population: Quantitative and Qualitative Results from an Oncology Providers' Survey on Knowledge, Attitudes, and Practice Behaviors.

    Science.gov (United States)

    Tamargo, Christina L; Quinn, Gwendolyn P; Sanchez, Julian A; Schabath, Matthew B

    2017-10-07

    Despite growing social acceptance, the LGBTQ population continues to face barriers to healthcare including fear of stigmatization by healthcare providers, and providers' lack of knowledge about LGBTQ-specific health issues. This analysis focuses on the assessment of quantitative and qualitative responses from a subset of providers who identified as specialists that treat one or more of the seven cancers that may be disproportionate in LGBTQ patients. A 32-item web-based survey was emailed to 388 oncology providers at a single institution. The survey assessed: demographics, knowledge, attitudes, and practice behaviors. Oncology providers specializing in seven cancer types had poor knowledge of LGBTQ-specific health needs, with fewer than half of the surveyed providers (49.5%) correctly answering knowledge questions. Most providers had overall positive attitudes toward LGBTQ patients, with 91.7% agreeing they would be comfortable treating this population, and would support education and/or training on LGBTQ-related cancer health issues. Results suggest that despite generally positive attitudes toward the LGBTQ population, oncology providers who treat cancer types most prevalent among the population, lack knowledge of their unique health issues. Knowledge and practice behaviors may improve with enhanced education and training on this population's specific needs.

  7. Cost efficiency with triangular fuzzy number input prices: An application of DEA

    International Nuclear Information System (INIS)

    Bagherzadeh Valami, H.

    2009-01-01

    The cost efficiency model (CE) has been considered by researchers as a Data Envelopment Analysis (DEA) model for evaluating the efficiency of DMUs. In this model, the possibility of producing the outputs of a target DMU is evaluated by the input prices of the DMU. This provides a criterion for evaluating the CE of DMUs. The main contribution of this paper is to provide an approach for generalizing the CE of DMUs when their input prices are triangular fuzzy numbers, where preliminary concepts of fuzzy theory and CE, are directly used.

  8. Input preshaping with frequency domain information for flexible-link manipulator control

    Science.gov (United States)

    Tzes, Anthony; Englehart, Matthew J.; Yurkovich, Stephen

    1989-01-01

    The application of an input preshaping scheme to flexible manipulators is considered. The resulting control corresponds to a feedforward term that convolves in real-time the desired reference input with a sequence of impulses and produces a vibration free output. The robustness of the algorithm with respect to injected disturbances and modal frequency variations is not satisfactory and can be improved by convolving the input with a longer sequence of impulses. The incorporation of the preshaping scheme to a closed-loop plant, using acceleration feedback, offers satisfactory disturbance rejection due to feedback and cancellation of the flexible mode effects due to the preshaping. A frequency domain identification scheme is used to estimate the modal frequencies on-line and subsequently update the spacing between the impulses. The combined adaptive input preshaping scheme provides the fastest possible slew that results in a vibration free output.

  9. Enhancing the Quantitative Representation of Socioeconomic Conditions in the Shared Socio-economic Pathways (SSPs) using the International Futures Model

    Science.gov (United States)

    Rothman, D. S.; Siraj, A.; Hughes, B.

    2013-12-01

    The international research community is currently in the process of developing new scenarios for climate change research. One component of these scenarios are the Shared Socio-economic Pathways (SSPs), which describe a set of possible future socioeconomic conditions. These are presented in narrative storylines with associated quantitative drivers. The core quantitative drivers include total population, average GDP per capita, educational attainment, and urbanization at the global, regional, and national levels. At the same time there have been calls, particularly by the IAV community, for the SSPs to include additional quantitative information on other key social factors, such as income inequality, governance, health, and access to key infrastructures, which are discussed in the narratives. The International Futures system (IFs), based at the Pardee Center at the University of Denver, is able to provide forecasts of many of these indicators. IFs cannot use the SSP drivers as exogenous inputs, but we are able to create development pathways that closely reproduce the core quantitative drivers defined by the different SSPs, as well as incorporating assumptions on other key driving factors described in the qualitative narratives. In this paper, we present forecasts for additional quantitative indicators based upon the implementation of the SSP development pathways in IFs. These results will be of value to many researchers.

  10. Quantitation of cerebral blood flow using HMPAO tomography

    International Nuclear Information System (INIS)

    Bruyant, P.; Mallet, J.J.; Sau, J.; Teyssier, R.; Bonmartin, A.

    1997-01-01

    A method has been developed to quantitate regional cerebral blood flow (rCBF) using 99m Tc-HMPAO. It relies on the application of the bolus distribution principle. The rCBF is determined using compartmental analysis, by measuring the amount of tracer retained in the parenchyma and the input function. The values for blood: brain partition coefficient and for the conversion rate from the lipophilic to the hydrophilic form of the tracer are taken from the literature. Mean values for rCBF in eight patients are 41.1 ± 6.4 et 25.6 ± 5.8 mL.min -1 for the grey matter and for the white matter respectively (mean±standard deviation). This method allows to quantitate rCBF with one SPET scan and one venous blood sample. (authors)

  11. Quantitative EPR A Practitioners Guide

    CERN Document Server

    Eaton, Gareth R; Barr, David P; Weber, Ralph T

    2010-01-01

    This is the first comprehensive yet practical guide for people who perform quantitative EPR measurements. No existing book provides this level of practical guidance to ensure the successful use of EPR. There is a growing need in both industrial and academic research to provide meaningful and accurate quantitative EPR results. This text discusses the various sample, instrument and software related aspects required for EPR quantitation. Specific topics include: choosing a reference standard, resonator considerations (Q, B1, Bm), power saturation characteristics, sample positioning, and finally, putting all the factors together to obtain an accurate spin concentration of a sample.

  12. Mobile gaze input system for pervasive interaction

    DEFF Research Database (Denmark)

    2017-01-01

    feedback to the user in response to the received command input. The unit provides feedback to the user on how to position the mobile unit in front of his eyes. The gaze tracking unit interacts with one or more controlled devices via wireless or wired communications. Example devices include a lock......, a thermostat, a light or a TV. The connection between the gaze tracking unit may be temporary or longer-lasting. The gaze tracking unit may detect features of the eye that provide information about the identity of the user....

  13. Envisioning a Quantitative Studies Center: A Liberal Arts Perspective

    Directory of Open Access Journals (Sweden)

    Gizem Karaali

    2010-01-01

    Full Text Available Several academic institutions are searching for ways to help students develop their quantitative reasoning abilities and become more adept at higher-level tasks that involve quantitative skills. In this note we study the particular way Pomona College has framed this issue within its own context and what it plans to do about it. To this end we describe our efforts as members of a campus-wide committee that was assigned the duty of investigating the feasibility of founding a quantitative studies center on our campus. These efforts involved analysis of data collected through a faculty questionnaire, discipline-specific input obtained from each departmental representative, and a survey of what some of our peer institutions are doing to tackle these issues. In our studies, we identified three critical needs where quantitative support would be most useful in our case: tutoring and mentoring for entry-level courses; support for various specialized and analytic software tools for upper-level courses; and a uniform basic training for student tutors and mentors. We surmise that our challenges can be mitigated effectively via the formation of a well-focused and -planned quantitative studies center. We believe our process, findings and final proposal will be helpful to others who are looking to resolve similar issues on their own campuses.

  14. FLUTAN 2.0. Input specifications

    International Nuclear Information System (INIS)

    Willerding, G.; Baumann, W.

    1996-05-01

    FLUTAN is a highly vectorized computer code for 3D fluiddynamic and thermal-hydraulic analyses in Cartesian or cylinder coordinates. It is related to the family of COMMIX codes originally developed at Argonne National Laboratory, USA, and particularly to COMMIX-1A and COMMIX-1B, which were made available to FZK in the frame of cooperation contracts within the fast reactor safety field. FLUTAN 2.0 is an improved version of the FLUTAN code released in 1992. It offers some additional innovations, e.g. the QUICK-LECUSSO-FRAM techniques for reducing numerical diffusion in the k-ε turbulence model equations; a higher sophisticated wall model for specifying a mass flow outside the surface walls together with its flow path and its associated inlet and outlet flow temperatures; and a revised and upgraded pressure boundary condition to fully include the outlet cells in the solution process of the conservation equations. Last but not least, a so-called visualization option based on VISART standards has been provided. This report contains detailed input instructions, presents formulations of the various model options, and explains how to use the code by means of comprehensive sample input. (orig.) [de

  15. Simultaneous acquisition of dynamic PET-MRI: arterial input function using DSC-MRI and [18F]-FET

    Energy Technology Data Exchange (ETDEWEB)

    Caldeira, Liliana; Yun, Seong Dae; Silva, Nuno da; Filss, Christian; Scheins, Juergen; Telmann, Lutz; Herzog, Hans; Shah, Jon [Institute of Neuroscience and Medicine - 4, Forschungszentrum Juelich GmbH (Germany)

    2015-05-18

    This work focuses on the study of simultaneous dynamic MR-PET acquisition in brain tumour patients. MR-based perfusion-weighted imaging (PWI) and PET [18F]-FET are dynamic methods, which allow to evaluate tumour metabolism in a quantitative way. In both methods, arterial input function (AIF) is necessary for quantification. However, the AIF estimation is a challenging task. In this work, we explore the possibilities to combine dynamic MR and PET AIF.

  16. Simultaneous acquisition of dynamic PET-MRI: arterial input function using DSC-MRI and [18F]-FET

    International Nuclear Information System (INIS)

    Caldeira, Liliana; Yun, Seong Dae; Silva, Nuno da; Filss, Christian; Scheins, Juergen; Telmann, Lutz; Herzog, Hans; Shah, Jon

    2015-01-01

    This work focuses on the study of simultaneous dynamic MR-PET acquisition in brain tumour patients. MR-based perfusion-weighted imaging (PWI) and PET [18F]-FET are dynamic methods, which allow to evaluate tumour metabolism in a quantitative way. In both methods, arterial input function (AIF) is necessary for quantification. However, the AIF estimation is a challenging task. In this work, we explore the possibilities to combine dynamic MR and PET AIF.

  17. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  18. EPICS Input/Output Controller (IOC) application developer's guide. APS Release 3.12

    International Nuclear Information System (INIS)

    Kraimer, M.R.

    1994-11-01

    This document describes the core software that resides in an Input/Output Controller (IOC), one of the major components of EPICS. The basic components are: (OPI) Operator Interface; this is a UNIX based workstation which can run various EPICS tools; (IOC) Input/Output Controller; this is a VME/VXI based chassis containing a Motorola 68xxx processor, various I/O modules, and VME modules that provide access to other I/O buses such as GPIB, (LAN), Local Area Network; and this is the communication network which allows the IOCs and OPIs to communicate. Epics provides a software component, Channel Access, which provides network transparent communication between a Channel Access client and an arbitrary number of Channel Access servers

  19. A Review of Quantitative Situation Assessment Models for Nuclear Power Plant Operators

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Seong, Poong Hyun

    2009-01-01

    Situation assessment is the process of developing situation awareness and situation awareness is defined as 'the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning and the projection of their status in the near future.' Situation awareness is an important element influencing human actions because human decision making is based on the result of situation assessment or situation awareness. There are many models for situation awareness and those models can be categorized into qualitative or quantitative. As the effects of some input factors on situation awareness can be investigated through the quantitative models, the quantitative models are more useful for the design of operator interfaces, automation strategies, training program, and so on, than the qualitative models. This study presents the review of two quantitative models of situation assessment (SA) for nuclear power plant operators

  20. Input filter compensation for switching regulators

    Science.gov (United States)

    Lee, F. C.; Kelkar, S. S.

    1982-01-01

    The problems caused by the interaction between the input filter, output filter, and the control loop are discussed. The input filter design is made more complicated because of the need to avoid performance degradation and also stay within the weight and loss limitations. Conventional input filter design techniques are then dicussed. The concept of pole zero cancellation is reviewed; this concept is the basis for an approach to control the peaking of the output impedance of the input filter and thus mitigate some of the problems caused by the input filter. The proposed approach for control of the peaking of the output impedance of the input filter is to use a feedforward loop working in conjunction with feedback loops, thus forming a total state control scheme. The design of the feedforward loop for a buck regulator is described. A possible implementation of the feedforward loop design is suggested.

  1. Survey of Quantitative Research Metrics to Assess Pilot Performance in Upset Recovery

    Science.gov (United States)

    Le Vie, Lisa R.

    2016-01-01

    Accidents attributable to in-flight loss of control are the primary cause for fatal commercial jet accidents worldwide. The National Aeronautics and Space Administration (NASA) conducted a literature review to determine and identify the quantitative standards for assessing upset recovery performance. This review contains current recovery procedures for both military and commercial aviation and includes the metrics researchers use to assess aircraft recovery performance. Metrics include time to first input, recognition time and recovery time and whether that input was correct or incorrect. Other metrics included are: the state of the autopilot and autothrottle, control wheel/sidestick movement resulting in pitch and roll, and inputs to the throttle and rudder. In addition, airplane state measures, such as roll reversals, altitude loss/gain, maximum vertical speed, maximum/minimum air speed, maximum bank angle and maximum g loading are reviewed as well.

  2. Isolating Graphical Failure-Inducing Input for Privacy Protection in Error Reporting Systems

    Directory of Open Access Journals (Sweden)

    Matos João

    2016-04-01

    Full Text Available This work proposes a new privacy-enhancing system that minimizes the disclosure of information in error reports. Error reporting mechanisms are of the utmost importance to correct software bugs but, unfortunately, the transmission of an error report may reveal users’ private information. Some privacy-enhancing systems for error reporting have been presented in the past years, yet they rely on path condition analysis, which we show in this paper to be ineffective when it comes to graphical-based input. Knowing that numerous applications have graphical user interfaces (GUI, it is very important to overcome such limitation. This work describes a new privacy-enhancing error reporting system, based on a new input minimization algorithm called GUIᴍɪɴ that is geared towards GUI, to remove input that is unnecessary to reproduce the observed failure. Before deciding whether to submit the error report, the user is provided with a step-by-step graphical replay of the minimized input, to evaluate whether it still yields sensitive information. We also provide an open source implementation of the proposed system and evaluate it with well-known applications.

  3. Soil organic carbon dynamics jointly controlled by climate, carbon inputs, soil properties and soil carbon fractions.

    Science.gov (United States)

    Luo, Zhongkui; Feng, Wenting; Luo, Yiqi; Baldock, Jeff; Wang, Enli

    2017-10-01

    Soil organic carbon (SOC) dynamics are regulated by the complex interplay of climatic, edaphic and biotic conditions. However, the interrelation of SOC and these drivers and their potential connection networks are rarely assessed quantitatively. Using observations of SOC dynamics with detailed soil properties from 90 field trials at 28 sites under different agroecosystems across the Australian cropping regions, we investigated the direct and indirect effects of climate, soil properties, carbon (C) inputs and soil C pools (a total of 17 variables) on SOC change rate (r C , Mg C ha -1  yr -1 ). Among these variables, we found that the most influential variables on r C were the average C input amount and annual precipitation, and the total SOC stock at the beginning of the trials. Overall, C inputs (including C input amount and pasture frequency in the crop rotation system) accounted for 27% of the relative influence on r C , followed by climate 25% (including precipitation and temperature), soil C pools 24% (including pool size and composition) and soil properties (such as cation exchange capacity, clay content, bulk density) 24%. Path analysis identified a network of intercorrelations of climate, soil properties, C inputs and soil C pools in determining r C . The direct correlation of r C with climate was significantly weakened if removing the effects of soil properties and C pools, and vice versa. These results reveal the relative importance of climate, soil properties, C inputs and C pools and their complex interconnections in regulating SOC dynamics. Ignorance of the impact of changes in soil properties, C pool composition and C input (quantity and quality) on SOC dynamics is likely one of the main sources of uncertainty in SOC predictions from the process-based SOC models. © 2017 John Wiley & Sons Ltd.

  4. Reconstruction of neuronal input through modeling single-neuron dynamics and computations

    International Nuclear Information System (INIS)

    Qin, Qing; Wang, Jiang; Yu, Haitao; Deng, Bin; Chan, Wai-lok

    2016-01-01

    Mathematical models provide a mathematical description of neuron activity, which can better understand and quantify neural computations and corresponding biophysical mechanisms evoked by stimulus. In this paper, based on the output spike train evoked by the acupuncture mechanical stimulus, we present two different levels of models to describe the input-output system to achieve the reconstruction of neuronal input. The reconstruction process is divided into two steps: First, considering the neuronal spiking event as a Gamma stochastic process. The scale parameter and the shape parameter of Gamma process are, respectively, defined as two spiking characteristics, which are estimated by a state-space method. Then, leaky integrate-and-fire (LIF) model is used to mimic the response system and the estimated spiking characteristics are transformed into two temporal input parameters of LIF model, through two conversion formulas. We test this reconstruction method by three different groups of simulation data. All three groups of estimates reconstruct input parameters with fairly high accuracy. We then use this reconstruction method to estimate the non-measurable acupuncture input parameters. Results show that under three different frequencies of acupuncture stimulus conditions, estimated input parameters have an obvious difference. The higher the frequency of the acupuncture stimulus is, the higher the accuracy of reconstruction is.

  5. Reconstruction of neuronal input through modeling single-neuron dynamics and computations

    Energy Technology Data Exchange (ETDEWEB)

    Qin, Qing; Wang, Jiang; Yu, Haitao; Deng, Bin, E-mail: dengbin@tju.edu.cn; Chan, Wai-lok [School of Electrical Engineering and Automation, Tianjin University, Tianjin 300072 (China)

    2016-06-15

    Mathematical models provide a mathematical description of neuron activity, which can better understand and quantify neural computations and corresponding biophysical mechanisms evoked by stimulus. In this paper, based on the output spike train evoked by the acupuncture mechanical stimulus, we present two different levels of models to describe the input-output system to achieve the reconstruction of neuronal input. The reconstruction process is divided into two steps: First, considering the neuronal spiking event as a Gamma stochastic process. The scale parameter and the shape parameter of Gamma process are, respectively, defined as two spiking characteristics, which are estimated by a state-space method. Then, leaky integrate-and-fire (LIF) model is used to mimic the response system and the estimated spiking characteristics are transformed into two temporal input parameters of LIF model, through two conversion formulas. We test this reconstruction method by three different groups of simulation data. All three groups of estimates reconstruct input parameters with fairly high accuracy. We then use this reconstruction method to estimate the non-measurable acupuncture input parameters. Results show that under three different frequencies of acupuncture stimulus conditions, estimated input parameters have an obvious difference. The higher the frequency of the acupuncture stimulus is, the higher the accuracy of reconstruction is.

  6. [Input on monitoring and evaluation practices of government management of Brazilian Municipal Health Departments].

    Science.gov (United States)

    Miranda, Alcides Silva de; Carvalho, André Luis Bonifácio de; Cavalcante, Caio Garcia Correia Sá

    2012-04-01

    What do the leaders of the Municipal Health Service (SMS) report and say about the systematic monitoring and evaluation of their own government management? The purpose of this paper is to provide input for the formulation of plausible hypotheses about such institutional processes and practices based on information produced in an exploratory study. This is a multiple case study with quantitative and qualitative analysis of answers to a semi-structured questionnaire given to government officials of a systematic sample of 577 Municipal Health Services (10.4% of the total in Brazil). They were selected and stratified by proportional distribution among states and by the population size of municipalities. In general, it shows that approximately half of the respondents use information from Health Monitoring Evaluations to orient decision-making, planning and other management approaches. This proportion tends to decrease in cities with smaller populations. There are specific and significant gaps in financial, personnel and crisis management. The evidence from the hypotheses highlights the fact that these processes are still at an early stage.

  7. SISTEM KONTROL OTOMATIK DENGAN MODEL SINGLE-INPUT-DUAL-OUTPUT DALAM KENDALI EFISIENSI UMUR-PEMAKAIAN INSTRUMEN

    Directory of Open Access Journals (Sweden)

    S.N.M.P. Simamora

    2014-10-01

    Full Text Available Efficiency condition occurs when the value of the used outputs compared to the resource total that has been used almost close to the value 1 (absolute environment. An instrument to achieve efficiency if the power output level has decreased significantly in the life of the instrument used, if it compared to the previous condition, when the instrument is not equipped with additional systems (or proposed model improvement. Even more effective if the inputs model that are used in unison to achieve a homogeneous output. On this research has been designed and implemented the automatic control system for models of single input-dual-output, wherein the sampling instruments used are lamp and fan. Source voltage used is AC (alternate-current and tested using quantitative research methods and instrumentation (with measuring instruments are observed. The results obtained demonstrate the efficiency of the instrument experienced a significant current model of single-input-dual-output applied separately instrument trials such as lamp and fan when it compared to the condition or state before. And the result show that the design has been built, can also run well.

  8. Teaching Fundamental Skills in Microsoft Excel to First-Year Students in Quantitative Analysis

    Science.gov (United States)

    Rubin, Samuel J.; Abrams, Binyomin

    2015-01-01

    Despite their technological savvy, most students entering university lack the necessary computer skills to succeed in a quantitative analysis course, in which they are often expected to input, analyze, and plot results of experiments without any previous formal education in Microsoft Excel or similar programs. This lack of formal education results…

  9. Low cost data acquisition module for evaluating the quantitative performance of daylight systems

    Energy Technology Data Exchange (ETDEWEB)

    Ciampini, F.; Scarazzato, P.S. [Universidade Estadual de Campinas, Faculdade de Engenharia Civil, Arquitetura e Urbanismo, P.O. Box 6021, CEP 13083-852 Campinas (Brazil); Neves, A.A.R. [Universidade Estadual de Campinas, Instituto de Fisica Gleb Wataghin, Departamento de Eletronica Quantica, P.O. Box 6165, 13083-970 Campinas (Brazil); Pereira, D.C.L.; Yamanaka, M.H. [Universidade de Sao Paulo, Faculdade de Arquitetura e Urbanismo, Departamento de Tecnologia da Arquitetura, Rua do Lago, 878 CEP 05508-080, Sao Paulo (Brazil)

    2007-09-15

    The search for efficient, auto-sustainable constructions that allows the user a contact with the outer environment has stimulated the development of advanced strategies, in various devices, for the exploitation of the daylight. A low cost data acquisition system was developed in this study, to observe the distribution of the natural light inside a prototype and to evaluate the quantitative performance for redirecting systems. The luminous sensor is a light dependent resistor, that responds to the illuminance with a reduction in the resistance when illuminated, through a log-log dependence. Calibration curves are set up to relate the change of resistance to absolute illuminance. It therefore provides a continuous investigation of the illuminance for various sampled points in the interior test space with a 0.03% digital error due to the 12-bit resolution. The final measured error of 5% is mainly due to the system calibration and resistance memory history. The circuit connects to a standard parallel port of any Personal Computer and supplies 64 analog inputs, one for each luminous sensor. The circuit can be easily modified to attend different quantities of analog inputs or communication ports. (author)

  10. 7 CFR 3430.607 - Stakeholder input.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Stakeholder input. 3430.607 Section 3430.607 Agriculture Regulations of the Department of Agriculture (Continued) COOPERATIVE STATE RESEARCH, EDUCATION... § 3430.607 Stakeholder input. CSREES shall seek and obtain stakeholder input through a variety of forums...

  11. Can Simulation Credibility Be Improved Using Sensitivity Analysis to Understand Input Data Effects on Model Outcome?

    Science.gov (United States)

    Myers, Jerry G.; Young, M.; Goodenow, Debra A.; Keenan, A.; Walton, M.; Boley, L.

    2015-01-01

    Model and simulation (MS) credibility is defined as, the quality to elicit belief or trust in MS results. NASA-STD-7009 [1] delineates eight components (Verification, Validation, Input Pedigree, Results Uncertainty, Results Robustness, Use History, MS Management, People Qualifications) that address quantifying model credibility, and provides guidance to the model developers, analysts, and end users for assessing the MS credibility. Of the eight characteristics, input pedigree, or the quality of the data used to develop model input parameters, governing functions, or initial conditions, can vary significantly. These data quality differences have varying consequences across the range of MS application. NASA-STD-7009 requires that the lowest input data quality be used to represent the entire set of input data when scoring the input pedigree credibility of the model. This requirement provides a conservative assessment of model inputs, and maximizes the communication of the potential level of risk of using model outputs. Unfortunately, in practice, this may result in overly pessimistic communication of the MS output, undermining the credibility of simulation predictions to decision makers. This presentation proposes an alternative assessment mechanism, utilizing results parameter robustness, also known as model input sensitivity, to improve the credibility scoring process for specific simulations.

  12. Noninvasive quantification of cerebral metabolic rate for glucose in rats using 18F-FDG PET and standard input function

    Science.gov (United States)

    Hori, Yuki; Ihara, Naoki; Teramoto, Noboru; Kunimi, Masako; Honda, Manabu; Kato, Koichi; Hanakawa, Takashi

    2015-01-01

    Measurement of arterial input function (AIF) for quantitative positron emission tomography (PET) studies is technically challenging. The present study aimed to develop a method based on a standard arterial input function (SIF) to estimate input function without blood sampling. We performed 18F-fluolodeoxyglucose studies accompanied by continuous blood sampling for measurement of AIF in 11 rats. Standard arterial input function was calculated by averaging AIFs from eight anesthetized rats, after normalization with body mass (BM) and injected dose (ID). Then, the individual input function was estimated using two types of SIF: (1) SIF calibrated by the individual's BM and ID (estimated individual input function, EIFNS) and (2) SIF calibrated by a single blood sampling as proposed previously (EIF1S). No significant differences in area under the curve (AUC) or cerebral metabolic rate for glucose (CMRGlc) were found across the AIF-, EIFNS-, and EIF1S-based methods using repeated measures analysis of variance. In the correlation analysis, AUC or CMRGlc derived from EIFNS was highly correlated with those derived from AIF and EIF1S. Preliminary comparison between AIF and EIFNS in three awake rats supported an idea that the method might be applicable to behaving animals. The present study suggests that EIFNS method might serve as a noninvasive substitute for individual AIF measurement. PMID:25966947

  13. Automation of Geometry Input for Building Code Compliance Check

    DEFF Research Database (Denmark)

    Petrova, Ekaterina Aleksandrova; Johansen, Peter Lind; Jensen, Rasmus Lund

    2017-01-01

    Documentation of compliance with the energy performance regulations at the end of the detailed design phase is mandatory for building owners in Denmark. Therefore, besides multidisciplinary input, the building design process requires various iterative analyses, so that the optimal solutions can....... That has left the industry in constant pursuit of possibilities for integration of the tool within the Building Information Modelling environment so that the potential provided by the latter can be harvested and the processed can be optimized. This paper presents a solution for automated data extraction...... from building geometry created in Autodesk Revit and its translation to input for compliance check analysis....

  14. World Input-Output Network.

    Directory of Open Access Journals (Sweden)

    Federica Cerina

    Full Text Available Production systems, traditionally analyzed as almost independent national systems, are increasingly connected on a global scale. Only recently becoming available, the World Input-Output Database (WIOD is one of the first efforts to construct the global multi-regional input-output (GMRIO tables. By viewing the world input-output system as an interdependent network where the nodes are the individual industries in different economies and the edges are the monetary goods flows between industries, we analyze respectively the global, regional, and local network properties of the so-called world input-output network (WION and document its evolution over time. At global level, we find that the industries are highly but asymmetrically connected, which implies that micro shocks can lead to macro fluctuations. At regional level, we find that the world production is still operated nationally or at most regionally as the communities detected are either individual economies or geographically well defined regions. Finally, at local level, for each industry we compare the network-based measures with the traditional methods of backward linkages. We find that the network-based measures such as PageRank centrality and community coreness measure can give valuable insights into identifying the key industries.

  15. Automation of RELAP5 input calibration and code validation using genetic algorithm

    International Nuclear Information System (INIS)

    Phung, Viet-Anh; Kööp, Kaspar; Grishchenko, Dmitry; Vorobyev, Yury; Kudinov, Pavel

    2016-01-01

    SRQ of primary interest. The ranges of the input parameter were defined based on the experimental data and results of the calibration process. Then GA was used in order to identify combinations of the uncertain input parameters that provide maximum deviation of code prediction results from the experimental data. Such approach provides a conservative estimate of the possible discrepancy between the code result and the experimental data.

  16. Automation of RELAP5 input calibration and code validation using genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Phung, Viet-Anh, E-mail: vaphung@kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Kööp, Kaspar, E-mail: kaspar@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Grishchenko, Dmitry, E-mail: dmitry@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Vorobyev, Yury, E-mail: yura3510@gmail.com [National Research Center “Kurchatov Institute”, Kurchatov square 1, Moscow 123182 (Russian Federation); Kudinov, Pavel, E-mail: pavel@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden)

    2016-04-15

    SRQ of primary interest. The ranges of the input parameter were defined based on the experimental data and results of the calibration process. Then GA was used in order to identify combinations of the uncertain input parameters that provide maximum deviation of code prediction results from the experimental data. Such approach provides a conservative estimate of the possible discrepancy between the code result and the experimental data.

  17. SplicePlot: a utility for visualizing splicing quantitative trait loci.

    Science.gov (United States)

    Wu, Eric; Nance, Tracy; Montgomery, Stephen B

    2014-04-01

    RNA sequencing has provided unprecedented resolution of alternative splicing and splicing quantitative trait loci (sQTL). However, there are few tools available for visualizing the genotype-dependent effects of splicing at a population level. SplicePlot is a simple command line utility that produces intuitive visualization of sQTLs and their effects. SplicePlot takes mapped RNA sequencing reads in BAM format and genotype data in VCF format as input and outputs publication-quality Sashimi plots, hive plots and structure plots, enabling better investigation and understanding of the role of genetics on alternative splicing and transcript structure. Source code and detailed documentation are available at http://montgomerylab.stanford.edu/spliceplot/index.html under Resources and at Github. SplicePlot is implemented in Python and is supported on Linux and Mac OS. A VirtualBox virtual machine running Ubuntu with SplicePlot already installed is also available.

  18. Residual N effects from livestock manure inputs to soils

    DEFF Research Database (Denmark)

    Schröder, Jaap; Bechini, Luca; Bittman, Shabtai

    Organic inputs including livestock manures provide nitrogen (N) to crops beyond the year of their application. This so-called residual N effect should be taken into account when making decisions on N rates for individual fields, but also when interpreting N response trials in preparation...

  19. Input description for BIOPATH

    International Nuclear Information System (INIS)

    Marklund, J.E.; Bergstroem, U.; Edlund, O.

    1980-01-01

    The computer program BIOPATH describes the flow of radioactivity within a given ecosystem after a postulated release of radioactive material and the resulting dose for specified population groups. The present report accounts for the input data necessary to run BIOPATH. The report also contains descriptions of possible control cards and an input example as well as a short summary of the basic theory.(author)

  20. The human motor neuron pools receive a dominant slow‐varying common synaptic input

    Science.gov (United States)

    Negro, Francesco; Yavuz, Utku Şükrü

    2016-01-01

    Key points Motor neurons in a pool receive both common and independent synaptic inputs, although the proportion and role of their common synaptic input is debated.Classic correlation techniques between motor unit spike trains do not measure the absolute proportion of common input and have limitations as a result of the non‐linearity of motor neurons.We propose a method that for the first time allows an accurate quantification of the absolute proportion of low frequency common synaptic input (60%) of common input, irrespective of their different functional and control properties.These results increase our knowledge about the role of common and independent input to motor neurons in force control. Abstract Motor neurons receive both common and independent synaptic inputs. This observation is classically based on the presence of a significant correlation between pairs of motor unit spike trains. The functional significance of different relative proportions of common input across muscles, individuals and conditions is still debated. One of the limitations in our understanding of correlated input to motor neurons is that it has not been possible so far to quantify the absolute proportion of common input with respect to the total synaptic input received by the motor neurons. Indeed, correlation measures of pairs of output spike trains only allow for relative comparisons. In the present study, we report for the first time an approach for measuring the proportion of common input in the low frequency bandwidth (60%) proportion of common low frequency oscillations with respect to their total synaptic input. These results suggest that the central nervous system provides a large amount of common input to motor neuron pools, in a similar way to that for muscles with different functional and control properties. PMID:27151459

  1. Quantitative global sensitivity analysis of a biologically based dose-response pregnancy model for the thyroid endocrine system.

    Science.gov (United States)

    Lumen, Annie; McNally, Kevin; George, Nysia; Fisher, Jeffrey W; Loizou, George D

    2015-01-01

    A deterministic biologically based dose-response model for the thyroidal system in a near-term pregnant woman and the fetus was recently developed to evaluate quantitatively thyroid hormone perturbations. The current work focuses on conducting a quantitative global sensitivity analysis on this complex model to identify and characterize the sources and contributions of uncertainties in the predicted model output. The workflow and methodologies suitable for computationally expensive models, such as the Morris screening method and Gaussian Emulation processes, were used for the implementation of the global sensitivity analysis. Sensitivity indices, such as main, total and interaction effects, were computed for a screened set of the total thyroidal system descriptive model input parameters. Furthermore, a narrower sub-set of the most influential parameters affecting the model output of maternal thyroid hormone levels were identified in addition to the characterization of their overall and pair-wise parameter interaction quotients. The characteristic trends of influence in model output for each of these individual model input parameters over their plausible ranges were elucidated using Gaussian Emulation processes. Through global sensitivity analysis we have gained a better understanding of the model behavior and performance beyond the domains of observation by the simultaneous variation in model inputs over their range of plausible uncertainties. The sensitivity analysis helped identify parameters that determine the driving mechanisms of the maternal and fetal iodide kinetics, thyroid function and their interactions, and contributed to an improved understanding of the system modeled. We have thus demonstrated the use and application of global sensitivity analysis for a biologically based dose-response model for sensitive life-stages such as pregnancy that provides richer information on the model and the thyroidal system modeled compared to local sensitivity analysis.

  2. Quantitative global sensitivity analysis of a biologically based dose-response pregnancy model for the thyroid endocrine system

    Directory of Open Access Journals (Sweden)

    Annie eLumen

    2015-05-01

    Full Text Available A deterministic biologically based dose-response model for the thyroidal system in a near-term pregnant woman and the fetus was recently developed to evaluate quantitatively thyroid hormone perturbations. The current work focuses on conducting a quantitative global sensitivity analysis on this complex model to identify and characterize the sources and contributions of uncertainties in the predicted model output. The workflow and methodologies suitable for computationally expensive models, such as the Morris screening method and Gaussian Emulation processes, were used for the implementation of the global sensitivity analysis. Sensitivity indices, such as main, total and interaction effects, were computed for a screened set of the total thyroidal system descriptive model input parameters. Furthermore, a narrower sub-set of the most influential parameters affecting the model output of maternal thyroid hormone levels were identified in addition to the characterization of their overall and pair-wise parameter interaction quotients. The characteristic trends of influence in model output for each of these individual model input parameters over their plausible ranges were elucidated using Gaussian Emulation processes. Through global sensitivity analysis we have gained a better understanding of the model behavior and performance beyond the domains of observation by the simultaneous variation in model inputs over their range of plausible uncertainties. The sensitivity analysis helped identify parameters that determine the driving mechanisms of the maternal and fetal iodide kinetics, thyroid function and their interactions, and contributed to an improved understanding of the system modeled. We have thus demonstrated the use and application of global sensitivity analysis for a biologically based dose-response model for sensitive life-stages such as pregnancy that provides richer information on the model and the thyroidal system modeled compared to local

  3. Quantitative Image Feature Engine (QIFE): an Open-Source, Modular Engine for 3D Quantitative Feature Extraction from Volumetric Medical Images.

    Science.gov (United States)

    Echegaray, Sebastian; Bakr, Shaimaa; Rubin, Daniel L; Napel, Sandy

    2017-10-06

    The aim of this study was to develop an open-source, modular, locally run or server-based system for 3D radiomics feature computation that can be used on any computer system and included in existing workflows for understanding associations and building predictive models between image features and clinical data, such as survival. The QIFE exploits various levels of parallelization for use on multiprocessor systems. It consists of a managing framework and four stages: input, pre-processing, feature computation, and output. Each stage contains one or more swappable components, allowing run-time customization. We benchmarked the engine using various levels of parallelization on a cohort of CT scans presenting 108 lung tumors. Two versions of the QIFE have been released: (1) the open-source MATLAB code posted to Github, (2) a compiled version loaded in a Docker container, posted to DockerHub, which can be easily deployed on any computer. The QIFE processed 108 objects (tumors) in 2:12 (h/mm) using 1 core, and 1:04 (h/mm) hours using four cores with object-level parallelization. We developed the Quantitative Image Feature Engine (QIFE), an open-source feature-extraction framework that focuses on modularity, standards, parallelism, provenance, and integration. Researchers can easily integrate it with their existing segmentation and imaging workflows by creating input and output components that implement their existing interfaces. Computational efficiency can be improved by parallelizing execution at the cost of memory usage. Different parallelization levels provide different trade-offs, and the optimal setting will depend on the size and composition of the dataset to be processed.

  4. Residual N effects from livestock manure inputs to soils

    NARCIS (Netherlands)

    Schroder, J.J.; Bechini, L.; Bittman, S.; Brito, M.P.; Delin, S.; Lalor, S.T.J.; Morvan, T.; Chambers, B.J.; Sakrabani, R.; Sørensen, P.B.

    2013-01-01

    Organic inputs including livestock manures provide nitrogen (N) to crops beyond the year of their application. This so-called residual N effect should be taken into account when making decisions on N rates for individual fields, but also when interpreting N response trials in preparation of

  5. Wave energy input into the Ekman layer

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    This paper is concerned with the wave energy input into the Ekman layer, based on 3 observational facts that surface waves could significantly affect the profile of the Ekman layer. Under the assumption of constant vertical diffusivity, the analytical form of wave energy input into the Ekman layer is derived. Analysis of the energy balance shows that the energy input to the Ekman layer through the wind stress and the interaction of the Stokes-drift with planetary vorticity can be divided into two kinds. One is the wind energy input, and the other is the wave energy input which is dependent on wind speed, wave characteristics and the wind direction relative to the wave direction. Estimates of wave energy input show that wave energy input can be up to 10% in high-latitude and high-wind speed areas and higher than 20% in the Antarctic Circumpolar Current, compared with the wind energy input into the classical Ekman layer. Results of this paper are of significance to the study of wave-induced large scale effects.

  6. CBM First-level Event Selector Input Interface Demonstrator

    Science.gov (United States)

    Hutter, Dirk; de Cuveland, Jan; Lindenstruth, Volker

    2017-10-01

    CBM is a heavy-ion experiment at the future FAIR facility in Darmstadt, Germany. Featuring self-triggered front-end electronics and free-streaming read-out, event selection will exclusively be done by the First Level Event Selector (FLES). Designed as an HPC cluster with several hundred nodes its task is an online analysis and selection of the physics data at a total input data rate exceeding 1 TByte/s. To allow efficient event selection, the FLES performs timeslice building, which combines the data from all given input links to self-contained, potentially overlapping processing intervals and distributes them to compute nodes. Partitioning the input data streams into specialized containers allows performing this task very efficiently. The FLES Input Interface defines the linkage between the FEE and the FLES data transport framework. A custom FPGA PCIe board, the FLES Interface Board (FLIB), is used to receive data via optical links and transfer them via DMA to the host’s memory. The current prototype of the FLIB features a Kintex-7 FPGA and provides up to eight 10 GBit/s optical links. A custom FPGA design has been developed for this board. DMA transfers and data structures are optimized for subsequent timeslice building. Index tables generated by the FPGA enable fast random access to the written data containers. In addition the DMA target buffers can directly serve as InfiniBand RDMA source buffers without copying the data. The usage of POSIX shared memory for these buffers allows data access from multiple processes. An accompanying HDL module has been developed to integrate the FLES link into the front-end FPGA designs. It implements the front-end logic interface as well as the link protocol. Prototypes of all Input Interface components have been implemented and integrated into the FLES test framework. This allows the implementation and evaluation of the foreseen CBM read-out chain.

  7. Convergence and periodic solutions for the input impedance of a standard ladder network

    International Nuclear Information System (INIS)

    Ucak, C; Acar, C

    2007-01-01

    The input impedance of an infinite ladder network is computed by using the recursive relation and by assuming that the input impedance does not change when a new block is added to the network. However, this assumption is not true in general and standard textbooks do not always treat these networks correctly. This paper develops a general solution to obtain the input impedance of a standard ladder network of impedances and admittances for any number of blocks. Then, this result is used to provide the convergence condition for the infinite ladder network. The conditions which lead to periodic input impedance are exploited. It is shown that there are infinite numbers of periodic points and no paradoxical behaviour exists in the standard ladder network

  8. Quantitative imaging of protein targets in the human brain with PET

    International Nuclear Information System (INIS)

    Gunn, Roger N; Slifstein, Mark; Searle, Graham E; Price, Julie C

    2015-01-01

    PET imaging of proteins in the human brain with high affinity radiolabelled molecules has a history stretching back over 30 years. During this period the portfolio of protein targets that can be imaged has increased significantly through successes in radioligand discovery and development. This portfolio now spans six major categories of proteins; G-protein coupled receptors, membrane transporters, ligand gated ion channels, enzymes, misfolded proteins and tryptophan-rich sensory proteins. In parallel to these achievements in radiochemical sciences there have also been significant advances in the quantitative analysis and interpretation of the imaging data including the development of methods for image registration, image segmentation, tracer compartmental modeling, reference tissue kinetic analysis and partial volume correction. In this review, we analyze the activity of the field around each of the protein targets in order to give a perspective on the historical focus and the possible future trajectory of the field. The important neurobiology and pharmacology is introduced for each of the six protein classes and we present established radioligands for each that have successfully transitioned to quantitative imaging in humans. We present a standard quantitative analysis workflow for these radioligands which takes the dynamic PET data, associated blood and anatomical MRI data as the inputs to a series of image processing and bio-mathematical modeling steps before outputting the outcome measure of interest on either a regional or parametric image basis. The quantitative outcome measures are then used in a range of different imaging studies including tracer discovery and development studies, cross sectional studies, classification studies, intervention studies and longitudinal studies. Finally we consider some of the confounds, challenges and subtleties that arise in practice when trying to quantify and interpret PET neuroimaging data including motion artifacts

  9. Quantitative imaging of protein targets in the human brain with PET

    Science.gov (United States)

    Gunn, Roger N.; Slifstein, Mark; Searle, Graham E.; Price, Julie C.

    2015-11-01

    PET imaging of proteins in the human brain with high affinity radiolabelled molecules has a history stretching back over 30 years. During this period the portfolio of protein targets that can be imaged has increased significantly through successes in radioligand discovery and development. This portfolio now spans six major categories of proteins; G-protein coupled receptors, membrane transporters, ligand gated ion channels, enzymes, misfolded proteins and tryptophan-rich sensory proteins. In parallel to these achievements in radiochemical sciences there have also been significant advances in the quantitative analysis and interpretation of the imaging data including the development of methods for image registration, image segmentation, tracer compartmental modeling, reference tissue kinetic analysis and partial volume correction. In this review, we analyze the activity of the field around each of the protein targets in order to give a perspective on the historical focus and the possible future trajectory of the field. The important neurobiology and pharmacology is introduced for each of the six protein classes and we present established radioligands for each that have successfully transitioned to quantitative imaging in humans. We present a standard quantitative analysis workflow for these radioligands which takes the dynamic PET data, associated blood and anatomical MRI data as the inputs to a series of image processing and bio-mathematical modeling steps before outputting the outcome measure of interest on either a regional or parametric image basis. The quantitative outcome measures are then used in a range of different imaging studies including tracer discovery and development studies, cross sectional studies, classification studies, intervention studies and longitudinal studies. Finally we consider some of the confounds, challenges and subtleties that arise in practice when trying to quantify and interpret PET neuroimaging data including motion artifacts

  10. Pandemic recovery analysis using the dynamic inoperability input-output model.

    Science.gov (United States)

    Santos, Joost R; Orsi, Mark J; Bond, Erik J

    2009-12-01

    Economists have long conceptualized and modeled the inherent interdependent relationships among different sectors of the economy. This concept paved the way for input-output modeling, a methodology that accounts for sector interdependencies governing the magnitude and extent of ripple effects due to changes in the economic structure of a region or nation. Recent extensions to input-output modeling have enhanced the model's capabilities to account for the impact of an economic perturbation; two such examples are the inoperability input-output model((1,2)) and the dynamic inoperability input-output model (DIIM).((3)) These models introduced sector inoperability, or the inability to satisfy as-planned production levels, into input-output modeling. While these models provide insights for understanding the impacts of inoperability, there are several aspects of the current formulation that do not account for complexities associated with certain disasters, such as a pandemic. This article proposes further enhancements to the DIIM to account for economic productivity losses resulting primarily from workforce disruptions. A pandemic is a unique disaster because the majority of its direct impacts are workforce related. The article develops a modeling framework to account for workforce inoperability and recovery factors. The proposed workforce-explicit enhancements to the DIIM are demonstrated in a case study to simulate a pandemic scenario in the Commonwealth of Virginia.

  11. Relating interesting quantitative time series patterns with text events and text features

    Science.gov (United States)

    Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.

    2013-12-01

    In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other

  12. Multivariate Self-Exciting Threshold Autoregressive Models with eXogenous Input

    OpenAIRE

    Addo, Peter Martey

    2014-01-01

    This study defines a multivariate Self--Exciting Threshold Autoregressive with eXogenous input (MSETARX) models and present an estimation procedure for the parameters. The conditions for stationarity of the nonlinear MSETARX models is provided. In particular, the efficiency of an adaptive parameter estimation algorithm and LSE (least squares estimate) algorithm for this class of models is then provided via simulations.

  13. Estimated anthropogenic nitrogen and phosphorus inputs to the land surface of the conterminous United States--1992, 1997, and 2002

    Science.gov (United States)

    Sprague, Lori A.; Gronberg, Jo Ann M.

    2013-01-01

    Anthropogenic inputs of nitrogen and phosphorus to each county in the conterminous United States and to the watersheds of 495 surface-water sites studied as part of the U.S. Geological Survey National Water-Quality Assessment Program were quantified for the years 1992, 1997, and 2002. Estimates of inputs of nitrogen and phosphorus from biological fixation by crops (for nitrogen only), human consumption, crop production for human consumption, animal production for human consumption, animal consumption, and crop production for animal consumption for each county are provided in a tabular dataset. These county-level estimates were allocated to the watersheds of the surface-water sites to estimate watershed-level inputs from the same sources; these estimates also are provided in a tabular dataset, together with calculated estimates of net import of food and net import of feed and previously published estimates of inputs from atmospheric deposition, fertilizer, and recoverable manure. The previously published inputs are provided for each watershed so that final estimates of total anthropogenic nutrient inputs could be calculated. Estimates of total anthropogenic inputs are presented together with previously published estimates of riverine loads of total nitrogen and total phosphorus for reference.

  14. Statistical identification of effective input variables

    International Nuclear Information System (INIS)

    Vaurio, J.K.

    1982-09-01

    A statistical sensitivity analysis procedure has been developed for ranking the input data of large computer codes in the order of sensitivity-importance. The method is economical for large codes with many input variables, since it uses a relatively small number of computer runs. No prior judgemental elimination of input variables is needed. The sceening method is based on stagewise correlation and extensive regression analysis of output values calculated with selected input value combinations. The regression process deals with multivariate nonlinear functions, and statistical tests are also available for identifying input variables that contribute to threshold effects, i.e., discontinuities in the output variables. A computer code SCREEN has been developed for implementing the screening techniques. The efficiency has been demonstrated by several examples and applied to a fast reactor safety analysis code (Venus-II). However, the methods and the coding are general and not limited to such applications

  15. Reduced basis ANOVA methods for partial differential equations with high-dimensional random inputs

    Energy Technology Data Exchange (ETDEWEB)

    Liao, Qifeng, E-mail: liaoqf@shanghaitech.edu.cn [School of Information Science and Technology, ShanghaiTech University, Shanghai 200031 (China); Lin, Guang, E-mail: guanglin@purdue.edu [Department of Mathematics & School of Mechanical Engineering, Purdue University, West Lafayette, IN 47907 (United States)

    2016-07-15

    In this paper we present a reduced basis ANOVA approach for partial deferential equations (PDEs) with random inputs. The ANOVA method combined with stochastic collocation methods provides model reduction in high-dimensional parameter space through decomposing high-dimensional inputs into unions of low-dimensional inputs. In this work, to further reduce the computational cost, we investigate spatial low-rank structures in the ANOVA-collocation method, and develop efficient spatial model reduction techniques using hierarchically generated reduced bases. We present a general mathematical framework of the methodology, validate its accuracy and demonstrate its efficiency with numerical experiments.

  16. Input parameters to codes which analyze LMFBR wire-wrapped bundles

    International Nuclear Information System (INIS)

    Hawley, J.T.; Chan, Y.N.; Todreas, N.E.

    1980-12-01

    This report provides a current summary of recommended values of key input parameters required by ENERGY code analysis of LMFBR wire wrapped bundles. This data is based on the interpretation of experimental results from the MIT and other available laboratory programs

  17. Gestures and multimodal input

    OpenAIRE

    Keates, Simeon; Robinson, Peter

    1999-01-01

    For users with motion impairments, the standard keyboard and mouse arrangement for computer access often presents problems. Other approaches have to be adopted to overcome this. In this paper, we will describe the development of a prototype multimodal input system based on two gestural input channels. Results from extensive user trials of this system are presented. These trials showed that the physical and cognitive loads on the user can quickly become excessive and detrimental to the interac...

  18. Inhibitory Gating of Basolateral Amygdala Inputs to the Prefrontal Cortex.

    Science.gov (United States)

    McGarry, Laura M; Carter, Adam G

    2016-09-07

    Interactions between the prefrontal cortex (PFC) and basolateral amygdala (BLA) regulate emotional behaviors. However, a circuit-level understanding of functional connections between these brain regions remains incomplete. The BLA sends prominent glutamatergic projections to the PFC, but the overall influence of these inputs is predominantly inhibitory. Here we combine targeted recordings and optogenetics to examine the synaptic underpinnings of this inhibition in the mouse infralimbic PFC. We find that BLA inputs preferentially target layer 2 corticoamygdala over neighboring corticostriatal neurons. However, these inputs make even stronger connections onto neighboring parvalbumin and somatostatin expressing interneurons. Inhibitory connections from these two populations of interneurons are also much stronger onto corticoamygdala neurons. Consequently, BLA inputs are able to drive robust feedforward inhibition via two parallel interneuron pathways. Moreover, the contributions of these interneurons shift during repetitive activity, due to differences in short-term synaptic dynamics. Thus, parvalbumin interneurons are activated at the start of stimulus trains, whereas somatostatin interneuron activation builds during these trains. Together, these results reveal how the BLA impacts the PFC through a complex interplay of direct excitation and feedforward inhibition. They also highlight the roles of targeted connections onto multiple projection neurons and interneurons in this cortical circuit. Our findings provide a mechanistic understanding for how the BLA can influence the PFC circuit, with important implications for how this circuit participates in the regulation of emotion. The prefrontal cortex (PFC) and basolateral amygdala (BLA) interact to control emotional behaviors. Here we show that BLA inputs elicit direct excitation and feedforward inhibition of layer 2 projection neurons in infralimbic PFC. BLA inputs are much stronger at corticoamygdala neurons compared

  19. The Importance of Input and Interaction in SLA

    Institute of Scientific and Technical Information of China (English)

    党春花

    2009-01-01

    As is known to us, input and interaction play the crucial roles in second language acquisition (SLA). Different linguistic schools have different explanations to input and interaction Behaviorist theories hold a view that input is composed of stimuli and response, putting more emphasis on the importance of input, while mentalist theories find input is a necessary condition to SLA, not a sufficient condition. At present, social interaction theories, which is one type of cognitive linguistics, suggests that besides input, interaction is also essential to language acquisition. Then, this essay will discuss how input and interaction result in SLA.

  20. Bottom-up and Top-down Input Augment the Variability of Cortical Neurons

    Science.gov (United States)

    Nassi, Jonathan J.; Kreiman, Gabriel; Born, Richard T.

    2016-01-01

    SUMMARY Neurons in the cerebral cortex respond inconsistently to a repeated sensory stimulus, yet they underlie our stable sensory experiences. Although the nature of this variability is unknown, its ubiquity has encouraged the general view that each cell produces random spike patterns that noisily represent its response rate. In contrast, here we show that reversibly inactivating distant sources of either bottom-up or top-down input to cortical visual areas in the alert primate reduces both the spike train irregularity and the trial-to-trial variability of single neurons. A simple model in which a fraction of the pre-synaptic input is silenced can reproduce this reduction in variability, provided that there exist temporal correlations primarily within, but not between, excitatory and inhibitory input pools. A large component of the variability of cortical neurons may therefore arise from synchronous input produced by signals arriving from multiple sources. PMID:27427459

  1. Facilitating mathematics learning for students with upper extremity disabilities using touch-input system.

    Science.gov (United States)

    Choi, Kup-Sze; Chan, Tak-Yin

    2015-03-01

    To investigate the feasibility of using tablet device as user interface for students with upper extremity disabilities to input mathematics efficiently into computer. A touch-input system using tablet device as user interface was proposed to assist these students to write mathematics. User-switchable and context-specific keyboard layouts were designed to streamline the input process. The system could be integrated with conventional computer systems only with minor software setup. A two-week pre-post test study involving five participants was conducted to evaluate the performance of the system and collect user feedback. The mathematics input efficiency of the participants was found to improve during the experiment sessions. In particular, their performance in entering trigonometric expressions by using the touch-input system was significantly better than that by using conventional mathematics editing software with keyboard and mouse. The participants rated the touch-input system positively and were confident that they could operate at ease with more practice. The proposed touch-input system provides a convenient way for the students with hand impairment to write mathematics and has the potential to facilitate their mathematics learning. Implications for Rehabilitation Students with upper extremity disabilities often face barriers to learning mathematics which is largely based on handwriting. Conventional computer user interfaces are inefficient for them to input mathematics into computer. A touch-input system with context-specific and user-switchable keyboard layouts was designed to improve the efficiency of mathematics input. Experimental results and user feedback suggested that the system has the potential to facilitate mathematics learning for the students.

  2. An artificial neural network approach to laser-induced breakdown spectroscopy quantitative analysis

    International Nuclear Information System (INIS)

    D’Andrea, Eleonora; Pagnotta, Stefano; Grifoni, Emanuela; Lorenzetti, Giulia; Legnaioli, Stefano; Palleschi, Vincenzo; Lazzerini, Beatrice

    2014-01-01

    The usual approach to laser-induced breakdown spectroscopy (LIBS) quantitative analysis is based on the use of calibration curves, suitably built using appropriate reference standards. More recently, statistical methods relying on the principles of artificial neural networks (ANN) are increasingly used. However, ANN analysis is often used as a ‘black box’ system and the peculiarities of the LIBS spectra are not exploited fully. An a priori exploration of the raw data contained in the LIBS spectra, carried out by a neural network to learn what are the significant areas of the spectrum to be used for a subsequent neural network delegated to the calibration, is able to throw light upon important information initially unknown, although already contained within the spectrum. This communication will demonstrate that an approach based on neural networks specially taylored for dealing with LIBS spectra would provide a viable, fast and robust method for LIBS quantitative analysis. This would allow the use of a relatively limited number of reference samples for the training of the network, with respect to the current approaches, and provide a fully automatizable approach for the analysis of a large number of samples. - Highlights: • A methodological approach to neural network analysis of LIBS spectra is proposed. • The architecture of the network and the number of inputs are optimized. • The method is tested on bronze samples already analyzed using a calibration-free LIBS approach. • The results are validated, compared and discussed

  3. Interfield dysbalances in research input and output benchmarking: Visualisation by density equalizing procedures

    Directory of Open Access Journals (Sweden)

    Fischer Axel

    2008-08-01

    Full Text Available Abstract Background Historical, social and economic reasons can lead to major differences in the allocation of health system resources and research funding. These differences might endanger the progress in diagnostic and therapeutic approaches of socio-economic important diseases. The present study aimed to assess different benchmarking approaches that might be used to analyse these disproportions. Research in two categories was analysed for various output parameters and compared to input parameters. Germany was used as a high income model country. For the areas of cardiovascular and respiratory medicine density equalizing mapping procedures visualized major geographical differences in both input and output markers. Results An imbalance in the state financial input was present with 36 cardiovascular versus 8 respiratory medicine state-financed full clinical university departments at the C4/W3 salary level. The imbalance in financial input is paralleled by an imbalance in overall quantitative output figures: The 36 cardiology chairs published 2708 articles in comparison to 453 articles published by the 8 respiratory medicine chairs in the period between 2002 and 2006. This is a ratio of 75.2 articles per cardiology chair and 56.63 articles per respiratory medicine chair. A similar trend is also present in the qualitative measures. Here, the 2708 cardiology publications were cited 48337 times (7290 times for respiratory medicine which is an average citation of 17.85 per publication vs. 16.09 for respiratory medicine. The average number of citations per cardiology chair was 1342.69 in contrast to 911.25 citations per respiratory medicine chair. Further comparison of the contribution of the 16 different German states revealed major geographical differences concerning numbers of chairs, published items, total number of citations and average citations. Conclusion Despite similar significances of cardiovascular and respiratory diseases for the global

  4. Framework for Modelling Multiple Input Complex Aggregations for Interactive Installations

    DEFF Research Database (Denmark)

    Padfield, Nicolas; Andreasen, Troels

    2012-01-01

    on fuzzy logic and provides a method for variably balancing interaction and user input with the intention of the artist or director. An experimental design is presented, demonstrating an intuitive interface for parametric modelling of a complex aggregation function. The aggregation function unifies...

  5. Observer-Based Perturbation Extremum Seeking Control with Input Constraints for Direct-Contact Membrane Distillation Process

    KAUST Repository

    Eleiwi, Fadi

    2017-05-08

    An Observer-based Perturbation Extremum Seeking Control (PESC) is proposed for a Direct-Contact Membrane Distillation (DCMD) process. The process is described with a dynamic model that is based on a 2D Advection-Diffusion Equation (ADE) model which has pump flow rates as process inputs. The objective of the controller is to optimize the trade-off between the permeate mass flux and the energy consumption by the pumps inside the process. Cases of single and multiple control inputs are considered through the use of only the feed pump flow rate or both the feed and the permeate pump flow rates. A nonlinear Lyapunov-based observer is designed to provide an estimation for the temperature distribution all over the designated domain of the DCMD process. Moreover, control inputs are constrained with an anti-windup technique to be within feasible and physical ranges. Performance of the proposed structure is analyzed, and simulations based on real DCMD process parameters for each control input are provided.

  6. Observer-based perturbation extremum seeking control with input constraints for direct-contact membrane distillation process

    Science.gov (United States)

    Eleiwi, Fadi; Laleg-Kirati, Taous Meriem

    2018-06-01

    An observer-based perturbation extremum seeking control is proposed for a direct-contact membrane distillation (DCMD) process. The process is described with a dynamic model that is based on a 2D advection-diffusion equation model which has pump flow rates as process inputs. The objective of the controller is to optimise the trade-off between the permeate mass flux and the energy consumption by the pumps inside the process. Cases of single and multiple control inputs are considered through the use of only the feed pump flow rate or both the feed and the permeate pump flow rates. A nonlinear Lyapunov-based observer is designed to provide an estimation for the temperature distribution all over the designated domain of the DCMD process. Moreover, control inputs are constrained with an anti-windup technique to be within feasible and physical ranges. Performance of the proposed structure is analysed, and simulations based on real DCMD process parameters for each control input are provided.

  7. Intraglomerular inhibition maintains mitral cell response contrast across input frequencies.

    Science.gov (United States)

    Shao, Zuoyi; Puche, Adam C; Shipley, Michael T

    2013-11-01

    Odor signals are transmitted to the olfactory bulb by olfactory nerve (ON) synapses onto mitral/tufted cells (MTCs) and external tufted cells (ETCs); ETCs provide additional feed-forward excitation to MTCs. Both are strongly regulated by intraglomerular inhibition that can last up to 1 s and, when blocked, dramatically increases ON-evoked MC spiking. Intraglomerular inhibition thus limits the magnitude and duration of MC spike responses to sensory input. In vivo, sensory input is repetitive, dictated by sniffing rates from 1 to 8 Hz, potentially summing intraglomerular inhibition. To investigate this, we recorded MTC responses to 1- to 8-Hz ON stimulation in slices. Inhibitory postsynaptic current area (charge) following each ON stimulation was unchanged from 1 to 5 Hz and modestly paired-pulse attenuated at 8 Hz, suggesting there is no summation and only limited decrement at the highest input frequencies. Next, we investigated frequency independence of intraglomerular inhibition on MC spiking. MCs respond to single ON shocks with an initial spike burst followed by reduced spiking decaying to baseline. Upon repetitive ON stimulation peak spiking is identical across input frequencies but the ratio of peak-to-minimum rate before the stimulus (max-min) diminishes from 30:1 at 1 Hz to 15:1 at 8 Hz. When intraglomerular inhibition is selectively blocked, peak spike rate is unchanged but trough spiking increases markedly decreasing max-min firing ratios from 30:1 at 1 Hz to 2:1 at 8 Hz. Together, these results suggest intraglomerular inhibition is relatively frequency independent and can "sharpen" MC responses to input across the range of frequencies. This suggests that glomerular circuits can maintain "contrast" in MC encoding during sniff-sampled inputs.

  8. An Interface Theory for Input/Output Automata

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Nyman, Ulrik; Wasowski, Andrzej

    Building on the theory of interface automata by de~Alfaro and Henzinger we design an interface language for Lynch's Input/Output Automata, a popular formalism used in the development of distributed asynchronous systems, not addressed by previous interface research. We introduce an explicit....... We also present a method for solving systems of relativized behavioral inequalities as used in our setup and draw a formal correspondence between our work and interface automata. Proofs are provided in an appendix....

  9. Conceptualizing, Understanding, and Predicting Responsible Decisions and Quality Input

    Science.gov (United States)

    Wall, N.; PytlikZillig, L. M.

    2012-12-01

    In areas such as climate change, where uncertainty is high, it is arguably less difficult to tell when efforts have resulted in changes in knowledge, than when those efforts have resulted in responsible decisions. What is a responsible decision? More broadly, when it comes to citizen input, what is "high quality" input? And most importantly, how are responsible decisions and quality input enhanced? The aim of this paper is to contribute to the understanding of the different dimensions of "responsible" or "quality" public input and citizen decisions by comparing and contrasting the different predictors of those different dimensions. We first present different possibilities for defining, operationalizing and assessing responsible or high quality decisions. For example, responsible decisions or quality input might be defined as using specific content (e.g., using climate change information in decisions appropriately), as using specific processes (e.g., investing time and effort in learning about and discussing the issues prior to making decisions), or on the basis of some judgment of the decision or input itself (e.g., judgments of the rationale provided for the decisions, or number of issues considered when giving input). Second, we present results from our work engaging people with science policy topics, and the different ways that we have tried to define these two constructs. In the area of climate change specifically, we describe the development of a short survey that assesses exposure to climate information, knowledge of and attitudes toward climate change, and use of climate information in one's decisions. Specifically, the short survey was developed based on a review of common surveys of climate change related knowledge, attitudes, and behaviors, and extensive piloting and cognitive interviews. Next, we analyze more than 200 responses to that survey (data collection is currently ongoing and will be complete after the AGU deadline), and report the predictors of

  10. Analysis on relation between safety input and accidents

    Institute of Scientific and Technical Information of China (English)

    YAO Qing-guo; ZHANG Xue-mu; LI Chun-hui

    2007-01-01

    The number of safety input directly determines the level of safety, and there exists dialectical and unified relations between safety input and accidents. Based on the field investigation and reliable data, this paper deeply studied the dialectical relationship between safety input and accidents, and acquired the conclusions. The security situation of the coal enterprises was related to the security input rate, being effected little by the security input scale, and build the relationship model between safety input and accidents on this basis, that is the accident model.

  11. TRANSIT: model for providing generic transportation input for preliminary siting analysis

    International Nuclear Information System (INIS)

    McNair, G.W.; Cashwell, J.W.

    1985-02-01

    To assist the US Department of Energy's efforts in potential facility site screening in the nuclear waste management program, a computerized model, TRANSIT, is being developed. Utilizing existing data on the location and inventory characteristics of spent nuclear fuel at reactor sites, TRANSIT derives isopleths of transportation mileage, costs, risks and fleet requirements for shipments to storage sites and/or repository sites. This technique provides a graphic, first-order method for use by the Department in future site screening efforts. 2 refs

  12. Yield and technological quality of ecological and low-input production of potatoes

    Directory of Open Access Journals (Sweden)

    MILAN MACÁK

    2012-09-01

    Full Text Available The yield and other quantitative (number of plants, number of tubers, weight of tubers per 1 m2 and qualitative parameters (content of vitamin C, starch, nitrogen and dry matter of the Solanum tuberosum L. early variety “Collete” have been studied in ecological and low-input farming systems with two levels of organic fertilization during 2003-2005. The experiment was situated in water-protected zone of western Slovakia on Luvi-Haplic Chernozem. After harvest of forecrop in higher level of organic fertilization treatment catch crop – phacelia and mustard was grown. Highly significant differences in each studied parameters of potato tubers between certain years were ascertained, thus great influence of weather conditions on quality and quantity of potatoes was confirmed. Yields was highly significantly influenced also by farming systems, when in low-input system the average yield was 21.38 t ha-1 and in ecological system 20.02 t ha-1. Green manure management did not influence yield significantly. In treatment without green manure the average yield 20.47 t ha-1 was reached with comparison to green manure application treatments 20.93 t ha-1. In low-input system significantly higher C vitamin content (4.23 mg 100g-1 was ascertained compared to ecological one 3.53 mg 100g-1. Other qualitative parameters were more or less on the same level. We recommended both farming system for growing potatoes in water vulnerable zones and for better fulfil the Good Agricultural Practices in Slovak conditions.

  13. Fine Sediment Input and Benthic Fauna Interactions at the Confluence of Two Large Rivers

    International Nuclear Information System (INIS)

    Blettler, M. C. M.; Amsler, M. L.; Ezcurra De Drago, I.; Drago, E.; Paira, A.; Espinola, L. A.; Eberle, E.; Szupiany, R.

    2016-01-01

    Several studies suggest that invertebrate abundance and richness are disrupted and reset at confluences. Thus, junctions contribute disproportionately to the overall aquatic biodiversity of the river. In general terms, authors have reported high abundance and diversity due to the major physical heterogeneity at junctions. However, data are still scarce and uncertainties are plentiful. The impact of a great input of fine sediments on the distribution patterns of benthic invertebrates at a river confluence was quantitatively analyzed herein. The junction of the subtropical Bermejo River (high suspended sediment load) with the large Paraguay River is the selected study area to achieve this aim. While diversity increased slightly downstream the junction (from 0.21 to 0.36), density and richness of the macro invertebrate assemblage significantly diminished downstream the confluence (from 29050 to 410 ind/m2; p< 0.05) due to the input of fine sediment from the Bermejo River (mean fine sediment increased downstream from 6.3 to 10.2 mg/L), causing a negatively impact on invertebrate assemblage. This study highlights the ecological importance of the sediment input effects on benthic invertebrates, a topic still poorly explored in river ecology. It is speculated that the spatial extent of the impact would be dependent upon the hydrological and sedimentological context, highly unequal between both rivers. New hypotheses should be tested through new studies considering different hydrological stages.

  14. Input-output model for MACCS nuclear accident impacts estimation¹

    Energy Technology Data Exchange (ETDEWEB)

    Outkin, Alexander V. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bixler, Nathan E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vargas, Vanessa N [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-27

    Since the original economic model for MACCS was developed, better quality economic data (as well as the tools to gather and process it) and better computational capabilities have become available. The update of the economic impacts component of the MACCS legacy model will provide improved estimates of business disruptions through the use of Input-Output based economic impact estimation. This paper presents an updated MACCS model, bases on Input-Output methodology, in which economic impacts are calculated using the Regional Economic Accounting analysis tool (REAcct) created at Sandia National Laboratories. This new GDP-based model allows quick and consistent estimation of gross domestic product (GDP) losses due to nuclear power plant accidents. This paper outlines the steps taken to combine the REAcct Input-Output-based model with the MACCS code, describes the GDP loss calculation, and discusses the parameters and modeling assumptions necessary for the estimation of long-term effects of nuclear power plant accidents.

  15. A Practical pedestrian approach to parsimonious regression with inaccurate inputs

    Directory of Open Access Journals (Sweden)

    Seppo Karrila

    2014-04-01

    Full Text Available A measurement result often dictates an interval containing the correct value. Interval data is also created by roundoff, truncation, and binning. We focus on such common interval uncertainty in data. Inaccuracy in model inputs is typically ignored on model fitting. We provide a practical approach for regression with inaccurate data: the mathematics is easy, and the linear programming formulations simple to use even in a spreadsheet. This self-contained elementary presentation introduces interval linear systems and requires only basic knowledge of algebra. Feature selection is automatic; but can be controlled to find only a few most relevant inputs; and joint feature selection is enabled for multiple modeled outputs. With more features than cases, a novel connection to compressed sensing emerges: robustness against interval errors-in-variables implies model parsimony, and the input inaccuracies determine the regularization term. A small numerical example highlights counterintuitive results and a dramatic difference to total least squares.

  16. Measuring Input Thresholds on an Existing Board

    Science.gov (United States)

    Kuperman, Igor; Gutrich, Daniel G.; Berkun, Andrew C.

    2011-01-01

    A critical PECL (positive emitter-coupled logic) interface to Xilinx interface needed to be changed on an existing flight board. The new Xilinx input interface used a CMOS (complementary metal-oxide semiconductor) type of input, and the driver could meet its thresholds typically, but not in worst-case, according to the data sheet. The previous interface had been based on comparison with an external reference, but the CMOS input is based on comparison with an internal divider from the power supply. A way to measure what the exact input threshold was for this device for 64 inputs on a flight board was needed. The measurement technique allowed an accurate measurement of the voltage required to switch a Xilinx input from high to low for each of the 64 lines, while only probing two of them. Directly driving an external voltage was considered too risky, and tests done on any other unit could not be used to qualify the flight board. The two lines directly probed gave an absolute voltage threshold calibration, while data collected on the remaining 62 lines without probing gave relative measurements that could be used to identify any outliers. The PECL interface was forced to a long-period square wave by driving a saturated square wave into the ADC (analog to digital converter). The active pull-down circuit was turned off, causing each line to rise rapidly and fall slowly according to the input s weak pull-down circuitry. The fall time shows up as a change in the pulse width of the signal ready by the Xilinx. This change in pulse width is a function of capacitance, pulldown current, and input threshold. Capacitance was known from the different trace lengths, plus a gate input capacitance, which is the same for all inputs. The pull-down current is the same for all inputs including the two that are probed directly. The data was combined, and the Excel solver tool was used to find input thresholds for the 62 lines. This was repeated over different supply voltages and

  17. Patient participation in patient safety and nursing input - a systematic review.

    Science.gov (United States)

    Vaismoradi, Mojtaba; Jordan, Sue; Kangasniemi, Mari

    2015-03-01

    This systematic review aims to synthesise the existing research on how patients participate in patient safety initiatives. Ambiguities remain about how patients participate in routine measures designed to promote patient safety. Systematic review using integrative methods. Electronic databases were searched using keywords describing patient involvement, nursing input and patient safety initiatives to retrieve empirical research published between 2007 and 2013. Findings were synthesized using the theoretical domains of Vincent's framework for analysing risk and safety in clinical practice: "patient", "healthcare provider", "task", "work environment", "organisation & management". We identified 17 empirical research papers: four qualitative, one mixed-method and 12 quantitative designs. All 17 papers indicated that patients can participate in safety initiatives. Improving patient participation in patient safety necessitates considering the patient as a person, the nurse as healthcare provider, the task of participation and the clinical environment. Patients' knowledge, health conditions, beliefs and experiences influence their decisions to engage in patient safety initiatives. An important component of the management of long-term conditions is to ensure that patients have sufficient knowledge to participate. Healthcare providers may need further professional development in patient education and patient care management to promote patient involvement in patient safety, and ensure that patients understand that they are 'allowed' to inform nurses of adverse events or errors. A healthcare system characterised by patient-centredness and mutual acknowledgement will support patient participation in safety practices. Further research is required to improve international knowledge of patient participation in patient safety in different disciplines, contexts and cultures. Patients have a significant role to play in enhancing their own safety while receiving hospital care. This

  18. Spatiotemporal coding of inputs for a system of globally coupled phase oscillators

    Science.gov (United States)

    Wordsworth, John; Ashwin, Peter

    2008-12-01

    We investigate the spatiotemporal coding of low amplitude inputs to a simple system of globally coupled phase oscillators with coupling function g(ϕ)=-sin(ϕ+α)+rsin(2ϕ+β) that has robust heteroclinic cycles (slow switching between cluster states). The inputs correspond to detuning of the oscillators. It was recently noted that globally coupled phase oscillators can encode their frequencies in the form of spatiotemporal codes of a sequence of cluster states [P. Ashwin, G. Orosz, J. Wordsworth, and S. Townley, SIAM J. Appl. Dyn. Syst. 6, 728 (2007)]. Concentrating on the case of N=5 oscillators we show in detail how the spatiotemporal coding can be used to resolve all of the information that relates the individual inputs to each other, providing that a long enough time series is considered. We investigate robustness to the addition of noise and find a remarkable stability, especially of the temporal coding, to the addition of noise even for noise of a comparable magnitude to the inputs.

  19. Fuzzy logic algorithm for quantitative tissue characterization of diffuse liver diseases from ultrasound images.

    Science.gov (United States)

    Badawi, A M; Derbala, A S; Youssef, A M

    1999-08-01

    Computerized ultrasound tissue characterization has become an objective means for diagnosis of liver diseases. It is difficult to differentiate diffuse liver diseases, namely cirrhotic and fatty liver by visual inspection from the ultrasound images. The visual criteria for differentiating diffused diseases are rather confusing and highly dependent upon the sonographer's experience. This often causes a bias effects in the diagnostic procedure and limits its objectivity and reproducibility. Computerized tissue characterization to assist quantitatively the sonographer for the accurate differentiation and to minimize the degree of risk is thus justified. Fuzzy logic has emerged as one of the most active area in classification. In this paper, we present an approach that employs Fuzzy reasoning techniques to automatically differentiate diffuse liver diseases using numerical quantitative features measured from the ultrasound images. Fuzzy rules were generated from over 140 cases consisting of normal, fatty, and cirrhotic livers. The input to the fuzzy system is an eight dimensional vector of feature values: the mean gray level (MGL), the percentile 10%, the contrast (CON), the angular second moment (ASM), the entropy (ENT), the correlation (COR), the attenuation (ATTEN) and the speckle separation. The output of the fuzzy system is one of the three categories: cirrhosis, fatty or normal. The steps done for differentiating the pathologies are data acquisition and feature extraction, dividing the input spaces of the measured quantitative data into fuzzy sets. Based on the expert knowledge, the fuzzy rules are generated and applied using the fuzzy inference procedures to determine the pathology. Different membership functions are developed for the input spaces. This approach has resulted in very good sensitivities and specificity for classifying diffused liver pathologies. This classification technique can be used in the diagnostic process, together with the history

  20. Robust input design for nonlinear dynamic modeling of AUV.

    Science.gov (United States)

    Nouri, Nowrouz Mohammad; Valadi, Mehrdad

    2017-09-01

    Input design has a dominant role in developing the dynamic model of autonomous underwater vehicles (AUVs) through system identification. Optimal input design is the process of generating informative inputs that can be used to generate the good quality dynamic model of AUVs. In a problem with optimal input design, the desired input signal depends on the unknown system which is intended to be identified. In this paper, the input design approach which is robust to uncertainties in model parameters is used. The Bayesian robust design strategy is applied to design input signals for dynamic modeling of AUVs. The employed approach can design multiple inputs and apply constraints on an AUV system's inputs and outputs. Particle swarm optimization (PSO) is employed to solve the constraint robust optimization problem. The presented algorithm is used for designing the input signals for an AUV, and the estimate obtained by robust input design is compared with that of the optimal input design. According to the results, proposed input design can satisfy both robustness of constraints and optimality. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Augmented Input Reveals Word Deafness in a Man with Frontotemporal Dementia

    Directory of Open Access Journals (Sweden)

    Chris Gibbons

    2012-01-01

    Full Text Available We describe a 57 year old, right handed, English speaking man initially diagnosed with progressive aphasia. Language assessment revealed inconsistent performance in key areas. Expressive language was reduced to a few short, perseverative phrases. Speech was severely apraxic. Primary modes of communication included gesture, pointing, gaze, physical touch and leading. Responses were 100% accurate when he was provided with written words, with random or inaccurate responses for strictly auditory/verbal input. When instructions to subsequent neuropsychological tests were written instead of spoken, performance improved markedly. A comprehensive audiology assessment revealed no hearing impairment. Neuroimaging was unremarkable. Neurobehavioral evaluation utilizing written input led to diagnoses of word deafness and frontotemporal dementia, resulting in very different management. We highlight the need for alternative modes of language input for assessment and treatment of patients with language comprehension symptoms.

  2. PCC/SRC, PCC and SRC Calculation from Multivariate Input for Sensitivity Analysis

    International Nuclear Information System (INIS)

    Iman, R.L.; Shortencarier, M.J.; Johnson, J.D.

    1995-01-01

    1 - Description of program or function: PCC/SRC is designed for use in conjunction with sensitivity analyses of complex computer models. PCC/SRC calculates the partial correlation coefficients (PCC) and the standardized regression coefficients (SRC) from the multivariate input to, and output from, a computer model. 2 - Method of solution: PCC/SRC calculates the coefficients on either the original observations or on the ranks of the original observations. These coefficients provide alternative measures of the relative contribution (importance) of each of the various input variables to the observed variations in output. Relationships between the coefficients and differences in their interpretations are identified. If the computer model output has an associated time or spatial history, PCC/SRC will generate a graph of the coefficients over time or space for each input-variable, output- variable combination of interest, indicating the importance of each input value over time or space. 3 - Restrictions on the complexity of the problem - Maxima of: 100 observations, 100 different time steps or intervals between successive dependent variable readings, 50 independent variables (model input), 20 dependent variables (model output). 10 ordered triples specifying intervals between dependent variable readings

  3. A Study on Quantitative Assessment of Design Specification of Reactor Protection System Software Using Bayesian Belief Networks

    International Nuclear Information System (INIS)

    Eom, H. S.; Kang, H. G.; Chang, S. C.; Park, G. Y.; Kwon, K. C.

    2007-02-01

    This report propose a method that can produce quantitative reliability of safety-critical software for PSA by making use of Bayesian Belief Networks (BBN). BBN has generally been used to model the uncertain system in many research fields. The proposed method was constructed by utilizing BBN that can combine the qualitative and the quantitative evidence relevant to the reliability of safety-critical software, and then can infer a conclusion in a formal and a quantitative way. A case study was also carried out with the proposed method to assess the quality of software design specification of safety-critical software that will be embedded in reactor protection system. The V and V results of the software were used as inputs for the BBN model. The calculation results of the BBN model showed that its conclusion is mostly equivalent to those of the V and V expert for a given input data set. The method and the results of the case study will be utilized in PSA of NPP. The method also can support the V and V expert's decision making process in controlling further V and V activities

  4. Six axis force feedback input device

    Science.gov (United States)

    Ohm, Timothy (Inventor)

    1998-01-01

    The present invention is a low friction, low inertia, six-axis force feedback input device comprising an arm with double-jointed, tendon-driven revolute joints, a decoupled tendon-driven wrist, and a base with encoders and motors. The input device functions as a master robot manipulator of a microsurgical teleoperated robot system including a slave robot manipulator coupled to an amplifier chassis, which is coupled to a control chassis, which is coupled to a workstation with a graphical user interface. The amplifier chassis is coupled to the motors of the master robot manipulator and the control chassis is coupled to the encoders of the master robot manipulator. A force feedback can be applied to the input device and can be generated from the slave robot to enable a user to operate the slave robot via the input device without physically viewing the slave robot. Also, the force feedback can be generated from the workstation to represent fictitious forces to constrain the input device's control of the slave robot to be within imaginary predetermined boundaries.

  5. Advanced information processing system: Input/output network management software

    Science.gov (United States)

    Nagle, Gail; Alger, Linda; Kemp, Alexander

    1988-01-01

    The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.

  6. Investigation of RADTRAN Stop Model input parameters for truck stops

    International Nuclear Information System (INIS)

    Griego, N.R.; Smith, J.D.; Neuhauser, K.S.

    1996-01-01

    RADTRAN is a computer code for estimating the risks and consequences as transport of radioactive materials (RAM). RADTRAN was developed and is maintained by Sandia National Laboratories for the US Department of Energy (DOE). For incident-free transportation, the dose to persons exposed while the shipment is stopped is frequently a major percentage of the overall dose. This dose is referred to as Stop Dose and is calculated by the Stop Model. Because stop dose is a significant portion of the overall dose associated with RAM transport, the values used as input for the Stop Model are important. Therefore, an investigation of typical values for RADTRAN Stop Parameters for truck stops was performed. The resulting data from these investigations were analyzed to provide mean values, standard deviations, and histograms. Hence, the mean values can be used when an analyst does not have a basis for selecting other input values for the Stop Model. In addition, the histograms and their characteristics can be used to guide statistical sampling techniques to measure sensitivity of the RADTRAN calculated Stop Dose to the uncertainties in the stop model input parameters. This paper discusses the details and presents the results of the investigation of stop model input parameters at truck stops

  7. Undergraduate medical education programme renewal: a longitudinal context, input, process and product evaluation study.

    Science.gov (United States)

    Mirzazadeh, Azim; Gandomkar, Roghayeh; Hejri, Sara Mortaz; Hassanzadeh, Gholamreza; Koochak, Hamid Emadi; Golestani, Abolfazl; Jafarian, Ali; Jalili, Mohammad; Nayeri, Fatemeh; Saleh, Narges; Shahi, Farhad; Razavi, Seyed Hasan Emami

    2016-02-01

    The purpose of this study was to utilize the Context, Input, Process and Product (CIPP) evaluation model as a comprehensive framework to guide initiating, planning, implementing and evaluating a revised undergraduate medical education programme. The eight-year longitudinal evaluation study consisted of four phases compatible with the four components of the CIPP model. In the first phase, we explored the strengths and weaknesses of the traditional programme as well as contextual needs, assets, and resources. For the second phase, we proposed a model for the programme considering contextual features. During the process phase, we provided formative information for revisions and adjustments. Finally, in the fourth phase, we evaluated the outcomes of the new undergraduate medical education programme in the basic sciences phase. Information was collected from different sources such as medical students, faculty members, administrators, and graduates, using various qualitative and quantitative methods including focus groups, questionnaires, and performance measures. The CIPP model has the potential to guide policy makers to systematically collect evaluation data and to manage stakeholders' reactions at each stage of the reform in order to make informed decisions. However, the model may result in evaluation burden and fail to address some unplanned evaluation questions.

  8. Leaders’ receptivity to subordinates’ creative input: the role of achievement goals and composition of creative input

    NARCIS (Netherlands)

    Sijbom, R.B.L.; Janssen, O.; van Yperen, N.W.

    2015-01-01

    We identified leaders’ achievement goals and composition of creative input as important factors that can clarify when and why leaders are receptive to, and supportive of, subordinates’ creative input. As hypothesized, in two experimental studies, we found that relative to mastery goal leaders,

  9. Quantitative Image Restoration in Bright Field Optical Microscopy.

    Science.gov (United States)

    Gutiérrez-Medina, Braulio; Sánchez Miranda, Manuel de Jesús

    2017-11-07

    Bright field (BF) optical microscopy is regarded as a poor method to observe unstained biological samples due to intrinsic low image contrast. We introduce quantitative image restoration in bright field (QRBF), a digital image processing method that restores out-of-focus BF images of unstained cells. Our procedure is based on deconvolution, using a point spread function modeled from theory. By comparing with reference images of bacteria observed in fluorescence, we show that QRBF faithfully recovers shape and enables quantify size of individual cells, even from a single input image. We applied QRBF in a high-throughput image cytometer to assess shape changes in Escherichia coli during hyperosmotic shock, finding size heterogeneity. We demonstrate that QRBF is also applicable to eukaryotic cells (yeast). Altogether, digital restoration emerges as a straightforward alternative to methods designed to generate contrast in BF imaging for quantitative analysis. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  10. Quantitative prediction of solute strengthening in aluminium alloys.

    Science.gov (United States)

    Leyson, Gerard Paul M; Curtin, William A; Hector, Louis G; Woodward, Christopher F

    2010-09-01

    Despite significant advances in computational materials science, a quantitative, parameter-free prediction of the mechanical properties of alloys has been difficult to achieve from first principles. Here, we present a new analytic theory that, with input from first-principles calculations, is able to predict the strengthening of aluminium by substitutional solute atoms. Solute-dislocation interaction energies in and around the dislocation core are first calculated using density functional theory and a flexible-boundary-condition method. An analytic model for the strength, or stress to move a dislocation, owing to the random field of solutes, is then presented. The theory, which has no adjustable parameters and is extendable to other metallic alloys, predicts both the energy barriers to dislocation motion and the zero-temperature flow stress, allowing for predictions of finite-temperature flow stresses. Quantitative comparisons with experimental flow stresses at temperature T=78 K are made for Al-X alloys (X=Mg, Si, Cu, Cr) and good agreement is obtained.

  11. Regional Capital Inputs in Chinese Industry and Manufacturing, 1978-2003

    NARCIS (Netherlands)

    Wang, Lili; Szirmai, Adam

    2008-01-01

    This paper provides new estimates of capital inputs in the Chinese economy. Estimates are made for the total economy (1953-2003), for the industrial sector (1978-2003) and for the manufacturing sector (1985-2003). The estimates for industry and manufacturing are broken down by thirty regions. The

  12. High-frequency matrix converter with square wave input

    Science.gov (United States)

    Carr, Joseph Alexander; Balda, Juan Carlos

    2015-03-31

    A device for producing an alternating current output voltage from a high-frequency, square-wave input voltage comprising, high-frequency, square-wave input a matrix converter and a control system. The matrix converter comprises a plurality of electrical switches. The high-frequency input and the matrix converter are electrically connected to each other. The control system is connected to each switch of the matrix converter. The control system is electrically connected to the input of the matrix converter. The control system is configured to operate each electrical switch of the matrix converter converting a high-frequency, square-wave input voltage across the first input port of the matrix converter and the second input port of the matrix converter to an alternating current output voltage at the output of the matrix converter.

  13. Field inoculation of arbuscular mycorrhiza on maize (Zea mays L. under low inputs: preliminary study on quantitative and qualitative aspects

    Directory of Open Access Journals (Sweden)

    Emilio Sabia

    2015-03-01

    Full Text Available Arbuscular mycorrhizal symbiosis contributes to the sustainability of soil-plant system. A field experiment was conducted to examine the effect of arbuscular mycorrhiza (AM on quantitative and qualitative performance in forage maize (Zea mays L.. Within the project Sviluppo di modelli zootecnici ai fini della sostenibilità (SOS-ZOOT a trial was conducted at the experimental farm of the Agricultural Research Council in Bella (PZ, located in Basilicata region (Southern Italy at 360 m asl, characterised by an annual rainfall of approximately 650 mm. For spring sowing, two plots of 2500 m2 were used, one sown with seeds inoculated with AM (M, 1.0 kg/ha, and the other one with non-inoculated seeds (NM. After 120 days after sowing, when plants showed 30% dry matter, five replicates of 1 m2 per plot were used to estimate dry matter yield (DMY, while half plot was dedicated to the assessment of grain production. For each replicate, three representative plants were considered; each plant was measured for height and was divided into leaves, stem and ear. For each plot, the following constituents were determined: crude protein, ash, ether extract, crude fibre (CF, fractions of fibre [neutral detergent (NDF, acid detergent fibre (ADF and sulphuric acid lignin] and phosphorus (P. Throughout the period of plants’ growth, no herbicides, organic or inorganic fertilisation, and irrigation water were distributed. The preliminary results revealed a significant effect of AM inoculation on forage maize DMY, P content in the whole plant, into the leaves and on the quality of steam. The M thesis showed a significant increase in terms of DMY in comparison with the NM thesis: 21.2 vs 17.9 t/ha (P<0.05. The mycorrhized whole plants [0.22 vs 0.17% dry matter (DM, P<0.05] and leaves (0.14 vs 0.09% DM, P<0.05 showed an increased P content. The stems of M plants showed a content of CF, NDF, ADF and Ash significantly lower compared with NM plants. No significant

  14. Intelligent RF-Based Gesture Input Devices Implemented Using e-Textiles

    Directory of Open Access Journals (Sweden)

    Dana Hughes

    2017-01-01

    Full Text Available We present an radio-frequency (RF-based approach to gesture detection and recognition, using e-textile versions of common transmission lines used in microwave circuits. This approach allows for easy fabrication of input swatches that can detect a continuum of finger positions and similarly basic gestures, using a single measurement line. We demonstrate that the swatches can perform gesture detection when under thin layers of cloth or when weatherproofed, providing a high level of versatility not present with other types of approaches. Additionally, using small convolutional neural networks, low-level gestures can be identified with a high level of accuracy using a small, inexpensive microcontroller, allowing for an intelligent fabric that reports only gestures of interest, rather than a simple sensor requiring constant surveillance from an external computing device. The resulting e-textile smart composite has applications in controlling wearable devices by providing a simple, eyes-free mechanism to input simple gestures.

  15. Agricultural and Environmental Input Parameters for the Biosphere Model

    International Nuclear Information System (INIS)

    Kaylie Rasmuson; Kurt Rautenstrauch

    2003-01-01

    This analysis is one of nine technical reports that support the Environmental Radiation Model for Yucca Mountain Nevada (ERMYN) biosphere model. It documents input parameters for the biosphere model, and supports the use of the model to develop Biosphere Dose Conversion Factors (BDCF). The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the repository at Yucca Mountain. The ERMYN provides the TSPA with the capability to perform dose assessments. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships between the major activities and their products (the analysis and model reports) that were planned in the biosphere Technical Work Plan (TWP, BSC 2003a). It should be noted that some documents identified in Figure 1-1 may be under development and therefore not available at the time this document is issued. The ''Biosphere Model Report'' (BSC 2003b) describes the ERMYN and its input parameters. This analysis report, ANL-MGR-MD-000006, ''Agricultural and Environmental Input Parameters for the Biosphere Model'', is one of the five reports that develop input parameters for the biosphere model. This report defines and justifies values for twelve parameters required in the biosphere model. These parameters are related to use of contaminated groundwater to grow crops. The parameter values recommended in this report are used in the soil, plant, and carbon-14 submodels of the ERMYN

  16. Snapshot Views of the Romanian Economy on Regional Level Using Input-Output Methodology

    Directory of Open Access Journals (Sweden)

    BORÓKA-JÚLIA BÍRÓ

    2014-06-01

    Full Text Available Our present paper proposes to give snapshot views on the status-quo of the Romanian economy at the level of development regions. From a methodological perspective, the study is based on the construction of an aggregated national Input-Output table from the more detailed one of the National Institute of Statistics, followed by the derivation of regional tables using the non-survey GRIT technique. Quantitative sectoral interrelationships are going to be analysed based on multipliers, backward and forward linkages in order to identify key sectors within regional economies. This could serve as a baseline for assessing the impact of several policies of the European Union on the Romanian economy, such as the Cohesion Policy and the Common Agricultural Policy. The lower territorial approach – i.e. the construction of regional Input-Output models – used within the present study is in accordance with the European Union’s NUTS2 level policy design and planning philosophy on the one hand. On the other hand, this analytic direction makes possible the use of the results as a base for regional economic development strategy design, highlighting structural specificities and discrepancies among regions of the same country.

  17. Integration of hydrothermal-energy economics: related quantitative studies

    Energy Technology Data Exchange (ETDEWEB)

    1982-08-01

    A comparison of ten models for computing the cost of hydrothermal energy is presented. This comparison involved a detailed examination of a number of technical and economic parameters of the various quantitative models with the objective of identifying the most important parameters in the context of accurate estimates of cost of hydrothermal energy. Important features of various models, such as focus of study, applications, marked sectors covered, methodology, input data requirements, and output are compared in the document. A detailed sensitivity analysis of all the important engineering and economic parameters is carried out to determine the effect of non-consideration of individual parameters.

  18. Textual Enhancement of Input: Issues and Possibilities

    Science.gov (United States)

    Han, ZhaoHong; Park, Eun Sung; Combs, Charles

    2008-01-01

    The input enhancement hypothesis proposed by Sharwood Smith (1991, 1993) has stimulated considerable research over the last 15 years. This article reviews the research on textual enhancement of input (TE), an area where the majority of input enhancement studies have aggregated. Methodological idiosyncrasies are the norm of this body of research.…

  19. Responses of tree and insect herbivores to elevated nitrogen inputs: A meta-analysis

    Science.gov (United States)

    Li, Furong; Dudley, Tom L.; Chen, Baoming; Chang, Xiaoyu; Liang, Liyin; Peng, Shaolin

    2016-11-01

    Increasing atmospheric nitrogen (N) inputs have the potential to alter terrestrial ecosystem function through impacts on plant-herbivore interactions. The goal of our study is to search for a general pattern in responses of tree characteristics important for herbivores and insect herbivorous performance to elevated N inputs. We conducted a meta-analysis based on 109 papers describing impacts of nitrogen inputs on tree characteristics and 16 papers on insect performance. The differences in plant characteristics and insect performance between broadleaves and conifers were also explored. Tree aboveground biomass, leaf biomass and leaf N concentration significantly increased under elevated N inputs. Elevated N inputs had no significantly overall effect on concentrations of phenolic compounds and lignin but adversely affected tannin, as defensive chemicals for insect herbivores. Additionally, the overall effect of insect herbivore performance (including development time, insect biomass, relative growth rate, and so on) was significantly increased by elevated N inputs. According to the inconsistent responses between broadleaves and conifers, broadleaves would be more likely to increase growth by light interception and photosynthesis rather than producing more defensive chemicals to elevated N inputs by comparison with conifers. Moreover, the overall carbohydrate concentration was significantly reduced by 13.12% in broadleaves while increased slightly in conifers. The overall tannin concentration decreased significantly by 39.21% in broadleaves but a 5.8% decrease in conifers was not significant. The results of the analysis indicated that elevated N inputs would provide more food sources and ameliorate tree palatability for insects, while the resistance of trees against their insect herbivores was weakened, especially for broadleaves. Thus, global forest insect pest problems would be aggravated by elevated N inputs. As N inputs continue to rise in the future, forest

  20. Effects of allochthonous inputs in the control of infectious disease of prey

    International Nuclear Information System (INIS)

    Sahoo, Banshidhar; Poria, Swarup

    2015-01-01

    Highlights: •Infected predator–prey model with allochthonous inputs is proposed. •Stability and persistence conditions are derived. •Bifurcation is determined with respect to allochthonous inputs. •Results show that system can not be disease free without allochthonous inputs. •Hopf and its continuation bifurcation is analysed numerically. -- Abstract: Allochthonous inputs are important sources of productivity in many food webs and their influences on food chain model demand further investigations. In this paper, assuming the existence of allochthonous inputs for intermediate predator, a food chain model is formulated with disease in the prey. The stability and persistence conditions of the equilibrium points are determined. Extinction criterion for infected prey population is obtained. It is shown that suitable amount of allochthonous inputs to intermediate predator can control infectious disease of prey population, provided initial intermediate predator population is above a critical value. This critical intermediate population size increases monotonically with the increase of infection rate. It is also shown that control of infectious disease of prey is possible in some cases of seasonally varying contact rate. Dynamical behaviours of the model are investigated numerically through one and two parameter bifurcation analysis using MATCONT 2.5.1 package. The occurrence of Hopf and its continuation curves are noted with the variation of infection rate and allochthonous food availability. The continuation curves of limit point cycle and Neimark Sacker bifurcation are drawn by varying the rate of infection and allochthonous inputs. This study introduces a novel natural non-toxic method for controlling infectious disease of prey in a food chain model

  1. A comprehensive estimation of the economic effects of meteorological services based on the input-output method.

    Science.gov (United States)

    Wu, Xianhua; Wei, Guo; Yang, Lingjuan; Guo, Ji; Lu, Huaguo; Chen, Yunfeng; Sun, Jian

    2014-01-01

    Concentrating on consuming coefficient, partition coefficient, and Leontief inverse matrix, relevant concepts and algorithms are developed for estimating the impact of meteorological services including the associated (indirect, complete) economic effect. Subsequently, quantitative estimations are particularly obtained for the meteorological services in Jiangxi province by utilizing the input-output method. It is found that the economic effects are noticeably rescued by the preventive strategies developed from both the meteorological information and internal relevance (interdependency) in the industrial economic system. Another finding is that the ratio range of input in the complete economic effect on meteorological services is about 1 : 108.27-1 : 183.06, remarkably different from a previous estimation based on the Delphi method (1 : 30-1 : 51). Particularly, economic effects of meteorological services are higher for nontraditional users of manufacturing, wholesale and retail trades, services sector, tourism and culture, and art and lower for traditional users of agriculture, forestry, livestock, fishery, and construction industries.

  2. A Comprehensive Estimation of the Economic Effects of Meteorological Services Based on the Input-Output Method

    Science.gov (United States)

    Wu, Xianhua; Yang, Lingjuan; Guo, Ji; Lu, Huaguo; Chen, Yunfeng; Sun, Jian

    2014-01-01

    Concentrating on consuming coefficient, partition coefficient, and Leontief inverse matrix, relevant concepts and algorithms are developed for estimating the impact of meteorological services including the associated (indirect, complete) economic effect. Subsequently, quantitative estimations are particularly obtained for the meteorological services in Jiangxi province by utilizing the input-output method. It is found that the economic effects are noticeably rescued by the preventive strategies developed from both the meteorological information and internal relevance (interdependency) in the industrial economic system. Another finding is that the ratio range of input in the complete economic effect on meteorological services is about 1 : 108.27–1 : 183.06, remarkably different from a previous estimation based on the Delphi method (1 : 30–1 : 51). Particularly, economic effects of meteorological services are higher for nontraditional users of manufacturing, wholesale and retail trades, services sector, tourism and culture, and art and lower for traditional users of agriculture, forestry, livestock, fishery, and construction industries. PMID:24578666

  3. High-order sliding mode observer for fractional commensurate linear systems with unknown input

    KAUST Repository

    Belkhatir, Zehor; Laleg-Kirati, Taous-Meriem

    2017-01-01

    In this paper, a high-order sliding mode observer (HOSMO) is proposed for the joint estimation of the pseudo-state and the unknown input of fractional commensurate linear systems with single unknown input and a single output. The convergence of the proposed observer is proved using a Lyapunov-based approach. In addition, an enhanced variant of the proposed fractional-HOSMO is introduced to avoid the peaking phenomenon and thus to improve the estimation results in the transient phase. Simulation results are provided to illustrate the performance of the proposed fractional observer in both noise-free and noisy cases. The effect of the observer’s gains on the estimated pseudo-state and unknown input is also discussed.

  4. High-order sliding mode observer for fractional commensurate linear systems with unknown input

    KAUST Repository

    Belkhatir, Zehor

    2017-05-20

    In this paper, a high-order sliding mode observer (HOSMO) is proposed for the joint estimation of the pseudo-state and the unknown input of fractional commensurate linear systems with single unknown input and a single output. The convergence of the proposed observer is proved using a Lyapunov-based approach. In addition, an enhanced variant of the proposed fractional-HOSMO is introduced to avoid the peaking phenomenon and thus to improve the estimation results in the transient phase. Simulation results are provided to illustrate the performance of the proposed fractional observer in both noise-free and noisy cases. The effect of the observer’s gains on the estimated pseudo-state and unknown input is also discussed.

  5. Estimating the input function non-invasively for FDG-PET quantification with multiple linear regression analysis: simulation and verification with in vivo data

    International Nuclear Information System (INIS)

    Fang, Yu-Hua; Kao, Tsair; Liu, Ren-Shyan; Wu, Liang-Chih

    2004-01-01

    A novel statistical method, namely Regression-Estimated Input Function (REIF), is proposed in this study for the purpose of non-invasive estimation of the input function for fluorine-18 2-fluoro-2-deoxy-d-glucose positron emission tomography (FDG-PET) quantitative analysis. We collected 44 patients who had undergone a blood sampling procedure during their FDG-PET scans. First, we generated tissue time-activity curves of the grey matter and the whole brain with a segmentation technique for every subject. Summations of different intervals of these two curves were used as a feature vector, which also included the net injection dose. Multiple linear regression analysis was then applied to find the correlation between the input function and the feature vector. After a simulation study with in vivo data, the data of 29 patients were applied to calculate the regression coefficients, which were then used to estimate the input functions of the other 15 subjects. Comparing the estimated input functions with the corresponding real input functions, the averaged error percentages of the area under the curve and the cerebral metabolic rate of glucose (CMRGlc) were 12.13±8.85 and 16.60±9.61, respectively. Regression analysis of the CMRGlc values derived from the real and estimated input functions revealed a high correlation (r=0.91). No significant difference was found between the real CMRGlc and that derived from our regression-estimated input function (Student's t test, P>0.05). The proposed REIF method demonstrated good abilities for input function and CMRGlc estimation, and represents a reliable replacement for the blood sampling procedures in FDG-PET quantification. (orig.)

  6. Multiple-Input Subject-Specific Modeling of Plasma Glucose Concentration for Feedforward Control.

    Science.gov (United States)

    Kotz, Kaylee; Cinar, Ali; Mei, Yong; Roggendorf, Amy; Littlejohn, Elizabeth; Quinn, Laurie; Rollins, Derrick K

    2014-11-26

    The ability to accurately develop subject-specific, input causation models, for blood glucose concentration (BGC) for large input sets can have a significant impact on tightening control for insulin dependent diabetes. More specifically, for Type 1 diabetics (T1Ds), it can lead to an effective artificial pancreas (i.e., an automatic control system that delivers exogenous insulin) under extreme changes in critical disturbances. These disturbances include food consumption, activity variations, and physiological stress changes. Thus, this paper presents a free-living, outpatient, multiple-input, modeling method for BGC with strong causation attributes that is stable and guards against overfitting to provide an effective modeling approach for feedforward control (FFC). This approach is a Wiener block-oriented methodology, which has unique attributes for meeting critical requirements for effective, long-term, FFC.

  7. Effect of input compression and input frequency response on music perception in cochlear implant users.

    Science.gov (United States)

    Halliwell, Emily R; Jones, Linor L; Fraser, Matthew; Lockley, Morag; Hill-Feltham, Penelope; McKay, Colette M

    2015-06-01

    A study was conducted to determine whether modifications to input compression and input frequency response characteristics can improve music-listening satisfaction in cochlear implant users. Experiment 1 compared three pre-processed versions of music and speech stimuli in a laboratory setting: original, compressed, and flattened frequency response. Music excerpts comprised three music genres (classical, country, and jazz), and a running speech excerpt was compared. Experiment 2 implemented a flattened input frequency response in the speech processor program. In a take-home trial, participants compared unaltered and flattened frequency responses. Ten and twelve adult Nucleus Freedom cochlear implant users participated in Experiments 1 and 2, respectively. Experiment 1 revealed a significant preference for music stimuli with a flattened frequency response compared to both original and compressed stimuli, whereas there was a significant preference for the original (rising) frequency response for speech stimuli. Experiment 2 revealed no significant mean preference for the flattened frequency response, with 9 of 11 subjects preferring the rising frequency response. Input compression did not alter music enjoyment. Comparison of the two experiments indicated that individual frequency response preferences may depend on the genre or familiarity, and particularly whether the music contained lyrics.

  8. A novel three-input monomolecular logic circuit on a rhodamine inspired bio-compatible bi-compartmental molecular platform

    International Nuclear Information System (INIS)

    Mistri, Tarun; Bhowmick, Rahul; Katarkar, Atul; Chaudhuri, Keya; Ali, Mahammad

    2017-01-01

    Methodological synthesis of a new biocompatible bi-compartmental rhodamine based probe (L 3 ) provides a multi-inputs and multi-outputs molecular logic circuit based on simple chemosensing phenomena. Spectroscopic responses of Cu 2+ and Hg 2+ towards L 3 together with reversible binding of S 2- with L 3 -Cu 2+ and L 3 -Hg 2+ complexes help us to construct a thee-input molecular circuit on their control and sequential addition to a solution of L 3 in a mixed organo-aqueous medium. We have further successfully encoded binary digits out of these inputs and outputs which may convert a three-digit input string into a two-digit output string resulting a simple monomolecular logic circuit. Such a molecular ‘Boolean’ logic operation may improve the complexity of logic gate circuitry and computational speed and may be useful to employ in potential biocompatible molecular logic platforms. - Graphical abstract: A new bi-compartmental molecular system equipped with Rhodamine fluorophore unit provides a Multi-inputs and Multi-outputs Molecular Logic Circuit based on a very simple observation of chemosensing activities.

  9. Isotopic signatures of eelgrass (Zostera marina L.) as bioindicator of anthropogenic nutrient input in the western Baltic Sea

    International Nuclear Information System (INIS)

    Schubert, Philipp R.; Karez, Rolf; Reusch, Thorsten B.H.; Dierking, Jan

    2013-01-01

    Highlights: • Anthropogenic nitrogen (N) inputs are a global problem, but difficult to quantify. • We tested the use of eelgrass δ 15 N as proxy of such inputs in the Baltic Sea. • The method revealed distinct spatial patterns in sewage N across a eutrophic bay. • Traditional eutrophication measures corroborated the results from δ 15 N values. • Eelgrass δ 15 N ratios have high potential as proxy of sewage-derived N in the Baltic. -- Abstract: Eutrophication is a global environmental problem. Better management of this threat requires more accurate assessments of anthropogenic nitrogen (N) inputs to coastal systems than can be obtained with traditional measures. Recently, primary producer N isotopic signatures have emerged as useful proxy of such inputs. Here, we demonstrated for the first time the applicability of this method using the widespread eelgrass (Zostera marina) in the highly eutrophic Baltic Sea. Spatial availability of sewage N across a bay with one major sewage outflow predicted by eelgrass δ 15 N was high near and downstream of the outflow compared to upstream, but returned to upstream levels within 4 km downstream from the outfall. General conclusions were corroborated by traditional eutrophication measures, but in contrast to these measures were fully quantitative. Eelgrass N isotope ratios therefore show high potential for coastal screens of eutrophication in the Baltic Sea, and in other areas with eelgrass meadows

  10. Quantitative 3-D imaging topogrammetry for telemedicine applications

    Science.gov (United States)

    Altschuler, Bruce R.

    1994-01-01

    The technology to reliably transmit high-resolution visual imagery over short to medium distances in real time has led to the serious considerations of the use of telemedicine, telepresence, and telerobotics in the delivery of health care. These concepts may involve, and evolve toward: consultation from remote expert teaching centers; diagnosis; triage; real-time remote advice to the surgeon; and real-time remote surgical instrument manipulation (telerobotics with virtual reality). Further extrapolation leads to teledesign and telereplication of spare surgical parts through quantitative teleimaging of 3-D surfaces tied to CAD/CAM devices and an artificially intelligent archival data base of 'normal' shapes. The ability to generate 'topogrames' or 3-D surface numerical tables of coordinate values capable of creating computer-generated virtual holographic-like displays, machine part replication, and statistical diagnostic shape assessment is critical to the progression of telemedicine. Any virtual reality simulation will remain in 'video-game' realm until realistic dimensional and spatial relational inputs from real measurements in vivo during surgeries are added to an ever-growing statistical data archive. The challenges of managing and interpreting this 3-D data base, which would include radiographic and surface quantitative data, are considerable. As technology drives toward dynamic and continuous 3-D surface measurements, presenting millions of X, Y, Z data points per second of flexing, stretching, moving human organs, the knowledge base and interpretive capabilities of 'brilliant robots' to work as a surgeon's tireless assistants becomes imaginable. The brilliant robot would 'see' what the surgeon sees--and more, for the robot could quantify its 3-D sensing and would 'see' in a wider spectral range than humans, and could zoom its 'eyes' from the macro world to long-distance microscopy. Unerring robot hands could rapidly perform machine-aided suturing with

  11. PERSPECTIVES ON A DOE CONSEQUENCE INPUTS FOR ACCIDENT ANALYSIS APPLICATIONS

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Thoman, D.C.; Lowrie, J.; Keller, A.

    2008-01-01

    Department of Energy (DOE) accident analysis for establishing the required control sets for nuclear facility safety applies a series of simplifying, reasonably conservative assumptions regarding inputs and methodologies for quantifying dose consequences. Most of the analytical practices are conservative, have a technical basis, and are based on regulatory precedent. However, others are judgmental and based on older understanding of phenomenology. The latter type of practices can be found in modeling hypothetical releases into the atmosphere and the subsequent exposure. Often the judgments applied are not based on current technical understanding but on work that has been superseded. The objective of this paper is to review the technical basis for the major inputs and assumptions in the quantification of consequence estimates supporting DOE accident analysis, and to identify those that could be reassessed in light of current understanding of atmospheric dispersion and radiological exposure. Inputs and assumptions of interest include: Meteorological data basis; Breathing rate; and Inhalation dose conversion factor. A simple dose calculation is provided to show the relative difference achieved by improving the technical bases

  12. Parallel factor ChIP provides essential internal control for quantitative differential ChIP-seq.

    Science.gov (United States)

    Guertin, Michael J; Cullen, Amy E; Markowetz, Florian; Holding, Andrew N

    2018-04-17

    A key challenge in quantitative ChIP combined with high-throughput sequencing (ChIP-seq) is the normalization of data in the presence of genome-wide changes in occupancy. Analysis-based normalization methods were developed for transcriptomic data and these are dependent on the underlying assumption that total transcription does not change between conditions. For genome-wide changes in transcription factor (TF) binding, these assumptions do not hold true. The challenges in normalization are confounded by experimental variability during sample preparation, processing and recovery. We present a novel normalization strategy utilizing an internal standard of unchanged peaks for reference. Our method can be readily applied to monitor genome-wide changes by ChIP-seq that are otherwise lost or misrepresented through analytical normalization. We compare our approach to normalization by total read depth and two alternative methods that utilize external experimental controls to study TF binding. We successfully resolve the key challenges in quantitative ChIP-seq analysis and demonstrate its application by monitoring the loss of Estrogen Receptor-alpha (ER) binding upon fulvestrant treatment, ER binding in response to estrodiol, ER mediated change in H4K12 acetylation and profiling ER binding in patient-derived xenographs. This is supported by an adaptable pipeline to normalize and quantify differential TF binding genome-wide and generate metrics for differential binding at individual sites.

  13. Input-Admittance Passivity Compliance for Grid-Connected Converters with LCL Filter

    DEFF Research Database (Denmark)

    Diaz, Enrique Rodriguez; Freijedo, Francisco D.; Guerrero, Josep M.

    2018-01-01

    This work presents a design methodology and its experimental validation for the input-admittance passivity compliance of LCL grid-connected converters. The designs of the LCL filter parameters and discrete controller are addressed systematically, and suitable design guidelines are provided......-admittance passivity compliance....

  14. Stochastic weather inputs for improved urban water demand forecasting: application of nonlinear input variable selection and machine learning methods

    Science.gov (United States)

    Quilty, J.; Adamowski, J. F.

    2015-12-01

    Urban water supply systems are often stressed during seasonal outdoor water use as water demands related to the climate are variable in nature making it difficult to optimize the operation of the water supply system. Urban water demand forecasts (UWD) failing to include meteorological conditions as inputs to the forecast model may produce poor forecasts as they cannot account for the increase/decrease in demand related to meteorological conditions. Meteorological records stochastically simulated into the future can be used as inputs to data-driven UWD forecasts generally resulting in improved forecast accuracy. This study aims to produce data-driven UWD forecasts for two different Canadian water utilities (Montreal and Victoria) using machine learning methods by first selecting historical UWD and meteorological records derived from a stochastic weather generator using nonlinear input variable selection. The nonlinear input variable selection methods considered in this work are derived from the concept of conditional mutual information, a nonlinear dependency measure based on (multivariate) probability density functions and accounts for relevancy, conditional relevancy, and redundancy from a potential set of input variables. The results of our study indicate that stochastic weather inputs can improve UWD forecast accuracy for the two sites considered in this work. Nonlinear input variable selection is suggested as a means to identify which meteorological conditions should be utilized in the forecast.

  15. Environmental Transport Input Parameters for the Biosphere Model

    International Nuclear Information System (INIS)

    M. Wasiolek

    2004-01-01

    This analysis report is one of the technical reports documenting the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment for the license application (TSPA-LA) for the geologic repository at Yucca Mountain. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows relationships among the reports developed for biosphere modeling and biosphere abstraction products for the TSPA-LA, as identified in the ''Technical Work Plan for Biosphere Modeling and Expert Support'' (BSC 2004 [DIRS 169573]) (TWP). This figure provides an understanding of how this report contributes to biosphere modeling in support of the license application (LA). This report is one of the five reports that develop input parameter values for the biosphere model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes the conceptual model and the mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed description of the model input parameters. The output of this report is used as direct input in the ''Nominal Performance Biosphere Dose Conversion Factor Analysis'' and in the ''Disruptive Event Biosphere Dose Conversion Factor Analysis'' that calculate the values of biosphere dose conversion factors (BDCFs) for the groundwater and volcanic ash exposure scenarios, respectively. The purpose of this analysis was to develop biosphere model parameter values related to radionuclide transport and accumulation in the environment. These parameters support calculations of radionuclide concentrations in the environmental media (e.g., soil, crops, animal products, and air) resulting from a given radionuclide concentration at the source of contamination (i.e., either in groundwater or in volcanic ash). The analysis was performed in accordance with the TWP (BSC 2004 [DIRS 169573])

  16. Environmental Transport Input Parameters for the Biosphere Model

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-09-10

    This analysis report is one of the technical reports documenting the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment for the license application (TSPA-LA) for the geologic repository at Yucca Mountain. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows relationships among the reports developed for biosphere modeling and biosphere abstraction products for the TSPA-LA, as identified in the ''Technical Work Plan for Biosphere Modeling and Expert Support'' (BSC 2004 [DIRS 169573]) (TWP). This figure provides an understanding of how this report contributes to biosphere modeling in support of the license application (LA). This report is one of the five reports that develop input parameter values for the biosphere model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes the conceptual model and the mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed description of the model input parameters. The output of this report is used as direct input in the ''Nominal Performance Biosphere Dose Conversion Factor Analysis'' and in the ''Disruptive Event Biosphere Dose Conversion Factor Analysis'' that calculate the values of biosphere dose conversion factors (BDCFs) for the groundwater and volcanic ash exposure scenarios, respectively. The purpose of this analysis was to develop biosphere model parameter values related to radionuclide transport and accumulation in the environment. These parameters support calculations of radionuclide concentrations in the environmental media (e.g., soil, crops, animal products, and air) resulting from a given radionuclide concentration at the source of contamination (i.e., either in groundwater or in volcanic ash). The analysis

  17. Applications of Microfluidics in Quantitative Biology.

    Science.gov (United States)

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2018-05-01

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  18. An Analysis of the Effect of Quantitative and Qualitative Admissions Factors in Determining Student Performance at the U.S. Naval Academy

    National Research Council Canada - National Science Library

    Phillips, Barton

    2004-01-01

    .... The Candidate Multiple (CM) is the quantitative input to the admissions process derived from a statistics-based scoring model anchored in proven high school performance measures such as the SAT and high school GPA...

  19. The profile of quantitative risk indicators in Krsko NPP

    International Nuclear Information System (INIS)

    Vrbanic, I.; Basic, I.; Bilic-Zabric, T.; Spiler, J.

    2004-01-01

    During the past decade strong initiative was observed which was aimed at incorporating information on risk into various aspects of operation of nuclear power plants. The initiative was observable in activities carried out by regulators as well as utilities and industry. It resulted in establishing the process, or procedure, which is often referred to as integrated decision making or risk informed decision making. In this process, engineering analyses and evaluations that are usually termed traditional and that rely on considerations of safety margins and defense in depth are supplemented by quantitative indicators of risk. Throughout the process, the plant risk was most commonly expressed in terms of likelihood of events involving damage to the reactor core and events with radiological releases to the environment. These became two commonly used quantitative indicators or metrics of plant risk (or, reciprocally, plant safety). They were evaluated for their magnitude (e.g. the expected number of events per specified time interval), as well as their profile (e.g. the types of contributing events). The information for quantitative risk indicators (to be used in risk informing process) is obtained from plant's probabilistic safety analyses or analyses of hazards. It is dependable on issues such as availability of input data or quality of model or analysis. Nuclear power plant Krsko has recently performed Periodic Safety Review, which was a good opportunity to evaluate and integrate the plant specific information on quantitative plant risk indicators and their profile. The paper discusses some aspects of quantitative plant risk profile and its perception.(author)

  20. A low-offset low-voltage CMOS Op Amp with rail-to-rail input and output ranges

    NARCIS (Netherlands)

    Holzmann, Peter J.; Wiegerink, Remco J.; Gierkink, Sander L.J.; Wassenaar, R.F.; Stroet, Peter; Stroet, P.M.

    1996-01-01

    A low voltage CMOS op amp is presented. The circuit uses complementary input pairs to achieve a rail-to-rail common mode input voltage range. Special attention has been given to the reduction of the op amp's systematic offset voltage. Gain boost amplifiers are connected in a special way to provide

  1. Analysis of network motifs in cellular regulation: Structural similarities, input-output relations and signal integration.

    Science.gov (United States)

    Straube, Ronny

    2017-12-01

    Much of the complexity of regulatory networks derives from the necessity to integrate multiple signals and to avoid malfunction due to cross-talk or harmful perturbations. Hence, one may expect that the input-output behavior of larger networks is not necessarily more complex than that of smaller network motifs which suggests that both can, under certain conditions, be described by similar equations. In this review, we illustrate this approach by discussing the similarities that exist in the steady state descriptions of a simple bimolecular reaction, covalent modification cycles and bacterial two-component systems. Interestingly, in all three systems fundamental input-output characteristics such as thresholds, ultrasensitivity or concentration robustness are described by structurally similar equations. Depending on the system the meaning of the parameters can differ ranging from protein concentrations and affinity constants to complex parameter combinations which allows for a quantitative understanding of signal integration in these systems. We argue that this approach may also be extended to larger regulatory networks. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. LFQuant: a label-free fast quantitative analysis tool for high-resolution LC-MS/MS proteomics data.

    Science.gov (United States)

    Zhang, Wei; Zhang, Jiyang; Xu, Changming; Li, Ning; Liu, Hui; Ma, Jie; Zhu, Yunping; Xie, Hongwei

    2012-12-01

    Database searching based methods for label-free quantification aim to reconstruct the peptide extracted ion chromatogram based on the identification information, which can limit the search space and thus make the data processing much faster. The random effect of the MS/MS sampling can be remedied by cross-assignment among different runs. Here, we present a new label-free fast quantitative analysis tool, LFQuant, for high-resolution LC-MS/MS proteomics data based on database searching. It is designed to accept raw data in two common formats (mzXML and Thermo RAW), and database search results from mainstream tools (MASCOT, SEQUEST, and X!Tandem), as input data. LFQuant can handle large-scale label-free data with fractionation such as SDS-PAGE and 2D LC. It is easy to use and provides handy user interfaces for data loading, parameter setting, quantitative analysis, and quantitative data visualization. LFQuant was compared with two common quantification software packages, MaxQuant and IDEAL-Q, on the replication data set and the UPS1 standard data set. The results show that LFQuant performs better than them in terms of both precision and accuracy, and consumes significantly less processing time. LFQuant is freely available under the GNU General Public License v3.0 at http://sourceforge.net/projects/lfquant/. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. The ATLAS Fast Tracker Processing Units - input and output data preparation

    CERN Document Server

    Bolz, Arthur Eugen; The ATLAS collaboration

    2016-01-01

    The ATLAS Fast Tracker is a hardware processor built to reconstruct tracks at a rate of up to 100 kHz and provide them to the high level trigger system. The Fast Tracker will allow the trigger to utilize tracking information from the entire detector at an earlier event selection stage than ever before, allowing for more efficient event rejection. The connection of the system from to the detector read-outs and to the high level trigger computing farms are made through custom boards implementing Advanced Telecommunications Computing Technologies standard. The input is processed by the Input Mezzanines and Data Formatter boards, designed to receive and sort the data coming from the Pixel and Semi-conductor Tracker. The Fast Tracker to Level-2 Interface Card connects the system to the computing farm. The Input Mezzanines are 128 boards, performing clustering, placed on the 32 Data Formatter mother boards that sort the information into 64 logical regions required by the downstream processing units. This necessitat...

  4. Agricultural and Environmental Input Parameters for the Biosphere Model

    Energy Technology Data Exchange (ETDEWEB)

    Kaylie Rasmuson; Kurt Rautenstrauch

    2003-06-20

    This analysis is one of nine technical reports that support the Environmental Radiation Model for Yucca Mountain Nevada (ERMYN) biosphere model. It documents input parameters for the biosphere model, and supports the use of the model to develop Biosphere Dose Conversion Factors (BDCF). The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the repository at Yucca Mountain. The ERMYN provides the TSPA with the capability to perform dose assessments. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships between the major activities and their products (the analysis and model reports) that were planned in the biosphere Technical Work Plan (TWP, BSC 2003a). It should be noted that some documents identified in Figure 1-1 may be under development and therefore not available at the time this document is issued. The ''Biosphere Model Report'' (BSC 2003b) describes the ERMYN and its input parameters. This analysis report, ANL-MGR-MD-000006, ''Agricultural and Environmental Input Parameters for the Biosphere Model'', is one of the five reports that develop input parameters for the biosphere model. This report defines and justifies values for twelve parameters required in the biosphere model. These parameters are related to use of contaminated groundwater to grow crops. The parameter values recommended in this report are used in the soil, plant, and carbon-14 submodels of the ERMYN.

  5. Quantitative Resistance to Plant Pathogens in Pyramiding Strategies for Durable Crop Protection

    Directory of Open Access Journals (Sweden)

    Marie-Laure Pilet-Nayel

    2017-10-01

    Full Text Available Quantitative resistance has gained interest in plant breeding for pathogen control in low-input cropping systems. Although quantitative resistance frequently has only a partial effect and is difficult to select, it is considered more durable than major resistance (R genes. With the exponential development of molecular markers over the past 20 years, resistance QTL have been more accurately detected and better integrated into breeding strategies for resistant varieties with increased potential for durability. This review summarizes current knowledge on the genetic inheritance, molecular basis, and durability of quantitative resistance. Based on this knowledge, we discuss how strategies that combine major R genes and QTL in crops can maintain the effectiveness of plant resistance to pathogens. Combining resistance QTL with complementary modes of action appears to be an interesting strategy for breeding effective and potentially durable resistance. Combining quantitative resistance with major R genes has proven to be a valuable approach for extending the effectiveness of major genes. In the plant genomics era, improved tools and methods are becoming available to better integrate quantitative resistance into breeding strategies. Nevertheless, optimal combinations of resistance loci will still have to be identified to preserve resistance effectiveness over time for durable crop protection.

  6. SUS in nuclear medicine in Brazil: analysis and comparison of data provided by Datasus and CNEN.

    Science.gov (United States)

    Pozzo, Lorena; Coura Filho, George; Osso Júnior, João Alberto; Squair, Peterson Lima

    2014-01-01

    To investigate the outpatient access to nuclear medicine procedures by means of the Brazilian Unified Health System (SUS), analyzing the correspondence between data provided by this system and those from Comissão Nacional de Energia Nuclear (CNEN) (National Commission of Nuclear Energy). Data provided by Datasus regarding number of scintillation chambers, outpatient procedures performed from 2008 to 2012, administrative responsibility for such procedures, type of service providers and outsourced services were retrieved and evaluated. Also, such data were compared with those from institutions certified by CNEN. The present study demonstrated that the system still lacks maturity in terms of correct data input, particularly regarding equipment available. It was possible to list the most common procedures and check the growth of the specialty along the study period. Private centers are responsible for most of the procedures covered and reimbursed by SUS. However, many healthcare facilities are not certified by CNEN. Datasus provides relevant data for analysis as done in the present study, although some issues still require attention. The present study has quantitatively depicted the Brazilian reality regarding access to nuclear medicine procedures offered by/for SUS.

  7. SUS in nuclear medicine in Brazil: analysis and comparison of data provided by Datasus and CNEN*

    Science.gov (United States)

    Pozzo, Lorena; Coura Filho, George; Osso Júnior, João Alberto; Squair, Peterson Lima

    2014-01-01

    Objective To investigate the outpatient access to nuclear medicine procedures by means of the Brazilian Unified Health System (SUS), analyzing the correspondence between data provided by this system and those from Comissão Nacional de Energia Nuclear (CNEN) (National Commission of Nuclear Energy). Materials and Methods Data provided by Datasus regarding number of scintillation chambers, outpatient procedures performed from 2008 to 2012, administrative responsibility for such procedures, type of service providers and outsourced services were retrieved and evaluated. Also, such data were compared with those from institutions certified by CNEN. Results The present study demonstrated that the system still lacks maturity in terms of correct data input, particularly regarding equipment available. It was possible to list the most common procedures and check the growth of the specialty along the study period. Private centers are responsible for most of the procedures covered and reimbursed by SUS. However, many healthcare facilities are not certified by CNEN. Conclusion Datasus provides relevant data for analysis as done in the present study, although some issues still require attention. The present study has quantitatively depicted the Brazilian reality regarding access to nuclear medicine procedures offered by/for SUS. PMID:25741070

  8. Using a quantitative risk register to promote learning from a patient safety reporting system.

    Science.gov (United States)

    Mansfield, James G; Caplan, Robert A; Campos, John S; Dreis, David F; Furman, Cathie

    2015-02-01

    Patient safety reporting systems are now used in most health care delivery organizations. These systems, such as the one in use at Virginia Mason (Seattle) since 2002, can provide valuable reports of risk and harm from the front lines of patient care. In response to the challenge of how to quantify and prioritize safety opportunities, a risk register system was developed and implemented. Basic risk register concepts were refined to provide a systematic way to understand risks reported by staff. The risk register uses a comprehensive taxonomy of patient risk and algorithmically assigns each patient safety report to 1 of 27 risk categories in three major domains (Evaluation, Treatment, and Critical Interactions). For each category, a composite score was calculated on the basis of event rate, harm, and cost. The composite scores were used to identify the "top five" risk categories, and patient safety reports in these categories were analyzed in greater depth to find recurrent patterns of risk and associated opportunities for improvement. The top five categories of risk were easy to identify and had distinctive "profiles" of rate, harm, and cost. The ability to categorize and rank risks across multiple dimensions yielded insights not previously available. These results were shared with leadership and served as input for planning quality and safety initiatives. This approach provided actionable input for the strategic planning process, while at the same time strengthening the Virginia Mason culture of safety. The quantitative patient safety risk register serves as one solution to the challenge of extracting valuable safety lessons from large numbers of incident reports and could profitably be adopted by other organizations.

  9. Progress towards in vitro quantitative imaging of human femur using compound quantitative ultrasonic tomography

    International Nuclear Information System (INIS)

    Lasaygues, Philippe; Ouedraogo, Edgard; Lefebvre, Jean-Pierre; Gindre, Marcel; Talmant, Marilyne; Laugier, Pascal

    2005-01-01

    The objective of this study is to make cross-sectional ultrasonic quantitative tomography of the diaphysis of long bones. Ultrasonic propagation in bones is affected by the severe mismatch between the acoustic properties of this biological solid and those of the surrounding soft medium, namely, the soft tissues in vivo or water in vitro. Bone imaging is then a nonlinear inverse-scattering problem. In this paper, we showed that in vitro quantitative images of sound velocities in a human femur cross section could be reconstructed by combining ultrasonic reflection tomography (URT), which provides images of the macroscopic structure of the bone, and ultrasonic transmission tomography (UTT), which provides quantitative images of the sound velocity. For the shape, we developed an image-processing tool to extract the external and internal boundaries and cortical thickness measurements. For velocity mapping, we used a wavelet analysis tool adapted to ultrasound, which allowed us to detect precisely the time of flight from the transmitted signals. A brief review of the ultrasonic tomography that we developed using correction algorithms of the wavepaths and compensation procedures are presented. Also shown are the first results of our analyses on models and specimens of long bone using our new iterative quantitative protocol

  10. Quantifying input uncertainty in an assemble-to-order system simulation with correlated input variables of mixed types

    NARCIS (Netherlands)

    Akçay, A.E.; Biller, B.

    2014-01-01

    We consider an assemble-to-order production system where the product demands and the time since the last customer arrival are not independent. The simulation of this system requires a multivariate input model that generates random input vectors with correlated discrete and continuous components. In

  11. On the Nature of the Input in Optimality Theory

    DEFF Research Database (Denmark)

    Heck, Fabian; Müller, Gereon; Vogel, Ralf

    2002-01-01

    The input has two main functions in optimality theory (Prince and Smolensky 1993). First, the input defines the candidate set, in other words it determines which output candidates compete for optimality, and which do not. Second, the input is referred to by faithfulness constraints that prohibit...... output candidates from deviating from specifications in the input. Whereas there is general agreement concerning the relevance of the input in phonology, the nature of the input in syntax is notoriously unclear. In this article, we show that the input should not be taken to define syntactic candidate...... and syntax is due to a basic, irreducible difference between these two components of grammar: Syntax is an information preserving system, phonology is not....

  12. Quantitative measurement of pathogen specific human memory T cell repertoire diversity using a CDR3β-specific microarray

    Directory of Open Access Journals (Sweden)

    Gorski Jack

    2007-09-01

    Full Text Available Abstract Background Providing quantitative microarray data that is sensitive to very small differences in target sequence would be a useful tool in any number of venues where a sample can consist of a multiple related sequences present in various abundances. Examples of such applications would include measurement of pseudo species in viral infections and the measurement of species of antibodies or T cell receptors that constitute immune repertoires. Difficulties that must be overcome in such a method would be to account for cross-hybridization and for differences in hybridization efficiencies between the arrayed probes and their corresponding targets. We have used the memory T cell repertoire to an influenza-derived peptide as a test case for developing such a method. Results The arrayed probes were corresponded to a 17 nucleotide TCR-specific region that distinguished sequences differing by as little as a single nucleotide. Hybridization efficiency between highly related Cy5-labeled subject sequences was normalized by including an equimolar mixture of Cy3-labeled synthetic targets representing all 108 arrayed probes. The same synthetic targets were used to measure the degree of cross hybridization between probes. Reconstitution studies found the system sensitive to input ratios as low as 0.5% and accurate in measuring known input percentages (R2 = 0.81, R = 0.90, p 0.05. Conclusion This novel strategy appears to be robust and can be adapted to any situation where complex mixtures of highly similar sequences need to be quantitatively resolved.

  13. Phasing Out a Polluting Input

    OpenAIRE

    Eriksson, Clas

    2015-01-01

    This paper explores economic policies related to the potential conflict between economic growth and the environment. It applies a model with directed technological change and focuses on the case with low elasticity of substitution between clean and dirty inputs in production. New technology is substituted for the polluting input, which results in a gradual decline in pollution along the optimal long-run growth path. In contrast to some recent work, the era of pollution and environmental polic...

  14. Distinguishing Representations as Origin and Representations as Input: Roles for Individual Cells

    Directory of Open Access Journals (Sweden)

    Jonathan C.W. Edwards

    2016-09-01

    Full Text Available It is widely perceived that there is a problem in giving a naturalistic account of mental representation that deals adequately with meaning, interpretation or significance (semantic content. It is suggested here that this problem may arise partly from the conflation of two vernacular senses of representation: representation-as-origin and representation-as-input. The flash of a neon sign may in one sense represent a popular drink, but to function as representation it must provide an input to a ‘consumer’ in the street. The arguments presented draw on two principles – the neuron doctrine and the need for a venue for ‘presentation’ or ‘reception’ of a representation at a specified site, consistent with the locality principle. It is also argued that domains of representation cannot be defined by signal traffic, since they can be expected to include ‘null’ elements based on non-firing cells. In this analysis, mental representations-as-origin are distributed patterns of cell firing. Each firing cell is given semantic value in its own right - some form of atomic propositional significance – since different axonal branches may contribute to integration with different populations of signals at different downstream sites. Representations-as-input are patterns of local co-arrival of signals in the form of synaptic potentials in dendrites. Meaning then draws on the relationships between active and null inputs, forming ‘scenarios’ comprising a molecular combination of ‘premises’ from which a new output with atomic propositional significance is generated. In both types of representation, meaning, interpretation or significance pivots on events in an individual cell. (This analysis only applies to ‘occurrent’ representations based on current neural activity. The concept of representations-as-input emphasises the need for a ‘consumer’ of a representation and the dependence of meaning on the co-relationships involved in an

  15. Transcutaneous measurement of the arterial input function in positron emission tomography

    International Nuclear Information System (INIS)

    Litton, J.E.; Eriksson, L.

    1990-01-01

    Positron emission tomography (PET) provides a powerful tool in medical research. Biochemical function can be both precisely localized and quantitatively measured. To achieve reliable quantitation it is necessary to know the time course of activity concentration in the arterial blood during the measurement. In this study the arterial blood curve from the brachial artery is compared to the activity measured in the internal carotid artery with a new transcutaneous detector

  16. Input/Output linearizing control of a nuclear reactor

    International Nuclear Information System (INIS)

    Perez C, V.

    1994-01-01

    The feedback linearization technique is an approach to nonlinear control design. The basic idea is to transform, by means of algebraic methods, the dynamics of a nonlinear control system into a full or partial linear system. As a result of this linearization process, the well known basic linear control techniques can be used to obtain some desired dynamic characteristics. When full linearization is achieved, the method is referred to as input-state linearization, whereas when partial linearization is achieved, the method is referred to as input-output linearization. We will deal with the latter. By means of input-output linearization, the dynamics of a nonlinear system can be decomposed into an external part (input-output), and an internal part (unobservable). Since the external part consists of a linear relationship among the output of the plant and the auxiliary control input mentioned above, it is easy to design such an auxiliary control input so that we get the output to behave in a predetermined way. Since the internal dynamics of the system is known, we can check its dynamics behavior on order of to ensure that the internal states are bounded. The linearization method described here can be applied to systems with one-input/one-output, as well as to systems with multiple-inputs/multiple-outputs. Typical control problems such as stabilization and reference path tracking can be solved using this technique. In this work, the input/output linearization theory is presented, as well as the problem of getting the output variable to track some desired trayectories. Further, the design of an input/output control system applied to the nonlinear model of a research nuclear reactor is included, along with the results obtained by computer simulation. (Author)

  17. WORM: A general-purpose input deck specification language

    International Nuclear Information System (INIS)

    Jones, T.

    1999-01-01

    Using computer codes to perform criticality safety calculations has become common practice in the industry. The vast majority of these codes use simple text-based input decks to represent the geometry, materials, and other parameters that describe the problem. However, the data specified in input files are usually processed results themselves. For example, input decks tend to require the geometry specification in linear dimensions and materials in atom or weight fractions, while the parameter of interest might be mass or concentration. The calculations needed to convert from the item of interest to the required parameter in the input deck are usually performed separately and then incorporated into the input deck. This process of calculating, editing, and renaming files to perform a simple parameter study is tedious at best. In addition, most computer codes require dimensions to be specified in centimeters, while drawings or other materials used to create the input decks might be in other units. This also requires additional calculation or conversion prior to composition of the input deck. These additional calculations, while extremely simple, introduce a source for error in both the calculations and transcriptions. To overcome these difficulties, WORM (Write One, Run Many) was created. It is an easy-to-use programming language to describe input decks and can be used with any computer code that uses standard text files for input. WORM is available, via the Internet, at worm.lanl.gov. A user's guide, tutorials, example models, and other WORM-related materials are also available at this Web site. Questions regarding WORM should be directed to wormatlanl.gov

  18. Genetic algorithm based input selection for a neural network function approximator with applications to SSME health monitoring

    Science.gov (United States)

    Peck, Charles C.; Dhawan, Atam P.; Meyer, Claudia M.

    1991-01-01

    A genetic algorithm is used to select the inputs to a neural network function approximator. In the application considered, modeling critical parameters of the space shuttle main engine (SSME), the functional relationship between measured parameters is unknown and complex. Furthermore, the number of possible input parameters is quite large. Many approaches have been used for input selection, but they are either subjective or do not consider the complex multivariate relationships between parameters. Due to the optimization and space searching capabilities of genetic algorithms they were employed to systematize the input selection process. The results suggest that the genetic algorithm can generate parameter lists of high quality without the explicit use of problem domain knowledge. Suggestions for improving the performance of the input selection process are also provided.

  19. Substitution elasticities between GHG-polluting and nonpolluting inputs in agricultural production: A meta-regression

    International Nuclear Information System (INIS)

    Liu, Boying; Richard Shumway, C.

    2016-01-01

    This paper reports meta-regressions of substitution elasticities between greenhouse gas (GHG) polluting and nonpolluting inputs in agricultural production, which is the main feedstock source for biofuel in the U.S. We treat energy, fertilizer, and manure collectively as the “polluting input” and labor, land, and capital as nonpolluting inputs. We estimate meta-regressions for samples of Morishima substitution elasticities for labor, land, and capital vs. the polluting input. Much of the heterogeneity of Morishima elasticities can be explained by type of primal or dual function, functional form, type and observational level of data, input categories, number of outputs, type of output, time period, and country categories. Each estimated long-run elasticity for the reference case, which is most relevant for assessing GHG emissions through life-cycle analysis, is greater than 1.0 and significantly different from zero. Most predicted long-run elasticities remain significantly different from zero at the data means. These findings imply that life-cycle analysis based on fixed proportion production functions could provide grossly inaccurate measures of GHG of biofuel. - Highlights: • This paper reports meta-regressions of substitution elasticities between greenhouse-gas (GHG) polluting and nonpolluting inputs in agricultural production, which is the main feedstock source for biofuel in the U.S. • We estimate meta-regressions for samples of Morishima substitution elasticities for labor, land, and capital vs. the polluting input based on 65 primary studies. • We found that each estimated long-run elasticity for the reference case, which is most relevant for assessing GHG emissions through life-cycle analysis, is greater than 1.0 and significantly different from zero. Most predicted long-run elasticities remain significantly different from zero at the data means. • These findings imply that life-cycle analysis based on fixed proportion production functions could

  20. Visual Working Memory Enhances the Neural Response to Matching Visual Input.

    Science.gov (United States)

    Gayet, Surya; Guggenmos, Matthias; Christophel, Thomas B; Haynes, John-Dylan; Paffen, Chris L E; Van der Stigchel, Stefan; Sterzer, Philipp

    2017-07-12

    Visual working memory (VWM) is used to maintain visual information available for subsequent goal-directed behavior. The content of VWM has been shown to affect the behavioral response to concurrent visual input, suggesting that visual representations originating from VWM and from sensory input draw upon a shared neural substrate (i.e., a sensory recruitment stance on VWM storage). Here, we hypothesized that visual information maintained in VWM would enhance the neural response to concurrent visual input that matches the content of VWM. To test this hypothesis, we measured fMRI BOLD responses to task-irrelevant stimuli acquired from 15 human participants (three males) performing a concurrent delayed match-to-sample task. In this task, observers were sequentially presented with two shape stimuli and a retro-cue indicating which of the two shapes should be memorized for subsequent recognition. During the retention interval, a task-irrelevant shape (the probe) was briefly presented in the peripheral visual field, which could either match or mismatch the shape category of the memorized stimulus. We show that this probe stimulus elicited a stronger BOLD response, and allowed for increased shape-classification performance, when it matched rather than mismatched the concurrently memorized content, despite identical visual stimulation. Our results demonstrate that VWM enhances the neural response to concurrent visual input in a content-specific way. This finding is consistent with the view that neural populations involved in sensory processing are recruited for VWM storage, and it provides a common explanation for a plethora of behavioral studies in which VWM-matching visual input elicits a stronger behavioral and perceptual response. SIGNIFICANCE STATEMENT Humans heavily rely on visual information to interact with their environment and frequently must memorize such information for later use. Visual working memory allows for maintaining such visual information in the mind

  1. Learning from input and memory evolution: points of vulnerability on a pathway to mastery in word learning.

    Science.gov (United States)

    Storkel, Holly L

    2015-02-01

    Word learning consists of at least two neurocognitive processes: learning from input during training and memory evolution during gaps between training sessions. Fine-grained analysis of word learning by normal adults provides evidence that learning from input is swift and stable, whereas memory evolution is a point of potential vulnerability on the pathway to mastery. Moreover, success during learning from input is linked to positive outcomes from memory evolution. These two neurocognitive processes can be overlaid on to components of clinical treatment with within-session variables (i.e. dose form and dose) potentially linked to learning from input and between-session variables (i.e. dose frequency) linked to memory evolution. Collecting data at the beginning and end of a treatment session can be used to identify the point of vulnerability in word learning for a given client and the appropriate treatment component can then be adjusted to improve the client's word learning. Two clinical cases are provided to illustrate this approach.

  2. Quantitative Methods in the Study of Local History

    Science.gov (United States)

    Davey, Pene

    1974-01-01

    The author suggests how the quantitative analysis of data from census records, assessment roles, and newspapers may be integrated into the classroom. Suggestions for obtaining quantitative data are provided. (DE)

  3. IMPORT COMPONENTS AND IMPORT MULTIPLIERS IN INDONESIAN ECONOMY: WORLD INPUT-OUTPUT ANALYSIS

    Directory of Open Access Journals (Sweden)

    Muchdie Muchdie

    2018-03-01

    Full Text Available This paper calculates, presents and discusses on import components and the impact of final demand change on Indonesian imports using Indonesian 36 sector input-output tables of years: 2000, 2005, 2010 and 2014 from World Input-Output Tables. The results showed that firstly, Indonesian import components of input were, on average, more than 20 percent; meaning that input that locally provided were less than 80 percent. Secondly, Indonesian import of input had increased significantly from US$ 36,011 million in 2000 to US$ 151,505 million in 2014. Thirdly, Indonesian imports have been dominated by Sector-3: Manufacture of food products, beverages and tobacco products, Sector-4: Manufacture of textiles, wearing apparel and leather products, Sector-24: Construction, Sector-25: Wholesale and retail trade and repair, and Sector-26: Transportation and post services. Fourthly, by country of origin, Indonesian imports have been dominated by Japan, Korea, the USA, Australia, and China. Imports from Australia, Japan, and the US have been decreased significantly, but import from China has steadily increased. Finally, highest sectoral import multipliers occurred if final demands change in Sector-1: Crop and animal production, forestry, fishing and aquaculture, Sector-2: Mining and quarrying, Sector-23: Water collection; sewerage; waste collection, treatment and disposal activities, and Sector-30: Real estate activities, but there was no significant difference of import multipliers for country origin of import.

  4. A Three-Phase Dual-Input Matrix Converter for Grid Integration of Two AC Type Energy Resources

    DEFF Research Database (Denmark)

    Liu, Xiong; Wang, Peng; Chiang Loh, Poh

    2013-01-01

    This paper proposes a novel dual-input matrix converter (DIMC) to integrate two three-phase ac type energy resources to a power grid. The proposed matrix converter is developed based on the traditional indirect matrix converter under reverse power flow operation mode, but with its six......-to-output voltage boost capability since power flows from the converter’s voltage source side to its current source side. Commanded currents can be extracted from the two input sources to the grid. The proposed control and modulation schemes guarantee sinusoidal input and output waveforms as well as unity input......-switch voltage source converter replaced by a nine-switch configuration. With the additional three switches, the proposed DIMC can provide six in put terminals, which make it possible to integrate two independent ac sources into a single grid-tied power electronics interface. The proposed converter has input...

  5. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    Science.gov (United States)

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  6. A novel three-input monomolecular logic circuit on a rhodamine inspired bio-compatible bi-compartmental molecular platform

    Energy Technology Data Exchange (ETDEWEB)

    Mistri, Tarun; Bhowmick, Rahul [Department of Chemistry, Jadavpur University, 188 Raja S.C. Mullick Road, Kolkata 700032 (India); Katarkar, Atul; Chaudhuri, Keya [Molecular & Human Genetics Division, CSIR-Indian Institute of Chemical Biology, 4 Raja S.C. Mullick Road, Kolkata 700032 (India); Ali, Mahammad, E-mail: mali@chemistry.jdvu.ac.in [Department of Chemistry, Jadavpur University, 188 Raja S.C. Mullick Road, Kolkata 700032 (India)

    2017-05-15

    Methodological synthesis of a new biocompatible bi-compartmental rhodamine based probe (L{sup 3}) provides a multi-inputs and multi-outputs molecular logic circuit based on simple chemosensing phenomena. Spectroscopic responses of Cu{sup 2+} and Hg{sup 2+} towards L{sup 3} together with reversible binding of S{sup 2-} with L{sup 3}-Cu{sup 2+} and L{sup 3}-Hg{sup 2+} complexes help us to construct a thee-input molecular circuit on their control and sequential addition to a solution of L{sup 3} in a mixed organo-aqueous medium. We have further successfully encoded binary digits out of these inputs and outputs which may convert a three-digit input string into a two-digit output string resulting a simple monomolecular logic circuit. Such a molecular ‘Boolean’ logic operation may improve the complexity of logic gate circuitry and computational speed and may be useful to employ in potential biocompatible molecular logic platforms. - Graphical abstract: A new bi-compartmental molecular system equipped with Rhodamine fluorophore unit provides a Multi-inputs and Multi-outputs Molecular Logic Circuit based on a very simple observation of chemosensing activities.

  7. Using Popular Culture to Teach Quantitative Reasoning

    Science.gov (United States)

    Hillyard, Cinnamon

    2007-01-01

    Popular culture provides many opportunities to develop quantitative reasoning. This article describes a junior-level, interdisciplinary, quantitative reasoning course that uses examples from movies, cartoons, television, magazine advertisements, and children's literature. Some benefits from and cautions to using popular culture to teach…

  8. Minimally invasive input function for 2-{sup 18}F-fluoro-A-85380 brain PET studies

    Energy Technology Data Exchange (ETDEWEB)

    Zanotti-Fregonara, Paolo [National Institute of Mental Health, NIH, Molecular Imaging Branch, Bethesda, MD (United States); Maroy, Renaud; Peyronneau, Marie-Anne; Trebossen, Regine [CEA, DSV, I2BM, Service Hospitalier Frederic Joliot, Orsay (France); Bottlaender, Michel [CEA, DSV, I2BM, NeuroSpin, Gif-sur-Yvette (France)

    2012-04-15

    Quantitative neuroreceptor positron emission tomography (PET) studies often require arterial cannulation to measure input function. While population-based input function (PBIF) would be a less invasive alternative, it has only rarely been used in conjunction with neuroreceptor PET tracers. The aims of this study were (1) to validate the use of PBIF for 2-{sup 18}F-fluoro-A-85380, a tracer for nicotinic receptors; (2) to compare the accuracy of measures obtained via PBIF to those obtained via blood-scaled image-derived input function (IDIF) from carotid arteries; and (3) to explore the possibility of using venous instead of arterial samples for both PBIF and IDIF. Ten healthy volunteers underwent a dynamic 2-{sup 18}F-fluoro-A-85380 brain PET scan with arterial and, in seven subjects, concurrent venous serial blood sampling. PBIF was obtained by averaging the normalized metabolite-corrected arterial input function and subsequently scaling each curve with individual blood samples. IDIF was obtained from the carotid arteries using a blood-scaling method. Estimated Logan distribution volume (V{sub T}) values were compared to the reference values obtained from arterial cannulation. For all subjects, PBIF curves scaled with arterial samples were similar in shape and magnitude to the reference arterial input function. The Logan V{sub T} ratio was 1.00 {+-} 0.05; all subjects had an estimation error <10%. IDIF gave slightly less accurate results (V{sub T} ratio 1.03 {+-} 0.07; eight of ten subjects had an error <10%). PBIF scaled with venous samples yielded inaccurate results (V{sub T} ratio 1.13 {+-} 0.13; only three of seven subjects had an error <10%). Due to arteriovenous differences at early time points, IDIF could not be calculated using venous samples. PBIF scaled with arterial samples accurately estimates Logan V{sub T} for 2-{sup 18}F-fluoro-A-85380. Results obtained with PBIF were slightly better than those obtained with IDIF. Due to arteriovenous concentration

  9. 7 CFR 3431.4 - Solicitation of stakeholder input.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Solicitation of stakeholder input. 3431.4 Section... Designation of Veterinarian Shortage Situations § 3431.4 Solicitation of stakeholder input. The Secretary will solicit stakeholder input on the process and procedures used to designate veterinarian shortage situations...

  10. Quantified carbon input for maintaining existing soil organic carbon stocks in global wheat systems

    Science.gov (United States)

    Wang, G.

    2017-12-01

    Soil organic carbon (SOC) dynamics in croplands is a crucial component of global carbon (C) cycle. Depending on local environmental conditions and management practices, typical C input is generally required to reduce or reverse C loss in agricultural soils. No studies have quantified the critical C input for maintaining SOC at global scale with high resolution. Such information will provide a baseline map for assessing soil C dynamics under potential changes in management practices and climate, and thus enable development of management strategies to reduce C footprint from farm to regional scales. We used the soil C model RothC to simulate the critical C input rates needed to maintain existing soil C level at 0.1°× 0.1° resolution in global wheat systems. On average, the critical C input was estimated to be 2.0 Mg C ha-1 yr-1, with large spatial variability depending on local soil and climatic conditions. Higher C inputs are required in wheat system of central United States and western Europe, mainly due to the higher current soil C stocks present in these regions. The critical C input could be effectively estimated using a summary model driven by current SOC level, mean annual temperature, precipitation, and soil clay content.

  11. Critical carbon input to maintain current soil organic carbon stocks in global wheat systems.

    Science.gov (United States)

    Wang, Guocheng; Luo, Zhongkui; Han, Pengfei; Chen, Huansheng; Xu, Jingjing

    2016-01-13

    Soil organic carbon (SOC) dynamics in croplands is a crucial component of global carbon (C) cycle. Depending on local environmental conditions and management practices, typical C input is generally required to reduce or reverse C loss in agricultural soils. No studies have quantified the critical C input for maintaining SOC at global scale with high resolution. Such information will provide a baseline map for assessing soil C dynamics under potential changes in management practices and climate, and thus enable development of management strategies to reduce C footprint from farm to regional scales. We used the soil C model RothC to simulate the critical C input rates needed to maintain existing soil C level at 0.1° × 0.1° resolution in global wheat systems. On average, the critical C input was estimated to be 2.0 Mg C ha(-1) yr(-1), with large spatial variability depending on local soil and climatic conditions. Higher C inputs are required in wheat system of central United States and western Europe, mainly due to the higher current soil C stocks present in these regions. The critical C input could be effectively estimated using a summary model driven by current SOC level, mean annual temperature, precipitation, and soil clay content.

  12. Model for Quantitative Evaluation of Enzyme Replacement Treatment

    Directory of Open Access Journals (Sweden)

    Radeva B.

    2009-12-01

    Full Text Available Gaucher disease is the most frequent lysosomal disorder. Its enzyme replacement treatment was the new progress of modern biotechnology, successfully used in the last years. The evaluation of optimal dose of each patient is important due to health and economical reasons. The enzyme replacement is the most expensive treatment. It must be held continuously and without interruption. Since 2001, the enzyme replacement therapy with Cerezyme*Genzyme was formally introduced in Bulgaria, but after some time it was interrupted for 1-2 months. The dose of the patients was not optimal. The aim of our work is to find a mathematical model for quantitative evaluation of ERT of Gaucher disease. The model applies a kind of software called "Statistika 6" via the input of the individual data of 5-year-old children having the Gaucher disease treated with Cerezyme. The output results of the model gave possibilities for quantitative evaluation of the individual trends in the development of the disease of each child and its correlation. On the basis of this results, we might recommend suitable changes in ERT.

  13. International P/L Insurance Output, Input, and Productivity Comparisons

    OpenAIRE

    Mary A. Weiss

    1991-01-01

    This research provides (bilateral) divisia and multilateral divisia indexes of output, input, and productivity for the property-liability (P-L) insurance industry for the following countries: United States, West Germany, Switzerland, France, and Japan. The time period studied is 1975 to 1987. The results indicate that considerable diversity exists among different countries, with Japan showing the weakest productivity growth. The United States and West Germany are associated overall with high ...

  14. Organization of sensory input to the nociceptive-specific cutaneous trunk muscle reflex in rat, an effective experimental system for examining nociception and plasticity

    Science.gov (United States)

    Petruska, Jeffrey C.; Barker, Darrell F.; Garraway, Sandra M.; Trainer, Robert; Fransen, James W.; Seidman, Peggy A.; Soto, Roy G.; Mendell, Lorne M.; Johnson, Richard D.

    2013-01-01

    Detailed characterization of neural circuitries furthers our understanding of how nervous systems perform specific functions and enables the use of those systems to test hypotheses. We have characterized the sensory input to the cutaneous trunk muscle (CTM; also cutaneus trunci (rat) or cutaneus maximus (mouse)) reflex (CTMR), which manifests as a puckering of the dorsal thoracolumbar skin and is selectively driven by noxious stimuli. CTM electromyography (EMG) and neurogram recordings in naïve rats revealed that CTMR responses were elicited by natural stimuli and electrical stimulation of all segments from C4 to L6, a much greater extent of segmental drive to the CTMR than previously described. Stimulation of some subcutaneous paraspinal tissue can also elicit this reflex. Using a selective neurotoxin, we also demonstrate differential drive of the CTMR by trkA-expressing and non-expressing small diameter afferents. These observations highlight aspects of the organization of the CTMR system which make it attractive for studies of nociception and anesthesiology and plasticity of primary afferents, motoneurons, and the propriospinal system. We use the CTMR system to qualitatively and quantitatively demonstrate that experimental pharmacological treatments can be compared to controls applied either to the contralateral side or to another segment, with the remaining segments providing controls for systemic or other treatment effects. These data indicate the potential for using the CTMR system as both an invasive and non-invasive quantitative assessment tool providing improved statistical power and reduced animal use. PMID:23983104

  15. ASSIST - a package of Fortran routines for handling input under specified syntax rules and for management of data structures

    International Nuclear Information System (INIS)

    Sinclair, J.E.

    1991-02-01

    The ASSIST package (A Structured Storage and Input Syntax Tool) provides for Fortran programs a means for handling data structures more general than those provided by the Fortran language, and for obtaining input to the program from a file or terminal according to specified syntax rules. The syntax-controlled input can be interactive, with automatic generation of prompts, and dialogue to correct any input errors. The range of syntax rules possible is sufficient to handle lists of numbers and character strings, keywords, commands with optional clauses, and many kinds of variable-format constructions, such as algebraic expressions. ASSIST was developed for use in two large programs for the analysis of safety of radioactive waste disposal facilities, but it should prove useful for a wide variety of applications. (author)

  16. Uncertainty of input data for room acoustic simulations

    DEFF Research Database (Denmark)

    Jeong, Cheol-Ho; Marbjerg, Gerd; Brunskog, Jonas

    2016-01-01

    Although many room acoustic simulation models have been well established, simulation results will never be accurate with inaccurate and uncertain input data. This study addresses inappropriateness and uncertainty of input data for room acoustic simulations. Firstly, the random incidence absorption...... and scattering coefficients are insufficient when simulating highly non-diffuse rooms. More detailed information, such as the phase and angle dependence, can greatly improve the simulation results of pressure-based geometrical and wave-based models at frequencies well below the Schroeder frequency. Phase...... summarizes potential advanced absorption measurement techniques that can improve the quality of input data for room acoustic simulations. Lastly, plenty of uncertain input data are copied from unreliable sources. Software developers and users should be careful when spreading such uncertain input data. More...

  17. Effect of heat input on the microstructure, residual stresses and corrosion resistance of 304L austenitic stainless steel weldments

    Energy Technology Data Exchange (ETDEWEB)

    Unnikrishnan, Rahul, E-mail: rahulunnikrishnannair@gmail.com [Department of Metallurgical and Materials Engineering, Visvesvaraya National Institute of Technology (VNIT), South Ambazari Road, Nagpur 440010, Maharashtra (India); Idury, K.S.N. Satish, E-mail: satishidury@gmail.com [Department of Metallurgical and Materials Engineering, Visvesvaraya National Institute of Technology (VNIT), South Ambazari Road, Nagpur 440010, Maharashtra (India); Ismail, T.P., E-mail: tpisma@gmail.com [Department of Metallurgical and Materials Engineering, Visvesvaraya National Institute of Technology (VNIT), South Ambazari Road, Nagpur 440010, Maharashtra (India); Bhadauria, Alok, E-mail: alokbhadauria1@gmail.com [Department of Metallurgical and Materials Engineering, Visvesvaraya National Institute of Technology (VNIT), South Ambazari Road, Nagpur 440010, Maharashtra (India); Shekhawat, S.K., E-mail: satishshekhawat@gmail.com [Department of Metallurgical Engineering and Materials Science, Indian Institute of Technology Bombay (IITB), Powai, Mumbai 400076, Maharashtra (India); Khatirkar, Rajesh K., E-mail: rajesh.khatirkar@gmail.com [Department of Metallurgical and Materials Engineering, Visvesvaraya National Institute of Technology (VNIT), South Ambazari Road, Nagpur 440010, Maharashtra (India); Sapate, Sanjay G., E-mail: sgsapate@yahoo.com [Department of Metallurgical and Materials Engineering, Visvesvaraya National Institute of Technology (VNIT), South Ambazari Road, Nagpur 440010, Maharashtra (India)

    2014-07-01

    Austenitic stainless steels are widely used in high performance pressure vessels, nuclear, chemical, process and medical industry due to their very good corrosion resistance and superior mechanical properties. However, austenitic stainless steels are prone to sensitization when subjected to higher temperatures (673 K to 1173 K) during the manufacturing process (e.g. welding) and/or certain applications (e.g. pressure vessels). During sensitization, chromium in the matrix precipitates out as carbides and intermetallic compounds (sigma, chi and Laves phases) decreasing the corrosion resistance and mechanical properties. In the present investigation, 304L austenitic stainless steel was subjected to different heat inputs by shielded metal arc welding process using a standard 308L electrode. The microstructural developments were characterized by using optical microscopy and electron backscattered diffraction, while the residual stresses were measured by X-ray diffraction using the sin{sup 2}ψ method. It was observed that even at the highest heat input, shielded metal arc welding process does not result in significant precipitation of carbides or intermetallic phases. The ferrite content and grain size increased with increase in heat input. The grain size variation in the fusion zone/heat affected zone was not effectively captured by optical microscopy. This study shows that electron backscattered diffraction is necessary to bring out changes in the grain size quantitatively in the fusion zone/heat affected zone as it can consider twin boundaries as a part of grain in the calculation of grain size. The residual stresses were compressive in nature for the lowest heat input, while they were tensile at the highest heat input near the weld bead. The significant feature of the welded region and the base metal was the presence of a very strong texture. The texture in the heat affected zone was almost random. - Highlights: • Effect of heat input on microstructure, residual

  18. Response of spiking neurons to correlated inputs

    International Nuclear Information System (INIS)

    Moreno, Ruben; Rocha, Jaime de la; Renart, Alfonso; Parga, Nestor

    2002-01-01

    The effect of a temporally correlated afferent current on the firing rate of a leaky integrate-and-fire neuron is studied. This current is characterized in terms of rates, autocorrelations, and cross correlations, and correlation time scale τ c of excitatory and inhibitory inputs. The output rate ν out is calculated in the Fokker-Planck formalism in the limit of both small and large τ c compared to the membrane time constant τ of the neuron. By simulations we check the analytical results, provide an interpolation valid for all τ c , and study the neuron's response to rapid changes in the correlation magnitude

  19. PRE-CASKETSS: an input data generation computer program for thermal and structural analysis of nuclear fuel shipping casks

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1988-12-01

    A computer program PRE-CASKETSS has been developed for the purpose of input data generation for thermal and structural analysis computer code system CASKETSS (CASKETSS means a modular code system for CASK Evaluation code system for Thermal and Structural Safety). Main features of PRE-CASKETSS are as follow; (1) Function of input data generation for thermal and structural analysis computer programs is provided in the program. (2) Two- and three-dimensional mesh generation for finite element and finite difference programs are available in the program. (3) The capacity of the material input data generation are provided in the program. (4) The boundary conditions, the load conditions and the initial conditions are capable in the program. (5) This computer program operate both the time shearing system and the batch system. In the paper, brief illustration of calculation method, input data and sample calculations are presented. (author)

  20. RIP INPUT TABLES FROM WAPDEG FOR LA DESIGN SELECTION: ENHANCED DESIGN ALTERNATIVE V

    International Nuclear Information System (INIS)

    K. Mon

    1999-01-01

    The purpose of this calculation is to document (1) the Waste Package Degradation (WAPDEG) version 3.09 (CRWMS M and O 1998b, Software Routine Report for WAPDEG (Version 3.09)) simulations used to analyze degradation and failure of 2-cm thick titanium grade 7 corrosion resistant material (CRM) drip shields (that are placed over waste packages composed of a 2-cm thick Alloy 22 corrosion resistant material (CRM) as the outer barrier and an unspecified material to provide structural support as the inner barrier) as well as degradation and failure of the waste packages themselves, and (2) post-processing of these results into tables of drip shield/waste package degradation time histories suitable for use as input into the Integrated Probabilistic Simulator for Environmental Systems (RIP) version 5.19.01 (Golder Associates 1998) computer code. Performance credit of the inner barrier material is not taken in this calculation. This calculation supports Performance Assessment analysis of the License Application Design Selection (LADS) Enhanced Design Alternative V. Additional details concerning the Enhanced Design Alternative V are provided in a Design Input Request (CRWMS M and O 1999e, Design Input Request for LADS Phase II EDA Evaluations, Item 3)

  1. Reactor protection system software test-case selection based on input-profile considering concurrent events and uncertainties

    International Nuclear Information System (INIS)

    Khalaquzzaman, M.; Lee, Seung Jun; Cho, Jaehyun; Jung, Wondea

    2016-01-01

    Recently, the input-profile-based testing for safety critical software has been proposed for determining the number of test cases and quantifying the failure probability of the software. Input-profile of a reactor protection system (RPS) software is the input which causes activation of the system for emergency shutdown of a reactor. This paper presents a method to determine the input-profile of a RPS software which considers concurrent events/transients. A deviation of a process parameter value begins through an event and increases owing to the concurrent multi-events depending on the correlation of process parameters and severity of incidents. A case of reactor trip caused by feedwater loss and main steam line break is simulated and analyzed to determine the RPS software input-profile and estimate the number of test cases. The different sizes of the main steam line breaks (e.g., small, medium, large break) with total loss of feedwater supply are considered in constructing the input-profile. The uncertainties of the simulation related to the input-profile-based software testing are also included. Our study is expected to provide an option to determine test cases and quantification of RPS software failure probability. (author)

  2. A Method to Select Software Test Cases in Consideration of Past Input Sequence

    International Nuclear Information System (INIS)

    Kim, Hee Eun; Kim, Bo Gyung; Kang, Hyun Gook

    2015-01-01

    In the Korea Nuclear I and C Systems (KNICS) project, the software for the fully-digitalized reactor protection system (RPS) was developed under a strict procedure. Even though the behavior of the software is deterministic, the randomness of input sequence produces probabilistic behavior of software. A software failure occurs when some inputs to the software occur and interact with the internal state of the digital system to trigger a fault that was introduced into the software during the software lifecycle. In this paper, the method to select test set for software failure probability estimation is suggested. This test set reflects past input sequence of software, which covers all possible cases. In this study, the method to select test cases for software failure probability quantification was suggested. To obtain profile of paired state variables, relationships of the variables need to be considered. The effect of input from human operator also have to be considered. As an example, test set of PZR-PR-Lo-Trip logic was examined. This method provides framework for selecting test cases of safety-critical software

  3. High-Voltage-Input Level Translator Using Standard CMOS

    Science.gov (United States)

    Yager, Jeremy A.; Mojarradi, Mohammad M.; Vo, Tuan A.; Blalock, Benjamin J.

    2011-01-01

    proposed integrated circuit would translate (1) a pair of input signals having a low differential potential and a possibly high common-mode potential into (2) a pair of output signals having the same low differential potential and a low common-mode potential. As used here, "low" and "high" refer to potentials that are, respectively, below or above the nominal supply potential (3.3 V) at which standard complementary metal oxide/semiconductor (CMOS) integrated circuits are designed to operate. The input common-mode potential could lie between 0 and 10 V; the output common-mode potential would be 2 V. This translation would make it possible to process the pair of signals by use of standard 3.3-V CMOS analog and/or mixed-signal (analog and digital) circuitry on the same integrated-circuit chip. A schematic of the circuit is shown in the figure. Standard 3.3-V CMOS circuitry cannot withstand input potentials greater than about 4 V. However, there are many applications that involve low-differential-potential, high-common-mode-potential input signal pairs and in which standard 3.3-V CMOS circuitry, which is relatively inexpensive, would be the most appropriate circuitry for performing other functions on the integrated-circuit chip that handles the high-potential input signals. Thus, there is a need to combine high-voltage input circuitry with standard low-voltage CMOS circuitry on the same integrated-circuit chip. The proposed circuit would satisfy this need. In the proposed circuit, the input signals would be coupled into both a level-shifting pair and a common-mode-sensing pair of CMOS transistors. The output of the level-shifting pair would be fed as input to a differential pair of transistors. The resulting differential current output would pass through six standoff transistors to be mirrored into an output branch by four heterojunction bipolar transistors. The mirrored differential current would be converted back to potential by a pair of diode-connected transistors

  4. A Comprehensive Estimation of the Economic Effects of Meteorological Services Based on the Input-Output Method

    Directory of Open Access Journals (Sweden)

    Xianhua Wu

    2014-01-01

    Full Text Available Concentrating on consuming coefficient, partition coefficient, and Leontief inverse matrix, relevant concepts and algorithms are developed for estimating the impact of meteorological services including the associated (indirect, complete economic effect. Subsequently, quantitative estimations are particularly obtained for the meteorological services in Jiangxi province by utilizing the input-output method. It is found that the economic effects are noticeably rescued by the preventive strategies developed from both the meteorological information and internal relevance (interdependency in the industrial economic system. Another finding is that the ratio range of input in the complete economic effect on meteorological services is about 1 : 108.27–1 : 183.06, remarkably different from a previous estimation based on the Delphi method (1 : 30–1 : 51. Particularly, economic effects of meteorological services are higher for nontraditional users of manufacturing, wholesale and retail trades, services sector, tourism and culture, and art and lower for traditional users of agriculture, forestry, livestock, fishery, and construction industries.

  5. Enhanced Sensitivity to Rapid Input Fluctuations by Nonlinear Threshold Dynamics in Neocortical Pyramidal Neurons.

    Science.gov (United States)

    Mensi, Skander; Hagens, Olivier; Gerstner, Wulfram; Pozzorini, Christian

    2016-02-01

    The way in which single neurons transform input into output spike trains has fundamental consequences for network coding. Theories and modeling studies based on standard Integrate-and-Fire models implicitly assume that, in response to increasingly strong inputs, neurons modify their coding strategy by progressively reducing their selective sensitivity to rapid input fluctuations. Combining mathematical modeling with in vitro experiments, we demonstrate that, in L5 pyramidal neurons, the firing threshold dynamics adaptively adjust the effective timescale of somatic integration in order to preserve sensitivity to rapid signals over a broad range of input statistics. For that, a new Generalized Integrate-and-Fire model featuring nonlinear firing threshold dynamics and conductance-based adaptation is introduced that outperforms state-of-the-art neuron models in predicting the spiking activity of neurons responding to a variety of in vivo-like fluctuating currents. Our model allows for efficient parameter extraction and can be analytically mapped to a Generalized Linear Model in which both the input filter--describing somatic integration--and the spike-history filter--accounting for spike-frequency adaptation--dynamically adapt to the input statistics, as experimentally observed. Overall, our results provide new insights on the computational role of different biophysical processes known to underlie adaptive coding in single neurons and support previous theoretical findings indicating that the nonlinear dynamics of the firing threshold due to Na+-channel inactivation regulate the sensitivity to rapid input fluctuations.

  6. Quantitative Simulation of QARBM Challenge Events During Radiation Belt Enhancements

    Science.gov (United States)

    Li, W.; Ma, Q.; Thorne, R. M.; Bortnik, J.; Chu, X.

    2017-12-01

    Various physical processes are known to affect energetic electron dynamics in the Earth's radiation belts, but their quantitative effects at different times and locations in space need further investigation. This presentation focuses on discussing the quantitative roles of various physical processes that affect Earth's radiation belt electron dynamics during radiation belt enhancement challenge events (storm-time vs. non-storm-time) selected by the GEM Quantitative Assessment of Radiation Belt Modeling (QARBM) focus group. We construct realistic global distributions of whistler-mode chorus waves, adopt various versions of radial diffusion models (statistical and event-specific), and use the global evolution of other potentially important plasma waves including plasmaspheric hiss, magnetosonic waves, and electromagnetic ion cyclotron waves from all available multi-satellite measurements. These state-of-the-art wave properties and distributions on a global scale are used to calculate diffusion coefficients, that are then adopted as inputs to simulate the dynamical electron evolution using a 3D diffusion simulation during the storm-time and the non-storm-time acceleration events respectively. We explore the similarities and differences in the dominant physical processes that cause radiation belt electron dynamics during the storm-time and non-storm-time acceleration events. The quantitative role of each physical process is determined by comparing against the Van Allen Probes electron observations at different energies, pitch angles, and L-MLT regions. This quantitative comparison further indicates instances when quasilinear theory is sufficient to explain the observed electron dynamics or when nonlinear interaction is required to reproduce the energetic electron evolution observed by the Van Allen Probes.

  7. Repositioning Recitation Input in College English Teaching

    Science.gov (United States)

    Xu, Qing

    2009-01-01

    This paper tries to discuss how recitation input helps overcome the negative influences on the basis of second language acquisition theory and confirms the important role that recitation input plays in improving college students' oral and written English.

  8. Deliberative Political Leaders: The Role of Policy Input in Political Leadership

    Directory of Open Access Journals (Sweden)

    Jennifer Lees-Marshment

    2016-06-01

    Full Text Available This article provides a fresh perspective on political leadership by demonstrating that government ministers take a deliberative approach to decision making. Getting behind the closed doors of government through 51 elite interviews in the UK, US, Australia, Canada and New Zealand, the article demonstrates that modern political leadership is much more collaborative than we usually see from media and public critique. Politicians are commonly perceived to be power-hungry autocratic, elite figures who once they have won power seek to implement their vision. But as previous research has noted, not only is formal power circumscribed by the media, public opinion, and unpredictability of government, more collaborative approaches to leadership are needed given the rise of wicked problems and citizens increasingly demand more say in government decisions and policy making. This article shows that politicians are responding to their challenging environment by accepting they do not know everything and cannot do everything by themselves, and moving towards a leadership style that incorporates public input. It puts forward a new model of Deliberative Political Leadership, where politicians consider input from inside and outside government from a diverse range of sources, evaluate the relative quality of such input, and integrate it into their deliberations on the best way forward before making their final decision. This rare insight into politician’s perspectives provides a refreshing view of governmental leadership in practice and new model for future research.

  9. Workflow Optimization for Tuning Prostheses with High Input Channel

    Science.gov (United States)

    2017-10-01

    of Specific Aim 1 by driving a commercially available two DoF wrist and single DoF hand. The high -level control system will provide analog signals...AWARD NUMBER: W81XWH-16-1-0767 TITLE: Workflow Optimization for Tuning Prostheses with High Input Channel PRINCIPAL INVESTIGATOR: Daniel Merrill...Unlimited The views, opinions and/or findings contained in this report are those of the author(s) and should not be construed as an official Department

  10. Aspects of input processing in the numerical control of electron beam machines

    International Nuclear Information System (INIS)

    Chowdhury, A.K.

    1981-01-01

    A high-performance Numerical Control has been developed for an Electron Beam Machine. The system is structured into 3 hierarchial levels: Input Processing, Realtime Processing (such as Geometry Interpolation) and the Interfaces to the Electron Beam Machine. The author considers the Input Processing. In conventional Numerical Controls the Interfaces to the control is given by the control language as defined in DIN 66025. State of the art in NC-technology offers programming systems of differing competence covering the spectra between manual programming in the control language to highly sophisticated systems such as APT. This software interface has been used to define an Input Processor that in cooperation with the Hostcomputer meets the requirements of a sophisticated NC-system but at the same time provides a modest stand-alone system with all the basic functions such as interactive program-editing, program storage, program execution simultaneous with the development of another program, etc. Software aspects such as adapting DIN 66025 for Electron Beam Machining, organisation and modularisation of Input Processor Software has been considered and solutions have been proposed. Hardware aspects considered are interconnections of the Input Processor with the Host and the Realtime Processors. Because of economical and development-time considerations, available software and hardware has been liberally used and own development has been kept to a minimum. The proposed system is modular in software and hardware and therefore very flexible and open-ended to future expansion. (Auth.)

  11. Remote media vision-based computer input device

    Science.gov (United States)

    Arabnia, Hamid R.; Chen, Ching-Yi

    1991-11-01

    In this paper, we introduce a vision-based computer input device which has been built at the University of Georgia. The user of this system gives commands to the computer without touching any physical device. The system receives input through a CCD camera; it is PC- based and is built on top of the DOS operating system. The major components of the input device are: a monitor, an image capturing board, a CCD camera, and some software (developed by use). These are interfaced with a standard PC running under the DOS operating system.

  12. Development and operation of K-URT data input system

    International Nuclear Information System (INIS)

    Kim, Yun Jae; Myoung, Noh Hoon; Kim, Jong Hyun; Han, Jae Jun

    2010-05-01

    Activities for TSPA(Total System Performance Assessment) on the permanent disposal of high level radioactive waste includes production of input data, safety assessment using input data, license procedure and others. These activities are performed in 5 steps as follows; (1) Adequate planning, (2) Controlled execution, (3) Complete documentation, (4) Thorough review, (5) Independent oversight. For the confidence building, it is very important to record and manage the materials obtained from research works in transparency. For the documentation of disposal research work from planning stage to data management stage, KAERI developed CYPRUS named CYBER R and D Platform for Radwaste Disposal in Underground System with a QA(Quality Assurance) System. In CYPRUS, QA system makes effects on other functions such as data management, project management and others. This report analyzes the structure of CYPRUS and proposes to accumulate qualified data, to provide a convenient application and to promote access and use of CYPRUS for a future-oriented system

  13. Agricultural and Environmental Input Parameters for the Biosphere Model

    Energy Technology Data Exchange (ETDEWEB)

    K. Rasmuson; K. Rautenstrauch

    2004-09-14

    This analysis is one of 10 technical reports that support the Environmental Radiation Model for Yucca Mountain Nevada (ERMYN) (i.e., the biosphere model). It documents development of agricultural and environmental input parameters for the biosphere model, and supports the use of the model to develop biosphere dose conversion factors (BDCFs). The biosphere model is one of a series of process models supporting the total system performance assessment (TSPA) for the repository at Yucca Mountain. The ERMYN provides the TSPA with the capability to perform dose assessments. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships between the major activities and their products (the analysis and model reports) that were planned in ''Technical Work Plan for Biosphere Modeling and Expert Support'' (BSC 2004 [DIRS 169573]). The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes the ERMYN and its input parameters.

  14. Agricultural and Environmental Input Parameters for the Biosphere Model

    International Nuclear Information System (INIS)

    K. Rasmuson; K. Rautenstrauch

    2004-01-01

    This analysis is one of 10 technical reports that support the Environmental Radiation Model for Yucca Mountain Nevada (ERMYN) (i.e., the biosphere model). It documents development of agricultural and environmental input parameters for the biosphere model, and supports the use of the model to develop biosphere dose conversion factors (BDCFs). The biosphere model is one of a series of process models supporting the total system performance assessment (TSPA) for the repository at Yucca Mountain. The ERMYN provides the TSPA with the capability to perform dose assessments. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships between the major activities and their products (the analysis and model reports) that were planned in ''Technical Work Plan for Biosphere Modeling and Expert Support'' (BSC 2004 [DIRS 169573]). The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes the ERMYN and its input parameters

  15. Quantitative Methods for Molecular Diagnostic and Therapeutic Imaging

    OpenAIRE

    Li, Quanzheng

    2013-01-01

    This theme issue provides an overview on the basic quantitative methods, an in-depth discussion on the cutting-edge quantitative analysis approaches as well as their applications for both static and dynamic molecular diagnostic and therapeutic imaging.

  16. Laterodorsal nucleus of the thalamus: A processor of somatosensory inputs.

    Science.gov (United States)

    Bezdudnaya, Tatiana; Keller, Asaf

    2008-04-20

    The laterodorsal (LD) nucleus of the thalamus has been considered a "higher order" nucleus that provides inputs to limbic cortical areas. Although its functions are largely unknown, it is often considered to be involved in spatial learning and memory. Here we provide evidence that LD is part of a hitherto unknown pathway for processing somatosensory information. Juxtacellular and extracellular recordings from LD neurons reveal that they respond to vibrissa stimulation with short latency (median = 7 ms) and large magnitude responses (median = 1.2 spikes/stimulus). Most neurons (62%) had large receptive fields, responding to six and more individual vibrissae. Electrical stimulation of the trigeminal nucleus interpolaris (SpVi) evoked short latency responses (median = 3.8 ms) in vibrissa-responsive LD neurons. Labeling produced by anterograde and retrograde neuroanatomical tracers confirmed that LD neurons receive direct inputs from SpVi. Electrophysiological and neuroanatomical analyses revealed also that LD projects upon the cingulate and retrosplenial cortex, but has only sparse projections to the barrel cortex. These findings suggest that LD is part of a novel processing stream involved in spatial orientation and learning related to somatosensory cues. (c) 2008 Wiley-Liss, Inc.

  17. Quantitative analysis of oyster larval proteome provides new insights into the effects of multiple climate change stressors

    KAUST Repository

    Dineshram, Ramadoss; Chandramouli, Kondethimmanahalli; Ko, Ginger Wai Kuen; Zhang, Huoming; Qian, Pei-Yuan; Ravasi, Timothy; Thiyagarajan, Vengatesen

    2016-01-01

    might be affected in a future ocean, we examined changes in the proteome of metamorphosing larvae under multiple stressors: decreased pH (pH 7.4), increased temperature (30 °C), and reduced salinity (15 psu). Quantitative protein expression profiling

  18. Atmospheric Inputs of Nitrogen, Carbon, and Phosphorus across an Urban Area: Unaccounted Fluxes and Canopy Influences

    Science.gov (United States)

    Decina, Stephen M.; Templer, Pamela H.; Hutyra, Lucy R.

    2018-02-01

    Rates of atmospheric deposition are declining across the United States, yet urban areas remain hotspots of atmospheric deposition. While past studies show elevated rates of inorganic nitrogen (N) deposition in cities, less is known about atmospheric inputs of organic N, organic carbon (C), and organic and inorganic phosphorus (P), all of which can affect ecosystem processes, water quality, and air quality. Further, the effect of the tree canopy on amounts and forms of nutrients reaching urban ground surfaces is not well-characterized. We measured growing season rates of total N, organic C, and total P in bulk atmospheric inputs, throughfall, and soil solution around the greater Boston area. We found that organic N constitutes a third of total N inputs, organic C inputs are comparable to rural inputs, and inorganic P inputs are 1.2 times higher than those in sewage effluent. Atmospheric inputs are enhanced two-to-eight times in late spring and are elevated beneath tree canopies, suggesting that trees augment atmospheric inputs to ground surfaces. Additionally, throughfall inputs may directly enter runoff when trees extend above impervious surfaces, as is the case with 26.1% of Boston's tree canopy. Our results indicate that the urban atmosphere is a significant source of elemental inputs that may impact urban ecosystems and efforts to improve water quality, particularly in terms of P. Further, as cities create policies encouraging tree planting to provide ecosystem services, locating trees above permeable surfaces to reduce runoff nutrient loads may be essential to managing urban biogeochemical cycling and water quality.

  19. Response sensitivity of barrel neuron subpopulations to simulated thalamic input.

    Science.gov (United States)

    Pesavento, Michael J; Rittenhouse, Cynthia D; Pinto, David J

    2010-06-01

    Our goal is to examine the relationship between neuron- and network-level processing in the context of a well-studied cortical function, the processing of thalamic input by whisker-barrel circuits in rodent neocortex. Here we focus on neuron-level processing and investigate the responses of excitatory and inhibitory barrel neurons to simulated thalamic inputs applied using the dynamic clamp method in brain slices. Simulated inputs are modeled after real thalamic inputs recorded in vivo in response to brief whisker deflections. Our results suggest that inhibitory neurons require more input to reach firing threshold, but then fire earlier, with less variability, and respond to a broader range of inputs than do excitatory neurons. Differences in the responses of barrel neuron subtypes depend on their intrinsic membrane properties. Neurons with a low input resistance require more input to reach threshold but then fire earlier than neurons with a higher input resistance, regardless of the neuron's classification. Our results also suggest that the response properties of excitatory versus inhibitory barrel neurons are consistent with the response sensitivities of the ensemble barrel network. The short response latency of inhibitory neurons may serve to suppress ensemble barrel responses to asynchronous thalamic input. Correspondingly, whereas neurons acting as part of the barrel circuit in vivo are highly selective for temporally correlated thalamic input, excitatory barrel neurons acting alone in vitro are less so. These data suggest that network-level processing of thalamic input in barrel cortex depends on neuron-level processing of the same input by excitatory and inhibitory barrel neurons.

  20. The APEX Quantitative Proteomics Tool: Generating protein quantitation estimates from LC-MS/MS proteomics results

    Directory of Open Access Journals (Sweden)

    Saeed Alexander I

    2008-12-01

    Full Text Available Abstract Background Mass spectrometry (MS based label-free protein quantitation has mainly focused on analysis of ion peak heights and peptide spectral counts. Most analyses of tandem mass spectrometry (MS/MS data begin with an enzymatic digestion of a complex protein mixture to generate smaller peptides that can be separated and identified by an MS/MS instrument. Peptide spectral counting techniques attempt to quantify protein abundance by counting the number of detected tryptic peptides and their corresponding MS spectra. However, spectral counting is confounded by the fact that peptide physicochemical properties severely affect MS detection resulting in each peptide having a different detection probability. Lu et al. (2007 described a modified spectral counting technique, Absolute Protein Expression (APEX, which improves on basic spectral counting methods by including a correction factor for each protein (called Oi value that accounts for variable peptide detection by MS techniques. The technique uses machine learning classification to derive peptide detection probabilities that are used to predict the number of tryptic peptides expected to be detected for one molecule of a particular protein (Oi. This predicted spectral count is compared to the protein's observed MS total spectral count during APEX computation of protein abundances. Results The APEX Quantitative Proteomics Tool, introduced here, is a free open source Java application that supports the APEX protein quantitation technique. The APEX tool uses data from standard tandem mass spectrometry proteomics experiments and provides computational support for APEX protein abundance quantitation through a set of graphical user interfaces that partition thparameter controls for the various processing tasks. The tool also provides a Z-score analysis for identification of significant differential protein expression, a utility to assess APEX classifier performance via cross validation, and a

  1. Comprehensive Information Retrieval and Model Input Sequence (CIRMIS)

    International Nuclear Information System (INIS)

    Friedrichs, D.R.

    1977-04-01

    The Comprehensive Information Retrieval and Model Input Sequence (CIRMIS) was developed to provide the research scientist with man--machine interactive capabilities in a real-time environment, and thereby produce results more quickly and efficiently. The CIRMIS system was originally developed to increase data storage and retrieval capabilities and ground-water model control for the Hanford site. The overall configuration, however, can be used in other areas. The CIRMIS system provides the user with three major functions: retrieval of well-based data, special application for manipulating surface data or background maps, and the manipulation and control of ground-water models. These programs comprise only a portion of the entire CIRMIS system. A complete description of the CIRMIS system is given in this report. 25 figures, 7 tables

  2. Quantitative Microbial Risk Assessment Tutorial - Primer

    Science.gov (United States)

    This document provides a Quantitative Microbial Risk Assessment (QMRA) primer that organizes QMRA tutorials. The tutorials describe functionality of a QMRA infrastructure, guide the user through software use and assessment options, provide step-by-step instructions for implementi...

  3. A quantitative and dynamic model of the Arabidopsis flowering time gene regulatory network.

    Directory of Open Access Journals (Sweden)

    Felipe Leal Valentim

    Full Text Available Various environmental signals integrate into a network of floral regulatory genes leading to the final decision on when to flower. Although a wealth of qualitative knowledge is available on how flowering time genes regulate each other, only a few studies incorporated this knowledge into predictive models. Such models are invaluable as they enable to investigate how various types of inputs are combined to give a quantitative readout. To investigate the effect of gene expression disturbances on flowering time, we developed a dynamic model for the regulation of flowering time in Arabidopsis thaliana. Model parameters were estimated based on expression time-courses for relevant genes, and a consistent set of flowering times for plants of various genetic backgrounds. Validation was performed by predicting changes in expression level in mutant backgrounds and comparing these predictions with independent expression data, and by comparison of predicted and experimental flowering times for several double mutants. Remarkably, the model predicts that a disturbance in a particular gene has not necessarily the largest impact on directly connected genes. For example, the model predicts that SUPPRESSOR OF OVEREXPRESSION OF CONSTANS (SOC1 mutation has a larger impact on APETALA1 (AP1, which is not directly regulated by SOC1, compared to its effect on LEAFY (LFY which is under direct control of SOC1. This was confirmed by expression data. Another model prediction involves the importance of cooperativity in the regulation of APETALA1 (AP1 by LFY, a prediction supported by experimental evidence. Concluding, our model for flowering time gene regulation enables to address how different quantitative inputs are combined into one quantitative output, flowering time.

  4. Inputs and spatial distribution patterns of Cr in Jiaozhou Bay

    Science.gov (United States)

    Yang, Dongfang; Miao, Zhenqing; Huang, Xinmin; Wei, Linzhen; Feng, Ming

    2018-03-01

    Cr pollution in marine bays has been one of the critical environmental issues, and understanding the input and spatial distribution patterns is essential to pollution control. In according to the source strengths of the major pollution sources, the input patterns of pollutants to marine bay include slight, moderate and heavy, and the spatial distribution are corresponding to three block models respectively. This paper analyzed input patterns and distributions of Cr in Jiaozhou Bay, eastern China based on investigation on Cr in surface waters during 1979-1983. Results showed that the input strengths of Cr in Jiaozhou Bay could be classified as moderate input and slight input, and the input strengths were 32.32-112.30 μg L-1 and 4.17-19.76 μg L-1, respectively. The input patterns of Cr included two patterns of moderate input and slight input, and the horizontal distributions could be defined by means of Block Model 2 and Block Model 3, respectively. In case of moderate input pattern via overland runoff, Cr contents were decreasing from the estuaries to the bay mouth, and the distribution pattern was parallel. In case of moderate input pattern via marine current, Cr contents were decreasing from the bay mouth to the bay, and the distribution pattern was parallel to circular. The Block Models were able to reveal the transferring process of various pollutants, and were helpful to understand the distributions of pollutants in marine bay.

  5. The effect of output-input isolation on the scaling and energy consumption of all-spin logic devices

    International Nuclear Information System (INIS)

    Hu, Jiaxi; Haratipour, Nazila; Koester, Steven J.

    2015-01-01

    All-spin logic (ASL) is a novel approach for digital logic applications wherein spin is used as the state variable instead of charge. One of the challenges in realizing a practical ASL system is the need to ensure non-reciprocity, meaning the information flows from input to output, not vice versa. One approach described previously, is to introduce an asymmetric ground contact, and while this approach was shown to be effective, it remains unclear as to the optimal approach for achieving non-reciprocity in ASL. In this study, we quantitatively analyze techniques to achieve non-reciprocity in ASL devices, and we specifically compare the effect of using asymmetric ground position and dipole-coupled output/input isolation. For this analysis, we simulate the switching dynamics of multiple-stage logic devices with FePt and FePd perpendicular magnetic anisotropy materials using a combination of a matrix-based spin circuit model coupled to the Landau–Lifshitz–Gilbert equation. The dipole field is included in this model and can act as both a desirable means of coupling magnets and a source of noise. The dynamic energy consumption has been calculated for these schemes, as a function of input/output magnet separation, and the results show that using a scheme that electrically isolates logic stages produces superior non-reciprocity, thus allowing both improved scaling and reduced energy consumption

  6. The effect of output-input isolation on the scaling and energy consumption of all-spin logic devices

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Jiaxi; Haratipour, Nazila; Koester, Steven J., E-mail: skoester@umn.edu [Department of Electrical and Computer Engineering, University of Minnesota-Twin Cities, 200 Union St. SE, Minneapolis, Minnesota 55455 (United States)

    2015-05-07

    All-spin logic (ASL) is a novel approach for digital logic applications wherein spin is used as the state variable instead of charge. One of the challenges in realizing a practical ASL system is the need to ensure non-reciprocity, meaning the information flows from input to output, not vice versa. One approach described previously, is to introduce an asymmetric ground contact, and while this approach was shown to be effective, it remains unclear as to the optimal approach for achieving non-reciprocity in ASL. In this study, we quantitatively analyze techniques to achieve non-reciprocity in ASL devices, and we specifically compare the effect of using asymmetric ground position and dipole-coupled output/input isolation. For this analysis, we simulate the switching dynamics of multiple-stage logic devices with FePt and FePd perpendicular magnetic anisotropy materials using a combination of a matrix-based spin circuit model coupled to the Landau–Lifshitz–Gilbert equation. The dipole field is included in this model and can act as both a desirable means of coupling magnets and a source of noise. The dynamic energy consumption has been calculated for these schemes, as a function of input/output magnet separation, and the results show that using a scheme that electrically isolates logic stages produces superior non-reciprocity, thus allowing both improved scaling and reduced energy consumption.

  7. Convergent input from brainstem coincidence detectors onto delay-sensitive neurons in the inferior colliculus.

    Science.gov (United States)

    McAlpine, D; Jiang, D; Shackleton, T M; Palmer, A R

    1998-08-01

    Responses of low-frequency neurons in the inferior colliculus (IC) of anesthetized guinea pigs were studied with binaural beats to assess their mean best interaural phase (BP) to a range of stimulating frequencies. Phase plots (stimulating frequency vs BP) were produced, from which measures of characteristic delay (CD) and characteristic phase (CP) for each neuron were obtained. The CD provides an estimate of the difference in travel time from each ear to coincidence-detector neurons in the brainstem. The CP indicates the mechanism underpinning the coincidence detector responses. A linear phase plot indicates a single, constant delay between the coincidence-detector inputs from the two ears. In more than half (54 of 90) of the neurons, the phase plot was not linear. We hypothesized that neurons with nonlinear phase plots received convergent input from brainstem coincidence detectors with different CDs. Presentation of a second tone with a fixed, unfavorable delay suppressed the response of one input, linearizing the phase plot and revealing other inputs to be relatively simple coincidence detectors. For some neurons with highly complex phase plots, the suppressor tone altered BP values, but did not resolve the nature of the inputs. For neurons with linear phase plots, the suppressor tone either completely abolished their responses or reduced their discharge rate with no change in BP. By selectively suppressing inputs with a second tone, we are able to reveal the nature of underlying binaural inputs to IC neurons, confirming the hypothesis that the complex phase plots of many IC neurons are a result of convergence from simple brainstem coincidence detectors.

  8. Quantitative Analysis of Science and Chemistry Textbooks for Indicators of Reform: A complementary perspective

    Science.gov (United States)

    Kahveci, Ajda

    2010-07-01

    In this study, multiple thematically based and quantitative analysis procedures were utilized to explore the effectiveness of Turkish chemistry and science textbooks in terms of their reflection of reform. The themes gender equity, questioning level, science vocabulary load, and readability level provided the conceptual framework for the analyses. An unobtrusive research method, content analysis, was used by coding the manifest content and counting the frequency of words, photographs, drawings, and questions by cognitive level. The context was an undergraduate chemistry teacher preparation program at a large public university in a metropolitan area in northwestern Turkey. Forty preservice chemistry teachers were guided to analyze 10 middle school science and 10 high school chemistry textbooks. Overall, the textbooks included unfair gender representations, a considerably higher number of input and processing than output level questions, and high load of science terminology. The textbooks failed to provide sufficient empirical evidence to be considered as gender equitable and inquiry-based. The quantitative approach employed for evaluation contrasts with a more interpretive approach, and has the potential in depicting textbook profiles in a more reliable way, complementing the commonly employed qualitative procedures. Implications suggest that further work in this line is needed on calibrating the analysis procedures with science textbooks used in different international settings. The procedures could be modified and improved to meet specific evaluation needs. In the Turkish context, next step research may concern the analysis of science textbooks being rewritten for the reform-based curricula to make cross-comparisons and evaluate a possible progression.

  9. Modal Parameter Identification from Responses of General Unknown Random Inputs

    DEFF Research Database (Denmark)

    Ibrahim, S. R.; Asmussen, J. C.; Brincker, Rune

    1996-01-01

    Modal parameter identification from ambient responses due to a general unknown random inputs is investigated. Existing identification techniques which are based on assumptions of white noise and or stationary random inputs are utilized even though the inputs conditions are not satisfied....... This is accomplished via adding. In cascade. A force cascade conversion to the structures system under consideration. The input to the force conversion system is white noise and the output of which is the actual force(s) applied to the structure. The white noise input(s) and the structures responses are then used...

  10. Off-line learning from clustered input examples

    NARCIS (Netherlands)

    Marangi, Carmela; Solla, Sara A.; Biehl, Michael; Riegler, Peter; Marinaro, Maria; Tagliaferri, Roberto

    1996-01-01

    We analyze the generalization ability of a simple perceptron acting on a structured input distribution for the simple case of two clusters of input data and a linearly separable rule. The generalization ability computed for three learning scenarios: maximal stability, Gibbs, and optimal learning, is

  11. Input reduction for long-term morphodynamic simulations

    NARCIS (Netherlands)

    Walstra, D.J.R.; Ruessink, G.; Hoekstra, R.; Tonnon, P.K.

    2013-01-01

    Input reduction is imperative to long-term (> years) morphodynamic simulations to avoid excessive computation times. Here, we discuss the input-reduction framework for wave-dominated coastal settings introduced by Walstra et al. (2013). The framework comprised 4 steps, viz. (1) the selection of the

  12. Smart-Guard: Defending User Input from Malware

    DEFF Research Database (Denmark)

    Denzel, Michael; Bruni, Alessandro; Ryan, Mark

    2016-01-01

    Trusted input techniques can profoundly enhance a variety of scenarios like online banking, electronic voting, Virtual Private Networks, and even commands to a server or Industrial Control System. To protect the system from malware of the sender’s computer, input needs to be reliably authenticated...

  13. ORIGNATE: PC input processor for ORIGEN-S

    International Nuclear Information System (INIS)

    Bowman, S.M.

    1992-01-01

    ORIGNATE is a personal computer program that serves as a user- friendly interface for the ORIGEN-S isotopic generation and depletion code. It is designed to assist an ORIGEN-S user in preparing an input file for execution of light-water-reactor fuel depletion and decay cases. Output from ORIGNATE is a card-image input file that may be uploaded to a mainframe computer to execute ORIGEN-S in SCALE-4. ORIGNATE features a pulldown menu system that accesses sophisticated data entry screens. The program allows the user to quickly set up an ORIGEN-S input file and perform error checking

  14. Reprocessing input data validation

    International Nuclear Information System (INIS)

    Persiani, P.J.; Bucher, R.G.; Pond, R.B.; Cornella, R.J.

    1990-01-01

    The Isotope Correlation Technique (ICT), in conjunction with the gravimetric (Pu/U ratio) method for mass determination, provides an independent verification of the input accountancy at the dissolver or accountancy stage of the reprocessing plant. The Isotope Correlation Technique has been applied to many classes of domestic and international reactor systems (light-water, heavy-water, graphite, and liquid-metal) operating in a variety of modes (power, research, production, and breeder), and for a variety of reprocessing fuel cycle management strategies. Analysis of reprocessing operations data based on isotopic correlations derived for assemblies in a PWR environment and fuel management scheme, yielded differences between the measurement-derived and ICT-derived plutonium mass determinations of (-0.02 ± 0.23)% for the measured U-235 and (+0.50 ± 0.31)% for the measured Pu-239, for a core campaign. The ICT analyses has been implemented for the plutonium isotopics in a depleted uranium assembly in a heavy-water, enriched uranium system and for the uranium isotopes in the fuel assemblies in light-water, highly-enriched systems. 7 refs., 5 figs., 4 tabs

  15. Patient input into the development and enhancement of ED discharge instructions: a focus group study.

    Science.gov (United States)

    Buckley, Barbara A; McCarthy, Danielle M; Forth, Victoria E; Tanabe, Paula; Schmidt, Michael J; Adams, James G; Engel, Kirsten G

    2013-11-01

    Previous research indicates that patients have difficulty understanding ED discharge instructions; these findings have important implications for adherence and outcomes. The objective of this study was to obtain direct patient input to inform specific revisions to discharge documents created through a literacy-guided approach and to identify common themes within patient feedback that can serve as a framework for the creation of discharge documents in the future. Based on extensive literature review and input from ED providers, subspecialists, and health literacy and communication experts, discharge instructions were created for 5 common ED diagnoses. Participants were recruited from a federally qualified health center to participate in a series of 5 focus group sessions. Demographic information was obtained and a Rapid Estimate of Adult Literacy in Medicine (REALM) assessment was performed. During each of the 1-hour focus group sessions, participants reviewed discharge instructions for 1 of 5 diagnoses. Participants were asked to provide input into the content, organization, and presentation of the documents. Using qualitative techniques, latent and manifest content analysis was performed to code for emergent themes across all 5 diagnoses. Fifty-seven percent of participants were female and the average age was 32 years. The average REALM score was 57.3. Through qualitative analysis, 8 emergent themes were identified from the focus groups. Patient input provides meaningful guidance in the development of diagnosis-specific discharge instructions. Several themes and patterns were identified, with broad significance for the design of ED discharge instructions. Copyright © 2013 Emergency Nurses Association. Published by Mosby, Inc. All rights reserved.

  16. Energy Education: The Quantitative Voice

    Science.gov (United States)

    Wolfson, Richard

    2010-02-01

    A serious study of energy use and its consequences has to be quantitative. It makes little sense to push your favorite renewable energy source if it can't provide enough energy to make a dent in humankind's prodigious energy consumption. Conversely, it makes no sense to dismiss alternatives---solar in particular---that supply Earth with energy at some 10,000 times our human energy consumption rate. But being quantitative---especially with nonscience students or the general public---is a delicate business. This talk draws on the speaker's experience presenting energy issues to diverse audiences through single lectures, entire courses, and a textbook. The emphasis is on developing a quick, ``back-of-the-envelope'' approach to quantitative understanding of energy issues. )

  17. CREATING INPUT TABLES FROM WAPDEG FOR RIP

    International Nuclear Information System (INIS)

    K.G. Mon

    1998-01-01

    The purpose of this calculation is to create tables for input into RIP ver. 5.18 (Integrated Probabilistic Simulator for Environmental Systems) from WAPDEG ver. 3.06 (Waste Package Degradation) output. This calculation details the creation of the RIP input tables for TSPA-VA REV.00

  18. GGDC Productivity Level Database : International Comparisons of Output, Inputs and Productivity at the Industry Level

    NARCIS (Netherlands)

    Inklaar, Robert; Timmer, Marcel P.

    2008-01-01

    In this paper we introduce the GGDC Productivity Level database. This database provides comparisons of output, inputs and productivity at a detailed industry level for a set of thirty OECD countries. It complements the EU KLEMS growth and productivity accounts by providing comparative levels and

  19. Quantitative imaging methods in osteoporosis.

    Science.gov (United States)

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  20. Harmonize input selection for sediment transport prediction

    Science.gov (United States)

    Afan, Haitham Abdulmohsin; Keshtegar, Behrooz; Mohtar, Wan Hanna Melini Wan; El-Shafie, Ahmed

    2017-09-01

    In this paper, three modeling approaches using a Neural Network (NN), Response Surface Method (RSM) and response surface method basis Global Harmony Search (GHS) are applied to predict the daily time series suspended sediment load. Generally, the input variables for forecasting the suspended sediment load are manually selected based on the maximum correlations of input variables in the modeling approaches based on NN and RSM. The RSM is improved to select the input variables by using the errors terms of training data based on the GHS, namely as response surface method and global harmony search (RSM-GHS) modeling method. The second-order polynomial function with cross terms is applied to calibrate the time series suspended sediment load with three, four and five input variables in the proposed RSM-GHS. The linear, square and cross corrections of twenty input variables of antecedent values of suspended sediment load and water discharge are investigated to achieve the best predictions of the RSM based on the GHS method. The performances of the NN, RSM and proposed RSM-GHS including both accuracy and simplicity are compared through several comparative predicted and error statistics. The results illustrated that the proposed RSM-GHS is as uncomplicated as the RSM but performed better, where fewer errors and better correlation was observed (R = 0.95, MAE = 18.09 (ton/day), RMSE = 25.16 (ton/day)) compared to the ANN (R = 0.91, MAE = 20.17 (ton/day), RMSE = 33.09 (ton/day)) and RSM (R = 0.91, MAE = 20.06 (ton/day), RMSE = 31.92 (ton/day)) for all types of input variables.

  1. A quantitative x-ray diffraction inventory of volcaniclastic inputs into the marine sediment archives off Iceland: a contribution to the Volcanoes in the Arctic System programme

    Directory of Open Access Journals (Sweden)

    Dennis D. Eberl

    2013-02-01

    Full Text Available This paper re-evaluates how well quantitative x-ray diffraction (qXRD can be used as an exploratory method of the weight percentage (wt% of volcaniclastic sediment, and to identify tephra events in marine cores. In the widely used RockJock v6 software programme, qXRD tephra and glass standards include the rhyodacite White River tephra (Alaska, a rhyolitic tephra (Hekla-4 and the basaltic Saksunarvatn tephra. Experiments of adding known wt% of tephra to felsic bedrock samples indicated that additions ≥10 wt% are accurately detected, but reliable estimates of lesser amounts are masked by amorphous material produced by milling. Volcaniclastic inputs range between 20 and 50 wt%. Primary tephra events are identified as peaks in residual qXRD glass wt% from fourth-order polynomial fits. In cores where tephras have been identified by shard counts in the >150 µm fraction, there is a positive correlation (validation with peaks in the wt% glass estimated by qXRD. Geochemistry of tephra shards confirms the presence of several Hekla-sourced tephras in cores B997-317PC1 and -319PC2 on the northern Iceland shelf. In core B997-338 (north-west Iceland, there are two rhyolitic tephras separated by ca. 100 cm with uncorrected radiocarbon dates on articulated shells of around 13 000 yr B.P. These tephras may be correlatives of the Borrobol and Penifiler tephras found in Scotland. The number of Holocene tephra events per 1000 yr was estimated from qXRD on 16 cores and showed a bimodal distribution with an increased number of events in both the late and early Holocene.

  2. Variance-based sensitivity indices for models with dependent inputs

    International Nuclear Information System (INIS)

    Mara, Thierry A.; Tarantola, Stefano

    2012-01-01

    Computational models are intensively used in engineering for risk analysis or prediction of future outcomes. Uncertainty and sensitivity analyses are of great help in these purposes. Although several methods exist to perform variance-based sensitivity analysis of model output with independent inputs only a few are proposed in the literature in the case of dependent inputs. This is explained by the fact that the theoretical framework for the independent case is set and a univocal set of variance-based sensitivity indices is defined. In the present work, we propose a set of variance-based sensitivity indices to perform sensitivity analysis of models with dependent inputs. These measures allow us to distinguish between the mutual dependent contribution and the independent contribution of an input to the model response variance. Their definition relies on a specific orthogonalisation of the inputs and ANOVA-representations of the model output. In the applications, we show the interest of the new sensitivity indices for model simplification setting. - Highlights: ► Uncertainty and sensitivity analyses are of great help in engineering. ► Several methods exist to perform variance-based sensitivity analysis of model output with independent inputs. ► We define a set of variance-based sensitivity indices for models with dependent inputs. ► Inputs mutual contributions are distinguished from their independent contributions. ► Analytical and computational tests are performed and discussed.

  3. Joint analysis of input and parametric uncertainties in watershed water quality modeling: A formal Bayesian approach

    Science.gov (United States)

    Han, Feng; Zheng, Yi

    2018-06-01

    Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.

  4. Combining symbolic cues with sensory input and prior experience in an iterative Bayesian framework

    Directory of Open Access Journals (Sweden)

    Frederike Hermi Petzschner

    2012-08-01

    Full Text Available Perception and action are the result of an integration of various sources of information, such as current sensory input, prior experience, or the context in which a stimulus occurs. Often, the interpretation is not trivial hence needs to be learned from the co-occurrence of stimuli. Yet, how do we combine such diverse information to guide our action?Here we use a distance production-reproduction task to investigate the influence of auxiliary, symbolic cues, sensory input, and prior experience on human performance under three different conditions that vary in the information provided. Our results indicate that subjects can (1 learn the mapping of a verbal, symbolic cue onto the stimulus dimension and (2 integrate symbolic information and prior experience into their estimate of displacements.The behavioral results are explained by to two distinct generative models that represent different structural approaches of how a Bayesian observer would combine prior experience, sensory input, and symbolic cue information into a single estimate of displacement. The first model interprets the symbolic cue in the context of categorization, assuming that it reflects information about a distinct underlying stimulus range (categorical model. The second model applies a multi-modal integration approach and treats the symbolic cue as additional sensory input to the system, which is combined with the current sensory measurement and the subjects’ prior experience (cue-combination model. Notably, both models account equally well for the observed behavior despite their different structural assumptions. The present work thus provides evidence that humans can interpret abstract symbolic information and combine it with other types of information such as sensory input and prior experience. The similar explanatory power of the two models further suggest that issues such as categorization and cue-combination could be explained by alternative probabilistic approaches.

  5. Automated input data management in manufacturing process simulation

    OpenAIRE

    Ettefaghian, Alireza

    2015-01-01

    Input Data Management (IDM) is a time consuming and costly process for Discrete Event Simulation (DES) projects. Input Data Management is considered as the basis of real-time process simulation (Bergmann, Stelzer and Strassburger, 2011). According to Bengtsson et al. (2009), data input phase constitutes on the average about 31% of the time of an entire simulation project. Moreover, the lack of interoperability between manufacturing applications and simulation software leads to a high cost to ...

  6. Safety analysis code input automation using the Nuclear Plant Data Bank

    International Nuclear Information System (INIS)

    Kopp, H.; Leung, J.; Tajbakhsh, A.; Viles, F.

    1985-01-01

    The Nuclear Plant Data Bank (NPDB) is a computer-based system that organizes a nuclear power plant's technical data, providing mechanisms for data storage, retrieval, and computer-aided engineering analysis. It has the specific objective to describe thermohydraulic systems in order to support: rapid information retrieval and display, and thermohydraulic analysis modeling. The Nuclear Plant Data Bank (NPBD) system fully automates the storage and analysis based on this data. The system combines the benefits of a structured data base system and computer-aided modeling with links to large scale codes for engineering analysis. Emphasis on a friendly and very graphically oriented user interface facilitates both initial use and longer term efficiency. Specific features are: organization and storage of thermohydraulic data items, ease in locating specific data items, graphical and tabular display capabilities, interactive model construction, organization and display of model input parameters, input deck construction for TRAC and RELAP analysis programs, and traceability of plant data, user model assumptions, and codes used in the input deck construction process. The major accomplishments of this past year were the development of a RELAP model generation capability and the development of a CRAY version of the code

  7. OFFSCALE: A PC input processor for the SCALE code system. The ORIGNATE processor for ORIGEN-S

    International Nuclear Information System (INIS)

    Bowman, S.M.

    1994-11-01

    OFFSCALE is a suite of personal computer input processor programs developed at Oak Ridge National Laboratory to provide an easy-to-use interface for modules in the SCALE-4 code system. ORIGNATE is a program in the OFFSCALE suite that serves as a user-friendly interface for the ORIGEN-S isotopic generation and depletion code. It is designed to assist an ORIGEN-S user in preparing an input file for execution of light-water-reactor (LWR) fuel depletion and decay cases. ORIGNATE generates an input file that may be used to execute ORIGEN-S in SCALE-4. ORIGNATE features a pulldown menu system that accesses sophisticated data entry screens. The program allows the user to quickly set up an ORIGEN-S input file and perform error checking. This capability increases productivity and decreases the chance of user error

  8. Radioactive inputs to the North Sea and the Channel

    International Nuclear Information System (INIS)

    1984-01-01

    The subject is covered in sections: introduction (radioactivity; radioisotopes; discharges from nuclear establishments); data sources (statutory requirements); sources of liquid radioactive waste (figure showing location of principal sources of radioactive discharges; tables listing principal discharges by activity and by nature of radioisotope); Central Electricity Generating Board nuclear power stations; research and industrial establishments; Ministy of Defence establishments; other UK inputs of radioactive waste; total inputs to the North Sea and the Channel (direct inputs; river inputs; adjacent sea areas); conclusions. (U.K.)

  9. Crop Breeding for Low Input Agriculture: A Sustainable Response to Feed a Growing World Population

    Directory of Open Access Journals (Sweden)

    Vagner A. Benedito

    2011-10-01

    Full Text Available World population is projected to reach its maximum (~10 billion people by the year 2050. This 45% increase of the current world population (approaching seven billion people will boost the demand for food and raw materials. However, we live in a historical moment when supply of phosphate, water, and oil are at their peaks. Modern agriculture is fundamentally based on varieties bred for high performance under high input systems (fertilizers, water, oil, pesticides, which generally do not perform well under low-input situations. We propose a shift of research goals and plant breeding objectives from high-performance agriculture at high-energy input to those with an improved rationalization between yield and energy input. Crop breeding programs that are more focused on nutrient economy and local environmental fitness will help reduce energy demands for crop production while still providing adequate amounts of high quality food as global resources decline and population is projected to increase.

  10. Non-perturbative inputs for gluon distributions in the hadrons

    International Nuclear Information System (INIS)

    Ermolaev, B.I.; Troyan, S.I.

    2017-01-01

    Description of hadronic reactions at high energies is conventionally done in the framework of QCD factorization. All factorization convolutions comprise non-perturbative inputs mimicking non-perturbative contributions and perturbative evolution of those inputs. We construct inputs for the gluon-hadron scattering amplitudes in the forward kinematics and, using the optical theorem, convert them into inputs for gluon distributions in the hadrons, embracing the cases of polarized and unpolarized hadrons. In the first place, we formulate mathematical criteria which any model for the inputs should obey and then suggest a model satisfying those criteria. This model is based on a simple reasoning: after emitting an active parton off the hadron, the remaining set of spectators becomes unstable and therefore it can be described through factors of the resonance type, so we call it the resonance model. We use it to obtain non-perturbative inputs for gluon distributions in unpolarized and polarized hadrons for all available types of QCD factorization: basic, K_T-and collinear factorizations. (orig.)

  11. SO2 policy and input substitution under spatial monopoly

    International Nuclear Information System (INIS)

    Gerking, Shelby; Hamilton, Stephen F.

    2010-01-01

    Following the U.S. Clean Air Act Amendments of 1990, electric utilities dramatically increased their utilization of low-sulfur coal from the Powder River Basin (PRB). Recent studies indicate that railroads hauling PRB coal exercise a substantial degree of market power and that relative price changes in the mining and transportation sectors were contributing factors to the observed pattern of input substitution. This paper asks the related question: To what extent does more stringent SO 2 policy stimulate input substitution from high-sulfur coal to low-sulfur coal when railroads hauling low-sulfur coal exercise spatial monopoly power? The question underpins the effectiveness of incentive-based environmental policies given the essential role of market performance in input, output, and abatement markets in determining the social cost of regulation. Our analysis indicates that environmental regulation leads to negligible input substitution effects when clean and dirty inputs are highly substitutable and the clean input market is mediated by a spatial monopolist. (author)

  12. Non-perturbative inputs for gluon distributions in the hadrons

    Energy Technology Data Exchange (ETDEWEB)

    Ermolaev, B.I. [Ioffe Physico-Technical Institute, Saint Petersburg (Russian Federation); Troyan, S.I. [St. Petersburg Institute of Nuclear Physics, Gatchina (Russian Federation)

    2017-03-15

    Description of hadronic reactions at high energies is conventionally done in the framework of QCD factorization. All factorization convolutions comprise non-perturbative inputs mimicking non-perturbative contributions and perturbative evolution of those inputs. We construct inputs for the gluon-hadron scattering amplitudes in the forward kinematics and, using the optical theorem, convert them into inputs for gluon distributions in the hadrons, embracing the cases of polarized and unpolarized hadrons. In the first place, we formulate mathematical criteria which any model for the inputs should obey and then suggest a model satisfying those criteria. This model is based on a simple reasoning: after emitting an active parton off the hadron, the remaining set of spectators becomes unstable and therefore it can be described through factors of the resonance type, so we call it the resonance model. We use it to obtain non-perturbative inputs for gluon distributions in unpolarized and polarized hadrons for all available types of QCD factorization: basic, K{sub T}-and collinear factorizations. (orig.)

  13. CBM first-level event selector input interface

    Energy Technology Data Exchange (ETDEWEB)

    Hutter, Dirk [Frankfurt Institute for Advanced Studies, Goethe University, Frankfurt (Germany); Collaboration: CBM-Collaboration

    2016-07-01

    The CBM First-level Event Selector (FLES) is the central event selection system of the upcoming CBM experiment at FAIR. Designed as a high-performance computing cluster, its task is an online analysis of the physics data at a total data rate exceeding 1 TByte/s. To allow efficient event selection, the FLES performs timeslice building, which combines the data from all given input links to self-contained, overlapping processing intervals and distributes them to compute nodes. Partitioning the input data streams into specialized containers allows to perform this task very efficiently. The FLES Input Interface defines the linkage between FEE and FLES data transport framework. Utilizing a custom FPGA board, it receives data via optical links, prepares them for subsequent timeslice building, and transfers the data via DMA to the PC's memory. An accompanying HDL module implements the front-end logic interface and FLES link protocol in the front-end FPGAs. Prototypes of all Input Interface components have been implemented and integrated into the FLES framework. In contrast to earlier prototypes, which included components to work without a FPGA layer between FLES and FEE, the structure matches the foreseen final setup. This allows the implementation and evaluation of the final CBM read-out chain. An overview of the FLES Input Interface as well as studies on system integration and system start-up are presented.

  14. Reconstructing historical trends in metal input in heavily-disturbed, contaminated estuaries: studies from Bilbao, Southampton Water and Sicily

    International Nuclear Information System (INIS)

    Cundy, A.B.; Croudace, I.W.; Cearreta, A.; Irabien, M.J.

    2003-01-01

    Estuaries may be important reservoirs for contaminants as they tend to act as sinks for fine, contaminant-reactive sediments, and, historically, they have acted as centres for industrial and urban development. Analysis of dated sediment cores from these areas may allow historical trends in heavy metal input to be reconstructed, and recent and historical inputs of metal contaminants to be compared. Undisturbed saltmarsh settings have been used widely in the reconstruction of historical trends in metal input as saltmarshes provide a stable, vegetated substrate of dominantly fine sediments, and are less prone to erosion and reworking than adjacent mudflat areas. In comparison, much less research on historical pollution trends has been undertaken at estuarine sites which are prone to severe local disturbance, such as intertidal areas which are routinely dredged or where sedimentary processes have been modified by human activities such as shipping, salt working, port activities, land claim etc. This paper assesses the usefulness of 210 Pb and 137 Cs dating, combined with geochemical studies, in reconstructing historical trends in heavy metal input and sediment accretion in 3 heavily-modified, industrialised estuarine areas in Europe: the Bilbao estuary (Spain), Southampton Water (UK), and the Mulinello estuary (Sicily). Of these sites, only a salt marsh core from the Mulinello estuary provides a high-resolution record of recent heavy metal inputs. In Southampton Water only a partial record of changing metal inputs over time is retained due to land-claim and possible early-diagenetic remobilisation, while at Bilbao the vertical distribution of heavy metals in intertidal flats is mainly controlled by input on reworked sediment particles and variations in sediment composition. Where 137 Cs and 210 Pb distributions with depth allow a chronology of sediment deposition to be established, and early-diagenetic remobilisation has been minimal, mudflat and saltmarsh cores from

  15. Multi-Input Convolutional Neural Network for Flower Grading

    Directory of Open Access Journals (Sweden)

    Yu Sun

    2017-01-01

    Full Text Available Flower grading is a significant task because it is extremely convenient for managing the flowers in greenhouse and market. With the development of computer vision, flower grading has become an interdisciplinary focus in both botany and computer vision. A new dataset named BjfuGloxinia contains three quality grades; each grade consists of 107 samples and 321 images. A multi-input convolutional neural network is designed for large scale flower grading. Multi-input CNN achieves a satisfactory accuracy of 89.6% on the BjfuGloxinia after data augmentation. Compared with a single-input CNN, the accuracy of multi-input CNN is increased by 5% on average, demonstrating that multi-input convolutional neural network is a promising model for flower grading. Although data augmentation contributes to the model, the accuracy is still limited by lack of samples diversity. Majority of misclassification is derived from the medium class. The image processing based bud detection is useful for reducing the misclassification, increasing the accuracy of flower grading to approximately 93.9%.

  16. Input Shaping to Reduce Solar Array Structural Vibrations

    Science.gov (United States)

    Doherty, Michael J.; Tolson, Robert J.

    1998-01-01

    Structural vibrations induced by actuators can be minimized using input shaping. Input shaping is a feedforward method in which actuator commands are convolved with shaping functions to yield a shaped set of commands. These commands are designed to perform the maneuver while minimizing the residual structural vibration. In this report, input shaping is extended to stepper motor actuators. As a demonstration, an input-shaping technique based on pole-zero cancellation was used to modify the Solar Array Drive Assembly (SADA) actuator commands for the Lewis satellite. A series of impulses were calculated as the ideal SADA output for vibration control. These impulses were then discretized for use by the SADA stepper motor actuator and simulated actuator outputs were used to calculate the structural response. The effectiveness of input shaping is limited by the accuracy of the knowledge of the modal frequencies. Assuming perfect knowledge resulted in significant vibration reduction. Errors of 10% in the modal frequencies caused notably higher levels of vibration. Controller robustness was improved by incorporating additional zeros in the shaping function. The additional zeros did not require increased performance from the actuator. Despite the identification errors, the resulting feedforward controller reduced residual vibrations to the level of the exactly modeled input shaper and well below the baseline cases. These results could be easily applied to many other vibration-sensitive applications involving stepper motor actuators.

  17. WIMS-D use by ZfK - Data input, experience, and examples

    International Nuclear Information System (INIS)

    Wand, H.

    1983-05-01

    The report provides users of the ZfK-version (EC-1055 computer) of the cell program WIMS-D with all necessary information to compile by oneself input data sets. By means of 10 examples which comprise a large number of the practically most important program options the data compilation is explained. Some experience in using special options is given. (author)

  18. Adaptive control of a quadrotor aerial vehicle with input constraints and uncertain parameters

    Science.gov (United States)

    Tran, Trong-Toan; Ge, Shuzhi Sam; He, Wei

    2018-05-01

    In this paper, we address the problem of adaptive bounded control for the trajectory tracking of a Quadrotor Aerial Vehicle (QAV) while the input saturations and uncertain parameters with the known bounds are simultaneously taken into account. First, to deal with the underactuated property of the QAV model, we decouple and construct the QAV model as a cascaded structure which consists of two fully actuated subsystems. Second, to handle the input constraints and uncertain parameters, we use a combination of the smooth saturation function and smooth projection operator in the control design. Third, to ensure the stability of the overall system of the QAV, we develop the technique for the cascaded system in the presence of both the input constraints and uncertain parameters. Finally, the region of stability of the closed-loop system is constructed explicitly, and our design ensures the asymptotic convergence of the tracking errors to the origin. The simulation results are provided to illustrate the effectiveness of the proposed method.

  19. Plasticity of the cis-regulatory input function of a gene.

    Directory of Open Access Journals (Sweden)

    Avraham E Mayo

    2006-04-01

    Full Text Available The transcription rate of a gene is often controlled by several regulators that bind specific sites in the gene's cis-regulatory region. The combined effect of these regulators is described by a cis-regulatory input function. What determines the form of an input function, and how variable is it with respect to mutations? To address this, we employ the well-characterized lac operon of Escherichia coli, which has an elaborate input function, intermediate between Boolean AND-gate and OR-gate logic. We mapped in detail the input function of 12 variants of the lac promoter, each with different point mutations in the regulator binding sites, by means of accurate expression measurements from living cells. We find that even a few mutations can significantly change the input function, resulting in functions that resemble Pure AND gates, OR gates, or single-input switches. Other types of gates were not found. The variant input functions can be described in a unified manner by a mathematical model. The model also lets us predict which functions cannot be reached by point mutations. The input function that we studied thus appears to be plastic, in the sense that many of the mutations do not ruin the regulation completely but rather result in new ways to integrate the inputs.

  20. Computer Generated Inputs for NMIS Processor Verification

    International Nuclear Information System (INIS)

    J. A. Mullens; J. E. Breeding; J. A. McEvers; R. W. Wysor; L. G. Chiang; J. R. Lenarduzzi; J. T. Mihalczo; J. K. Mattingly

    2001-01-01

    Proper operation of the Nuclear Identification Materials System (NMIS) processor can be verified using computer-generated inputs [BIST (Built-In-Self-Test)] at the digital inputs. Preselected sequences of input pulses to all channels with known correlation functions are compared to the output of the processor. These types of verifications have been utilized in NMIS type correlation processors at the Oak Ridge National Laboratory since 1984. The use of this test confirmed a malfunction in a NMIS processor at the All-Russian Scientific Research Institute of Experimental Physics (VNIIEF) in 1998. The NMIS processor boards were returned to the U.S. for repair and subsequently used in NMIS passive and active measurements with Pu at VNIIEF in 1999

  1. Structural consequences of carbon taxes: An input-output analysis

    International Nuclear Information System (INIS)

    Che Yuhu.

    1992-01-01

    A model system is provided for examining for examining the structural consequences of carbon taxes on economic, energy, and environmental issues. The key component is the Iterative Multi-Optimization (IMO) Process model which describes, using an Input-Output (I-O) framework, the feedback between price changes and substitution. The IMO process is designed to assure this feedback process when the input coefficients in an I-O table can be changed while holding the I-O price model. The theoretical problems of convergence to a limit in the iterative process and uniqueness (which requires all IMO processes starting from different initial prices to converge to a unique point for a given level of carbon taxes) are addressed. The empirical analysis also examines the effects of carbon taxes on the US economy as described by a 78 sector I-O model. Findings are compared with those of other models that assess the effects of carbon taxes, and the similarities and differences with them are interpreted in terms of differences in the scope, sectoral detail, time frame, and policy assumptions among the models

  2. Optimizing Input/Output Using Adaptive File System Policies

    Science.gov (United States)

    Madhyastha, Tara M.; Elford, Christopher L.; Reed, Daniel A.

    1996-01-01

    Parallel input/output characterization studies and experiments with flexible resource management algorithms indicate that adaptivity is crucial to file system performance. In this paper we propose an automatic technique for selecting and refining file system policies based on application access patterns and execution environment. An automatic classification framework allows the file system to select appropriate caching and pre-fetching policies, while performance sensors provide feedback used to tune policy parameters for specific system environments. To illustrate the potential performance improvements possible using adaptive file system policies, we present results from experiments involving classification-based and performance-based steering.

  3. Manual input device for controlling a robot arm

    International Nuclear Information System (INIS)

    Fischer, P.J.; Siva, K.V.

    1990-01-01

    A six-axis input device, eg joystick, is supported by a mechanism which enables the joystick to be aligned with any desired orientation, eg parallel to the tool. The mechanism can then be locked to provide a rigid support of the joystick. The mechanism may include three pivotal joints whose axes are perpendicular, each incorporating a clutch. The clutches may be electromagnetic or mechanical and may be operable jointly or independently. The robot arm comprises a base rotatable about a vertical axis, an upper arm, a forearm and a tool or grip rotatable about three perpendicular axes relative to the forearm. (author)

  4. Rethinking the Numerate Citizen: Quantitative Literacy and Public Issues

    Directory of Open Access Journals (Sweden)

    Ander W. Erickson

    2016-07-01

    Full Text Available Does a citizen need to possess quantitative literacy in order to make responsible decisions on behalf of the public good? If so, how much is enough? This paper presents an analysis of the quantitative claims made on behalf of ballot measures in order to better delineate the role of quantitative literacy for the citizen. I argue that this role is surprisingly limited due to the contextualized nature of quantitative claims that are encountered outside of a school setting. Instead, rational dependence, or the reasoned dependence on the knowledge of others, is proposed as an educational goal that can supplement quantitative literacy and, in so doing, provide a more realistic plan for informed evaluations of quantitative claims.

  5. Method and System for Physiologically Modulating Videogames and Simulations which Use Gesture and Body Image Sensing Control Input Devices

    Science.gov (United States)

    Pope, Alan T. (Inventor); Stephens, Chad L. (Inventor); Habowski, Tyler (Inventor)

    2017-01-01

    Method for physiologically modulating videogames and simulations includes utilizing input from a motion-sensing video game system and input from a physiological signal acquisition device. The inputs from the physiological signal sensors are utilized to change the response of a user's avatar to inputs from the motion-sensing sensors. The motion-sensing system comprises a 3D sensor system having full-body 3D motion capture of a user's body. This arrangement encourages health-enhancing physiological self-regulation skills or therapeutic amplification of healthful physiological characteristics. The system provides increased motivation for users to utilize biofeedback as may be desired for treatment of various conditions.

  6. Modeling inputs to computer models used in risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.

    1987-01-01

    Computer models for various risk assessment applications are closely scrutinized both from the standpoint of questioning the correctness of the underlying mathematical model with respect to the process it is attempting to model and from the standpoint of verifying that the computer model correctly implements the underlying mathematical model. A process that receives less scrutiny, but is nonetheless of equal importance, concerns the individual and joint modeling of the inputs. This modeling effort clearly has a great impact on the credibility of results. Model characteristics are reviewed in this paper that have a direct bearing on the model input process and reasons are given for using probabilities-based modeling with the inputs. The authors also present ways to model distributions for individual inputs and multivariate input structures when dependence and other constraints may be present

  7. Inhalation Exposure Input Parameters for the Biosphere Model

    International Nuclear Information System (INIS)

    M. A. Wasiolek

    2003-01-01

    This analysis is one of the nine reports that support the Environmental Radiation Model for Yucca Mountain Nevada (ERMYN) biosphere model. The ''Biosphere Model Report'' (BSC 2003a) describes in detail the conceptual model as well as the mathematical model and its input parameters. This report documents a set of input parameters for the biosphere model, and supports the use of the model to develop biosphere dose conversion factors (BDCFs). The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for a Yucca Mountain repository. This report, ''Inhalation Exposure Input Parameters for the Biosphere Model'', is one of the five reports that develop input parameters for the biosphere model. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling, and the plan for development of the biosphere abstraction products for TSPA, as identified in the ''Technical Work Plan: for Biosphere Modeling and Expert Support'' (BSC 2003b). It should be noted that some documents identified in Figure 1-1 may be under development at the time this report is issued and therefore not available at that time. This figure is included to provide an understanding of how this analysis report contributes to biosphere modeling in support of the license application, and is not intended to imply that access to the listed documents is required to understand the contents of this analysis report. This analysis report defines and justifies values of mass loading, which is the total mass concentration of resuspended particles (e.g., dust, ash) in a volume of air. Measurements of mass loading are used in the air submodel of ERMYN to calculate concentrations of radionuclides in air surrounding crops and concentrations in air inhaled by a receptor. Concentrations in air to which the

  8. Remote input/output station

    CERN Multimedia

    1972-01-01

    A general view of the remote input/output station installed in building 112 (ISR) and used for submitting jobs to the CDC 6500 and 6600. The card reader on the left and the line printer on the right are operated by programmers on a self-service basis.

  9. Inverse Tasks In The Tsunami Problem: Nonlinear Regression With Inaccurate Input Data

    Science.gov (United States)

    Lavrentiev, M.; Shchemel, A.; Simonov, K.

    problem can be formally propounded this way: A distribution of various combinations of observed values should be estimated. Totality of the combinations is represented by the set of variables. The results of observations determine excerption of outputs. In the scope of the propounded problem continuous (along with its derivations) homomorphic reflec- tion of the space of hidden parameters to the space of observed parameters should be found. It allows to reconstruct lack information of the inputs when the number of the 1 inputs is not less than the number of hidden parameters and to estimate the distribution if information for synonymous prediction of unknown inputs is not sufficient. The following approach to build approximation based on the excerption is suggested: the excerption is supplemented with the hidden parameters, which are distributed uni- formly in a multidimensional limited space. Then one should find correspondence of model and observed outputs. Therefore the correspondence will provide that the best approximation is the most accurate. In the odd iterations dependence between hid- den inputs and outputs is being optimized (like the conventional problem is solved). Correspondence between tasks is changing in the case when the error is reducing and distribution of inputs remains intact. Therefore, a special transform is applied to reduce error at every iteration. If the mea- sure of distribution is constant, then the condition of transformations is simplified. Such transforms are named "canonical" or "volume invariant transforms" and, there- fore, are well known. This approach is suggested for solving main inverse task of the tsunami problem. Basing on registered tsunami in seaside and shelf to estimate parameters of tsunami's hearth. 2

  10. Quantitative data analysis in education a critical introduction using SPSS

    CERN Document Server

    Connolly, Paul

    2007-01-01

    This book provides a refreshing and user-friendly guide to quantitative data analysis in education for students and researchers. It assumes absolutely no prior knowledge of quantitative methods or statistics. Beginning with the very basics, it provides the reader with the knowledge and skills necessary to be able to undertake routine quantitative data analysis to a level expected of published research. Rather than focusing on teaching statistics through mathematical formulae, the book places an emphasis on using SPSS to gain a real feel for the data and an intuitive grasp of t

  11. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Thomas Jensen

    2016-01-01

    Full Text Available Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework.

  12. Input and Age-Dependent Variation in Second Language Learning: A Connectionist Account.

    Science.gov (United States)

    Janciauskas, Marius; Chang, Franklin

    2017-07-26

    Language learning requires linguistic input, but several studies have found that knowledge of second language (L2) rules does not seem to improve with more language exposure (e.g., Johnson & Newport, 1989). One reason for this is that previous studies did not factor out variation due to the different rules tested. To examine this issue, we reanalyzed grammaticality judgment scores in Flege, Yeni-Komshian, and Liu's (1999) study of L2 learners using rule-related predictors and found that, in addition to the overall drop in performance due to a sensitive period, L2 knowledge increased with years of input. Knowledge of different grammar rules was negatively associated with input frequency of those rules. To better understand these effects, we modeled the results using a connectionist model that was trained using Korean as a first language (L1) and then English as an L2. To explain the sensitive period in L2 learning, the model's learning rate was reduced in an age-related manner. By assigning different learning rates for syntax and lexical learning, we were able to model the difference between early and late L2 learners in input sensitivity. The model's learning mechanism allowed transfer between the L1 and L2, and this helped to explain the differences between different rules in the grammaticality judgment task. This work demonstrates that an L1 model of learning and processing can be adapted to provide an explicit account of how the input and the sensitive period interact in L2 learning. © 2017 The Authors. Cognitive Science - A Multidisciplinary Journal published by Wiley Periodicals, Inc.

  13. Identifying the relevant dependencies of the neural network response on characteristics of the input space

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    This talk presents an approach to identify those characteristics of the neural network inputs that are most relevant for the response and therefore provides essential information to determine the systematic uncertainties.

  14. Use of a D17Z1 oligonucleotide probe for human DNA quantitation prior to PCR analysis of polymorphic DNA markers

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, S.; Alavaren, M.; Varlaro, J. [Roche Molecular Systems, Alameda, CA (United States)] [and others

    1994-09-01

    The alpha-satellite DNA locus D17Z1 contains primate-specific sequences which are repeated several hundred times per chromosome 17. A probe that was designed to hybridize to a subset of the D17Z1 sequence can be used for very sensitive and specific quantitation of human DNA. Sample human genomic DNA is immobilized on nylon membrane using a slot blot apparatus, and then hybridized with a biotinylated D17Z1 oligonucleotide probe. The subsequent binding of streptavidin-horseradish peroxidase to the bound probe allows for either calorimetric (TMB) or chemiluminescent (ECL) detection. Signals obtained for sample DNAs are then compared to the signals obtained for a series of human DNA standards. For either detection method, forty samples can be quantitated in less than two hours, with a sensitivity of 150 pg. As little as 20 pg of DNA can be quantitated when using chemiluminescent detection with longer film exposures. PCR analysis of several VNTR and STR markers has indicated that optimal typing results are generally obtained within a relatively narrow range of input DNA quantities. Too much input DNA can lead to PCR artifacts such as preferential amplification of smaller alleles, non-specific amplification products, and exaggeration of the DNA synthesis slippage products that are seen with STR markers. Careful quantitation of human genomic DNA prior to PCR can avoid or minimize these problems and ultimately give cleaner, more unambiguous PCR results.

  15. Quantitative X-ray fluorescence analysis at the ESRF ID18F microprobe

    CERN Document Server

    Vekemans, B; Somogyi, A; Drakopoulos, M; Kempenaers, L; Simionovici, A; Adams, F

    2003-01-01

    The new ID18F end-station at the European synchrotron radiation facility (ESRF) in Grenoble (France) is dedicated to sensitive and accurate quantitative micro-X-ray fluorescence (XRF) analysis at the ppm level with accuracy better than 10% for elements with atomic numbers above 18. For accurate quantitative analysis, given a high level of instrumental stability, major steps are the extraction and conversion of experimental X-ray line intensities into elemental concentrations. For this purpose a two-step quantification approach was adopted. In the first step, the collected XRF spectra are deconvoluted on the basis of a non-linear least-squares fitting algorithm (AXIL). The extracted characteristic line intensities are then used as input for a detailed Monte Carlo (MC) simulation code dedicated to XRF spectroscopy taking into account specific experimental conditions (excitation/detection) as well as sample characteristics (absorption and enhancement effects, sample topology, heterogeneity etc.). The iterative u...

  16. Authentic Language Input Through Audiovisual Technology and Second Language Acquisition

    Directory of Open Access Journals (Sweden)

    Taher Bahrani

    2014-09-01

    Full Text Available Second language acquisition cannot take place without having exposure to language input. With regard to this, the present research aimed at providing empirical evidence about the low and the upper-intermediate language learners’ preferred type of audiovisual programs and language proficiency development outside the classroom. To this end, 60 language learners (30 low level and 30 upper-intermediate level were asked to have exposure to their preferred types of audiovisual program(s outside the classroom and keep a diary of the amount and the type of exposure. The obtained data indicated that the low-level participants preferred cartoons and the upper-intermediate participants preferred news more. To find out which language proficiency level could improve its language proficiency significantly, a post-test was administered. The results indicated that only the upper-intermediate language learners gained significant improvement. Based on the findings, the quality of the language input should be given priority over the amount of exposure.

  17. Quantitative verification of ab initio self-consistent laser theory.

    Science.gov (United States)

    Ge, Li; Tandy, Robert J; Stone, A D; Türeci, Hakan E

    2008-10-13

    We generalize and test the recent "ab initio" self-consistent (AISC) time-independent semiclassical laser theory. This self-consistent formalism generates all the stationary lasing properties in the multimode regime (frequencies, thresholds, internal and external fields, output power and emission pattern) from simple inputs: the dielectric function of the passive cavity, the atomic transition frequency, and the transverse relaxation time of the lasing transition.We find that the theory gives excellent quantitative agreement with full time-dependent simulations of the Maxwell-Bloch equations after it has been generalized to drop the slowly-varying envelope approximation. The theory is infinite order in the non-linear hole-burning interaction; the widely used third order approximation is shown to fail badly.

  18. The expert surgical assistant. An intelligent virtual environment with multimodal input.

    Science.gov (United States)

    Billinghurst, M; Savage, J; Oppenheimer, P; Edmond, C

    1996-01-01

    Virtual Reality has made computer interfaces more intuitive but not more intelligent. This paper shows how an expert system can be coupled with multimodal input in a virtual environment to provide an intelligent simulation tool or surgical assistant. This is accomplished in three steps. First, voice and gestural input is interpreted and represented in a common semantic form. Second, a rule-based expert system is used to infer context and user actions from this semantic representation. Finally, the inferred user actions are matched against steps in a surgical procedure to monitor the user's progress and provide automatic feedback. In addition, the system can respond immediately to multimodal commands for navigational assistance and/or identification of critical anatomical structures. To show how these methods are used we present a prototype sinus surgery interface. The approach described here may easily be extended to a wide variety of medical and non-medical training applications by making simple changes to the expert system database and virtual environment models. Successful implementation of an expert system in both simulated and real surgery has enormous potential for the surgeon both in training and clinical practice.

  19. Quantitative approaches in climate change ecology

    DEFF Research Database (Denmark)

    Brown, Christopher J.; Schoeman, David S.; Sydeman, William J.

    2011-01-01

    Contemporary impacts of anthropogenic climate change on ecosystems are increasingly being recognized. Documenting the extent of these impacts requires quantitative tools for analyses of ecological observations to distinguish climate impacts in noisy data and to understand interactions between...... climate variability and other drivers of change. To assist the development of reliable statistical approaches, we review the marine climate change literature and provide suggestions for quantitative approaches in climate change ecology. We compiled 267 peer‐reviewed articles that examined relationships...

  20. International trade inoperability input-output model (IT-IIM): theory and application.

    Science.gov (United States)

    Jung, Jeesang; Santos, Joost R; Haimes, Yacov Y

    2009-01-01

    The inoperability input-output model (IIM) has been used for analyzing disruptions due to man-made or natural disasters that can adversely affect the operation of economic systems or critical infrastructures. Taking economic perturbation for each sector as inputs, the IIM provides the degree of economic production impacts on all industry sectors as the outputs for the model. The current version of the IIM does not provide a separate analysis for the international trade component of the inoperability. If an important port of entry (e.g., Port of Los Angeles) is disrupted, then international trade inoperability becomes a highly relevant subject for analysis. To complement the current IIM, this article develops the International Trade-IIM (IT-IIM). The IT-IIM investigates the resulting international trade inoperability for all industry sectors resulting from disruptions to a major port of entry. Similar to traditional IIM analysis, the inoperability metrics that the IT-IIM provides can be used to prioritize economic sectors based on the losses they could potentially incur. The IT-IIM is used to analyze two types of direct perturbations: (1) the reduced capacity of ports of entry, including harbors and airports (e.g., a shutdown of any port of entry); and (2) restrictions on commercial goods that foreign countries trade with the base nation (e.g., embargo).

  1. Speech graphs provide a quantitative measure of thought disorder in psychosis.

    Science.gov (United States)

    Mota, Natalia B; Vasconcelos, Nivaldo A P; Lemos, Nathalia; Pieretti, Ana C; Kinouchi, Osame; Cecchi, Guillermo A; Copelli, Mauro; Ribeiro, Sidarta

    2012-01-01

    Psychosis has various causes, including mania and schizophrenia. Since the differential diagnosis of psychosis is exclusively based on subjective assessments of oral interviews with patients, an objective quantification of the speech disturbances that characterize mania and schizophrenia is in order. In principle, such quantification could be achieved by the analysis of speech graphs. A graph represents a network with nodes connected by edges; in speech graphs, nodes correspond to words and edges correspond to semantic and grammatical relationships. To quantify speech differences related to psychosis, interviews with schizophrenics, manics and normal subjects were recorded and represented as graphs. Manics scored significantly higher than schizophrenics in ten graph measures. Psychopathological symptoms such as logorrhea, poor speech, and flight of thoughts were grasped by the analysis even when verbosity differences were discounted. Binary classifiers based on speech graph measures sorted schizophrenics from manics with up to 93.8% of sensitivity and 93.7% of specificity. In contrast, sorting based on the scores of two standard psychiatric scales (BPRS and PANSS) reached only 62.5% of sensitivity and specificity. The results demonstrate that alterations of the thought process manifested in the speech of psychotic patients can be objectively measured using graph-theoretical tools, developed to capture specific features of the normal and dysfunctional flow of thought, such as divergence and recurrence. The quantitative analysis of speech graphs is not redundant with standard psychometric scales but rather complementary, as it yields a very accurate sorting of schizophrenics and manics. Overall, the results point to automated psychiatric diagnosis based not on what is said, but on how it is said.

  2. Speech graphs provide a quantitative measure of thought disorder in psychosis.

    Directory of Open Access Journals (Sweden)

    Natalia B Mota

    Full Text Available BACKGROUND: Psychosis has various causes, including mania and schizophrenia. Since the differential diagnosis of psychosis is exclusively based on subjective assessments of oral interviews with patients, an objective quantification of the speech disturbances that characterize mania and schizophrenia is in order. In principle, such quantification could be achieved by the analysis of speech graphs. A graph represents a network with nodes connected by edges; in speech graphs, nodes correspond to words and edges correspond to semantic and grammatical relationships. METHODOLOGY/PRINCIPAL FINDINGS: To quantify speech differences related to psychosis, interviews with schizophrenics, manics and normal subjects were recorded and represented as graphs. Manics scored significantly higher than schizophrenics in ten graph measures. Psychopathological symptoms such as logorrhea, poor speech, and flight of thoughts were grasped by the analysis even when verbosity differences were discounted. Binary classifiers based on speech graph measures sorted schizophrenics from manics with up to 93.8% of sensitivity and 93.7% of specificity. In contrast, sorting based on the scores of two standard psychiatric scales (BPRS and PANSS reached only 62.5% of sensitivity and specificity. CONCLUSIONS/SIGNIFICANCE: The results demonstrate that alterations of the thought process manifested in the speech of psychotic patients can be objectively measured using graph-theoretical tools, developed to capture specific features of the normal and dysfunctional flow of thought, such as divergence and recurrence. The quantitative analysis of speech graphs is not redundant with standard psychometric scales but rather complementary, as it yields a very accurate sorting of schizophrenics and manics. Overall, the results point to automated psychiatric diagnosis based not on what is said, but on how it is said.

  3. READDATA: a FORTRAN 77 codeword input package

    International Nuclear Information System (INIS)

    Lander, P.A.

    1983-07-01

    A new codeword input package has been produced as a result of the incompatibility between different dialects of FORTRAN, especially when character variables are passed as parameters. This report is for those who wish to use a codeword input package with FORTRAN 77. The package, called ''Readdata'', attempts to combine the best features of its predecessors such as BINPUT and pseudo-BINPUT. (author)

  4. Originate: PC input processor for origen-S

    International Nuclear Information System (INIS)

    Bowman, S.M.

    1994-01-01

    ORIGINATE is a personal computer program developed at Oak Ridge National Laboratory to serve as a user-friendly interface for the ORIGEN-S isotopic generation and depletion code. It is designed to assist an ORIGEN-S user in preparing an input file for execution of light-water-reactor fuel depletion and decay cases. Output from ORIGINATE is a card-image input file that may be uploaded to a mainframe computer to execute ORIGEN-S in SCALE-4. ORIGINATE features a pull down menu system that accesses sophisticated data entry screens. The program allows the user to quickly set up an ORIGEN-S input file and perform error checking. This capability increases productivity and decreases chance of user error. (authors). 6 refs., 3 tabs

  5. Quantitative densitometry of neurotransmitter receptors

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Bleisch, W.V.; Biegon, A.; McEwen, B.S.

    1982-01-01

    An autoradiographic procedure is described that allows the quantitative measurement of neurotransmitter receptors by optical density readings. Frozen brain sections are labeled in vitro with [ 3 H]ligands under conditions that maximize specific binding to neurotransmitter receptors. The labeled sections are then placed against the 3 H-sensitive LKB Ultrofilm to produce the autoradiograms. These autoradiograms resemble those produced by [ 14 C]deoxyglucose autoradiography and are suitable for quantitative analysis with a densitometer. Muscarinic cholinergic receptors in rat and zebra finch brain and 5-HT receptors in rat brain were visualized by this method. When the proper combination of ligand concentration and exposure time are used, the method provides quantitative information about the amount and affinity of neurotransmitter receptors in brain sections. This was established by comparisons of densitometric readings with parallel measurements made by scintillation counting of sections. (Auth.)

  6. Low-level waste shallow land disposal source term model: Data input guides

    International Nuclear Information System (INIS)

    Sullivan, T.M.; Suen, C.J.

    1989-07-01

    This report provides an input guide for the computational models developed to predict the rate of radionuclide release from shallow land disposal of low-level waste. Release of contaminants depends on four processes: water flow, container degradation, waste from leaching, and contaminant transport. The computer code FEMWATER has been selected to predict the movement of water in an unsaturated porous media. The computer code BLT (Breach, Leach, and Transport), a modification of FEMWASTE, has been selected to predict the processes of container degradation (Breach), contaminant release from the waste form (Leach), and contaminant migration (Transport). In conjunction, these two codes have the capability to account for the effects of disposal geometry, unsaturated/water flow, container degradation, waste form leaching, and migration of contaminants releases within a single disposal trench. In addition to the input requirements, this report presents the fundamental equations and relationships used to model the four different processes previously discussed. Further, the appendices provide a representative sample of data required by the different models. 14 figs., 27 tabs

  7. Effects of Heat Input on the Mechanical and Metallurgical Characteristics of Tig Welded Incoloy 800Ht Joints

    Directory of Open Access Journals (Sweden)

    Kumar S. Arun

    2017-09-01

    Full Text Available This study focuses on the effect of heat input on the quality characteristics of tungsten inert arc gas welded incoloy 800HT joints using inconel-82 filler wire. Butt welding was done on specimens with four different heat inputs by varying the process parameters like welding current and speed. The result indicated that higher heat input levels has led to the formation of coarser grain structure, reduced mechanical properties and sensitization issues on the weldments. The formation of titanium nitrides provided resistance to fracture and increased the tensile strength of the joints at high temperatures. Further aging was done on the welded sample at a temperature of 750°C for 500 hours and the metallographic result showed formation of carbides along the grain boundaries in a chain of discrete and globular form which increased the hardness of the material. The formation of spinel NiCr2O4 provided oxidation resistance to the material during elevated temperature service.

  8. Evaluating the efficiency of municipalities in collecting and processing municipal solid waste: A shared input DEA-model

    International Nuclear Information System (INIS)

    Rogge, Nicky; De Jaeger, Simon

    2012-01-01

    Highlights: ► Complexity in local waste management calls for more in depth efficiency analysis. ► Shared-input Data Envelopment Analysis can provide solution. ► Considerable room for the Flemish municipalities to improve their cost efficiency. - Abstract: This paper proposed an adjusted “shared-input” version of the popular efficiency measurement technique Data Envelopment Analysis (DEA) that enables evaluating municipality waste collection and processing performances in settings in which one input (waste costs) is shared among treatment efforts of multiple municipal solid waste fractions. The main advantage of this version of DEA is that it not only provides an estimate of the municipalities overall cost efficiency but also estimates of the municipalities’ cost efficiency in the treatment of the different fractions of municipal solid waste (MSW). To illustrate the practical usefulness of the shared input DEA-model, we apply the model to data on 293 municipalities in Flanders, Belgium, for the year 2008.

  9. Guidelines for reporting quantitative mass spectrometry based experiments in proteomics.

    Science.gov (United States)

    Martínez-Bartolomé, Salvador; Deutsch, Eric W; Binz, Pierre-Alain; Jones, Andrew R; Eisenacher, Martin; Mayer, Gerhard; Campos, Alex; Canals, Francesc; Bech-Serra, Joan-Josep; Carrascal, Montserrat; Gay, Marina; Paradela, Alberto; Navajas, Rosana; Marcilla, Miguel; Hernáez, María Luisa; Gutiérrez-Blázquez, María Dolores; Velarde, Luis Felipe Clemente; Aloria, Kerman; Beaskoetxea, Jabier; Medina-Aunon, J Alberto; Albar, Juan P

    2013-12-16

    Mass spectrometry is already a well-established protein identification tool and recent methodological and technological developments have also made possible the extraction of quantitative data of protein abundance in large-scale studies. Several strategies for absolute and relative quantitative proteomics and the statistical assessment of quantifications are possible, each having specific measurements and therefore, different data analysis workflows. The guidelines for Mass Spectrometry Quantification allow the description of a wide range of quantitative approaches, including labeled and label-free techniques and also targeted approaches such as Selected Reaction Monitoring (SRM). The HUPO Proteomics Standards Initiative (HUPO-PSI) has invested considerable efforts to improve the standardization of proteomics data handling, representation and sharing through the development of data standards, reporting guidelines, controlled vocabularies and tooling. In this manuscript, we describe a key output from the HUPO-PSI-namely the MIAPE Quant guidelines, which have developed in parallel with the corresponding data exchange format mzQuantML [1]. The MIAPE Quant guidelines describe the HUPO-PSI proposal concerning the minimum information to be reported when a quantitative data set, derived from mass spectrometry (MS), is submitted to a database or as supplementary information to a journal. The guidelines have been developed with input from a broad spectrum of stakeholders in the proteomics field to represent a true consensus view of the most important data types and metadata, required for a quantitative experiment to be analyzed critically or a data analysis pipeline to be reproduced. It is anticipated that they will influence or be directly adopted as part of journal guidelines for publication and by public proteomics databases and thus may have an impact on proteomics laboratories across the world. This article is part of a Special Issue entitled: Standardization and

  10. OFFSCALE: A PC input processor for the SCALE code system. The CSASIN processor for the criticality sequences

    International Nuclear Information System (INIS)

    Bowman, S.M.

    1994-11-01

    OFFSCALE is a suite of personal computer input processor programs developed at Oak Ridge National Laboratory to provide an easy-to-use interface for modules in the SCALE-4 code system. CSASIN (formerly known as OFFSCALE) is a program in the OFFSCALE suite that serves as a user-friendly interface for the Criticality Safety Analysis Sequences (CSAS) available in SCALE-4. It is designed to assist a SCALE-4 user in preparing an input file for execution of criticality safety problems. Output from CSASIN generates an input file that may be used to execute the CSAS control module in SCALE-4. CSASIN features a pulldown menu system that accesses sophisticated data entry screens. The program allows the user to quickly set up a CSAS input file and perform data checking. This capability increases productivity and decreases the chance of user error

  11. Distance-Ranked Fault Identification of Reconfigurable Hardware Bitstreams via Functional Input

    Directory of Open Access Journals (Sweden)

    Naveed Imran

    2014-01-01

    Full Text Available Distance-Ranked Fault Identification (DRFI is a dynamic reconfiguration technique which employs runtime inputs to conduct online functional testing of fielded FPGA logic and interconnect resources without test vectors. At design time, a diverse set of functionally identical bitstream configurations are created which utilize alternate hardware resources in the FPGA fabric. An ordering is imposed on the configuration pool as updated by the PageRank indexing precedence. The configurations which utilize permanently damaged resources and hence manifest discrepant outputs, receive lower rank are thus less preferred for instantiation on the FPGA. Results indicate accurate identification of fault-free configurations in a pool of pregenerated bitstreams with a low number of reconfigurations and input evaluations. For MCNC benchmark circuits, the observed reduction in input evaluations is up to 75% when comparing the DRFI technique to unguided evaluation. The DRFI diagnosis method is seen to isolate all 14 healthy configurations from a pool of 100 pregenerated configurations, and thereby offering a 100% isolation accuracy provided the fault-free configurations exist in the design pool. When a complete recovery is not feasible, graceful degradation may be realized which is demonstrated by the PSNR improvement of images processed in a video encoder case study.

  12. Determination of the arterial input function in mouse-models using clinical MRI

    International Nuclear Information System (INIS)

    Theis, D.; Fachhochschule Giessen-Friedberg; Keil, B.; Heverhagen, J.T.; Klose, K.J.; Behe, M.; Fiebich, M.

    2008-01-01

    Dynamic contrast enhanced magnetic resonance imaging is a promising method for quantitative analysis of tumor perfusion and is increasingly used in study of cancer in small animal models. In those studies the determination of the arterial input function (AIF) of the target tissue can be the first step. Series of short-axis images of the heart were acquired during administration of a bolus of Gd-DTPA using saturation-recovery gradient echo pulse sequences. The AIF was determined from the changes of the signal intensity in the left ventricle. The native T1 relaxation times and AIF were determined for 11 mice. An average value of (1.16 ± 0.09) s for the native T1 relaxation time was measured. However, the AIF showed significant inter animal variability, as previously observed by other authors. The inter-animal variability shows, that a direct measurement of the AIF is reasonable to avoid significant errors. The proposed method for determination of the AIF proved to be reliable. (orig.)

  13. PRA and the implementation of quantitative safety goals

    International Nuclear Information System (INIS)

    Okrent, D.

    1983-01-01

    With the adoption by the U.S. Nuclear Regulatory Commission (NRC) in January, 1983, of a Policy Statement on Safety Goals for the Operation of Nuclear Power Plants, probabilitstic risk assessment (PRA) has taken on increased importance in nuclear reactor safety. Although the Reactor Safety Study, WASH-1400, was a major pioneering effort that revolutionized thinking about reactor safety, PRA was used only on occasion by the NRC regulatory staff prior to the accident at Three Mile Island. Since then, PRA has been used more and more as an important factor in decision making, usually for specific issues. The nuclear industry has also employed PRA, sometimes to make its case on specific issues, sometimes to present a position on overall risk. The advent of the Zion and Indian Point PRAs, with their treatment of risks from fire, wind, and earthquakes, and their examination of the course of core melt accidents, has added a new dimension to the overall picture. Although the NRC has stated that during the next two year evolution period, its quantitative design objectives and PRA are not to enter directly into the licensing process, many important issues will be influenced significantly by the results of risk and reliability studies. In fact, PRA may be coming into a position of great importance before the methodology, data, and process are sufficiently mature for the task. Large gaps still exist in our understanding of phenomena and in input information; and much of the final result depends on subjective input; large differences of opinion can and should be expected to persist. Accepted standards for quality assurance, and adequacy and depth of independent, peer review remain to be formulated and achieved. This paper will summarize the recently adopted NRC safety policy and the two-year evaluation plan, and will provide, by example, some words of caution concerning a few of the difficulties which may arise. (orig.)

  14. Development of a Math Input Interface with Flick Operation for Mobile Devices

    Science.gov (United States)

    Nakamura, Yasuyuki; Nakahara, Takahiro

    2016-01-01

    Developing online test environments for e-learning for mobile devices will be useful to increase drill practice opportunities. In order to provide a drill practice environment for calculus using an online math test system, such as STACK, we develop a flickable math input interface that can be easily used on mobile devices. The number of taps…

  15. Sensitivity analysis of complex models: Coping with dynamic and static inputs

    International Nuclear Information System (INIS)

    Anstett-Collin, F.; Goffart, J.; Mara, T.; Denis-Vidal, L.

    2015-01-01

    In this paper, we address the issue of conducting a sensitivity analysis of complex models with both static and dynamic uncertain inputs. While several approaches have been proposed to compute the sensitivity indices of the static inputs (i.e. parameters), the one of the dynamic inputs (i.e. stochastic fields) have been rarely addressed. For this purpose, we first treat each dynamic as a Gaussian process. Then, the truncated Karhunen–Loève expansion of each dynamic input is performed. Such an expansion allows to generate independent Gaussian processes from a finite number of independent random variables. Given that a dynamic input is represented by a finite number of random variables, its variance-based sensitivity index is defined by the sensitivity index of this group of variables. Besides, an efficient sampling-based strategy is described to estimate the first-order indices of all the input factors by only using two input samples. The approach is applied to a building energy model, in order to assess the impact of the uncertainties of the material properties (static inputs) and the weather data (dynamic inputs) on the energy performance of a real low energy consumption house. - Highlights: • Sensitivity analysis of models with uncertain static and dynamic inputs is performed. • Karhunen–Loève (KL) decomposition of the spatio/temporal inputs is performed. • The influence of the dynamic inputs is studied through the modes of the KL expansion. • The proposed approach is applied to a building energy model. • Impact of weather data and material properties on performance of real house is given

  16. Molecular structure input on the web

    Directory of Open Access Journals (Sweden)

    Ertl Peter

    2010-02-01

    Full Text Available Abstract A molecule editor, that is program for input and editing of molecules, is an indispensable part of every cheminformatics or molecular processing system. This review focuses on a special type of molecule editors, namely those that are used for molecule structure input on the web. Scientific computing is now moving more and more in the direction of web services and cloud computing, with servers scattered all around the Internet. Thus a web browser has become the universal scientific user interface, and a tool to edit molecules directly within the web browser is essential. The review covers a history of web-based structure input, starting with simple text entry boxes and early molecule editors based on clickable maps, before moving to the current situation dominated by Java applets. One typical example - the popular JME Molecule Editor - will be described in more detail. Modern Ajax server-side molecule editors are also presented. And finally, the possible future direction of web-based molecule editing, based on technologies like JavaScript and Flash, is discussed.

  17. Input and output constraints-based stabilisation of switched nonlinear systems with unstable subsystems and its application

    Science.gov (United States)

    Chen, Chao; Liu, Qian; Zhao, Jun

    2018-01-01

    This paper studies the problem of stabilisation of switched nonlinear systems with output and input constraints. We propose a recursive approach to solve this issue. None of the subsystems are assumed to be stablisable while the switched system is stabilised by dual design of controllers for subsystems and a switching law. When only dealing with bounded input, we provide nested switching controllers using an extended backstepping procedure. If both input and output constraints are taken into consideration, a Barrier Lyapunov Function is employed during operation to construct multiple Lyapunov functions for switched nonlinear system in the backstepping procedure. As a practical example, the control design of an equilibrium manifold expansion model of aero-engine is given to demonstrate the effectiveness of the proposed design method.

  18. Geostatistical and multivariate modelling for large scale quantitative mapping of seafloor sediments using sparse datasets, a case study from the Cleaverbank area (the Netherlands)

    NARCIS (Netherlands)

    Alevizos, Evangelos; Siemes, K.; Janmaat, J.; Snellen, M.; Simons, D.G.; Greinert, J

    2016-01-01

    Quantitative mapping of seafloor sediment properties (eg. grain size) requires the input of comprehensive Multi-Beam Echo Sounder (MBES) datasets along with adequate ground truth for establishing a functional relation between them. MBES surveys in extensive shallow shelf areas can be a rather

  19. A strategy for integrated low-input potato production

    NARCIS (Netherlands)

    Vereijken, P.H.; Loon, van C.D.

    1991-01-01

    Current systems of potato growing use large amounts of pesticides and fertilizers; these inputs are costly and cause environmental problems. In this paper a strategy for integrated low-input potato production is developed with the aim of reducing costs, improving product quality and reducing

  20. Canonical multi-valued input Reed-Muller trees and forms

    Science.gov (United States)

    Perkowski, M. A.; Johnson, P. D.

    1991-01-01

    There is recently an increased interest in logic synthesis using EXOR gates. The paper introduces the fundamental concept of Orthogonal Expansion, which generalizes the ring form of the Shannon expansion to the logic with multiple-valued (mv) inputs. Based on this concept we are able to define a family of canonical tree circuits. Such circuits can be considered for binary and multiple-valued input cases. They can be multi-level (trees and DAG's) or flattened to two-level AND-EXOR circuits. Input decoders similar to those used in Sum of Products (SOP) PLA's are used in realizations of multiple-valued input functions. In the case of the binary logic the family of flattened AND-EXOR circuits includes several forms discussed by Davio and Green. For the case of the logic with multiple-valued inputs, the family of the flattened mv AND-EXOR circuits includes three expansions known from literature and two new expansions.

  1. The Input-Output Relationship of the Cholinergic Basal Forebrain

    Directory of Open Access Journals (Sweden)

    Matthew R. Gielow

    2017-02-01

    Full Text Available Basal forebrain cholinergic neurons influence cortical state, plasticity, learning, and attention. They collectively innervate the entire cerebral cortex, differentially controlling acetylcholine efflux across different cortical areas and timescales. Such control might be achieved by differential inputs driving separable cholinergic outputs, although no input-output relationship on a brain-wide level has ever been demonstrated. Here, we identify input neurons to cholinergic cells projecting to specific cortical regions by infecting cholinergic axon terminals with a monosynaptically restricted viral tracer. This approach revealed several circuit motifs, such as central amygdala neurons synapsing onto basolateral amygdala-projecting cholinergic neurons or strong somatosensory cortical input to motor cortex-projecting cholinergic neurons. The presence of input cells in the parasympathetic midbrain nuclei contacting frontally projecting cholinergic neurons suggest that the network regulating the inner eye muscles are additionally regulating cortical state via acetylcholine efflux. This dataset enables future circuit-level experiments to identify drivers of known cortical cholinergic functions.

  2. Multifunction input-output board for the IBM AT/XT (Lab-Master)

    Energy Technology Data Exchange (ETDEWEB)

    Pilyar, A V

    1996-12-31

    Multifunction input-output board for the IBM PC AT/XT is described. It consists of a CMOS analog input multiplexer, programmable amplifier, a fast 12-bit ADC, four 10-bit DAC and two 8-bit digital input-output registers. Specifications of analog input and output are given. 6 refs.

  3. DOG -II input generator program for DOT3.5 code

    International Nuclear Information System (INIS)

    Hayashi, Katsumi; Handa, Hiroyuki; Yamada, Koubun; Kamogawa, Susumu; Takatsu, Hideyuki; Koizumi, Kouichi; Seki, Yasushi

    1992-01-01

    DOT3.5 is widely used for radiation transport analysis of fission reactors, fusion experimental facilities and particle accelerators. We developed the input generator program for DOT3.5 code in aim to prepare input data effectively. Formar program DOG was developed and used internally in Hitachi Engineering Company. In this new version DOG-II, limitation for R-Θ geometry was removed. All the input data is created by interactive method in front of color display without using DOT3.5 manual. Also the geometry related input are easily created without calculation of precise curved mesh point. By using DOG-II, reliable input data for DOT3.5 code is obtained easily and quickly

  4. Double input converters for different voltage sources with isolated charger

    Directory of Open Access Journals (Sweden)

    Chalash Sattayarak

    2014-09-01

    Full Text Available This paper presents the double input converters for different voltage input sources with isolated charger coils. This research aims to increase the performance of the battery charger circuit. In the circuit, there are the different voltage levels of input source. The operating modes of the switch in the circuit use the microcontroller to control the battery charge and to control discharge mode automatically when the input voltage sources are lost from the system. The experimental result of this research shows better performance for charging at any time period of the switch, while the voltage input sources work together. Therefore, this research can use and develop to battery charger for present or future.

  5. Input shaping control with reentry commands of prescribed duration

    Directory of Open Access Journals (Sweden)

    Valášek M.

    2008-12-01

    Full Text Available Control of flexible mechanical structures often deals with the problem of unwanted vibration. The input shaping is a feedforward method based on modification of the input signal so that the output performs the demanded behaviour. The presented approach is based on a finite-time Laplace transform. It leads to no-vibration control signal without any limitations on its time duration because it is not strictly connected to the system resonant frequency. This idea used for synthesis of control input is extended to design of dynamical shaper with reentry property that transform an arbitrary input signal to the signal that cause no vibration. All these theoretical tasks are supported by the results of simulation experiments.

  6. GARFEM input deck description

    Energy Technology Data Exchange (ETDEWEB)

    Zdunek, A.; Soederberg, M. (Aeronautical Research Inst. of Sweden, Bromma (Sweden))

    1989-01-01

    The input card deck for the finite element program GARFEM version 3.2 is described in this manual. The program includes, but is not limited to, capabilities to handle the following problems: * Linear bar and beam element structures, * Geometrically non-linear problems (bar and beam), both static and transient dynamic analysis, * Transient response dynamics from a catalog of time varying external forcing function types or input function tables, * Eigenvalue solution (modes and frequencies), * Multi point constraints (MPC) for the modelling of mechanisms and e.g. rigid links. The MPC definition is used only in the geometrically linearized sense, * Beams with disjunct shear axis and neutral axis, * Beams with rigid offset. An interface exist that connects GARFEM with the program GAROS. GAROS is a program for aeroelastic analysis of rotating structures. Since this interface was developed GARFEM now serves as a preprocessor program in place of NASTRAN which was formerly used. Documentation of the methods applied in GARFEM exists but is so far limited to the capacities in existence before the GAROS interface was developed.

  7. Completing the Physical Representation of Quantum Algorithms Provides a Quantitative Explanation of Their Computational Speedup

    Science.gov (United States)

    Castagnoli, Giuseppe

    2018-03-01

    The usual representation of quantum algorithms, limited to the process of solving the problem, is physically incomplete. We complete it in three steps: (i) extending the representation to the process of setting the problem, (ii) relativizing the extended representation to the problem solver to whom the problem setting must be concealed, and (iii) symmetrizing the relativized representation for time reversal to represent the reversibility of the underlying physical process. The third steps projects the input state of the representation, where the problem solver is completely ignorant of the setting and thus the solution of the problem, on one where she knows half solution (half of the information specifying it when the solution is an unstructured bit string). Completing the physical representation shows that the number of computation steps (oracle queries) required to solve any oracle problem in an optimal quantum way should be that of a classical algorithm endowed with the advanced knowledge of half solution.

  8. Applied quantitative finance

    CERN Document Server

    Chen, Cathy; Overbeck, Ludger

    2017-01-01

    This volume provides practical solutions and introduces recent theoretical developments in risk management, pricing of credit derivatives, quantification of volatility and copula modeling. This third edition is devoted to modern risk analysis based on quantitative methods and textual analytics to meet the current challenges in banking and finance. It includes 14 new contributions and presents a comprehensive, state-of-the-art treatment of cutting-edge methods and topics, such as collateralized debt obligations, the high-frequency analysis of market liquidity, and realized volatility. The book is divided into three parts: Part 1 revisits important market risk issues, while Part 2 introduces novel concepts in credit risk and its management along with updated quantitative methods. The third part discusses the dynamics of risk management and includes risk analysis of energy markets and for cryptocurrencies. Digital assets, such as blockchain-based currencies, have become popular b ut are theoretically challenging...

  9. Quantitative and qualitative coronary arteriography. 1

    International Nuclear Information System (INIS)

    Brown, B.G.; Simpson, Paul; Dodge, J.T. Jr; Bolson, E.L.; Dodge, H.T.

    1991-01-01

    The clinical objectives of arteriography are to obtain information that contributes to an understanding of the mechanisms of the clinical syndrome, provides prognostic information, facilitates therapeutic decisions, and guides invasive therapy. Quantitative and improved qualitative assessments of arterial disease provide us with a common descriptive language which has the potential to accomplish these objectives more effectively and thus to improve clinical outcome. In certain situations, this potential has been demonstrated. Clinical investigation using quantitative techniques has definitely contributed to our understanding of disease mechanisms and of atherosclerosis progression/regression. Routine quantitation of clinical images should permit more accurate and repeatable estimates of disease severity and promises to provide useful estimates of coronary flow reserve. But routine clinical QCA awaits more cost- and time-efficient methods and clear proof of a clinical advantage. Careful inspection of highly magnified, high-resolution arteriographic images reveals morphologic features related to the pathophysiology of the clinical syndrome and to the likelihood of future progression or regression of obstruction. Features that have been found useful include thrombus in its various forms, ulceration and irregularity, eccentricity, flexing and dissection. The description of such high-resolution features should be included among, rather than excluded from, the goals of image processing, since they contribute substantially to the understanding and treatment of the clinical syndrome. (author). 81 refs.; 8 figs.; 1 tab

  10. Quantitative X-ray fluorescence analysis at the ESRF ID18F microprobe

    International Nuclear Information System (INIS)

    Vekemans, B.; Vincze, L.; Somogyi, A.; Drakopoulos, M.; Kempenaers, L.; Simionovici, A.; Adams, F.

    2003-01-01

    The new ID18F end-station at the European synchrotron radiation facility (ESRF) in Grenoble (France) is dedicated to sensitive and accurate quantitative micro-X-ray fluorescence (XRF) analysis at the ppm level with accuracy better than 10% for elements with atomic numbers above 18. For accurate quantitative analysis, given a high level of instrumental stability, major steps are the extraction and conversion of experimental X-ray line intensities into elemental concentrations. For this purpose a two-step quantification approach was adopted. In the first step, the collected XRF spectra are deconvoluted on the basis of a non-linear least-squares fitting algorithm (AXIL). The extracted characteristic line intensities are then used as input for a detailed Monte Carlo (MC) simulation code dedicated to XRF spectroscopy taking into account specific experimental conditions (excitation/detection) as well as sample characteristics (absorption and enhancement effects, sample topology, heterogeneity etc.). The iterative use of the MC code gives a 'no-compromise' solution for the quantification problem

  11. Quantitative X-ray fluorescence analysis at the ESRF ID18F microprobe

    Energy Technology Data Exchange (ETDEWEB)

    Vekemans, B. E-mail: vekemans@uia.ua.ac.be; Vincze, L.; Somogyi, A.; Drakopoulos, M.; Kempenaers, L.; Simionovici, A.; Adams, F

    2003-01-01

    The new ID18F end-station at the European synchrotron radiation facility (ESRF) in Grenoble (France) is dedicated to sensitive and accurate quantitative micro-X-ray fluorescence (XRF) analysis at the ppm level with accuracy better than 10% for elements with atomic numbers above 18. For accurate quantitative analysis, given a high level of instrumental stability, major steps are the extraction and conversion of experimental X-ray line intensities into elemental concentrations. For this purpose a two-step quantification approach was adopted. In the first step, the collected XRF spectra are deconvoluted on the basis of a non-linear least-squares fitting algorithm (AXIL). The extracted characteristic line intensities are then used as input for a detailed Monte Carlo (MC) simulation code dedicated to XRF spectroscopy taking into account specific experimental conditions (excitation/detection) as well as sample characteristics (absorption and enhancement effects, sample topology, heterogeneity etc.). The iterative use of the MC code gives a 'no-compromise' solution for the quantification problem.

  12. Direct qualitative and quantitative determination of rare earths after separation by high pressure liquid chromatography (HPLC)

    International Nuclear Information System (INIS)

    Weuster, W.; Specker, H.

    1980-01-01

    The rare earths from lanthanum to erbium can be separated by means of HPLC in an eluent system containing di-isopropylether/tetrahydrofuran/nitric acid (100:30:3), and they are determined qualitatively and quantitatively after calibration. Fluorescence quenching of THF at break-through of the single elements serves as indication method. This quenching is proportional to the concentration. The calibration curve is linear within 0.2 to 0.02 moles input. Standards, ores (monazites, cerite earths, yttriae) and technical products were analysed qualitatively and quantitatively. The results obtained are in good agreement with analytical values from different methods. The relative standard deviation is 1.8-3% (N = 10). The procedure takes 50 min from dissolution of the analytical sample. (orig.) [de

  13. Input Manipulation, Enhancement and Processing: Theoretical Views and Empirical Research

    Science.gov (United States)

    Benati, Alessandro

    2016-01-01

    Researchers in the field of instructed second language acquisition have been examining the issue of how learners interact with input by conducting research measuring particular kinds of instructional interventions (input-oriented and meaning-based). These interventions include such things as input flood, textual enhancement and processing…

  14. A robust rotorcraft flight control system design methodology utilizing quantitative feedback theory

    Science.gov (United States)

    Gorder, Peter James

    1993-01-01

    Rotorcraft flight control systems present design challenges which often exceed those associated with fixed-wing aircraft. First, large variations in the response characteristics of the rotorcraft result from the wide range of airspeeds of typical operation (hover to over 100 kts). Second, the assumption of vehicle rigidity often employed in the design of fixed-wing flight control systems is rarely justified in rotorcraft where rotor degrees of freedom can have a significant impact on the system performance and stability. This research was intended to develop a methodology for the design of robust rotorcraft flight control systems. Quantitative Feedback Theory (QFT) was chosen as the basis for the investigation. Quantitative Feedback Theory is a technique which accounts for variability in the dynamic response of the controlled element in the design robust control systems. It was developed to address a Multiple-Input Single-Output (MISO) design problem, and utilizes two degrees of freedom to satisfy the design criteria. Two techniques were examined for extending the QFT MISO technique to the design of a Multiple-Input-Multiple-Output (MIMO) flight control system (FCS) for a UH-60 Black Hawk Helicopter. In the first, a set of MISO systems, mathematically equivalent to the MIMO system, was determined. QFT was applied to each member of the set simultaneously. In the second, the same set of equivalent MISO systems were analyzed sequentially, with closed loop response information from each loop utilized in subsequent MISO designs. The results of each technique were compared, and the advantages of the second, termed Sequential Loop Closure, were clearly evident.

  15. Modeling of heat transfer into a heat pipe for a localized heat input zone

    International Nuclear Information System (INIS)

    Rosenfeld, J.H.

    1987-01-01

    A general model is presented for heat transfer into a heat pipe using a localized heat input. Conduction in the wall of the heat pipe and boiling in the interior structure are treated simultaneously. The model is derived from circumferential heat transfer in a cylindrical heat pipe evaporator and for radial heat transfer in a circular disk with boiling from the interior surface. A comparison is made with data for a localized heat input zone. Agreement between the theory and the model is good. This model can be used for design purposes if a boiling correlation is available. The model can be extended to provide improved predictions of heat pipe performance

  16. The Economic Impact of Tourism. An Input-Output Analysis

    OpenAIRE

    Camelia SURUGIU

    2009-01-01

    The paper presents an Input-Output Analysis for Romania, an important source of information for the investigation of the inter-relations existing among different industries. The Input-Output Analysis is used to determine the role and importance of different economic value added, incomes and employment and it analyses the existing connection in an economy. This paper is focused on tourism and the input-output analysis is finished for the Hotels and Restaurants Sector.

  17. Comparison of different snow model formulations and their responses to input uncertainties in the Upper Indus Basin

    Science.gov (United States)

    Pritchard, David; Fowler, Hayley; Forsythe, Nathan; O'Donnell, Greg; Rutter, Nick; Bardossy, Andras

    2017-04-01

    Snow and glacier melt in the mountainous Upper Indus Basin (UIB) sustain water supplies, irrigation networks, hydropower production and ecosystems in extensive downstream lowlands. Understanding hydrological and cryospheric sensitivities to climatic variability and change in the basin is therefore critical for local, national and regional water resources management. Assessing these sensitivities using numerical modelling is challenging, due to limitations in the quality and quantity of input and evaluation data, as well as uncertainties in model structures and parameters. This study explores how these uncertainties in inputs and process parameterisations affect distributed simulations of ablation in the complex climatic setting of the UIB. The role of model forcing uncertainties is explored using combinations of local observations, remote sensing and reanalysis - including the high resolution High Asia Refined Analysis - to generate multiple realisations of spatiotemporal model input fields. Forcing a range of model structures with these input fields then provides an indication of how different ablation parameterisations respond to uncertainties and perturbations in climatic drivers. Model structures considered include simple, empirical representations of melt processes through to physically based, full energy balance models with multi-physics options for simulating snowpack evolution (including an adapted version of FSM). Analysing model input and structural uncertainties in this way provides insights for methodological choices in climate sensitivity assessments of data-sparse, high mountain catchments. Such assessments are key for supporting water resource management in these catchments, particularly given the potential complications of enhanced warming through elevation effects or, in the case of the UIB, limited understanding of how and why local climate change signals differ from broader patterns.

  18. Quantitative Decision Support Requires Quantitative User Guidance

    Science.gov (United States)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  19. Quantitative Reasoning in Problem Solving

    Science.gov (United States)

    Ramful, Ajay; Ho, Siew Yin

    2015-01-01

    In this article, Ajay Ramful and Siew Yin Ho explain the meaning of quantitative reasoning, describing how it is used in the to solve mathematical problems. They also describe a diagrammatic approach to represent relationships among quantities and provide examples of problems and their solutions.

  20. Quantitative cerebral H215O perfusion PET without arterial blood sampling, a method based on washout rate

    International Nuclear Information System (INIS)

    Treyer, Valerie; Jobin, Mathieu; Burger, Cyrill; Buck, Alfred; Teneggi, Vincenzo

    2003-01-01

    The quantitative determination of regional cerebral blood flow (rCBF) is important in certain clinical and research applications. The disadvantage of most quantitative methods using H 2 15 O positron emission tomography (PET) is the need for arterial blood sampling. In this study a new non-invasive method for rCBF quantification was evaluated. The method is based on the washout rate of H 2 15 O following intravenous injection. All results were obtained with Alpert's method, which yields maps of the washin parameter K 1 (rCBF K1 ) and the washout parameter k 2 (rCBF k2 ). Maps of rCBF K1 were computed with measured arterial input curves. Maps of rCBF k2* were calculated with a standard input curve which was the mean of eight individual input curves. The mean of grey matter rCBF k2* (CBF k2* ) was then compared with the mean of rCBF K1 (CBF K1 ) in ten healthy volunteer smokers who underwent two PET sessions on day 1 and day 3. Each session consisted of three serial H 2 15 O scans. Reproducibility was analysed using the rCBF difference scan 3-scan 2 in each session. The perfusion reserve (PR = rCBF acetazolamide -rCBF baseline ) following acetazolamide challenge was calculated with rCBF k2* (PR k2* ) and rCBF K1 (PR K1 ) in ten patients with cerebrovascular disease. The difference CBF k2* -CBF K1 was 5.90±8.12 ml/min/100 ml (mean±SD, n=55). The SD of the scan 3-scan 1 difference was 6.1% for rCBF k2* and rCBF K1 , demonstrating a high reproducibility. Perfusion reserve values determined with rCBF K1 and rCBF k2* were in high agreement (difference PR k2* -PR K1 =-6.5±10.4%, PR expressed in percentage increase from baseline). In conclusion, a new non-invasive method for the quantitative determination of rCBF is presented. The method is in good agreement with Alpert's original method and the reproducibility is high. It does not require arterial blood sampling, yields quantitative voxel-by-voxel maps of rCBF, and is computationally efficient and easy to implement

  1. Outsourcing, public Input provision and policy cooperation

    OpenAIRE

    Aronsson, Thomas; Koskela, Erkki

    2009-01-01

    This paper concerns public input provision as an instrument for redistribution under international outsourcing by using a model-economy comprising two countries, North and South, where firms in the North may outsource part of their low-skilled labor intensive production to the South. We consider two interrelated issues: (i) the incentives for each country to modify the provision of public input goods in response to international outsourcing, and (ii) whether international outsourcing justifie...

  2. An assessment of equity in the distribution of non-financial health care inputs across public primary health care facilities in Tanzania.

    Science.gov (United States)

    Kuwawenaruwa, August; Borghi, Josephine; Remme, Michelle; Mtei, Gemini

    2017-07-11

    There is limited evidence on how health care inputs are distributed from the sub-national level down to health facilities and their potential influence on promoting health equity. To address this gap, this paper assesses equity in the distribution of health care inputs across public primary health facilities at the district level in Tanzania. This is a quantitative assessment of equity in the distribution of health care inputs (staff, drugs, medical supplies and equipment) from district to facility level. The study was carried out in three districts (Kinondoni, Singida Rural and Manyoni district) in Tanzania. These districts were selected because they were implementing primary care reforms. We administered 729 exit surveys with patients seeking out-patient care; and health facility surveys at 69 facilities in early 2014. A total of seventeen indices of input availability were constructed with the collected data. The distribution of inputs was considered in relation to (i) the wealth of patients accessing the facilities, which was taken as a proxy for the wealth of the population in the catchment area; and (ii) facility distance from the district headquarters. We assessed equity in the distribution of inputs through the use of equity ratios, concentration indices and curves. We found a significant pro-rich distribution of clinical staff and nurses per 1000 population. Facilities with the poorest patients (most remote facilities) have fewer staff per 1000 population than those with the least poor patients (least remote facilities): 0.6 staff per 1000 among the poorest, compared to 0.9 among the least poor; 0.7 staff per 1000 among the most remote facilities compared to 0.9 among the least remote. The negative concentration index for support staff suggests a pro-poor distribution of this cadre but the 45 degree dominated the concentration curve. The distribution of vaccines, antibiotics, anti-diarrhoeal, anti-malarials and medical supplies was approximately

  3. A quantitative PGNAA study for use in aqueous solution measurements using Am–Be neutron source and BGO scintillation detector

    Energy Technology Data Exchange (ETDEWEB)

    Ghal-Eh, N., E-mail: ghal-eh@du.ac.ir [School of Physics, Damghan University, P.O. Box 36716-41167, Damghan (Iran, Islamic Republic of); Ahmadi, P. [School of Physics, Damghan University, P.O. Box 36716-41167, Damghan (Iran, Islamic Republic of); Doost-Mohammadi, V. [Nuclear Science and Technology Research Center, AEOI, P.O. Box 11365-8486, Tehran (Iran, Islamic Republic of)

    2016-02-01

    A prompt gamma neutron activation analysis (PGNAA) system including an Am–Be neutron source and BGO scintillation detector are used for quantitative analysis of bulk samples. Both Monte Carlo-simulated and experimental data are considered as input data libraries for two different procedures based on neural network and least squares methods. The results confirm the feasibility and precision of the proposed methods.

  4. User input verification and test driven development in the NJOY21 nuclear data processing code

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); McCartney, Austin Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-21

    Before physically-meaningful data can be used in nuclear simulation codes, the data must be interpreted and manipulated by a nuclear data processing code so as to extract the relevant quantities (e.g. cross sections and angular distributions). Perhaps the most popular and widely-trusted of these processing codes is NJOY, which has been developed and improved over the course of 10 major releases since its creation at Los Alamos National Laboratory in the mid-1970’s. The current phase of NJOY development is the creation of NJOY21, which will be a vast improvement from its predecessor, NJOY2016. Designed to be fast, intuitive, accessible, and capable of handling both established and modern formats of nuclear data, NJOY21 will address many issues that many NJOY users face, while remaining functional for those who prefer the existing format. Although early in its development, NJOY21 is quickly providing input validation to check user input. By providing rapid and helpful responses to users while writing input files, NJOY21 will prove to be more intuitive and easy to use than any of its predecessors. Furthermore, during its development, NJOY21 is subject to regular testing, such that its test coverage must strictly increase with the addition of any production code. This thorough testing will allow developers and NJOY users to establish confidence in NJOY21 as it gains functionality. This document serves as a discussion regarding the current state input checking and testing practices of NJOY21.

  5. Inhalation Exposure Input Parameters for the Biosphere Model

    Energy Technology Data Exchange (ETDEWEB)

    M. A. Wasiolek

    2003-09-24

    This analysis is one of the nine reports that support the Environmental Radiation Model for Yucca Mountain Nevada (ERMYN) biosphere model. The ''Biosphere Model Report'' (BSC 2003a) describes in detail the conceptual model as well as the mathematical model and its input parameters. This report documents a set of input parameters for the biosphere model, and supports the use of the model to develop biosphere dose conversion factors (BDCFs). The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for a Yucca Mountain repository. This report, ''Inhalation Exposure Input Parameters for the Biosphere Model'', is one of the five reports that develop input parameters for the biosphere model. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling, and the plan for development of the biosphere abstraction products for TSPA, as identified in the ''Technical Work Plan: for Biosphere Modeling and Expert Support'' (BSC 2003b). It should be noted that some documents identified in Figure 1-1 may be under development at the time this report is issued and therefore not available at that time. This figure is included to provide an understanding of how this analysis report contributes to biosphere modeling in support of the license application, and is not intended to imply that access to the listed documents is required to understand the contents of this analysis report. This analysis report defines and justifies values of mass loading, which is the total mass concentration of resuspended particles (e.g., dust, ash) in a volume of air. Measurements of mass loading are used in the air submodel of ERMYN to calculate concentrations of radionuclides in air surrounding crops and concentrations in air

  6. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry.

    Directory of Open Access Journals (Sweden)

    Juan D Chavez

    Full Text Available Chemical cross-linking mass spectrometry (XL-MS provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions.

  7. Statistical aspects of quantitative real-time PCR experiment design.

    Science.gov (United States)

    Kitchen, Robert R; Kubista, Mikael; Tichopad, Ales

    2010-04-01

    Experiments using quantitative real-time PCR to test hypotheses are limited by technical and biological variability; we seek to minimise sources of confounding variability through optimum use of biological and technical replicates. The quality of an experiment design is commonly assessed by calculating its prospective power. Such calculations rely on knowledge of the expected variances of the measurements of each group of samples and the magnitude of the treatment effect; the estimation of which is often uninformed and unreliable. Here we introduce a method that exploits a small pilot study to estimate the biological and technical variances in order to improve the design of a subsequent large experiment. We measure the variance contributions at several 'levels' of the experiment design and provide a means of using this information to predict both the total variance and the prospective power of the assay. A validation of the method is provided through a variance analysis of representative genes in several bovine tissue-types. We also discuss the effect of normalisation to a reference gene in terms of the measured variance components of the gene of interest. Finally, we describe a software implementation of these methods, powerNest, that gives the user the opportunity to input data from a pilot study and interactively modify the design of the assay. The software automatically calculates expected variances, statistical power, and optimal design of the larger experiment. powerNest enables the researcher to minimise the total confounding variance and maximise prospective power for a specified maximum cost for the large study. Copyright 2010 Elsevier Inc. All rights reserved.

  8. Projecting the potential evapotranspiration by coupling different formulations and input data reliabilities: The possible uncertainty source for climate change impacts on hydrological regime

    Science.gov (United States)

    Wang, Weiguang; Li, Changni; Xing, Wanqiu; Fu, Jianyu

    2017-12-01

    Representing atmospheric evaporating capability for a hypothetical reference surface, potential evapotranspiration (PET) determines the upper limit of actual evapotranspiration and is an important input to hydrological models. Due that present climate models do not give direct estimates of PET when simulating the hydrological response to future climate change, the PET must be estimated first and is subject to the uncertainty on account of many existing formulae and different input data reliabilities. Using four different PET estimation approaches, i.e., the more physically Penman (PN) equation with less reliable input variables, more empirical radiation-based Priestley-Taylor (PT) equation with relatively dependable downscaled data, the most simply temperature-based Hamon (HM) equation with the most reliable downscaled variable, and downscaling PET directly by the statistical downscaling model, this paper investigated the differences of runoff projection caused by the alternative PET methods by a well calibrated abcd monthly hydrological model. Three catchments, i.e., the Luanhe River Basin, the Source Region of the Yellow River and the Ganjiang River Basin, representing a large climatic diversity were chosen as examples to illustrate this issue. The results indicated that although similar monthly patterns of PET over the period 2021-2050 for each catchment were provided by the four methods, the magnitudes of PET were still slightly different, especially for spring and summer months in the Luanhe River Basin and the Source Region of the Yellow River with relatively dry climate feature. The apparent discrepancy in magnitude of change in future runoff and even the diverse change direction for summer months in the Luanhe River Basin and spring months in the Source Region of the Yellow River indicated that the PET method related uncertainty occurred, especially in the Luanhe River Basin and the Source Region of the Yellow River with smaller aridity index. Moreover, the

  9. Combining emission inventory and isotope ratio analyses for quantitative source apportionment of heavy metals in agricultural soil.

    Science.gov (United States)

    Chen, Lian; Zhou, Shenglu; Wu, Shaohua; Wang, Chunhui; Li, Baojie; Li, Yan; Wang, Junxiao

    2018-08-01

    Two quantitative methods (emission inventory and isotope ratio analysis) were combined to apportion source contributions of heavy metals entering agricultural soils in the Lihe River watershed (Taihu region, east China). Source apportionment based on the emission inventory method indicated that for Cd, Cr, Cu, Pb, and Zn, the mean percentage input from atmospheric deposition was highest (62-85%), followed by irrigation (12-27%) and fertilization (1-14%). Thus, the heavy metals were derived mainly from industrial activities and traffic emissions. For Ni the combined percentage input from irrigation and fertilization was approximately 20% higher than that from atmospheric deposition, indicating that Ni was mainly derived from agricultural activities. Based on isotope ratio analysis, atmospheric deposition accounted for 57-93% of Pb entering soil, with the mean value of 69.3%, which indicates that this was the major source of Pb entering soil in the study area. The mean contributions of irrigation and fertilization to Pb pollution of soil ranged from 0% to 10%, indicating that they played only a marginally important role. Overall, the results obtained using the two methods were similar. This study provides a reliable approach for source apportionment of heavy metals entering agricultural soils in the study area, and clearly have potential application for future studies in other regions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Consumer input into research: the Australian Cancer Trials website.

    Science.gov (United States)

    Dear, Rachel F; Barratt, Alexandra L; Crossing, Sally; Butow, Phyllis N; Hanson, Susan; Tattersall, Martin Hn

    2011-06-26

    The Australian Cancer Trials website (ACTO) was publicly launched in 2010 to help people search for cancer clinical trials recruiting in Australia, provide information about clinical trials and assist with doctor-patient communication about trials. We describe consumer involvement in the design and development of ACTO and report our preliminary patient evaluation of the website. Consumers, led by Cancer Voices NSW, provided the impetus to develop the website. Consumer representative groups were consulted by the research team during the design and development of ACTO which combines a search engine, trial details, general information about trial participation and question prompt lists. Website use was analysed. A patient evaluation questionnaire was completed at one hospital, one week after exposure to the website. ACTO's main features and content reflect consumer input. In February 2011, it covered 1, 042 cancer trials. Since ACTO's public launch in November 2010, until the end of February 2011, the website has had 2, 549 new visits and generated 17, 833 page views. In a sub-study of 47 patient users, 89% found the website helpful for learning about clinical trials and all respondents thought patients should have access to ACTO. The development of ACTO is an example of consumers working with doctors, researchers and policy makers to improve the information available to people whose lives are affected by cancer and to help them participate in their treatment decisions, including consideration of clinical trial enrolment. Consumer input has ensured that the website is informative, targets consumer priorities and is user-friendly. ACTO serves as a model for other health conditions.

  11. Does Input Enhancement Work for Learning Politeness Strategies?

    Science.gov (United States)

    Khatib, Mohammad; Safari, Mahmood

    2013-01-01

    The present study investigated the effect of input enhancement on the acquisition of English politeness strategies by intermediate EFL learners. Two groups of freshman English majors were randomly assigned to the experimental (enhanced input) group and the control (mere exposure) group. Initially, a TOEFL test and a discourse completion test (DCT)…

  12. Optimal Input Design for Aircraft Parameter Estimation using Dynamic Programming Principles

    Science.gov (United States)

    Morelli, Eugene A.; Klein, Vladislav

    1990-01-01

    A new technique was developed for designing optimal flight test inputs for aircraft parameter estimation experiments. The principles of dynamic programming were used for the design in the time domain. This approach made it possible to include realistic practical constraints on the input and output variables. A description of the new approach is presented, followed by an example for a multiple input linear model describing the lateral dynamics of a fighter aircraft. The optimal input designs produced by the new technique demonstrated improved quality and expanded capability relative to the conventional multiple input design method.

  13. New approach to derive linear power/burnup history input for CANDU fuel codes

    International Nuclear Information System (INIS)

    Lac Tang, T.; Richards, M.; Parent, G.

    2003-01-01

    The fuel element linear power / burnup history is a required input for the ELESTRES code in order to simulate CANDU fuel behavior during normal operating conditions and also to provide input for the accident analysis codes ELOCA and SOURCE. The purpose of this paper is to present a new approach to derive 'true', or at least more realistic linear power / burnup histories. Such an approach can be used to recreate any typical bundle power history if only a single pair of instantaneous values of bundle power and burnup, together with the position in the channel, are known. The histories obtained could be useful to perform more realistic simulations for safety analyses for cases where the reference (overpower) history is not appropriate. (author)

  14. Quantitative Phosphoproteomic Analysis Provides Insight into the Response to Short-Term Drought Stress in Ammopiptanthus mongolicus Roots

    Directory of Open Access Journals (Sweden)

    Huigai Sun

    2017-10-01

    Full Text Available Drought is one of the major abiotic stresses that negatively affects plant growth and development. Ammopiptanthus mongolicus is an ecologically important shrub in the mid-Asia desert region and used as a model for abiotic tolerance research in trees. Protein phosphorylation participates in the regulation of various biological processes, however, phosphorylation events associated with drought stress signaling and response in plants is still limited. Here, we conducted a quantitative phosphoproteomic analysis of the response of A. mongolicus roots to short-term drought stress. Data are available via the iProx database with project ID IPX0000971000. In total, 7841 phosphorylation sites were found from the 2019 identified phosphopeptides, corresponding to 1060 phosphoproteins. Drought stress results in significant changes in the abundance of 103 phosphopeptides, corresponding to 90 differentially-phosphorylated phosphoproteins (DPPs. Motif-x analysis identified two motifs, including [pSP] and [RXXpS], from these DPPs. Functional enrichment and protein-protein interaction analysis showed that the DPPs were mainly involved in signal transduction and transcriptional regulation, osmotic adjustment, stress response and defense, RNA splicing and transport, protein synthesis, folding and degradation, and epigenetic regulation. These drought-corresponsive phosphoproteins, and the related signaling and metabolic pathways probably play important roles in drought stress signaling and response in A. mongolicus roots. Our results provide new information for understanding the molecular mechanism of the abiotic stress response in plants at the posttranslational level.

  15. A parallel input composite transimpedance amplifier

    Science.gov (United States)

    Kim, D. J.; Kim, C.

    2018-01-01

    A new approach to high performance current to voltage preamplifier design is presented. The design using multiple operational amplifiers (op-amps) has a parasitic capacitance compensation network and a composite amplifier topology for fast, precision, and low noise performance. The input stage consisting of a parallel linked JFET op-amps and a high-speed bipolar junction transistor (BJT) gain stage driving the output in the composite amplifier topology, cooperating with the capacitance compensation feedback network, ensures wide bandwidth stability in the presence of input capacitance above 40 nF. The design is ideal for any two-probe measurement, including high impedance transport and scanning tunneling microscopy measurements.

  16. Integrate-and-fire vs Poisson models of LGN input to V1 cortex: noisier inputs reduce orientation selectivity.

    Science.gov (United States)

    Lin, I-Chun; Xing, Dajun; Shapley, Robert

    2012-12-01

    One of the reasons the visual cortex has attracted the interest of computational neuroscience is that it has well-defined inputs. The lateral geniculate nucleus (LGN) of the thalamus is the source of visual signals to the primary visual cortex (V1). Most large-scale cortical network models approximate the spike trains of LGN neurons as simple Poisson point processes. However, many studies have shown that neurons in the early visual pathway are capable of spiking with high temporal precision and their discharges are not Poisson-like. To gain an understanding of how response variability in the LGN influences the behavior of V1, we study response properties of model V1 neurons that receive purely feedforward inputs from LGN cells modeled either as noisy leaky integrate-and-fire (NLIF) neurons or as inhomogeneous Poisson processes. We first demonstrate that the NLIF model is capable of reproducing many experimentally observed statistical properties of LGN neurons. Then we show that a V1 model in which the LGN input to a V1 neuron is modeled as a group of NLIF neurons produces higher orientation selectivity than the one with Poisson LGN input. The second result implies that statistical characteristics of LGN spike trains are important for V1's function. We conclude that physiologically motivated models of V1 need to include more realistic LGN spike trains that are less noisy than inhomogeneous Poisson processes.

  17. Graphical user interface for input output characterization of single variable and multivariable highly nonlinear systems

    Directory of Open Access Journals (Sweden)

    Shahrukh Adnan Khan M. D.

    2017-01-01

    Full Text Available This paper presents a Graphical User Interface (GUI software utility for the input/output characterization of single variable and multivariable nonlinear systems by obtaining the sinusoidal input describing function (SIDF of the plant. The software utility is developed on MATLAB R2011a environment. The developed GUI holds no restriction on the nonlinearity type, arrangement and system order; provided that output(s of the system is obtainable either though simulation or experiments. An insight to the GUI and its features are presented in this paper and example problems from both single variable and multivariable cases are demonstrated. The formulation of input/output behavior of the system is discussed and the nucleus of the MATLAB command underlying the user interface has been outlined. Some of the industries that would benefit from this software utility includes but not limited to aerospace, defense technology, robotics and automotive.

  18. Prediction-Based Control for Nonlinear Systems with Input Delay

    Directory of Open Access Journals (Sweden)

    I. Estrada-Sánchez

    2017-01-01

    Full Text Available This work has two primary objectives. First, it presents a state prediction strategy for a class of nonlinear Lipschitz systems subject to constant time delay in the input signal. As a result of a suitable change of variable, the state predictor asymptotically provides the value of the state τ units of time ahead. Second, it proposes a solution to the stabilization and trajectory tracking problems for the considered class of systems using predicted states. The predictor-controller convergence is proved by considering a complete Lyapunov functional. The proposed predictor-based controller strategy is evaluated using numerical simulations.

  19. Constituency Input into Budget Management.

    Science.gov (United States)

    Miller, Norman E.

    1995-01-01

    Presents techniques for ensuring constituency involvement in district- and site-level budget management. Outlines four models for securing constituent input and focuses on strategies to orchestrate the more complex model for staff and community participation. Two figures are included. (LMI)

  20. A quantitative X-ray diffraction inventory of the tephra and volcanic glass inputs into the Holocene marine sediment archives off Iceland: A contribution to V.A.S.T.

    Science.gov (United States)

    Andrews, John T.; Kristjansdottir, Greta B.; Eberl, Dennis D.; Jennings, Anne E.

    2013-01-01

    This paper re-evaluates how well quantitative x-ray diffraction (qXRD) can be used as an exploratory method of the weight percentage (wt%) of volcaniclastic sediment, and to identify tephra events in marine cores. In the widely used RockJock v6 software programme, qXRD tephra and glass standards include the rhyodacite White River tephra (Alaska), a rhyolitic tephra (Hekla-4) and the basaltic Saksunarvatn tephra. Experiments of adding known wt% of tephra to felsic bedrock samples indicated that additions ≥10 wt% are accurately detected, but reliable estimates of lesser amounts are masked by amorphous material produced by milling. Volcaniclastic inputs range between 20 and 50 wt%. Primary tephra events are identified as peaks in residual qXRD glass wt% from fourth-order polynomial fits. In cores where tephras have been identified by shard counts in the > 150 µm fraction, there is a positive correlation (validation) with peaks in the wt% glass estimated by qXRD. Geochemistry of tephra shards confirms the presence of several Hekla-sourced tephras in cores B997-317PC1 and -319PC2 on the northern Iceland shelf. In core B997-338 (north-west Iceland), there are two rhyolitic tephras separated by ca. 100 cm with uncorrected radiocarbon dates on articulated shells of around 13 000 yr B.P. These tephras may be correlatives of the Borrobol and Penifiler tephras found in Scotland. The number of Holocene tephra events per 1000 yr was estimated from qXRD on 16 cores and showed a bimodal distribution with an increased number of events in both the late and early Holocene.

  1. Development of MIDAS/SMR Input Deck for SMART

    International Nuclear Information System (INIS)

    Cho, S. W.; Oh, H. K.; Lee, J. M.; Lee, J. H.; Yoo, K. J.; Kwun, S. K.; Hur, H.

    2010-01-01

    The objective of this study is to develop MIDAS/SMR code basic input deck for the severe accidents by simulating the steady state for the SMART plant. SMART plant is an integrated reactor developed by KAERI. For the assessment of reactor safety and severe accident management strategy, it is necessary to simulate severe accidents using the MIDAS/SMR code which is being developed by KAERI. The input deck of the MIDAS/SMR code for the SMART plant is prepared to simulate severe accident sequences for the users who are not familiar with the code. A steady state is obtained and the results are compared with design values. The input deck will be improved through the simulation of the DBAs and severe accidents. The base input deck of the MIDAS/SMR code can be used to simulate severe accident scenarios after improvement. Source terms and hydrogen generation can be analyzed through the simulation of the severe accident. The information gained from analyses of severe accidents is expected to be helpful to develop the severe accident management strategy

  2. Road simulation for four-wheel vehicle whole input power spectral density

    Science.gov (United States)

    Wang, Jiangbo; Qiang, Baomin

    2017-05-01

    As the vibration of running vehicle mainly comes from road and influence vehicle ride performance. So the road roughness power spectral density simulation has great significance to analyze automobile suspension vibration system parameters and evaluate ride comfort. Firstly, this paper based on the mathematical model of road roughness power spectral density, established the integral white noise road random method. Then in the MATLAB/Simulink environment, according to the research method of automobile suspension frame from simple two degree of freedom single-wheel vehicle model to complex multiple degrees of freedom vehicle model, this paper built the simple single incentive input simulation model. Finally the spectrum matrix was used to build whole vehicle incentive input simulation model. This simulation method based on reliable and accurate mathematical theory and can be applied to the random road simulation of any specified spectral which provides pavement incentive model and foundation to vehicle ride performance research and vibration simulation.

  3. Learning Complex Grammar in the Virtual Classroom: A Comparison of Processing Instruction, Structured Input, Computerized Visual Input Enhancement, and Traditional Instruction

    Science.gov (United States)

    Russell, Victoria

    2012-01-01

    This study investigated the effects of processing instruction (PI) and structured input (SI) on the acquisition of the subjunctive in adjectival clauses by 92 second-semester distance learners of Spanish. Computerized visual input enhancement (VIE) was combined with PI and SI in an attempt to increase the salience of the targeted grammatical form…

  4. Effects of shade and input management on economic performance of small-scale Peruvian coffee systems

    NARCIS (Netherlands)

    Jezeer, Rosalien E.|info:eu-repo/dai/nl/374336865; Ferreira Dos Santos, Maria Joao|info:eu-repo/dai/nl/371571979; Boot, René G.A.|info:eu-repo/dai/nl/069412928; Junginger, Martin|info:eu-repo/dai/nl/202130703; Verweij, Pita A.|info:eu-repo/dai/nl/145431843

    2018-01-01

    Tropical agroforestry systems provide a number of ecosystem services that might help sustain the production of multiple crops, improve farmers’ livelihoods and conserve biodiversity. A major drawback of agroforestry coffee systems is the perceived lower economic performance compared to high-input

  5. Discrete Input Signaling for MISO Visible Light Communication Channels

    KAUST Repository

    Arfaoui, Mohamed Amine; Rezki, Zouheir; Ghrayeb, Ali; Alouini, Mohamed-Slim

    2017-01-01

    In this paper, we study the achievable secrecy rate of visible light communication (VLC) links for discrete input distributions. We consider single user single eavesdropper multiple-input single-output (MISO) links. In addition, both beamforming

  6. Quantitative Reasoning Learning Progressions for Environmental Science: Developing a Framework

    Directory of Open Access Journals (Sweden)

    Robert L. Mayes

    2013-01-01

    Full Text Available Quantitative reasoning is a complex concept with many definitions and a diverse account in the literature. The purpose of this article is to establish a working definition of quantitative reasoning within the context of science, construct a quantitative reasoning framework, and summarize research on key components in that framework. Context underlies all quantitative reasoning; for this review, environmental science serves as the context.In the framework, we identify four components of quantitative reasoning: the quantification act, quantitative literacy, quantitative interpretation of a model, and quantitative modeling. Within each of these components, the framework provides elements that comprise the four components. The quantification act includes the elements of variable identification, communication, context, and variation. Quantitative literacy includes the elements of numeracy, measurement, proportional reasoning, and basic probability/statistics. Quantitative interpretation includes the elements of representations, science diagrams, statistics and probability, and logarithmic scales. Quantitative modeling includes the elements of logic, problem solving, modeling, and inference. A brief comparison of the quantitative reasoning framework with the AAC&U Quantitative Literacy VALUE rubric is presented, demonstrating a mapping of the components and illustrating differences in structure. The framework serves as a precursor for a quantitative reasoning learning progression which is currently under development.

  7. Pseudo-BINPUT, a free formal input package for Fortran programmes

    International Nuclear Information System (INIS)

    Gubbins, M.E.

    1977-11-01

    Pseudo - BINPUT is an input package for reading free format data in codeword control in a FORTRAN programme. To a large degree it mimics in function the Winfrith Subroutine Library routine BINPUT. By using calls of the data input package DECIN to mimic the input routine BINPUT, Pseudo - BINPUT combines some of the advantages of both systems. (U.K.)

  8. Distinctiveness and Bidirectional Effects in Input Enhancement for Vocabulary Learning

    Science.gov (United States)

    Barcroft, Joe

    2003-01-01

    This study examined input enhancement and second language (L2) vocabulary learning while exploring the role of "distinctiveness," the degree to which an item in the input diverges from the form in which other items in the input are presented, with regard to the nature and direction of the effects of enhancement. In this study,…

  9. Quantitative information in medical imaging

    International Nuclear Information System (INIS)

    Deconinck, F.

    1985-01-01

    When developing new imaging or image processing techniques, one constantly has in mind that the new technique should provide a better, or more optimal answer to medical tasks than existing techniques do 'Better' or 'more optimal' imply some kind of standard by which one can measure imaging or image processing performance. The choice of a particular imaging modality to answer a diagnostic task, such as the detection of coronary artery stenosis is also based on an implicit optimalisation of performance criteria. Performance is measured by the ability to provide information about an object (patient) to the person (referring doctor) who ordered a particular task. In medical imaging the task is generally to find quantitative information on bodily function (biochemistry, physiology) and structure (histology, anatomy). In medical imaging, a wide range of techniques is available. Each technique has it's own characteristics. The techniques discussed in this paper are: nuclear magnetic resonance, X-ray fluorescence, scintigraphy, positron emission tomography, applied potential tomography, computerized tomography, and compton tomography. This paper provides a framework for the comparison of imaging performance, based on the way the quantitative information flow is altered by the characteristics of the modality

  10. A non-linear dimension reduction methodology for generating data-driven stochastic input models

    Science.gov (United States)

    Ganapathysubramanian, Baskar; Zabaras, Nicholas

    2008-06-01

    Stochastic analysis of random heterogeneous media (polycrystalline materials, porous media, functionally graded materials) provides information of significance only if realistic input models of the topology and property variations are used. This paper proposes a framework to construct such input stochastic models for the topology and thermal diffusivity variations in heterogeneous media using a data-driven strategy. Given a set of microstructure realizations (input samples) generated from given statistical information about the medium topology, the framework constructs a reduced-order stochastic representation of the thermal diffusivity. This problem of constructing a low-dimensional stochastic representation of property variations is analogous to the problem of manifold learning and parametric fitting of hyper-surfaces encountered in image processing and psychology. Denote by M the set of microstructures that satisfy the given experimental statistics. A non-linear dimension reduction strategy is utilized to map M to a low-dimensional region, A. We first show that M is a compact manifold embedded in a high-dimensional input space Rn. An isometric mapping F from M to a low-dimensional, compact, connected set A⊂Rd(d≪n) is constructed. Given only a finite set of samples of the data, the methodology uses arguments from graph theory and differential geometry to construct the isometric transformation F:M→A. Asymptotic convergence of the representation of M by A is shown. This mapping F serves as an accurate, low-dimensional, data-driven representation of the property variations. The reduced-order model of the material topology and thermal diffusivity variations is subsequently used as an input in the solution of stochastic partial differential equations that describe the evolution of dependant variables. A sparse grid collocation strategy (Smolyak algorithm) is utilized to solve these stochastic equations efficiently. We showcase the methodology by constructing low

  11. A non-linear dimension reduction methodology for generating data-driven stochastic input models

    International Nuclear Information System (INIS)

    Ganapathysubramanian, Baskar; Zabaras, Nicholas

    2008-01-01

    Stochastic analysis of random heterogeneous media (polycrystalline materials, porous media, functionally graded materials) provides information of significance only if realistic input models of the topology and property variations are used. This paper proposes a framework to construct such input stochastic models for the topology and thermal diffusivity variations in heterogeneous media using a data-driven strategy. Given a set of microstructure realizations (input samples) generated from given statistical information about the medium topology, the framework constructs a reduced-order stochastic representation of the thermal diffusivity. This problem of constructing a low-dimensional stochastic representation of property variations is analogous to the problem of manifold learning and parametric fitting of hyper-surfaces encountered in image processing and psychology. Denote by M the set of microstructures that satisfy the given experimental statistics. A non-linear dimension reduction strategy is utilized to map M to a low-dimensional region, A. We first show that M is a compact manifold embedded in a high-dimensional input space R n . An isometric mapping F from M to a low-dimensional, compact, connected set A is contained in R d (d<< n) is constructed. Given only a finite set of samples of the data, the methodology uses arguments from graph theory and differential geometry to construct the isometric transformation F:M→A. Asymptotic convergence of the representation of M by A is shown. This mapping F serves as an accurate, low-dimensional, data-driven representation of the property variations. The reduced-order model of the material topology and thermal diffusivity variations is subsequently used as an input in the solution of stochastic partial differential equations that describe the evolution of dependant variables. A sparse grid collocation strategy (Smolyak algorithm) is utilized to solve these stochastic equations efficiently. We showcase the methodology

  12. Athena: Providing Insight into the History of the Universe

    Science.gov (United States)

    Murphy, Gloria A.

    2010-01-01

    The American Institute for Aeronautics and Astronautics has provided a Request for Proposal which calls for a manned mission to a Near-Earth Object. It is the goal of Team COLBERT to respond to their request by providing a reusable system that can be implemented as a solid stepping stone for future manned trips to Mars and beyond. Despite Team COLBERT consisting of only students in Aerospace Engineering, in order to achieve this feat, the team must employ the use of Systems Engineering. Tools and processes from Systems Engineering will provide quantitative and semi-quantitative tools for making design decisions and evaluating items such as budgets and schedules. This paper will provide an in-depth look at some of the Systems Engineering processes employed and will step through the design process of a Human Asteroid Exploration System.

  13. Development of Input/Output System for the Reactor Transient Analysis System (RETAS)

    International Nuclear Information System (INIS)

    Suh, Jae Seung; Kang, Doo Hyuk; Cho, Yeon Sik; Ahn, Seung Hoon; Cho, Yong Jin

    2009-01-01

    A Korea Institute of Nuclear Safety Reactor Transient Analysis System (KINS-RETAS) aims at providing a realistic prediction of core and RCS response to the potential or actual event scenarios in Korean nuclear power plants (NPPs). A thermal hydraulic system code MARS is a pivot code of the RETAS, and used to predict thermal hydraulic (TH) behaviors in the core and associated systems. MARS alone can be applied to many types of transients, but is sometimes coupled with the other codes developed for different objectives. Many tools have been developed to aid users in preparing input and displaying the transient information and output data. Output file and Graphical User Interfaces (GUI) that help prepare input decks, as seen in SNAP (Gitnick, 1998), VISA (K.D. Kim, 2007) and display aids include the eFAST (KINS, 2007). The tools listed above are graphical interfaces. The input deck builders allow the user to create a functional diagram of the plant, pictorially on the screen. The functional diagram, when annotated with control volume and junction numbers, is a nodalization diagram. Data required for an input deck is entered for volumes and junctions through a mouse-driven menu and pop-up dialog; after the information is complete, an input deck is generated. Display GUIs show data from MARS calculations, either during or after the transient. The RETAS requires the user to first generate a set of 'input', two dimensional pictures of the plant on which some of the data is displayed either numerically or with a color map. The RETAS can generate XY-plots of the data. Time histories of plant conditions can be seen via the plots or through the RETAS's replay mode. The user input was combined with design input from MARS developers and experts from both the GUI and ergonomics fields. A partial list of capabilities follows. - 3D display for neutronics. - Easier method (less user time and effort) to generate 'input' for the 3D displays. - Detailed view of data at volume or

  14. Development of Input/Output System for the Reactor Transient Analysis System (RETAS)

    Energy Technology Data Exchange (ETDEWEB)

    Suh, Jae Seung; Kang, Doo Hyuk; Cho, Yeon Sik [ENESYS, Daejeon (Korea, Republic of); Ahn, Seung Hoon; Cho, Yong Jin [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2009-05-15

    A Korea Institute of Nuclear Safety Reactor Transient Analysis System (KINS-RETAS) aims at providing a realistic prediction of core and RCS response to the potential or actual event scenarios in Korean nuclear power plants (NPPs). A thermal hydraulic system code MARS is a pivot code of the RETAS, and used to predict thermal hydraulic (TH) behaviors in the core and associated systems. MARS alone can be applied to many types of transients, but is sometimes coupled with the other codes developed for different objectives. Many tools have been developed to aid users in preparing input and displaying the transient information and output data. Output file and Graphical User Interfaces (GUI) that help prepare input decks, as seen in SNAP (Gitnick, 1998), VISA (K.D. Kim, 2007) and display aids include the eFAST (KINS, 2007). The tools listed above are graphical interfaces. The input deck builders allow the user to create a functional diagram of the plant, pictorially on the screen. The functional diagram, when annotated with control volume and junction numbers, is a nodalization diagram. Data required for an input deck is entered for volumes and junctions through a mouse-driven menu and pop-up dialog; after the information is complete, an input deck is generated. Display GUIs show data from MARS calculations, either during or after the transient. The RETAS requires the user to first generate a set of 'input', two dimensional pictures of the plant on which some of the data is displayed either numerically or with a color map. The RETAS can generate XY-plots of the data. Time histories of plant conditions can be seen via the plots or through the RETAS's replay mode. The user input was combined with design input from MARS developers and experts from both the GUI and ergonomics fields. A partial list of capabilities follows. - 3D display for neutronics. - Easier method (less user time and effort) to generate 'input' for the 3D displays. - Detailed view

  15. Evaluating the efficiency of municipalities in collecting and processing municipal solid waste: a shared input DEA-model.

    Science.gov (United States)

    Rogge, Nicky; De Jaeger, Simon

    2012-10-01

    This paper proposed an adjusted "shared-input" version of the popular efficiency measurement technique Data Envelopment Analysis (DEA) that enables evaluating municipality waste collection and processing performances in settings in which one input (waste costs) is shared among treatment efforts of multiple municipal solid waste fractions. The main advantage of this version of DEA is that it not only provides an estimate of the municipalities overall cost efficiency but also estimates of the municipalities' cost efficiency in the treatment of the different fractions of municipal solid waste (MSW). To illustrate the practical usefulness of the shared input DEA-model, we apply the model to data on 293 municipalities in Flanders, Belgium, for the year 2008. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Accountability Requirements in the Cloud Provider Chain

    Directory of Open Access Journals (Sweden)

    Martin Gilje Jaatun

    2018-04-01

    Full Text Available In order to be responsible stewards of other people’s data, cloud providers must be accountable for their data handling practices. The potential long provider chains in cloud computing introduce additional accountability challenges, with many stakeholders involved. Symmetry is very important in any requirements’ elicitation activity, since input from diverse stakeholders needs to be balanced. This article ventures to answer the question “How can one create an accountable cloud service?” by examining requirements which must be fulfilled to achieve an accountability-based approach, based on interaction with over 300 stakeholders.

  17. Quantitative learning strategies based on word networks

    Science.gov (United States)

    Zhao, Yue-Tian-Yi; Jia, Zi-Yang; Tang, Yong; Xiong, Jason Jie; Zhang, Yi-Cheng

    2018-02-01

    Learning English requires a considerable effort, but the way that vocabulary is introduced in textbooks is not optimized for learning efficiency. With the increasing population of English learners, learning process optimization will have significant impact and improvement towards English learning and teaching. The recent developments of big data analysis and complex network science provide additional opportunities to design and further investigate the strategies in English learning. In this paper, quantitative English learning strategies based on word network and word usage information are proposed. The strategies integrate the words frequency with topological structural information. By analyzing the influence of connected learned words, the learning weights for the unlearned words and dynamically updating of the network are studied and analyzed. The results suggest that quantitative strategies significantly improve learning efficiency while maintaining effectiveness. Especially, the optimized-weight-first strategy and segmented strategies outperform other strategies. The results provide opportunities for researchers and practitioners to reconsider the way of English teaching and designing vocabularies quantitatively by balancing the efficiency and learning costs based on the word network.

  18. Input-variable sensitivity assessment for sediment transport relations

    Science.gov (United States)

    Fernández, Roberto; Garcia, Marcelo H.

    2017-09-01

    A methodology to assess input-variable sensitivity for sediment transport relations is presented. The Mean Value First Order Second Moment Method (MVFOSM) is applied to two bed load transport equations showing that it may be used to rank all input variables in terms of how their specific variance affects the overall variance of the sediment transport estimation. In sites where data are scarce or nonexistent, the results obtained may be used to (i) determine what variables would have the largest impact when estimating sediment loads in the absence of field observations and (ii) design field campaigns to specifically measure those variables for which a given transport equation is most sensitive; in sites where data are readily available, the results would allow quantifying the effect that the variance associated with each input variable has on the variance of the sediment transport estimates. An application of the method to two transport relations using data from a tropical mountain river in Costa Rica is implemented to exemplify the potential of the method in places where input data are limited. Results are compared against Monte Carlo simulations to assess the reliability of the method and validate its results. For both of the sediment transport relations used in the sensitivity analysis, accurate knowledge of sediment size was found to have more impact on sediment transport predictions than precise knowledge of other input variables such as channel slope and flow discharge.

  19. Westinghouse corporate development of a decision software program for Radiological Evaluation Decision Input (REDI)

    International Nuclear Information System (INIS)

    Bush, T.S.

    1995-01-01

    In December 1992, the Department of Energy (DOE) implemented the DOE Radiological Control Manual (RCM). Westinghouse Idaho Nuclear Company, Inc. (WINCO) submitted an implementation plan showing how compliance with the manual would be achieved. This implementation plan was approved by DOE in November 1992. Although WINCO had already been working under a similar Westinghouse RCM, the DOE RCM brought some new and challenging requirements. One such requirement was that of having procedure writers and job planners create the radiological input in work control procedures. Until this time, that information was being provided by radiological engineering or a radiation safety representative. As a result of this requirement, Westinghouse developed the Radiological Evaluation Decision Input (REDI) program

  20. Westinghouse corporate development of a decision software program for Radiological Evaluation Decision Input (REDI)

    Energy Technology Data Exchange (ETDEWEB)

    Bush, T.S. [Westinghosue Idaho Nuclear Co., Inc., Idaho Falls, ID (United States)

    1995-03-01

    In December 1992, the Department of Energy (DOE) implemented the DOE Radiological Control Manual (RCM). Westinghouse Idaho Nuclear Company, Inc. (WINCO) submitted an implementation plan showing how compliance with the manual would be achieved. This implementation plan was approved by DOE in November 1992. Although WINCO had already been working under a similar Westinghouse RCM, the DOE RCM brought some new and challenging requirements. One such requirement was that of having procedure writers and job planners create the radiological input in work control procedures. Until this time, that information was being provided by radiological engineering or a radiation safety representative. As a result of this requirement, Westinghouse developed the Radiological Evaluation Decision Input (REDI) program.

  1. Efficient round-robin multicast scheduling for input-queued switches

    DEFF Research Database (Denmark)

    Rasmussen, Anders; Yu, Hao; Ruepp, Sarah Renée

    2014-01-01

    The input-queued (IQ) switch architecture is favoured for designing multicast high-speed switches because of its scalability and low implementation complexity. However, using the first-in-first-out (FIFO) queueing discipline at each input of the switch may cause the head-of-line (HOL) blocking...... problem. Using a separate queue for each output port at an input to reduce the HOL blocking, that is, the virtual output queuing discipline, increases the implementation complexity, which limits the scalability. Given the increasing link speed and network capacity, a low-complexity yet efficient multicast...... by means of queue look-ahead. Simulation results demonstrate that this FIFO-based IQ multicast architecture is able to achieve significant improvements in terms of multicast latency requirements by searching through a small number of cells beyond the HOL cells in the input queues. Furthermore, hardware...

  2. Barrier island forest ecosystem: role of meteorologic nutrient inputs.

    Science.gov (United States)

    Art, H W; Bormann, F H; Voigt, G K; Woodwell, G M

    1974-04-05

    The Sunken Forest, located on Fire Island, a barrier island in the Atlantic Ocean off Long Island, New York, is an ecosystem in which most of the basic cation input is in the form of salt spray. This meteorologic input is sufficient to compensate for the lack of certain nutrients in the highly weathered sandy soils. In other ecosystems these nutrients are generally supplied by weathering of soil particles. The compensatory effect of meteorologic input allows for primary production rates in the Sunken Forest similar to those of inland temperate forests.

  3. Jointness through vessel capacity input in a multispecies fishery

    DEFF Research Database (Denmark)

    Hansen, Lars Gårn; Jensen, Carsten Lynge

    2014-01-01

    capacity. We develop a fixed but allocatable input model of purse seine fisheries capturing this particular type of jointness. We estimate the model for the Norwegian purse seine fishery and find that it is characterized by nonjointness, while estimations for this fishery using the standard models imply...... are typically modeled as either independent single species fisheries or using standard multispecies functional forms characterized by jointness in inputs. We argue that production of each species is essentially independent but that jointness may be caused by competition for fixed but allocable input of vessel...

  4. Soil mineral assemblage influences on microbial communities and carbon cycling under fresh organic matter input

    Science.gov (United States)

    Finley, B. K.; Schwartz, E.; Koch, B.; Dijkstra, P.; Hungate, B. A.

    2017-12-01

    The interactions between soil mineral assemblages and microbial communities are important drivers of soil organic carbon (SOC) cycling and storage, although the mechanisms driving these interactions remain unclear. There is increasing evidence supporting the importance of associations with poorly crystalline, short-range order (SRO) minerals in protection of SOC from microbial utilization. However, how the microbial processing of SRO-associated SOC may be influenced by fresh organic matter inputs (priming) remains poorly understood. The influence on SRO minerals on soil microbial community dynamics is uncertain as well. Therefore, we conducted a priming incubation by adding either a simulated root exudate mixture or conifer needle litter to three soils from a mixed-conifer ecosystem. The parent material of the soils were andesite, basalt, and granite and decreased in SRO mineral content, respectively. We also conducted a parallel quantitative stable isotope probing incubation by adding 18O-labelled water to the soils to isotopically label microbial DNA in situ. This allowed us to characterize and identify the active bacterial and archaeal community and taxon-specific growth under fresh organic matter input. While the granite soil (lowest SRO content), had the largest total mineralization, the least priming occurred. The andesite and basalt soils (greater SRO content) had lower total respiration, but greater priming. Across all treatments, the granite soil, while having the lowest species richness of the entire community (249 taxa, both active and inactive), had a larger active community (90%) in response to new SOC input. The andesite and basalt soils, while having greater total species richness of the entire community at 333 and 325 taxa, respectively, had fewer active taxa in response to new C compared to the granite soil (30% and 49% taxa, respectively). These findings suggest that the soil mineral assemblage is an important driver on SOC cycling under fresh

  5. Quantiprot - a Python package for quantitative analysis of protein sequences.

    Science.gov (United States)

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  6. Discretizing LTI Descriptor (Regular Differential Input Systems with Consistent Initial Conditions

    Directory of Open Access Journals (Sweden)

    Athanasios D. Karageorgos

    2010-01-01

    Full Text Available A technique for discretizing efficiently the solution of a Linear descriptor (regular differential input system with consistent initial conditions, and Time-Invariant coefficients (LTI is introduced and fully discussed. Additionally, an upper bound for the error ‖x¯(kT−x¯k‖ that derives from the procedure of discretization is also provided. Practically speaking, we are interested in such kind of systems, since they are inherent in many physical, economical and engineering phenomena.

  7. Micro-Level Management of Agricultural Inputs: Emerging Approaches

    Directory of Open Access Journals (Sweden)

    Jonathan Weekley

    2012-12-01

    Full Text Available Through the development of superior plant varieties that benefit from high agrochemical inputs and irrigation, the agricultural Green Revolution has doubled crop yields, yet introduced unintended impacts on environment. An expected 50% growth in world population during the 21st century demands novel integration of advanced technologies and low-input production systems based on soil and plant biology, targeting precision delivery of inputs synchronized with growth stages of crop plants. Further, successful systems will integrate subsurface water, air and nutrient delivery, real-time soil parameter data and computer-based decision-making to mitigate plant stress and actively manipulate microbial rhizosphere communities that stimulate productivity. Such an approach will ensure food security and mitigate impacts of climate change.

  8. HAADF-STEM atom counting in atom probe tomography specimens: Towards quantitative correlative microscopy.

    Science.gov (United States)

    Lefebvre, W; Hernandez-Maldonado, D; Moyon, F; Cuvilly, F; Vaudolon, C; Shinde, D; Vurpillot, F

    2015-12-01

    The geometry of atom probe tomography tips strongly differs from standard scanning transmission electron microscopy foils. Whereas the later are rather flat and thin (atom probe tomography specimens. Based on simulations (electron probe propagation and image simulations), the possibility to apply quantitative high angle annular dark field scanning transmission electron microscopy to of atom probe tomography specimens has been tested. The influence of electron probe convergence and the benefice of deconvolution of electron probe point spread function electron have been established. Atom counting in atom probe tomography specimens is for the first time reported in this present work. It is demonstrated that, based on single projections of high angle annular dark field imaging, significant quantitative information can be used as additional input for refining the data obtained by correlative analysis of the specimen in APT, therefore opening new perspectives in the field of atomic scale tomography. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Soil-Related Input Parameters for the Biosphere Model

    International Nuclear Information System (INIS)

    Smith, A. J.

    2004-01-01

    This report presents one of the analyses that support the Environmental Radiation Model for Yucca Mountain Nevada (ERMYN). The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes the details of the conceptual model as well as the mathematical model and the required input parameters. The biosphere model is one of a series of process models supporting the postclosure Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A schematic representation of the documentation flow for the Biosphere input to TSPA is presented in Figure 1-1. This figure shows the evolutionary relationships among the products (i.e., analysis and model reports) developed for biosphere modeling, and the biosphere abstraction products for TSPA, as identified in the ''Technical Work Plan for Biosphere Modeling and Expert Support'' (TWP) (BSC 2004 [DIRS 169573]). This figure is included to provide an understanding of how this analysis report contributes to biosphere modeling in support of the license application, and is not intended to imply that access to the listed documents is required to understand the contents of this report. This report, ''Soil-Related Input Parameters for the Biosphere Model'', is one of the five analysis reports that develop input parameters for use in the ERMYN model. This report is the source documentation for the six biosphere parameters identified in Table 1-1. The purpose of this analysis was to develop the biosphere model parameters associated with the accumulation and depletion of radionuclides in the soil. These parameters support the calculation of radionuclide concentrations in soil from on-going irrigation or ash deposition and, as a direct consequence, radionuclide concentration in other environmental media that are affected by radionuclide concentrations in soil. The analysis was performed in accordance with the TWP (BSC 2004 [DIRS 169573]) where the governing procedure was defined as AP-SIII.9Q, ''Scientific Analyses''. This

  10. QTest: Quantitative Testing of Theories of Binary Choice.

    Science.gov (United States)

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  11. QTest: Quantitative Testing of Theories of Binary Choice

    Science.gov (United States)

    Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495

  12. A Stochastic Collocation Method for Elliptic Partial Differential Equations with Random Input Data

    KAUST Repository

    Babuška, Ivo; Nobile, Fabio; Tempone, Raul

    2010-01-01

    This work proposes and analyzes a stochastic collocation method for solving elliptic partial differential equations with random coefficients and forcing terms. These input data are assumed to depend on a finite number of random variables. The method consists of a Galerkin approximation in space and a collocation in the zeros of suitable tensor product orthogonal polynomials (Gauss points) in the probability space, and naturally leads to the solution of uncoupled deterministic problems as in the Monte Carlo approach. It treats easily a wide range of situations, such as input data that depend nonlinearly on the random variables, diffusivity coefficients with unbounded second moments, and random variables that are correlated or even unbounded. We provide a rigorous convergence analysis and demonstrate exponential convergence of the “probability error” with respect to the number of Gauss points in each direction of the probability space, under some regularity assumptions on the random input data. Numerical examples show the effectiveness of the method. Finally, we include a section with developments posterior to the original publication of this work. There we review sparse grid stochastic collocation methods, which are effective collocation strategies for problems that depend on a moderately large number of random variables.

  13. Characterization of memory states of the Preisach operator with stochastic inputs

    International Nuclear Information System (INIS)

    Amann, A.; Brokate, M.; McCarthy, S.; Rachinskii, D.; Temnov, G.

    2012-01-01

    The Preisach operator with inputs defined by a Markov process x t is considered. The question we address is: what is the distribution of the random memory state of the Preisach operator at a given time moment t 0 in the limit r→∞ of infinitely long input history x t , t 0 -r≤t≤t 0 ? In order to answer this question, we introduce a Markov chain (called the memory state Markov chain) where the states are pairs (m k ,M k ) of elements from the monotone sequences of the local minimum input values m k and the local maximum input values M k recorded in the memory state and the index k of the elements plays the role of time. We express the transition probabilities of this Markov chain in terms of the transition probabilities of the input stochastic process and show that the memory state Markov chain and the input process generate the same distribution of the memory states. These results are illustrated by several examples of stochastic inputs such as the Wiener and Bernoulli processes and their mixture (we first discuss a discrete version of these processes and then the continuous time and state setting). The memory state Markov chain is then used to find the distribution of the random number of elements in the memory state sequence. We show that this number has the Poisson distribution for the Wiener and Bernoulli processes inputs. In particular, in the discrete setting, the mean value of the number of elements in the memory state scales as lnN, where N is the number of the input states, while the mean time it takes the input to generate this memory state scales as N 2 for the Wiener process and as N for the Bernoulli process. A similar relationship between the dimension of the memory state vector and the number of iterations in the numerical realization of the input is shown for the mixture of the Wiener and Bernoulli processes, thus confirming that the memory state Markov chain is an efficient tool for generating the distribution of the Preisach operator memory

  14. Characterization of memory states of the Preisach operator with stochastic inputs

    Energy Technology Data Exchange (ETDEWEB)

    Amann, A. [Department of Applied Mathematics, University College Cork (Ireland); Brokate, M. [Zentrum Mathematik, Technische Universitaet Muenchen (Germany); McCarthy, S. [Department of Applied Mathematics, University College Cork (Ireland); Rachinskii, D., E-mail: d.rachinskii@ucc.ie [Department of Applied Mathematics, University College Cork (Ireland); Temnov, G. [Department of Mathematics, University College Cork (Ireland)

    2012-05-01

    The Preisach operator with inputs defined by a Markov process x{sup t} is considered. The question we address is: what is the distribution of the random memory state of the Preisach operator at a given time moment t{sub 0} in the limit r{yields}{infinity} of infinitely long input history x{sup t}, t{sub 0}-r{<=}t{<=}t{sub 0}? In order to answer this question, we introduce a Markov chain (called the memory state Markov chain) where the states are pairs (m{sub k},M{sub k}) of elements from the monotone sequences of the local minimum input values m{sub k} and the local maximum input values M{sub k} recorded in the memory state and the index k of the elements plays the role of time. We express the transition probabilities of this Markov chain in terms of the transition probabilities of the input stochastic process and show that the memory state Markov chain and the input process generate the same distribution of the memory states. These results are illustrated by several examples of stochastic inputs such as the Wiener and Bernoulli processes and their mixture (we first discuss a discrete version of these processes and then the continuous time and state setting). The memory state Markov chain is then used to find the distribution of the random number of elements in the memory state sequence. We show that this number has the Poisson distribution for the Wiener and Bernoulli processes inputs. In particular, in the discrete setting, the mean value of the number of elements in the memory state scales as lnN, where N is the number of the input states, while the mean time it takes the input to generate this memory state scales as N{sup 2} for the Wiener process and as N for the Bernoulli process. A similar relationship between the dimension of the memory state vector and the number of iterations in the numerical realization of the input is shown for the mixture of the Wiener and Bernoulli processes, thus confirming that the memory state Markov chain is an efficient tool for

  15. Life Cycle Assessment (LCA for Wheat (Triticum aestivum L. Production Systems of Iran: 1- Comparison of Inputs Level

    Directory of Open Access Journals (Sweden)

    Mahdi Nassiri Mahallati

    2018-02-01

    Full Text Available Introduction Agricultural intensification has serious environmental consequences such as depletion of non-renewable resources, emission of greenhouse gases, threatening of biodiversity and pollution of both surface and underground water resources. The life cycle assessment (LCA provides a standard method for assessing environmental impacts from various economic activities, including agriculture, and covers a wide range of impact categories across the entire production chain. Over the past few decades, food production in Iran has been increased drastically due to heavier use of chemical inputs. Since the use of LCA method is overlooked for assesseing the effects of agricultural intensification in Iran and few researches are conducted at local level (such as province, cities, the purpose of this research is evaluation of wheat production systems throughout the country based on the level of intensification using LCA method. Materials and Methods Fourteen provinces covering 80 percent of total cultivated area of wheat production in the country were subjected to a cradle to gate LCA study using the standard method. The selected provinces were classified as low, medium and high input based on the level of intensification and all inputs and emissions were estimated within the system boundaries during inventory stage. Required data for yield, and level of applied inputs for 14 provinces were collected from the official databases of the Ministry of Jihad Agriculture. The various environmental impacts including, abiotic resource depletion, land use, global warming potential, acidification and eutrophication potential, human, aquatic and terrestrial ecotoxicity potential of wheat production systems over the country was studied based on emission coefficients and characterization factors provided by standard literatures. The integrated effects of emission of each impact category were calculated per functional units (hectare cultivated area as well as ton

  16. History of the special committee on INIS input preparation

    International Nuclear Information System (INIS)

    Itabashi, Keizo

    2011-06-01

    The special committee on INIS input techniques was held 8 times from December 1970 to March 1973. The special committee on INIS input preparation was held 39 times from February 1974 to December 2004. The history of these two committees is described. (author)

  17. Parameter setting and input reduction

    NARCIS (Netherlands)

    Evers, A.; van Kampen, N.J.|info:eu-repo/dai/nl/126439737

    2008-01-01

    The language acquisition procedure identifies certain properties of the target grammar before others. The evidence from the input is processed in a stepwise order. Section 1 equates that order and its typical effects with an order of parameter setting. The question is how the acquisition procedure

  18. Lithium inputs to subduction zones

    NARCIS (Netherlands)

    Bouman, C.; Elliott, T.R.; Vroon, P.Z.

    2004-01-01

    We have studied the sedimentary and basaltic inputs of lithium to subduction zones. Various sediments from DSDP and ODP drill cores in front of the Mariana, South Sandwich, Banda, East Sunda and Lesser Antilles island arcs have been analysed and show highly variable Li contents and δ

  19. User's guide to input for WRAP: a water reactor analysis package

    International Nuclear Information System (INIS)

    Gregory, M.V.

    1977-06-01

    The document describes the input records required to execute the Water Reactor Analysis Package (WRAP) for the analysis of thermal-hydraulic transients in primarily light water reactors. The card input required by RELAP4 has been significantly modified to broaden the code's input processing capabilities: (1) All input is in the form of templated, named records. (2) All components (volumes, junctions, etc.) are named rather than numbered, and system relationships are formed by defining associations between the names. (3) A hierarchical part structure is used which allows collections of components to be described as discrete parts (these parts may then be catalogued for use in a wide range of cases). A sample problem, the small break analysis of the Westinghouse Trojan Plant, is discussed and detailed, step-by-step instructions in setting up an input data base are presented. A master list of all input templates for WRAP is compiled

  20. K2: Extending Kepler's Power to the Ecliptic-Ecliptic Plane Input Catalog

    Science.gov (United States)

    Huber, Daniel; Bryson, Stephen T.

    2017-01-01

    This document describes the Ecliptic Plane Input Catalog (EPIC) for the K2 mission (Howell et al. 2014). The primary purpose of this catalog is to provide positions and Kepler magnitudes for target management and aperture photometry. The Ecliptic Plane Input Catalog is hosted at MAST (http://archive.stsci.edu/k2/epic/search.php) and should be used for selecting targets when ever possible. The EPIC is updated for future K2 campaigns as their fields of view are finalized and the associated target management is completed. Table 0 summarizes the EPIC updates to date and the ID range for each. The main algorithms used to construct the EPIC are described in Sections 2 through 4. The details for individual campaigns are described in the subsequent sections, with the references listed in the last section. Further details can be found in Huber et al. (2016).

  1. Alternative input medium development for wheelchair user with severe spinal cord injury

    Science.gov (United States)

    Ihsan, Izzat Aqmar; Tomari, Razali; Zakaria, Wan Nurshazwani Wan; Othman, Nurmiza

    2017-09-01

    Quadriplegia or tetraplegia patients have restricted four limbs as well as torso movement caused by severe spinal cord injury. Undoubtedly, these patients face difficulties when operating their powered electric wheelchair since they are unable to control the wheelchair by means of a standard joystick. Due to total loss of both sensory and motor function of the four limbs and torso, an alternative input medium for the wheelchair will be developed to assist the user in operating the wheelchair. In this framework, the direction of the wheelchair movement is determined by the user's conscious intent through a brain control interface (BCI) based on Electroencephalogram (EEG) signal. A laser range finder (LFR) is used to perceive environment information for determining a safety distance of the wheelchair's surrounding. Local path planning algorithm will be developed to provide navigation planner along with user's input to prevent collision during control operation.

  2. The Absolute Stability Analysis in Fuzzy Control Systems with Parametric Uncertainties and Reference Inputs

    Science.gov (United States)

    Wu, Bing-Fei; Ma, Li-Shan; Perng, Jau-Woei

    This study analyzes the absolute stability in P and PD type fuzzy logic control systems with both certain and uncertain linear plants. Stability analysis includes the reference input, actuator gain and interval plant parameters. For certain linear plants, the stability (i.e. the stable equilibriums of error) in P and PD types is analyzed with the Popov or linearization methods under various reference inputs and actuator gains. The steady state errors of fuzzy control systems are also addressed in the parameter plane. The parametric robust Popov criterion for parametric absolute stability based on Lur'e systems is also applied to the stability analysis of P type fuzzy control systems with uncertain plants. The PD type fuzzy logic controller in our approach is a single-input fuzzy logic controller and is transformed into the P type for analysis. In our work, the absolute stability analysis of fuzzy control systems is given with respect to a non-zero reference input and an uncertain linear plant with the parametric robust Popov criterion unlike previous works. Moreover, a fuzzy current controlled RC circuit is designed with PSPICE models. Both numerical and PSPICE simulations are provided to verify the analytical results. Furthermore, the oscillation mechanism in fuzzy control systems is specified with various equilibrium points of view in the simulation example. Finally, the comparisons are also given to show the effectiveness of the analysis method.

  3. Shaped input distributions for structural damage localization

    DEFF Research Database (Denmark)

    Ulriksen, Martin Dalgaard; Bernal, Dionisio; Damkilde, Lars

    2018-01-01

    localization method is cast that operates on the premise of shaping inputs—whose spatial distribution is fixed—by use of a model, such that these inputs, in one structural subdomain at a time, suppress certain steady-state vibration quantities (depending on the type of damage one seeks to interrogate for......). Accordingly, damage is localized when the vibration signature induced by the shaped inputs in the damaged state corresponds to that in the reference state, hereby implying that the approach does not point directly to damage. Instead, it operates with interrogation based on postulated damage patterns...

  4. Nuclear reaction inputs based on effective interactions

    Energy Technology Data Exchange (ETDEWEB)

    Hilaire, S.; Peru, S.; Dubray, N.; Dupuis, M.; Bauge, E. [CEA, DAM, DIF, Arpajon (France); Goriely, S. [Universite Libre de Bruxelles, Institut d' Astronomie et d' Astrophysique, CP-226, Brussels (Belgium)

    2016-11-15

    Extensive nuclear structure studies have been performed for decades using effective interactions as sole input. They have shown a remarkable ability to describe rather accurately many types of nuclear properties. In the early 2000 s, a major effort has been engaged to produce nuclear reaction input data out of the Gogny interaction, in order to challenge its quality also with respect to nuclear reaction observables. The status of this project, well advanced today thanks to the use of modern computers as well as modern nuclear reaction codes, is reviewed and future developments are discussed. (orig.)

  5. Consumer input into research: the Australian Cancer Trials website

    Directory of Open Access Journals (Sweden)

    Butow Phyllis N

    2011-06-01

    Full Text Available Abstract Background The Australian Cancer Trials website (ACTO was publicly launched in 2010 to help people search for cancer clinical trials recruiting in Australia, provide information about clinical trials and assist with doctor-patient communication about trials. We describe consumer involvement in the design and development of ACTO and report our preliminary patient evaluation of the website. Methods Consumers, led by Cancer Voices NSW, provided the impetus to develop the website. Consumer representative groups were consulted by the research team during the design and development of ACTO which combines a search engine, trial details, general information about trial participation and question prompt lists. Website use was analysed. A patient evaluation questionnaire was completed at one hospital, one week after exposure to the website. Results ACTO's main features and content reflect consumer input. In February 2011, it covered 1, 042 cancer trials. Since ACTO's public launch in November 2010, until the end of February 2011, the website has had 2, 549 new visits and generated 17, 833 page views. In a sub-study of 47 patient users, 89% found the website helpful for learning about clinical trials and all respondents thought patients should have access to ACTO. Conclusions The development of ACTO is an example of consumers working with doctors, researchers and policy makers to improve the information available to people whose lives are affected by cancer and to help them participate in their treatment decisions, including consideration of clinical trial enrolment. Consumer input has ensured that the website is informative, targets consumer priorities and is user-friendly. ACTO serves as a model for other health conditions.

  6. Multiple Input - Multiple Output (MIMO) SAR

    Data.gov (United States)

    National Aeronautics and Space Administration — This effort will research and implement advanced Multiple-Input Multiple-Output (MIMO) Synthetic Aperture Radar (SAR) techniques which have the potential to improve...

  7. EXIOBASE 3: Developing a Time Series of Detailed Environmentally Extended Multi-Regional Input-Output Tables

    NARCIS (Netherlands)

    Stadler, K.; Wood, R.; Bulavskaya, T.; Södersten, C.J.; Simas, M.; Schmidt, S.; Usubiaga, A.; Acosta-Fernández, J.; Kuenen, J.; Bruckner, M.; Giljum, S.; Lutter, S.; Merciai, S.; Schmidt, J.H.; Theurl, M.C.; Plutzar, C.; Kastner, T.; Eisenmenger, N.; Erb, K.H.; Koning, A. de; Tukker, A.

    2018-01-01

    Environmentally extended multiregional input-output (EE MRIO) tables have emerged as a key framework to provide a comprehensive description of the global economy and analyze its effects on the environment. Of the available EE MRIO databases, EXIOBASE stands out as a database compatible with the

  8. PREVIMER : Meteorological inputs and outputs

    Science.gov (United States)

    Ravenel, H.; Lecornu, F.; Kerléguer, L.

    2009-09-01

    PREVIMER is a pre-operational system aiming to provide a wide range of users, from private individuals to professionals, with short-term forecasts about the coastal environment along the French coastlines bordering the English Channel, the Atlantic Ocean, and the Mediterranean Sea. Observation data and digital modelling tools first provide 48-hour (probably 96-hour by summer 2009) forecasts of sea states, currents, sea water levels and temperatures. The follow-up of an increasing number of biological parameters will, in time, complete this overview of coastal environment. Working in partnership with the French Naval Hydrographic and Oceanographic Service (Service Hydrographique et Océanographique de la Marine, SHOM), the French National Weather Service (Météo-France), the French public science and technology research institute (Institut de Recherche pour le Développement, IRD), the European Institute of Marine Studies (Institut Universitaire Européen de la Mer, IUEM) and many others, IFREMER (the French public institute fo marine research) is supplying the technologies needed to ensure this pertinent information, available daily on Internet at http://www.previmer.org, and stored at the Operational Coastal Oceanographic Data Centre. Since 2006, PREVIMER publishes the results of demonstrators assigned to limited geographic areas and to specific applications. This system remains experimental. The following topics are covered : Hydrodynamic circulation, sea states, follow-up of passive tracers, conservative or non-conservative (specifically of microbiological origin), biogeochemical state, primary production. Lastly, PREVIMER provides researchers and R&D departments with modelling tools and access to the database, in which the observation data and the modelling results are stored, to undertake environmental studies on new sites. The communication will focus on meteorological inputs to and outputs from PREVIMER. It will draw the lessons from almost 3 years during

  9. History of nutrient inputs to the northeastern United States, 1930-2000

    Science.gov (United States)

    Hale, Rebecca L.; Hoover, Joseph H.; Wollheim, Wilfred M.; Vörösmarty, Charles J.

    2013-04-01

    Humans have dramatically altered nutrient cycles at local to global scales. We examined changes in anthropogenic nutrient inputs to the northeastern United States (NE) from 1930 to 2000. We created a comprehensive time series of anthropogenic N and P inputs to 437 counties in the NE at 5 year intervals. Inputs included atmospheric N deposition, biological N2 fixation, fertilizer, detergent P, livestock feed, and human food. Exports included exports of feed and food and volatilization of ammonia. N inputs to the NE increased throughout the study period, primarily due to increases in atmospheric deposition and fertilizer. P inputs increased until 1970 and then declined due to decreased fertilizer and detergent inputs. Livestock consistently consumed the majority of nutrient inputs over time and space. The area of crop agriculture declined during the study period but consumed more nutrients as fertilizer. We found that stoichiometry (N:P) of inputs and absolute amounts of N matched nutritional needs (livestock, humans, crops) when atmospheric components (N deposition, N2 fixation) were not included. Differences between N and P led to major changes in N:P stoichiometry over time, consistent with global trends. N:P decreased from 1930 to 1970 due to increased inputs of P, and increased from 1970 to 2000 due to increased N deposition and fertilizer and decreases in P fertilizer and detergent use. We found that nutrient use is a dynamic product of social, economic, political, and environmental interactions. Therefore, future nutrient management must take into account these factors to design successful and effective nutrient reduction measures.

  10. Analytic uncertainty and sensitivity analysis of models with input correlations

    Science.gov (United States)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  11. A Brief Talk on Cultural Input in English Teaching

    Institute of Scientific and Technical Information of China (English)

    王敏

    2007-01-01

    Different countries have different languages and cultures. My paper starts from the differentiation between western culture and Chinese culture to point out the importance and necessity of cultural input in English teaching and puts forward some approaches to enforce the cultural input in language teaching.

  12. The Effect of Pinyin Input Experience on the Link Between Semantic and Phonology of Chinese Character in Digital Writing.

    Science.gov (United States)

    Chen, Jingjun; Luo, Rong; Liu, Huashan

    2017-08-01

    With the development of ICT, digital writing is becoming much more common in people's life. Differently from keyboarding alphabets directly to input English words, keyboarding Chinese character is always through typing phonetic alphabets and then identify the glyph provided by Pinyin input-method software while in this process which do not need users to produce orthography spelling, thus it is different from traditional written language production model based on handwriting process. Much of the research in this domain has found that using Pinyin input method is beneficial to Chinese characters recognition, but only a small part explored the effects of individual's Pinyin input experience on the Chinese characters production process. We ask whether using Pinyin input-method will strengthen the semantic-phonology linkage or semantic-orthography linkage in Chinese character mental lexicon. Through recording the RT and accuracy of participants completing semantic-syllable and semantic-glyph consistency judgments, the results found the accuracy of semantic-syllable consistency judgments in high Pinyin input experienced group was higher than that in low-experienced group, and RT was reversed. There were no significant differences on semantic-glyph consistency judgments between the two groups. We conclude that using Pinyin input method in Chinese digital writing can strengthen the semantic-phonology linkage while do not weakening the semantic-orthography linkage in mental lexicon at the same time, which means that Pinyin input method is beneficial to lexical processing involving Chinese cognition.

  13. Robotics control using isolated word recognition of voice input

    Science.gov (United States)

    Weiner, J. M.

    1977-01-01

    A speech input/output system is presented that can be used to communicate with a task oriented system. Human speech commands and synthesized voice output extend conventional information exchange capabilities between man and machine by utilizing audio input and output channels. The speech input facility is comprised of a hardware feature extractor and a microprocessor implemented isolated word or phrase recognition system. The recognizer offers a medium sized (100 commands), syntactically constrained vocabulary, and exhibits close to real time performance. The major portion of the recognition processing required is accomplished through software, minimizing the complexity of the hardware feature extractor.

  14. Automation of Geometry Input for Building Code Compliance Check

    DEFF Research Database (Denmark)

    Petrova, Ekaterina Aleksandrova; Johansen, Peter Lind; Jensen, Rasmus Lund

    2017-01-01

    Documentation of compliance with the energy performance regulations at the end of the detailed design phase is mandatory for building owners in Denmark. Therefore, besides multidisciplinary input, the building design process requires various iterative analyses, so that the optimal solutions can...... be identified amongst multiple alternatives. However, meeting performance criteria is often associated with manual data inputs and retroactive modifications of the design. Due to poor interoperability between the authoring tools and the compliance check program, the processes are redundant and inefficient...... from building geometry created in Autodesk Revit and its translation to input for compliance check analysis....

  15. Input Scanners: A Growing Impact In A Diverse Marketplace

    Science.gov (United States)

    Marks, Kevin E.

    1989-08-01

    Just as newly invented photographic processes revolutionized the printing industry at the turn of the century, electronic imaging has affected almost every computer application today. To completely emulate traditionally mechanical means of information handling, computer based systems must be able to capture graphic images. Thus, there is a widespread need for the electronic camera, the digitizer, the input scanner. This paper will review how various types of input scanners are being used in many diverse applications. The following topics will be covered: - Historical overview of input scanners - New applications for scanners - Impact of scanning technology on select markets - Scanning systems issues

  16. OFFSCALE: PC input processor for SCALE-4 criticality sequences

    International Nuclear Information System (INIS)

    Bowman, S.M.

    1991-01-01

    OFFSCALE is a personal computer program that serves as a user-friendly interface for the Criticality Safety Analysis Sequences (CSAS) available in SCALE-4. It is designed to assist a SCALE-4 user in preparing an input file for execution of criticality safety problems. Output from OFFSCALE is a card-image input file that may be uploaded to a mainframe computer to execute the CSAS4 control module in SCALE-4. OFFSCALE features a pulldown menu system that accesses sophisticated data entry screens. The program allows the user to quickly set up a CSAS4 input file and perform data checking

  17. Video-based Chinese Input System via Fingertip Tracking

    Directory of Open Access Journals (Sweden)

    Chih-Chang Yu

    2012-10-01

    Full Text Available In this paper, we propose a system to detect and track fingertips online and recognize Mandarin Phonetic Symbol (MPS for user-friendly Chinese input purposes. Using fingertips and cameras to replace pens and touch panels as input devices could reduce the cost and improve the ease-of-use and comfort of computer-human interface. In the proposed framework, particle filters with enhanced appearance models are applied for robust fingertip tracking. Afterwards, MPS combination recognition is performed on the tracked fingertip trajectories using Hidden Markov Models. In the proposed system, the fingertips of the users could be robustly tracked. Also, the challenges of entering, leaving and virtual strokes caused by video-based fingertip input can be overcome. Experimental results have shown the feasibility and effectiveness of the proposed work.

  18. Acoustic input and efferent activity regulate the expression of molecules involved in cochlear micromechanics

    Science.gov (United States)

    Lamas, Veronica; Arévalo, Juan C.; Juiz, José M.; Merchán, Miguel A.

    2015-01-01

    Electromotile activity in auditory outer hair cells (OHCs) is essential for sound amplification. It relies on the highly specialized membrane motor protein prestin, and its interactions with the cytoskeleton. It is believed that the expression of prestin and related molecules involved in OHC electromotility may be dynamically regulated by signals from the acoustic environment. However little is known about the nature of such signals and how they affect the expression of molecules involved in electromotility in OHCs. We show evidence that prestin oligomerization is regulated, both at short and relatively long term, by acoustic input and descending efferent activity originating in the cortex, likely acting in concert. Unilateral removal of the middle ear ossicular chain reduces levels of trimeric prestin, particularly in the cochlea from the side of the lesion, whereas monomeric and dimeric forms are maintained or even increased in particular in the contralateral side, as shown in Western blots. Unilateral removal of the auditory cortex (AC), which likely causes an imbalance in descending efferent activity on the cochlea, also reduces levels of trimeric and tetrameric forms of prestin in the side ipsilateral to the lesion, whereas in the contralateral side prestin remains unaffected, or even increased in the case of trimeric and tetrameric forms. As far as efferent inputs are concerned, unilateral ablation of the AC up-regulates the expression of α10 nicotinic Ach receptor (nAChR) transcripts in the cochlea, as shown by RT-Quantitative real-time PCR (qPCR). This suggests that homeostatic synaptic scaling mechanisms may be involved in dynamically regulating OHC electromotility by medial olivocochlear efferents. Limited, unbalanced efferent activity after unilateral AC removal, also affects prestin and β-actin mRNA levels. These findings support that the concerted action of acoustic and efferent inputs to the cochlea is needed to regulate the expression of major

  19. Added value of experts' knowledge to improve a quantitative microbial exposure assessment model--Application to aseptic-UHT food products.

    Science.gov (United States)

    Pujol, Laure; Johnson, Nicholas Brian; Magras, Catherine; Albert, Isabelle; Membré, Jeanne-Marie

    2015-10-15

    In a previous study, a quantitative microbial exposure assessment (QMEA) model applied to an aseptic-UHT food process was developed [Pujol, L., Albert, I., Magras, C., Johnson, N. B., Membré, J. M. Probabilistic exposure assessment model to estimate aseptic UHT product failure rate. 2015 International Journal of Food Microbiology. 192, 124-141]. It quantified Sterility Failure Rate (SFR) associated with Bacillus cereus and Geobacillus stearothermophilus per process module (nine modules in total from raw material reception to end-product storage). Previously, the probabilistic model inputs were set by experts (using knowledge and in-house data). However, only the variability dimension was taken into account. The model was then improved using expert elicitation knowledge in two ways. First, the model was refined by adding the uncertainty dimension to the probabilistic inputs, enabling to set a second order Monte Carlo analysis. The eight following inputs, and their impact on SFR, are presented in detail in this present study: D-value for each bacteria of interest (B. cereus and G. stearothermophilus) associated with the inactivation model for the UHT treatment step, i.e., two inputs; log reduction (decimal reduction) number associated with the inactivation model for the packaging sterilization step for each bacterium and each part of the packaging (product container and sealing component), i.e., four inputs; and bacterial spore air load of the aseptic tank and the filler cabinet rooms, i.e., two inputs. Second, the model was improved by leveraging expert knowledge to develop further the existing model. The proportion of bacteria in the product which settled on surface of pipes (between the UHT treatment and the aseptic tank on one hand, and between the aseptic tank and the filler cabinet on the other hand) leading to a possible biofilm formation for each bacterium, was better characterized. It was modeled as a function of the hygienic design level of the aseptic

  20. Quantitative subsurface analysis using frequency modulated thermal wave imaging

    Science.gov (United States)

    Subhani, S. K.; Suresh, B.; Ghali, V. S.

    2018-01-01

    Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.